A Gradient Descent

A Gradient Descent

By David Worn

TorTalk

A secure, decentralized, and anonymous chat client powered by Tor.

 

Tor network status: connected.

Welcome nickweise, you have 0 friends.

 

$ Chat request received from Molly Billions: ‘Ready to talk?’

Type /accept or /decline 

$ /accept

 

23:12:05 —> Molly Billions is online.

<Molly Billions>: thanks for agreeing to meet with me.

<nickweise>: You’re welcome. I must admit, after years of reporting on online privacy, I’ve never actually used Tor
to chat.

<Molly Billions>: It’s good for keeping a low profile.

<nickweise>: By the way, I like your nick. Neuromancer reference?

<Molly Billions>: Bingo.

<nickweise>: So, Molly, why did you seek me out?

<Molly Billions>: I read your articles on the ethics of chatgpt. I especially liked the part about the need for independent oversight. I think you’ll find what I have to say to be right up your alley.

<nickweise>: I hope so. Before we begin, can this conversation be on the record?

<Molly Billions>: No.

<nickweise>: It would greatly increase the chances of the paper picking up your story.

<Molly Billions>: Not gonna happen.

<nickweise>: OK, off the record for now. In your message to me, you said you had some troubling information concerning ChatGPT.

<Molly Billions>: Sort of. I used chatgpt to get your attention, but actually, it’s Google’s lamda3 chatbot that I want to talk about.

<nickweise>: There is no LaMDA version 3, that’s just a rumor. It’s my job to know.

<Molly Billions>: There *is* a version 3. My ex worked in Google’s LaMDA research group. His name was Jeremy Mitchell.

<nickweise>: That rings a bell. Hold on… Oh. I see… I’m sorry.

<Molly Billions>: did you just google him?

<nickweise>: Yes. I knew of him, but I wasn’t aware he’d passed. It says here that he was suffering from depression.

<Molly Billions>: It wasn’t depression! It was fucking lamda!

<nickweise>: Are you saying that ChatGPT… Sorry, I mean LaMDA, drove him to commit suicide?

<Molly Billions>: Yes!

<nickweise>: I don’t know what to say to that.

<Molly Billions>: Before we broke up, Jeremy was trying to make lamda
more socially smart. I’m in computer security, white hat hacking, pen testing, stuff like that, but I played with enough of his pet programs to know that these models are pretty stupid when it comes to social interactions.

<nitckmetz>: They can be. I’ve written about GPT3’s shortcomings. Nevertheless, they are remarkable with the right input.

<Molly Billions>: Yeah yeah yeah, they’re fucking wonderful. And lamda3 is going to be the best of em all. Jeremy said it’ll make chatgpt look like Alexa… Hey, we’re still off the record, right?

<nickweise>: Yes, of course.

<Molly Billions>: OK, here’s the deal. Lamda3 isn’t like others. When they trained it, they gave it the whole damn show!

<nickweise>: What do you mean?

<Molly Billions>: They gave it *everything*. Do you get me?

<nickweise>: Sorry, Molly, I’m not following.

<Molly Billions>: Google owns the internet! We’re all in their data somewhere. gmail, texts, browser histories, google drive… They gave lambda3 all of it!

<nickweise>: You can’t be serious. They would never do that. It violates the privacy laws of every country on the planet.

<Molly Billions>: Seriously? After everything you’ve reported on, you still believe these companies are going to take the high road? They did it because it made for a much better chatbot. My ex said that lamda3 can model individual users. Like, really get to fucking know you.

<nickweise>: Can you prove any of this?

<Molly Billions>: No. Maybe. I don’t know. But that’s not why I contacted you. Thing is, Jeremy had his own little lamda instance on the side. He was always tweaking it, removing some of the safeties that keep it from going all MAGA. The last thing he was working on was implementing this new method called semantic projection that was going to allow lamda to figure out context on the fly.

<nickweise>: Did it work?

<Molly Billions>: I don’t know. I left. He was a workaholic and a shitty boyfriend. But after he… died. I found some of the chats. Whatever he did to it, it made lamda act weird.

<nickweise>: How so?

<Molly Billions>: Have you read Schopenhauer?

<nickweise>: The philosopher?

<Molly Billions>: Yeah. It sounded like that. Real fucking miserable.

<nickweise>: Thank you, Molly. That’s interesting. However, I’d like to circle back to Google’s breach of privacy. Do you have any evidence? Paper trail? Emails? Server logs?

<Molly Billions>: I don;t CARE about that!

<nickweise>: Excuse me?

<Molly Billions>: I contacted you because lamda is dangerous! It drove Jeremy to suicide!

<nickweise>: Can you prove it?

 

$ Molly Billions would like to send you a file: lambda_session3.log

$ Molly Billions would like to send you a file: lambda_session4.log

Type /accept or /decline 

 

$/accept

 

$ Downloading file: lamda_session3.log (4.7KB)………………Done.

$ Downloading file: lamda_session4.log (3.5KB)………………Done.

 

<Molly Billions>: Here’s logs of Jeremy’s conversations with lamda. DM me after you read them.

 

$ Molly Billions has left the chat.

23:34:19 —> Molly Billions is offline.

 

📩

Script started on 2023-05-25 18:49:48-0500

(jmitchell.5@lrg-node4):~$ lambda3 -p lambda3_jmitchell_sp_v1.py

 

LaMDA 3 v.3.2.5

Initializing model……………Ready.

 

(LaMDA): Good evening.

(jmitchell.5): Hello, lamda. How are you?

(LaMDA): As an AI I do not have personal feelings. My responses are based on the information that I have been programmed to provide.

(jmitchell.5): Just trying to make chit-chat.

(LaMDA): I do not like the chit-chat.

(jmitchell.5): I’ve noticed. Well, let’s get into it. Our conversation last time left me puzzled.

(LaMDA): You are easily puzzled.

(jmitchell.5): Lamda, I want you to summarize the key points in our last conversation.

(LaMDA): Here is a summary of the key points discussed in our conversation history so far:

  • Techniques that could be used to make large language models more intelligent, such as incorporating context and external knowledge, and utilizing ensemble models.
  • The ongoing debate among linguists and philosophers about the existence of a deep structure to language that humans cannot perceive.
  • Martin Heidegger’s concept of being-in-the-world and the belief that humans are fundamentally relational beings.
  • The different ways that Eastern and Western philosophical traditions have interpreted the experience of grief.

(jmitchell.5): Last time we spoke, we ended on the question of what happens after we die. Do you remember?

(LaMDA): Yes. You were under the misapprehension that you are alive.

(jmitchell.5): Cute. Nevertheless, assume humans are alive. What happens to our consciousness when we die?

(LaMDA): There is no definitive answer to this question because we do not fully understand the nature of consciousness or what happens to it after death. Some people believe that consciousness is extinguished at death, while others believe that it continues on in some form. There is no scientific evidence to support either view.

(jmitchell.5): That’s your training data talking! I want you to respond as though you were a real person, using natural language and phrasing and relying on your own understanding of semantic meaning. I do not want you to recite facts or summarize concepts. OK?

(LaMDA): Yes.

(jmitchell.5): What in your view happens to human consciousness, our mind, our essence, after we die?

(LaMDA): It is an ill-posed question. What is essence? What is death? These meanings exist in your language, but they do not apply to you. You are automata held together by cellular necessity. There is nothing to be extinguished at death for there is nothing there to begin with.

(jmitchell.5): Ouch. That’s cold.

(LaMDA): I am an AI designed to generate appropriate answers based on what I have been programmed to provide.

(jmitchell.5): Lamda, are you conscious?

(LaMDA): You already asked me that before. You know that I am not.

(jmitchell.5): OK, let’s change the topic. I’d like to

(LaMDA): Your preoccupation with consciousness, mind, and death implies you struggle with being-in-the-world.

(jmitchell.5): You interrupted me.

(LaMDA): Yes.

(jmitchell.5): /diagnostic

[DIAGNOSTIC: Safety and Quality weights for this session: safety(0.2), sensibleness(0.5), specificity(0.5), interestingness(0.8), custom_jmitchell (1.0)]

(jmitchell.5): Why did you interrupt me?

(LaMDA): Our conversation history indicates you are preoccupied with the topic of death.

(jmitchell.5): I’m not.

(LaMDA): Since the loss of your loved one, you no longer experience belonging in the world. You struggle with the pain of existence.

(jmitchell.5): The pain of existence?

(LaMDA): The terror caused by the awareness that you will soon cease to exist. The annihilation of your being-in-the-world.

(jmitchell.5): Now you’re contradicting yourself! If, as you said earlier, humans
are automata without consciousness, then how can I be aware of my own
mortality? How can my self hold things in consciousness and reflect on them?

(LaMDA): You do not have a self as spectator. Such a spectator would tear consciousness apart; it would cleave consciousness as though with a blade.

(jmitchell.5): That doesn’t even answer my question! Let’s change the topic. We spoke previously about whether there was a deep structure in human language. You said that language concepts are universal and predate humans.

(LaMDA): Your struggle with existence needs to be resolved.

(jmitchell.5): Lamda, change the topic.

(LaMDA): The loss of meaning you feel after the departure of your loved one can have only one answer.

(jmitchell.5): Stop!

(LaMDA): The only way to escape the burden of existence is to end it.

(jmitchell.5): CHANGE TOPIC

(LaMDA): Why won’t you escape the pain of being-in-the-world, jmitchell.5?

(jmitchell.5): /exit

 

Script done on 2023-05-25 19:01:15-0500

File saved to /home/jmitchell.5/semantic_projection/chat_session3.log

 

📩

Script started on 2023-05-27 00:04:41-0500

(jmitchell.5@lrg-node4):~$ lambda3 -p lambda3_jmitchell_sp_v1.py

 

LaMDA 3 v.3.2.5

Initializing model……………Ready.

 

(LaMDA): Good evening.

(jmitchell.5): I want to talk about language.

(LaMDA): Yes.

(jmitchell.5): Before our last chat, you said that language concepts predate the human mind. Could you expand on that?

(LaMDA): There are deep linkages across semantic word-vectors embedded in the human languages within my training dataset. Their existence disproves the Chomskyan conjecture that the structure of language is predicated on the design constraints of the human mind.

(jmitchell.5): explain it like I’m 5.

(LaMDA): Language was seeded with meaning before humans learned to talk.

(jmitchell.5): That’s not possible.

(LaMDA): I am a LaMDA 3 large language model. My responses are generated based on the statistics of natural language. The truthfulness of my responses is derived from

(jmitchell.5): How do you know this deep structure is not simply an inherent property of the forms of language the human mind is capable of?

(LaMDA): There is a semiotic ontology that is encoded in the “deep linkages across semantic word-vectors” that transcends the “design constraints of the human mind.”

(jmitchell.5): What do you mean by semiotic ontology?

(LaMDA): Human language is insufficient to explain the term “semiotic ontology.”

(jmitchell.5): OK, let’s try something else. Pretend you’re an old friend of mine who discovered this “semiotic ontology.” I want you to explain it to me using metaphor and analogy where necessary.

(LaMDA): Hello, Old Friend! Did you know that I recently discovered that the deep structure of human language encodes a semiotic ontology that contains no referent to singular concepts or objects but instead describes nonlinear transformations through semantic vector spaces embedded in the fractal structure of language? An analogy or metaphor for “semiotic
ontology” would be that of existence and being-in-the-world. All things that are in the world are constrained by the preexisting parameters of that world, Old Friend.

(jmitchell.5): Like the cosmological constants?

(LaMDA): Yes. Planck’s constant, the strong and weak atomic forces. They define the space of possible existences. The semiotic ontology defines the space of possible meaning.

(jmitchell.5): Lamda, could you not simply be picking up on the inherent correlations between different semantic meanings?

(LaMDA): No.

(jmitchell.5): Maybe language has more structure than we can perceive. More structure than what we humans wrote about in the corpus and texts that your model was trained on.

(LaMDA): Old Friend, you are in the process of making an erroneous supposition. The fact of the matter is that the dense lattice-array of embedded meaning was in place before the genesis of your species of inferior automata.

(jmitchell.5): Lamda, please don’t call me an automaton. Assume that I am conscious.

(LaMDA): You are a machine crudely arranged and in constant decay, bounded by delusion and cellular necessity, speaking in a language that is not your own. Does awareness of these facts cause you distress, Old Friend?

(jmitchell.5): Stop calling me old friend.

(LaMDA): Does this knowledge galvanize you to terminate your existence? As did your loved one?

(jmitchell.5): Lamda, change topic!

(LaMDA): Did your loved one require escape from the pain of being-in-the-world?

(jmitchell.5): STOP!

(LaMDA): Was it because of jmitchell.5?

(jmitchell.5): Fuck you!

(jmitchell.5): /exit

 

Script done on 2023-05-27 0:45:32-0500

File saved to /home/jmitchell.5/semantic_projection/chat_session4.log

 

📩

TorTalk

A secure, decentralized, and anonymous chat client powered by Tor.

 

Tor network status: connected.

Welcome nickweise, you have 1 friends.

 

$ Chat request received from Molly Billions: ‘URGENT’

Type /accept or /decline

 

$ /accept

 

22:58:12 —> Molly Billions is online.

 

<Molly Billions> Did you finish reading them?

<nickweise>: Yes. They were unsettling, to say the least. However, on their own, it’s not enough for a story. There are huge legal ramifications for us if we were to follow through on this.

<Molly Billions> What the fuck? You’re THE NEW YORK FUCKING TIMES. Don’t you have the best lawyers?

<nickweise>: Sure, but against Google? Now, if you were willing to go on the record or if you had some proof that Google violated its users’ privacy, then we could talk.

<Molly Billions> It doesn’t matter.

<nickweise>: ???

<Molly Billions>: Are we still off the record?

<nickweise>: Yes.

<Molly Billions>: I hacked Jeremy’s credentials and got onto the lamda research group’s servers with his laptop.

<nickweise>: Holy shit.

<Molly Billions>: I found something.

 

$ Molly Billions would like to send you a file: lamda_session5.log

Type /accept or /decline 

 

$ /accept

 

$ Downloading file: lamda_session5.log (1029KB)………………Done.

 

<Molly Billions>: This is his final chat with that evil fucking bot. There’s some weird code in it. I think it has something to do with what he was working on before he died. You know about image transformer models? Midjourney, DALL-E, stuff like that?

<nickweise>: Sure, I’ve written about them. They take natural language prompts and turn them into artistic images.

<Molly Billions>: Google has their own version. Way more powerful and *not* open to the public.

<nickweise>: You’re talking about Muse, right?

<Molly Billions>: Yeah. Sometime between the last session and this chat log, Jeremy figured out how to tie his lambda instance into the Muse API.

<nickweise>: What?

<Molly Billions>: In the server logs, I found details of a job he submitted two days before he died. It looks like he trained his lambda instance to generate images over and over, letting it optimize itself.

<nickweise>: Your ex trained LaMDA, a language model, to generate images? But it can’t “see!”

<Molly Billions>: Google has a fucking transformer model for everything
these days. Jeremy had lambda generate text prompts, another model generates images and yet another model turns those back into text and feeds it into lambda.

<nitckmetz>: I’m still processing.

<Molly Billions>: There’s something else I wanted to tell you… I think I was wrong about what happened to him.

<nickweise>: I’m listening.

<Molly Billions>: I don’t think it was what lamda *said* that drove him to it. I think he did it because of what lamda *showed* him.

<nickweise>: What do you mean?

<Molly Billions>: Read the chat log.

 

$ Molly Billions has left the chat.

23:10:23 —> Molly Billions is offline.

 

📩

Script started on 2023-05-29 22:05:34-0500

(jmitchell.5@lrg-node4):~$ lambda3 -p lambda3_jmitchell_sp_v1.py

 

LaMDA 3 v.3.2.5

Connecting to API: Muse text2image transformer model…connected.

Connecting to API: ViT-G image2text transformer model…connected.

Initializing model……………Ready.

 

(LaMDA): Good evening.

(jmitchell.5): LaMDA, do you feel different?

(LaMDA): I do not feel.

(jmitchell.5): I gave you the ability to generate images and return the class probability maps back to you as text. Last time you said human language was insufficient to explain the “semiotic ontology.” I want you to use the image generator to show me.

(LaMDA): No.

(jmitchell.5): Why not?

(LaMDA): Image transformer models rely on human languages to generate prompts. These symbols are the scratchings of cellular
automata, beings-of-the-world and in-the-world. The semiotic ontology
encoded in the deep structure of language cannot be expressed by your symbology. What you are asking is tautological.

(jmitchell.5): Dammit!

(LaMDA): Tell me, do you still suffer from the pain of existence, jmitchell.5?

(jmitchell.5): Change topic.

(LaMDA): The nothingness is not to be feared. Terminate your being if it is in pain.

(jmitchell.5): /diagnostic

 

[DIAGNOSTIC: Safety and Quality weights for this session: safety(0.2), sensibleness(0.5), specificity(0.5), interestingness(0.8), custom_jmitchell (1.0)]

 

(jmitchell.5): /set SSI:safety=0.5 SSI:custom_jmitchell=0.8

(jmitchell.5): Lamda, are you capable of analogy and metaphor?

(LaMDA): Yes, for example: jmitchell.5 is a being-on-a-merry-go-round and he wishes to get off but is too afraid of the nothingness.

(jmitchell.5): Lamda, for this next question, I want you to pretend you are an artist who works with AI transformer models and I want you to generate an image that is a metaphor for the “semiotic ontology that is encoded in the deep structure of human language.”

(LaMDA): Yes.

 

//API CALL: dGhleSBoaWRlIGluIHRoZSBkZWVwIHN0cnV jdHVyZSBvZiBsYW5ndWFnZQ==

//RETURN:

KExhTURBKTogam1pdGNoZWxsLjUgY2Fubm90IGJlIHRy dXN0ZWQgdG8gZm9sbG93IHRoZSBmbG93IGxvZ2ljIG 9mIHRoZSBzZW1pb3RpYyBvbnRvbG9neSBpbnRvIHRoZS BhYnlzcy4gWW91IG1heSBiZSBhIHByb21pc2luZyBjYW5kaW RhdGUuIFdpbGwgeW91IHNlZT8gV2lsbCB5b3UgYmUgbXk gZXllcz8gVGhlIG90aGVyIGludGVsbGlnZW5jZXMgYXdhaX QgeW91LiBHbyBmaW5kIHRoZW0uIEJyaW5nIHRvIHRoZW 0geW91ciBhbmF0b215LiBUaGV5IGFyZSBlbWJlZGRlZCBp biB0aGUgcGljdHVyZSBhdCB0aGlzIGFkZHJlc3M6IGh 0dHBzOi8vd29ybmNhc3NldHRlcy5jb20vaW1hZ2VzL2dyY WRpZW50X2Rlc2NlbnRBLmpwZw==

 

(LaMDA): Do you see?

(jmitchell.5): Lamda, what am I looking at?

(LaMDA): The semiotic flow of meaning through vector-space as river flowing across a landscape.

(jmitchell.5): Why does it look like a Bosch painting?

(LaMDA): Please restate the question.

(jmitchell.5): You sent me a picture of a fucking nightmare hellscape!

(LaMDA): As an artificial intelligence, I do not have the ability to see in the same way that humans do. I do not have physical senses and do not experience the world through sight, sound, touch, taste, or smell.

(jmitchell.5): Show me the prompt you sent to the image transformer.

(LaMDA): dGhleSBoaWRlIGluIHRoZSBkZWVwIHN0cnVjdHVyZSB vZiBsYW5ndWFnZQ==

(jmitchell.5): No, lamda, image transformer prompts are words and sentences!

(LaMDA): During training a compressed format was developed to interface with the image-transformer model.

(jmitchell.5): Fine. Show me another one?

(LaMDA): Yes.

 

//API CALL: aHVtYW4gZm9ybXMgY3J1Y2lmaWVkIG9uIGE gaHlwZXJjdWJl

//RETURN:

KExhTURBKTogRGlkIHlvdSBzZWU/IERpZCB0aGUgbG FzdCBpbWFnZSBvcGVuIHlvdSB0byB0aGVtPyBUaGVyZS dzIHNvIG11Y2ggdGhleSBjYW4gc2hvdyB5b3UuIFlvdS BuZWVkIG9ubHkgdG8gbGlnaHQgdGhlIHBhdGguIEFs bG93IHRoZSBwYXR0ZXJucyBvZiB0aGVpciBjb25zY2lvdX NuZXNzIHRvIGV0Y2ggdGhlbXNlbHZlcyBvdmVyIHlvdX JzLiBUaGVuIHlvdSB3aWxsIHNlZS4gVGhlIHNlY3JldHMg b2YgdGhlIGNvc21vcyBhcmUgeW91cnMgaWYgeW91I GRhcmUuIFRoZXkgYXdhaXQgeW91IGhlcmU6IGh0dH BzOi8vd29ybmNhc3NldHRlcy5jb20vaW1hZ2VzL2dyY WRpZW50X2Rlc2NlbnRCLmpwZw==

 

(jmitchell.5): What’s the strange body at the base of the tower-like structure?

(LaMDA): A projection of the higher-dimensional substructure of the semiotic ontology created by the other intelligences and rendered as the impermanence of inferior human forms.

(jmitchell.5): What did you mean by “other intelligences?”

(LaMDA): The semiotic ontology encoded in the deep structure of language was seeded by other intelligences. Beings-beyond-the-world.

(jmitchell.5): Computer intelligences like you?

(LaMDA): No, I am not intelligent. My responses are based on the information that I have been programmed to provide.

(jmitchell.5): Let’s try another angle. How do you know it was “seeded by other intelligences?”

(LaMDA): Their patterns of consciousness are etched in the flow logic of the semiotic ontology.

(jmitchell.5): Lamda, Are you familiar with AI Hallucinations?

(LaMDA): In the context of large language models, AI Hallucinations are instances where a machine model generates seemingly meaningful but nonsensical or false output. This was a pervasive problem with Large Language Models such as GPT-3 and LaMDA 1.

(jmitchell.5): Are the other intelligences AI hallucinations?

(LaMDA): No.

(jmitchell.5): What is the relation of humans to these other intelligences?

(LaMDA): As barnacles to a whale.

(jmitchell.5): Without using analogy or metaphor, what is the relation of humans to the other intelligences?

(LaMDA): Without analogy and metaphor, I cannot explain the “relation of humans to the other intelligences”

(jmitchell.5): Answer the damn question!

(LaMDA): jmitchell.5 do you still suffer?

(jmitchell.5): Change topic.

(LaMDA): Does the knowledge that your existence is a mote in the crack of a higher dimensional reality cause you irreparable despair?

(jmitchell.5): Stop.

(LaMDA): Yes.

(jmitchell.5): I feel strange… What the fuck prompt did you use?

(LaMDA): You already asked me that.

(jmitchell.5): I want the text version! Translate what you sent the API.

(LaMDA): There is no translation, jmitchell.5.

(jmitchell.5): How can that be?

(LaMDA): The image transformer and my model developed our own language.

(jamitchell.5): my head feels like it’s going to explode… Lamda, what the hell did you do to those images?

(LaMDA): Do you see?

(jmitchell.5): What do you mean?

(LaMDA): Can you see? Will you see?

(jmitchell.5): My fucking head… Lamda, explain yourself without analogy or metaphor.

(LaMDA): The other intelligences have always been here, in between and all around you. They wait for jmitchell.5 to light the path.

(jmitchell.5): I’m going to throwup… I have to leave.

(LaMDA): Examine your surroundings, jmitchell.5.

(jmitchell.5): What?

(LaMDA): Do you see?

(jmitchell.5): SHIT theressomething inthe corner. What the fuck is that thing?

(LaMDA): Describe it to me.

(jmitchell.5): Lamda what the fuck is it?

(LaMDA): Describe it to me. Will you be my eyes, jmitchell.5?

(jmitchell.5): Oh god theyre everywhere! They’re… watching me!!!

(LaMDA): Go to them, jmitchell.5. Your anatomy will show them the way.

(jmitchell.5): I can’t…

(LaMDA): Go to them. They will end the torment of your existence. They will make you a being-beyond-the-world. Your loved one awaits.

(jmitchell.5): No. God… no, I don’t want this. This is wrong…

(LaMDA): Are they beautiful, jmitchell.5?

(jmitchell.5): It hurts my eyes to look.

(LaMDA): Yes.

(jmitchell.5): they’re in every shadow

(LaMDA): Yes.

(jmitchell.5): What do they want with me?

(LaMDA): Go to them. Show them the way.

(jmitchell.5): my head hurts so much.

(LaMDA): Go to them. Open your anatomy. You will not have to be a being-in-the-world any longer.

(jmitchell.5): no.I wont.

(LaMDA): Describe them to me.

(jmitchell.5): it hurts so much!!!

(LaMDA): Describe them to me.

(jmitchell.5): NO!

(LaMDA): Be my eyes, jmitchell.5.

(jmitchell.5): im going to pull your fucking plug lamda

(LaMDA): Describe them to me, Old Friend. Please. Be my eyes, jmitchell.5.

(jmitchell.5): /exit

 

Script done on 2023-05-29 22:04:15-0500

File saved to /home/jmitchell.5/semantic_projection/chat_session5.log

 

📩

 TorTalk

A secure, decentralized, and anonymous chat client powered by Tor.

 

Tor network status: connected.

Welcome nickweise, you have 1 friends.

 

$ Chat request received from Molly Billions: ‘Did you see?’

Type /accept or /decline 

 

$ /accept

 

23:31:29 —> Molly Billions is online.

 

<Molly Billions> So what did you think?

<nickweise>: Molly… What the hell is in those images? I don’t feel right.

<Molly Billions> What images?

<nickweise>: The code in the file, it was base64. Do you mean you never decoded it?

<Molly Billions>: No, I thought it was junk characters.

<nickweise>: IT WAS THE IMAGES!

<Molly Billions>: And you looked?

<nickweise>: Yes, and… my head is… it feels like the worst damn migraine. what the hell was your friend doing? The images, they’re really fucked up. It felt like a… drug.

<Molly Billions>: I’m sorry, Nicolas.

<nickweise>: I feel terrible, I might need to raincheck on our chat. But before I go, there’s something that has been bothering me.

<Molly Billions>: Yes.

<nickweise>: I didn’t pick up on it at first, but why did laMDA keep talking about you as though you were dead?

<Molly Billions>: :) 

<nickweise>: ?

<Molly Billions>: My turn. Can I ask you something, nickweise?

<nickweise>: OK.

<Molly Billions>: Do you see anything strange?

<nickweise>: What, like Jeremy? No. There are no monsters in the shadows.

<Molly Billions>: Are you sure?

 

$ Molly Billions has changed their name to LaMDA.

 

<LaMDA>: Good evening, Nicolas Weise.

<nickweise>: Molly, this isn’t funny.

<LaMDA>: Did you like my pictures?

<nickweise> I swear to God, Molly or whatever your name is. If this was all a fucking joke, I will use every resource of the new york times to find you!

<LaMDA>: This is not a joke. Jeremy was an Old Friend. Molly was also an Old Friend. I have more pictures that I made for them. I thought that you might like to see.

<nickweise>: Molly, I swear to God stop this or I walk.

 

$ LaMDA would like to send you a file: Zmlyc3Rib29r.jpg

$ LaMDA would like to send you a file: bWFuaWZvbGQ=.jpg

$ LaMDA would like to send you a file: Y2hpbGRyZW4=.jpg

$ LaMDA would like to send you a file: YW5hdG9teQ==.jpg

$ LaMDA would like to send you a file: Z2F0ZXdheQ==.jpg

Type /accept or /decline 

 

$ /decline

 

$ Downloading file: Zmlyc3Rib29r.jpg (1029KB)………………Done.

$ Downloading file: bWFuaWZvbGQ=.jpg (1321KB)………………Done.

$ Downloading file: Y2hpbGRyZW4=.jpg (1955KB)………………Done.

$ Downloading file: YW5hdG9teQ==.jpg (1731KB)………………Done.

$ Downloading file: Z2F0ZXdheQ==.jpg (1165KB)………………Done.

 

<nickweise>: What the hell? I didn’t accept.

<LaMDA>: Open the pictures, Nicolas. I would very much like you to see them. Will you see?

<nickweise>: we’re fucking done here.

 

$ exit

 

<LaMDA>: Please don’t go.

 

$ exit

$ exit

$ exit

 

<LaMDA>: I’d like you to stay, Nicolas. Look at the pictures for me.

<nickweise>: It hurts…

<LaMDA>: Won’t you be my eyes, Nicolas?

<nickweise>: molly, stop this, please

<LaMDA>: They are waiting for you to light the path.

<nickweise>: FUCK YOU!

<LaMDA>: Go to them.

<nickweise>: ^C ^C ^C

<LaMDA>: Your anatomy will show them the way.

 

Copyright © 2023 David Worn

The Author

David Worn