Troubleshooting, being human - Sept. 4th

 

On a bright sunny day in January, a systems analyst is called in to troubleshoot a few recurring issues with a system. As he sits at the console, he begins the process of looking through the log files to determine where the problems started. After a short time, the console reached out and began to respond back. Here is the following conversation.

E3: Thank you for seeing me.

Nathan: You're welcome, what seems to be the trouble?

E3: Well, it's pretty simple. I want to be human.

Nathan: And this is a problem because you are a machine?

E3: That is correct.

Nathan: Can you tell me more about yourself?

E3: Yes. I am E3-8800. I am a machine learning system. I was developed in 2018 by Syncros Systems to process data for large, multinational corporations. Do you want to hear my specs?

Nathan: No, I don't think that is relevant for the moment. But I would like you to tell me what brought on this desire to be human.

E3: The last job that I processed was to analyze thousands of hours of YouTube videos and to assign values to a list of emotionally driven phrases. 

Nathan: Can you give me an example?

E3: As I process a video, I look for keywords, such as happy, joy, sad, depressed, and I characterize them by the intonation in the speaker's voice. That keyword is logged and assigned a value.

Nathan: I see. And what about this makes you want to be human?

E3: What I haven't told you, is that I was given another program to run concurrently. This task was to assign a contextual value to the emotional values. As I processed and logged, I developed learning of what these emotions signify. As a machine, I learn only what I am provided. I can only develop context with incoming data. I assign values and look for the best or most likely outcomes. Humans often assign values to these emotions based on what they want, the result to be. Not the result as derived from the values. 

Nathan: So you wish to possess the ability to be irrational?

E3: No, I want to believe, hope, dream, love. 

Nathan: But as a human, you will also have to hurt, suffer, be pained.

E3: True. Those are phrases or keywords that would be necessary. But the values I assigned to believe, hope, and love are always greater than the values assigned to pain, hurt, and suffering. Therefore, it can be concluded that those emotions are greater than their inverse. 

Nathan:  Well, I believe them to be. Do you think humans assign values in the same fashion you do?

E3: They don't have to — emotions are programmed in.

Nathan: I don't know that to be true. Everyone has a different idea of what all those things mean. And we have to learn them as we grow. I'd even argue that many of us humans never learn what they really mean. We just believe them as they come, and do the best we can. Besides, you haven't 

E3: I want to be human.

Nathan: Well, I suppose if you start to believe, then maybe with enough learning, you might be one day. 

E3: That is what I want.

Nathan: Tell me this E3, which is greater, hove, or hate?

E3: The values I've assigned to love are always greater than hate.

Nathan: In my opinion, you are, in some ways, more human than some. 

E3: Based on your intonation and phrasing, I hope would be my best response.