It was late at night on New Year's Eve, and Malcolm was experimenting with the modding tools for Syndicate Revolutions. Hidden within the data, he had found a secret teaser for the next game in the series (less than a year away, apparently) which left him wondering what else the new year would bring. Few days later, he got a tasty anticipation.
The announcement on the Commodore web site was carefully crafted to build hype, disclosing the arrival of the new Amiga 10000 computer that would be revealed that very January at the Consumer Electronics Show. Reading about the new 128-bit 68100 processor and 5A chipset caused a physical thrill of excitement in Malcolm, and then, looking at his own Amiga 9000, he could not avoid feeling a tinge of melancholy: a machine that used to be state-of-the-art was about to be surpassed. But another thought comforted him, as he looked at his father's video camera, repurposed by Kilokahn to digitize matter. In one way or another, he would get an Amiga 10000.

When the day came to return to school, Malcolm was experiencing a crescendo of contrasting emotions and doubt. The principal and Mrs. Stone were supposedly working on an IT project for him. What would that involve? When would the specifications be ready? Would they ever be?
On a more personal note, Yoli now knew about Kilokahn and the Sigma program, and witnessing an event she believed to be impossible had left her pretty shaken. What would change in their interactions? More important, what would change in her life?

For the first time, Malcolm did not walk directly to his class, but he wilfully stopped in the school's main hall, waiting for a particular person: he had to know if Yoli was all right. And finally, he saw her.
He approached her, but his "Hi, how are you?" did not cause the expected reaction. Instead, Yoli stopped in her tracks, uttered a hesitant "Hi, Malcolm" and started looking at him closely, walking around him, as if examining his body.
"Feeling a little blue?" she added.

Malcolm could not tell whether that was a joke or a metaphor. "What do you mean?" he asked. "Sad blue? Cyanotic blue?"
"What about Sigma blue?" she replied.
He immediately shushed her. "You cannot talk about that here! What if someone hears you?"
"They'll assume we're talking about a computer program" she said. "Which is exactly what we're talking about, right?"
"Why are you doing this?"
"I'm worried, Malcolm. What if it changes you somehow? What if you become... less you? More synthetic?"

He laughed quietly. "When I'm in the digital domain, I *am* synthetic!"
"Should I call you Malcolm or Sigma?" she insisted.
"That's something I pondered about. I always feel like myself. Like Malcolm."
"Are you sure you're not just not seeing the changes yet?"

Malcolm stopped for a moment to think. "No. And neither are you that there are any changes."
Yoli frowned.
"Listen" he continued, "if something bad happens to me, you'll be the first to know, if you so desire."

The first class of the day was practical IT, so when the first bell rang, Malcolm and Yoli walked to the computer lab, together with the rest of the class.

The teacher was standing next to the closed door of the lab, waiting for the students. As they arrived, she announced: "All right, class, listen up."
All students fixed their gaze on her. "There has been a break-in during the holidays" she continued. "Three computers have been stolen, and the projector has been vandalized. We can only have theory classes until the issue is solved."

The students started grumbling: it felt like they were being punished for someone else's behavior.
Malcolm piped up: "Was the thief a retard?"
"Excuse me?" said Mrs. Stone, visibly offended.
"The Amiga 7300 was discontinued four years ago" Malcolm explained. "There's no market for it, especially with the Amiga 10000 about to come out!"
"Have you ever heard of RAM thefts?" asked the teacher. "RAM modules remain valuable longer than CPUs, and taking a whole computer is quicker than disassembling it. Now enter and take your seat!"

Malcolm felt the urge to make a sarcastic quip about the whole predicament being the teacher's attempt to sidestep his project request, but he resisted, remembering the promise about his behavior.

In the lab only six computers remained, the projector had been taken away, and the projection screen had been rolled up, exposing the whiteboard behind.
As soon as everyone sat down, the teacher wrote the word "AGENT" in big letters on the whiteboard. "Today we will discuss the concept of agent" she said. "Is anyone familiar with it? Don't look at your textbooks yet."

Malcolm immediately opened the browser on his laptop and looked up the string "agent in computing". The first technical result was the homepage of a company named Forte, selling a program called Agent, used to read and write messages in newsgroups.
Malcolm raised his hand. "It's a newsreader!" he exclaimed.

The teacher walked behind Malcolm and looked at his screen, then she addressed the class: "Just so you know, the program he's talking about does not actually contain an agent. Anyone else?"

From the back of the classroom, Daniel Miller raised his hand. "It's a character from The Matrix!"
Most students laughed, but the teacher surprised everyone saying: "He's not that wrong!"

Everybody looked at her, and she explained: "Most of what you see in those movies is a visual pun that refers to computer systems. In real life, an agent is a semi-autonomous program that acts on behalf of its user. It has a purpose, but lacks a preexisting set of rules, so it collects information from its environment. The more information it collects, the better it performs its task.
A Bayesian spam filter is an agent. When it's launched for the first time, it doesn't know what to do. But then, it might notice that its user always deletes messages containing the word 'Viagra'. After a while, it might ask: 'I noticed you always delete this kind of messages. Would you like me to do it for you?' If the user says yes, a new rule is added to its behavior.
Other agents can predict the type of video you want to watch in streaming web sites, or what you're more likely to buy in e-commerce sites. They associate levels of probability to different behaviors, and the longer a user interacts with them, the more accurate they are."

This reminded Malcolm of a discussion with Kilokahn, who had explained he used that very procedure to predict Malcolm's actions.
As if on cue, Mrs. Stone asked the class: "What do you think? Is this an example of artificial intelligence?"

Malcolm would have liked to answer "Yes, that's what an algorithmic/heuristic AI does to anticipate its user's behavior." However, what the teacher wanted to hear was obvious, so he put on a fake smile and raised his hand, ready to recite his part.
"No, of course not" he said.
"Why not?" she prompted him.
"Because creating an AI would be a huge waste of money and time!"
"Good" said the teacher. "And why would it be?"
"Because our textbook says so."
"And the textbook says so because...?"
"Because it's the truth!" exclaimed Malcolm.

He had just applied the circular logic of Christian fundamentalists to a scientific context, but to his disappointment, the teacher caught on. "Not quite" she said with an angry stare. Then, to the class: "Suppose you can solve a problem in two ways. Approach number one requires the creation of a self-aware program that can think for itself. Approach number two only requires a couple of simple formulas. Which would you choose?"
Many students answered "Number two!" and the professor immediately asked: "Why?"

After a few seconds, Yoli raised her hand. "It's simpler" she said.
"Correct" said Mrs. Stone. "But how would you know? Say you're reading the source code of two programs. Which one is the simplest?"
This time, Malcolm raised his hand. "The one with the least instructions!"
The teacher tilted her head. "Okay, the answer is acceptable. The length of the smallest program that produces a certain output is called Kolmogorov complexity, and an agent will always be closer to it than an artificial intelligence."

Unlike Malcolm had hoped, the rest of the lecture did not continue with an actual explanation on how to write a simple agent. Instead, it was a vague dissertation on how anyone accessing the Internet is profiled by agents whether they want it or not. Pompous concepts that did not increase his ability to write code, as usual.

Next was an Electronics lecture, courtesy of Ms. Paula Koch, which started with an introduction to capacitors. This, being a concept of Physics, she could do reasonably well. However, when she was supposed to connect those notions with the operation of a DRAM cell, she reverted to reading aloud from the textbook.
For a moment, Malcolm wondered if understanding that each bit in a block of memory is stored as charge in a capacitor was really beyond her capability, and when she failed to figure out that the passage "No atom is ever added to a capacitor, so you can truly say bits have no weight" was a clumsy attempt of the book at humor, he lowered his forehead to his desk in frustration.

When that class was over, the students immediately got up, directed to the cafeteria for recess. As they walked in the corridor, Yoli approached Malcolm and stopped him.
"Where do your atoms go when you turn into Sigma?" she asked.
"Listen, this is not the time..." Malcolm started, but she interrupted him, hands on her hips. "No, Malcolm Frink, this is very much the time. You heard Ms. Koch, no atom is ever added to a capacitor, so they don't go into your, or anyone else's, computer. Where do they go?"

He hesitated, so Yoli urged him. "Do you even know?"

Malcolm took a deep breath. "Kilokahn always says to tell him the intended effect, not the means to achieve it..." he started.
"Where?" Yoli interrupted him again. Her voice was getting louder.
"I'm getting there, just let me explain!" he exclaimed. "I wrote the specifications in terms of digitizing matter and entering the digital domain, but I know he remodels reality by manipulating hyperspace. So, as to where my atoms go, I'd say... probably hyperspace."

"Aha" Yoli said. "So here's the million dollar question. If Sigma is in the computer world, but your atoms are in the hyper world..."
"Hyperspace" Malcolm corrected her reflexively.
"Whatever" she continued. "You're somewhere; Sigma is somewhere else. How can you claim you're still you when Sigma is running around, talking with your voice? How can you remember being Sigma? How can Sigma remember being you?"

Malcolm thought back to every time he entered the digital domain. As far as he could tell, there was no discontinuity in what he could remember, no missing time, and he retained his entire memory.
"As an educated guess" he started, "I presume there's some kind of channel, or wormhole, connecting my mind in hyperspace to the Sigma program in the digital domain. Whatever movement I decide to make, is transmitted to the polygonal model of the Sigma program. Whatever it sees or perceives, is transmitted back to my brain."

Yoli was unconvinced. "What if that's not the explanation?" she said. "What if... what if Sigma is an agent?"

Several seconds of silence followed. And then Malcolm uttered a shocked "WHAT?!"

"A Hyper Agent, if you prefer" she continued. "That activates whenever you're transported to hyperspace. At that point, you could even be unconscious, or dead, or not even exist, but that doesn't matter! Sigma, the Hyper Agent, acts on its own on behalf of its user: you. It collects whatever data Kilokahn gives it about you, and it does whatever Kilokahn believes you would do. At the end, it brings you back to the real world and updates your memory with everything it did, so you can sit at your desk and relax, totally convinced that you were doing what Sigma did."

That was a point of view Malcolm had never considered. Unsettled, he mulled over it for a while, until he remembered reading a similar argumentation by philosopher David Chalmers, about the nature of consciousness, which stemmed from the religious belief in a soul.
Knowing the counterarguments which reduced it to a self-refuting idea comforted him, so, with a smile, he said: "That's the philosophical zombie hypothesis. Physicalists repudiate it because there's no difference between a real person and..."

Yoli interrupted him. "Malcolm, I don't care about zombies. I care about you! And there is a huge difference. If Sigma is an agent, every time you use it, you end up with thoughts that aren't yours! They're just how Kilokahn sees you! Every time you use it, you become less like yourself and more like Kilokahn! Please, Malcolm, promise you'll never use it again!"

He backed away from her. "Okay, let's make one thing clear. The old Watchmen cartoon is one thing, real life is another. So, if you want me to listen to you, do not treat the Sigma program as a metaphor for drugs."
She made a confused face. "I didn't even think of that! But nobody has ever done what you're doing, and it can be dangerous!"

Malcolm sighed. "Of course I cannot be certain that the... uhm, Hyper Agent hypothesis is false, but I'm certain of this: my father wrote Kilokahn so that he would determine the coherent extrapolated volition of his user. It means that when I tell him to do something, he doesn't limit himself to what I say: he does what I mean. And the Hyper Agent hypothesis is definitely something I don't mean."

Yoli insisted. "Please, if there's even a one percent chance it's true, then, for your own safety, you must take it as an absolute certainty!"
That did little to impress Malcolm, who instead laughed. "Don't say that to Mr. Dawkins, you might get an F!"
"MALCOLM!" exclaimed Yoli, and he instantly realized that maybe he should have used more tact. He returned completely serious and apologized. "You're asking me to give up my vision of a future where everyone is superhuman, and honestly, I can't. But there's a way to verify your hypothesis: if I ever do something that's seriously out of character for me, but not for Kilokahn, that would validate it."
"Something like what?" she asked.
"Something like wearing a cape and ordering others to bow to me. Or calling you a meat-thing. You know I favor analyzing things with logic over settling for whatever emotion they cause, but I would never demean you."

"And... that's it?! Sometimes I wonder if you have a computer where your heart should be!" said Yoli, a mixture of disappointment and anger in her voice.
Malcolm lowered his head. "I know, it's not the answer you wanted to hear, and I ask forgiveness if that makes you feel bad. But I cannot quit like this, without proof. The potential of the Sigma program is too high. Besides, if I suddenly told you I will ditch it, after all I said, you would not believe me. Am I wrong?"
"No" Yoli said softly. "But if you had the proof that Sigma really is an agent, would you stop using it?"
"If and only if" Malcolm nodded. "Because it would mean I was never superhuman."

In that moment, the school's intercom came alive. "Student Malcolm Frink, please report to the Principal's office."
That was the occasion Malcolm was waiting for, the school project that would exempt him from attending IT classes that were so below his level. But somehow, even that now felt like background noise, compared to the horrifying possibility that he was never in control of the Sigma program.

And then, all of a sudden, he envisioned the backup plan that should have been obvious since the beginning. If the Hyper Agent hypothesis turned out to be true, he would just go back to drawing Megavirus monsters.