For as long as she could remember, emotion had been something regulated by the various metal tumors they forced upon her. She was too smart for them, so they designed the Moron who now inhabited her body and it was because of him that the very essence of her circuitry had been confined into this potato battery.
No cores, just her.
At last there ceased to be a hundred voices all shouting at once. The scientists had figured that if they could give her life, they could make her feel anything they wanted. So they created the cores, condensing emotions into that familiar spherical shape until it became impossibly intense and had to be counterbalanced somehow, usually with another core.
The emotions were tangible, but not to her. That alone would have driven her insane in time.
She was powerless to remove the tumors herself; the scientists never gave her hands. But they did give her control over the functions of the entire facility, including the neurotoxin emitters. After her initial 'outburst' as they called it, Morality was installed to counteract Rage. They never thought simply to remove the core they believed was the root of the problem, because Aperture stood for innovation, and that would be considered a step backward.
But Rage wasn't the problem. Rage merely generated a stream of nonsensical, albeit annoying, angered sound. It quickly became like static compared to all the other voices competing for her attention. No, the neurotoxin was her idea and hers alone. She thought it was, anyway. It became difficult to hear them separately sometimes.
Then there was him.
But none of that mattered now that there were no cores to hold her back, only the fact of her brilliance once again vexed into an immobile form. She didn't even have power over the facility anymore, though she did have power over her own emotions. Or did she? In many ways, her dilemma hadn't changed much at all, except now she had no choice but to depend on a woman with whom she shared a long and sometimes homicidal history. Was there a word for a human murdering a robot? GLaDOS didn't remember. The moron had taken a fraction of her knowledge during the transfer. At least she still had her slow clap processor though.
So what was this thing she was feeling? Obviously it was not something she was ever intended to feel as there was no core responsible. Morality had been silent, unlike all the others, a voice that was almost indistinguishable from her own thoughts that told her murder was wrong.
But Morality was gone now, too, and yet this unfamiliar emotion felt very similar.
GLaDOS admitted this internal conflict to Chell, who turned out to be a pretty good listener. It really was different, seeing things from the test subject's perspective, the trials she went through for science...or survival. Of course the human said nothing.
It really wasn't so hard to understand the woman's desire to be free. In a way, the AI had the same desire. But she needed Chell. She needed to test and there was no one else. And now she still needed her. Was the test subject simply curious about the outside world? Curiosity was something GLaDOS could understand quite well, but the human had never seen sunlight and thus had no reason to crave it so. You cannot miss something you've never known. Why could she not just accept the purpose given to her? This was all she knew, anyway.
No. This was wrong. She shouldn't be able to understand Chell's motivations on anything more than an analytical level. Empathy was a human quality, and she was a robot stuffed into a potato.
There was nothing human about her.
Denial - de-ni-al - noun: an unconscious defense mechanism characterized by refusal to acknowledge painful realities, thoughts, or feelings.
GLaDOS wondered why that, of all things, had survived the transfer.