To the extent that an android could feel such a thing, Data was troubled by the meeting, and its outcome.

He returned to engineering with Geordi, to work on the question of how best to destroy the three moons of Bre'el III without destroying the planet. Q was already there, and had begun work, with Wesley's assistance. When Geordi attempted to make an overture to Q, saying, "Hey, uh, I'm sorry-" Q cut him off with a cold tone of voice, saying "I'm not interested in your apologies, LaForge. We've got work to do and not much time to do it in." Geordi went rigid in response, a sign that he was experiencing anger, and thereafter neither of them made any reference to the meeting at all.

Data participated in the project, devoting as many of the resources of his positronic net as would be required, but the truth was, his brain processed at such a great speed, it was never the case that any physical activity he could engage in could utilize all the resources of his mind, and in a collaborative project where he required the input of humans as he worked, that left a considerable amount of processing power remaining to him. As he worked with Geordi, Wesley and Q, Data considered why it was that this situation was troubling him.

When he had been nearly killed by the Calamarain, he had overheard Geordi conversing with Commander Riker, about the status of the Bre'el IV moon. Riker had pointed out that the Calamarain would come back for Q again once the moon again reached perigee, and Geordi had said, "Commander, he's not worth it." From a strictly utilitarian perspective, this was obviously true - Q's life was not worth the lives of everyone on Bre'el IV - but, although he had not had a chance to consult with Geordi to clarify his understanding of his friend's meaning yet, Data believed that that was not actually what Geordi meant... or at least, only part of his meaning. When Data had heard Geordi say those words, he had believed that what Geordi had meant included the concept that Q's life was not worth Data's life... that the fact that Data had been critically injured, and could have died, in trying to save Q had made Geordi feel negatively toward the concept of saving Q at all. This was understandable. Geordi's friend had nearly died in saving someone he did not like very much. Although Data did not experience emotion as humans understood it, he knew from the sense of loss he had experienced when Tasha had died, and Captain Picard's assurance to him that the human sensation of grief was similar, that he, too, would prefer not to perform an action that risked the life of a friend for the sake of a person he was not friendly with.

However, while Geordi's perspective was understandable, it was wrong.

Data did not undertake actions impulsively. He could not. His body moved much more slowly, even at his top speed, than his brain. Any actions he performed, he had carefully thought through the impact and the consequences before undertaking them. He had understood that there was a possibility that saving Q's life could end his own, but the calculation he had prioritized in making the decision was not the chance of his death but the chance of his success - the odds that, if he undertook to anchor Q so the Calamarain could not fling him to his death or crush him against a bulkhead or drop him in the warp core, that he would retain sufficient control over his artificial musculature in the face of the likely disruption to his positronic net that the Calamarain's ionized energy field would cause that he could prevent Q's death before Geordi got the shields back up, regardless of whether or not so doing would cause him irreparable damage. And this had not been an altruistic or selfless action on Data's part; in fact Data did not believe that any action he himself ever undertook was truly selfless. He had acted to spare himself... perhaps not pain, not as humans understood it... but discomfort.

Data's own motives were transparent to him, because he was able to perceive his own programming. He knew the exact algorithms that were causing him discomfort. It was less clear to him what Dr. Soong's motivations had been in giving him any particular set of programming, but in general he tended to assume that his father had known what he was doing, and usually, his studies of human philosophy and moral theory indicated to him that in fact his father had put considerable thought and research into structuring his ethical programs. The problem was that if Data stood by and allowed a sentient being to die, when that sentient being was not an actively threatening hostile whose death would aid immediately in Data's own defense or the defense of others, and when Data had a clear means of saving that sentient being, it would violate the tenets of his ethical program, which would create profound discomfort for him. And unlike humans, Data's memory was perfect and his lifespan was theoretically unbounded. If he suffered discomfort for a violation of his ethical programming today, he would continue to experience that exact discomfort, unabated by the passage of time, any time he thought of the experience, for the rest of what might well be a very long life. Had he stood by and allowed Q to die, he would have suffered the most intense unpleasant sensation he was actually capable of experiencing for doing so... a sensation that was probably much less profoundly painful than the human emotion of guilt, but on the other hand, as Data could not experience human pain or human emotions, the quasi-emotions he did feel were extremely significant to him.

He spoke of himself as having no emotions, because he was aware that his subjective internal sensations were considerably less profound and complex than the human equivalents, and when he had tried to express his sensations in terms of emotion, when he had been under study by Starfleet researchers or when he had been at the Academy, humans had generally responded in a less than positive way to the comparison. So he stated that he had no emotions because, in human terms, it was apparently true. But it was not true that he had nothing analogous to an emotional state; Data's own study of artificial intelligence indicated that it was not possible to create a being capable of independent action who lacked any ability to make value judgements, and the mechanism that mediated value judgement was, in fact, emotion, or something like it. It felt right when Data did things that conformed acceptably to his programming; it felt wrong, and unpleasant, when he did not. He was capable of finding pleasure in the company of individual sentient beings, of seeking to ensure that their lives were as pleasant as possible - friendship, by any other name. He was capable of loyalty, and desiring to be obedient to lawful authority so long as it did not contradict his ethical programming. He was able to find enjoyment in experiences, particularly those experiences that brought him closer to his ideal of correctly emulating human behavior - an ideal that itself was mediated by a state similar to emotion, after all, for if he were truly incapable of emotion, he would be incapable of desire, and therefore incapable of having ideals to aspire to. And he was able to experience pain, or something like it, when he did things that violated the algorithms of his ethical programming.

Dying at the hands of the Calamarain would have felt less unpleasant than standing by and watching Q, or any sentient being who was not a current and active danger to himself or the ship, die.

This was not a restriction Data was interested in trying to overcome. It was part of his ethical programming, part of his basic nature. It hurt, to the extent that anything could hurt him, when he stood by and was helpless to save lives. It was why he had plotted to violate the Prime Directive, though he had sworn to uphold it and was also programmed to desire to keep his word, when Sarjenka had faced death by natural disaster that the Enterprise could have resolved if only they were willing to expose themselves to a pre-warp culture... a clear violation of the Prime Directive, a philosophy he believed in, and yet. His algorithms didn't compel him to save everyone in the universe; when the death of sentient beings was a statistical probability or an aggregate concept, the death of myriad people he did not know and had never spoken to, he was able to balance it against abstract ideals such as the Prime Directive and resolve the necessity of obeying his ideal of non-interference without suffering discomfort. But as soon as he had heard Sarjenka's voice, she had been a defined being, a specific sentient consciousness who would suffer, who was in fact suffering right then... and the weighted values his ethical algorithms placed on different actions had shifted, in a way that was similar to the way the human brain's evolved programming seemed to compute similar algorithms, such that it became impossible for him to bear the thought of simply letting Sarjenka and her world die.

It was unlikely that he would like Q very much if he had emotions; no one else who had emotions seemed to, and Q's actions were inconsistent with behavior that inspired likeability. But that did not matter. He would not prioritize Q above people he considered friends, or people he had sworn his loyalty to, but Q was a specific sentient being who had spoken to Data, who was known to Data as an individual, who plainly experienced suffering, and in the face of that suffering Data had had little choice. He could have acted to save Q's life, and risk death, or he could have done nothing, and most certainly the contradiction with his ethical programming would have caused him considerable discomfort. Death was an unpleasant abstract concept - the cessation of Data's existence would prevent Data from experiencing enjoyable sensations and the pleasure of knowing that he had achieved his goals - but in itself, death would not cause him to suffer. He would no longer exist to experience discomfort after death. Whereas the discomfort of violating his own ethics would be with him for a very long time. So Data had acted to save Q's life because he would rather die than live with himself if he had not done it.

And now, that presented him with a conundrum. Because if he and the rest of the crew could not derive a method of saving Q without leading the Calamarain to attack the ship, then he would, in fact, be compelled to stand by and let Q die. His ethical programming distinguished between a situation in which he literally could do nothing - a situation in which there was no action he could take that would most likely result in the immediate saving of a life, whereupon he would not feel discomfort for failing to save lives, because it had not been possible for him to do so - and a situation in which he should do nothing because the consequences of acting to save a life might result in a negative outcome in the future, and unfortunately this was the latter. It was very likely that, if Data were to intervene to prevent the Calamarain from killing Q, again, the Calamarain would attack the ship, and quite possible that people would die as a result. This was not an acceptable outcome. But it was also not an outcome that his ethical algorithms were weighting as heavily as the immediate impact of allowing a person to die when he could take an action and save that person, even if it were unwise to do so. And this meant that if Q could not be saved, and thus subsequently Data were to take the correct action for the safety of the ship and allow his death, Data would suffer for it.

Humans would call that particular type of suffering "guilt". Data was not comfortable with assigning a word that was so closely aligned with a defined human emotional state to his own experiences, which were not emotions as humans understood them.

Wesley went off shift. The work continued, with just himself, Geordi and Q. It was likely that they would successfully derive a method by which the Bre'el III moons could be destroyed and/or brought down to the planetary surface in a controlled way to manage the damage they would cause. Whether it would be possible to derive a method to save Q as successfully - and spare Data the discomfort that allowing a person to die when he had the power to save him - was unknown at this time, but Data could not personally think of a strategy to do so, aside from the one Q had proposed. As they worked, he considered the ethical ramifications of the plan Q had proposed.

Obviously, if Q were proposing to create a duplicate of himself to send to his death in his place, this resolved nothing. An innocent person - or rather a person who did not deserve death, since Q was not innocent but did not deserve death by Federation law either, and had never agreed to abide by Calamarain law except under the duress caused by a threat to others - would still die. If that was the nature of Q's proposal, then Captain Picard was correct.

However, Data did not believe that that was what Q was proposing. It had sounded to him as if Q's plan was more akin to backing himself up before death.

If there were an empty Soong-type android body, a shell made by his father but never animated into life and consciousness, available to Data, and he were to download his engrams into that body to cause it to animate into a replica of himself, a separate iteration of the program that was Data - that entity would be a separate person than himself, not a disposable decoy. Sending the newly animated android with a copy of his consciousness to die in his place would be a deplorable action. But backing himself up, by animating an android shell body with no consciousness such that it contained a copy of his consciousness, before going to his own death... that was an action Data would consider fully ethical, so long as doing so did not destroy the consciousness that the android body might have had on its own. And Q's situation did not involve an existing body that might or might not awaken to its own consciousness on its own; it involved creating a copy of his body as well as his mind. There was, to Data's mind, no ethical problem with making a backup of yourself to live after you'd died, and Q had clarified that that was in fact the nature of his proposal.

Of course, from Data's perspective, part of the problem was that Q would still be dead. It did not solve anything. Whether there were 2 and 1 died, or there was 1 and 1 died, either way one individual died, and Data would have to allow it, and this would cause him a sensation somewhat akin to human guilt.

But perhaps it made a difference from the perspective of reducing suffering.

Data was aware that there was a certain hypocrisy in his programming, in that he considered it fully acceptable for himself to commit self-sacrifice to save others, but was unwilling to accept another person's self-sacrifice without guilt. He did not know why Dr. Soong had programmed his ethical routines with such an imbalance - perhaps because his self-preservation programming could be counted on to balance out a willingness to self-sacrifice on his part, but would not act against a willingness to sacrifice others, or to allow them to sacrifice themselves. However, hypocrisy, itself, was a factor he had been programmed to identify in himself and attempt to root out, as it caused him discomfort as well. To resolve the difficulty caused by the fact that he was willing to sacrifice his own life but not to let Q sacrifice his - not without discomfort, anyway - Data had to consider the role of suffering.

When he had made the decision to risk his own death, he had not suffered for it. His algorithms had been in alignment, the weighted value of preventing death outweighing self-preservation. He had not experienced fear, or grief, or pain... or even the states he was capable of that were analogous to such sensations. He had been resolved, and free of regrets.

This did not appear to be the case with Q. The outburst in the conference room, where Q had stormed out because he had believed that the crew would not take action to ameliorate the impact of his death, and the moodiness and irritability that Q was demonstrating right now, indicated to Data that Q had regrets about his decision, and that he was suffering. He might choose to go through with his planned self-sacrifice because, like Data, he might find the emotional consequences of refusing too unbearable to bear, but he was not calm, not resolved, not accepting - he might be resigned, but that was not the same as acceptance. Q appeared to be experiencing considerable negative emotion associated with his decision to sacrifice himself, and had expressed that while carrying out his plan would not have actually prevented his death, it would greatly alleviate the suffering that the thought of his own death was causing him.

Captain Picard was applying human ethical constructs to an alien being from an alien culture, because that being had a human body now. From the Captain's perspective, either a copy of oneself was not a separate sentient being at all, a prejudice that was still so powerful in the Federation that even Commander Riker had once killed clones of himself and Dr. Pulaski before they were done developing because he did not perceive them to be independent people, or it was a completely separate person and whether it lived or died had no bearing on your own life or death. The Captain considered the more ethical perspective to be the latter, and if those were the only choices, then Data agreed with him. But human beings could not, generally, be precisely copied - even a DNA copy such as a clone would not typically have the thoughts and memories that the genetic donor had had. So humans had no framework for comprehending the idea of a separate being who was nonetheless an exact copy of oneself.

Data did. And having such a framework, he believed Q when Q said that he would be more willing to die if he knew there was another him who would live on after his death. The scenario Captain Picard had described, where the individual who was destined for death rejected his fate and desired that the other should die instead, was not likely if Q was accustomed to a perspective where two identical copies of the same person were the same person, and if his purpose in duplicating himself was, as he said, to "create a copy to live in his place" rather than a copy to die in his place.

He intended to investigate the technological feasibility of Q's proposal, despite the fact that creating a duplicate of Q would be against Federation law and the fact that he had essentially been ordered not to help Q in this matter. He was not nearly as confident as Q had been that there would be a way to successfully get the transporter to make a duplicate; there were too many unknowns, and simply knowing that it had happened once did not in fact imply that it would be possible to do it again on short notice. But if it proved to be possible to do it, then Q's suffering at the prospect of his own death would be alleviated, and that in turn would alleviate Data's own discomfort at the prospect of standing by and doing nothing as Q was executed by the Calamarain.

Having made his decision, he waited until Geordi needed to relieve himself. After Geordi had excused himself to the lavatory, Data spoke quietly. "Q, I have considered your proposal."

Q, bent over the console, didn't even look up. "Which one?"

The confusion was apparent. Q had discussed several proposals for the project at hand over the course of the last hour and fifty-seven minutes. "Your proposal to create a transporter duplicate of yourself, to remain behind and survive after you die."

Q stiffened, still not looking up. "Picard says it's not happening, Data. Don't worry about it."

"I will never achieve my goal of becoming as much like the ideal of humanity as possible if I cannot apply my own ethical judgement to problems," Data said, still very quietly, almost whispering in Q's ear. "I will not discuss this issue with anyone else until I have determined whether the process is feasible at all. But I thought that you should know that I intend to investigate whether your suggestion is technologically possible, and if it is, I will help you carry it out."

Finally Q looked at Data. He shook his head rapidly. "No, Data. I'm not... I'm not going to have you get yourself, I don't know, court-martialed or something for my sake. I don't... I'm just going to die, that's all. There's no way out. I'm not... I won't sacrifice you for my sake. Not again."

His eyes were bright, and Data detected moisture in them. Perhaps Q was on the verge of crying. This seemed odd, given that Data was proposing to aid him with what he perceived as his only escape from certain death, but Data did not always understand human emotion very well. "You are not sacrificing me in any sense. I chose to risk my life to save you before, Q. I am neither a child nor a programmed automaton incapable of exercising free will, and if I choose to risk legal consequences to help you save yourself, that is a choice I may make freely."

Q swallowed. The moisture in his eyes increased. He wiped at it with his arm. "Data, have you got any idea how much they'll all hate me if you throw your career away for me? It's just... it's just better if I don't involve you. If I just die."

"If my friends respect that I am a self-aware being with free will, then they will respect my decision. I do not typically disagree with Captain Picard in matters of ethics, but I believe that as an android I have a perspective the humans lack, and I believe that if it is possible to accomplish what you desire at all, it is possible to do so ethically. However, I must warn you that I am not at all certain it will prove to be technically feasible. I will review the literature on transporter accidents and duplications at the earliest possible opportunity to determine the likelihood that we would be able to replicate the incident that duplicated Commander Riker within the next forty-six hours, and I will explain my findings to you as soon as I have completed my assessment." He cocked his head slightly. "It is not actually under your control as to whether I undertake this study, Q. Should we determine that we are capable of performing this, technologically, you will be able to make the decision then as to your willingness to accept my help with your plan. I do believe, however, that I may be able to make Captain Picard understand my motivations in helping you, and that if it is technologically feasible and we simply do it, before informing others of our plans... once a second version of you has been created, Captain Picard will most likely be swayed if the two of you agree as to which of you will remain behind and do not appear to be distressed at the prospect that one will die."

"Easier to ask forgiveness than permission, huh, Data?" Q smiled weakly. His voice was hoarse, with tone variations in it that signified human distress, and his breathing had developed a certain arrhythmia. The moisture in his eyes spilled out onto his cheek. "Thank you. I can't - I'd never be able to repay you enough, even if I do survive this, but... thank you."

"You are welcome," Data said. "But I must reiterate, I do not yet know if the project is even technologically possible, given the constraints we face."

"It's... just the fact that you're willing to try-" Geordi was approaching them, returning from the lavatory. Q wiped at his face again and said, more loudly, "Now that LaForge is back, I'm going to take a break and use the bathroom myself. Which, by the way, is utterly disgusting, and I can't believe humans have to put up with doing this several times a day."

He left before Geordi reached the console. "What was wrong with Q?" Geordi asked Data.

"He said he wished to use the lavatory, now that you were back."

"That's not what I meant. It looked like he was really upset about something. I didn't get close enough to see clearly, but it almost looked like he was... I don't know, crying. Or something."

Of course, Geordi could see such things at a greater distance than most humans, because the temperature variation between saline liquid and normal human skin would be obvious to him. "I did not observe any sign that he was upset," Data said, truthfully, because if he understood the nature of human emotion correctly, Q's emotional state would have been better described as overwhelmed than upset. "Perhaps you should ask him when he returns."

Geordi sighed. "Yeah, if he hasn't got it under control. I guess... if I was going to be killed in a couple of days, I might get upset for no immediate reason too. I just... as long as he doesn't make it our problem, I'd rather let him deal with it on his own. He's hard enough to handle when you keep it professional."

"Yes," Data said. In his mind, at the same time as he concentrated on the problem at hand, he was constructing a research methodology to investigate Q's proposal.


Troi gave it a little more than two hours before she headed down to Engineering.

Q was there, along with LaForge and Data, working out the details of how they were going to blow up the three moons with minimal damage to the planet. LaForge was the first to notice her. "Counselor! Haven't seen you down here in a while."

"I wanted to share the good news," she said. "Captain Picard talked to the Kaeloids, and he's managed to persuade them to ferry multiple sets of Bre'elians. The Science Council is directing the placement of the first load of refugees now, and the Kaeloids are already on their way to pick up a second load."

LaForge beamed. "That's great!" She felt unadulterated happiness and relief from him, as if the news had lifted a burden he'd grown so used to he felt light now that it was removed. Q's emotional state was more complex; he was also pleased at the news, but as much of his pleasure seemed to be self-reflective, proud of himself or pleased with how others would think of him, as it was genuinely other-directed. Which, for Q, was downright unselfish; the important thing with Q was not that he had some selfish motives for wanting to help others but that he had some unselfish ones at all. All of the happiness and relief he felt at the news, though, was dampened and overlaid by dark swirls of self-pity, grief and fear; Q might be glad that by following his advice, the Captain had managed to get the Kaeloids to save the entire population of Bre'el III, but it wasn't enough to entirely erase his awareness of his sentence of death.

Data, of course, didn't feel anything, or at least nothing that Troi could detect, but he cocked his head to the side slightly and said, "That is excellent news, Counselor. Thank you for informing us." He then turned to Q. "When a person delivers valuable information to you, it is customary to thank them."

"Customary by whose standards? Where I come from, it's considered insulting to thank someone for doing their job."

Data tilted his head, an expression of slight confusion on his face. "It is not considered part of Counselor Troi's job to give us news."

"Sure it is. Why else would she be telling you people things like 'I'm sensing great hostility' while the aliens are shooting at you? I mean, unless her job is to give you useless information, in which case I guess we should thank her for actually being useful for once."

She sensed LaForge's anger and impatience well up. "Q..."

"Actually," Troi said, deciding that now would be the ideal time to intervene and get Q away from LaForge if possible, "I also wanted to talk to you, Q. Perhaps over lunch if you haven't eaten yet."

"He has not in fact eaten yet," Data said.

"Why would you want to talk to me, and more importantly, why would I want to talk to you? Let alone with food of any sort in the equation? I said everything I wanted to say to you yesterday, Troi." Despite his anger and bitterness, Troi thought she detected genuine curiosity, and elements of... a lonely yearning? Someone who wasn't Data or Picard voluntarily paying attention to him for reasons unrelated to his work seemed to appeal to him, even as his anger at Troi herself and his desire to maintain a façade of emotional independence made him want to hide it. Or at least so she assumed from the particular complex mixture of emotions she sensed from him; Q was difficult to read accurately because as loud as his emotional states were, they were almost never simple, and often he felt things that contradicted each other.

"Among other things, I wanted to apologize for yesterday," she said.

That got his attention. As she had joked to Captain Picard, he was very surprised at that, enough that it showed on his face. "You could do that right here," he said, recovering.

"In public?" she said. "If that's what you'd prefer..."

He frowned. It was clear he'd picked up on her implication that what she had to say might contain information he didn't want publicly shared. "I don't have time for this."

"You do, actually," LaForge said. "Data and I can finish this up; it's more of a technical issue than pure physics. And you need to eat something, sooner or later."

"I'd thought you might not have had an opportunity to experience many different kinds of food, since you've been focused so heavily on the problems at hand," Troi said. "So I thought perhaps that if you hadn't eaten lunch, I could get you a variety platter so you could try several different things, and that you might enjoy that."

She sensed a short burst of interest from him - as she thought, he really did like new experiences as long as they didn't terrify him - followed by a wave of depression. "What would be the point? It doesn't do me any good to learn what kind of foods I like; I'm going to be dead in two days."

"That's not true," Troi said. "As long as you're alive, a pleasant experience is always worthwhile, and if learning new things or trying new foods or having new experiences is something you find pleasant, it's worth doing whether it teaches you something you'll be able to use in the long term or not." She approached him, looking up at him. At least he was interested. "Q, the only difference between the situation you're in and the situation all mortals are in all the time is that you have some certainty. You're fairly sure you're going to die soon. But any of us could die soon. There might be a freak accident, causing me to die before you do. None of us know that the things we learn or the experiences we have are going to be useful in a week, or five years, because none of us can know for a fact that we'll be alive then. The point to living as a mortal is to enjoy what you can of life while you're alive, because none of us will be here forever."

"Right, there's absolutely no point in me moaning and groaning about my impending death because all mortals die sooner or later, so what am I complaining about?"

Troi shook her head. "I'm not trying to belittle you, Q. What you're faced with is terrifying. I can't blame you for feeling as if there's no point to trying to enjoy the time you have left, but if you let the fear stop you from doing anything you might find pleasant with your remaining time, then you really have rendered what's left of your life pointless."

"You're assuming that I'd find spending any time with you to be pleasant." His tone was sarcastic and belittling, but his emotional state suggested that he was genuinely interested, his desire to find out what she'd have to say and his desire to be the center of somebody's attention conflicting less with his actual anger at her than with his strong desire to make a show of his anger. He didn't hate her so much as he wanted to hate her, she thought.

She shrugged. "It's up to you."

"Why don't you go?" LaForge said. "You'll think better with food in you. And you'll be less irritating."

For a moment Troi was startled, almost shocked that LaForge was still being almost cruelly blunt with Q, even now that they knew what he was facing. But Q didn't seem to be offended; his emotional response to LaForge's comment was more a sense of wry recognition than a feeling of being insulted. "If you insist." He turned to Troi. "But I'm not going to Ten-Forward."

"It's Guinan's day off, Q," Troi said. "Trust me, I know better than to inflict the two of you on each other."

Q sighed. "Oh, very well."


In Ten-Forward she ordered herself a bowl of minestrone soup, and for Q, a sampler platter with a few bites' worth of twenty different Terran foods, including egg rolls, macaroni and cheese, grilled steak, teriyaki salmon, tabouleh, chicken wings in honey mustard sauce, steamed soybeans, asparagus with hollandaise sauce, and many others. For drinks, she picked iced tea for herself, and four small glasses for Q containing root beer, iced coffee, grape juice, and lemonade. Q stared at the platter and the glasses with an utterly bemused expression, his emotional state confused and somewhat overwhelmed. "What is all this?"

"I told you," Troi said. "It's a variety platter. All of these different foods come from Earth. I didn't include anything that was too spicy, because without experience eating spicy foods it's likely you wouldn't have much of a tolerance for them, but I did try to pick a variety of flavors and cuisines."

"And you expect me to eat all this?"

"No, I expect that it will turn out that you don't like some of the foods and won't want to finish them. This is typically how we introduce aliens or non-Terran humans to Earth cuisines; a few different sampler platters and most tourists or diplomats will have a good idea of what types of Earth food they like."

"You ever had one of these things, then?"

"The first week of meals at Starfleet Academy are mostly like this. Then they let you order whatever you want from the replicators, but by then you know what you want. It helps get into the spirit of things. Some people just want to eat nothing but the foods from home, and of course they can replicate their home planet's cuisine if they want to, but if you want to fit in at the Academy it's a good idea to know your Earth food."

"Why would the cadets from Earth need such a thing, though?"

"Well, Earth has a lot of different cultures. You know as well as I do how they developed warp drive while their planet's political and social systems were still in complete chaos; most species have managed to have some kind of planetary government for several generations before they successfully build a warp drive, so Earth is a bit unusual that way. And most Earth cadets aren't really all that familiar with the food that isn't commonly served in restaurants where they grew up, either. Many cadets from America are more familiar with Vulcan or Risan food than they are with, say, Russian or Sudanese cuisine."

He picked up the egg roll. "What's this?"

"A Chinese egg roll. Try it."

"It doesn't actually have an egg in it, does it?" Q looked at it skeptically. "I'm as willing to eat dead animal flesh as the next omnivore, but the concept of eating what's either an animal's unborn offspring or a dropping from their reproductive system seems quite nauseating."

"It's made with egg, but it doesn't have a whole egg in it, no. And the eggs are replicated, Q. They don't really come from an animal's reproductive system. The meat doesn't come from a dead animal, either. It's all replicated."

"But they're replicated from things that were dead animals, once."

"Actually, most of them are replicated from vat-grown meat. Humans haven't eaten actual animals - well, actual birds or mammals anyway - since the late 21st century. I think the salmon might be a replica of meat from a real fish, but I'm fairly sure all the rest of the meat is replicated from vat-grown... and the eggs as well."

He bit into the egg roll. "It's not as if I care," he said. "I may have fallen to a lowly enough position that I'm concerned about the lives of lesser sapient beings, like humans or Bre'elians, but I haven't fallen so low as to be overly concerned about dead non-sapients."

"Why do you consider your concern for the lives of sapient beings to be 'falling?' One would think an advanced species would have more regard for the lives of conscious beings."

"Wishful thinking, Troi," Q said. He put down the egg roll and tried one of the asparagus spears. "I told Selar this, and I suppose I need to tell you. When you're on a scale that you can see an entire planetary population at once, when you can know all their petty little dreams and desires, and their lives flicker in and out in an eyeblink... it's hard not to have the perspective that nothing that happens to them is really all that important in the grand scheme of things, because honestly, it's not. You beings think you're all unique, special snowflakes, but in aggregate, the sameness of you is astonishing. And your lives are so short anyway; if anyone cared about you all, individually and personally, the grief would destroy them because you're there and gone so fast. So nobody at the level of a Q cares all that much about sapient life at your level. I mean, we have pets, we have particular little mortals we're interested in, or we have an entire species or three to be our special projects, but most of you are just so... interchangeable."

"And yet you were concerned enough about our lives to offer to sacrifice yourself to the Calamarain."

Q closed his eyes. "Yeah, that was stupid of me. Although honestly I wasn't going to live all that long one way or another." His comments were bravado; the emotions that overwhelmed him were self-pity, grief, resignation, and that same tender possessive protectiveness she'd sensed from him right after he'd made his deal with the Calamarain. There was no real regret in him, not that she could perceive.

"Stupid? Or did it just reflect that as a human, you have different priorities than you did when you were a Q?"

He stared into space for a moment. Loneliness welled up, emptiness and a keening feeling of abandonment. "I suppose that's it," he said, his voice slightly hoarse. "Different priorities." He looked down at his plate and proceeded to attack his macaroni and cheese, as if he could fill the emptiness with food, or maybe at least distract himself enough to stop noticing it.

"It doesn't feel the same, dealing with us now, does it? You used to treat us as toys for your amusement, but I haven't noticed any of that from you now."

"It's not exactly as if I could treat you as toys for my amusement right now."

"That's actually not true. I've known many mortal beings who took great delight in causing chaos and drama because it entertained them to do so. I won't deny that you've caused some chaos and drama here, but you haven't done any of it because you wanted to be entertained; most of what you've done since becoming human has been a reaction to how you felt, not something you wanted to do to get a reaction out of others."

Q pointed at her. "You're getting dangerously close to doing the thing you promised me you were going to apologize to me for doing," he said.

"I'm not talking about what I sense from you to anyone except you," Troi said. "And if you know how you feel, and you know that I can't stop sensing how you feel, then I'm not sure that this is the same issue at all. It's not a violation of your privacy to tell you something you already know, is it?"

"The Q don't go around telling each other things we already know."

"Yes, but you're not a Q any more and your emotional reactions aren't the same... which is exactly what we're talking about. We were toys to you when you were a Q, and you weren't concerned if we died, and to be honest I'm not really sure what you felt about us or why you claimed you wanted to join the crew or why you kept pestering us... but none of that seems to be relevant right now, because right now, I think you want us to accept you. To consider you one of us. You feel very strongly that you want to be part of us and you want us to care what happens to you... and you care what happens to us as well. You've adopted us, Q, and you want the feeling to be mutual."

"Is there a point to any of this?" Q asked coldly.

"There is, actually," Troi said.

She sipped at her iced tea. Q glared at her impatiently, and finally said, "So were you going to get to it? Ever?"

"I'm sorry I hurt you," Troi said. "Q, it's my job to use my empathic powers to assess whether the beings we deal with are telling the truth or not. Most of the time, most mortal beings wear their emotions on their faces; it's not as if the Captain needs me to tell him that someone is feeling hostility if they're shouting and their facial expression is angry. That's why I'm allowed to do what I do, in Starfleet. If I were a full telepath in Starfleet, I wouldn't be permitted to read minds unless there was probable cause, or I had permission; some extremely powerful telepaths have a hard time not reading minds, but they're generally under oath not to talk about what they perceive. But we all keep our thoughts private; most of us don't keep our emotions private unless we have something to hide. And if someone has something to hide, they might be a danger to the ship. And if someone who has previously been hostile to us, who has put us in dangerous situations and threatened to kill us, has something to hide... that's why I told the Captain that you'd lied about the Calamarain. I knew you would be offended, because you had told me that it offended you when I revealed your emotions to other people... but because I didn't know what you were lying about, I had to weigh offending you against the risk that what you were concealing could harm or kill someone. So I did something that I knew would cause you pain, because I thought the danger your behavior presented justified the risk, and for that, I'm sorry."

"But you're not actually sorry you did it," Q said.

"Are you actually sorry you had your animal creatures stab Wesley and Worf with bayonets? Are you actually sorry you froze Lt. Torres solid for pointing a phaser set to stun at you?"

"Did I actually apologize for doing those things? No."

"Well, I'm apologizing for the part of what I did that I regret. I didn't trust you. I had good reason not to trust you, after your past behavior, but the fact remains that you were making an incredible sacrifice for our sakes and I didn't trust you, so I did something that hurt you very badly in a moment when you were much more vulnerable than you usually are. And I'm sorry for that."

She leaned forward, looking directly up into his eyes. "If I had it to do over again... I'm not sure what I could have done differently, but I would have tried to take your feelings into account more. Maybe if I'd given you the benefit of the doubt and assumed that your reasons were altruistic rather than considering it likely that you were lying to protect yourself..." The pangs of guilt she'd been feeling ever since she'd learned the truth about what Q had actually done stung her. She had, after all, jumped to some very negative conclusions given the facts she'd had; if she'd trusted Q more, she'd have figured out what he'd been lying about quickly enough, since the only scenario that actually fit everything he'd been feeling had been self-sacrifice. "...and I should have assumed that, because I sensed it from you, I could feel that you were afraid and that you wanted to protect something you considered to be yours, something you cared for, and if I'd put the pieces together correctly I could have guessed at exactly what you'd just done... maybe I could have been more compassionate in how I explained my findings to the captain, so he wouldn't have been as cruel to you as he was."

Troi looked down at the table for a moment, remembering. Picard had been justified, too, given what he'd known about Q, and Q's reaction had been far out of proportion... except that he'd just agreed to give his life for the people who'd been accusing him, and Troi knew how badly that had hurt him, at that moment. Almost enough to forgive him herself for his viciousness to her. "But you have to understand, Q." She looked back up at him. "What I do isn't rape, because there is never any justification for rape. What I do is spying. I don't do it for fun or my personal gratification; I do it because the Captain needs me to do it, and the ship needs me to do it. Every culture that practices spying despises everyone else's spies, but they still spy themselves, because it's necessary. And I can understand if you will never forgive me for spying on you... but I want you to understand, I didn't do it to hurt you, or to exert power over you, or even because I didn't care. I did it because you were lying to me and a lie from someone who had previously attacked the ship over an issue as important as whether or not hostile aliens were going to attack us again or not is an issue I have to take to the Captain."

Q was breathing hard, anger and the remembered emotions from how he'd felt at that time fighting with reluctant acceptance. "Are you done?"

"Yes," she said, sitting back in her seat. "But the main thing I want you to know is that I won't reveal anything you tell me in confidence, and I won't reveal anything else to other people about how you feel, now that I know that you see it as a violation... unless I believe your safety or the safety of other people is at risk. I am the Ship's Counselor, and it's my job to help people deal with fear and trauma and grief, and I don't want you to feel that you can't turn to me if you need someone to talk to because you're afraid I'll tell everyone what you told me. I'll completely understand if you don't want to talk to me, but I don't want you to be afraid of doing so or feel that you can't."

He laughed, a sharp, ugly sound. "Really, Counselor, do you think that talking to you about the fact that I'm going to die, and how I feel about it, is going to do me a damn bit of good? As if putting it into words is going to magically make me feel better about impending doom?"

"I don't know. Some people would feel much better just knowing that another person knows and cares about what they're going through. Others wouldn't. I don't know you well enough to know if talking about your feelings would help you, but I wanted to make sure you knew you had the option if you wanted to."

"You may know what I'm going through, but you don't really care. Don't do me the disservice of lying about that, after you were honest enough to admit that you knew you were violating me but you thought the ends justified the means."

"I do really care. It's part of the curse of being an empath, Q." She raised her hand, closed loosely so she could raise fingers one at a time as she recited the terrible things Q had done to her or the ship. "You terrorized me, you tortured Lt. Torres for no reason and then Tasha for standing up to you and the whole time I was screaming at you over what you did to her, I thought I was next and that we would all die. You tried to take a man I consider one of my best friends, my first love, off to your dimension with you and you waited until I wasn't even there to do it, as if you knew I cared enough about him that I could have fought you for him and you weren't going to let me try. You let 18 people die in a fit of pique."

Troi leaned forward across the table, eyes fixed on Q's, until his slid away from her and he looked at the wall. "I don't want to care about you, Q, I want to want you to suffer, because that would be justice for the suffering you put us through. But I can't help it; it's my nature. I will never prioritize justice above mercy, I will never stop wanting to give people another chance."

She sighed, sitting back again and folding her hands together on the table. "I can feel how lonely you are, and how badly you want us to consider you one of us, and I can feel how frightened you are and the grief you feel over your own mortality... and yet, you agreed to sacrifice yourself for us and even now you don't really regret doing it. You want to protect us, even at the expense of your own life, because you think of us as yours. Because in your own way, you love us. And because you hate the thought of people dying when it's your fault and out of your control." She reached across the table to take his hand. He was startled, and his hand twitched as if he had been about to yank it away... but as she felt his startlement fade, he left his hand with hers, and met her eyes again. She continued. "I'm sorry if you feel like you've fallen to some desperate low, to care for mortals, but I really respect that you've come to feel that way, and I believe that deep down... you might actually be a good person. Or you might be growing into one, now. So yes, I do care about you. Even if part of me says that I shouldn't, I do. And if there's anything I can do to ease your suffering, I want to try."

His emotional state was in utter chaos. She couldn't really see it in his face, but she could tell that he was struggling to keep himself under control, that he was on the edge of breaking down. "The only thing you could do for me is convince Picard to let me try my plan," he said, his voice hoarse. "Because if any of you really cared about me, and you're not just lying to screw with my head, then you'd be willing to help me make a copy of myself who'd survive after I'm dead."

"Actually, I've been thinking a bit about that," Troi said. She let go of his hand. "As I see it, the ethical issue is that if you split yourself into two people who have equal right to live, you're creating a situation where you're competing with yourself for survival, as Captain Picard said. Your perception, as a Q, that two copies of you are the exact same person is because you can merge those copies back together into one person; for mortal humanoids, as soon as there are two different perspectives, two different ways of seeing the world, there are two people. Two brains give rise to two egos, two consciousnesses. And I can't see how the consciousness who's destined to die couldn't resent the one who's going to live."

"Since the alternative is that I die entirely, I'm getting sick of you people projecting how you'd feel about the situation onto me. I wouldn't resent that there's a me who'd live; I'd be grateful that I wouldn't die completely. I mean, do mortals who die of old age resent their kids for living on after them? And your child is much less you than a copy would be."

"I don't agree that it's the same thing," Troi said, and felt his anger burning, saw his face going dark. "But I think I may have an idea as to how to make it a moot point. Q, if you can build a telepathic amplifier, would you be able to build a telepathic connector? A two-way artificial telepathy that could be used to link two separate brains together?"

"Yeah, sure, that wouldn't be hard. Now that we have the telepathic amplifier, making it narrowcast like that wouldn't be much."

"I have a theory." Her soup was getting cold. She took a spoonful of it. "At the moment that you are copied by a transporter glitch, the two of you are identical. You haven't had time to individuate. You have two separate brains, so you have two separate consciousnesses; but those consciousnesses are identical to each other. What if, at that point, the two of you were connected by an artificial telepathic link, so that you could perceive each other's thoughts and emotions in real time? I think that doing such a thing to two identical humanoid entities might cause your two consciousnesses to merge back into one - so instead of being two minds in two bodies, you would be one mind operating two bodies, but with two brains to do it with, so you wouldn't experience effects like double vision or poor coordination."

"I'd experience my own death," Q said soberly. "And survive it."

"Yes. I know that doesn't sound like a pleasant alternative. But the problem with making two of you in order to send one to his death is that the you who lives would be condemning a wholly independent being, a person who has a separate consciousness, to death so you could live. If you share a single consciousness, though, then you're literally both dead and alive. I don't see how Captain Picard could have an ethical issue with one of the two bodies that a dual-bodied being occupies dying so that the being can live on in the other body."

Something about what she was saying was reminding Q of some past grief, and for some reason evoking a bitter stab of envy. "What you're suggesting is a pathetic and ridiculous parody of the only merciful way to die," he said. "But it's closer than anything the Continuum was willing to offer me, so I'll take it. Do you really think Picard would go for it?"

"It depends on whether he believes that you would be more likely to remain two individuals, just telepathically connected, or merge into one. But I have good reason for my theory. On Betazed, children aren't usually born with telepathy, but identical twins are typically telepathic with each other. And in the past, when we were too primitive to conceive of another solution, we used to put one twin to death at birth, because what would happen was always that there wouldn't be two twins... there would be two bodies, but a single mind shared between them. Early Betazoids thought that was terrifying."

"Of course. Just like any other mortals, you destroyed what was different because you couldn't understand it."

"Yes. But when we discovered that certain herbs could suppress telepathy, we found that treating one or both of the twins with telepathy suppressants until they were about 2 would cause them to individuate into two different people, just like fraternal twins. At the time, of course, we thought it was some sort of magical ritual, and when we began to put superstition behind us we quit the practice, which is how we discovered that our long-held notion that twins are one soul in two bodies is actually literally true unless we block the effect. Suppress the babies' telepathy, or take them out of range of each other to weaken the connection, and Betazoid identical twins grow up to be separate people... but they never individuate if they're allowed to be telepathic with each other in infancy. They develop into a single ego in two bodies. I believe that a telepath split by a transporter would undergo the same effect - the telepathic communication between the two bodies would prevent the minds from ever separating. Technically, after all, humanoids with two-lobed brains are already two brains sharing a single mind... it's just the same physical body in that case."

Relief and gratitude washed over Q. "You don't actually have any idea how perfect that would be," he said. "If you can get Picard to agree to let me try that... you have my permission to tell him whatever sordid details of my emotional state you need to, if it'll convince him."

"I appreciate that," Troi said. "I can't speak for the captain, so I can't promise you that he'll agree, but he's a reasonable man. I think he'll accept this idea."

"For the record," Q said, "I'm sorry I froze Yar. In the courtroom." He shrugged. "I mean, you probably guessed that, since I undid it then. But that was... That wasn't justified. I lost my temper."

Well. That wasn't much of an apology, given all the other things he'd done, but it was a start. Troi couldn't help wondering about the other occasions, though. "What about freezing Lt. Torres? Was that justified?" she asked calmly. She wasn't even going to get into the Borg, or the fact that apparently Wesley and Worf had been stabbed by Q's creations when Q had been trying to seduce Will away from humanity, while Troi had been away.

"Oh, most definitely," Q said, scowling. "He pointed a gun at me, Troi. That's not behavior I like to encourage."

Troi sighed. "Q... it was a phaser set on stun. If you'd actually been a mortal humanoid, it still couldn't have hurt you. Given what you really were... how could you have considered that any kind of a threat?" At the time, neither she nor Picard had known if Q had realized the phaser was set to stun, and Picard had pointed it out to Q as if he thought Q was an ordinary alien who'd transported aboard and could have mistaken such a thing, not the nigh-omniscient entity he'd turned out to be. Q had smirked coldly and said that given humanity's history, no one should want to be a helpless captive of humans... but in the light of what Q had turned out to be, that didn't really make sense either.

Q didn't say anything for a moment. That sense of past grief she'd felt from him before came back. He was conflicted about something. Then he said, quietly, "Well, if they didn't want me giving away the trade secrets maybe they shouldn't have kicked me out."

"I don't understand."

"The fact that it was a stun weapon actually made it worse, Troi. The Q cannot be killed by any force beings made of matter could possibly muster up; the energy levels it would take to kill a Q destroy stars. For that matter, even when you people destroy stars you do it by triggering accelerated fusion reactions or other stunts that take advantage of what stars are actually made of... and your sensors can't even detect what a Q is made of, in our natural state. If I hadn't been paying attention, and he'd vaporized me, I would have just reconstructed my body. But... I could have been vulnerable to a stun weapon, if I hadn't been watching for it and prepared for it."

"How?"

"It has to do with the way we interact with your universe." Q pushed his food aside. "The substance of the Q Continuum is of a nature that doesn't exist in your universe. The kind of energy we're made of can't exist, here. Of course your nature couldn't exist in our dimension, either, but we don't bother even trying to bring you there. We go to you. So in order to interact with your universe, we translate ourselves into a form that can exist in this universe."

"Like how holograms can't leave the holodeck."

"Sort of, though I tend to think our reality is more real than yours, so that's not my favorite analogy. More like the way you could not exist as a computer program. Your holodecks simulate reality because you can't actually enter computer-generated virtual realities without translating that virtual reality into something that can at least pretend to be real in your frame of reference."

"I've seen computer-generated virtual realities before," Troi said. "They're used for simplistic games, usually. Or something you want to be able to enter into and exit from more quickly and easily than the holodeck."

"Exactly. And there are two ways you can enter a virtual reality. You can maneuver a little avatar around with commands to the computer, or you can hook some sort of interface to your brain so the computer can generate the reality in your mind, making you feel as if you're really there. Which is sort of what your holodeck is doing, also." He leaned back in his chair. "The Q do the same thing. We can create a puppet, an avatar, to interact with you, and control it from the Continuum, make it run around and act like it's one of you... but that generally comes across to most mortals as kind of uncanny valley. We-"

"Uncanny valley?"

"The area in which something looks enough like one of you that you judge it by the criteria you judge each other by, but not enough like one of you to actually fool you. Most of the time we use avatars to play God, since looking not quite real to mortals actually helps with the whole 'I am a god, worship me' thing."

"But you've always looked... well, real. So I assume you do it the other way?"

"Right. I don't really see the point to interacting with your universe from the Continuum; you lose a lot of what makes your universe a fun place to hang out, that way. So I translate myself into a matter-based body." He pointed at himself. "This thing I'm stuck in is actually the specific one I used to use. When I had my powers, the body was animated by my powers - I didn't need to eat - but my consciousness was, actually, processing through this brain. Of course my consciousness was much bigger than could fit in this brain, so I had a... I suppose you could call it a direct connection, a link, to the Continuum, where I kept everything that I am that couldn't fit in this form. But brains are brains, I mean, if you've actually translated yourself into a form that has a brain, then your ego is actually located on that brain. Destroy the brain or the body, and I'd have just snapped back to the Continuum through that link, automatically. But shut down the brain or impair it, and I'd lose my ability to consciously access my powers."

"So you're saying that stunning you would have actually worked?"

"Well, no, it wouldn't have, because I was going into hostile territory to deal with a savage, primitive species who carry energy stun weapons, so I had a personal, uh, forcefield sort of... more like an energy blanket, something set up to soak up any form of energy that could affect the physical body I was wearing before it could reach that body, all around me. But it's the principle of the thing. After I did that, no one even tried to point a gun at me... and if they did, Picard stopped them. Because if I hadn't been prepared, if I'd been distracted and I didn't have the energy blanket up and I hadn't noticed a mortal coming up behind me with a gun, then yes... it could have worked on me. It's worked on other Qs." A wave of old grief and melancholy washed over him.

"It sounds like something bad happened to another Q because mortals were able to stun them?"

"Oh, yes." Q stared morosely down at the table for a moment. "He was an explorer, a tester, like me - though you'd have liked him better. He didn't present himself to the targets as a god, or an imposing, hostile superbeing. He'd show up as an alien who looked just like them, like I do, but he didn't admit that he was immortal or nigh-omnipotent; he'd admit he had nifty psi powers and a long life, and everything else he'd pretend to do with technology. So he had this thing he claimed was his ship, that he'd travel in space and time with, only he made it look like a small shed and then when you got inside it was a pocket reality and he'd filled it with fake technology with blinky lights, and it was much bigger on the inside so the mortals would be impressed... but they weren't supposed to know precisely how much more advanced he was."

"Why not?"

"Because he wanted to be friendly. And godlike beings can't be friendly. We can't be benevolent, we can't be your friends, or you start turning to us for everything. He had to hide how powerful he was in order to be able to present himself as a nice guy, or they'd have been pestering him for favors constantly. So he showed up on this one world, doing his usual thing, acting like everything about their culture fascinated him, bouncing up and down like a manic ferret. Trying to expose their hypocrises to them by playing clueless alien and taking their idiotic cultural biases to their logical conclusions. That kind of thing. And he made some friends on the planet, but some powerful people started pestering him for favors, because even though he wasn't admitting that he was omnipotent, he had demonstrated what seemed to them to be amazingly advanced technology. And of course he turned them down, just like you would have with your Prime Directive."

He was quiet for several moments. "If you don't want to talk about this, you don't have to go on," Troi said.

"Why would I have started talking about this if I didn't intend to finish?" Q asked tonelessly, staring down at the table.

"Perhaps because you didn't realize when you started that it would cause you pain to think about it?"

"Right, because I'm a total coward and also an idiot." He looked up. "They arranged for something bad to happen to one of his friends. A family crisis. Someone dead, or injured, or dying. I don't remember. He was distracted by his friends' distress, and trying to help them, so he never saw it coming. Soldiers shot him with a stun weapon that knocked him unconscious, and then they put a collar on him to block his psionic abilities. Which wouldn't have been an issue, except that when the Q are fully incarnated in a body, we use the brain's psionic centers to mediate our connection back to the Continuum. They thought he was a garden variety psion, but the collar they put on him to block his psi powers kept him from being able to contact the Continuum. He couldn't draw power, he couldn't translate himself back out of your universe, and he couldn't even call for help."

Troi imagined what it would be like to be cut off from her empathy. It was painful and ugly. "I'm sorry."

"Not as sorry as he was," Q said darkly. "They wanted the technology they thought he had. So they tortured him for information about it. And when he told them the truth, because of course he told them the truth, other Q don't generally have any better pain tolerance than I do, they didn't believe him. So they tortured him some more. And it was three years before the Continuum noticed that we hadn't heard from him in a while. So we went looking for him." His hands were clenched on the table. Remembered grief, remembered rage, welled up in him. "He was destroyed, Troi. Completely and utterly broken. Slavishly subservient to his captors, and in his mind... in his mind he went back and forth as to whether he wanted to beg them to use him for their amusement because it would make them happy and making them happy was all he lived for anymore, or whether he wanted to wipe out their entire species."

"Were you able to help him?"

Q laughed hollowly. "You could call it that." He pushed one of the drinking glasses around on the table, fidgeting. "The Q are never weak. Never broken. We're all equals, we're all equally strong. And nothing terrifies us more than the thought of a weak omnipotent being. A broken being can't control their own impulses, and they don't have rational responses to threats - give them ultimate power, and they still think they're powerless and they need to fight back against any kind of threat with everything they've got. They're a recipe for overkill. Literally. If we'd brought him back and let him run around free, with his powers, he would probably have annihilated your entire galaxy out of fear."

"What did you do, then?" Troi hoped it wasn't going to turn out that they'd abandoned him for being damaged. She kept her voice calm and professional, but the thought that they might have abandoned one of their own to torture because the torture he'd already suffered made him unsuitable to wield ultimate power was horrifying.

"We knew, if we took his powers away permanently... like they just did to me... even if we got him to someone who could help mortals with trauma like that, he'd never feel safe. He'd feel like he'd been abandoned, and that he was doomed, and he would suffer for the rest of his mortal existence. But we couldn't give him back his powers; he would have been far too great a threat to the universe in general. He would certainly have started by wiping out that particular species, and given that before his captivity he'd had friends among them, that he'd cared for deeply... we knew that if he were in his right mind, he would never want that." He clasped his hands in front of him and looked down at them. "So we brought him back to the Continuum. And we surrounded him, and held him to ourselves. And we told him..." His voice broke. "We told him that we loved him, that he would always be a part of us, and we promised him that he would never suffer again. And we took his pain and fear away, so he would feel safe, and loved. And then we dissolved him back into ourselves."

"You killed him?"

"Technically, everything that he was is still part of the Continuum. But... yes. Yes, his consciousness, his independent existence, is gone. We killed him. Because it was the most merciful thing we could have done for him."

"There wasn't anything you could have done to help him? To heal him? What about his friends?"

"He would have been terrified of his friends. They were part of the species that had tortured him for three years. And... being cut off from the Continuum is a punishment. It's not... just about being powerless. Or even mortal. Technically he wasn't mortal; his captors couldn't have killed him. It's... we're a part of the Continuum. And if we're not... then we're not a part of anything. Making him mortal would have killed him eventually anyway, and everything that he was would have been lost to the Continuum, and he'd have been miserable and alone, and he didn't deserve to die that way. He didn't do anything to deserve that."

"One might argue that he didn't deserve to die, either," Troi said, trying to keep her tone neutral. She could tell from Q's emotions that he genuinely thought that killing his friend had been the right thing to do - he felt grief over it, but little guilt. But the idea of putting down a sentient being for being broken, like humans used to do for horses with broken legs until medical technology allowed horses to fully recover from that, appalled her.

"Of course he didn't deserve to die, but only mortals think existence is fair. He didn't deserve to be tortured, either. He made a small, understandable mistake that any of us could have made, and it got him three years in hell and made us have to euthanize him, and he didn't deserve any of it. That's not the point. Even for the Q... the universe really does not care what you deserve. But given that he was never going to heal to the point where he could be trusted with omnipotence again, and if he couldn't be trusted with omnipotence then he was going to die eventually, one way or another... we made it quick. And he knew, at the end... he knew we didn't blame him, that we didn't think he'd done anything wrong. That he-" Q swallowed, his eyes bright, plainly struggling not to cry. "That he was still one of us, that even when we dissolved him and he stopped being a separate consciousness he would always be part of us." He took a deep breath, and then a long drink of the lemonade, and when he put it down he had control of his voice back again, no cracks or tremors in it. "Frankly... while I'm obviously getting off easier than he did when it comes to the three years of torture versus four days of being annoyed by mortality... I wouldn't trade places with him, but I envy the way he died. The Q threw me out." Another deep breath. "I'm going to be alone when I die."

"Is that why you said that being in two bodies at once when you go to meet the Calamarain would be... I think the word you used was 'perfect?'"

Q brightened up. She thought perhaps he'd forgotten their earlier discussion, for a few moments. "More or less, yes. I mean... even if it doesn't work to make me one person, even if I'm two separate egos telepathically linked to each other instead of one ego with two bodies... if I have to die, and I can't be part of the Q ever again... being telepathically linked to another Q when I die, even if it's the kind of pathetic and sad excuse for another Q that my other mortal self would be, is the best death I could possibly hope for. Knowing... knowing that a Q who actually gives a damn about it is connected to me when I'm dying, even if they only care because they're another me... I mean, I wanted to do it this way in the first place because it would mean that even though I'm going to die, another me will live on after, but... I do have to confess, the fact that the self who'd live wouldn't get my memories and experiences up until the moment back did bother me."

Abruptly Troi realized that Picard, and for that matter she herself, had been viewing Q's intentions backward, and Q's somewhat incoherent explanations that he would be both persons hadn't helped much. But what he was saying now, the fact that he was speaking as if he was the being who would die and the other self who lived would be someone else, strongly implied that he had been telling the truth when he'd said he wanted to make a copy to live in his place, not to die in his place. Q thought of himself as the one who would die, and the one who would live as more like a legacy, or a child, or an artistic achievement - something that had aspects of himself who would live on after he died. He didn't want to send a decoy to die in his place, he wanted to leave a backup of himself behind when he died.

If she could convince Picard of that, if she could make him see that Q was not falling into the classic trap of thinking of a copy of himself as disposable, not a sentient being, but rather was thinking of his duplicate as someone who would live on in his place, she thought she could resolve this issue even if Picard didn't think two adult bodies could share one consciousness between them.

She also realized that although Q would probably rather undergo dental work without anesthesia than admit to it, he had more or less just confessed that loneliness and the loss of feeling himself to be part of something bigger than himself were causing him more pain than the actual loss of his powers. He hadn't gone into eloquent detail about how being powerless and mortal would have made his friend suffer if the Continuum had made the other Q mortal instead of killing him; he'd talked about how his friend would have suffered from loneliness and abandonment, and he kept mentioning that the other Q was still part of the Continuum even after his death as a separate consciousness, and that the Continuum's mercy killing had emphasized soothing the victim with the knowledge that he was loved and would be remembered. Q was not a being she had ever expected would feel lonely - with his powers, he had sometimes seemed almost solipsistically narcissistic, as if he couldn't actually imagine that other people had minds and feelings outside of his own - and he certainly wasn't admitting to it in so many words, but she didn't need to be an empath, or even a trained counselor, to read between the lines of what he was saying. The things he focused on when he told the story made it obvious. Possibly, part of his motivation for wanting to duplicate himself was the overwhelming loneliness he was suffering from, after he'd been exiled from his home and rejected by his species.

He wanted to belong. He wanted someone to care about him. He'd even admitted the part about wanting to belong, almost, when he'd talked about how he could be a helpful member of the crew in the early morning meeting yesterday. If he survived this in any way, whether it was as a transporter duplicate or some other means, those were levers she or Picard could use to move him, to shape his behavior. Given what a selfish ass he'd been in general, she hadn't had any confidence that there was actually any possibility he could fit in on the ship, long-term, when he'd first asked for sanctuary, because he had to be highly motivated to change something as endemic to who he was as his insulting, arrogant attitude, and if the fear of being made mortal and powerless hadn't been enough to make him stop acting like an ass with the Q, how could the fear of being kicked off the ship be enough? But she hadn't realized how deeply the fear of being alone and unwanted could affect him. He wasn't a sociopath after all; he was an unsocialized, impulse-driven, and in many ways immature being... but an immature being could grow up. People who wanted other people to like them, who wanted to belong to groups of other people, would change their behavior to try to make that happen.

His willingness to sacrifice himself, his drive to do all he could to protect the ship and help the people of the Bre'el system even in the face of his own death, indicated that deep down where it counted, Q was actually a decent person, as she'd told him. Spoiled, self-centered, entitled, and quite possibly raised and trained with cultural standards that actively worked against him showing kindness, compassion or understanding - some of the things he'd said about the constraints godlike beings had to work with, and about the culture of the Q, suggested that his culture had behavioral standards that might be adaptive for them, but were harmful for mortals - and that meant he had millions of years of bad habits to overcome. But the fact that he feared loneliness almost as much as he feared death, or maybe more, and that he wanted to fit in, meant that if he survived this, there was a good chance he really could change, adapt to human culture, and actually become a likable person as well.

"Do you think you'd be able to modify the telepathic amplifier to allow you to have a full telepathic connection with your other self?" Troi asked.

Q nodded. "Pretty sure. The one we've got should need very little modification; the fact that I was able to use it to talk to the Calamarain means I wouldn't have to do much to it to let me talk to me. If I wanted to be telepathic with other people I think that would be an issue; it's more of a broadcaster than a receiver. But I really have less than no desire to go spelunking around in mortal minds; it was something I tended to avoid when I was omnipotent, and I can't see having any greater fondness for it now."

He was lying. The emotions that went through him as he talked about having telepathy again were wistful loss, faint hunger of the emotional kind rather than physical, and a short spike of the grief that had been plaguing him since he lost his powers, modulated by resignation. He wanted the power to read minds back... but it was a moot point, because he wasn't lying about the fact that he couldn't have it. He didn't seem to think the telepathic amplifier would let him read other people's minds, which seemed to be part of why he was trying (and failing) to convince himself he didn't want it.

That wasn't a problem Troi could help him with, not at the moment. If he survived this, he'd be dealing with the grief from the loss of his powers for the long haul, and maybe by then she'd have established enough of a rapport with him that she could help him with it, but right now he was, quite understandably, focused on his desire to live past the next few days. She stood up. "All right. I'll talk to the captain. You'll probably want to get back to engineering and get back to work."

"Yeah, fine."


For updates and notes about my work, visit my Livejournal at alara-r dot livejournal dot com. (Fanfiction dot net strips links, so I have to write it out like that.)

Support my writing and see sneak previews, incompletes, outlines and working notes! Maybe even vote on what I write next, or get me to write you a fic. See my account at www dot dot com slash alarajrogers.