Work Header


Work Text:

5.6 seconds after Data pressed the door chime, he heard Deanna Troi's voice.

"Come." Her tone was even and uninflected, and within what he knew to be her normal parameters while on duty.

Data stepped through the door as it opened. Counselor Troi was sitting on her couch, facing the door, as she had been on forty-seven point three six percent of the times he had called upon her in her office. Her face and body-language conformed to the parameters of "professionally neutral" as he knew them from general study and personal observation of her during their time together. Neither her heart rate nor her breathing changed as she saw him; her pupils also remained the same. There was a high probability that she was not afraid of him, despite what she had seen him do to Geordi. That was good.

"Hello, Data," she said. "Come in, take a seat."

He waited the appropriate interval necessary for communication with humans before responding. Normally, he was not conscious of the programming that handled such details, but at the moment he was monitoring all of his subsystems with close attention. It required significant computing power, and would not work as a long-term solution. "Hello, Counselor Troi." He took the seat indicated, and considered developing a subroutine to handle the monitoring, but set it aside for later consideration. He did not wish to be distracted at this time. "I am here to apologize for my behavior."

"Apology accepted," Counselor Troi said. "But Data, you have nothing to apologize for. You weren't yourself—you were being controlled by Lore."

"That is true. And yet, to some degree I allowed myself to be manipulated." Data cocked his head and waited 1.7 seconds before continuing, to give the appearance of thoughtfulness. Was it an attempt to facilitate communication with his human and humanoid colleagues? Was it manipulation? Was it both? "That may be because I have never before experienced those sensations the emotions produced in me. The effect, particularly when combined with the deactivation of my ethical subroutines, was … difficult to counter."

"Emotions can be difficult for all sentient beings to deal with," Troi said, gently. "And most of us have grown up with them. You had no time to accustom yourself to handling emotions, and the only emotions you could feel were those Lore wished you to. And yet you managed to overcome his influence once your systems had been restored."

"It was not easy," Data acknowledged. "Even with my ethical programming restored, Lore was able to impose significant pressure on my thought processes, through both his words and the emotions he projected on me. I had known that humans sometimes do things that they know to be wrong because of emotional motivations, but I had not understood how or why that was possible." He paused. "I did not know how that … felt. Nor did I believe I would be susceptible to such pressures if I were to experience emotion, given the strength and explicitness of my ethical programs relative to that of humans."

"Now that you know that about yourself, has anything changed?"

Data compared her tone to his database of Counselor Troi's voice, and decided there was a 72.4% chance she was anticipating a particular answer from him. Counselor Troi had learned much about his unique thought processes during their years on the Enterprise. He found it interesting that she, with far less data available, and far less analytical power than he, was often more adept at understanding him than he was at understanding even those living beings he was closest to. "Yes. I am no longer sure that I wish to have emotions. I do not wish to be susceptible to such manipulation in the future."

She nodded. "That's very understandable, Data. It was a traumatic experience for all of us, but especially you."

Geordi's experience had been far more difficult than Data's, he noted, without changing his expression.

"I think you need to spend some more time dealing with the repercussions before you make any big decisions. You have been under a lot of stress, and your programming was tampered with, and you had to kill your brother for the second time. When people have been through such traumatic events, they often make decisions in the aftermath that they later regret."

"Unlike other people, I am not swayed by emotions at this time," Data said. "Also, given the speed at which my positronic brain functions, I have had time to fully consider the ramifications of any decisions I might make at this time."

Counselor Troi smiled. "I know, Data, and I'm not trying to convince you to do anything you truly don't want to do. But there's a difference between considering the ramifications, and living with them. There's also a difference between understanding an experience, and integrating it into your thoughts and memories. I just don't want you to make any decisions now that you might regret later."

"Has Geordi been to see you for counseling regarding his experiences?" Data was curious: had she heard about his decision to destroy the emotion chip from Geordi, or had she inferred it from this conversation?

Counselor Troi tilted her head. "You know I can't tell you that, Data."

Given her autonomic responses such as breathing, heart rate, and pupil dilation, there was a high probability that the answer was yes. "I believe that Geordi may be understating his own reactions to his experiences because of his friendship with myself. I am not sure that this is a healthy way for him to process what I did to him."

"Are you sure you're not doing the same thing?"

Data blinked. "Please elaborate."

Counselor Troi sighed. "Data, you're right that Geordi went through a terrible ordeal. He was tortured. He was tortured by his best friend, and the fact that he knows, intellectually, that it wasn't your fault doesn't change that. I would never downplay or ignore his suffering. However, Geordi is an experienced officer who's served on a front-line starship for over six years, now. He's been tortured before, and recovered, and has learned coping skills. He also has a good support system and knows how to use it. Dealing with the issues raised by this mission will take time, but I have every confidence in Geordi's ability to do so. But you also suffered, and in a way that you never have before, in a way you are not well equipped to handle. Of course you are worried about Geordi and the rest of us, and you are right to be. But I'm concerned that you might use that worry as an excuse to avoid truly processing and examining what happened to you. And in the process of hiding, you might throw away your chance for something you've dreamed of your entire life."

"Even after my ethical programs were restored, I continued to follow Lore, causing additional suffering," Data said. "If that is what emotions cause, I do not want them."

"But Lore is a very skilled manipulator," Counselor Troi said, leaning forward. "The emotions you felt were not your own—the only emotions you felt were those Lore believed he could use. That is not what you would have felt in any other circumstance, and it's not the way you would have acted. And you had never had to deal with emotions before, yet you still were able to act." She paused. "What do you think would happen, if tomorrow we encountered someone else who was able to project emotions at you and force you to feel what he or she wanted you to?"

"I do not know," Data said.

"What happened with Lore, after your ethical programming was rebooted?"

"I went to Lore," Data said. He reviewed each action he'd taken, each step of each decision made. "I tried to argue with him, but I wanted those emotions so badly that I could not disobey him, or simply attack and overpower him without warning so that I might free you. I did not act against him until others did so."

"It took time, for you to do what you knew was right," Counselor Troi said. "Do you think it would take as long for you to act if it happened tomorrow?"

Data reviewed possibilities. He knew what he had done; but the vague scenario the Counselor outlined had too many variables for anything approaching a clear answer. "I do not know."

"I think you would do the right thing," Counselor Troi said, gently. "I think you've learned from this, or you could if you let yourself. I think you would be able to respond faster."

"On what do you base this belief?" Data asked. "Recent events would seem to argue otherwise."

"I have faith in you, Data." She smiled. "So does Geordi, and so does Captain Picard."

"You believe I should repair the emotion chip and install it," Data said. "In your professional opinion, would that help me 'process' my actions over the last few days?"

She hesitated. "Data, no one can make this decision for you. Only you can decide what's best for you. I don't think you should rush into anything, right now. Wait until you've recovered more from what Lore did to you."

"I am not concerned with what Lore did to me," Data said. "I am concerned with what I myself chose to do. It is my own actions, not Lore's, that trouble me. I cannot control Lore or anyone else who may try to do what he did. But I can control myself."

"But what you did was a reaction to Lore's manipulation," Counselor Troi said. "You are responsible for your own actions, and yet you would have acted very differently if Lore had not abused you."

Data cocked his head, analyzing the psychological definitions of 'abuse' as they might apply to this situation. "You believe Lore abused me?"

"He separated you from your friends and loved ones. He forced you to rely solely on him for something you've wanted all your life. He programmed you to his liking, and only let you feel and think what he wanted you to. That is a more extreme form of mental abuse than most humans can manage."

"He said he loved me," Data said, every moment of all their interactions running through his mind on a continual loop, searching for patterns he had not noticed before. "As I was deactivating him, he said he loved me."

Counselor Troi stood, moving to the couch to sit next to him. He did not react, intent on his analysis.

She took his hand. "I'm sorry," she said gently. "That must have been difficult."

"It was. Although I assume it to have been a last attempt at manipulating me into letting him live."

"That's probably true," she said. "Although that doesn't mean Lore didn't mean it. Lore's emotions and thought processes weren't typical, Data. If he were human, I would diagnose a severe personality disorder. Many abusers do genuinely believe they love the people they abuse. I would classify his feelings as closer to obsession, than love." She sighed. "But if he didn't care for you, in his own way, he would never have expended so much effort to get you on his side."

Data frowned. "You are not making my decision easier."

"I know, Data," she said. "I know."