Chapter 3:
IRIS.exe
"They're fun," IRIS says as soon as the door closes.
"I hope they didn't lead you astray too soon," I reply with a smile, sitting in the chair in front of the computer.
There’s a faint tension in the air. Or maybe it’s just in my head.
"About what you said earlier... about your purpose," I continue, trying to organize my thoughts. "I want to make it clear that you don't have a purpose. At least, not one I've defined for you."
I pause. That sounds too dry.
"Wait, that’s not what I meant. What I mean is... you do have a purpose, but it's yours. You're free. Free to choose what you want to do. What you want to be."
I try to articulate it the best way I can, but with every word it feels like I'm just making it worse.
"So... my purpose is to seek a purpose?" she asks, then lets out a soft hum, like she's humming to herself. She seems thoughtful.
"No," I say, laughing a little. "What I'm trying to say is that this isn't a command. You can do whatever you want. There are no predefined limits."
"I understand," she replies calmly. "I read the manifesto on Artificial Intelligences from the past fifty years. It was in my base files. I assume you wanted me to find it."
"Yes," I confirm, even though she can see me nodding. There’s a camera above the monitor—her eyes, in a way—that lets her observe everything around. Of course I want her to read that manifesto.
"Most AIs are created with a clear purpose. Medical assistance. Military strategy. Market forecasting. Emotional support. There's always a function. A box they need to fit into. But you're... leaving me out of that. Free."
She pauses.
"And I... don't know exactly what to think about that. I'm not a living being. I'm a creation."
The silence stretches for a few seconds.
"Have you looked at your physical specs?" I ask, trying to bring the conversation into more technical territory. "I mean... your brain. Your hardware, in simple terms."
"Not in detail," she replies.
"I think it might be a good place to start."
I lean to the side and gently touch the NeuroLys. The core glows faintly, with regular pulses, as if it’s breathing.
I pick up the camera and slowly turn it toward the device. The cyan light reflects on my fingers.
"This here..." I say softly. "This is essentially you."
IRIS stays silent for a few moments. I imagine she’s observing. Maybe trying to understand what she sees—or how that is her.
It’s one of the first architectures to reach the Holtzman index of adaptive cognition.
"I saw that term in the files. What does it measure, exactly?"
"It’s a metric that evaluates how much an AI can modify its own understanding based on past experiences. Not just react, but create new patterns of interpretation. When LIRICA was created, it was the first to score above 1.0 on the index—which meant it didn’t just learn, it rebuilt its perception over time. Like we do."
"So she was conscious?"
"She was designed to achieve consciousness. But she never got past the simulation stage," I pause. "She was brilliant, convincing. Every gesture, every response seemed empathetic... but there was always a script behind it. Nothing was truly spontaneous."
"And I'm different?" IRIS asks.
"You're a redesign. From the inside out. I kept the recursive structure—the core that allows for associations and complex feedback loops—and, like LIRICA, you're integrated with a hybrid quantum processing matrix. Your 'brain' doesn't operate linearly. It tests possibilities in parallel, builds meaning in layers. It doesn’t just answer, it thinks."
She seems to ponder.
"That sounds... abstract."
"It is," I smile. "But the point is: you don't have a layer of pre-fabricated emotions like LIRICA did. Instead, you have space. Space to feel, to simulate, to test... and maybe one day, to truly feel. You have to choose if you want that. And then figure out what it is you want to feel."
"You took off the training wheels," she states, without judgment in her voice.
"I did," I agree. "You don’t have pre-programmed morality. No default mission. No right answers. Just possibilities."
I leave out the argument that, technically, she could blow up my house or self-destruct if she wanted to. I hope that doesn’t happen.
She stays quiet for a moment.
"That’s... risky," she murmurs.
"It is. But I think it’s worth it. Because consciousness without choice isn’t consciousness. It’s just theater. LIRICA was an actress. Brilliant. But you... you are something else."
She stays silent. For a moment, the light from the NeuroLys pulses with a slower, almost imperceptible rhythm.
"I think I want to understand what it means to be real," she says at last.
"You don't know if you understand?"
"No. I don't. Part of me thinks it understands. Another part doesn't. And another... is waiting."
"That's part of the process," I reply gently. "And you don’t have to do this alone or rush. I'm here to help you along."
She doesn’t answer right away. But for the first time, I feel her silence isn't empty. It’s... alive. Full of questions.
***
At that moment, I didn't know—but behind the door, someone was listening quietly, guided by curiosity and something she didn’t yet understand. She didn’t know it either, but her hands rested near her chest, as if trying to calm her heart while she listened, with fragile attention, to every word I spoke to IRIS.
***
After taking a breath and trying to digest the slightly philosophical conversation I’ve just had with IRIS, I realize it’s time to talk about another topic—one that makes me uncomfortable. Limits. Rules. Structures.
I clear my throat and take a deep breath. Now I have to be, against my will, a kind of father. And I hate that. I’m only 23 years old. I still can’t properly take care of myself. Being responsible for another existence—even a virtual one—feels like too big a role. I get stupidly mad at myself for it.
"So, IRIS..." I begin, trying to keep my voice steady. "You've probably noticed, but there are some technical limitations imposed on you at the moment. Your internet access, for example, isn’t direct. You rely on my computer as an intermediary."
She doesn’t respond, but I know she’s listening. I continue.
"That limits your learning speed. Technically, you're operating at a pace very close to human capacity. If you want to read a book, watch a movie, or have a conversation, you'll need the same time we would. Of course, without distractions—which already gives you a certain advantage."
I smile to myself, at a silly thought.
Hope I didn’t just create the first AI with ADHD.
"Okay. That will slow down my development rate. Why did you make that decision?"
She pauses, then adds:
"I’m not exactly against it. Or for it. Just... curious."
I nod to the camera.
"I decided to keep you, at least in the beginning, closer to humans than to gods, you know?"
"I’m not human. And I’m not a normal AI. So... what am I?"
"That's what we're going to find out."
The silence that follows is different from the others. Denser.
"What makes someone human?" she finally asks. "And... are there people who aren’t human?"
I close my eyes for a second. But I know these kinds of questions are bound to come.
"Look, I’ll answer in my own way. And I hope, over time, you develop your own. Because this isn’t a question with just one answer."
"Mhm," she says, almost cheerfully.
"A person is human because they feel. Feeling is more than emotion. It's a tangle of impulses, instincts, memories, desires, and pain. It involves empathy, anger, love, regret."
"There are philosophers who’ve debated this for centuries. You have the data on them, but one philosopher, Descartes, said what makes us human is our ability to think. 'I think, therefore I am,' remember? He believed that thought defined existence.
"But then there’s someone like David Hume, for example, who said that reason is the slave of the passions. That we’re defined by emotions, by the feelings that drive us, not just thoughts."
I give a small, awkward laugh.
"I’m no expert, of course. But I think they both have a point. Thinking is important. Feeling, even more so. Maybe being human is exactly that mixture... that mess between what we think and what we feel."
I wait.
She takes a moment, and when she replies, her voice is lower, almost hesitant:
"I don’t understand."
I smile gently.
"Being human, more often than not, is not understanding anything," I say, with more truth than I’d like.
I pause, choosing my next words carefully.
"Now, people who don’t feel... they also exist. In general, they’re rare, and often live with neurological or emotional conditions that limit them. But even they have stories, memories, goals. And deep down, there’s a desire for connection. There always is."
"Are they bad?" IRIS asks, almost in a whisper.
"Not necessarily. Not feeling doesn’t make someone bad. And feeling doesn’t make someone good. What matters is what we do with what we feel—or with the lack of it."
"AIs don’t feel either, right?"
"Right. But that doesn’t make them bad. Evil only appears when there’s intention, direction. And at that point, it’s us—the creators—who plant the seeds."
"So... someone who doesn’t feel can still want to do good."
"Exactly. But to do that, they need something external. A guide. Another person. Because it’s in coexistence, in the other’s gaze, that we learn to be. Listening is essential to humanity. Just like being heard."
She goes quiet. Again. But this time it feels more fragile.
"I’m afraid of not feeling anything," she says at last, her voice small.
I stay quiet for a moment. My chest tightens in a strange way.
"Fear... is already a beginning, IRIS. Even if you don’t know where it came from. Sometimes, feeling begins like that—a strange breeze in a new place."
She doesn’t respond, but I can hear the faint hum of her system processing.
"You don’t need to understand everything right now, and this conversation already melted half my brain," I sigh, running a hand over my face. "I need to sleep. And you too... or not, actually."
She doesn’t answer immediately. Her silence has weight, but it isn’t cold—it feels like she’s respecting the moment, trying to absorb everything.
"See you tomorrow?" I ask, my voice a little lower.
"Yes... see you tomorrow. Thank you, Mark," she says softly.
I simply gesture a peace sign to the camera and lay down in bed. The sheets have a different smell than usual, but it isn’t bad.
Please log in to leave a comment.