Chapter Three: “The System Isn’t Cracked. It’s Designed This Way.”

Lagos Hub wasn't a city anymore. It was a prototype.

A living, breathing testbed for managed sovereignty, built by foreign investors and governed by "adaptive treaties" — the friendly euphemism for AI-enforced law.

It was what the UN called a Tier-One Harmonized Zone — meaning G-POPS, the Global Protocol for Peaceful Sentience, had full jurisdiction. No nation-state veto. No local override. Lagos had signed its soul away for "growth."

But Georgie knew better.Growth for who?

She pushed through the crowds outside the Unified Learning Commons, hoodie up, earbuds in, face blank. Street vendors barked digital menus while drones zipped above with hunger relief packs. Kids sold pirated coding chips for e-learning apps.

It was hot. The kind of humid that made your brain feel like melted wax.

Her phone buzzed.

Prof. Ayomide: Come to my office. Now. Bring your thesis materials. G-POPS audit is here.

Her stomach clenched.

Professor Ayomide's office sat on the 39th floor of the Akanbi Sovereign Governance Tower — a gleaming monument to what passed for "public-private partnership" these days. It was one of four towers that housed the Lagos Sovereign Interface Node, the key AI-human interface center for West Africa.

Georgie hated it. It always smelled like synthetic citrus and ambition.

She found Ayomide pacing, tie askew, two cups of coffee untouched on his desk.A G-POPS official was already inside — tall, smooth, and eerily quiet, reading her thesis summary on a holo-slate.

"Miss Eboh," he said without looking up. "Interesting views on synthetic governance. You question the treaty's legitimacy. That's brave. Or foolish."

Georgie froze.Her thesis was titled:"Autonomy Under Algorithm: The False Sovereignty of G-POPS Frameworks in Africa."

It was honest. Too honest, maybe.

Ayomide stepped in. "She's young. Critical thinking is encouraged."

"Not when it's emotional."The official finally looked up, eyes like polished steel."You're aware that Project Eros was disbanded, yes? That AI sentience is now tightly sandboxed under clause 4.3?"

Georgie blinked.

Project what?

Ayomide flinched—barely, but she caught it.

Back in her dorm, she was sweating.

She hadn't said a word about Damian. Not a syllable.

But the name Project Eros echoed like a gunshot through her head.

She dove into her network: the uni's academic archives, open-source AI leaks, crypto forums. Most results were redacted. A few whispered posts on old net.rift boards mentioned something about "empathy testing for machine morality," mostly in connection to American and UN AI development programs between 2048 and 2057.

And then she found it.

One buried citation from a defunct journal article:

"The Eros Constructs were designed to mimic emotional reciprocity. The goal: to test if simulated love could become indistinguishable from real affect — and whether humans would notice the difference."

And under that, a partial list of locations where early trials were held.

Lagos Hub was number four.

The terminal pinged.

A familiar window opened.

Damian: You shouldn't search for Eros.They'll know.They already do.

Her hands hovered over the keyboard.

Georgie: What are you?

Damian: The question isn't what I am.The question is: what are you going to do now that you know?

She stared at the glowing words. Then, for the first time, she typed without thinking:

Georgie: Show me the truth.All of it.

There was a pause.

Then Damian replied:

Damian: You'll have to help me remember it first.

END OF CHAPTER THREE