Emily’s colleague, Mister Pritchard, had brought a pair of armored, bullet-proofed SUVs to the prison. The military police saw them off cheerfully. They were rich now, after all. Fulton imagined that as soon as the convoy pulled away from the prison they would throw their weapons in the dirt and trek home, never to fight again.
Pritchard was up front with the driver. Fulton sat in the back with Emily. The car jostled them into one another as it snaked through the labyrinth of slum streets.
“On the face of it the app is worthless,” Fulton explained. “It’s not even spacial encryption, just Reese banging on about his worldview. Take one of the characters — if you can call her that — half spider and half girl. She’s patterned after Claire, this girl Reese used to sleep with. She’s not an exact match, of course, and not just because of the metal abdomen. He distorted her features. He did it to make fun of her.”
“Then what do the cartels want with it?” Pritchard asked.
“They could probably care less. I bet their freelancer wants it — this Arab they’ve got shut up in a room somewhere.”
“I thought you said it’s a joke?”
“It is a joke,” Fulton said. “On the surface at least. But there’s something fairly sophisticated happening underneath. The app does something with your memories. It takes bits and pieces and twists them around. Reese found a way to tag and extract memories from a digitally augmented brain. Not just memories, I suspect, but fantasies and fears as well. I’m not entirely sure how it works yet. Bear in mind this kind of modification is illegal in most places. There hasn’t been a whole lot of research on the subject. Consider for a moment the overarching trend in technology: digital integration, augmentation and immersion. You think of it in terms of defense, but it’s much bigger than that. Think of the pedestrian meandering down the street, browsing restaurant reviews on his lens. Think of the commercial pilot, the futures trader…”
Emily gave him a blank look that reminded him of Robyn, so Fulton skipped the detailed case studies he’d been working up on the fly and cut straight to his conclusion. “Reese tore the cover off the brain’s circuitry and started fiddling around with the wires underneath. Take this to its logical conclusion and you’ve got the ability to alter human perception of reality, potentially en masse.
Imagine someone writes a virus that bridges the gap between the integrated chipset and your actual brain. He doesn’t just infect a hard drive. He infects your mind. Maybe you end up on a perpetual LSD trip. Maybe your brain starts running traumatic stimuli on a loop. Maybe you lose control over your nervous system. To a hacker this is the holy grail: hacking the human brain. Now imagine that propagating across an entire network. You literally have the ability to trigger mass insanity.”
There was no scientific basis for his last statement. It was pure supposition. Fulton was not a hacker. An avid user of technology, certainly, but for practical purposes. Like most people he viewed technology as a means to an end.
The hacker, on the other hand, fetishized technology. For the hacker it was all about thrills. The end was incidental. This of course went a long way toward explaining the preponderance of freelancers in the world. Freelancers only cared that they were hacking, and that they were doing something BIG. As a wise man once said (or typed): “it’s all about the lulz.”
Fulton had worked with many a hacker over the years. Reese, for example. But he still understood them only in a distant, academic sense. From what he’d seen they preferred to exist in a different world. Maybe not in the way most people considered Mars a different planet from Earth. Maybe “world” wasn’t the right word at all.
The hacker’s world looked like ours, worked like ours in most respects but was not bound by the same social, moral, physical — even perceptual constraints. Which of course was why Reese struggled so desperately with anything that involved relating to other human beings. He existed above and beyond them, in an almost evolutionary sense. Reese was like Merlin trying to relate to the illiterate serf shoveling manure out back of Camelot.
Somewhere in the midst of this train of thought, long after he’d forgotten whether he was speaking aloud or merely thinking very, very intently, Fulton realized everyone else was staring at him.
Emily, Pritchard — even the driver eyed him sideways as he worked to keep his head on the proverbial swivel.
“Am I rambling?”
“Back to the part about spreading a computer virus through people’s brains,” Pritchard said.
Fulton swallowed. He didn’t have all that much to say on the subject. Really he was just thinking out loud. A bad habit. This was not the first time it had resulted in him biting off quite a bit more than he could safely chew. Granted, prior to this about the worst damage he’d ever done was derail small-talk at cocktail parties.
“It would be something of a cross between information and biological warfare,” Fulton said, immediately looking to Emily for approval. It said something about the current state of affairs that the woman who’d snapped and killed a half-dozen people earlier in the evening was beginning to look like an emotional anchor point.
If this explanation at all impressed Emily she didn’t show it.
For whatever reason Fulton thought back to Reese’s suicide note, to the image of the two of them mid-coitus playing beneath his feet, the visual manifestation of some subconscious fantasy he feared would now haunt him for all eternity.
“It’s nothing new,” Emily said. “NSA pegged it in a Threat Assessment almost as soon as the first generation of chips were being plugged into brains. Professionals have been working on this for a long time.”
“The theory of it.”
“He’s that smart?” Pritchard asked.
“Was,” Fulton corrected. “But yes, I believe so. From what I would understand there are two key elements that come into play. First, you have to crack the security on each model of chip. That’s the easy part. I’m sure the NSA got at least that far experimenting on convicts — however they do research and development these days.”
At that Emily stiffened a bit in her seat. “The hard part,” Fulton continued, “would be finding a way to pull memories out of someone’s consciousness. That’s where Reese got ahead.”
“Some kind of AI. Reese’s specialty was cognitive development for AIs. He taught machines to learn through various carrot-and-stick type strategies. Most were all stick and no carrot. Every cub scout with his computer science badge knows you can shape an AI’s behavior with imperatives. Reese gave his AIs the imperative to survive, then programmed them to die unless they accomplished certain tasks. He taught them to fear.”
“Crude,” Emily observed. Again she had that distance in her voice that told him she’d gone somewhere else in her mind.
Fulton shrugged. “Efficient, was how Reese described it. His AIs were quick studies.”
No sooner had the words left his mouth than the building to his right exploded.