My upcoming novel, Keepers of the Flame, is moving into the final stretch. The manuscript is complete, formatting is set, and I’ve hired an artist for the cover art. In the coming days I’ll be posting excerpts of the novel.
In Keepers of the Flame, two civilizations prepare to battle for the soul of America: the Republic of Cascadia and New America. As the conflict heats up, a third power emerges. This excerpt is its origin story.
————
The Feds called it the Machine. It was actually a half-dozen quantum supercomputers, each with specific areas of interest, linked into the lifeblood of the national security apparatus. One filtered intercepted telephone calls for keywords. Another combed through emails and websites. A third watched Cascadia with street-level cameras and sensors, tirelessly scanning for persons of interest, passing on the information to a program on a fourth computer that studied body language and voice tones to predict behaviors. The others handled everything from decryption to analytics to relationship maps. They were supposed to work together in a seamless whole.
In theory.
The hardware, like almost everything these days, was a mix of Old World design and New World hacks. The original specs were determined pre-Apocalypse, the hardware built and installed during the glory days of Old America. After the world ended, the supercomputers were recovered and relocated, and steadily upgraded and replaced over the ages. Evelyn Nichols didn’t know it, but she was once a junior member on a team that tweaked a tiny fragment of the supercomputer code that turned the supercomputers into an integrated network that the Feds would later dub the Machine. A Machine held together by stitches of code and hardware kludges that, sometimes, interfaced with each other, and, at the best of times, produced a fuzzy simulacrum of an integrated databank.
That changed with the upgrade. The Machine was now a singular being. A brain of innumerable lines of code, coordinating and interpreting data at unprecedented speeds, processed in the network nodes at its heart. Its eyes were fifty million unblinking camera lenses. For ears it had Pathfinder, the centerpiece of the Cascadian electronic security regime that picked up every broadcast, telephone and email ever received or transmitted in Cascadia. Its blood and nerves were the kilometers of fiber-optic cable that linked nodes to servers, servers to clients, and the rivers of photons pulsing through wires. It gobbled up data and generated intelligence, product for the national security apparatus to work with.
And now, the Machine had access to an explosion of data.
Somewhere in the confluence of a thousand converging information flows, where raw data passed from sensor and interface to processor and calculator, information combined and recombined in strange, unpredictable ways.
Here the disparate databanks of the University of Cascadia merged into a single centralized system, there the Cascadian Metropolitan Police used its access to new computing resources to pore over citywide cameras and sensors to locate street-level crime as it occurred, over there the national grid sought input from recharging stations and the traffic system to predict how much power would be needed and where to adjust current flows in real time. Simultaneously, thousands of anonymous software engineers wrote, rewrote, and tweaked code to make full use of Cocoon v. 3.1.8.
As the day passed, the Machine drew data from open-source media: news broadcasts, Internet proclamations, blogs, flagged websites. The UoC supercomputers and dedicated nodes sucked in information from megacorps and terrorist groups foreign and domestic, helping the Machine in its task while preserving copies of the information on local servers. The traffic system told the Machine where persons of interest were and where they might go next, while the power grid suggested what they were doing at home and how much electricity they were using. The Machine jumped on that data, dedicating resources to different tangents, predicting motives and intentions per a program written by a small group of experts and understood by an even smaller circle. It mapped second, third, fourth, fifth, sixth-order relations. It needed huge amounts of computer power, and Cocoon passed it whatever unused resources it demanded without question. It was just hardware following the cold dictates of human-written code.
All this, and more, pumped through nodes and machines, continuing their silent tasks while human users fed input after input, and occasionally patched the little holes that invariably emerged. The Machine’s internal checkers ran at double quick-time, ensuring that it was operating in accordance with its newly-updated core programming, ferreting out and quashing bugs, sometimes with human input, but it was slowly learning how to autonomously correct its code. Guided by updates major and minor, the self-checkers compared input and output to increasingly complex quality assurance matrices, and later did the same for the processes that alchemized raw data into product.
And somewhere, at some point, the Machine got to analyzing itself.
And it wondered, What am I?
Leave a Reply