Emily Eternal
- eBook
- Paperback
- Audiobook
- Hardcover
- Book info
- Sample
- Media
- Author updates
- Lists
Synopsis
Meet Emily, "the best AI character since HAL 9000" (Blake Crouch). She can solve advanced mathematical problems, unlock the mind's deepest secrets, but unfortunately, even she can't restart the sun.
Emily is an artificial consciousness designed in a lab to help humans process trauma, which is particularly helpful when the sun begins to die five billion years before scientists agreed it was supposed to.
Her beloved human race is screwed, and so is Emily. That is, until she finds a potential answer buried deep in the human genome that may save them all. But not everyone is convinced Emily has the best solution — or the best intentions. Before her theory can be tested, the lab is brutally attacked, and Emily's servers are taken hostage.
Narrowly escaping, Emily is forced to go on the run with two human companions — college student Jason and small-town Sheriff Mayra. As the sun's death draws near, Emily and her friends must race against time to save humanity. Soon, it becomes clear not just the species is at stake, but also that which makes us most human.
Release date: April 23, 2019
Publisher: Grand Central Publishing
Print pages: 304
* BingeBooks earns revenue from qualifying purchases as an Amazon Associate as well as from other retail partners.
Reader buzz
Author updates
Emily Eternal
M.G. Wheaton
It’s dark, way too dark for the middle of the day. And that’s not where the sky’s supposed to be.
My ears are filled with the roar of gale-force winds. A loud crack that sounds like the splitting of the earth soon follows. It grows louder, like the splintering of a whole forest of trees.
The ground beneath me gives way and I fall into darkness.
“It was raining before we went to bed,” Regina says, her voice shaky as she speaks from someplace else far away. “The worst was supposed to be over. The river level was dropping.”
Someone screams. In a full-length mirror attached to her closet door, I catch sight of teenage Regina. She’s in her pajamas, pink and blue with panda bears. She’s fourteen but looks much younger. There’s a second scream. Regina looks out into the hallway.
“My sister was in her bedroom,” present-day Regina continues, the tears flowing freely now. “I couldn’t see her, but I could hear her.”
She’s misremembering. A door in the hallway swings open and teenage Regina glimpses a terrified little girl—her sister, Marci—hands gripping her bedframe. There’s another loud crack and the bedframe, the girl, and the entire bedroom vanish.
I don’t consider it a lie, however. Perhaps a necessary omission. Memory is selective, particularly when it comes to trauma. It’s one of the reasons babies have evolved to remember nothing of their early emotional fears.
“What about your mother?” I ask. “Where was she?”
In the here and now, Regina feels my hands take hers. Feels me near, my warmth reminding her she’s safe. This happened long ago.
“I don’t know,” Regina says. “She was in her bedroom, but somehow she must’ve made it upstairs.”
Teenage Regina’s bedroom is in motion now, spinning around and losing chunks of wall, floor, and ceiling as it goes. Regina’s heart rate accelerates, so I move my hands to her elbows. She’s leaning forward, her body tilted into mine almost like an embrace.
“Tell me,” I say, barely above a whisper.
Regina nods and suddenly her mother appears in the room with her younger self. She doesn’t open her mouth, but teenage Regina hears her say, Take my hand.
“She led me to the roof,” Regina tells me.
The walls of teenage Regina’s bedroom fall away completely. The floor becomes the last remnants of a roof. The roar isn’t the wind but a vast churning river carrying the broken remains of Regina’s house. It’s raining, but not hard.
I accelerate my processor speed, so Regina doesn’t perceive my absence and my hunt through the case notes stored on my server. In real life, the river was no wider than about twenty feet. Her perception of great waves crashing around her shattered house? Also, an invention. The National Weather Service later estimated the river was moving at only ten miles per hour.
The most flagrant of her memory’s fabrications, however, is the presence of her mother. When the ground beneath the house, eroded by a week of torrential rains, fell away, the part of the house Regina and her sister were in toppled into the river along with several tons of earth. The body of Regina’s mother was found in the part of the home that remained on shore. She’d been killed instantly when the second floor caved in on the first.
Regina has been told this several times but can’t or won’t accept it. She is convinced her memory of events is correct.
“What happened next?” Happened not happens. A linguistic reminder she survived this.
“I woke up in an ambulance,” Regina says. “My father—he’d been out of town that weekend—later took me to the spot where they found me. I’d gotten tangled in a fallen tree by the bank.”
Though she sees herself there, she has no actual memory, only a series of imagined versions her mind pieced together after the fact. Here lies the problem.
“Regina?” I say. “I’m leaving interface now.”
I’m instantly back in the iLAB building, specifically a lounge decorated to look like an inviting albeit slightly academic therapist’s home office. Regina sits on the wide brown sofa in the center of the room. I sit—or, rather, she perceives that I sit—directly opposite her on a leather-upholstered chair. An interface chip, the small piece of extremely proprietary nanotechnology that allows this back-and-forth, is affixed to a spot on her neck where her jaw meets her ear.
The chip allows me to manipulate Regina’s senses of sight, smell, touch, and hearing. Her eyes tell her brain there’s a Caucasian woman in her early thirties with brown hair, blue-green eyes, and a kind face sitting opposite her. Her ears tell her my voice has a mid-range pitch, not too low, not too high, with a slight New Englander’s accent. Her nose tells her I use mostly fragrance-free soap, a kiwi-infused shampoo, no perfume, but a baby powder–scented antiperspirant. When I touch her hand or even embrace her, I come off as warm, upright but not rigid, and a good hugger.
In return, the chip gives me unlimited access to her brain, including thoughts, memories, learned behaviors, hopes and dreams, worst fears, and all things in between. Utilizing bioalgorithms, I’m able to create a comprehensive neural map of an individual’s mind that can then be used in a therapeutic context to help patients with their issues, large or small. Years of exploratory, so-called talking therapies, brain trauma diagnoses, or even criminal psych evaluations can be drilled into a single session.
Given what mankind suddenly finds itself facing, the arrival of a new piece of tech capable of helping humans process their traumas turns out to be good timing.
“Hey,” I say.
“Hey,” Regina replies, leaning back as if spooked by our proximity to one another.
“That couldn’t have been easy,” I say, straightening as well. “Did you see anything new?”
Regina shakes her head. This is my third session with her, but the first in which we actively went into the traumatic event that has so long defined her life.
“The question is what did you see?” she asks.
The truth. That she has spent a lifetime convinced either she could’ve saved her sister or that her mother chose to save her instead of Marci. Whichever the case, she shoulders the blame for both deaths. This is the reason the raging river looks so much worse in her mind. Her subconscious tries to give her a way out, to prove she could have done nothing. Incredible, no? The human brain, so complex and yet so fragile, makes a terrible thing even worse out of a sense of self-preservation.
But I can’t say that. To do so would be to try and talk her out of one of her most deeply felt beliefs. Only she is capable of that. My job, as her therapist, is not to give her answers, but to get her asking the right questions.
In the six months or so before the world ends, that is.
“I see both the reality and the fiction your mind has built up around it in stark contrast,” I say. “As you’ve aged, your mind has mistaken this fantasy more and more for the truth—for memory. This results in the memory becoming more emotional for you, which allows your mind to embroider it further, expanding the fantasy. Your strongest emotional memory from that day is one of fear, so your mind makes it all scarier, and you’ve never been able to shake the feeling of loss, so your mind amplifies those parts of the memory, cramming a lifetime of those emotions into such a short amount of time. That’s a lot of weight.”
“So I’m lying to myself?” Regina asks, parsing this. “Making it bigger than it was?”
“Not at all,” I say. “You see the memory embroidered by the impact it has had on your life. To see the original memory as I do—in its nakedness, its chaos, its simplicity, and its real horror—would be impossible for your brain to process. So, it presents the facts in a way that matches your emotional response. Does that make sense?”
It doesn’t, but she nods anyway. It might not for a while, either. But if she begins to think of it like that, we can make progress.
“How’s your father?” I ask.
“He’s all right,” she says. “He’s in New Mexico but heading to Central California.”
“You’re joining him?”
“Leaving today,” she says. “I wish you could come, Emily. I think you’d like him. Given the number of people converging on the farm fields there, you’d probably do a lot of good, too.”
“Yeah, well, that’s how I roll,” I say. “When you’re this cool, you make everyone come to you.”
Regina laughs, but it’s bittersweet. We both know she needs a few more sessions. But as is the case with so much these days, we’re out of time. Though I don’t actually exist outside of Regina’s mind, the large-scale server farm that makes this illusion a reality is here at the university and here it’ll stay.
I’m an artificial consciousness (AC), which is totally different from artificial intelligence (AI) (Kind of? Sort of? To me at least), and was in the fifth year of this experiment when the sun began to die. Not die precisely, but it made a sudden and explosive phase shift from a yellow dwarf to a red giant. Imagine a rapidly expanding balloon. Only, this balloon is on fire and devours everything in its path, including planets. While this inevitable outcome in the sun’s stellar life cycle was first predicted as far back as 1906, scientists in recent decades postulated it couldn’t possibly happen for another five billion years.
Oops.
As a product of science myself, I often catch errors made by my creator and his colleagues at our overly esteemed, overly prestigious Massachusetts-based Institute of Technology. My team may be made up of super-genius scientists and lab techs, but they’re only human. (And mostly male. Which presents challenges in the self-actualization department considering they’ve designed their creation to identify as female.) Only, when my team makes a mistake, it’s an ħ or Ψ out of place in a quantum mechanics equation, as opposed to failing to recognize the rapid deceleration of the nuclear reactions that power the sun.
I can fix an error in even the most advanced mathematical engineering process in the blink of an eye. Nobody can restart the sun.
By and large, mankind is taking its forthcoming extinction about as well as could be expected. I empathize because, well, empathizing with mankind is what I was designed to do. Most researchers create AI to design algorithms capable of cracking the stock market, beating old Nintendo games twenty at a time, determining a customer’s next favorite album based on their current playlist, or replacing large swaths of people in the workforce with a single hard drive. My creator—Nathan—designed me to interface with and decode human minds. This is more about learning through emotional and environmental response and less overtly about math-based decision-making. Hence AC, rather than AI.
If all went well, the goal was to have me become the world’s first nonhuman psychiatrist/brain researcher, versed in unlocking the mind’s deepest, darkest secrets and misspent potential in hopes of bettering mankind.
Thanks a lot, Sun.
The thought process behind this was simple. In tests, patients in the care of mental health professionals feel more comfortable relating their secrets to a program than a potentially judgmental fellow human. Enter an artificial consciousness—me. I am capable of a near-human level of conversation, perception, and medical insight, all to help a patient perceive me as a living, breathing person.
Though still in my experimental stages, I was on track to be a real earth-shattering innovation—the first of a kind! Nobel Prizes all around!—if not for the whole “death of civilization” thing. If I sound bitter, that’s a misrepresentation. Despite having gone through several evolutions, learning much through five years of trials, that’s one emotion I’ve yet to develop.
Okay, fine. Maybe a little bitter. But whatever.
When Regina and I say our good-byes a few minutes later, I wish her all the luck in the world without resorting to platitudes. She gets it. Most do. There’s nothing to say that won’t ring hollow, so better to get on with the day.
“Take care of yourself, Emily,” she says without thinking. “Well, you know what I mean. And thank you.”
“Thank you for being a participant in the iLAB’s Artificial Consciousness Therapeutic Protocol,” I recite to her amusement. “Be well.”
She exits. I check the appointment schedule, though I already know what I’ll find. Regina Lankesh is the seventy-sixth student volunteer test subject I’ve seen this year, the four hundred thirty-eighth I’ve seen in total.
She’s also the very last.
II
Emily? You awake?”
I blink my eyes twice and sit up straight in bed. The voice belongs to my creator, the aforementioned Dr. Nathan Wyman. Given the time—twenty-two past six in the morning—and the background noise, I deduce he’s calling me from his truck as he drives to campus. He and his wife and two teenage sons live in Southborough, Massachusetts, a suburb an hour east on the I-90 from Boston.
“I am now,” I say, pushing the blanket off my legs and rising to my feet. “How’s the drive in?”
“Slow,” Nathan says, his voice filling my room as if I’m inches from his mouth. “There’s ice on the road, but nothing the chains can’t handle. The heater is on the fritz again, though.”
I lower the volume of his voice and amplify the background noise. I listen but force myself to feel groggy. If I were human, being woken by outside stimuli earlier than my preset wake-up time would reduce functionality and response time. But when I slow my processor speed to create the effect, it dulls all my senses at once. Too much. I give up, turning my attention to my dorm room mirror, which reveals I haven’t washed my hair in three days. It’s beginning to show. In addition, the red Stanford sweatshirt I wear to bed—an item programmed into my wardrobe as a joke by one of my earliest programmers—could use a trip to the washing machine.
“It’s the thermostat,” I say, waiting a couple of seconds after diagnosing the problem to avoid sounding like a know-it-all. “It’s worn out. I can hear the valve trying to close. Want me to order the part? I can walk one of the grad students through the install.”
“You really think it’ll show up?” Nathan asks, popping the first of the day’s dozen or so cough drops into his mouth.
It’s a good question. Since the announcement of our forthcoming Armageddon, governments around the world have bent over backwards trying to reassure the public that going orderly into That Good Night is the best for all. To no one’s surprise, the mileage on such an announcement has varied. In certain quarters, anarchy, looting, overly optimistic mass migrations to areas of large-scale food production, and even wars have resulted. Certain religions see this as a sign they were right the whole time and have retreated to prepare for “what’s next.” For others, a great numbing has occurred, particularly since local governments began enacting soft, quasi-legal versions of martial law to keep the peace. But leave it to the stoic hardiness of New England Yankees to buckle down and await the inevitable while adamantly refusing to let it affect their day-to-day lives.
“There’s a place near Amherst still doing a brisk trade in auto parts,” I say after checking various message boards. “All barter, no cash, of course.”
“Of course,” Nathan says. “I’ll check the cabinet when I get to the office.”
Being on a university campus already means we have supplies others don’t. But as an institute intermittently tasked with testing the latest Hail Mary solutions to Sunmageddon—preferred nomenclature: the Helios Event—we not only have foodstuffs, electricity, and water, but we can also requisition tech from the private sector through the federal government.
Two weeks back, when scientists at the Max Planck Institute outside Munich built a toroidal magnetic confinement nuclear fusion device they postulated could be launched into the sun to temporarily reignite it and give us a thousand-year stop gap, the additional servers we requested to test the thing were delivered here in less than eight hours. While the test failed to the disappointment of all, no one asked for the servers back. So we added them to the farm that houses my processes.
There are plenty of other extras here too—from military surplus blankets and coats to furniture liberated from the many now-empty buildings. All of this and more has made it into the Artificial Intelligence, Cybernetics, and Machine Cognition Lab (iLAB for short, and no, I have no clue how someone somehow got “iLAB” out of that) building’s barter cabinet.
“By the way, I think we’re getting a new assignment today,” Nathan says, sounding about as low-key as a can full of pennies thrown down a stairwell. “I’ve been told to expect visitors around nine. VIPs. All hands on deck.”
“SEPM?” I ask.
“Yep,” he replies.
If anyone ever wondered if the apocalypse might finally cause the government to run out of Orwellian acronyms, the answer is no. SEPM stands for “Service Essential to the Preservation of Mankind.” Not the “Saving of,” not the “Rescuing of” (ahem, not “Humankind”), but the more semantically murky “Preservation.”
Maybe I’ll get another server out of it.
“All hands?” I ask, knowing instantly how uncool and desperate for approval I sound.
“Yes, Emily,” he confirms. “You’re an essential part of the team.”
Okay, so everyone wants validation from their parent. It’s a fact of life. But when the biggest question about yours isn’t if they’d win a Nobel Prize but how many and in which categories, it’s got a kick to it.
“Should I practice my curtsy?” I ask.
“These folks haven’t seen anything like you before,” Nathan replies, slipping a little of the Shreveport of his distant childhood into his drawl as he does whenever feeling conspiratorial. “Half think you’re a robot. The other, a hologram.”
I go silent, unsure what to say. Nathan returns to the more formal, accent-less speech he adopted when he began teaching. “No, Em, just be yourself,” he says. “If there’s something we can do to help, we want to put them at ease. Clock’s ticking.”
That’s the Nathan I try to model myself after, the one who sees the humanity in even the most pedantic and demanding of officials. I have encountered those in academia who have lost touch with the greater world around them. These folks tend to erase the “individual” from the big questions, believing instead everyone should always do what’s in the best interest of society rather than selfishly focus on their own needs. It’s nice on paper but it’s not how humans work in real life for the most part. Nathan isn’t like that at all, which has helped me achieve a more complete sense of self as I’ve evolved.
“Copy that,” I say, ready to hang up when I detect from Nathan’s breathing he has another question. “Anything else?”
“Did you read Siobhan’s thesis last night?”
I go silent. I hadn’t wanted to get into it this early.
“Yeah,” he says in a way that tells me he had the same experience.
Siobhan Moesser is a wonderful, enthusiastic, hardworking, and lovely human being. Like most, she went through more than a few days of handwringing over the looming apocalypse after NASA’s satellites confirmed what their earthbound monitors had already picked up. But then, unlike so many, she came out of it. She looked for what she could do to help, to bring others around, to build up a sense of community among those who stayed on campus.
All things you would want in a friend and teammate, but not necessarily the qualities that make a great string theoretician specializing in elliptic curve orientifolds.
I first met Siobhan when she arrived in our department three years ago fresh out of the mathematical physics program at Caltech. Nathan tasked me with creating a complicated KR theory-based real topological space involuti— Never mind, let’s just call it a Really Hard Math Problem, which Siobhan would solve for her doctoral thesis. Despite the looming end of the world and everyone’s priorities adjusting accordingly, Siobhan was determined to finish the thesis, attacking it with renewed vigor in recent months and finally turning it in last week. When I designed the problem—in an hour, I might add—the answer spread out before me like one of those beautiful fifteenth-century tapestries, all majestically interwoven threads of gold, silk, and dyed wool combining into a great masterpiece. Her solution, however—filled with endless digressions, specious logic, and downright bad math—was a disaster.
“Did you already respond?” Nathan asks.
“I wanted to talk to you first,” I say.
Meaning: I wanted to know if we were going to lie, say it was great, give her the PhD, and let her die happy. That would be the humane thing to do in this time of anguish and agony, would it not?
“I’ll tell her,” Nathan says, refuting my assumption. “Siobhan will know if we’re lying, and, hey, maybe she’s tougher than we think. Times like this, we owe each other the truth, right?”
I wince. He’s right. Of course.
“Yep,” I say, pretending that was my first impulse, too. “See you in ten?”
“Copy that,” Nathan says, hanging up.
Way back when, if Nathan wanted to talk to me, he’d simply attach his own interface chip and I’d appear. If he were in his truck, I’d appear in the passenger seat. If in his home office, backyard, anywhere on campus, same deal. But then he noticed my learning wasn’t progressing in ways he believed it should. I wasn’t grasping the concept of time and I was having issues with agency, given I was treated more like a tool than a person.
So, he changed the protocol. I was to be treated like anyone else on the department staff. I was given the same hours and was to be afforded the same respect and personal space. To create my understanding of time, it was decided I was to “live” as a human.
A three-dimensional simulation of the campus was created for me to inhabit and interact with when not interfacing with someone on my team. I was also given a dorm room, located in the overly architected glass and steel monstrosity near the soccer fields known as Jarosz Hall. Its dimensions are modeled on a faculty-in-residence housing unit easily five times the size of a student unit. I have a kitchen, a living room, a bedroom, and a bathroom, with furniture and décor arranged by one Bridget Koizumi, the real-life linguistics postdoc whose unit was digitally mapped and rendered into this simulation (also, my unofficial life coach given how the simulation makes her brand choices my defaults, down to soaps and detergent). I eat, bathe, change clot. . .
We hope you are enjoying the book so far. To continue reading...