The Nobel laureate in literature, Kazuo Ishiguro, warned this month about the intersection of AI and human emotions. In an interview with The Guardian newspaper, he said “AI will become very good at manipulating emotions. I think we’re on the verge of that. At the moment, we’re just thinking of AI crunching data. But very soon, AI will be able to figure out how you create certain kinds of emotions in people – anger, sadness, laughter.”
What if technology could monitor our most private thoughts and images, namely dreams? What if it collected data on us while we sleep, altering our emotions to help us sleep, and then use that to assess whether or not we should remain in society, or be removed for our own ‘safety’?
Exploring some of these issues is a new book by the famed author Laila Lalami: The Dream Hotel. Lalami, who was born in Morocco, graced the stage at SXSW in a wide-ranging discussion on the dangers of dreaming in a future United States dominated by surveillance. In a separate SXSW session, the real-world development of monitoring people while they sleep, and adjusting their brain activity to modify their emotions, was discussed. More on that in a bit, but first let’s explore what Lalami said about the inspiration for her book, which began in 2014.
“So yes, the book started in 2014. I remember one morning, I reached for my phone and I saw a new goal in the communication that said, ‘If you leave right now, you will make it to the name of my workplace at 728am.’ And of course, I had never told the rule with the week or time of day, or even that I want to do that, but because it could follow my movements, it had learned that every Tuesday and Thursday, I leave for that particular location. It quite helpfully decided to tell me that I was going to be late.
“The surveillance had made it clear to me how precise the data that these major tech companies collected from us could be. And I remember I said to my husband, ‘you know, pretty soon, people need privacy. All any of us would have left will be our dreams.’ And then I thought, what if one day even the most private parts of our ourselves – our dreams – the most intimate part of ourselves, are also monitored?
“Now, I happen to be an insomniac, and so I thought, well, maybe that’s how I can make progress in writing the book, by inventing a device that helps you sleep.”
The creation of such devices is already underway. The topic was discussed in another SXSW session by Dr. Caitlin Shure, Head of product and marketing at NextSense.
“It’s going to be the wild, wild West,” warned Shure. “Many people will be wearing headphones that will not only monitor your sleep but actually transmit sounds that will alter your mood to potentially help you sleep.
“So for sleep, you might think– Okay, well, if I’m sleeping with these headphones and I’m sleeping with my partner or I’m sleeping with roommates, is data being collected about them? Is environmental safety data being collected about other people? Like, could I be sleeping with this device to help sleep and be unintentionally learning about the medical diagnosis of my roommate? So, this is kind of like an emergency privacy issue, right?”
She posits this scenario, which starts with fine motives. “We’re going to make headphones to help people focus, and you’re a company and you have really good intentions. And a lot of people buy it and use it at home and they’re really excited to learn about their data and they’re learning so much about themselves. We want to empower people to learn about themselves, manage their clients, etc.
“But then imagine that company selling thousands of these devices to a company that wants to use it in the workplace, for their employees. So, in that case, do you really want that data to be transmitted to your manager who’s tracking your focus level, or who’s using it punitively to decide during her performance evaluation, or even giving it HR?”
If your company mandated the use of these headphones, would you be willing to lose your job, or go along with it? In the Dream Hotel novel, people seem to have no problem signing away their privacy rights. “One of the things you’re going to capture so well is that we are often willing participants of our own surveillance and there are acts of consent in the book,” said Lalami. “Like for example, there’s essentially a political consent by the voters to allow the government leeway into our very thoughts as a way we’re using them like this.”
She points to our cellphones, filled with apps that already collect a lot of data about us. “It boggles my mind to think that we each carry the device that records how many miles we walk in a day; and it holds every email and every text that I’ve ever written, and every picture of the names of people. If you go back and think about 20 years ago, and how much of our data we now take for granted is accessible.”
Lalami identifies two key reasons why we are so complacent about giving up our private data. “One is convenience policy. It may not be moral, and it may not be ethical to use some of these apps. Convenience, I think, is just such a powerful force in our lives. And the second thing I would say is connection: we are social animals and we need to be able to meet around them. So how often have you heard people say, I don’t like Facebook, but all my friends are on there.’ So, because everyone else is doing it, we’re also doing it because we tend to do things that others do all around us. It’s a kind of pressure that arises to just let people connect with others.”
Referencing the book 1984 by George Orwell, Lalami explained “the only way that Orwell could make a surveillance state is through the coercion of the state. So, the government is doing this to you. What not even he would have imagined is that people would willingly be accepting that level of surveillance. I think this is where fiction really can play a role: it allows us to imagine a possible future.
“So there are trillions of possible futures. None of them have to be, as they all depend on all the decisions that we are making at present. And what this book is doing is taking one such possibility and really kind of exploring (almost like a simulation). If your dreams cause you to be ‘retained,’ to use the language of the book [that is, being removed from society], how do you survive that? How do you fight against that? That is the kind of the pleasure and the thrill of writing a novel: getting to explore from the perspective of character, all the choices.”
Photo: Laila Lalami with Dr. Cunningham