can I ask my friend to turn off her Alexa?
Can I ask my friend to turn off her Alexa when I go to her house? To answer that question, Dr. Simone Browne joins the show to talk about privacy, ethics, and whose job it is to convince someone to not opt into surveillance.
- Dark Matters: On the Surveillance of Blackness
- Interview with Dr. Simone Browne
- Digital Epidermalization: Race, Identity and Biometrics
- Race and surveillance
- Simone Browne | Dark Sousveillance Race, Surveillance and Resistance
Advice For And From The Future is written, edited and performed by Rose Eveleth. The theme music is by Also, Also, Also. The logo is by Frank Okay. Additional music this episode provided by Blue Dot Sessions.
To get even more, you can become a Flash Forward Presents Time Traveler for access to behind the scenes exclusive content, early access to new shows, and other surprises & goodies.
Show sponsors:
- The Listener: A daily podcast recommendation newsletter, sending three superb episodes to your inbox every week day. Get 20% off your first year using the code ADVICE20 at checkout at thelistener.co/advice.
- Shaker & Spoon: A subscription cocktail service that helps you learn how to make hand-crafted cocktails right at home. Get $20 off your first box at shakerandspoon.com/futureadvice.
- Tab for a Cause: A browser extension that lets you raise money for charity while doing your thing online. Whenever you open a new tab, you’ll see a beautiful photo and a small ad. Part of that ad money goes toward a charity of your choice! Join team Advice For And From The future by signing up at tabforacause.org/futureadvice.
- Tavour: Tavour is THE app for fans of beer, craft brews, and trying new and exciting labels. You sign up in the app and can choose the beers you’re interested in (including two new ones DAILY) adding to your own personalized crate. Use code: futureadvice for $10 off after your first order of $25 or more.
- Purple Carrot: Purple Carrot is THE plant-based subscription meal kit that makes it easy to cook irresistible meals to fuel your body. Each week, choose from an expansive and delicious menu of dinners, lunches, breakfasts, and snacks! Get $30 off your first box by going to www.purplecarrot.com and entering code futureadvice at checkout today! Purple Carrot, the easiest way to eat more plants!
TRANSCRIPT
Transcript by Emily White @ The Wordary
Advice For And From The Future
S1E9: “Can I Ask My Friend to Turn Off Her Alexa?
[store bell jingles]
[Advice For and from the Future theme kicks in: low, long synths under a steady, crunchy rhythm]
ROSE EVELETH:
Hi again. Welcome back. I’m so glad you could join us. The repairs from our little incident earlier this month are almost done, and here you are. Got a question about tomorrow? Well, you are in the right place. Welcome to your friendly neighborhood futurology shop, where you can get the answers to tomorrow’s questions, today.
On today’s trip to and from the future, we are considering questions of surveillance and friendship. This is an emailed-in question. Ellia writes in the following question. They say:
A lot of my friends at this point have some kind of smart home assistant; Google, Alexa, etc. Often, they are hidden in the house so I don’t even know that they’re listening until they say, “Hey Alexa, play some music,” or something like that. Personally, I hate the idea of being surveilled constantly by these devices, and I really don’t want to normalize them. I’ve been tempted to ask my friends to turn their smart home systems off when I visit. Is that weird? Is it okay to ask your friends to turn off their surveillance devices when you come over?
[theme fades out]
I love this question because I actually have the same one. I would love to know the answer. And to grapple with this conundrum, I called Dr. Simone Browne, an Associate Professor of African and African Diaspora Studies at the University of Texas at Austin, and the author of an amazing book called Dark Matters: On the Surveillance of Blackness.
ROSE (on call):
Simone, thank you so much for coming to talk about surveillance and how to talk about surveillance with people. I guess my first question is how often you talk to your friends about surveillance and about these technologies, or do you try to avoid it in social conversation?
DR. SIMONE BROWNE:
I think avoid it sometimes in social conversations, or I’m not, like, in teacher mode, but just will ask… just talk about the things in their lives that they’re using, or that’s being done to them in terms of crossing borders, whether on their social media or other things. I think I play the long game with friends in that and hopefully they come around at some point because, you know, many people just see it as a form of convenience in their lives.
ROSE:
Do your friends ask you for advice about surveillance technology? Do they come to you as, like, the expert?
SIMONE:
No. (laughs)
ROSE:
Really?
SIMONE:
I don’t think so. I think I have a different type of friend group where I don’t take that stuff home. But I feel like sometimes my colleagues, and students, and people that I know… I think I probably have different definitions of friends.
ROSE:
I feel like I get text messages from friends all the time, like, “Is my phone listening to me?” And I’m constantly being like, “Well, yes and no. Not in the way you think,” like, trying to explain it to them.
SIMONE:
Yeah.
ROSE:
When you go to someone’s house, do you notice if there is one of these, sort of, always-on devices? Do you notice if there’s, like, a Ring device? Are you looking for that stuff?
SIMONE:
They’re in a lot of homes now, so it’s almost like an expectation for a particular class and category of people that they have things like, you know, Google Home or Echo. I think at some point they were giving them away for free practically, you know, to source data from people’s homes. So, I think I see a lot of people with those things. And I feel like, in some homes they kind of hide it. It’s in a corner somewhere. Maybe for their own aesthetics, but maybe it’s just that they have a certain type of, maybe, anxiety and guilt about how those things look.
ROSE:
If a friend did come to you and said, like, “Should I buy one of these?” like an Alexa, or Google Home, or something like that, how would you advise them?
SIMONE:
Well, that has happened to me before, and I just say no, and “What do you need it for? So you can have your playlist?” I think that’s really the main thing. But oftentimes they come with a product that they bought. So for me, it’s like… To tell people sometimes some spectacular story about this technology, and it’s just, like, they are looking for the trigger word or the wake-up word and it has to listen. How is it going to understand various accents and ways of talking? There’s training that’s being done by the recordings in your home. And they can say that they’re disposed at some times or they’re not checked, but how many times do you get on your phone and have to say your credit card number out loud, or social security number?
Those are things that you’re relying on, not just an AI, but actual people that are being outsourced to do this thing that… you know, people are people. We just saw recently the kind of breaches that were happening around Shopify and people’s data in terms of credit cards and stuff. So, it requires a lot of trust. Those types of stories are the ones that I tell.
ROSE (Mono):
In case you missed it, in September, Shopify revealed that two employees had, basically, gone rogue and stolen data from about 200 merchants who use the service, and that included names, addresses, emails, and phone numbers. This is not the first time this has happened. In fact, Amazon also recently caught some employees who were taking bribes in return for helping sketchy merchants look more legit on the site. Humans, more often than not, are the weakest link in any security system.
SIMONE:
I have another friend that has some facial recognition thing on their television, which is a kind of convenience, but what does it do in terms of, like, spontaneity? You can’t just walk around your house naked anymore with that thing. I get them to think in those ways and maybe they can question that, you know, you can actually get up and, you know, turn the track on the CD, or the record, or whatever it is.
ROSE (on call):
Yeah, is there a particular story that you go to that you feel like works really well? I’m secretly asking because I want to better describe to my friends why they shouldn’t be buying these things. I feel like it’s hard when it’s vague, where you’re talking about what feels like a vague security threat. People are like, “I don’t really care about that.” But when you can tell a specific story about something that happened or can happen, that tends to work better, it feels like. Do you have a specific one that you go to?
SIMONE:
I think for a lot of people that have things for security measures, they see themselves as the one that needs to be secured from outsiders. So, those stories about, you know, whether it is package porch pirates or whatever they call them, or somebody coming into their home, and that there could be a misrecognition or whatever it is, or that this data is being fed to police stations, that’s a selling point for these types of people. So, those stories, once you have those things in your home, I don’t think that’s really going to work. I think the question is to say that there could possibly be some type of, like, data breaches, leakages, what happens to your information if it’s traded, or stored, or whatever. But those kinds of at-home assistants or AI assistants, I think the stories I could tell, probably for many people, don’t garner that much… pull any kind of heartstrings.
I’d gone to Accra, Ghana to look at the kind of electronic waste work that is happening in a space like that, where our end-of-life products – phones, any type of consumer electronics – to think about what happens when Alexa goes to Accra. It’s really… The health effects of this type of dismantling are so brutal on the people doing the work and the people living there. But if you can’t see that, or you refuse to see that, the labor and all of these things that go into it, you just want your convenience because you need to pull up a recipe – who wants to get their hands dirty or their cookbooks dirty?
So, I think that some people, it’s going to be quite difficult to make those arguments, but you have to decide for yourself, what’s your political project in how we consume? I think that’s where it… I don’t want to give up on those folks, but you just have to find those stories where some kind of data has been breached and they’re personally… Because I feel like, you know, a lot of people have really deep investments in, like, “we’re a nation of laws,” or there are not these really brutal and wicked ways that power gets wielded in very asymmetrical ways. It’s like, “You know, I have nothing to hide.” It’s almost like you cannot convince a certain category of consumer.
ROSE:
I wanted to ask you about the “I have nothing to hide” argument because I’m sure you hear it a lot. I also hear it, like, “Well, I’m not breaking any laws. I’m not doing anything wrong, so why should I worry?” How do you try to disabuse people of that argument?
SIMONE:
I feel like I’m not really good at snappy one-liners. People say, “I have nothing to hide,” or I think Edward Snowden says, “It’s like saying you have nothing to say and therefore you don’t have freedom of speech,” or, “Let me see inside your phone, then, or let me see inside your fridge or your bathroom.” But in some ways, just having a conversation with my class, you know, via Zoom on the platform, I showed them a couple things that came out just two days ago where these kinds of proctoring technologies, where they want to make sure the student is watching the camera, that there’s no one else in the room. There was an image of a Black woman who had to shine a light so the facial automation technology would recognize her and not flag her as cheating. There was another woman who was in tears, really traumatized by the fact that as she was, you know, typing up her questions, moving her lips, she was flagged as if she was talking to someone.
And you know, they were telling me they just took a test where everybody’s sequestered in these rooms. They have their roommates, but they could be marked as cheating because they all had the same IP address, and then one had to go outside, and people were walking, and all of these ways that even if you had nothing to hide, why does the university via Proctor or Zoom have to have a scan of your entire room? You do need to have a sense of privacy, you know? I don’t need to be in your home.
ROSE:
It’s sometimes hard to bridge the gap between different threat models and different people who might have different things to worry about, and I’m curious how you talk about that with people and try to explain to people that just because you might not have to worry, you may be putting other people at risk by the choices that you’re making.
SIMONE:
Yes. I think that’s the key question around consent. And that’s how one of my students convinced me to start using, at the time, Signal or one of those types of encrypted apps, because I’m communicating with people that might be undocumented or at risk in some way, where the things that I do that I think aren’t necessary at the time, like talking in unencrypted or sending a message in a particular way could reveal and could put them under harm or something like that.
But for me, I’m not really in the convincing business. It’s so urgent, you know, that we have to look at what is really… who needs the most protection from the kinds of harms and violences that are mediated by these technologies? For example, we saw that… It was, like, Geofeedia a couple years ago, this data collection by trawling various tweets and social media platforms for keywords, or images, or anything that could then be used to criminalize resistance, or uprising, or rebellion, or riots, or whatever we want to name it. And I think that getting to know how those things are working hand-in-hand with policing agencies, that’s kind of important, as opposed to “Instagram has been listening to whatever I’m purchasing and I get these ads.”
So, I think that, yeah, I just focus on the work. I’m a Black studies professor; I’m thinking about surveillance through what it means to understand surveillance in Black life and what that has to do for understanding surveillance generally. Those are the communities and the kind of work that I’m doing.
ROSE:
They don’t need to be convinced.
SIMONE:
Yeah. (laughs) A lot of people don’t need to be convinced.
ROSE:
Did you ever, at any point in your life, feel like you should be in the convincing business, or were you always interested in this other way of thinking about it?
SIMONE:
I guess, yeah… My social location growing up as, like, a working-class kid in Toronto of immigrants, I was not going to be the business. I was not going to be part of, I guess, the ruling class. So, I don’t know if I ever was in that, but I was always teaching. Not always; I think we’re all teaching in different ways, but I was a kindergarten teacher, and second grade, and various things, and now I’m teaching adults, some older and some younger. So, I think that I am in the convincing business, but it has to look like I’m not doing that much convincing. I think that’s how it works, through this Socratic, kind of, asking questions, telling stories, and letting them put that knowledge to work.
ROSE:
So, in terms of this question-asker asking, “I go to my friend’s house. I see this Alexa device. Can I ask my friend to turn it off?” what would you say to that question? Is that a worthwhile conversation for two friends to have when they are in a place together? Or is it not worth having that fight then? It doesn’t have to be a fight, I suppose. That conversation.
SIMONE:
I think it’s worth having it. I will say that I am not that person. Like, I don’t even ask people to take off their shoes when they come to my home, and I’ll just complain about them afterward. (laughs) So, I don’t try to get into it. I’ll throw down a mat, there’s slippers, everything.
But I think you have to have these conversations with friends because… It’s almost like you need to have a sign on the door saying that “These Premises Are Being Recorded” and that being in that space you consent to that. I think you should have a right of refusal, and that can really be a moment of having a conversation about why some people want to refuse this kind of data capture. Whether it’s Google, or Amazon, or whatever technology is the thing. So, I think those kinds of conversations are legitimate and necessary, and it might just be about asking, “Well, do you know what happens to that data? Can I get access to it? Can it be played back to me?” Asking questions in those kinds of naïve and sly ways probably works a lot better.
ROSE:
I’m curious… In your thinking, because so much of these conversations around to-have or to-not-have an Alexa, position the answers or conversations around privacy and surveillance as, like, consumer choices, like, “You choose or you do not choose,” and that is the way in which some people talk about influencing policy. Do you feel as though these sorts of personal consumer choices matter at all, or matter in some way, that people should be doing other things as well? Like, how important are consumer choices in the context of talking about surveillance and trying to change the conversation about surveillance?
SIMONE:
I think those that can choose to refuse, should. I’m thinking of the example of direct-to-consumer DNA. And I know you’ve had Alondra Nelson on the show talking about that work and her research in detail. And I think that those kinds of… the buy-in that you get for consumers, there’s always the B side to that, like how those kinds of collections are being used at the border where people don’t necessarily have a right of refusal. Or in the GEDmatch, which is now bought by this forensic agency. Or even, like, Parabon where they’re making these mugshots, like imaginary faces.
ROSE (Mono):
We’ve talked about Parabon on Flash Forward before but never on this show, so a quick explainer. Parabon is a DNA sequencing company that, among other things, uses DNA information to create what they call ‘virtual mugshots’ of what they think a suspect might look like just based on their DNA. If that sounds not great to you, you are correct! You cannot actually tell what someone looks like from their DNA.
SIMONE:
So, I think that once you… If you can refuse, you refuse the normalization of these types of technologies, because we have to think about how electronic monitoring is being used as, basically, “ankle bracelets” for people who are still incarcerated by these technologies. Their homes become sites, or their bodies become, basically, satellites of the prison, the carceral state. So, those are the moments when it’s important that we refuse because there’s always a policing backend in a lot of these technologies. This is the soft sell that, you know, we can have these things in our homes that play our favorite tunes or turn down the heat when we’re out. But the way that it’s all part of a large data capture and it doesn’t… It actually heightens what they call social stratification, or the power imbalances, the racism, the sexism, the homophobia. All of those things we have in society could be heightened by a lot of these technologies.
ROSE (on call):
And if a listener is interested, like, maybe in resisting these sorts of technologies beyond just not buying an Alexa or not buying a Ring device, what are some of the ways that they might try and resist? Maybe having conversations with their friends about these things? What are some of the places that they can resist that go beyond consumer choice?
SIMONE:
I find that the… So, consumer choice is important. Talking to people is important. But the way that… It’s collaborative work that needs to be done around… whistleblowing is so important. Communities helping each other is so important. What gaps that are being filled by these technologies can then be filled in a more collective, humane type of way? For me, I think the work that’s being done around policy with things like the ACLU and the facial recognition banning in certain cities, or the Algorithmic Justice League to think about biometrics technologies, those types of… Just learning, and reading, and getting involved in understanding what’s at stake now, but also how people have historically instrumentalized certain technologies for the goals of, like, liberation, or anti-racism, or even just invention to think of new ways of living in this world.
ROSE:
If you could have one conversation with every person in the US about surveillance, what would that conversation be?
SIMONE:
You know, I think I would start with how I start my class, by putting the questions around surveillance that many people are thinking about now into… to historicize it through what really is foundational in this country, which is colonialism, genocide, and slavery, and to think about how those practices, and performances, and policies that we have now in our current moment have histories in racial management and other means of upholding a state that is a slaveholding state. So, those are the conversations… And that doesn’t really, perhaps, sound that interesting to people, but I will… Not not interesting, but for people that don’t necessarily… There’s a lot of not convincing… I mean, we’re living in a place where the president has said, like, “critical race theory must be extinguished,” and the 1619 project, and all those types of things.
So, how I begin the class is really talking about technologies that were used in enslavement and plantation logic, whether that be branding, or the auction block, or the slave ship. But also, importantly, looking at moments of resistance and rebellion, and when that whole system became upended by strategic uses of these technologies. You have, like, sociologist Christian Parenti calling the people that forged and made counterfeit documents to run away, to escape, like certificates of freedom, as the first hackers. And that’s one way of understanding the uses of technology.
When, for example, Harriet Jacob escaped, first into her grandmother’s attic for, like, seven years. She would artfully use the postal services and write letters from where she was, like, in her cell in North Carolina, and have a trusted friend mail them from places like Boston, New York, or Canada, and that would send the slave catchers looking for her. And so, those are moments of really using that. And you have, for example, the Student Nonviolent Coordinating Committee would make use of CB radios and wide-area telephone service lines like a 1-800 line so they could circumvent the local police, the KKK, or whoever would be listening in on their calls to record and to monitor activities that were happening during Freedom Summer. So, there’s always been these kinds of very artful uses and repurposing of the very technologies of repression that people have used.
And I think it’s so important to understand that surveillance is not, like, a post-9/11 formation, or a post-Edward Snowden revelation, but that there have been whistleblowers, people blowing the whistle on the state, on surveillance, since the inception of this country and what it was before. So that, to me, is the way that I would bring those conversations up.
And people always want a hope scenario, I’ve found. And I’m not really a hope scenario kind of person. (laughs) You know, Mariame Kaba says hope is a practice, like you have to keep… I guess I’m trying to have that, but you know… I think I told you earlier that I’m not, really, like, a feelings type of person. (laughs) But I think that it’s important to show people that there are moments of allyship, of mutual aid, and of communities coming together in various formation to challenge and to create new ways of being in the world, and seeing, whether through invention or innovations, that there is something outside of this surveillance state for us and that surveillance is not just about Instagram platforms. There are entire carceral logics with it too.
ROSE:
Yeah, I always have to remind myself of the active practice of hope and that it is not just, like, a thing you take a bath in or whatever. We started with, “Can you ask your friend to turn off their Alexa,” and “is that the right question to ask” is an open question in general, but is there anything else around, sort of, day-to-day choices that people might make around surveillance that you want people to know, or that you want to mention or talk about?
SIMONE:
I do, but I’m still… I don’t think I can convince people at all. Learn about how these products are sourced and what kind of energy and labor are these data processing centers, or whatever the structures of surveillance are. Like Facebook having those underwater sea cables that are destroying that environment. I think those are things that users and consumers are seeing as benign or helpful technologies have to grapple with their own complicity in all of the really negative outcomes of these technologies. Do we really need this next new thing? This Amazon drone that flies in our home? Or the Roomba that maps our home? Or whatever it is that these gadgets are not really going to get us where we need to be when it comes to the pandemic life that we are in now, which could be in perpetuity. Oh, I’m supposed to have hope… (laughs) Sorry.
ROSE:
You don’t have to, necessarily. You don’t have to.
SIMONE:
No, but it’s a practice. I said ‘in perpetuity’, and it’s a practice. So yes, we’re going to get through it, but not with Amazon drones.
ROSE:
I appreciate your commitment to sticking it out. That’s great. Thank you so much for coming on the show. This is really interesting, and I will link everybody to your book and all your work in the show notes.
SIMONE:
Thank you so much for having me. This was a wonderful discussion.
ROSE (Mono):
Do you have a question about the future? Some conundrum you’re facing now, or one that you think we might face in the future? Send it in! You can send a voice memo to Advice@FFwdPresents.com, or call (347) 927-1425 and leave a voice message.
And now, a quick break. When we come back, I want to talk about canyons.
ADVERTISEMENT: SKILLSHARE
This episode is supported in part by Skillshare.
So, the next few months are probably going to be… intense? Is that an understatement? Is that a weird way to say it? I don’t even really know exactly what’s going to happen, obviously, coming up, but it feels like it’s going to be intense. But what I do know is that for me, having some kind of creative outlet where I can step away from everything and go make… for me it’s my weird ceramic sculptures. That really does help stave off some, most, some percent of the abject panic that 2020 seems to heap on every single day. I do ceramics, but maybe you do woodworking, or graphic design, or photography, or poetry.
Maybe you’re looking for something to do. Maybe you’re looking for a new skill or hobby, something to take your mind off things. Maybe for work you really want to learn how to animate, or to create a really amazing pitch deck, or how to give incredible Zoom presentations. And if you are looking for a place to get tips on any of those things that I just mentioned, or other hobbies, professional skills, and more, check out Skillshare.
Skillshare is an online learning community for creatives where millions come together to take the next step in their creative journey. Whether you are a working creative, a beginner, a pro, a dabbler, or a master, Skillshare has something for you. Again, maybe you want to learn about illustration, or photography, or web design. Maybe you want to learn how to start a podcast. Maybe you want to learn how to keep plants alive… which is something that I need help with. There’s probably a class for that. In fact, there is an amazing-looking plant class called Plants at Home: Uplift Your Spirit & Your Space by Christopher Griffin, who goes by @PlantKween, which is described as “Discover the power of plants, no green thumb required!” which is good for me, because I absolutely do not have a green thumb, so I will probably be watching this.
And if you’re not sure where to start, I would suggest that you check out Skillshare’s series called Creativity with a Purpose, which is full of videos by Black creatives about everything from digital poster design to creative writing. There’s some amazing-looking classes in there.
Take some time for yourself to learn something new or to home your skills on something they’re already good at. Take some time for you. You deserve it.
Explore your creativity at Skillshare.com/Future and get TWO FREE months of premium membership. That’s Skillshare.com/Future.
ADVERTISEMENT END
[contemplative music begins]
ROSE:
You’ve probably heard of the ‘uncanny valley’, this idea that there’s an upsetting chasm between totally unbelievable humanoid robots and totally convincing ones. That there’s this moment as you get closer to convincing, but not quite there yet, in which things get really creepy.
What you might not know is that the evidence for the uncanny valley is kind of spotty. The phrase is widely accepted to have originated in 1970 when a roboticist named Masahiro Mori published a paper in an obscure journal called Energy. Mori’s original paper was in Japanese. Its title, Bukimi No Tani, does not mean uncanny valley. A more accurate translation is ‘valley of eeriness’. The main graph in Mori’s paper has been mistranslated too, leaving many people unsure what he actually meant.
Mori uses the Japanese word shinwakan on the Y axis, a word that has no direct translation into English. The most common interpretation is ‘likeability’. But not all translators agree about that. Other suggestions include familiarity, affinity, and comfort level. And those are all kind of different things, especially when you start to think about scientific studies that might try and prove the uncanny valley’s existence. How do you measure something if you can’t even name it?
In fact, Mori’s paper didn’t actually include any measurements. It was more of an essay than a study. And there is some contention today about whether or not this phenomenon really exists the way he described it, this ‘valley’, the dip just before you get to perfection. And to just be extra pedantic, it’s also not really a valley, if you think about it. It’s more of a canyon; a steep decline in comfort, a ravine, a gorge. Here we stand on one side with our not-so-convincing robots, and to get to the other, where the totally lifelike ones stand, we must climb down into the dark depths of discomfort, unease, before we can climb back up, sweating and scrambling into the realm of true success.
[music ends]
I’ve been thinking a lot about canyons in technology lately. Metaphorically, that is. These places where there’s a dark, often dangerous, chasm between where we are now and where we could be. It seems like with so many technologies, there is this ‘valley of death’, this canyon between reality and possibility in which scary and terrible things happen. We stand on one side and look across to the other, where some lovely outcomes live. On the other side of the canyon, there’s a world where we can use algorithms to eliminate bias in hiring or sentencing. A world where we could use AI to model our bodies and test treatments effectively. A world where we could use genetic editing to help people.
We have to look through binoculars to see these outcomes, this lovely meadow on the other side of the ravine. And sometimes it’s hazy, like a mirage. Some people see different things than others. But they’re lovely outcomes, happy ones, useful ones. Standing on our side, here, these technologies can’t do any of those things just yet. We have to get to the other side somehow.
[ominous synth music begins]
To get there, what normally happens is that companies, and inventors, and engineers begin to climb down, to descend into the depths of the ditch. Where bias lives, where harm happens, where people are unfairly targeted, and marginalized, and excluded for healthcare or jobs. Where eugenics happens. Where police brutality happens. These companies wade through that muck and mess, the dark depths of the ravine, keeping that other side in their mind’s eye. “It will be worth it,” they say. “We’ll fix all of this. Iron it all out just on the other side.”
Evaluating whether these canyons are worth traversing is hard. How do you weigh the pros and cons? How much damage is worth doing to get to the promised land on the other side? And who gets to decide that? Those deciding can often float above that damage and decide to do so with ease.
You might also be wondering, “Wait a minute. Why can’t we just build a bridge? Surely there is a way to span this canyon, to get to the other side without having to go down.” And it’s true, but there’s a catch. Bridges are expensive. They require planning, and engineering, and safety testing. They require permits and public engagement. And the money for all of that? The money is down inside the canyon. There is little incentive for technologists, capitalists, and their band of travelers who can see the other side so clearly in their imagination to wait, to go through the trouble of bridge building, when they know they can gather money in the ditch and come out the other side richer.
And to extend this metaphor even further, along the way they will also write books about how to scale a canyon, blueprints for others to follow, guides for how to pitch a tent, how to train for the climb, how to come out alive. Those books will make money too. Nobody will ask them about all the bodies along the trail.
It’s possible that in some select cases, the promise of the other side really is worth the descent. But in some cases, the ascent might never be possible, or people may get to the top only to find that it’s not the idyllic future they imagined. They were wrong, or too late, or too early, or their attempt itself changed the outcome. Embers from their campfire at the bottom floated up and scorched the other side.
So how do you decide whether to cross? Whether to descend? How to do so as humanely as possible? Who gets to make these decisions? And the question I’m the most interested in is: How do you build incentives to wait? And to build the bridge? And to cross safely, and together?
[ominous music fades down and Advice For and From the Future theme fades in]
Advice For and From the Future is written, edited, and performed by me, Rose Eveleth. The theme music is by Also, Also, Also, who has a new album out called The Good Grief, which you can get on Bandcamp. Thank you to Ellia for the question and to Dr. Simone Browne for joining me to talk about privacy, and personal choices, and the future. Additional music provided by Blue Dot Sessions.
If you want to ask a question for or from the future, send a voice memo to Advice@FFwdPresents.com. And if you want to get behind-the-scenes stuff about this show or any of the other shows in the Flash Forward Presents network – that includes Flash Forward, Open World, and Advice, as well as some upcoming top-secret experiments – you can do that by becoming a member of the Time Traveler program. Go to FFwdPresents.com for more about that.
Until next time…
[music fades down]
[store bell jingle]