We explore the unsettling landscape of data privacy through three emerging trends: a website publishing detailed voter information, car manufacturers collecting (and selling) intimate details about drivers and passengers, and a new technology scanning human irises for identity verification. As we discover how our personal information is being gathered and sold without meaningful consent, we uncover what's happening to our data and how we might regain control.
Selected Links:
National Privacy Test, https://nationalprivacytest.org/
Cox, Joseph. “Voted in America? This Site Doxed You.” 404 Media. 6 Nov 2024, https://www.404media.co/voted-in-america-this-site-doxed-you/
Caltrider, Jen and Misha Rykov and Zoë MacDonald. “It’s Official: Cars Are the Worst Product Category We Have Ever Reviewed for Privacy.” The Mozilla Foundation. 6 Sep 2023, https://foundation.mozilla.org/en/privacynotincluded/articles/its-official-cars-are-the-worst-product-category-we-have-ever-reviewed-for-privacy/
Jodka, Sara. “Dashboard Confessions: Unveiling The Privacy Issues In Connected Cars.” Reuters. 25 Apr 2024, https://www.reuters.com/legal/legalindustry/dashboard-confessions-unveiling-privacy-issues-connected-cars-2024-04-25/
Pascual, Manuel. “Why The Iris Offers The Most Precious Biometric Data.” El Pais. 9 Mar 2024, https://english.elpais.com/technology/2024-03-09/why-the-iris-offers-the-most-precious-biometric-data.html
Vance, Ashlee. “Inside Worldcoin’s Orb Factory, Audacious and Absurd Defender of Humanity.” Bloomberg. 12 Aug 2024, https://www.bloomberg.com/news/features/2024-08-12/how-worldcoin-is-building-digital-ids-to-combat-the-ai-apocalypse
Episode Transcript:
Sue: Welcome to Signal Shift, by Horizon Shift Lab. We're your hosts, Lana Price, Raakhee Natha, and Sue Chi. Each episode, we explore the latest signals in technology, culture, and society, uncovering insights that will impact our daily lives in the future. Join us as we shift perspectives, explore possibilities, and delve into real changes in our world. Curious to learn more? Go to horizonshiftlab.com.
[0:40]
Sue: Hello, and welcome to another episode of Signal Shift. This is Sue, and I've got Lana and Raakhee here with me. And well, before we get into this week's episode, we just wanted to acknowledge we are post-Election Day now. And we just wanted to acknowledge the collective angst that many people in the U.S. and beyond have felt regardless of your party affiliation. So we just wanted to extend some grace to you all, and we hope you're all finding ways to be whole this week and really being in community with those you love.
And so while we won't get into politics today, we'll leave that for actual pundits.
This week's theme I chose has to do a lot with what I experienced during the election, which has to do with the sheer number of emails, texts, phone calls that myself and my neighbors have received over the past few months related to the election because we live in a swing state.
And it has just been an onslaught. I've missed, like, actual texts from my friends because there's just been this bombardment of other things. Strangers are writing me letters, so just thought, how are they getting all my information? How do I get off these lists?
And so today's theme is on data privacy and security. And it's not just about that. Every day it feels like in the news we're seeing there's another hack, there's some kind of compromise of our information. Is there any way to stay on top of this? And what is the future going to look like in terms of our data, especially with AI becoming more prevalent in our lives?
Lana and Raakhee, I don't know if you know this, but October was Cybersecurity Awareness Month. I had no idea. And I found a link on Yale University's website. They had what was called the National Data Privacy Assessment. You can go and take it just to see how you're faring. Wow,
I did not do as well as I thought I would. So it was a big wake up call. But with that, I wanted to see what signals did you bring to discuss today?
[2:43]
Lana: Well, I think mine's directly related to what you're talking about. So I guess I'll start with that. And I also really need to take this assessment, I think.
But so my signal is from 404 Media, which is one of my favorite independent media sites. They're focused on technology and the internet. And so they recently published an article about VoteRef, which is a website that has publicized voter data from every state that they can. So there are some states that have laws against publishing the voter rolls. So California -- lucky Raakhee in California is protected -- Arizona, Minnesota, others. But typically, if you want to get a voter roll, you have to go to the state body individually to request it or purchase the data. But this website has done that.
And so they've gone to each of the state bodies, and then they have published all the information online and made it very easily searchable. So in many cases, you can search, you'd find your full name, including your middle name, your complete address, it can have your age, it says who else is a voter at your same address, what party you're with, and which elections you've voted in. So one, the organization that runs this site is very partisan. It is, ProPublica has reported that it's backed by the billionaire CEO of Uline. So he's a mega donor for the Republican Party and has also funded election-based conspiracy theories. So there is that.
So whatever the intention is of publishing this data, but it isn't hard to imagine it being used for nefarious purposes, like discrimination, intimidation.
There’s people who are, they need to have their privacy-- federal officials, judges, people who have been in victims of domestic violence-- there's a lot of things you can think about, not only protective cases, but just your average person might not want to have all of this out here.
And it isn't even hard to imagine that if they've done this layer, there are other layers of public available data that's out there. For example, if you own property, that's searchable in a database. If you're a business owner, that is in a database. And so it isn't hard to see like these potential databases could just come together.
I think in terms of a signal of the future -- I mean, on one hand, we need to have stronger privacy laws, right? That's like one whole thing.
But then I started thinking about a scenario -- I wanted to get your guys' opinion on this -- is I can imagine that maybe you were born with your government name or your legal name, but that becomes almost as private or as secret as your Social Security Number, right? So you're not out there advertising what your legal name is. And so maybe you're using, in public, your nickname or a different name, or even like a trend towards people being named very generic names, so that even if someone tried to search you, like it will come up with like a hundred versions.
And so I just started thinking about that as a potential implication. You know, even for myself, I feel very like coming of age with the Internet, you know, a lot of people have like your full name as your email address. Like it's not very mysterious as to like tying all of your pieces together in terms of your name, your physical address, your digital address.
Thinking about how we might, you know, kind of go on … I can't, I don't know if it's offense or if it's defense… like really unlocking all of those pieces and making it harder for people to link all the different sources back to the same person.
So that's what I started to think about as I was thinking about implications of our stuff, just kind of just being out there for folks to find and however they want to use it.
Sue: Thanks, Lana. That is really scary where you see how accessible so much of this data is. Yeah, Raakhee, what are your thoughts and what's your signal today?
Raakhee: Yeah, Lana, such a great point in, there's something around there around names, but also just avatars and identity and having more than one identity in this world for various reasons. For privacy, but just how we show ourselves in certain places, because maybe it's safe or it isn't. With all the changes that we see around us, right, with the protections we have or we don't, I think that's almost another topic we need to explore, which is like identity and names and profiles in the world.
[8:16]
Raakhee: But yeah, I guess my signal is adding on to all data privacy concerns, but a product that is actually a very real problem and a very, very real threat to data privacy is our cars. And it's going to get worse as more cars become connected cars, smart cars and self-driving cars.
I think McKinsey says that by 2030, 95% of all new vehicles will pretty much be connected cars, right? So that's, the future is right there. And how car manufacturers track literally every move, right? They have sensors, they have cameras, they have trackers, they have microphones; everything you're saying, the way you're moving your body, just aside from the obvious things we know about, of course, you're connecting your phone. You may connect your home surveillance system to the car, right? It knows a lot about how you're moving, what you're saying, where you're going, how you get there, how you behave while getting there. And I think those are the layers that we may often not think about, right, or just forget about.
So yeah, they're tracking everything. So nothing that you say or you do when you're in your car is sacred or secret. And that's, you know, that's the truth.
But the thing is that not only do they collect this data, but they also sell this data. So there isn't much protecting that, and there's a lot of that happening, right? And we talk a lot about things like Facebook and them taking our data and then selling it. But we have a lot more control over those touch points of Facebook. Like we put up those profiles, we share the photos we want to. So there's a lot more control in something like Facebook, and yet we all have a real issue with them selling our data.
So imagine the reality of knowing that your car manufacturer is doing the same thing for much, much more than that and much more of your data than that. It's just buried behind tons and tons of paperwork, right, that none of us, we just, I want that new car, right? I'm buying it. But we have very little control and they have pretty much all the control when it comes to our data.
So the Mozilla Foundation, some of their researchers and journalists did a study and they took 25 of the car manufacturers out there, and they looked at their data privacy and honestly the results are as bad as you can imagine.
And most of the car manufacturers obviously didn't want to engage with them or even comment back on the articles written because there's just too much liability there. It's not a big enough sort of momentum around consumers right now to fight for that or rather there's no noise around it. So, you know, it's just kind of like, okay, it's happening and no one's making a big enough noise about it.
But every car brand they looked at collects more data than necessary and they're using it for reasons other than to just operate your vehicle and manage a relationship with you. And that's every car brand. This is all 25, so 100%.
And you compare this to places like mental health apps that also sadly, according to this research does pretty poorly, but that was 63%. So at kind of really bad level, you have 63% and then you have car manufacturers, which is 100%. That's the kind of accountability around data, what they're collecting, where they can sell.
What's also interesting is we don't realize this, but you know, it's even information about people who sit in your car and interact with your car. And we sign that away. We give consent to say, we will inform people about our privacy practices when they get in our car. And I don't think any of us know we're doing that, but that's the legal side of it. That's the truth. That's the rule.
So it's, you know, and we see this in all kinds of policies online. You have a similar thing for a website, et cetera. It's just what you're dealing with when it's a car. It just becomes insidious, I guess. It's just something, you know, a lot more deeper.
And then there's brands like Nissan and Kia, who acknowledge in their privacy policies that they collect information about your sex life and your sexuality. And I guess that's maybe because if you're doing something in the car, why can we not collect that?
A lot of these brands acknowledge to collecting genetic information, right? We all are like, I'm not going to sign on to Ancestry, like, and give my, you know, and yet there's other ways where maybe information is shared, right? In your vehicle. You're sitting in there. You can think of all kinds of situations, right?
And again, the intent probably, hopefully, is mostly good in the sense that it's to help law enforcement if there's a crime. We can have things like if you're in a vehicle accident, you know what I'm saying? Somebody will be notified. You won't be able to do it, but your car will do it for you. So there's all these good applications, but it opens up a whole other realm of just issues around data privacy and how much these manufacturers know about us, can collect about us, can share about us. And I think exactly like you said, Lana, is like, we need better data privacy laws here. When you look at this sort of stuff, you realize how lax our laws are.
It's a little bit wild what they can collect and how they can use it. Of course, right now, one of the top buys is insurance companies. And again, sure, I get it, mileage, accidents, make sense, but when you start going a little bit further, because the other thing they do is they take this data and they create what are called “inferences.” They make, you know, sort of categories like customer personas or groups and say, this is that group. This is that group. There's a lot of bias, a lot of assumption, and maybe a lot of wrong that can happen in there and how we can classify groups in society. But yeah, again, who's protecting that, who is controlling that, who's managing that.
So I think with cars, we're going to have all the convenience and the flying and the great stuff. One of the major downsides clearly is data privacy and the role that these car manufacturers are going to play in how they work in our society and work with our governments with that data and control our data and hence us.
Sue: Aah! This is a very grim picture we're painting today. Oh, no. But yeah, Raakhee, that is so important and not something that I was aware of, because like you said, who's going to sit there at the dealership or wherever and read all the pages around data that you've got. So thank you for letting us know about that. So we've had searchable data, sellable data, and really no limit. And the signal that I have, unfortunately, also takes us one step further, too.
[15:07]
Sue: So we all know Sam Altman's name at this point with OpenAI. Well, what I didn't know is he's also the co-founder of another company called, until very recently, it was called the World Foundation. It was recently renamed to World. But it was basically, I think, more or less like a cryptocurrency institution. It was selling crypto, offering crypto, but in order to access the crypto, they had this metal orb and it's called an orb that would scan your iris for verification and then you'd get some cryptocurrency in exchange.
So lo and behold, a couple of weeks ago, Sam Altman made an announcement that they're changing the name to World. They're kind of dropping the priority on cryptocurrency and instead they are broadening the purpose to really have these orbs scan everyone's iris as a way to verify your humanness in a world where there will be more bots. And it'll be really hard to distinguish who's real and who's not and who you really, really are.
They've been able to get quite a few scans already, I think, partially because it was all around crypto. They have 7 million scans, but so many governments are very up in arms about this.
I saw an article from El Pais in Spain that they banned World from offering these orbs around to scan irises because they were seeing that there were really long lines to go do this. They had many other countries were also just talking about instituting some kind of ban, but on the other hand, they now have Taiwan and Malaysia as partners because they're starting to realize, oh, identity, digital identity is still going to be so important in the future.
So I thought that was really interesting, but apparently, you know, your iris scan even goes a step further than the facial scanning technology we have right now. It is apparently like one of the most accurate identity markers for you because they measure some biomarkers that are very consistent over time, regardless of how you age.
And so, you know, at least for me, it is very, very scary. It's very frightening to think that this is going to be inevitable.
You know, I know in our episodes, we've talked a lot about having human-only experiences. This was not what I was expecting. You know, like this is his version of a human-only experience. Like in this world, how are you going to verify that you're a human? And so this is what the orbs are intended to do.
They're not that many located in the US, but interestingly, Raakhee, there is one located in Venice in Los Angeles, if you just wanted to go see what it was looking like.
But yeah, I started thinking, you know, this is getting into more of like a “Minority Report” type of plot where it's all about the iris as your unique marker. And what kind of future does that bode for us? So that's the signal that I'm bringing this week.
And I see a lot of shaking heads. Yeah. What's going on? How are you feeling?
Lana: Oh, I was just thinking we need to have like a running list of films, like because I was it was on the tip of my tongue, but you're I was like, it's a Tom Cruise film. Like yes, “Minority Report,” like we need to have a list of films that's like prescient or, you know, like just… ugh.
I'm so glad that you brought awareness to this with the cars, Raakhee, because I just, you know, you sort of think about the car as kind of a safe space. If you're by yourself or like have conversations or listen to whatever you're listening to or if you're with someone and it's like a traveling private office, but it's not.
So and yeah, I did, I did hear about the the World thing, but my understanding was even how they collected the initial seven million was a little suspect, right? Is it because it was almost the lure of the crypto like almost like buying the scans? Is that right, Sue? I recall.
Sue: Yeah. The article from El Pais was saying essentially one way to get the scan was to offer essentially free cryptocurrency and not, it wasn't guaranteed, but you might get some crypto in the end. And I think that was what was causing some of these long lines. So that that's at least what the Spanish official was saying.
[19:48]
Raakhee: That goes exactly to, I mean, the laws should prevent that, right? I mean, we've seen that in other cases even recently where things are being offered, incentivizing people and where are the laws and regulations like stopping that, right? That should not be OK, separate those things.
So yeah, a lot of stuff coming about laws and just regulation and really a concern with everything we've learned today. We're on the back end, right, from the legal side in terms of where progress has happened technologically and where we are. And we're really on the back side of it, with legality and protection and our protection. So it's how do we catch up and keep pace with the speed of all the stuff that's happening out there? It's just, yeah. I mean, it's and Sue, I guess to me is like the part where you were like, well, you know, to be able to distinguish between humans and robots and just like, yeah, it's all so real now. It is all so real. I mean, that's, you know, that's a reality. I don't know. I think for me the legal side of it is coming up as a real point of reflux, actually in my body right now, like I'm having acid reflux.
Sue: Oh, no, Raakhee. OK, we have to, we have to figure out how we move forward from this because, yeah, I think you're right. It's moving so quickly that laws and regulation are a bit behind as we would expect. So it's kind of on us to figure out what to do in the meantime. And I guess for me, you know, one thing we have some watchdogs, which are good, you know, and that is something we're going to constantly need, like third party observers, people who can continuously report the downside of all this. So I'm just really appreciating the work of so many advocacy organizations because really, really hard to do all that.
And I think the other thing for me is, you know, we grew up in an analog world. So for me, I'm thinking, that's not so bad. I have nostalgia now for an analog world. And you're actually seeing more and more analog-related products.
So maybe in the future, if the car companies continue to sell and they're completely connected, maybe they'll start to offer a completely different alternative or there will be some companies that are figuring out how to build just old school analog cars and vehicles. Maybe people will take to the streets by bicycle now. You know, like there might be other very, very simple alternatives in the meantime. But I think to me, it's like, oh, I didn't even know any of this. And I guess that's why we have a National Cybersecurity Awareness month to unearth these things.
Raakhee: Yeah. I think you're right, Sue. I want to go do the assessment that you mentioned, right? And yeah, just really almost create a data kind of tracker, maybe even like a privacy and a data tracker in my life and be like, where am I really giving out what? And how do I scale that back? What can I control? And how do I do it?
Lana: Yeah, I agree. I mean, I feel like my assumption is like the big leak, and I'm like looking at it right now, is the phone, right? Like we sort of joke about when you see an ad for something that you were just talking about the other day. I think initially it was like, oh, this is weird, or this is scary. And then you just kind of are like, OK, it just is. I don't know. It's the convenience of these things. It's probably set in a way that like the default is for everything to be on. And so it does feel like it's about educating to see like, OK, to take this seriously, what are the steps that we have to take, to little by little, all through our different like devices and our smart homes and our computers and our phones to figure this out. And it feels overwhelming to be honest, but I don't know if there's like a better way to do it.
Sue: That's a really good call to action, both of you. Yeah, I'm definitely taking away a lot of to-dos, a lot of new things that I've learned.
And for all those listening out there, you know, I wonder how you're feeling as we've been talking about this. Thank you all for listening so much. Yeah, stay tuned for our next episode and we'll see you soon. Bye for now.
[24:22]
Comentários