Host Cat Coode and special guests take a deep dive into the noteworthy, concerning, and downright fascinating stories featured in recent episodes of the Data Drop News podcast.
Pro tip: you can listen to The Data Drop Panel on your phone by subscribing to our podcast.
About The Data Drop
The Data Drop podcast is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and global Collaborative Intelligence.
Join The Data Collaboration Community
The Data Collaboration Community is a data-centric community where professionals, nonprofits, and researchers join forces to collaborate on datasets, dashboards, and open tools in support of important causes. Learn more.
Full Transcript
Cat Coode: Hi folks. I'm Cat Coode, a data privacy consultant from Binary Tattoo and a member of the Data Collaboration Community. Welcome to the Data Drop Panel where each month we gather some lead data and privacy professionals to hear about the news stories that stood out for them over the past month or so in the fast paced world of data privacy.
It's always interesting to hear what's raising eyebrows and curling the fists of practitioner. I should note that all of the stories that will feature today have been included in our podcast, which delivers a five minute privacy news Roundup. Every other week. This month on the data drop panel, we have three amazing guests.
We have Jeff Jockisch who's the CEO of PrivacyPlan. David Kruger, who's the VP of Strategy at Absio. And Heidi Saas, who's the data privacy and technology attorney. All right, actually, Heidi's gonna kick us off today. So our first topic is the long awaited us data, privacy bill. That appears to be on track again.
Heidi, why don't you tell us about that?
Heidi Saas: Yes, no longer a mythical creature. We have it in draft form. But as I've been telling people, believe you need, this is not the first. Draft like firstborn children were sold. Like so many promises were made, like deals were cut all over the place. This is the deal that is supposed to represent all of the stakeholders.
It's not just about privacy. This one also includes children, includes biometric data. It includes AI and audits. It includes so many different things. Civil. Just not consumer rights in cybersecurity. It's got a little bit of everything in here. I don't wanna call it an omnibus because that's, that's too all encompassing here.
But what it does have in it is some good stuff along with a couple of, you know, issues. And not everything that we want to be covered is going to be covered. But it's at least something right now, which is when we really, really need it in our country. Because I don't know if you're aware of what's going on in our country, but women are about to lose our right to bodily sovereignty.
And so our data will be able to criminalize us. Basically. So that that's a, a story that we're gonna cover next. But that is part of the driving force behind why now people suddenly pull a rabbit outta their hat and say, here is this giant privacy bill. So, the major headwinds that it's facing in DC right now is Blumenthal and can't, well, they don't support this.
So it's a draft been introduced in the house. It's not been introduced in the Senate yet. And that's why, because they're not behind it. There are other competing bills that they feel like. Have stronger consumer protections. Part of their issue is the private right of action. So there is a private right of action, but it's very limited and it doesn't even kick in for four years.
So four years is there. So the other stakeholders can claw back any protections over time through whatever administration we may have next. So that's kind of the reasoning why it was drafted this way, the purpose and story behind the actual draft itself. So some of the great things that we do see in it are algorithmic bias in audits and things like that.
Because we are recognizing. Across the board, every industry's using these tools and nobody really knows how they work, but we do know they're causing disparate impacts. That's wildly illegal. So at this point we need to do something just at least to raise the base level of knowledge and provide best practices for businesses.
So that they don't run afoul of these issues that they're not even aware of for the most part. So I think the new data, privacy bill, it includes so many other moving parts that I can't really cover it in about five minutes. But those are some of the things that I see that are positive. Those are some of the things that I see that are gonna hold it back.
And we are limited by time. I think this is the most important part right now. We are limited by time. The August recess is. And if we get to the August recess and nothing is done with this bill, it is likely that it will just lie, follow and do nothing because then we're caught up in the midterm elections and then who, who knows what we have for the next, you know, set up for how Congress lines up.
This is our best opportunity. Now you can see where people are starting to add pressure. Elizabeth Warren just threw a new bill in last week, just to throw fire on the flame to try to get Kent well pushed back into this line. So that she'll accept what everybody else is on board with the hearing last week, I think was interesting to me because I've, I don't recall seeing a whole hearing of all the legislators, all the legislators and all of the lobbyists who were there presenting their issues.
Everybody on agreement about we've gotta do something now that was the vibe in the room, something, anything now, and everything else we can work out later. We all agree. Something's gotta happen. I don't recall seeing that before. So I think that's the most unusual thing out of the whole circus that is this last, you know, two weeks of crazy privacy look.
Right. So in the Canada don't wanna feel left out. So, I mean, they threw something in there too, and you gotta get to that too,
Cat Coode: right? They did. Yeah, they did. They did. I don't know. Jeff. David, do you have anything to add to that one? I think Heidi's covered it.
David Kruger: yeah. Well, I, I
Jeff Jockisch: would say it's a pretty good bill.
We, we had to pass it. There's some good coverage of data brokers in there. I like what it's doing with preemption. I mean, I'd rather, it didn't preempt stuff, but we have to have some uniformity and the fact that it's doing some limited preemption and still giving states the ability to do things.
Cover facial recognition laws and some other things like leaving death end intact. I think that's good. So, I'm pretty happy with this law. And I really like the fact that it's, that it's covering sensitive data. Really stringently requiring opt in for things like location tracking.
That's important.
Cat Coode: Yes, that's right. And I agree, like location tracking should be sensitive data. I'm always surprised when it isn't. Where's the foam finger emoji on this. Yeah. Where what's the timeline in the us for this? Like, if, if this gets passed, what are we looking at before? This is actually like a law where this.
Jeff Jockisch: You know, I'm not sure what the dates are on it, but I think it goes into effect pretty quickly. It's only the the private right of action. That's delayed for like four years.
Heidi Saas: Yeah. There's a rolling basis for some of the other provisions that that need to happen. But for the most part day one, There's a flat out ban on using children's data, like flat out and, and children.
They've decided to identify as minors, not 13 to 16 and you know, none of that just as minors. So that would be 18 and under
Cat Coode: her. That's great. All right, Jeff, we're going on our next topic, which is yours and that's the FTC chair con plans. Key work. As we were just talking about kids on kids' data privacy online.
Jeff Jockisch: Yeah. So, Le Khan at the FTC is saying that they're really gonna crack down on children's data. You'll you'll note that children's data is really sort of a, a subset or maybe even a superset of, of sensitive data. Which is a real Babbo of mine. And, and really what the FTC said is that they're gonna crack down on education technology companies in particular, right?
Who illegally surveil children when they go online to learn, that's one of their real big focuses. And, you know, this is really a little bit of a, a response you remember recently that senators were grilling executives from YouTube and tech talk and, and Snapchat about what they're doing to ensure user safety in the wake of suicides and other harms to teens attributed by their, you know, parents, the usage of, of, of those kinds of platforms after Facebook sort of got called under the map by, by Francis Hogan.
So. You know, this is very much in the spotlight right now, but it's also because children's data is sensitive data and it needs to be protected even more than our personal data. Which needs to be more protected, right? So we have to have, you know, our privacy protected our personal data protected, but children's data has and sensitive data have to be even more protected.
It requires not just an ability for us to opt out of that collection, but needs to be required to opt in to protect that sensitive data that children's data. And while we've had laws like Copa in, in effect for a long time, There are loopholes in that legislation that essentially that companies collect that data.
If they don't know that you're a child. And that's a really huge loophole that trucks drive through every day.
Cat Coode: Yeah. And then for anyone listening, who isn't familiar with carpets, the children's online privacy protection act. And it does say that you can't pull data from children under 13, but then, which is why all of the social media of course has a minimum age of 13.
And I used to do a lot of kids, internet safety workshops, and like get into a, anyone undergrad sex and say who here has an Instagram or a Snapchat account. Every hand would go up, which means every one of those kids is lying about their age, which is perfectly on point. Jeff, is that exactly what you said is that kids are lying in order to get in.
So the data's being collected anyway. Right. David, why don't you tell us about Marinna one and protecting users' privacy from Google and surveillance?
David Kruger: Well, I, I there's a sort of a, a, a mantra of mine. Partial privacy is no privacy. Right. And this really has to do with the structure of data brokerages and how things flow.
If a, if a, if a broker gets a data from one source or a hundred sources, they don't. They care that they get the data and then that data is sold and reused and reuse and reuse without end. So the thing that, and then there's another thing that's a particular PVE being a ly guy of mine, where, where things are marketed as being privacy enhancing without actually quantifying or qualifying what that means.
Right. So. There was a a, and, and I'm not dissing this product. And I, I, I'm glad that they're trying to do things like this, but you know, there's this couple lines that IED outta there, we have removed many pieces of code that send your personal data to remote servers without your without your consent.
Sounds great. What was the key in that sentence? Right. Let me say it again. We have removed many pieces of code that send your personal data to remote servers without your consent. Heidi many just that's correct. Yeah. There are many more, so many that's that's correct. There are many, many more. And again, the way the structure of the market works if you're going to sell and, and reuse that information perpetually right there, there's no practical distinction between many and.
And all right. I mean, are, are many in some and all that. It, it is essentially, if you can't do all, you're not doing really anything except. Doing a little, you know what I call rubbing a little privacy S on you, you know, and then, and then making money from it, right? It's not actually a solution to the problem.
Not, not dissing these people. I'm glad somebody's doing something, but it is an illusion to think that they're actually doing anything that has real force and effect in the world. They also go on to say in this article and cause EOS their proprietary operating system runs on Android. You can still run most of your favorite Android apps.
The company continues. So, Jeff, I think you snagged the the Tim Horton story before I did. So these, these two things and actually even the the, the duck dot go Microsoft story, they're all related to each other. Because again, if you don't stop the flow period, you don't stop the flow at all.
So if you can run your favorite Android apps, you know, there are, there are ask yourself a question. Why are there millions? Somewhere between four and 6 million. Now I think apps in the Android store. Why are there millions of apps? Do we have millions of actual applications that we need all those apps for?
And the answer is no. Most apps in the app store, their financial model is selling the data that they collect. So they have every impetus in the world to lie. And if you've got a, an OS that cuts off some functions that doesn't allow stuff to be sent out and you. And you allow them to load native Android apps.
you know, into the whose sole function for many of these apps is to collect the data and sell up the data brokerages and or to deliver advertising. And I love the ones that make you pay to not get ads, but still collect your information and sell it. God, you gotta love the, the tah for doing that. So, so again, these things are partial.
Privacy is no privacy. But that does not prevent people from making BKU books off of the earnest desire that we have for privacy. And frankly, that bothers me a lot.
Jeff Jockisch: Well there's so there's very little dad to what David said. I mean, he's just spot on it, it's great to want to improve privacy, but I'm not sure how much this would really do to improve that privacy.
David Kruger: Yeah, it, it it's a step in the right direction. The legislation is a step in the right direction. But the, the thing is, is that these are all battles that have to be fought in won, but, but we're still nowhere near winning the war.
Heidi Saas: I'll just add that. I hate data brokers. I, I hate data brokers, like whatever you wanna do.
I mean, that, that's a good idea.
David Kruger: So I mean,
Jeff Jockisch: you know, I, I was actually just thinking one thing, maybe we could, we could add to this if you actually, you know, implemented Desar request and go to a couple data brokers and, and pull your data off of those data brokers, what is the real point of doing that?
I mean, if you're individually gonna try to do that and, and get your data out of like two or three or five or 10 data brokers, does that accomplish anything? This is
David Kruger: already out of the tube. Yeah. Remember this privacy sad. Just rub a little bit on and you'll be fine, you know, it's we have cybersecurity on this arm too.
So there two things are related, right. So right.
Cat Coode: All right, Heidi, you have our next topic, which you had touched on a little bit earlier. Health data, privacy concerns grow as abortion laws change nationwide. What does this have to do with privacy and tracking?
Heidi Saas: So this is why everybody freaked out when Ruth died like this.
This is why because robe being overturned there are 13 different states that enacted legislation that says if Roe is ever overturned, it triggers. New laws in those states immediately. And some of them criminalize the act including accomplices. So if you're in Texas and your friend wants to use your phone to look up, how do I get an abortion and travel to Maryland?
That could be data collected and. Sold by a data broker or gathered by the government because they have all of these powers over big tech to walk in and say, give me data on people in this area, or people searching these terms and then they can pursue these people. So your period can lead to a sentence.
And that is, that's why we freaked out. That's a fundamental freedom that we have. It's a safety issue. Now these list safe craft was busted by vice magazine for selling location data. On people who were using planned parenthood services and they offer a lot of services, but they were selling these lists.
I wanna know who's buying these lists and what are they doing with 'em. But also, you know, that turns it more from privacy into personal safety. So cyber can take a backseat for our minutes because that's not, ransomware is like the last of my concerns right now. I am more concerned with religious white people showing up.
When I'm trying to go to work, if I'm a healthcare provider and they're shooting at me or something like these are real concerns, cuz these are things that happen. So the fact that Roe being overturned it's gonna upset the, the penumbras and emanations, the privacy in our country is based on. So I don't even know how the decision is gonna be written.
But I read one of the leaped drafts from Alito, and I gotta tell you not since Plessy V Ferguson, have I read a Supreme court decision that was so vile, so hateful, so paternalistic and willingly. Ignorance of the issues. And I am afraid of a decision that comes out, looking anything like that. So this is a big reason for everybody to get upset.
Privacy is now everybody's problem. If you have a uterus or you love someone with a uterus now is the time to get involved in privacy because your information can be used against you in ways that you never thought possible and you never agreed to, and you can't do anything about, so if you are using one of these.
Check out the FTC site, federal trade commission and their settlement with flow. You can get some good information on the flow of your data through flow right out to everybody. They wanted to market all of your information to, so, and that was a big settlement that nobody really wanted to pay a lot of attention to because it involves the female cycle.
And now it's at a point where you can't look away from this issue. I don't care how much you don't like to hear these words. You are gonna have to get used to hearing these words because your rights are at stake too. So that's where I'm at on, you know, the health data. The HIPAA does not protect most of the information that you put into these apps, unless they're from a health service provider or your insurance company.
And they do so many contracting and sub apps and this and that API to link things together that nobody's responsible in the end, because we don't have strong enough laws for consumers to know about what's going on and enforce their rights until we get privacy rights in this country. We still won't be able to do much about this.
So this is why everybody's freaking out. Now, this is why we freaked out when Bruce died, and this is why I'm gonna keep freaking out until we get this decision, which could come as early as Wednesday.
David Kruger: Those data flows are all over
Jeff Jockisch: the place. And, and every, you know, not just location, data, search terminology you know, variety of different things, right?
If you search for abortion services or, you know, prenatal services or whatever, right. That information goes ad tech vendors. Goes to data. Brokers can go to law enforcement can go to, you know, anti-abortion pro I guess providers, what do you call those services? Right? Pro-life services and probably people that might be, you know, angry with you for thinking whatever you you're thinking.
So, yeah, it's a big privacy concern for people whatever your, your thoughts are on the
David Kruger: issue. Well, you know, the, the, the thing is, again, you, you have the search engine. I mean, there's, there's a story about do, go giving stuff to Microsoft that, you know, violates their own policies. You, you, you there's technical problem with browsers.
The way browsers are architected, you cannot stop. You know, people from downloading stuff into a browser, you have a, the progressive web apps that don't go through any kind of vetting in a, in an app store, which is, you know, the vetting in an app store. Again, you, you vet what you don't like, you don't vet things that doesn't, don't give you information that you can resell and make money.
So there's a lot of variability in those things. And, and the, the useful thing about all of this is the more, and I know this sounds terrible, but the more outrages that people have that are based on their privacy, and one, one respect, that's almost gotta happen for people to wake up and smell the coffee and realize that their personal information a has real value and B can be used against them by anybody for any reason.
You know, so wake up this privacy stuff, isn't some side issue, right? Remember the real time. This is real serious stuff. Yeah. Remember
Jeff Jockisch: the, the real time bidding stuff that that Johnny Ryan's lawsuit is, is is going after right now applies to this too. So if you're. If you're searching for services like this your information is going out to potentially hundreds of different organizations every time you search for a term.
And that includes abortion services.
Heidi Saas: Massive ongoing data breach.
Cat Coode: Yep. Yeah. And just to tie this one out to reiterate, cuz I think you all covered different areas. There is the, the curated version of I'm gonna search something online. You're literally putting the term in there's the, maybe back of your mind version of these health apps that are tracking your cycle.
And then there is this total non curated area of geolocation where we're tracking people to these sites and people aren't like, we're not even thinking what apps or doing. So, now David, you said wake up and smell the coffee. So we're now gonna go back to Jeff for an issue about Tim Horton's
Everyone knows. So, Tim Horton's app collected vast amounts, vast amounts of sensitive data. So what is, what is the story behind that one?
Jeff Jockisch: Yeah. So, this is an interesting story. Apparently Tim Hortons was unhappy that people were drinking coffee other than Tim Horton's coffee. And so they started tracking their customers when their customers were going to Starbucks and and dunking donuts.
And some people found out about it. And one guy a reporter, James McLeod found out that Tim Horton's tracked him 2,700 times in less than a couple of months, five months. And so, he did a story about this and now Tim, Horton's got in a lot of trouble and obviously what they were trying to do is they were trying to protect their brand and, you know, find out when, you know, customers were going someplace else.
So. Probably try to figure out what they were doing wrong. Which makes a hell of a lot of sense from a marketing perspective, not so much from a privacy perspective and a cons customer trust perspective. And so, the, you know, Canada a privacy commissioner was not real happy with with Tim Hortons and that came down on them pretty hard.
But as we discussed a little bit, pre-show. Do not think there were any fines on Tim Horton's. They, they did agree to stop doing this and they got in a hell of a lot of trouble PR trouble, but there were no privacy fines because this is not illegal in Canada, at least at the moment. And so it's an interesting issue.
It's not actually illegal. I don't think in the United States right now, but it will be in 2023 because California, Colorado and Connecticut are all making location tracking and opt in because it's sensitive personal information. So, that's the story. I imagine other people here might have a comment on that.
David Kruger: Yeah, well, again, from a, from a development spec, from, from one aspect of this thing, is, is that, you know, shame on Tim Horton's development or their contract developers for being stupid enough to get caught. And I mean, I'm saying that tongue in cheek, but this, this goes to the underlying problem, right? Do doing location tracking just doesn't happen in the app.
It happens in the browser as well as it happens in the app. And it can happen while you search. And again, this goes back to some fundamental design. Problems that none of this legislation ever exists. It, it, it, it's all, it's all reactive, right? It's oh, you did a bad thing and you, you shouldn't do that.
We're gonna make a law. It's you doing that? But the legislators never ask the question. They, they always ask the question, how are we going to stop this? They never ask the question. Why is it possible? They never attack the fundamental technical engineering things that make all this crap possible. And again, I don't, I think I'm glad we're seeing progress.
I'm not discounting that stuff, but until they get to the fundamental questions, why is this possible? Then we get to real privacy. We' never get there as long as they keep on focusing on how do we stop it? That's the wrong question it's too far downstream to solve the problem.
Cat Coode: That's our privacy and production by design.
Yep.
Heidi Saas: Yeah. Why were your fines KA? What's going on with that? Why were your fines?
Cat Coode: And, you know, OK. So a couple things, first of all, Tim Hortons did have, we have consent and transparency things in Canada like that we do have NA geo ation, although it's coming. But the Tim Horton's at my understanding cause he don't drink coffee is that its intent was that you would open it and say, you know, where's the closest Tim Hortons, which is always.
30 meters away. And then you would say, this is my order, which Tim Hortons, do you want it ready at? And you would press the button. So there was a legitimate reason to have geolocation baked into the app, but it should have been, like you're saying David privacy by design, it should have the whole idea.
And the consent and transparency was that we will use location in order to identify the closest to Hortons, to, to where you are. And then the location was left on 24 7. And I'm thinking even you could track someone from home. To work and say, this person always stops at Tim's 10 minutes after they've left their home.
No matter when they leave their home, whether it's at eight in the morning or three o'clock, they always stop at Tim's 10 minutes after they've left their home. I mean, even that data is not what they were supposed to be collecting. But we do have another story and then we can talk about a little bit about the Canadian law, but the geolocation is being marked as sensitive data.
In the new Canadian law. So yay.
David Kruger: What I wish somebody would've asked in this article is not just the fact that they did collect it. What I wanna know is which was not in the article is did they sell it? That's what, going back to data brokers, data brokers,
Cat Coode: data, broker questions, the whole, everything could just sum up with data brokers.
I David. Yep. You've got our last story. So you had kind of alluded to this one before Dr. Go is caught giving Microsoft permission for trackers, despite its strong privacy reputation.
David Kruger: Is that, that for me, that's for you. Oh, okay. All right. No, well, again, it's, it's all of a piece and this is the, this is the problem you've got, whoever's doing search the whoever's selling a browser, right.
And whoever's installing an app, right? Whether that's a browser based app or whether it's something installed natively on the, the device. And, and if you've got leakage from any of the three of them, Again, there's there's no privacy, unless there's, there's no partial privacy, it's either private or is not.
So again, in this, this one news cycle. We've got the Tim Horton app. We've got the, the duck, duck go, ostensibly private browser. And then you always have data collection by anybody who's running a search. Well, not everybody. There are some, some search engine that do not collect information, but they're rare and they're not, they're not well used.
Right? So those are, those are three holes that you, you have to fill. And, and again, fundamentally the thing that we don't address is what makes that possible. Because you can slap people's hand and you can find them. And as long as they're the, the amount of, of fines is something that they can write off as a cost of doing business, which is true across the board, slapping their hands is never gonna stop.
The, that is never gonna solve the problem. Again, privacy by design at the browser level, at the app level and at the search appliance level and at the social media level, all the things that have yet to be addressed in any single piece of legislation that I've seen.
Heidi Saas: Yeah, I want new devices. I want our devices to be built differently because we're going to use services differently moving forward.
So I want devices to be built differently, but that doesn't really make good business sense for them right now. So that's, that's kind of where I'm hung up because here here's the situation. I'm at the office I'm hanging out and I'm on my computer. My computer, my phone are talking. I didn't tell 'em to they do.
I leave. I get in the car, my phone in my car talking. I don't want them to, but they do. Then I get home. I got my other computer, you know, by the time I get back to the office on Monday, my phone is like, guess how all these tech items she's been using, telling all my secrets to everybody. And, you know, I didn't give consent to any of that stuff.
You know? I mean, it's almost like a jealous mate. Like, were you using another computer? Did they have better graphics than I do? Like. What is all this about? Like, if this was a person in your life that acted this way, you would ditch them, cuz they're a nuts. So I'm not, I'm not really sure. You know how we can convince the device makers to make better devices that are privacy, respecting and privacy by design and the device.
But if we're gonna live our life out of this and it's gonna be our past key to everything else, we're gonna need better devices. They can't like we can't go on the metaverse with this crap. This is not okay. So I mean, we can't do data ownership with these either. Not copies, right. That's what we're looking to do.
So we wanna own the data. We wanna give people access to it. We don't want people making copies of it, but we've gotta build that technology into the infrastructure, starting with our devices. So that that's where I'm at with that. So I'm glad to see people are starting to do a little bit here and there, but it it's, like you said, it is privacy washing.
So the more we call that out, you know, the, the more people are gonna realize there's so much more, we should pay attention to. You
Jeff Jockisch: know, I, I think, I think the ultimate solution is as we move to this next version of the web, whether that's the web three or the, the metaverse or whatever it is, , I, I think we have to control our identity.
And there are a lot of solutions. That people have been working on for a long time. Now there's so many people working on, on identity from so many different angles who knows really what's gonna what what's gonna happen, but, but ultimately we need to control that identity. And hopefully it's some sort of a self sovereign, you know, parable identity solution.
If we can actually get. If we can control that identity, it solves a lot of our, our privacy issues, because then we can actually parse out. That identity in ways that allows us to control our data more effectively. But of course, if identity goes in the wrong directions, it could actually be much more of a disaster for privacy yeah.
Than, than we have right now. So it's a problem.
David Kruger: Well, you know, I'll, I'll throw this in here. Beware web 3.0. Because you know, we're trying to control our identities. We're trying to control our data. Webre is so far was a very difficult thing to define. And you know, if you think of your data as something that's mobile, something that moves around because it is physical.
It is real objects that are controlled by physics that get moved around and used by machines and people. So far Webre is an attempt to put the brakes and the steering wheel on the highway instead of the car. That's probably not going work out very well. The, the amount of hype around free web free 0.0, and the amount of engineering substance is extraordinary.
And I love that term privacy, Washington.
Heidi Saas: You should also check out my article that I published today on the Interline max fashion, tech, fashion, and
David Kruger: gaming. I read your stuff.
Heidi Saas: At the inter line about privacy and trust is brand messaging in the metaverses because there's so much garbage out there.
David Kruger: I will read your stuff and, and, and again, it's cause I do, I like your stuff.
And then, but this is this is you know, there's a lot of hype. Just, just, you know, keep that little, little thing in your mind when people are talking about changing the structure of the internet to to, to accommodate privacy, remember the brakes in the steering wheel, which is the controls. We want our data, right.
The brakes in the steering wheel. Go on the car, not the road.
Heidi Saas: So we
Cat Coode: already built the cart and we didn't put the brakes on it. Now people
David Kruger: are on correct it. And then you can, you can read all of my articles that point out there.
Heidi Saas: So there we go. And I do, we do, we love
Cat Coode: your articles, too. Abundance of analogies that I love.
All right. So quickly the Canadian law that has just been announced. On June 16th, which is new as Jeff had said earlier, we didn't used to have a tribunal that could actually create fines or say that anything was illegal and that's coming in Canada. So our fines actually are up to 4% like GDPR or 3%, I think, but there's a 5%, there's a 5% of gross revenue.
That's coming in, especially. From an AI perspective, which is all new in Canada. So we are getting an AI and data act and an actual AI data commissioner to deal with ethics. And I know Heidi does a lot of work on data ethics around AI. So we are, we are looking at algorithms and how it's using data and abusing data.
The big thing in our we had a C P P a, because we don't have enough acronyms as everyone says. And that was the consumer privacy protection act was announced last year. That version is also coming in includes, we didn't used to have the right to deletion. Now we will have that right to portability all that right stuff.
But one of the big things everyone complained about last year was that de-identified information was not considered personal information, cuz it had been de-identified and in the world of data brokers and Heidi, she already shaking her head. We know that they don't know how to properly deidentify information to a point where it cannot be re-identified.
So now the argument is that de-identified information in this new law is personal information falls under all of these wonderful rules and regulations, and that if it is truly anonymized, then it can be used and abused because it. Not personal information. So I am waiting cuz we need subsequent things to go with this, with what that definition of anonymized means.
Because again, we see this often where people like I'm only using gender and postal code and this one other thing, and then you have one non-binary individual in this one area and then bam, you have identified the individual. So, yes. So we are, we are waiting on how we're actually gonna define that.
But is it passed? Pardon? Is it passed it? Yes, it's coming. It's coming. I'm I'm the non-lawyer so I don't know how this whole thing works, but yes, it's coming for sure. Does that mean it's pass like, is it the non-law
Heidi Saas: is it a bill? Is it passed? Is it happening for sure? I mean, it's happening
Cat Coode: for sure it is happening for sure.
So, yes. Awesome. Yes. I don't know what the terminology for that, but it is happening for sure. It's long and it's coming soon and, and we might actually be using, we have postal code, which is like zip code. We might actually be using that as an identifier. We'll see, like that might actually be construed as, as enough of a personal piece of information that all of the data brokers in Canada might be in trouble because so many of them use that.
Because our postal codes are by. Oh, even, even side of the street and odd side of the street, they're very specific. So, yeah, sometimes the areas are very small. Sometimes they're bigger. Either way. So that's the Canadian lock. So now everyone knows. I don't know if anyone had any closing remarks. I think we covered a lot today.
Yeah. It's
Jeff Jockisch: important. You know, I, I would throw one sta out here real quick. I don't know if people realize how easy it is to re-identify data, but if you know the gender of somebody and you know, their zip code It, it is something like 50% likely that you can actually figure out within, I think it's like 50% of the time you can uniquely identify that individual.
It, it it's, it's like relatively trivial,
Heidi Saas: so yeah. Yeah. And the location data too, you know, I mean, numerous stories have been done on this, but I'll just tell you where I live right now. There's six people in a square mile, right. The swear mile I'm in, there's only six people in it. Right. Well, so I mean, yeah.
And, and how many of those are data privacy? Badass lawyers. One right. So, I mean, that's, that's kind of a problem. So, and I'm worthy of vigilant justice too, so I really need somebody to do some more on privacy. Rights in this country, cuz I really wanna get back to working on access to justice, but I can't do anything on that big societal issue until we have rights in our data and trustworthy data practices.
So if we could get this resolved and move over to data ownership, do access, not copies and get that handled already. Then I can move on to what I really wanna do. And then I'll be less cranky. I promise. Yeah,
David Kruger: yeah, yeah, yeah. This what I'm, I'm waiting for this logical progression where you do get the data brokers who do have some laws that said that they have to so forth, you know, de anonymize you, I mean, the data.
And they take it as an opportunity to open up a sister company that does open source intelligence capabilities to re attach the data to identities. So they get to bill twice.
Heidi Saas: So science and technology is asking for comments they're due by July 8th. If you have some comments on privacy enhancing technologies, if you know of a P E T, if you're working on a P E T, or if you just wanna tell the O S I N T community in science and technology as you're setting forward best practices.
Please send your comments in so that you can let them know. PTs are not commonly are not common across the board. They're also not effective. A lot of 'em are snake oil. We need to develop audit capabilities for these things. We need to verify that the PTs are working before we mandate their use. So if you wanna speak on that issue, I would highly encourage you or anybody else to reach out and let me know.
I'll give you all the information you need, so you can send those comments to the people who can do something about it. Nice.
Cat Coode: Awesome. Thank you guys so much. So that's a wrap. We're gonna thank our guests again. Heidi Saas, Jeff Jockisch and David Kruger.
I'd also like to invite everyone to listen in to the free our free podcast that happen again every few weeks and join the free Data Collaboration Community.
We're a vibrant group.
Look at us vibrant
of data savvy professionals who collaborate on open data sets and build free tools for important causes. You can visit datacollaboration.org/community to learn more about becoming a founding member and get the link to sign up. So thank you again everyone Until next time.
David Kruger: Thank you.
コメント