Host Erik Rind and special guests take a deep dive into the noteworthy, concerning, and downright fascinating stories featured in recent episodes of the Data Drop News podcast.
Pro tip: you can listen to The Data Drop Panel on your phone by subscribing to our podcast.
About The Data Drop
The Data Drop podcast is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and global Collaborative Intelligence.
Join The Data Collaboration Community
The Data Collaboration Community is a data-centric community where professionals, nonprofits, and researchers join forces to collaborate on datasets, dashboards, and open tools in support of important causes. Learn more.
Full Transcript
Erik Rind: I folks, I'm Erik Rind CEO of ImagineBC and a member of the Data Collaboration Alliance. Welcome to the Data Drop Panel where each month we gathered some leading data and privacy professionals to hear about the new story that stood out for them over the past month or so. In the fast-paced world of data privacy.
It's always interesting to hear what's raising the eyebrows and curling the fists of practitioners. I should note that all the stories that we'll feature have been included in our podcasts, which delivers a five-minute privacy news Roundup. Every other week. This month, the data drop panel will have three guests.
Jeff Jockisch CEO, PrivacyPlan, Priya Keshav, CEO of Meru Data, and Sameer Ahirrao CEO of Ardent Privacy. Welcome, everybody.
Priya Keshav: Thank you. Happy to be here.
Sameer Ahirrao: Thank you for having us, Erik. Look forward to the conversation. Yeah.
Erik Rind: All right, let's get kicked off. Jeff. You're first here.
So the topic is your online activity and location is being exposed 747 times a day for. I'm actually a little surprised it's that small?
Jeff Jockisch: Yeah. Well, that's an interesting case the Irish civil liberties organization I think that's run by Johnny Ryan filed a lawsuit. And this is against the the IB in Europe.
And it's really, this is a case that's about real-time bidding. And so they've done a lot of research and one aspect of that research is really sort of understanding, you know, how our personal information is shared in the real-time bidding process. And when you really break that down and you look at Americans for instance all that, all that ad tech really means that that Americans' information is, is given out to over 4,000 companies.
And our information is shared roughly 750 times a day. Right. And this is you know, personal information, personal data, as well as location. Which is, you know, in their opinion and in a lot of the people's opinion, a data breach, right. That's on a massive scale every single day. And that's just a scary thing right now, you know, the ad tech folks and Google and whoever else is sort of involved in this would probably tell you that this is in the information that.
Can't be attributed to an ADU individual person, right? That is de-identified. And, that is technically true. But when you have a large set of information about people, right, 20, 30, 50, a hundred pieces of personal data, even if it's de-identified It can be pretty easily re-identified. I mean, I think that's pretty much a well-known fact.
So if I was to take that information about you, Erik and take it to another data broker, they could probably pretty easily figure out that it's you?
Erik Rind: Yeah, that's interesting. Jeff, when I started my company, At the early onset, I actually hired hip sows, the market research company to actually do a study on that, which is how much data can you collect before they can find your needle in that haystack.
And I was surprised how small the step was that, right? That, Hey, if you have this, this piece data, this piece of data, I can figure out who you are. That really surprised me.
Priya Keshav: A little bit of credit to the adtech industry died. So when they started working, their intention was not the full person information, but the definition of personal information has changed. Right. And obviously, just the sheer volume of what they collect makes it. So obviously they're not collecting name and email address and physical address.
So they feel like they don't have the personal information, but. Because they collect so much information about everyone. You know, even if it's a cookie ID or a device ID, it just becomes easy to start saying, you know, pretty much knowing more about yourself than just, you know, it's amazing how much they track.
And I think part of it is just the recollection that, you know, cookies and device IDs and IME numbers and all of those things, which are. You know, some kind of 1 0, 0 something, and it's not, it's still personal information. I think that's the problem. W but that it's still difficult for them to recognize.
Erik Rind: Yeah, Sameer. Are you, are you buying the idea that they really don't know who you are? I don't buy that.
Sameer Ahirrao: Yeah. I think you guys covered it up, but I'll tell you, there is an interesting story, which I'm going to kill right out of India with that tutor company on the similar lines, that, what exactly is the PR versus PI. Right. So I think everybody's trying to get away with all the GDPR or CCPA that, Hey, we don't have personal data. We don't process personal data. Right. So I think regulations eventually should evolve that personal data. The whole definition. It's not just your name and last name and social security.
As long as you could be identity. That should be actually taken as a, as a personal information. You remember that New York times article where they actually would track the people in the white house. Can you believe that from the free and some partially bought data? So they actually mapped actually the whole everybody's location.
And there were like some people in Afghanistan. Right. So who are basically so no, I'm in that in a war zone, but they're still running. They still gonna do all kinds of family connections and social and whatever map. But yeah, I think what is the old, I think we need to getting away that we have anonymized data.
I don't think that used to basically exist the way it is. We really need to basically have something as long as you are identified. You're good. Right? That should under, that should be under regulation or whatever. The due diligence with.
Erik Rind: So Jeff, I started by saying that I was surprised it was only 747 times a day.
This problem is only going to get worse with 5g technologies, right. And the, the ability to put more and more sensors out there and track us more and more without us even knowing, I mean, sensors, tracking our cars as they move down roads.
Jeff Jockisch: Yeah. I certainly agree with you. It doesn't get better without regulation.
I don't think noticing consent solves the problem and so regulation, what is it going to take for our government?
Erik Rind: To be honest, there are a number of senators, us senators who were talking about this. The problem is that they talk about it. Nothing, nothing concrete's being done is they're going to take aliens from outer space to divide really useful legislation passed.
Jeff Jockisch: Yeah. Yeah. I'm not sure where we're going there.
Erik Rind: Okay. All right. Let's move on. Priya, Roe draft raises concerns. Data could be used to identify portion secrets and providers. Now, this is clearly a very hot topic right now.
Priya Keshav: It's actually in the front page of New York times. Today I think issues like this bring privacy to the forefront makes us realize how little privacy there is.
And, and not that we're not being talking about it. The previous topic was pretty much similar but major data breach. I ended up collecting both directly and indirectly and they collect both mundane as well as very intimate details about us. Things like, you know, what kind of prescription medications did you get?
Did you go to a particular site to read about maybe abortion? And some of these trackers they're supposed to be anonymous and aggregated but you end up with also very specific information. So all you need. I think you mentioned this one to two data points. So you have Bluetooth beacons in various places.
So if you're, if you go anywhere near them, they know maybe not, not you, but at least that your iPhone or your you know, whatever smartphone that you have was there in you're in the grocery store, or it could be a a planned parenthood for. And, and to collect that kind of data is not very expensive.
Right. So, I think somebody mentioned, I don't know if you saw one of the late night comedy show was talking about data brokers and they were talking about how cheap it is. And if you kind of look at the New York times already you'll see that, you know, one of the things that they point out is that to get details about 600 planned parenthood locations who visited them and who were aware that they go, you know, it just costs about $160 and it's something that can be available to anybody.
So it doesn't matter. And not so long ago, just a few months back. Maybe I'm, my timelines are a little bit fuzzy, but we were talking about Afghanistan, right? So we were talking about how, you know, the fact that everybody had a digital identity suddenly became an issue. Because now we could be in the hands of Saudi Bon, so.
I'm not trying to compare the two issues, but you know, but, but when, when issues like this come up that's when you realize how little privacy really have there's absolutely no way to escape. You know, if you, if you buy something, they track you. Your phone is dragging you all the time. And so, and so that information again, the fact that it can be just, there is no regulation on who can access it, how many people can buy it just makes it a very tough topic in terms of you don't know how it can be used and for what purposes.
Erik Rind: Yeah. And what's interesting, there is that in the first topic with Jeff, we really focused on adding. Right. And, and, and these guys using our data to try to sell us goods annoying. Not necessarily harmful, but this type of data where there's not really a product to be sold here, therefore if I'm buying it, I would think I'm buying it for pull.
It is a politicization of it. And that's a lot scarier, both neither, a good, but that's really worrisome. I mean, when we walk into voting booze where we're covered, right? So people can see how we vote. If they could start using data like this to target as for political reasons that that's a-okay.
Sameer, what do you think about that?
Sameer Ahirrao: Oh, so I think there is a right. So, I think there is, there is a perspective that this passive versus active, right? So if you see the advertising in our old TV era, it was passive, right? So you see the advertisement on the. But they don't pay anything to follow back on.
Right. So it's one way, right? So that's kind of a passive. So this active advertising, right? They see you in here, they see you in target or whatever the grocery store you go to, then they see you on browser. Oh man, this is, this is like somebody actually tracking you. And I mean, abortion, we get into bodily privacy here.
Right? So, no matter whether you are pro or anti, it doesn't matter at all right, whatever side you are on. Right. And everybody has a right to opinion, right. To believe in their religious or the other beliefs. But that does not mean everybody has a right for bodily privacy. Right. So this falls under very intimate and kind of bodily privacy.
Part of a section of privacy. And I think that this is Ariba in terms of how we can track the other thing, Silicon valley, or other August big tech or big Tex MarTech, everybody kind of try to make it so big. That tech is so awesome. But you saw this Cambridge Analytica stuff, right? You cannot just give access to this data based on the bots are just automated.
I think it's it's over automation right. Of your, and that's how the whole data was linked. Right. So yes, you can automate yourself. You can buy advertising bid on those, but at the same time, if there is no human element to actually monitor it, what exactly are selling? Because if you see the recent statement from one of the Facebook exhibitions, Right.
He clearly said, we don't know how this data goes and where it goes and we can't really control it. And our goal is to basically. Connect the world. So I think, but not at the cost of my daytime. I mean, that is something we really need to worry about.
Jeff Jockisch: You know, I know we're running out of time on this topic, but let me add something just really quickly, right.
Data doesn't care, how it's used. Right. 68 to 70% of Americans consistently say, they're scared about how, you know, the government might use their personal data. What they don't think about is that all of these companies that they're giving their data to freely, or the companies are grabbing freely we'll give that data to the governor.
Pretty much whenever the government asks, because that's what our privacy laws allow them to do. So essentially they're giving it to the government by proxy and right. The government can buy that data from data brokers for a couple of entities.
Erik Rind: Yeah. It's, it's, it's a little bit crazy and govern being complicit in it is really bad. All right, let's move onto the next topic. Indian customer sends notice to all electric for making telemetry data public.
Sameer Ahirrao: Yeah. So this is a very interesting guest and I actually particularly asked for this one, right?
So this is a scooter company, electric scooter company, right? One of the, kind of Uber rich mangers in India. So that guy actually made an accident. Right. And then why that accident happened, had an all kind of. Around it. Right. So basically the guy said that brakes did not work right. And maybe some kind of fault.
And now the, the scooter company, right. Basically said that they actually published his telemetry data, that he was driving faster than the speed limit. Right. Or whatever. Right. So that does not mean your brake should not work, but see. What data, right? You are the vehicle, you are riding the vehicle, right?
You have no idea. So yeah. Actually sued the company that under our contract and the respect of our other contract, you should not be actually putting there is not direct regulation, which covers that. But some section of, I think it act. Or the contractual act, whatever that is. He said that you should not be sharing that data.
And of course, right. Your speed. Right. And you can use that because he was driving. So it's not our fault. Okay. And you making that public is for, for your own like positive praise that, Hey, it wasn't our fault. I think that just follows on what earlier first news we had. Right? How much data points and here.
The companies are getting that we did not share any personal data. Right. So why do we say that some person made an accident? His name is Paul. Anyway, they're in the press. What Baldwin? And now he's saying, oh, we did not share. We share the telemetry data. Right. Which has no identification. Come on. That is his data.
Hey, who are you driving? And you are collecting that data every second. And now you're saying that, oh, it's not personal data. I think we need to, again, redefine this pie and PII definitions that it just not about the first name and last name, as long as you can identify the individual, it is personal data.
Right. And the simple example right yesterday, I went to a hospital for some minor stuff and, oh man, they just ask you for your birth date. Right. Because in. Right. But then I am first name and sometimes not even last name, right. There are only two, not even the first name, last name and the two characteristics.
They can find you in systems. So it defines right. It defines that identifying person later. And we did not have that problem in the past when we are paper records. Now everything is automated. We got all this AI stuff going on. So yeah. So I thought that would be a very interesting, but we'd like to hear other questions.
Erik Rind: Yeah, Jeff, how, how responsible or should big companies be in protecting our data?
Jeff Jockisch: Well, you know, all of the privacy laws really just say that they need to have reasonable security. And I think that's part of the problem that it's hard for companies to. To figure out what reasonable security is. And it's hard for anybody to enforce reasonable security.
What is reasonable security? Obviously it's not enough because, you know, cyber attacks and you know, the theft of identity and. And you, a ransomware just keeps increasing. It's increasing every, every month pretty much. So we're not doing enough companies. Aren't doing enough and we need to change.
We can't just say reasonable anymore. We've got to do something different.
Erik Rind: What do you think about that?
Priya Keshav: I have some comments on the the, the case itself, right? So I think the usage matters a lot, how you use the data. So the elementary was captured. But you know, if you don't have restrictions on how it can be used.
And, and again, I agree with some years point that it is personal information because it talks about your behavior, how fast you drive. So, but it could be misused if it's sold or shared an insurance company can start, you know, using that information to say, Hey, you've been fast to, you know, 200 times in the last month.
So your premium goes to. I mean, the same thing can happen in healthcare, right? So all these sorts of things kind of just has repercussions and we don't have any idea who has what type of data. And we don't even know if the insurance company is using that kind of information. Do you know, reject for example, or provide, you know, different kinds of premium to different types of people.
They're probably already doing that. So, you know, those kinds of things has direct repercussions. And so, yeah, it's not just about releasing telemetric data for the purposes of, you know, PR in this case. But, but it has other, you know, other repercussions as well.
Sameer Ahirrao: I think the important point to not there.
Right. The what do you call consent? Right. If he may have given a general concept when he bought a scooter in that accept dollars. Kind of stuff. Right. But he did not know that they accident happens tomorrow. Then they will publicly publish a telemetric data offer against me. So I think it's a, it's a very interesting pedigree.
I think it is just the stock we have. I always say that these regulations just not gonna the whole privacy perspective will not be implemented just because of the regulation, but more and more incidents happen every day. And that's what basically going to force people, companies, and eventually regulators to actually make it type.
That's how I think it's gonna go.
Erik Rind: All right. Jeff, people caught on ring cameras have privacy rights.
Jeff Jockisch: Yeah, I think this is more of a, I guess, a philosophical debate, but one that I have sort of the personal interest in. I I tend to walk in the evenings in my neighborhood and I I'm pretty sure I'm caught on about a hundred ring cameras.
Every time I take a walk in the evening, in fact, I even get caught on security spotlights because some of my neighbors seem to think that motion sensors that trigger when I'm walking in the public street are okay, which seems a little odd to me, but You know, I think we're, we're in a very strange place right now because I don't know if everybody knows, but I think most people do now that the police departments are getting access to a lot of these ring, a doorbell cameras as well.
Right. So it's not just for people's own individual protection, but now we've almost got like a whole. Surveillance network, you know, in every community. And that's great for crime protection, but maybe not so much for privacy. And it hasn't really gone through the courts yet as to whether or not that's a violation of our individual privacies.
And I think it's probably going to get challenged and who knows where it's going to govern, but I think I, I think we've, we've got some concerns. I have some personal concerns.
Erik Rind: Well, if you take a picture of somebody and you go to publish that picture, you need to get their approval to use their image in that picture.
You have to blur them out. Yeah. I mean, what's a privacy Carol, except a live stream of me. Right.
How do you feel about that?
Priya Keshav: I think Canada's are everywhere. And with the AI, it can do a lot of things. Facial recognition is becoming more and more and more common and easy. So, you know, ring, poses a question and Privacy is a problem. Right? So if, if somebody could find out what I was, what I was doing, I mean, if, if it was for the crime that can, they use it as one question, but then in the name of crime, what else can we identify?
Right. So it's just, it's just a question of, there's a reason why, you know, there, there needs to be a sufficient. There needs to be, you know, a process, a due process. And so when you skip those processes even if it's for a good cause yeah, one must question as to, you know, where should, where, where do we draw the line?
And I think. Privacy's should be a fundamental, right. We're all privacy professionals. So we believe in it. So maybe we're biased, but, but I believe almost everybody you ask them. They're probably gonna say yes, I do not want to be monitored. You know, 24 7. But, but you know, the, the details. So I, I don't know.
I think that Canada's are everywhere. I have a ring camera. I have motion sensors. So I guess, you know, it serves a purpose, but I don't store the data. And I would probably not. I agree with you. I would probably not reveal any kind of images without permission and would be mindful of privacy, but whether everybody would do that.
Erik Rind: Severe. So it's a Prius that it serves a purpose where, where where's the line between our theoretical safety, our increased security and our privacy. Right? I mean, that's, that's the decision we, as citizens are going to have to make somewhere in the near future. Right. And, and I'll just remind you, you said, says I'm concerned we're becoming a surveillance.
And therefore I need less security in my life. And what privacy, what are you, what are you thinking?
Sameer Ahirrao: Well, I think none of there is a high time to think about this, right? So, and some people and like Priya said are actually loving it. Right. I see all those neighborhood discussions. Right. And the next door and those kinds of apps.
Hey, I saw somebody on ring. You check my car. He has no business of coming. So please be aware. So there are some benefits with that kind of free surveillance. The, the key thing there is I think ring provides this information to the. Which is kind of concerning in terms of that, but I think then see it's privacy by design, right.
Kind of what they should do ring. So putting a ring camera, how much DataTree you can cover in somebody else's property. So for example, I have a house, right? So there is a difference between the half acre or one acre house versus your townhouse or even your condo. Okay. Because then you got to actually 10 houses.
So I think there has to be probably a water rink and implement is this should our different products, depending on what kind of house you bought, or actually there should be an option. We should not be allowing people to actually do monitor the entire street. So I do have my personal cameras, which actually die don't upload to the cloud.
Right. I also have a ring, but which courts a very limited. But what eventually is I can at least, or other at most go the area of which I cover in my lens. Right. Which is me, my house, my opposite house. So my maximum four houses, I think we gotta think about those kinds of that. How long you cannot, because you got to ring camera does not mean you are a cop or are you on, if you have.
Right. To basically monitor like, you know, the whole population we met, I never, and that actually gets in the a really good gray area. And the problem with all this privacy stuff is nobody knew it, everybody integrated somehow the ring came and then all data mining that is an economy. And there is, I think loss couldn't keep up it.
And we also did not understand. Right. What are. Repercussions of it from earlier. So yeah, I think we need to have some kind of, and I think all these companies like ring the better actually implement privacy by design rather than somebody enforcing and people throwing out those cameras. I don't use my Google home or Alexa anymore.
A lot of people just unplugged those long time if you don't use it, that data. So it's going to be less accurate. So companies need to think that it is a product before. Do we implement privacy rather than people throwing out their device, but all right, let's stop it.
Erik Rind: Priya, Polish DPA rules about the right to access the personal data contained in trackers.
Priya Keshav: Yeah. This is an interesting case, right? And it, somehow these issues, you know, they kind of come back to the same topic. What is PI and what is you know, how much do we understand. What we collect. So in this case, a data subject had asked a website operator for some information about. What kind of information is shared with third party, right?
Like, so with what do they collect about him or her and what, how it's shared with third party and a copy of off the data itself? The website provider obviously did not provide information saying that they did not process anything that would identify him. And then just basically. Giving you know, some information on how to kind of disabled cookies from the website.
So, you know, it just came back to this lack of. Clear definition or maybe understanding you know, that personal information includes cookies. It includes IP addresses. So it's not just name, email address, social security number, which, you know, obviously we've started to realize that that's personal.
But so, so what happened. You know, obviously if you don't kind of think about these ones zero, zero, or some random number or jibberish is attributable to me because it's tracking a lot of intimate details about, you know, what websites I visit. And obviously that is being used to target you know, specific ads to me.
But you know, if obviously the, in this case, the website operator did not understand how it works. So the con the disclosure and the consent was not very clear, and they were sending this information to a third party even before they obtained any kind of consent. Right. So, and, and of course because they didn't know.
They did not also provide this as part of the DSR request, they violated GDPR through that process, but it just comes back to fundamentally understanding how IP addresses work, how cookies work, what types of processing happens through cookies, how you know, it's transferred to third parties.
And and how would you use. Track specific behavioral information about me that I'm interested in certain types of shoes or I'm in the market for a certain type of product. You know, a lot of information is is gleaned from that. But most companies seem to not be Quite clear on what it is that they have enabled through the scripts.
And, and they kind of just assume that since they kind of just you know, accepted the script to be put on their website by the third party provider, somehow they are not liable for it. So it's, it's a kind of an interesting case that sort of brings to light some of the issues, you know, from a, from a privacy.
Erik Rind: Jeff, you're shaking your head a lot after hearing that.
Jeff Jockisch: I was just sorta thinking you know, this is a company that actually had a consent mechanism on their website and they actually did had some sort of the DSR process and yet they still failed pretty miserably. Primarily just because they hadn't really thought through all of the information that they collect and how they collect it and you know, what they needed to do with it.
You know, part of that is they're failing, but part of it is, is that it's, it's hard, it's complicated, right? This is, you know, Technology that a lot of people don't understand. If you don't really understand the adtech, if you don't really understand privacy, if you don't understand what can be personal information.
So if you don't have the expertise you gotta go out and hire it, or you got to grow it from within. And if you don't do that, if you don't invest in that, you're going to face.
Priya Keshav: No. I, I agree with that. I just want to point out something, right? So most people kind of think that they buy a technology.
Let's say cookie consent and there it is, they're compliant. Right. But privacy is a lot more complex and it requires deeper understanding of the topics and they don't. Put the two together, which is why you sort of implemented something partially, but you know, yet you failed I'm Erik, sorry to jump in.
But I just wanted to point that out.
Erik Rind: Yes. That's a great point. It's severe anything on
Sameer Ahirrao: that. Yeah. And it's not easy, right? We also actually, so a lot of these third party plugins and all this API economy, right. So you go to a travel site, you don't know how many. Yeah, so I was talking to her. Major, like, I think a quick book kind of company, they, how like funded partners, man, where they exchange later.
So first of all, the people who are doing this, do they even know how this data is used? That's my first question. Right? Because I am just a consumer of that and I don't know, I haven't checked like those privacy policies 10 times. Right. In terms of using any third party plugin. Right. So I recently published a privacy quiz, so we basically checked.
So that was a quiz platform. What kind of their privacy policies? So I think some automation is coming. We have some good text privacy tech, which is coming up, which will give the ability for people. But Pria is right, right. Consent management. And that also not consent management in general, it's constant management of website, which is front-end part of.
A whole business. Right? That's the only thing. Oh, we got privacy. Okay. We got consent management. We've got access, all button and things of that nature.
Erik Rind: All right. So in the real-time they have this program, we have one more topic, Sameer FCC to set us to set its sights on ed-tech companies.
Sameer Ahirrao: Yeah. That's pretty near and dear and should be near and dear.
Right. It's basically the whole generation. And thanks to. We are doing everything like gifts, and education online, I don't know what not. So I think the whole thing started with the state of union. If you follow, I think president Biden actually give almost 45 seconds on this topic, which was a big deal. And their area of concern was all about children, and privacy, right?
What, how these technologies are. And what they mentioned was before you, one child turns 13. There are like 12 million data points already collected on them. Right. So that's, that's something to be concerned about. So remember right. Find lucky, right. Say if I'm 40 years old, they couldn't do take my collection data point 10 years before that.
Now when you are a bond, right. And then you use your computer. Right. And I'm pretty sure you can find with the patterns, right. The age of the person on average. So looks like, I think what this news tells us. All the late tech companies be careful how you use that data. And you saw that life 360 app, right?
And there was a big controversy around that that, Hey, they, they were actually, so this is a parent watching out for the kids because if kids are lost in something so very good use, but they're taking that data and they were selling it. And people actually found the actual evidence that this data is. So how dangerous is that man in terms of, so the olden days, right?
We were about we were worried about kids being kidnapped right at the school coming from home. And there was kind of now you don't need to go to kidnap somewhere right. To the school, right. Just you can actually procrastinate or something like on their discard channels or whatever, just basically be friends with them.
This social media has given so much. All these gaming platforms, how they follow those eight, 13, or 11 or children in terms of, and what mistakes they do there, or somebody actually got. So I think the gives issues should be treated very carefully. So I'm glad that if this is taking a very good initiative on that.
And I would say even before doing I think that's a very sensitive topics, right? Because it's all about. So very important, but it tech companies make sure that how you're collecting and sharing data because education sector, another point to concern, it's not highly resourced. Right?
You see the budget, right. They are operate mainly non-profit manner or public sector. So they are not organized as much. Right. So all these ed tech companies are giving them. Right. And now they got to basically monetize because they may not be able to actually be very profitable with kind of the, the industry, this one, they're going to find the ways to be profitable.
And they may not know, right. There are like tons of platforms there, which you are my kids use and how they're sharing that data with, with identity without identity. So I think watch for it, if you don't, if you're running a tech company this is a very good that, especially in United States you're going to be a good one and it's a good news because minor is absolutely sensitive.
Erik Rind: Stunning Jeff. Jeff, should the government step in and prohibits certain industries like attacks from period don't you can't share that data. I mean, forget it just don't even think about it. You providing a service. I get it. But charge for it and don't share the data and don't sell the data, which the government, I think that, I think they have to get involved.
Don't they?
Jeff Jockisch: Yeah, I think they should. I think they'd probably be, need to have more guidelines. It's probably pretty rightful for, for that to happen. Especially because you've got a lot of private equity firms that are not getting into ed tech and emerging a lot of these companies probably exactly so that they can share that data within between those companies if not externally to those companies.
And I think there's, there's a lot of risk there. You see a lot of brokers in the EdTech space. And I think that, you know, regulators have always sort of hit the, the children's market first. I mean, you saw, you know, from above in, in the education arena, rape is a very early privacy law, as well as Copa, which was really sort of America's first comprehensive privacy law.
If you sort of think about it and you sort of, you can sort of think of that as the sectoral law, but it really was comprehensive just for a subsection of the, of the population, right. Just for children. And it has holes because really take companies can get around it by saying, well, I didn't know that they were kids, right.
If we were to actually stop her that and make companies figure out whether their clients were actually kids, that would actually be a pretty damn strict law. But we haven't done that yet. And if you were to sort of, you know, a couple that with this new focus on ed tech, by, by the FTC, you'd actually have a pretty, pretty damn strict regime for privacy.
Priya Keshav: I think it's a very important topic. I have kids and they, the amount of apps they are exposed to is just crazy. And I don't think there is, I mean, I can count a handful of apps where parental consent is required. 99% of the time there is nothing and the passwords are. Horrible. And there is no security.
So in that sense, and plus of course you don't even know what happens in the secondary market, right. In terms of data share. I think kids while they're being educated on this topic a little bit I think it's not possible for them to take care of themselves, but protect themselves and. Crazy to think that by the time you become an adult, somebody would have 12 million data points about you.
That's just way too much. It's just, it's just, I feel there needs to be more regulation and, and a lot more control because parents can't do this alone and I think it you know, you need, the, what Sameer said is true. There is not enough money, but that doesn't mean you can shortchange the future of these kids.
Erik Rind: You're here. Hopefully, the government's watching this podcast and that's a wrap everyone. Thanks again to our guests. Jeff Jockisch PrivacyPlan, Priya Keshav, Meru Data, and Sameer Ahirrao of Ardent Privacy. I'd also like to invite our listeners to join the free Data Collaboration Alliance Community.
We're a vibrant group of data savvy professionals who collaborate on open datasets and build free tools for important important causes. Visit datacollaboration.org to learn more about becoming a founding member and get a link to sign up. Thanks everybody for joining.
Kommentare