top of page
Writer's pictureTeam

The Data Drop Panel for January 2022

Host Debbie Reynolds and special guests take a deep dive into the noteworthy, concerning, and downright fascinating stories featured in recent episodes of the Data Drop News podcast.


Pro tip: you can listen to The Data Drop Panel on your phone by subscribing to our podcast.


About The Data Drop


The Data Drop podcast is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and global Collaborative Intelligence.


Join Node Zero


Node Zero is a data-centric community where professionals, nonprofits, and researchers join forces to collaborate on datasets, dashboards, and open tools in support of important causes. Learn more.

 

Full Transcript


Debbie Reynolds: My name is Debbie Reynolds. I'm a data privacy advisor strategist from Chicago. Welcome to though the node zero community at the data collaboration Alliance. This is the date of drop panel where every month we gather some of the leading data privacy and data professionals to hear about the news stories that stood out to the most in the past past month or so.


So. We have stories here featured today that are also included in our sister data dropped news podcast, which delivers a four minute data privacy news Roundup each week. So this month we have very special guests. We have Heidi Saas who is the founder CEO of H T SAS, LLC. She's a data privacy and data protection professional from.


DC area. We also have Sameer a Hario from Ardent privacy. He's a CEO and founder of that organization as well, also from the DC area. And then we have Jeff Djokovich, who is the founder CEO of privacy plan. He's also a data researcher and a dear friend of mine. So let's jump right on in. So Heidi, you have a story what's app Rewrites its privacy policy after record $225 million GDPR fine.

Heidi Saas: Yeah. Yeah. That one was a little bit more than just having to hit the swear jar. So, they decided to go and do a little bit more about it. What they did was say. We are not changing how we collect the information or how we share it or how we use it.

And you don't have to agree to anything new. They've just offered a little bit more transparency about what they're doing. So they've added more words to it. And if you start digging into the words, I was looking at it, oh, you can delete your account. You can do this and that. Well, if you want to learn more about data deletion, you can click right here.


And when you click right there, Here's what you get. So we still, yeah, we still got a few problems with transparency and it's great to me, but I, I think that this was right. This is the UK version of the WhatsApp privacy policy. I think what's also interesting is that this week I also got other privacy policies from updated other companies, such as the terms of service for.

Right. And Google, Google now says that their proxy policy, besides the terms, their privacy policy, although it's not part of the terms, we encourage you to read it. So, so Google has decided that the privacy policy is not longer, I'm no longer a part of their terms of agreement. So, I mean, there was a lot of shifting in the regulatory notices and things that are going out to people at the end of the year.


It's not just people getting upset and wanting more transparency. The question I think is what are people going to do now that they have the transparency, you know, more about what they're doing. Well, what are you gonna do about it? They're not going to stop, you know, so did you really accomplish your goal by getting them to take the step?


So I think, you know, they're getting bigger with the enforcement fines is adding a little bit more legitimacy, but are we getting closer to our goal?


Debbie Reynolds: That's a great point. I think people already misunderstand what their rights are and it isn't as transparent as it should be. Then also, you know, I think someone somewhere needs to address the fact that these things aren't written in plain English and they're entirely too long.


So I was like, do you sell my data or don't you? The other stuff too, but I'm gonna kill you with some words. So you're gonna not even know about it or care about it.


Debbie Reynolds: Yeah. What do you think Jeff?


Jeff Jockisch: Well, I'm glad to see somebody getting a fine finally, right? I mean, we don't see very many big fines coming out of our privacy.


There are a few, but, but there's sort of far between and I think transparency really. Is the thing that we really want to look for. Right. We get a lot of you know, other things happening in privacy and a lot of focus on, on regulations, but transparency really needs to be the big goal. I think And this is really a sort of step toward that.


And we need real transparency as you, as you just said, right. Letting people really understand what's going on rather than throwing a lot of complicated words at them. And I think this was maybe, you know, a small step toward that. Right. I don't think really WhatsApp is actually getting to true transparency here.


But maybe that's the intent.


Debbie Reynolds: Right. What do you think Sameer?


Sameer Ahirrao: So the interesting thing about WhatsApp compared to the other social media platforms is you don't need to actually sign up, right. It actually acts like your default messaging app almost right. For so many like billions of people. So people even, probably don't even realize they're signing up for something.


Because your phone number is eventually your user ID and that's the kind of killer feature for WhatsApp where like Facebook, you don't have to go and type in for full the farms. Right. You just add your phone number, download an app. And it's so simple to start messaging. So I think, of course these record fines and things are at least making them accountable and I think it's cool.


They got a pretty good action or at least some attention in India as well about the misinformation and stuff. So I think it's a, it's a good news. More transparency is required and you are more for WhatsApp because people don't even realize a lot of people due to the lack of honor that signing up for their data because there is no registration form or anything that you bleed like.


And you're boom, you're on WhatsApp. Yeah.


Debbie Reynolds: That's a great point. It's a very point. It's a lot of I guess the opposite of friction. So it's like, Nope, no friction whatsoever. Have you served to flip into these things? So yeah.


Jeff, you have a story, Australian online privacy bill to make social media, age verification, mandatory protect giants, Reddit zoom gaming platforms.


Jeff Jockisch: Yeah Australia is looking to really update their privacy act. They were one of the first countries to, to really come in with a strong privacy act. But that was, almost a couple decades ago now. And so their privacy act needs some updating and they've put, uh, a bill out essentially for comment and they've promulgated.


67, uh, different, sort of proposals, to, to seek public commentary. And they essentially have sort of organized those into three different classes of organizations that they're looking to regulate more heavily social media platforms data brokers, and large online platforms and that the large online platforms sort of roughly correlates.


To what Europe is doing with very large online platforms as part of some of their legislation. So that's sort of interesting. And you know, I think that what's really sort of important here is that. Is that I think this could actually be stronger than what GDPR is doing right now. The enforcement mechanism could potentially go up to 10% of global turnover.


Yeah. Which is pretty amazing. Right. So they're not really playing around now. It's it? It could be that this doesn't actually stay there. Right. I mean, that's the proposed amount. Some of the other things that they're proposing are pretty heavy handed as well. So we'll see if it actually stays there, but Australia has been taking a pretty leading role in privacy for a long time.

They're a little bit, maybe not. So forward-thinking at the moment because they haven't updated their legislation in a while, but I expect them to, to, to move forward and, and sort of get ahead of where everybody else is now. So I'm expecting some big changes.


Debbie Reynolds: That's a great point. Well, what do you think, Heidi?


Heidi Saas: You're right. Yeah. People don't really pay enough attention to what's going on in other democratized countries to see, you know, how it may have impact because they're trying other things and, you know, meanwhile, we're getting nothing done, but we can be learning by what they're trying. And they definitely know what they're talking about.


I mean, if you like the privacy commissioner for New Zealand is now the head, the privacy commissioner for the UK. So, you know, I mean, it's just, and he was, I think one of his tweets that got him in trouble was they don't give us up. So I mean, that, that's the, yeah. Right. That's the word of a privacy commissioner now?


So I mean, the people that we really want to be in positions to make change are, and it's not necessarily in our country, but globally I, I think that we're getting where we need to go. Because data is global and so our strategy should be as well. So I think I'm encouraged by some of the things that I see coming out of there.


Do I see something like that coming our way? No, because of the power of dark money in our country. But I'll get to, I'll get to more of that in a minute. So I am encouraged to see new legislation come out. Thinking and applying mistakes that we've already learned to say, let's do our legislation this way.


Cause it wasn't working for them. So that's, that's intellectual. That's, you know, encouraging.


Debbie Reynolds: I think that Australia, first of all, things about children tend to get passed. So this is definitely going to go. Right. And then also as if, depending on the difficulty it has for. Organizations to be able to implement it.


I think the other organizations that don't have to do this, we'll probably try to start doing this as well. So they'll see this as, as a harbinger of things to come. What do you think Sameer?


Sameer Ahirrao: I think the social media and gaming, right that directly back to our society. So I think privacy is just fundamental human, right?


More and more people are realizing. So I think more than even lawmakers, the whole incidents happening around the world, right? Those are prompting naturally this loss to come forward in legislature. And I think we're going to see more events and more countries doing the same. And it just the social impact.


I mean, the power of all these social media, we saw it in the United States during elections and especially in walnuts. So the gaming part, I don't think they're like it really monitored. Good things done there by the companies and the, the nonprofit societies who actually tried to sell the one them.


But I think they actually put the very good uh, important three industries, the right social media gaming, and one more which are actually directly impacting. So it's a good news and I think that's going to happen globally. No matter. And this, all the. The incident force is going to actually push that envelope too forward.


I think that's, what's going to happen.


Debbie Reynolds: So Sameer, I'm gonna start out with you now. The digitization of court spring privacy concerns, but India lacks a right to be forgotten. I'm fascinated about what's happening in India. So tell us about this.


Sameer Ahirrao: Yeah. So I, there was a big WhatsApp story on India as well, right? How do we stop misinformation, but overall right. To be forgotten. I think it's a very big, and if you see the three damage from minimization, so we are really passionate about that, so I can talk for hours, but I'll be brief. So, I think good news is it is, uh, it is coming up. So PPB the, the new privacy data protection bill, it does have a right to be forgotten and it's not passed yet, but it's very close to 'cause it was supposed to be shown on the giant parliamentary committee. So. So that headline may be a little bit misleading because I think they're working on it because if we see the Supreme court decision for that Bengali actress, so there were two incidents, one she asked to get that uh, how to videos, which she didn't like, or basically should not be on the public.


So I think court gave a favorable decision to her request. And also the second thing about the court decision. One guy who was accused of rape and then this data could be deleted, but then the Southern McGrath court actually um, gave a decision that no the concept of opening code is little city, so that has been stopped there.


But I think it's in the right direction, right? The awareness is coming and eventually when speedy PD passes, it's not there today. But right. To be forgotten is forgotten. Very important, right. In, in ton of other places, because how do we even delete the footprint of data, which is not true.


Right. And who is eventually the decision maker, right? It is true or not. So if somebody publishes something about me on YouTube, right. And that goes viral, that does not mean that I have my phenomenal. I don't have a number two, like anybody can say, right. So people do that sometimes about the freedom of speech.


But I think I, I, as a under extreme again also has a flip. Speech and want to be spoken if it's not correct. So I think it's a thin line in general. But, uh, uh, I think we'll take into the right direction. It is actually part of the right to be forgotten in the new law.


Heidi Saas: I think data destruction is one of my favorite topics because there's not an investment in that area and we've got so much unstructured data.


We're not going to need it. We are never going to need it. And we're not going to stop collecting. So, what do we need to do to get rid of it? That's why, like what Sameer does data minimization get in and get rid of that stuff, you know? That, and I think that the right to be forgotten, isn't really going to be able to be realized until we have better technology to make sure that, you know, it's never going to go away, but make sure that it's just not going to be used in a way that is harmful to you.


You know, I mean the free speech rights, I hear that, but that's only if things are true and not intended to shame, harass or damage you. So, you know, in, in, in those instances, you know, LS it's newsworthy, yet people should have some rights as to what the public knows about them, data ownership is how we're going to be able to achieve.


Debbie Reynolds: What do you think Jeff?


Jeff Jockisch: I'm glad India is updating their law. And I think a right to be forgotten is sort of interesting. Obviously, they're, you know, sort of cuts both ways. Twitter is actually just changed some of their terms of service to allow them to do some better management of. Harassment and things like that.


Privacy activists are a little scared about that that made sort of fall out. But overall I liked the right to be forgotten if it's implemented well But, you know, I'm also a person who thinks that, you know, we have to have some level of sort of a reputation economy. So, you know, everything that you do, can't just be erased because you don't like what you did in the past.

Right. So we have to have some level of responsibility for your actions as well. So we have to figure out what that, you know, that balance. And I think we'll probably as a society, figure out what that is eventually


Debbie Reynolds: Just want to add in here that the data collaborative Alliance supports, eliminating copies, supports things about forgetting you know, to reduce the data footprint and also the risks.


Retaining data too long, I would say the right to be forgotten. I think it will pass as part of this the Indian law. But the right to be forgotten really only works in places where privacy is a fundamental human right. So in a place like the US where we don't have that as a fundamental human right.


Can't forget something that you never had the right to begin with. So I think I would love to see more, more places, be able to, first of all, make privacy a fundamental human, right. And then also look at the right to be forgotten as going further than just having a deletion. Policy going forward. So let's go to the next topic.


Heidi lawmakers push for federal data privacy law after report reveals. Amazon is gutting state legislation. This is interesting.


Heidi Saas: Okay. This article was forever long, but it was, it was a great article, but basically here's what happened. Jay Carney used to work in DC. For past administrations, Amazon was looking for someone who could do some things for them in a washing.


So after a long search, they finally got Jay Carney, Jay Carney got over there and said, Hey, here's what we need to do. We need to tell them that they can't live without Amazon here. All the jobs will be provided here, all the things that we do. So they need to back up off us on all these other things. And that was kind of, you know, their methodology of going through things.

Eventually they started to understand, cause it took him a while. To get Silicon valley to hear him. When he says, if you don't engage in dark money activities, you will not have a seat at the table. And these legislators will have their way with you. So finally, they started listening and they started opening up a couple of 5 0 1 C4 because it was 5 0 1 C four.


You don't have to publish the list of donors. So that is the basic mechanism, the tax structure for a super pack. Now there are a couple of them that are out there doing things. This article goes a long way in showing how. Well, linked in the lobbyists are to the legislators because legislators don't write the wall, the lobbyists write the law.


And so, but you can see email exchanges back and forth saying change the definition of this and change the definition of that because we don't like it. And they did. So, yeah, I mean, this is nothing new, but this is just one of those things where when other people find out what's going on, they start to get excited about it.


And so for those people, I would like to call your attention to Because open secrets is an organization that is always following the money. Think of it as NASCAR for politicians, you want to know who supporting them. They don't have the labels all over their jackets. Yeah. Open secrets does. And they follow these people.


They know who the lobbyists are. They know where they worked previously. Dark money is called dark because it is groups with ties to the industry. They want to regulate. So, I mean, this, this is how it works in Washington. But open secrets will tell you who these people are. These are all of the people that are trying to fight all of the regulations we would like to see for data ownership and regulation of big tech.


All of these right here on this page are. American edge is the newest one that they set up. It's Facebook pretty much. And so, but they, they say, Hey, I'm a think tank. And here's what I'm doing. I'm working on all of these great projects, but what they're really doing is pushing back on antitrust and any other sorts of regulations.


And you could just see the interplay of how these groups work and where the money's coming from and the impact it's having on Washington. So this, this is why we're not getting things done here. But the entrance of big tech into dark money in the last two years is exactly why you're seeing everything that you're seeing right now, because before they just said the regulations don't apply to me and whatever they're doing in Washington is in world.

So, but they're, they're here now. They're definitely here now. And it's, it's fascinating for me to watch. I'm excited. What other people, you know, get clued into it cause they can start watching it as well. It's my favorite sport. Um, but really our rights are at stake here. And so I like the fact that open secrets likes to call people.


So we need to know our colleagues are working with these people. If you're getting paid $15,000 a month to advise Amazon, then I don't necessarily know that you have the foundations and fundamentals or the privacy principles that I have. So colleague yes, same. No. So that's where I am on that. I like the daylight on this situation too.

Debbie Reynolds: Well, Heidi is very badass. We love it. What do you, what do you think Jeff?

Jeff Jockisch: Well, I think you can you can see where see that money is flowing into a lot of these issues just based upon how the state legislation games have been playing out. Right. If you look at, you know, states like Florida, where Texas, uh, Oklahoma Arizona, where a lot of these.


Privacy laws have been going sort of going back and forth and it looks like they're going to pass. And then they change radically and they go back and forth. I mean, Florida is really a great example. I mean, they almost passed the bill and just came down as a nail biter and almost passed, but within didn't.


And it was all, I mean, I mean, all of this, all these logs are in our politics. Right. But it was pretty obvious that there was a lot of money at. Um, in that particular legislature legislation session, legislative session. And so, uh, I think Heidi is very much right. You know, there's a lot of money at stake and it's influencing what's in state legislation.


Sameer Ahirrao: Yeah, of course. I mean, I think this is definitely, I think it took us behind as a, as a society probably two to three years before. Right. Because see if you push this legislation forward, right. If you don't get the, some kind of limitation on what data is collected, eventually that data is getting into the big tech or whatever, whoever collect that.


Right. So it's just basically pushing the timeline. We all know it's gonna have its impacts and we're gonna get privately these of. With all these stories coming up. But I don't think it's it's a good news that, you know, with those, like, I mean, I was looking at also the dollar amount. Right instead, they had some, they weren't big, I won't say my privacy for $800 seriously.

I mean, if you see those small donations and kind of, so I think it's a little bit, a little bit, I would say the shocking that, you know, every companies have lobbyists and lobbyists and do those kinds of things. But I won't imagine that, you know, it is done at . And what that means as an American, right.


Is we lost our three words of all data collected. Who's going to pay for that. If something bad happens out of that. Right. So I think we got to do this data minimization, that collection point. Right. And any, anything sending the responsibilities, right. Transparency is one thing, but I don't think enough.


That what data they are throwing out there for free. I mean, the concept of Sharon thing and stuff like that, people post that willingfully because they just don't know what are the impacts of that, what the kids, what the themselves, how people can misuse that data. And I was reading other day about this whole doxing story.


Right? We'll talk about that in the next news cycle, I think which is in front of. But no, not good news. I, I think the story should actually put towards fascinating dysregulation for procedure across the state. And you want to appear little more.

Jeff Jockisch: This is a really important story, right. And, and I just want to make this additional point in Florida, right?


The Democrats wanted this bill to pass. The Republicans, wanted the bill to pass and the governor wanted the bill to pass. Right. And the bill didn't pass. Why is that?


Heidi Saas: Because there's a lot more money to be made in prolonging the problem. They coming up with a solution that was.


Debbie Reynolds: I agree. I agree. So Jeff, you're up next hacking fingerprints is affordable and simple says cracking security. Tell us about that.


Jeff Jockisch: So, this story is pretty interesting, obviously, you know, think of prince and probably one of the first biometric pieces of information. A lot of people think about, well, maybe they think about facial recognition first, right. But I'd think of prints are, are very basic, right.


This a crack, an article essentially says that for five bucks, anybody can probably steal your fingerprints and use them to log into anything. You've tried to secure with that, right? For myself. I actually use fingerprints to secure my bank account and my health records on my phone. And so this is pretty damn scary.


Luckily I have, you know, my phone secured with other stuff. So if somebody were to steal my phone, they wouldn't be able to immediately log into those apps, but assuming they could break into my phone, they can get into my bank and they can get into my health records. I gotta go change my security now after reading the story, because that's just not acceptable to me.


Right. That's that's really, really bad. Know, I think, I think people don't realize how problematic this is, right. I mean, it sounds like, you know, securing your bank account with your fingerprint would be a good thing and it's just not.


Debbie Reynolds: Yeah. What do you think Heidi?


Heidi Saas: I'm concerned as well? You know, I've always had the position that I can't get a new face, so. And where do I go to argue with somebody that this is my face, whoever stole my face and is doing something with it, how do I catch them? And how do I argue? This is my face. Instead of them saying that's their face. Like, I don't know how approved, but. I'm a face or a fingerprint and I can't get new ones.


So, I mean, that's always been my problem with collecting biometric data. So, you know, you can have as many safeguards in place as you want, but you're still collecting it. I too, in doing the same thing with my phone. So, you know, I don't, I don't know what would happen if, you know, I. Suddenly set that to know what your fingerprint, the danger here that we're seeing.


Debbie Reynolds: And we'll continue to see more and more is that we did to ties biometric information. It becomes his own other thing. So it doesn't, you know, fingerprint it's removed from the individual, right? So then it becomes the own data point in and of itself. And you can come, you can manipulate computers to do almost anything.


So it's very. Crazy and, and scary. Definitely. Sameer. Do you have comments?


Sameer Ahirrao: Yeah. So I think the key thing of our environment, because you can change your passwords, you can change your two factor. You cannot change your fingerprint. So I think though it is the simplest way, or it makes the authentication so easy and you.


But at the same time, if it is still and it is taller. And I don't think we have found solution for that. So most of the states actually worry about that. And I think everybody tried to do some kind of um, a biometric, but we need to be really careful about the whole biomedical. People tried to actually not use that feature.


I see a lot of people, they don't actually want to put their fingerprint on the iPhone. Use that as an authentication method, but I think technologically, we need to evolve. Right. In terms of either using alternative. Biometric right. Or maybe two ways of biometric. I don't know. But uh, of course that particular uniqueness of biometric authentication that definitely needs to be considered very heavily because who are users, if there is a mass collection of that.


And if somebody uses that, I don't think you can even stop that. Um, in terms of, so of course, anything we do in that direction should be a very positive. And you went bypassing that then, you know, it's it's well I know was there, right? So, I would say, I don't know if we'd go away from the, the biometric authentication in future, or just alternate those methods that.


Debbie Reynolds: Biometrics is crazy and it's going fast and far where he had where we're going. The only thing I can think of, and maybe this is on the horizon is kind of liveliness detection stuff where like, so Jeff is using his fingerprint in the town that he's in now. If someone who used his fingerprint in some other location, they'll probably say, well, that's not Jeff.


Cause you know, he's not there. Or, or the signature is from a human someone who has like heat or light or something. I don't know. I'm just talking crazy at this point. So the. The story we have, and this is going to be interesting. So Sameer, you want to talk about clear view? AI was told it broke Australia's privacy law in order to delete data.

I liked the story. I know a lot about it. Tell me.


Sameer Ahirrao: Yeah. So I think clear view is actually facing the action in Australia. I think they have some timeframe between October 19 and 20 the way, so everybody knows, right. There was a big story. I think UK or something came out of Europe earlier. In terms of the, the clear view AI.


So here is an interesting thing, right? The whole clear concept. I do have a content, right? When I put my picture on Facebook or any social media, they just to identify myself in one way of other way. That is one thing. Right? You have a consent to use that to my connections. Okay. And basically you look at my photo, right?


But like, it's, like we talked about, we do a massive collection of that and use that for that concept or doxing that's completely illegal. And I think that has a mass mass impact. So now what I'm doing is, okay. I may throw in digital consent for a particular person. We are actually, while letting the purpose.


That I'm not supposed to have an AI engine just because I pay someone $10,000 or something just to get information because they can do it in a faster way. Right. They can get those facial features from 10 different places and you, I could have, so I think of course, any kind of scrutiny on that and the whole viewing, the, the airport and all that.


There's a lot of legit use cases of that, where people want to use that. Security, but it's again between the security and so interesting story. I think it's not just another door, man. Eventually realized that Australian, um, citizens, data, mass data, and we so far hard stories, right. That government has our data like fingerprints and things of that sort.


But now if the somebody doing from the private entity where government has no control on. Not the elective. So just then it's basically up for sale for anyone. That's kind of scary.


Jeff Jockisch: I think it's just an example of what's to come right. That that they're going to be cracking down a lot on. Probably not just a big tech, but also on data brokers, let's say


Heidi Saas: I say, I want to hear from the data diva on this, because I know you've been following Pippa and shit in the Illinois and like, Ooh, I want to hear it from your,


Debbie Reynolds: Oh my goodness, I have lots to say. So basically. This is a big deal. Because Clearview AI has cases like this pending in multiple jurisdictions, including the U S with a case. So Australia basically put the hammer down on Clearview, told them to delete this data. They were there. They were in, they were working with the UK on this case.


And the recently. The UK is going to order them to do the same thing. And I think they've been fine as well. Several million million dollars for this. So they also have to delete those images. So next up all eyes on lifts and Illinois, and what's going to happen with this Clearview AI. Part of their case in the U S though, is that they feel like they have a first amendment, right.


To collect this data. And the problem that we have here in the us goes back to us, not having privacy as a fundamental human, right. We don't really have a right not to share. So because we don't have. You know, the, the problem with this argument and if it passes, it's like, okay, does Clearview AIS, you know, first amendment, right to free speech, you serve my right as an individual for free speech, but you haven't said anything.


So my data that they're collecting, you know, surreptitiously or what are scraping off the internet is that speech for me. And is it protected? So it's, this is going to be crazy blockbusters banana case with Clearview AI. I saw I'm really interested to see how this and the other kind of data scraping type cases.


Happened because what they're really doing and the thing that Australia and the UK and even the U S is concerned about is that they're putting data about innocent people in databases that are used for criminals criminal activity. So it's creating a guilty until proven innocent type of scenario when you're doing that.


Issue and the, the, the, the tech world right now about these types of databases, where we have technology people saying, okay, we can, we can supply empty databases to police departments to give them the capability to collect the stuff, but they need to collect the stuff based on. The evidence that they have, or the people they want to target, as opposed to having this huge mosque pit of anyone and everyone, and trying to create sort of a criminal element for about every person.


Sameer Ahirrao: There was a concept actually, we were doing some research with like a middleman law school student uh, two years ago. We introduced in that article, the concept of national privacy, right? So even if you are doing a mass data collection, say for religion, dormant reasons or anything, I'm comfortable sharing my face and my data with the national particular that country government.


But that does not mean any company from outside. Right. It may be already local. They use that for legit, but the concept of national privacy is okay. That particular thing can be sure. My country, but not the, the rest of the beyond those boundaries. So that was an interesting concept. We actually wrote a little bit of paragraph on that, in that the larger, but I think that things are getting there.


Right. And I think all over it. So depending on where, who should have this all mass data. You and the government is, I think it's a big thing too. And I hope that gets straightened out at some point with this kind of things.


Debbie Reynolds: Excellent. Excellent. That's excellent. Does anyone else have anything else to add? That's our last story.


Heidi Saas: Yeah, I was just going to add that since we are coming up on the fresh new season for people introducing their favorites, Biba knockoff in the legislative session in the spring. I just want to remind people that are cut and paste. Didn't it didn't work last year. So don't do it again.


Let's look and see what is working, what legislation is working. If you really are interested in drafting legislation, that's going to go somewhere. Then I think you need to find out what is not working about Biba because there are appellate cases on, you know, each instance, is it a violation, those sorts of small things.


So, I mean, if you're going to cut and paste the legislation that's already had, you know, things disputed in the legislation and case law has altered that in the legal field, we call that shepardize. And you'd shepardize that and make sure you're not going forward with bad law. And I wish lobbyists would do that.


Like, yes, go and create things, but be smart about it because if you're not, we're just going to call you out. Cause the same thing didn't work last year and it's ignorant to try and get.


Debbie Reynolds: Excellent. Well, thank you all for being able to do the session. I really enjoyed this and this is very, very educational and thought provoking for the listener.


Sameer Ahirrao: And thank you for a boasting, Debbie I'm pretty sure this is pretty interesting.


Debbie Reynolds: Yeah, this is great. Thank you.






Tags:

61 views

Recent Posts

See All

Comments


bottom of page