Privacy Please

S5, E201 - Securely Speaking: Privacy & Tech Attorney Heidi Saas

Cameron Ivey

Send us a text

Join us for insights into the crucial aspects shaping the future of privacy. We'll delve into the significance of diversity in the privacy sphere, with women at the forefront of leadership roles. Discover why mentorship is indispensable for those embarking on privacy careers. Explore the delicate balance between profit and privacy, as companies often prioritize the former, jeopardizing user trust. Learn about the anticipated impact of CPRA enforcement on data management and privacy practices. Finally, we'll discuss the imperative of addressing biases in AI development to ensure the fairness of algorithms. Don't miss these essential discussions that are shaping the landscape of privacy.

Support the show

Speaker 1:

All right, we are live. Ladies and gentlemen, welcome to Privacy. Please live edition Securely Speaking. I am your host, cameron Ivy, and with me soon should be Gabe Gums, my co-host as well. We have a really special episode today, so just wanna thank everyone for taking their time to join us on this Valentine's Day special episode. But we have a special guest, heidi Sass. Heidi, how you doing today?

Speaker 2:

I'm doing all right. Hey everybody, thanks for coming.

Speaker 1:

It looks like you're standing right now. I feel a little jealous. I don't have a standing desk at my house.

Speaker 2:

I do this so much that if I don't have a standing desk, I don't get up.

Speaker 1:

Yeah, it's fair. I've actually been looking at those little treadmills that you can put under your desk.

Speaker 2:

This is a bookcase. It's just a bookcase with all my law books on it and a couple of shelves, and I put a laptop holder on top of it and lights and put my foot there. When it gets hard to stand, I can shift my weight, whatever, but yeah.

Speaker 3:

Yeah.

Speaker 2:

Yeah, it's nothing fancy. I don't have any fancy treadmill desk. That's too much.

Speaker 1:

Yeah, you got a nice setup going on. I like it.

Speaker 2:

I live in this box. I don't get out of this box often unless people pay me to come scare them in person.

Speaker 1:

Well, heidi, what we like to do on this show. Obviously, you're our star, and thank you for taking the time to join us as people roll in. But why don't we kick things off and just start with you telling us a little bit about yourself, where you came from? Everybody's journey is different, right, and that's what's so beautiful about the privacy industry, and we'd love to learn where you came from and how you got to where you are today.

Speaker 2:

All right. Well, I started, I would say, in consumer rights, so that's kind of where I started my journey into privacy. When I was working in consumer rights, I was dealing with debt issues with people during the financial crisis and at that time they were buying and selling commercial paper and just Excel spreadsheets. They didn't have supporting documentation. It was very shady, and so we decided we would do what we could then to help consumers by not charging a billable hour and charging based on success that we achieved for them and negotiating lower amounts to pay on valid debts and on ones that weren't valid, we would draft the documents and prepare them to go and support, say, to argue it on their own behalf, and so in those cases we got a lot of debt wiped out, because if you're going to sue my client on a contract, I'm going to want you to produce that contract, and I know you don't have it. So we were able to help a lot of people that way. I managed over 200 lawyers in all 50 states, and at that time we were actively lobbying for Dodd-Frank and the establishment of the CFPB, because we need an agency to look after the consumers. The FTC is good as a cop on the beat. But we need someone looking deeper into these issues, and so the CFPB was created.

Speaker 2:

13 years later, it's now in a position where it's starting to do some things. So I'm a little hesitant when people say we need an AI agency. Like I've learned from experience, it takes a very long time to stand up an agency. So in the government it's not particularly good at doing anything. So I mean, do you really want to give them one agency control over that? All agencies should have AI officers in them.

Speaker 2:

So when I was leaving Consumer Rights, I thought all we are is data. I've got to learn as much as I can about that when I come back into the workforce after staying at home to have a few kids Because, honestly, motherhood is where legal careers go to die often. So it's kind of a choice that you make and what path you want to take after that. So I decided to learn privacy and data technology and as much as I could about that to fight consumer violations where they're happening. So I started working in privacy.

Speaker 2:

I took the CIPP exam in 2019. Everybody was talking about the GPR and cookie banners and this and that, but it was still enterprise heavy. I thought, well, nobody's looking at what's causing the issue. So I wanted to learn more about behavioral economics, targeted advertising, behavioral profiling, algorithmic bias All of the dirty details about data, how it's used, how the tools are built all of these things. So at this point, I feel like the world has come all the way back around to where I am, and now they're looking for people to say we're ready to listen. So it's an exciting time, and I'm not the only person.

Speaker 2:

There are a lot of other people that are in this role, that have this skill set, and they're being overlooked. What I'm seeing right now is that they're being replaced by pale males from Yale, because they went to law school and stayed at the Marriott. What other qualification they needed to take this job? But they're superseding the hardworking women who have been there all along, using all of their talents and efforts to keep the ship afloat. That's disconcerting. This used to be a female-led industry and once there was money in it, then everybody else started showing up, so it's starting to dilute the original makeup that I thought was so encouraging.

Speaker 2:

I still have faith. We're a privacy community, but I started doing outreach on LinkedIn and working with other people and things like that, in addition to the work I was doing because, consulting in the early days, they weren't ready to listen to what I had to say. And I told the FTC this too. They said we understand this is a solid program for privacy and it's probably the right thing to do, but we don't want to invest in it right now because we don't really see any threats on the litigation or enforcement landscape. And I heard that enough. And I said you know what? Let me see what I can do about that landscape for you.

Speaker 2:

So, that's when I started branching out and trying to move towards positive change in this area, because I didn't see really too many other people doing it. So I'm encouraged by the support that I've received. Some people just think I'm saying scary things and scroll right past, but you can't really deny it when you see those scary things happening in the world around you later on, Right?

Speaker 1:

No, that's great. That's so amazing Because, I mean, you said a lot of good things there and I want to kind of take it back a little bit. Obviously, diversity in this realm is so important and you can see it's growing and growing, especially for women in privacy. So it's nice to see Because I know a lot of the leaders in your area there seems to be a lot of good, strong, confident women that are leading the privacy front, which is amazing to see, and it's encouraging because you guys are your staples for younger women that are looking into maybe going this path, which is super important. So how does that make you feel? I would imagine that's part of the drive as well, as, like you were saying, you wanted to bring that back the way it was kind of formed originally, but it's good to see that it's growing and that someone like yourself is in this space and you're fighting for it.

Speaker 2:

It's humbling to see people with students of all ages in all stages of life. Maybe they got laid off recently and wanted to upskill, or maybe they're in law school currently and they're looking for direction on where they want to go. Other people that have expressed interest, that have turned into excellent students, have been people that previously worked in safety and trust and big tech isn't really interested in safety and trust, so those people are not working right now, but they have a great transferable skill set to get into privacy engineering, because it combines cyber and privacy and the code where you need them.

Speaker 2:

So it depends on where people are coming from, but so far I haven't met individuals with useless skills. I mean some sort of skill that you have, even if it's just soft skills, and dealing with people to break down complex issues into bite-sized pieces that they can digest. If that's the only skill you have, there's value in that.

Speaker 1:

You know you start where you are.

Speaker 2:

If you are interested in getting into privacy, then you can find a mentor and you can start finding a lot of free resources to try to figure out what area do you want to go into, because it's not just. There are 14 states and the desirous have to include this and checking boxes and getting through the exercise. There's a lot more to it, ooh there it is hey, there he is. You're late, did you at least bring me flowers.

Speaker 3:

No, I did. There's a dump truck right out there, a dump truck, a dump truck Filled with flowers. I killed as many red roses as I could humanly get my hands on.

Speaker 2:

I think it's killing time.

Speaker 1:

Yeah, I don't think a dump truck was a good description of having flowers in.

Speaker 3:

No, you're not wrong. You're not wrong, You're not wrong. I did show up wearing red too, though.

Speaker 1:

Yeah, I missed that memo. Sorry guys, that's my fault. I missed the memo. It's a debate, it's a date. Well, you know, let's move on.

Speaker 2:

I'm coming late and give you great deco.

Speaker 1:

Well, you know, whatever it happens, I missed it.

Speaker 3:

My apologies for the delay. I had a tiny bit of technical difficulties at first. There too, I was in the wrong platform. How is everyone doing? It's great to see you.

Speaker 1:

Doing well. We're diving into some diversity and a little bit of her background and where Heidi's coming from and her passion and privacy. And, heidi, you mentioned something before Gabe jumped on about larger companies and how. I'm not remembering exactly what you said so I don't want to quote it wrong but you're mentioning something about larger companies either not caring about privacy as much or what was it that you were kind of getting into, because I wanted to dig into that a little bit.

Speaker 2:

Oh, with the safety and trust.

Speaker 1:

Yes.

Speaker 2:

Yes, if you care about safety and trust, why did you lay off 98% of the people who do the safety and trust work for you? Right so that's just kind of common sense, Like I'm calling BS on you if you say my privacy is important to you and there's no one over there and no humans over there doing the work.

Speaker 1:

Right, it's interesting to say it again.

Speaker 3:

I said it's disingenuous at best. I'm going to agree with that. Yeah.

Speaker 1:

And we're seeing that across the board too, with a lot of large companies.

Speaker 2:

What's like data minimization? Everybody is requiring data minimization. We've been telling you only collect the data that you need. Yeah, yeah, yeah. And then if you talk to people, what do you need? I need everything. How long do you need it all the time? And if you go monitor your network traffic to find out what are they actually looking at and what are they using, that's just not true. So a lot of cases they tell people shut off their access and see if they complain. If they don't complain, they didn't need it. So purpose limitations and access controls are how to affect your corporate governance documents into your tech stack and if you don't.

Speaker 2:

That's unfair and deceptive. That's violation of section five. You can have policies all day long until it's the same thing in the tech stack In the back end. It's just not true.

Speaker 3:

Do we have an enforcement problem or do we have an apathy problem?

Speaker 2:

You know we get a series of problems. Here I am. I'm going to back us up to data, to the beginning of when data started to get its power. So in 2002, google built AdWords and quickly realized the power of big data and also the obstacle of the fair credit reporting act standing in their way. I'm not going to be able to comply with that thing if I need to build this. So they won't get an exception, so they have to be.

Speaker 2:

Congress passed FACTA. They sold it to the American people. As you have now more control over your credit report, you can ask for a credit report once a year and then fight with them to change what's wrong. But also the rapacious plaintiff's bars run away and oh my God, we're just overrun these lawyers there's just so hungry and greedy, and you know this isn't even about consumer rights, and so we've got to have tort reform. And so that is how they got it passed, because it was during, you know, the Bush administration. It was all Republican everywhere, everything. So that's how that passed. It created what's called a reseller exception. The reseller exception says if you're one of the big three credit bureaus, then you have to follow the rules of the credit reporting agency, and if you're not, you can do whatever you want and they did.

Speaker 2:

Your rights under the Fair Credit Reporting Act include the right to know what data they have, the right to correct it if it's wrong, the right to opt out of mass marketing and a private right of action to sue them if they don't afford you these rights in a timely manner. And it only covers five particular areas. So that's going to come back. So it housing, education, lending, employment and insurance by the five. Those are coming back.

Speaker 2:

The CFPB is coming, is bringing this back, the repealing the reseller exception and saying in these live critical areas, people need to have some agency over the information used against them. Right? So more people are going to be declared credit reporting agencies, which is why you see mergers and acquisitions and people scaring real fast to call everything you know first party data, even though that's totable. So that's what's happening. It's coming in the next two months because they proposed the rules in the fall and then they had the small businesses review it and they're reviewing comments now and Show Press said they're going to drop those new rules this spring. So that's going to come up pretty soon and businesses are going to have a lot to deal with with a private right of action on some of that data.

Speaker 3:

Let me understand that little skip you did there. That's interesting. So there's some sleight of hand that I can do to make my data appear to be first party data, even though that's not how I acquired it.

Speaker 2:

Yes, if you own business A and you merge with business B, you bought business B's data and it says, in their terms of service, successors will be sold right. This is currently an issue with near intelligence. Near intelligence was a data worker collecting data about visitors to Planned Parenthood and other abortion sites. Well, they get in trouble for a myriad of other things and lots. That's a big, big story.

Speaker 2:

You're going to hear a lot more about near intelligence and all the many shady things that are going on over there, but they've had to file bankruptcy and in the bankruptcy filings, their assets are listed and they have tons and tons of specific location data on people visiting these service locations. The problem is that in bankruptcy, things get chopped up and sold to other people. So, yeah, that's why Senator Wyden wrote a letter to the FTC to say you need to reach into the bankruptcy court and stop the sale of this data set, because it is in the public interest that we not continue selling this data. It shouldn't have been collected in the first place, and so privacy is a safety issue for women in our country, and this data was collected on behalf of an anti-abortion group.

Speaker 2:

So, they paid. Yes, yes, they are trying to collect the data so that they can target the audience and say you need religion, you don't need choices with your body. So that's happening right now, and so, through the bankruptcy filing, everything is exposed, and so that's kind of one of the issues that we're seeing now with mergers and acquisitions and bankruptcy filings and those sorts of things, because the data doesn't go away. You know as well as I do. They're not deleting the data when they say they are so right, wow, and does it end up in the right hands? And what are they doing with that data? The purpose limitations don't carry forward.

Speaker 1:

That's scary to think about it is.

Speaker 3:

I hadn't thought about that nice little shell game there, but it makes sense that the assets of business A would now be you know.

Speaker 2:

All these agreements are only ones that lawyers could love.

Speaker 3:

Yeah, I'm sure, for sure. There is something really challenging here, though, in that how do we make that, how do we put a spotlight on this to the people that it's really affecting? I mean, we can sit around this table and talk about it, and I think that's very useful, but like from an action perspective because you got me riled up now, like what do we do?

Speaker 2:

I've been riling people up for five years. Welcome to the party.

Speaker 3:

Don't get mad, yet I'll show you some charts.

Speaker 2:

I'm going to blow you mind. No, I've been doing this for a long time. I just respond to outrage, and so you know it's not that I set out to generate outrage. It was more to educate people, because people already don't trust and you know they fear it was happening to the point to where they've just resigned themselves. Oh, I don't know, there's nothing I can do about it. Oh, but there is. So you know, I've been asking the same three questions when I get out in a crowd of people at a music festival, post office, waiting in the store, the line of the grocery store. What does privacy mean to you, Right?

Speaker 1:

Right.

Speaker 2:

And do you know what data brokers are? And do you know how to change the privacy settings on your phone? And that's enough for a full conversation, because you empower people with a little bit of information and an action that they can take to do something and feel better about it. That also inspires curiosity, so that later and another reason why I do public speaking to inspire curiosity so that later you look into these things on your own and you slowly but surely start to connect the dots in a way that makes sense to you and then you start to share information with other people that you care about, because everybody is upset about this. It's a bipartisan issue, it's a human issue. So it hasn't been this way always, you know. I mean, think about 15 years ago. How did you do your job? You have three things Email, Word and Excel and we were running the world. We're doing fine with those three tools. Look at your desktop now.

Speaker 1:

No TikTok.

Speaker 2:

Look at your desktop now. How many things you got on your desktop now, like, do you really need all this stuff? Because you used to? Your job hasn't changed too much from what it was 15 years ago in theory, and you only had three tools then. So there's a lot of snake oil out there and I think trust is coming back. Social proof is important. That is the remedy to disinformation Social proof. Where are you getting your sources? Do you trust them to bring you quality information? You know it's advertising hasn't been around forever and it won't be around forever, especially behavioral, targeted advertising. It will not be around forever and those who keep resisting the inevitable future are going to get eaten alive, and I'm excited for that date.

Speaker 2:

It's. I'm tired of telling me the same thing over and over again. So consumers are angry. They're not going to get any less angry until they see change. Congress is not going to bring the change that you want to see right now because of a conflict of interest. They need this data so that they can run their campaigns, raise dollars and keep their jobs.

Speaker 1:

Yeah.

Speaker 2:

Yeah.

Speaker 1:

That's strong. I had a thought, and now I went blank All right.

Speaker 2:

So I want to talk about the money for a minute. I did.

Speaker 1:

Yeah, let's go to that.

Speaker 2:

A couple of weeks ago and I went and was presented to the audience and I said you know, almost $600 million was spent on behalf of your industry last year to get Congress to do nothing. Can you think of other ways to have spent that money to achieve positive goals that you have, other than to get Congress to do nothing? And there were 2,571 lobbyists that were assigned for this industry to lobby on their behalf. And I said there must be a lot of lobbyists. Does anybody know how many votes there are to be influenced on the whole of Capitol Hill? Do you know 535. That's it Okay. In the House and the Senate, 535 votes to influence and yet billions and billions of dollars are being spent to influence those votes.

Speaker 2:

The election this year that I mean that one suit, one pack, one super pack in the Democratic side has announced that they have already done an ad campaign spend of $140 million to raise awareness in female voters. There are not 140 million female voters of voting age in this country. They're spending over a million dollars on every single female voter who's going to show off and vote or not, on their own, without this ad space. So, and that's just one pack, and we're at the very beginning of the cycle. Imagine how much money is going to be spent per vote so these few people can do their job, which currently is not getting much of anything done. It's a ludicrous waste of money and it's all fueled by targeted advertising. So whatever law that we get, most likely Congress is going to carve out an exception for them to continue using our data this way.

Speaker 3:

Following the money is always a good approach.

Speaker 2:

Where else, I love hot. Shame at them every chance I get.

Speaker 3:

I love hot shame. I love hot shame, that is my thing. Hot takes Hot shame is my thing, absolutely. Where's that money coming from?

Speaker 2:

Your grandma who's being told that if she doesn't, trump's standing behind her with a you know, like buy them, gonna stab them, or something Like, oh my God, we've got to say you know, orange, jesus, I don't. Yeah, I'm not really sure you know. It's coming from the people on the Democratic side, like they're gonna take even more of your rights away to the LGBTQ community Look what they did with Roe and so they're trying to squeeze these dollars out of them. They're scaring the hell out of people to try to get more money, because the money is what shows the output that they're looking for. But is it the result that you want? I mean, how much money do we have to spend to convince ourselves to vote for either one of these old men?

Speaker 1:

It's never gonna end.

Speaker 2:

So that's kind of that's the power issue that we have in the standstill that we have with legislation and data, the other ways to achieve change or negotiation which we've tried. We allowed for innovation and then, when we tried to, you know, once we saw harm, we tried to mediate that harm with the companies and say you're causing harm, please make changes. How's that working out Right?

Speaker 1:

Right.

Speaker 2:

Angry hearings, with parents holding pictures of their dead kids, congress still doing nothing about it. So you know, negotiation hasn't really worked, and so that leaves litigation.

Speaker 3:

Oh, I was going to say sabotage, but you are, you are, you are kind of.

Speaker 2:

Then, I am, I was, I was Negotiation is the last resort, but we are there now. So the point is far has figured out how to take these cases. The regulators are responding to the outreach from consumers, and so the walls are closing in from every angle when it comes to the human centric view of data, and these tools were built to process human data without any respect for human dignity, and you know, as well, as I need strong privacy is not an add on.

Speaker 2:

So some of these things require transformative change and people are afraid of that because they don't know who you trust to say here are the steps you need to take to make the change that you want. And they also don't want to upend the way they do business because they've always done it that way.

Speaker 3:

So behind are we going to fall before we catch up? Because we were talking on the show yesterday which we were time traveling, so some of you folks would not have heard this yet. Yeah, but we were talking about, you know, the proposed legislation coming out of California regarding AI protections. Right, and it immediately occurred to me that this felt very divorced from, well, ccp and its protections against data, but I don't really know how you separate an AI bill from a data bill, considering, like all, ai has to be trained on data, and so it almost feels like there shouldn't be a separate AI bill, but, like that AI bill should be part of the original data protection bill, and so I feel like we're about to make a few more mistakes before we start making the right corrective actions.

Speaker 2:

Right, I agree with that. 44 states are currently considering legislation on AI. 44 states, exactly Right. And you know it's just February, we just got into the legislative session, so a lot of these are trying to get to root issues. Remember, it is an election year too, so everybody wants to have a victory, okay, so there's motivation on that too.

Speaker 1:

Right.

Speaker 2:

So they're trying to get labeling, watermarks, you know, those sorts of basic things passed first because that's easy, everybody recognizes that. You're not going to have too much resistance with that. It's not covered on the CCPA, cpia. What California law does cover currently are automated decision making tools like the ones the Fair Credit Reporting Act covers. You need to make sure that you know if there's risk that you conduct audits for bias and make sure that you are not unlawfully discriminating people when you're hiring or in offering housing or lending those sorts of things. That is in there, exactly.

Speaker 2:

So you know audits are a thing that the work of privacy has expanded considerably since I started doing this. It involves law policy, engineering, right, and, yeah, as well as statistical analysis. So you know that's a lot of different areas, whereas previously people were just going in to make sure, like you know, do we have our cookie banner right and let's set up our preference center and that kind, yeah. So I mean there's a lot more and I've always thought that you know privacy does start at the code, and so what I try to do is teach people where the culpability of the law meets the code that's caused in the harm, because if you're not aware of where your risks are, you cannot mitigate them.

Speaker 3:

That requires introspection, Like the examination of your own bias period requires introspection, and if it starts with code, which you and I will agree with. I tagged you on a post recently because it just jumped off the page of me. I have these communities of folks that I interact with both in real life and digitally, and we, Cameron and I, try to bridge that gap between privacy and security. But you see so much reflected in the security world of folks discussing what are obvious privacy impacting issues as still just a security code issue and not making that connection. There is a healthy amount of introspection in the security world from a code perspective. There is right, Like folks like NameDrop, a few, you know the guys in Galzalbert, Bug Crowd and White had and all those other places right, they have been at this for a while. I don't see that level of introspection being brought through a privacy lens, though.

Speaker 2:

Privacy engineering, I think, is where that is starting to take off, because you do have to address these issues with the code Like you can't. I don't think it is good to have one privacy professional that does your compliance work, that does the contracts right and does the policies, because that person either has to know and work on a team with other people that can put and implement those changes into the stack, or they have to be able to do that themselves. Otherwise, how do you validate the work that you just did? So you know, and cyber is how you put in purpose limitations and access controls. So in privacy, I'm calling for that in my notice. And cyber that's how you actually make that happen in the code, so that you automate this right.

Speaker 2:

And you can't locate everyone's data if you don't tag and classify it properly. Well, if you tag it and classify it, then you've got risk levels and you need to treat that data differently. If you were just writing policies, you're not necessarily going to be abreast of all of the changes, like health data. People that tacked their data as health data three years ago may want to look at re-tagging that data again because the definition of health data has been expanded. You know, washington State has expanded the definition of health data and it isn't just the people in Washington State. If you're offering services to people in Washington State exactly so those sorts of things you got to re-tag that data and get a you know bring in a service to do that. But first you have to be aware that that's what your issue is you don't know what you're missing. If you're missing it, what do you think?

Speaker 1:

What industry do you think is the furthest behind? Do you think it's the health industry, banking? Oh, wow, okay.

Speaker 2:

That's the last time you saw the monitor at the bank. Looks like DOS for function keys.

Speaker 3:

Most of them still only allow for four digits as a pin, as bananas.

Speaker 2:

So, and here's one Graham Leach was passed in 1999 and then Dodd-Frank came later and made another round of improvements and states have addressed cybersecurity here and there. But basically every new law that comes out says but we're going to exempt those with Graham Leach and HIPAA and this and that Rolling exemptions over time have created technical debt. If you don't have to invest in technical upgrades, then why would you? It's hard to give them a preemption. Moving forward If you're constantly getting hacked to bits.

Speaker 2:

You cannot tell me that you have these covered under another law and everything is fine. You don't need the additional administrative burden of a new law because you already have it covered Bullshit, you have it covered. Why do you get five breach notices from the move at breach? Bullshit, you have it covered under Graham Leach. So I don't know that exemptions are going to continue to roll the way that they are, but that is definitely the enterprise view. We've got to have these preemptions because companies don't want to change a damn thing about what they're doing. They want to tell you practices and they want to say regulators, and then they want to monitor and write the language. For what regulations are there which pretty much just rubber stamp what they're already doing, so they don't have to change, and that's why they get in the lead and they start leading these discussions.

Speaker 2:

Please regulate me. As long as I can write it, I don't have to change anything about how I do my business and how I treat your data. So enough of that has gone. On that people are getting wise. To it, I'm saying, yeah, I don't really believe you anymore. I had a law school professor who was a sitting judge in the bankruptcy court and she was talking to us about how you conduct yourself in the profession when you leave law school, and I'll never forget this. She said credibility is life for a genity.

Speaker 1:

Ok, Exactly Strong.

Speaker 2:

You remember that, especially in dealing with other people, because you learn to trust other people or you learn who you should not trust, but still keep close, and that's just how you conduct yourself moving through the profession. But that always stuck with me. She's absolutely right. When people no longer trust you, what do you have left?

Speaker 1:

Yeah, and to go back, a closed mind is the mind of an idiot. Yeah, so open it. I'm just doing bars, I'm just doing bars?

Speaker 2:

You're doing something, man. What are your questions? Give me your questions.

Speaker 1:

So, actually, while you brought it up, cpra obviously there was, I think, a few days ago. It's being enforced, now, right now, yeah, right now.

Speaker 2:

And I told people when they filed that appeal. I said don't sit on your laurels, because this is going to happen and you need to go ahead and plan for this inevitable of you now. And they said no, you want to pay a shit ton of money to our big law firm who says wait and see. Yeah, how's that working out for you now?

Speaker 3:

Always gets me like, well, we're going to wait and see what happens to others.

Speaker 2:

It's like well, no, wait, wait. Well, I keep charging you and then you will see how much I'm going to charge you to fix this on the next one.

Speaker 3:

So it's good. It's good, it's a good racket if you can get in on it, that's for sure.

Speaker 2:

Whole other show we can do about the brothel model of law practice that needs to die, but that's a whole other show.

Speaker 1:

Let's talk on the side. We'll create it, we'll do it. I want that show.

Speaker 2:

How many women want to join in that show and tell you why they left?

Speaker 1:

You want to do a, we can do a panel. You want to do a panel?

Speaker 2:

Why women leave. Yeah, oh my gosh. So many women would be so excited to be like. Here's why I left and tell you and there's some of the most talented, gifted people that you've met ever to practice in the field, but they just could not live under those structured environments that are designed to wear you down to a no, they're extracting every bit of value out of you, so there's nothing left for you to have invested in yourself.

Speaker 3:

I have a. My sister-in-law has recently joined your profession and she's already expressed exactly that. Her desire to leave is already there and she's only just started.

Speaker 2:

That's right. What's your exit plan Is one of the first questions they ask a new lawyer what's your exit plan?

Speaker 3:

Yeah, yeah.

Speaker 2:

I didn't know this. I'm first in a family to go to college. I didn't know what it was like to be a lawyer until I was one, so I worked with a few over time but I just I didn't know what to expect, and I have a lot of respect for some of the hardworking lawyers out there that are trying to do the right thing and help people. That's not the vast majority of the lawyers that I see.

Speaker 3:

Yeah, what's the advice you would give someone like that, though, who is new to the industry and find themselves challenged with any of the shenanigans you've just laid out?

Speaker 2:

Find a good mentor. Yeah, you always need someone cheering for you in your corner, Unconditional love and support. I mean you need to find someone that they think you're worth their time and effort to invest themselves in so that they can share their knowledge and wisdom with you. That's the right kind of mentor to have Someone who's always cheering you on, Someone who is willing to give you the hard truth when you fall down and say here's how you get back up. Finding a mentor. They can be the same age as you, but they have more experience. They don't have to necessarily be the toughest old bird that's around and you want to know her secrets. Maybe you need to leave her alone, but there are a lot of other people also in cross-functional teams. I mean, you work in privacy. You may work in cyber. That's a valuable relationship to have because it's going to give you insight and different ways to think about things outside of your comfort zone. So I think finding a mentor is probably the best thing that you can do, no matter what field you work in.

Speaker 3:

I mean trying to agree. Someone in the audience asked what I think is the natural right question on the heels of that is where would you recommend finding mentors when attempting a career or sector transition? Are there any characteristics that you would look for in a solid mentee?

Speaker 2:

Oh, there's okay. So, first of all, when I take on a student to mentor them, it's generally women and people of color. So that's just how I, you know, that's how I pay rent on my privilege to give back. But other people, you know, you may be looking for someone who can help you along the way, because you need help with the research part of a project that you're doing and it would be beneficial to them to do some of that research and then you can teach them through doing so. I mean, you know, having some of those kind of skills.

Speaker 2:

I think the best place to find these people are on LinkedIn. You know, linkedin is where people are showcasing the work that they can do. So if you're going through LinkedIn and you see, you know, like the data diva, and you see her put something out, read her stuff, and then look and see who comments on her stuff, right, and then you can, you know, go through there and find other people. You know people that maybe she comments on. You can find stuff that I'm doing or the posts that I comment on, and you can start following people and find other people that are in there along the way that have ideas that align with yours. And then you can start asking where can I network with more of you guys? Where can I find more of this community involvement? There are some privacy companies that have set up, you know, like Slack channels and things like that, where people could come and, you know, do some networking.

Speaker 2:

But I think if you put out work that you do in this field and say here's what I can do, other people will see value in that and maybe they'll start to interact with you and slowly, you know, we're going through social proof here, because we're humans and you've got to find people that you can trust. But it needs to be an even value exchange, because in teaching other people, I learn more about what's going on in the younger generation's mind and how they view things, which makes me a better teacher. At the same time, they're learning things from me, from my experience and how I view things, which makes them a stronger student. So you've got to have that positive feedback loop, that a good relationship with a mentor.

Speaker 2:

I would say reach out to people whose work inspires you Maybe not, like you know, tenet, gebru and people at the very top of the field like that because everyone in the world is reaching out to them so that you might not get a response from that. It should not encourage you. You know, like the lead partner at the firm that does this work, maybe they're busy, maybe a more junior associate who can give you a real feel for life, what it looks like on the ground. You know, those sorts of relationships you can build with people.

Speaker 3:

And what about the second question what are characteristics you look for to solid mentee? I know you mentioned how you pay your social privilege bills. I love that statement. But beyond that, like what do you really look for in that individual?

Speaker 2:

Curiosity. That's it Right on Curious. Yeah, that takes a lot, because you have to be brave enough to say I'm curious about this and speak it out loud, and then you have to look into it. And if you have the research skills to look into it, then you can start to analyze things. But curiosity is native to the individual. You either are curious or you're not, and things just go right past you. So I'm interested in those who are curious, because there are people who are going to go and do the deep research and find the things that you're looking for. I have the valuable conversations with the curiosity.

Speaker 1:

I like those answers. Yeah, that comes with awareness. I think being aware of your surroundings and your community is just to add on to what you said, heidi. This community since we've been a part of it, gabe is amazing. I can't even say the good things that I can say about the people in the privacy community is just amazing. All the women like you mentioned Data Diva and yourself and so many others. There's so many good characters that are in the trenches responding to people, talking to people. So go out there and don't be afraid to reach out if you want to connect and be curious.

Speaker 3:

Going out is key. I would have used the phrase. I've been lucky enough to have mentors close to me, but I think about your answer, heidi. The truth is that's a byproduct of networking inside of that community and for myself, for the folks that I tend to take on as mentees. I don't usually go looking for them. It's a self-selection process for me.

Speaker 2:

That's right. Yeah, they come to you. It's maybe eight to one that I'll take one, and the other seven I might have someone else in mind for you. That would be a better fit, or just a hot mess still, when I don't have time to help you with your life, right?

Speaker 1:

now.

Speaker 2:

Or it depends. Whistleblowers come to me all the time. I don't do whistleblower work. They're just looking for someone that they can trust, that they can share their story with, and sometimes when they tell me these things, it's the first time they said these things out loud. So is there a?

Speaker 3:

problem that whistleblowers like privacy. I know there's lots of other whistleblower platforms, but do they really cover privacy issues? I mean, if you were, let's say, working at Facebook circa any time and you wanted to report something that you found, what's that? What do you do?

Speaker 2:

You know. I'm not sure what they do inside their organization, but I have spoken to people who have left that organization and they came and dumped their dirty deeds on me and then I brought them where they need to go. People do key time or you know torch, or you know privacy, or you know false claims act. Did you work in cyber and do you have evidence that what is going on is defrauding tax-fever dollars? Then you know that's another whistleblower statute that's starting to pick up. They had 18,000 complaints under that statute last year. They couldn't follow up on all of those.

Speaker 3:

That's a while, and that's just the people that had the Courage to come forward. Certainly, he's a lot unreported.

Speaker 1:

So to go back real quick on the CPRA, just for anyone listening, if Do you have any, what does that mean? Now that it is enforced? What should we look out for? Were there any changes? Is there anything that the people should know about?

Speaker 2:

So all of the things that people were going to do and then all of a sudden just stopped dead cold and just iced it, all of those projects just came back to life.

Speaker 1:

Right.

Speaker 2:

And that's not where your organization was, because if you're going to do this foundationally, you need to do this first. So you know, and they keep stacking that up in the to-do pile. One of the things I think that people are missing out on is deletion schedules, Data retention and deletion schedules.

Speaker 2:

You don't have that, and you just have to have that, and you need to demonstrate that you are following that, because so it's been validated that we do in fact delete the data this way, you know, on this regular running basis or whatever it is, that employee notices, employee data is now on the table right, and so you have a privacy notice for your website, but you need to do a completely different one for your employees under the CCPA. So you get it's little things like that that you have plenty of time to have already taken care of, but if you're scrambling at the last minute, you know it's those things. As soon as they hit the appeal and everybody stopped working, all the privacy pros were like sitting on the beach for a while, because we're like what do we do?

Speaker 2:

You know suddenly, as soon as this is game on again, you know it's like playing stickball in the streets. Everybody yelled car and we get out of the road and like now the car is going, we're going back in the street and start going as fast as we can Like. Why did it have to be this way?

Speaker 1:

So like Wayne's World game on.

Speaker 2:

Yeah, pretty much. You know these are. These are important and people are starting to get like oh hell of upset about New Jersey with the DSARS and litigation going on right there. But the bottom line is this if you got a request from a consumer to delete the data and you located the data and there was no overriding legal obligation for you to hang on to that data, then you should have deleted it and told the consumer you did that. If you failed in that opportunity, you deserve the heat you're getting because people have not been paying enough attention to what the consumers want. So the consumers say I want you to delete my data.

Speaker 2:

In New Jersey, over 18,000 people used an authorized agent to say I want you to help me delete my data. These are members of family, members of law enforcement, judges, things like that percent of Daniel's law that they passed last year. They started in 2020 with that. After you know, someone crazy man went to a judge's house and killed her son at the doorway because he was looking for the judge. So they passed Daniel's law in 2020 and then they tweaked a little bit because the industry wanted to fight back some exceptions, but ultimately last year was passed. So they have plenty of time to know this law is coming, the data papers.

Speaker 2:

So when they started getting a huge amount of DSARS in January, when they started this project, they had 18,000 opportunities to do something about it, and at this point, after they filed the suit, at this point you can go on the data broker's site and still find that data. That is yes. So I mean you have plenty of opportunity. You knew it was coming and so at this point pardon me, but I'm fresh out of pity for you. The problem is, the enterprise lawyers have not had to deal with litigation before like this, and so this is your new world. Welcome, and exactly so. The wait and see approach is what got you into this hot water.

Speaker 3:

Is it irresponsible for me to call for more hacktivism? I think it technically is. I think I'm not allowed to say that and I have to say that I'm making that statement purely as Gabriel comes and I do not represent any employers, right.

Speaker 2:

That was a big one, yeah. So you know another thing the well. The data partners are already getting upset about it, and so they're now introducing legislation saying we've got to kill off the authorized agents. You can't have authorized agents do the D-SARS, because we got this wave of litigation, yeah, but then you did nothing about it when you had a clear chance to avoid.

Speaker 3:

Set another way. You made it too easy for the natives to get restless, please.

Speaker 2:

You made your bed, no lying it Somebody's going to have to get eaten, like those hiding in the herd are going to get eaten because they're the dumbest and the weakest. You've got plenty of opportunity to do something about it. So pardon me, but I'm hearing no pitching from the enterprise lawyers on this. You've had a long time with nothing to do with check boxes. Now you're going to have to do some work. So the idea is not just go after the plaintiffs bar on this. They're doing what they need to do because consumers are over it. The regulators are coming, the consumers are outraged, the plaintiffs bar is coming in to do the dirty work and so that's going to force change. It did not have to be like this. Remember when I said years ago, when I started consulting, I said these are what the changes you need to make, and they said we don't really see litigation or enforcement landscape what that should do now.

Speaker 1:

Oh yeah.

Speaker 2:

It didn't have to be this way.

Speaker 1:

Heidi, what is your prediction Like? What do you want to happen? What do you think is going to happen in the next, you know, in this year and next year when it comes to your privacy trends and laws and things like that?

Speaker 2:

So I see this in two parts here the data, and you have Providence, quality, privacy issues with data. Now you have tools. The tools how?

Speaker 3:

are the tools built.

Speaker 2:

What are the parameters? How are they weighted? What are the outputs? So you've got two competing categories. I think they're about to come together. One of the think tanks that I work with is for humanity Shout out to my for humanity piece and we've been working on this for years. We actually were one of the first to draft the criteria for how to conduct an audit for bias and autonomous tools, and that was a couple of years ago. We've since expanded it and recovering the use AI acts at this point and moving forward, but we were included in the US AI safety Institute Consortium. So our group, we're going to be part of the humans writing the rules of the road that will determine the future of AI. Now I'll find executive order applies to federal agencies and those who do the business with the government Big group of people but my prediction is that these rules are going to proliferate throughout the ecosystem.

Speaker 3:

It's the enforcement that I'm always concerned with. Rules are great, but it becomes a whole rules for the not for me game when they can figure out their way around it via well, a couple million dollars per voter, for example. Right yeah.

Speaker 2:

Well, that takes people like me who aren't motivated by money, who are willing to get in for elbows and do betting. So I currently have my own case on algorithmic bias. I get so angry that I actually do in litigating. So I'm not a litigator, I do strategy, but I'm litigating this one myself. Yeah right, a recruiter came to me and said you know, he's great jobs. And I said what about my mother with gap my resume? And she said our system is bad and women with a motherhood gap are getting left behind and and, and I'm also having trouble. I'm also having trouble trying to get businesses to overlook that gap when reviewing candidates to starting admissions both illegal, right, yeah.

Speaker 2:

So you know, she said she was going to send my resume here and there, but she didn't know. I knew those people because privacy is small, so I DM him. My resume is coming over. They didn't get them and they didn't have any explanation for what they did with my resume, but they did feed it right into their system because I'm getting targeted advertising for uh-huh right.

Speaker 1:

That's a problem.

Speaker 2:

I went to the EEOC with it. I said they're talking a big game, like they're going to do something with algorithmic bias. I got a case. They told me the tools are broken. Let's see what you can do. Yeah, they sat on it for two years and they freaked out and were like we don't know what to do with this. They've not gotten into the tools. The itutor case that, they said, is the first case on algorithmic bias. That's, that's not true. It was discriminatory on its face. It wasn't algorithmic bias. They didn't get into the tools, yeah, cause they don't really know how. There are 57 branch offices and technologists in zero of them. So, um, and they're going to be the frontline responders to this issue. Like you said, enforcement is a problem. Well, when people come looking for enforcement, these are the enforcers and they don't know what to do. What's worse is that they keep talking about we're doing this, but they're not actually demonstrating that they're doing it.

Speaker 2:

And then if you look at the lawyers that do employment law on behalf of consumers, they don't jack shit about AI. I have a couple of them tell me I should pay them 700 to 800 bucks an hour to teach them about AI and then they would decide to go and take my case. So I told them to pound say that I filed it myself as a pro se. So at this point you have two data brokers and I'm still waiting to hear back from the court to see if I got standing to proceed with my case. But if I did, it'll be one of the first in the country on um unemployment, um or um unlawful discrimination based on algorithmic bias.

Speaker 2:

The defendant has said we don't use AI, we use humans. But one of the defendants, their you know parent company, is the second largest recruiter in the world. Now I mean you did like $29 billion with the business and data brokering last year, but you want me to believe you did this by hand. So I would be surprised if the court said there's really nothing here and then just through my case completely out. So at this point I've asked for oral arguments before the ruling on standing, because they just keep spewing lies and I'm like trying to figure it out, so I don't know where this is going to go.

Speaker 1:

So that, to that point, do you think there's a problem in terms of technology being too far ahead of the people that are making the decisions on this exact issue? They don't know, they don't even believe it, almost, and they're just like nah, this is not real.

Speaker 3:

I was going to ask and then I want to address your question too directly is a technologist who almost made a mistake, a bias mistake, with AI. I'm willing to own that because it's a very useful story. But my question to you on that topic, dohaidi, is is it likely that the court comes back and says ah, it turns out the technology was biased and it broke the law, but it was an accident. The law doesn't account for that. Hopefully, right, like oh no, the machine broke it.

Speaker 2:

No, liability is clear in this case. It is on the business. It is, yeah, absolutely. If you got rid of all the racist humans and human resources and replaced it with a racist tool, you are still a responsible for the racism. It's still your racist tool. So, no, you cannot absorb yourself of liability by bringing in all of these apps. The app makers themselves, however, are a different story. Yeah, they're terms of service. So many indemnity clauses. They shift liability to you. They're operating with impunity.

Speaker 3:

You cannot expect them to be decent and with some of that, it does a different litigation by yourself.

Speaker 3:

I think that's part of the problem too. Is all software in our country still still operates as is, right, like there's? There is no. There's no justice for not getting what you were claimed to have bought when you buy sold, sold as is. I will just very quickly tell my side of this story, but I was working on a project several years back, an AI project that it associated data with humans, right without knowing who the humans were, without even knowing sometimes enough information about them, looking through large quantities of data and making these very Abductive inferences about who they were explicitly.

Speaker 3:

And I woke up one morning in a panic and realize, holy shit, wait a second, what did we use to seed that with? Because it struck me as like if, if all we did was had pulled from you know, a slice of Americana names, that could be problematic. That could be very problematic, considering America is made up of lots of people with lots of different types of names, some of which are are East Indian or even, you know, asian by descent, and some of them at what letter or two letters in their names, and like there was all kinds of weird things that we'd left out. They like African American names, so many things that I had. I woke up in a panic. I was like, did we just train this god damn model and didn't account for any of those things?

Speaker 3:

Luckily, my lead data scientist is much smarter human than I am and he had not done that, but as the person kind of I was seeing this I didn't have in my own process at the time. Ah, this has to be accounted for as we do a, b and C. I don't say that to give any excuse to these folks, which is why I asked the liability question. I would have still been my fault in my being like if that was the outcome. But what still lies here and maybe curious how the law saw that though so one of the things that I like to do is on.

Speaker 2:

In some of the speaking that I do, I speak to groups, including developers, because developers need to know when they're making an ethical choice, because they're not trained in this. They're coding the widget in a two week sprints and make this thing work well. I am coming in to ask for you to pause at certain points, and at these points it's an inflection point. You need to give this decision to someone else to make, because it's an ethical choice, and ethics committee or management or someone else to make the ethical choice and then come back to you to decide which way you want to code that. But they are the ones that are in the front line, workers that are building these tools.

Speaker 2:

So they're the ones that I wanted to talk to about these types of things, because I've talked to people who worked in programmatic advertising before and they they held their nose and did the work, but they clearly saw what they were doing.

Speaker 2:

You know, if you have an audience of this kind of person and that kind of person, you only want to sell your thing to these kind of people. Who are you leaving out and is that okay? You know exactly, and so fundamental fairness is within all of us, and we have our own standards for where that lies. But I wanted to inspire their curiosity so that they would look at their work in a different way and say this might be an ethical choice before I code this and I include this library right, you know? I mean, this little library over here doesn't even have pictures of gorillas in it, because Google couldn't figure out how to separate pictures of black people from gorillas, so they just got rid of all the gorillas and said now we're biased free, which is ludicrous hmm, we tell that to the poor gorillas well at this point.

Speaker 2:

You know they're also pulling out things like informational breastfeeding, because all the breast it might be photographic, you know. So people are looking for informational breastfeeding. You're not going to find that information. I mean, who are the arbiters of the truth? And what happens to information in the future, because people decided here it was too risky to include these. You know data sets right. Also. You know missing data sets, like how many gay and lesbian people were denied housing three years ago. We don't know, because nobody bothered to collect that data. There was no financial incentive to collect that data. So how do you manage a situation if you don't measure it? So we have data inequities all over the place, you know. I mean, look at Gen AI. It's proficient in white English. So if you are proficient in white English, you're probably going to do fun. How about the rest of world? Ernie? Yeah, I mean, if you speak Mandarin, you can use Ernie, but I would advise it. It's an LLM. So bye dude.

Speaker 3:

No Ernie, no, no, no, no, no, no for those not tracking, it is a Mandarin trained LLM.

Speaker 2:

When you do research, you know translate sometimes is the most effective feature because we're stuck in English and so you're reading a lot of research papers in English. Well, I'm sorry. Some people in another place in the world that researched this five years ago and came out with something that's extraordinary, but you never saw it because it's never been translated into English. You know there's a bulk of knowledge out there if we look outside of our little, you know, privileged bubble you are.

Speaker 3:

You're getting squirreled to how of Babel territory, and I like it, because I happen to think that's one of the biggest problems with AI also is it is myopic. It requires data sets, but the data is highly cherry-pint, highly cherry-pint and some of this is just garbage.

Speaker 2:

Like you mentioned inferencing. Inferencing means making stuff up.

Speaker 3:

Yes, yeah, I know, I will completely, completely cop to that. We were. We were making things up intentionally because what we started with was what we know nothing. So we have to make inferences right, like we have to start inferring our way back to truth, because we are intentionally starting from we don't know a thing. But inferences are very, very dangerous, very dangerous. For that reason, yeah, there's, there's an all abductive versus inductive reasoning process that we spent a lot of time looking at. That got us to kind of think differently on our heads about the mistakes we would make. Inferring things about humans, like inferring things about other people that we did not know and there were lots of gotchas that could happen. You start inferring the wrong things about someone and associating them with other activities or other documents.

Speaker 2:

Even it's like ah, like resume screeners yeah, resume screeners right now, like you can't code for this, I can't get a job because I'm an outlier, but resume screeners are trying to find the people in the middle of the road, right, and so what happens is that you keep turning through the media, the mediocrity of you know the normal people in the middle of the road. Meanwhile, businesses are wondering what excellence doesn't emerge right, we'll be right back.

Speaker 1:

No, I'm just kidding, we'll keep it right. Here it is. It is three minutes past.

Speaker 2:

I don't know if you have a hard stop, heidi, just want to be respectful no, I figured that this, you know, is live, and I didn't really know what to expect here.

Speaker 1:

So yeah, okay, we'll go a few minutes over that's cool.

Speaker 2:

I don't have a date right after this okay, good, good, we're only dating.

Speaker 1:

So, yeah, until your day date or your night date. Is there anything that we didn't talk about today that you want to talk about, that you want to bring up, that might be important or you want to?

Speaker 2:

shout at you. I would encourage people to do a few things if you're interested in getting more into the privacy engineering side, just because you want to know and understand more about what people are telling you, not necessarily because you want to do the present engineering work yourself. And the shots book is yeah, privacy, yeah book for engineers. It's fantastic. You don't have to be an engineer, but it will tell you how stuff works. So I mean that's a good resource. If you were looking to find out more information about ethical AI, responsible AI, whatever marketing term people are starting to use for ethics washing now if you wanted to find trustworthy sources of information for humanity, offers, training and also as working groups. So if you are interested in learning more than you can, come and meet more people in the community. We have over a thousand people now from over 80 countries and you know there are so many different projects that you don't have to be an expert in your field to add value in one of the working groups. You can just, you know, check it out and see if you like the people that are there. Maybe if you're looking for a mentor or something like that, but you have no imposter syndrome rules, so you have value just showing up, so that is a good place. Also, shay Brown is teaching some very useful classes and his company is Babel AI, and Shay is he's got.

Speaker 2:

The right opinion, in my opinion, is we need to bring as many people into this as possible and teach them how to do this work. It's not a competition or secret sauce kind of thing like I am not the only one that can save you, like. No, there needs to be lots of other people that can do this, because that's how you also can achieve diversity of thought. But we need to bring as many people into this field to know how to do this work. Like. More people have skills in data science and statistical analysis, more people have, you know, skills in dealing with change, adjustment with the people that you are implementing change with and the soft skills that you need there. So it kind of depends on what skills you have and what project you're working on, how you can go and get more information personally.

Speaker 2:

When I started doing my research into AI a few years ago to find out what is this, how does it work, some of the simpler things that I did is Google offer training classes. They don't offer them anymore, but they had training classes on target advertising and funneling and look alike. Since, you know, I got to look under the hood and find out how do you do that. It's horrifying, but you know it was. It was a free resource. Likewise, pbs has a free resource called crash course, and crash course AI is taught by a guy named Jabril and he's phenomenal. But the video series each one's about 20 minutes long and they're about 20 of them, so it goes through natural language processing and you know it just goes through all of the breakdown pieces in a way that you can understand it, so you can get a base level of knowledge before you need to start looking at potentially certifications if you want to go that route yeah, that's great.

Speaker 1:

Great resources and advice right there.

Speaker 3:

I like it. Shameless plug. I'll add one more to the top list because the shots book is at the top of that list, but I host a GitHub resource that just has a ton of privacy. Engineering resources in the shots book is top of that list nice, good deal.

Speaker 2:

That GitHub resource I mean I need yeah, I need some information from my peeps on that yeah, I will.

Speaker 3:

I will make sure we shared. It's basically all of the resources that I have found useful and so some of the ones that you've just spoke about I will. Let's connect. I want to make sure I update the list to include those resources to yeah to that point.

Speaker 1:

Heidi, are you a part of the privacy pulse slack community? Okay, I'll send you an invite. Got a good amount of people in there and I know I just had to, gabe. But, heidi, thank you so much for taking the time to join us today. Seriously, you're awesome. Appreciate you very much. Joining us for a day date and talking privacy and your passion is clearly evident and just it's just. Thank you for what you do and for what you're doing for you know, women and in this industry as well, it's very important and just really appreciate your time and being a colleague. You know just another person that works alongside you in the in this industry love it.

Speaker 2:

I'm just running my mouth on LinkedIn.

Speaker 3:

Probably not yeah, that's fun, let's do it.

Speaker 2:

Thank you so much for your for this lovely date. This was a great discussion. I'm gonna go outside and if there's not a dump truck of flowers, I'm gonna set if it's only because someone messed it up in the AI department and it was. It was bad training data. Trading thanks, guys, thanks for doing what you do.

Speaker 1:

Thank you, heidi of pleasure, be one. We'll talk soon. Take care, heidi. See everyone.

People on this episode