Privacy Please
Tune into "Privacy Please," where hosts Cam and Gabe engage with privacy and security professionals around the planet. They bring expert insights to the table and break down complicated tech stuff everyone can understand.
Privacy Please
S5, E227 - Avishai Ostrin, Founder & CEO at TrustIZ
Join us for an enlightening journey into privacy and technology with Avi, founder of TrustIZ. Discover proactive strategies to tackle privacy challenges and gain insights into Israel's evolving privacy laws post-GDPR alignment. Explore the intersection of privacy and cybersecurity and the transformative role of AI in privacy processes. Avi's dynamic perspective, inspired by the energy of Hamilton, emphasizes the importance of being "in the room where it happens." This episode is a must-listen for those eager to understand the future of privacy in our tech-driven world.
all righty, then. Ladies and gentlemen, welcome back to another episode of privacy, please. I'm your host, cameron ivy, alongside my other host, gabe gums. Gabe, how you doing today, man, I am well. How are you, sir, doing? Well, right, great, we got a we got a guest with us today. Um, avi, he's been on. Actually we were looking back. Um, what was it?
Speaker 1:uh, almost like was it three years ago almost that, while yeah, a lot of different things going on, so we wanted to catch up with you and see what was going on, what has changed, what you're in, you know what you're getting into right now, what you're dealing with with, and just talk privacy and all the things. So, avi, welcome back on the show, man Welcome back.
Speaker 2:Thank you very much. First, you're going to have to tell us how it is. You look five years younger, even though we saw you three years ago. You should go the other way.
Speaker 3:Yeah, that's true, I can tell you. It's not for a lack of stress and stuff going on in my life. So it's some. Yeah, it's that magic elixir I seem to be. But yeah, thank you, you guys don't look too shabby yourselves. I appreciate it, especially for being a Jets fan and having kids too and a company like that's a lot it. It is I'm not sure which order I would put those in in terms of levels of stress, but uh jet span father and uh company owner.
Speaker 3:But uh, yeah, for sure that's that'll do it.
Speaker 1:So what's going? What's going on right now? What? Okay, let's, let's catch everybody up that they never listened to your episode.
Speaker 3:Yeah so if anyone wants to go back and listen to that, that was a fun episode we did in February 2021.
Speaker 2:So it was almost three years ago.
Speaker 3:Yeah, awesome, awesome. So back then I was, I was heading the privacy department at a UK law firm that was based out of here out of Israel department, at a UK law firm that was based out of here out of Israel. And pretty quickly after that, after we had that episode, I kind of made a little bit of a change where I moved from a law firm providing legal advice to you know, always in privacy to my clients to a company that provided data protection officer consultancy services, and that was an interesting transition. I sort of felt a little bit I had reached a point in my career where I felt like, just as a privacy lawyer, I felt like companies were coming to me or clients were coming to me when something was broken, when something was wrong. We were talking about this before Gabe, about things being broken and having to was wrong. We were talking about this before Gabe, about things being broken and having to be fixed. And that's where I felt like I was always coming in after the fact.
Speaker 3:And I really wanted to kind of grab the bull by the horns and just be there right at the beginning and right before things were broken and try to prevent them from being broken, and so I joined the consultancy that helped companies build privacy programs, offer data protection officer services, worked with some really awesome tech and awesome companies right on the cutting edge of the tech that was coming out of here out of Israel, obviously, the startup nation. I did that for a couple of years and then a year ago a year and a month actually ago I decided to go out on my own and jump in the deep end and I started my own consultancy. It's been a year now and things have been going. It's been a whirlwind, but, you know, things have been really, really going great, and so what I do now the company that I founded is called Trust Is and it's short for it's sort of a play on the phrase trust is everything right, because everything that we do in our space, in tech, in compliance, whether it's privacy or AI governance or things like that it's all about building trust.
Speaker 3:That's the end game. That's what we're always striving for is building trust with consumers, building trust with the people that are using the different tech, and that's what I help companies do. At the end of the day, I help companies build that trust through privacy, compliance, data protection officer services and AI governance, and so that's been sort of my journey over the past three years. It's been an amazing one and I'm super like thrilled with the stuff that I've achieved, but also I'm looking forward and I think there's a lot, a lot to still do, and I'm excited about it.
Speaker 2:Awesome, it's awesome. Well, A congratulations on the new venture. That's just huge. That's a lot of work, and hats off to you. Thank you. By the way, for those that were interested, it was season two. Episode 52 is where Avi first Gosh.
Speaker 1:That's crazy Episode 52.
Speaker 2:Yeah, season two, episode 52. A lot's changed in the world since season two, episode 52. We're on season five now and so, from a catch up perspective, what is what? What's the number one change you've seen over those last three years? That's really affected the way you are thinking about the problems today.
Speaker 3:So obviously, from a technological perspective, you know it's, it's been. How many minutes are we into the episode now? And we still haven't said the term AI.
Speaker 1:Actually, I mentioned it before, but yeah, it's AI right, no question like hands down.
Speaker 3:that's what everyone's talking about and it's coming up in every single conversation. It's funny because even the more traditional companies before that weren't as tech savvy. You know, like you had the traditionally. You sort of had the tech heavy companies that were building this really cool cutting edge technology, and then the more traditional ones, and now even the traditional ones, like you know just to throw an example out there, like a local construction company is thinking about oh, how can I leverage AI in order to make my people more efficient and reduce costs and things like that. So it's really sort of perforated everything and gotten into everything and AI is just all over the place. So that, in terms of the technology, that is certainly something that I'm seeing much more of today than we did back then.
Speaker 3:The other thing, like if I'm thinking about, like the clients themselves, what's interesting is, I'm actually seeing a lot more maturity when it comes to talking about these types of issues of compliance and regulation and things like that. There seems to be there seems to have been a very good maturity in terms of where people are coming to. Like you're seeing companies coming much, much early on in the journey, which obviously makes my job A much more interesting, b much more, you know, efficient and effective, if I can get in, like on the ground level, when they're just laying the foundation and the scaffolding of the product, and I can, you know, steer them in the correct direction. So I'm seeing those conversations happen earlier in the cycle and that's very exciting and, I think, something that's really been a good byproduct of some of the regulations that we've seen come out over the last couple of years.
Speaker 2:Look, we couldn't spell your name without AI, could you? Awesome?
Speaker 3:I like that. I can't spell your name without AI.
Speaker 1:I don't know if I'm jumping the I don't want to say gun here, that's the wrong term but especially because I'm going to be bringing up the war, like how has the war affected what's going on in privacy and data security and everything? What have you seen there?
Speaker 2:And how's that? You might want to be more specific. Which of the wars? The war, yeah. Which of the wars? So that's fair.
Speaker 3:Yes, that's fair. So, for those who don't know my background, I'm originally from the US, but I grew up here in Israel, have lived here since age seven, so really grew up here and live with my family about 20 minutes north of Tel Aviv, which people are probably familiar with that area If you haven't been here, it's very beautiful. Familiar with that area, if you haven't been here, it's very beautiful. I would recommend coming and visiting, although I would wait until things calm down a little bit, which we're all hoping to happen soon. So, domestically, for us it's been really interesting. Obviously, personally, I started my business on Octoberober 1st 2023, so six days before, you know, october 7th, which was the the big, uh, terrible, terrible day that happened last year, and so that was like a shock to the system and I was totally like questioning myself for months after that, like what am I doing? How can?
Speaker 3:I possibly build a business and how can I possibly make this? Is this viable? And what have I done to like the future of my family? And I'm really proud to say that it was. It was a, you know there was there was a rough patch there at the beginning, but I really pushed through and persevered and I'm very thankful that I had the support of my family and doesn't necessarily have to be starting a business. But if you're hearing this and you're thinking like, oh, I want to make this change or I want to, you know, go into this type of business, or I want to start my own business, or anything like that, I think what I took away from the experience personally is that we can really overcome a lot of challenges and so, and so I would say you know, go for it, you can, you can. I believe in every single person in our amazing industry. So I would, I would encourage everyone to take those risks and good things will come. So that's sort of on the kind of personal side, I guess, or national slash personal side with me.
Speaker 3:Interestingly, in parallel, in the past year actually, israel has managed to pass a pretty, pretty serious reform of our privacy law. There was an amendment that was passed a few months ago. It's going to go into effect in 2025. And so because of that change in the law, we're actually seeing a lot more interest actually domestically from companies that didn't necessarily take privacy all that seriously because of sort of the history of privacy legislation in Israel, it's historically been something that hasn't really been on the top of people's minds. And now, because the law has changed and gotten more serious and the regulator has gotten way more ability to impose sanctions and much more authority, people are all of a sudden perking up and starting to take note of it, and so from that perspective, it's also been really great because we've gotten a lot of people, you know, calling in and saying hey, we heard something's going on. Can you help us navigate these?
Speaker 3:you know this new thing and we're able to be there and help them navigate it, so that's been also a really cool thing that's happened.
Speaker 1:Yeah, that must have helped your business, I would imagine.
Speaker 3:Yeah, definitely it is helping, for sure. There's actually even a requirement, similar to the GDPR requirement, for certain kinds of companies and for certain kinds of entities, like public entities and companies processing a high volume of sensitive personal data to actually appoint a data protection officer, and so that's a job title that just hasn't existed here beforehand, and all of a sudden now is not only is it existing, but it's now a legal requirement in certain circumstances, and so we're getting questions and queries from companies about, like, what is this new thing that we need that we need to deal with. So that's. That's been really cool and really interesting, and I've been enjoying a lot of the conversations that we've been having around that.
Speaker 2:What was the driver of such legislation? I mean, I'm always curious because you know like in the US, obviously CCPA is not at the federal level and has a very different driver than you know.
Speaker 3:Every piece of legislation has, like, the history behind you know how it came to be and this law is no different. So for people who aren't familiar, Israel's privacy law is actually pretty, you know. Actually, one of the first ones in 19 is from 1981, is.
Speaker 3:Israel's privacy law. So it's, it's. It's pretty old, I think. Certainly predates me. So there you go, giving a little bit of my disclosing my age right there, but you can imagine what technology was like back then and it hadn't gone through many major there. I think there were two major amendments that were made over the years, but nothing recent.
Speaker 3:In 2011, israel received the adequacy status from the European Union. Under the old regime, the EU Priv directive this is pre GDPR received adequacy status and, for people who may be less familiar with that, what that basically means is that data could flow. There are only a handful of countries around the world who have this status and Israel is one of them, and basically what it means is that data can flow freely from the European Union to Israel without the need for additional measures like standard contractual clauses or things like that. So that was achieved in 2011 with the promise that Israel's privacy law was going to be modernized and sort of amended. That didn't really happen, and then the EU actually came and said we're doing a review of all of the adequacy decisions that were given and we may be thinking about those who haven't brought their law up to speed, you know, potentially maybe taking that away or putting limitations on it, and so it was actually that sort of what was the main, I think, the main driver Obviously, there were internal things also the privacy regulator really wanted to really push this forward and it was like a personal almost like a personal driver for him to be able to raise the bar when it comes to privacy compliance in Israel, and they were able to do it.
Speaker 3:They had a really intense period of sessions in Parliament. Obviously, this type of piece of legislation has to go through hearings and committees and the process is pretty lengthy. So and and they had they had a lot of sessions on it, but they were able to finally pass it through and it was passed in August and it's going to go into effect next August in 2025.
Speaker 1:That was my next question.
Speaker 3:Yeah, yeah, so it's going into effect in August 2025. And it's interesting because, you know, we have this phrase called the Brussels effect, right, which is regulation out of the EU that impacts regulation around the world, right? So the formation of CCPA, based on obviously different from the GDPR but almost like, based on some of the principles of how GDPR was built, is sort of one effect that it had. You know, EU to US and now we're seeing, like this is the Israeli law is sort of the Brussels effect that we received from the GDPR was like the push from the adequacy standpoint to say, guys, you need to get your ducks in a row if you're going to keep this adequacy status.
Speaker 1:I don't know if we went over it. What's the name of this one? Does it have a name?
Speaker 3:Of Israel's privacy law.
Speaker 1:Yeah.
Speaker 3:Yeah, it's called the Protection of Privacy Act. It's still 1981, but it's.
Speaker 1:I'm sure it's got an acronym right. It's probably Israel.
Speaker 3:P-O-P-A. I think Papa maybe.
Speaker 1:Oh yeah, Papa.
Speaker 3:Papa, I've heard of that, not to be confused with Tom Papa.
Speaker 1:And there's also one called PEPA somewhere else. Not Peppa Pig, Right there's POPIA.
Speaker 3:There's PIPEDA, there's all the. When you get privacy, it's always with a P. So there you go.
Speaker 1:It's true. Do you know any of the main differences? You said there's similarities with the GDPR, Any key differences that stand out to you that you might that come off the top of your head.
Speaker 3:Yeah, so it's always interesting when you talk about like local legislation and comparing it, comparing the approach to the way that it's approached in other countries. Obviously, you know, in the US the privacy is very much seen as a consumer right, whereas in the EU.
Speaker 3:It's seen as a basic human right, consumer rights whereas in the EU it's seen as a basic human right. So in Israel as well, we consider privacy is considered a human right. There's constitutional, you know, israel's constitution does talk about people's right to privacy in their bodies and in their homes, and so that is based on Israel's constitution and so from that perspective, the approach is a bit similar. Traditionally, israeli privacy law has been actually very security focused. So because we have a lot of, you know, we're seen as like the cyber nation, right, there's a lot of cyber companies, there's a lot of focus on cybersecurity. So there's always been this conflation in Israel between privacy and cybersecurity and people who say, oh, I'm, you know, I'm protecting the data, therefore I'm privacy compliant, and trying to have to explain that, no, actually those are two, you know, they're obviously two sides of the same coin, but they're two separate issues. And so now what we have in the privacy law is we do have, we do have a principle-based law.
Speaker 3:One of the main differences between GDPR and Israeli privacy law is that, while GDPR has six different lawful bases for processing so we have consent you have contractual necessity, legitimate interest, lawful obligation, etc. Obligation, etc. In Israel it's all based on consent. So you have to obtain consent in order to process personal data, and if someone withdraws consent, then you can't process their data anymore. For those familiar with the Canadian privacy regime, it's very similar to that in the sense that Canada also has a consent as the sole lawful basis for processing personal data. So that is one very major difference. Also, conceptually, israel still. Unfortunately, one of the things that we still have in Israel is a definition of databases. So it's not about processing of personal data. It's about processing of data in databases, which is a little bit of an antiquated way of thinking about things, but like protecting a database as opposed to just looking at specific processing activities.
Speaker 2:I'm not super offended by that, I think, unless one defines database too narrowly, because what I like about it is some things that others kind of try to like Databroke is certainly stateside try to skirt is that they're not like keeping a database of a bunch of PII right, like because of how they keep it. They're like, well, that's not a database, obviously they're just getting cheeky with some of the wording there. Not a database, obviously they're just getting cheeky with some of the wording there. But I mean a database does not necessarily have to mean an ACID compliant, you know RDBMS system. By any stretch of the imagination it could just be this notebook.
Speaker 3:Yeah no, it's true, it's true, but then, yeah, it's, it's true, it's just, I feel, what, what, what's? I think what bothers experts most about it is it's assuming you know the way that modern technology has evolved and the way that we live our lives and the way that we have our personal data and our PII and our personal information out there. I guess database implies some sort of structured base of information, whereas we have a lot of information that's out there in unstructured formats, exactly.
Speaker 2:Right.
Speaker 3:Unstructured, unstructured formats, all kinds of connections between different databases and and you know data lakes and places where our personal information resides that you can't necessarily classify into a neat filing cabinet of database A, database B, database C.
Speaker 1:So is it like a general term? Is it a general term?
Speaker 3:Yeah, it's a general term. There isn't really a specific definition of what a database is. It's just a collection of personal data that then needs to be protected. So if you think about, let's say the way we think about it is more sort of as opposed to a specific system. We think about it more of the type of information. So let's say you're going and you're working. Let's say you're looking at a company A that has human resources data. They may have seven different systems in which that data is stored, but you look at it holistically and all those systems together comprise one database, which is HR information.
Speaker 3:So it's sort of more of a conceptual legal term than it is a technological term. But yeah, I mean it's just sort of a different way of thinking about it and conceptualizing it.
Speaker 1:That's interesting because you know, I don't know, do you work with companies in the US as well?
Speaker 3:Yeah, sure. I work with companies all over the world.
Speaker 1:Okay, that's good to know for the listeners too. But, yeah, so, compared to, like, what I'm hearing, compared to Israel's and the European, you know, gdpr and their privacy acts, compared to the state privacy acts that are in the United States, like CCPA there's a lot of them coming out in January of 2025. I think there's like six or seven, and a lot of what I'm seeing in those is there's a lot of there's thresholds, there's opt-out mechanisms, there's a lot of things that are in there. There's similarities in there. They're very different because of the way you said. It's more about the human right compared to the consumer, and that's why there's so many little things that the state laws have. Am I on the right path?
Speaker 3:Yeah, you're definitely on the right path and I completely understand. So the answer is from that perspective. In terms of scope, it's much more similar to GDPR. So there's no thresholds. Your database, right. So, for example, if you're, the security measures that you have to impose on the data that you have will depend on the sensitivity of the data, the number of data subjects that you have, the number of people that have access to the database. That's an interesting one that doesn't exist. So access rights determine the security measures that you need to impose. So that's an interesting one, but in terms of the, I guess, very much different from the US state laws that are coming out. From the US state laws that are coming out because, number one, it's all data subjects, regardless of the consumer context. So they can be employees, they can be, you know, individuals, even if they aren't direct consumers of the business. So it's not just necessarily in the business context and also, there isn't that threshold, it's. It applies to all, all processing.
Speaker 3:So so nobody, nobody's really exempt, even if they're government or health care or whatever yeah, no, obviously there's the, the standard household household exemption, like there is in the gdpr. You know, if you're keeping like your the, the, the modern day equivalent of a Rolodex.
Speaker 1:Oh man.
Speaker 3:Yeah, there you go See. Now you're showing your age, now I'm showing my age.
Speaker 1:Although I did say that the 1981 law predates me, so you, can guess even though I'm in the 80s too, so if that makes you feel any better, yeah, there you go too.
Speaker 3:So if that makes you feel any better, yeah, there you go.
Speaker 3:Um, so, so, yeah, so there's the the household exemption exists, but not the uh but, but, but certainly in terms of the um, certainly in terms of the applicability of the law it would apply to to anyone processing uh personal data.
Speaker 3:And then the interesting. The other interesting thing to note for anyone who is doing business in Israel or has any interest in Israeli privacy law is that we have a very a regulator that is certainly using or not use I shouldn't say using, that's not the right word but is very much leaning into the new regime and talking very much, you know, going to every speaking at every privacy event in Israel, talking about how enforcement is going to ramp up and they're really going to take it to the next level and putting people on notice. People on notice, and the regulator has received a lot more leeway and a lot more authority under the new law. And they do very much seem at least from what they're saying publicly to they seem like they're going to start, you know, cracking down on enforcement August 2025. So it's going to be interesting certainly.
Speaker 1:What's the most common challenge that you run into with your customers right now today? Is there like one that stands out that's coming up very frequently?
Speaker 3:Yeah, I think it's still. I mean, this was the case back when we spoke last and I think it still is very much the case. The clients, the companies that I tend to work with, are operating globally, and the lack of a unified standard globally is just a really, really difficult thing for companies to be able to comply with and to stand by. Obviously, companies that are operating in the EU will, and other countries will, take the GDPR as the global benchmark. Yeah, but once you're, once you're, once you go beyond the, you go, if you're in Israel, the EU and the U? S, for example, and then you start going into other jurisdictions. Um, then it starts becoming really tricky and really complicated and you in some cases, have conflicts between certain laws, right. So I work, for example, I do some work with with fintech companies that there's always, you know, financial regulation versus privacy. Are you retaining the?
Speaker 2:data. Are you deleting the data?
Speaker 3:Everyone has different interests and they're all like you know we need to. We need to delete it under EU law is privacy? Are you retaining the data? Are you deleting the data? Everyone has different interests and they're all like you know we need to. We need to delete it under EU law, but we have to retain it under you know, South African banking regulations.
Speaker 3:So how do we reconcile those? So that's just an example. But you know, building a global privacy program sounds really great in theory but is very difficult to implement in practice and that's I would say that's probably the most difficult thing that I find, you know.
Speaker 2:Yeah, I imagine I certainly want to be mindful of the time. I know certainly how much more we do it, dylan. I've got a little time to hang myself there. I'm not certain how much more we do it. I've got a little time to hang myself there, but we're nearing the end of November which means we're pulling up on 2025 territory, which means it's time for predictions. What do you see? What do you see really? Either changing the landscape in the year coming and or becoming more ever-present. That is there now. What are your?
Speaker 3:general predictions for 25? Wow, that's a great, that's an excellent question. So I think this state privacy laws. It's snowballing and the snowball is just going to continue growing. We have, I think, growing. We have, I think, 19, if I'm not mistaken, 19 laws in the US and we're going to keep going. I was a little bit hopeful for when the APRA was introduced, the draft federal privacy bill earlier this year, but that was a very short-lived experiment, as were its predecessors, so I don't see that coming anytime soon, but I do certainly see states adopting many more states adopting consumer privacy laws.
Speaker 3:I think that we'll see some really interesting regulatory questions coming out around AI and privacy. I think that there's, you know, there's still a lot of unanswered issues around that, things like data retention, deletion, you know, questions that really we haven't been grappling with, we haven't had to grapple with yet, but we really, we really will, and I think that's gonna that's gonna be super interesting, like from a from, as a person that is very interested and follows very closely the developments in AI governance as well as privacy, just to see where that is taking us right. So some of the, for example, some of the copyright lawsuits that are pending in the courts now against some of the big AI providers, like OpenAI and Google and others, around the use of copyright in AI. I think, yeah, and just a continuation of the trends that we've seen, with regulators sort of cracking down on the big tech.
Speaker 3:I think I don't see any. I don't see anyone slowing down on that front and and I think 2025 will will be no different, I do think, if I can say something, that's going to be, it's probably going to survive. This is my prediction. I do think that the third iteration of US adequacy in the form of the data privacy framework seems to be at least for now, seems to be safe. I haven't seen any major challenges of that. As opposed to its predecessors, privacy Shield and Safe Harbor, which were both struck down by the EU courts, it doesn't look like the data privacy framework is going down that route, which is, in my opinion, a very good thing that we can finally put that whole, all those shenanigans, behind us, and just focus on actually doing the work.
Speaker 1:So from that perspective, I'm happy.
Speaker 3:Wouldn't that be nice right?
Speaker 1:Well, so you have your predictions. Those are good. Now take what you would hope to happen. What do you think is one thing that could actually make things less complicated, for how things are so complicated, especially for these global companies that are dealing with data all over the place?
Speaker 3:Yeah, I think that anything that brings us closer to some sort of global standard, some sort of like an international body releasing a Again, I'm realistic, so I don't think this is actually going to happen. But if there was one thing that I would be hopeful that would happen is it's one of the international bodies, whether it's the UN or the OECD, or one of these international bodies would sort of stand up and say okay, enough with this nonsense of jurisdiction by jurisdiction. Here's the global privacy standard and we're sort of going to make things these are. The principle obviously needs to be principle based. It can't be you know specific rules, because each country will interpret and implement it differently in the same way that different countries in the EU interpret the GDPR differently and somehow it still seems to work.
Speaker 3:So we can have harmonization on a principle based harmonization, harmonized privacy framework. You can figure out the acronym for that PBPF PBPF the principle based privacy framework, and sorry've got to put the G the global in there. Yeah, there you go. Doesn't it work the other way around? First you figure out a really good acronym and then you figure out what it stands for right, like CANSPAM.
Speaker 3:That would be a backronym. A backronym, oh, there you go. Very good, I like it, I like it. So, yeah, just something like that that would allow companies to, just because, at the end of the day, it's the frustrating thing, I think, from my perspective, is you have companies that really want to do the right thing. It's not like someone's coming and saying, oh I want to do shady stuff, so let me like figure out which where I need to incorporate and what I need to do in order to avoid X, y, z rules. And that's not the case. The case is we want to do the right thing, but it just works out that because we're operating in 10 different jurisdictions and we have 10 different legal regimes that apply to us, it just makes things that much more complicated, and so that's what I would hope to see is some sort of harmonization.
Speaker 1:I can't imagine.
Speaker 3:No.
Speaker 2:Harmonization, that'd be awesome.
Speaker 1:When you're working with these privacy teams, or maybe they don't even have a privacy team, who knows? I mean, if they're a pretty large company, I would imagine they would. But what do you see in the tech side when it comes to, like that, collaboration with these companies or customers that you're working with? Is there still a disconnect there from your perspective? Is it still lacking on the privacy side, or what do you see from that end?
Speaker 3:So I work with various different sizes of companies, as you say, the smaller ones that don't necessarily have an in-house privacy, and I'm sort of the sole privacy consultant as an external consultant. That's always a challenge.
Speaker 3:that's always a challenge getting to the relevant people in the company trying to kind of put together the regulatory and the tech and trying to figure out how to, how to, how to work collaboratively and how to convince them that working with me is actually good for them. And uh, and I'm not trying to, I'm not trying to block anyone or stop anyone. I you know. I'm just trying to help us do a better job and build trust Right, because trust is everything.
Speaker 3:Um and then yeah, and then internally, um, I think that there's internally. I think there is also a you know, sorry, companies that have an internal resource tends to be better because they tend to be already on top of things and more sort of collaborate with the technical people. But I will also say that one of the things that I am seeing which I think is great is a lot of tech tools in our space that are designed to sort of enable and give the internal privacy teams more sort of collaboration, more control, more alerts about what's going on in the code and within the actual technical teams, and that's really cool to see that people are actually investing in that type of software.
Speaker 1:That was leading me to my next question, which was are you still seeing a lot of companies still using legacy tools for privacy now that there's so many better tools out there? Not to not to do a shameless plug with something like transcend, but, uh, you know, like thinking about that next gen privacy where we're trying to automate things to make it easier for those tech teams and those privacy teams to make things a lot easier for their consumers to build that trust. That's the biggest thing, like you were talking about is building trust is how do you build that trust? Well, I think having a tech that using some kind of privacy solution, that is, a platform that is very efficient and automated, to where you can understand your data, know where it is and have things automated for your customers and consumers to make them feel like they have that trust in that technology as well.
Speaker 3:Yeah, absolutely, and I think that that's one of the things that AI is going to specifically in our space is going to really make things much, much easier for the experts is sort of giving you that bird's eye view of what's going on and things like ropa's record of processing activities and you know other standard documentation that used to take you know ages to complete can now be automated and and kept up to date in real time.
Speaker 3:I've seen and I work with several tools like yeah like transcend and like some of the other players in this space that I know that are out there and are being built literally as we speak, and it's really exciting to see like tools that are scanning code and giving privacy teams sort of real time access to okay, not what it says in the policy, because the policy is already two months outdated, because the engineers have already moved five steps ahead of what it says in the policy.
Speaker 3:It just hasn't been updated yet because the update date hasn't arrived yet, but actually scanning in real time, giving alerts in real time, giving information in real time and cutting out the, you know, alleviating the necessity to do all of these manual tasks, all the time and I think that's going to be really, really great.
Speaker 1:Yeah, agreed, because I know a lot of the older tools. It takes a lot more resources, so a privacy team isn't going to know what an engineer would know For sure. They're going to need to lot more resources, so a privacy team isn't going to know what an engineer would know For sure they're going to need to use those resources? Yeah, definitely.
Speaker 3:One of the biggest things that I hear from clients is like oh, we feel like we're working for the tool more than the tool is working for us Right. That's like the number one piece of feedback that I hear about some of the more legacy tools is like we need to keep feeding it information as opposed to, and we haven't yet seen the benefits come back to us whereas the you know, some of these ai tools are almost like plug and play to use uh, to use one of the older uh antiquated uh, you know, one of the older uh tech phrases we don't.
Speaker 1:Yeah, I mean, it's interesting, it's going to be really neat to see how that develops and if, if, uh, ai and privacy can find that balance of innovation without uh, without taking a hit on the privacy side for sure for sure.
Speaker 3:No, I, I definitely, I think it's definitely going to happen. It's not a question of if, it's a question of when. And it's going to be great because what it's going to do is it's going to almost give privacy professionals superpowers, because they're going to be the number one thing that a privacy professional always yearns for and desires is to be in the room. Do you remember that? Did you see? Have you seen Hamilton? So I love that song. You know the song in the room where it happens. Room where it happens, the room where it happens.
Speaker 3:Yeah so that's what a privacy that's all a privacy pro ever wants to be is in the room where it happens.
Speaker 3:Yeah, so that's what a privacy that's all a privacy pro ever wants to be is in the room where it happens, in the room where that decision gets made in the room where the architecture of the new feature is getting, is getting built, and what these tools are going to allow them to do is be in the room where it happens, because they're literally going to be seeing it in real time, as as the code is being written, as opposed to, you know, three months down the line when they discover that they have a feature that's now been collecting data, sharing data. You know these tools have been onboarded. They don't even have any clue about what's going on. Now, all of a sudden, we're, we're real time.
Speaker 1:We're seeing it real time. We're shortening.
Speaker 3:Exactly, we're shortening the lag time for them and and that's you know- it's going to give.
Speaker 1:Yeah, it's going to be huge. That goes back to your point in the beginning, avi, where you said I don't want to come into these after the fact. I want to be able to see it when it happens and be a part of it in the room where it happens. Interesting, ok, and this episode is brought to you by Hamilton.
Speaker 3:You can go see it on.
Speaker 1:Disney Plus.
Speaker 3:Lin-Manuel Miranda. There we go, yeah.
Speaker 1:Awesome. Is there anything that you want to bring up that we didn't get to talk about? I know we're coming up on time for the hour, but anything that you want to talk about, or if you want to let the listeners know how to find you, if they're interested in working with you, learning more from you or connecting with you, whatever, whatever that is yeah, with pleasure.
Speaker 3:So, um, thank you very much. Uh, yeah, so the company is trust is, uh, the website is trust isai, uh, it's with a z, so t-r-u-s-t-i-zai or zed. If you're from the UK and you can find me on LinkedIn, I am very active on LinkedIn. I love to connect with people there. I love to have conversations there. Please feel free to reach out, dm, engage with any of the posts. I always love to hear people's feedback and people's thoughts on the things that I post and the things that I speak about, and if anyone needs any help with anything relating to privacy, consulting globally, dpo as a service or AI governance, please also feel free to reach out. That would be awesome and really looking forward to having great conversations, because if there's one thing that our community is good at, it's having really, really interesting conversations with really interesting people like yourselves.
Speaker 1:I agree. That's why you're on, because you're interesting and you care, you are you. I think that one thing that stands out to me from my perspective is that you're just being you and you're digging in and just going all in. I love that and I thank you. I appreciate that and thank you for taking the time to be with us again.
Speaker 3:Thanks. Well, the feeling is definitely mutual. Cam and Gabe, it's really, it's always a pleasure to speak to you guys, and same.
Speaker 1:One last thing before you go though, yeah, for sure. What, since we're coming up on Thanksgiving. What's, what's? What's one thing that you're thankful for?
Speaker 3:Oh, I wow, that's or multiple things.
Speaker 1:No.
Speaker 3:I was. I was gonna say I'm thankful for so many things that I you know it's hard to pick just one, but I would have to say I'm really thankful for my family. I think this past year has been, you know, both in terms of on and our dog, who gives me emotional support on a daily basis. You know she actually my dog, brownie is a pretty by now is a pretty big privacy expert because she's just had to have so many conversations with me about some of the stuff when I didn't have anyone else around.
Speaker 3:My wife was out at work and my kids were at school. So now she knows all about you know, building privacy programs, drafting policies, reviewing DPAs so she actually has a lot of knowledge in the privacy space.
Speaker 1:So she's pretty certified, sounds like. She's oh she is definitely by far the most certified canine privacy professional out there I think, uh, you should start like a I don't know, like a instagram page with you and you and her like a privacy sidekick or something yeah, for sure, for sure.
Speaker 3:Yeah, I mean it's, it's, she, doesn't she it's. It's very hard to. She's pretty cagey, no pun intended, so it's hard to get you know her, her view on things. It's usually me giving her my view, but uh, but she's, uh, she's a good listener.
Speaker 1:I love that man. Well, Avi, thank you again for your time, and it's always great catching up man. I'll definitely talk to you on LinkedIn and so on, but good luck going into 2025.
Speaker 3:Thank you so much for having me on again and wishing you and your families, and everyone listening, a very happy Thanksgiving.
Speaker 1:Appreciate that Same to you and yours.