.png)
Privacy Please
Welcome to "Privacy Please," a podcast for anyone who wants to know more about data privacy and security. Join your hosts Cam and Gabe as they talk to experts, academics, authors, and activists to break down complex privacy topics in a way that's easy to understand.
In today's connected world, our personal information is constantly being collected, analyzed, and sometimes exploited. We believe everyone has a right to understand how their data is being used and what they can do to protect their privacy.
Please subscribe and help us reach more people!
Privacy Please
S6, E255 - Privacy, Law, and Emerging Tech: A Conversation with Sonia Siddiqui
Privacy and cybersecurity leader Sonia Siddiqui joins us to explore the collision between emerging technologies and privacy regulations, offering insights on how companies can navigate this complex landscape while building trust.
• Sonia's journey from aspiring architect to privacy expert, motivated by the intersection of civil rights and privacy
• The growing gap between rapid technological innovation and slower-moving regulatory frameworks
• Examining real-world tensions like WorldCoin's iris scanning under GDPR's biometric data provisions
• Why privacy should be a core business enabler rather than just a compliance checkbox
• The importance of implementing privacy by design as a living process that evolves with technology
• Why principles-based regulation allows for better adaptation to new technologies than prescriptive rules
• The inseparable relationship between privacy and security in building customer trust
• How privacy professionals can stay current through professional networks, podcasts, and continuous learning
• Essential privacy resources including "The Unwanted Gaze" and "Dieterman's Field Guide to Privacy"
Find Sonia and her privacy consulting practice at tamarack.solutions or connect with her at the upcoming AI conference in Boston.
All righty, then. Ladies and gentlemen, welcome back to another episode of Privacy, please. I'm Cameron Ivey over here with Gabe Gumbs, as always, and today we have a special guest with us, sonia Siddiqui. She is a privacy and cybersecurity leader lawyer with Emerging Technologies. She's an ex-Coinbase employee. But, sonia, I'll let you kind of chime in and just tell the listeners a little bit about yourself. We'll go from there.
Speaker 2:Yeah, sure. So first of all, thank you both for having me on this show. I'm a big fan and very excited to be, you know, a guest now. My name is Sonia Siddiqui. I am present day Fractional Privacy Counsel and founder of my own practice, Hamrack Solutions, where I advise tech companies and crypto companies on building scalable, compliant privacy programs. I'm also a former chief privacy officer at Kohler and, as Cameron you mentioned, I was also a head of privacy and security legal at Coinbase for a bit and also sort of an early early joiner there, and I bring a business aligned lens and a practical application as it relates to privacy and tackling all the challenges that come in this space.
Speaker 3:Love it. Welcome to the show. It's a pleasure to have you.
Speaker 2:Thank you.
Speaker 1:What before we get into the meaty, greedy, or that's not even a saying what was the dream before becoming a lawyer? Like where did that all come from?
Speaker 2:Man, where did it come from? So it came from the dream before becoming a lawyer was I wanted to be an architect. But you know, it turns out I'm really bad at math. I do love design. I'm terrible at math. And so that dream got deferred. And then there is a little bit of a holy trinity still in South Asian culture about what you can be, and so it's doctor, lawyer, engineer, and so which one doesn't use math right? And so I joke and it sounds sad, but it was fine. I really do think I landed in what is truly my calling, which is to be a lawyer.
Speaker 2:It may have not been the most idealistic path, but I really love what I do. Do you know?
Speaker 1:Love that. So we were talking offline and I'd love to get kind of into it. Gabe, you had a, you kind of had a question. We can start off from there.
Speaker 3:First kind of a you know, a follow-up introductory question. Also, you know you built your career on that intersection of law, technology and privacy, as we were talking about offline, and I was curious what drew you to it. I mean, you mentioned kind of the natural progression of you know, maybe by elimination, what else was left for you to really explore and exploit as an intellectual human. But what drew you to the field? Because at any given point you could have done like some of our black sheep's in the family and said you know what? I'm not going to be a doctor, lawyer or an engineer, I sure should have tried.
Speaker 2:Yeah, no, and that's a great question. I think I in particular about ending up in sort of the privacy and the cybersecurity sort of field as I was going through law school. I think during my time in law school there was a case that came up in New York around surveillance of minority communities, particularly Muslim communities, and it was quite controversial at the time and there was a lot of sort of litigation around it and sort of questions around civil rights. And I've always been kind of interested in sort of the civil rights side of the world and had been exploring interns and clerking with civil rights firms in law school and taking on some of those issues. And this one was interesting because it really came at the intersection of the inherent right to privacy that human beings have and civil rights right, like tying those two things together, and I found it so fascinating and so interesting and really spoke to me just as an individual and as a person as a part of that community.
Speaker 2:As I graduated law school and began to explore my professional path, I started trying to figure out what I wanted to do and I think I mentioned this before. Part of it was truly something I'm passionate about, and some of it was luck and timing right. And so, as I was graduating law school, it was on the heels of sort of well on the heels, I don't know but GDPR was about to come into effect. Essentially, we had seen the e-privacy directive come down and kind of in this limbo space, and so I met some privacy professionals and started kind of realizing that this is something that actually could be truly a meaningful career for me to really kind of dig in on the privacy side of things. And so I started my career over at Grand Thornton, which is a consulting firm, and helped start up that actual privacy service offering there.
Speaker 2:And this came in again 2017 to 2018, as privacy was, gdpr was going to effect and then, of course, ccpa and so on. I just loved that. I mean, candidly, we all have to be able to survive and make a living, and so I love that I was able to kind of marry that just fundamental need that I had with something that I really love which is really contemplating the human need for privacy and how do we preserve that as we continue to evolve as a civilization really.
Speaker 3:Follow-up, another one of those interesting intersection and interplay questions. So you express an extremely strong interest in the interplay between emerging technologies and privacy Blockchain, for example. You worked at Coinbase early on. I know you've expressed some pretty thought, leading thoughts around things such as ML and AI and as it pertains to training, data and privacy also. So what do you see as the most promising or concerning applications in those emerging tech spaces?
Speaker 2:Help me understand a little bit more. Promising concerning applications of.
Speaker 3:Yeah, so in the technology world, there's some things that are very promising as it pertains to privacy and privacy protections, and then there are others that are very promising as it pertains to privacy and privacy protections, and then there are others that are a bit more detracting from that, if you would, namely adding a net positive effect to the world. And so are there any emerging technologies, whether it be blockchain or AI in particular, that you see having the most, either concerning applications or the most promising applications?
Speaker 2:Yeah, I honestly don't know. I don't know if there's anything that really like sticks out to me. I think there's some, that just kind of challenge and we talked about this things that kind of challenge the norms and our expectations of what it means to be privacy protective. And I think one of the things that we continue to see, particularly with, like, privacy enhancing technologies, particularly on the blockchain, is that the tension between anonymization and legal obligations, right and like, how do you contend with that? And like is the world really truly built for people to be truly anonymous? And I don't know. Those are just like philosophical questions I answer. So nothing really sticks out to me. But that's kind of just my general thoughts on the macro thoughts I have in this space.
Speaker 3:Sure. So, more specifically, if I were to pick explicitly on one, two things in particular, let's take privacy regulation, maybe GDPR, and then let's take biometric data. Do you see that as having more promising applications in the real world or more concerning applications?
Speaker 2:Yeah, and that's a great. That's a great question, and I think it's something I keep coming back to over the course of my career is like examples like that, and I'll preface this with you know, views are my own, and so anything I say is really just my own thoughts and views and nobody else's. And so I really do think that technology is evolving faster than regulation can keep up, and I think we hear that a lot. A lot of people say that. But what does that really mean? And I think a really good example of this touching on biometrics is Tools for Humanity and WorldCoin. Right, users look into this orb. It scans their iris, it generates a unique digital code and from there they're issued WorldCoin tokens.
Speaker 2:The company behind it argues that the system's anonymous and there's no raw biometrics that are stored and it can't be reversed, engineered to figure out who you are. But under GDPR, the very act of scanning and creating a biometric template is still considered processing personal data. So there's a real clear clash here, where you have this crypto project that's engineered to minimize exposure of personal information, but a regulatory framework that still treats it as traditional data processing. And so it's an unanswered question really, and there's some push there around. How do we solve for that? I think that same dynamic exists when we think about AI, when we think about server-side ad tech you know other parts of I've worked in crypto for a long time now sort of transactions being pseudonymous and permanent, and that immutability directly colliding with GDPR is right to erasure, right. These are not new things, but they remain challenges and unanswered from regulators.
Speaker 3:Right, right, right right. So let's keep pulling on that thread a little bit more. I've been in InfoSec the better part of my entire adult life, and my lesser than adult life wasn't terribly far from when Bill Clinton signed the larger Computer Security Act, et cetera. I'm dating myself a little bit there, but here's the thing that I'm always torn with.
Speaker 3:I agree that technology moves way faster than the law, but I'm torn between whether or not that's a good thing or if it's just something we'll have to live with. I tend to lean towards. I think it's a good thing, because one of the things about technology moving so fast is we also leave a lot of technology behind us, like for all the things that we know of today that stays with us, that we think of as being ubiquitous. There's tons of protocols and other technologies that you know. If we had spent the time to have legislated and regulated them, it's equally hard to say whether or not we would have evolved to the technology that we have today. And so how do you balance that? How do you balance the need for technology to move as fast as it does with what I think anyway is a requirement for the laws to be interpretable broadly under some reasonable human's expectation of what it reads to be.
Speaker 2:Yeah, and that's such a great point. I think the issue doesn't necessarily lie in the fact that regulations often come prior to the innovation that creates a new need, right? I think that, sure, there's this challenge that these laws were written for a world of file cabinets and databases and now we're in this new world of zero-knowledge proofs and AI and blockchain. What really, I think this comes down to is, as we think about this from a legal perspective, it's not necessarily rewriting the law in turn, so much as it's allowing for new interpretations and understanding the spirit of the law, right? So GDPR is the most comprehensive privacy regulation that we continue to have globally, but how do we allow for those standards, those principles, to continue to be upheld but also applicable in somewhat of a new world order?
Speaker 3:Yeah, that's a great point. I appreciate the allowing for the interpretation to be updated for the world that we're in, versus the simple trying to rewrite things, because for as much as the world has changed, a lot of it genuinely is the same under the surface.
Speaker 2:Now for that to happen, though, gabe regulators and those that are interpreting the law need to be very savvy and this is something I've said before and knowledgeable on the technology and how it operates. I think, even with the EU Data Act, right, we see somewhat reductive requirements for very complex organizations, right, and so I think it does require a level of study, and I think the most successful practitioners in these spaces will be the ones that really commit to having a deeper technical understanding that underpins their legal knowledge as well.
Speaker 3:That makes sense. We were talking offline and I think there is definitely a little bit of friction between the law and the understanding of how the technology works. Right, if I were to make a blanket statement that a biometric model constitutes a privacy risk, I'm of the opinion that absolutely it does. I know that there are different models that have different risk levels of exposure and there are a few that have zero. But how you get to the model still has this middle ground of you have to train the model and so while in the process of training the model, you will have something that is exposed to risk. But I do see sometimes where the legislation feels a little too broad. Because if I were to say in this conversation, try to attempt to say that you know like a linear model or regression model, you know a naive Bayes model, I apologize, I know math is.
Speaker 2:I'm bad at math.
Speaker 3:It's OK for what it's worth. I joke all the time. There's lots of things I'll do in public. Math is not one of them. You're working to get absolutely naked and run through a field. Sure, you want me to do math in public. You got the wrong guy. But there's some models that have no risk. They contain literally no biometric data. But if you think about the models that people use in their homes every day to bring it home a little bit more, voice recognition models is a great example. They're trained on voice recordings and they create unique fingerprints, such so that those embeddings can be reversed engineers for the purpose of re-identification because they contain unique voice prints. I don't have an expectation that the law should get that detailed, but again, like, how do we balance that? Maybe you're right, maybe the answer is more education on the legislation side, but new technologies are really starting to blur the lines of where you can apply that thought logic. Right, like it's not just the file.
Speaker 2:Yeah, it's a new calculus, right, so and.
Speaker 3:I don't.
Speaker 2:I guess I don't have an answer for this either, but there has Well, not right now on this show, but there has to be a more meaningful debate, discussion and analysis of these novel technologies, and how they process data and I think I mean having been a practitioner for a dozen years here voice recognition models is cakewalk.
Speaker 2:We can run that through gdpr, we absolutely can, and we'll get an answer, and we can mitigate the risks and we can silo the data on the back end and we can make sure it's only processed for certain reasons. Yes, we can do that, um, but voice is identifiable and there's it's really kind of hard to de-identify that and have a voice recognition model right. So that's an example where that processing is so clear right. And we're entering territory where the processing isn't as clear right Like we're even when we think about server-side ad tech. We're no longer storing user-level data. It's instantly anonymized. It's never identifiable to begin with.
Speaker 2:However, if we take a more, if we take this sort of stricter understanding of how this is addressed by regulation, if the data passes through your systems, even fleetingly, it's still processing. But is it Right? How do we deal with that? How do these companies think about this stuff? And I think these are things companies are contending with every day and they're finding answers. But you know, some leadership in the space from the regulators, some indication, would be also something that would be super, super helpful and it'll come from somewhere. I think sometimes these things are industry-led, sometimes they're regulator-led. I think here we may see some industry leadership that'll emerge and it's in their interest right to define that, and my hope is that it comes together.
Speaker 1:I want to keep pulling on this real quick, gabe. So obviously we're pretty well familiar with the concept of privacy by design and since we're talking about, like novel technologies, ai, how it's always moving extremely fast From your point of view you know, you've been in-house counsel, you've done professional services for firms how are we able to apply those technologies, privacy by design with these technologies? They're always evolving that must be like an extremely challenging thing to kind of navigate.
Speaker 2:Yeah, I think so. I think it's how do we apply the principles of privacy by design on an evolving tech? I think the principles themselves, while privacy by design itself is like GDPR principles.
Speaker 3:Mm-hmm.
Speaker 2:And I want to talk about this very practically because I've been in-house, I think, a lot of companies sometimes and I've been at varying types of companies some companies will treat it really as a compliance checkbox, and those companies they may be in a different growth stage or they may not be as enmeshed in technology development or product development, and so maybe that's candidly, that can be okay for some companies. I'm not going to say that everyone needs to have these holistic, beautiful, well-engineered privacy by design isn't meant to inhibit innovation. It really is truly the sense of embedding those reviews in the development's life cycle, sort of continuously. And so when we think about AI models, this could mean assessing the risk at each stage, training, deployment, fine-tuning, because what's low risk right now and today may become high risk tomorrow, even given evolving a new attack vectors and techniques, right. And then for blockchain, again, it means designing by default with things and this is work I've done zero knowledge, proofs and synonymization and selective disclosure from the outset. It has to really really be a living process.
Speaker 2:With AI moving as fast as it is, you can't just sign off on a single privacy review and call it done. You really do need ongoing guardrails built into the development lifecycle, so on. Tech shifts, privacy safeguards kind of shift with it. I really do see this. It's like a wave right, like there's always some ebb and flow, but there needs to be this kind of vibiness. Can we call it that?
Speaker 3:especially if we're talking about ai.
Speaker 2:We can definitely talk about vibiness yeah, right, that's my legal advice it vibes, yeah, okay we're just gonna vibe our way right through all of it and tonya to that same point.
Speaker 1:It's like that's probably why it's probably not best to write laws these, these state laws in particular, like we don't want to write laws based on technology Right.
Speaker 2:Yeah.
Speaker 1:And I don't know if any states have done that before. I haven't really done my research there, but I think that just makes a good point to like. That's probably something that you would never want to do, just because of how quickly those things change.
Speaker 2:Exactly it would be. It would be stale the day it comes out.
Speaker 1:Yeah, basically.
Speaker 2:I mean thinking about that a little bit more and always coming back to sort of this North Star of using a principles based approach when addressing any of this stuff. I mean, we think about this in cyber as well. What are the principles that need to be regulated Right? And it's risk, it's accountability, it's human impact Right.
Speaker 2:So, you know, right now we have lawmakers that chase always right, lawmakers often chase the shiny object. Four years ago was crypto, right now it's regulated AI, and so the second you drive it off the lot. What does it even mean, right? And so I love my metaphors, and they're always so inaccurate.
Speaker 2:But a principles-based approach allows these legislations, allows these requirements to really focus on you know, regardless of the tool, regardless of the tech, what rights individuals need to retain, what duties companies should bear when they process data minimization, transparency, all that stuff, and then let regulators apply those principles flexibly to new technologies. I think that's what we're calling for right. Gdpr and CCP are great regulations. They really touch at the core of those principles. Now the flexibility in the application needs to come right. It needs to be demonstrated to kind of allow for indication to industries that innovation is okay, innovation is allowed, it won't be stifled, it has to be done responsibly.
Speaker 1:Yeah, agreed. Have you guys heard of that? Horses to cars, analogy with privacy law.
Speaker 2:Vaguely, but I don't remember it.
Speaker 1:I can't remember who actually came up with it, but I recently heard of it because of Steve Elkins, who wrote the Minnesota law. Great conversation, actually he had it with Ron. Ron and Steve had a great conversation but they were talking about like outdated frameworks and how you know. The analogy kind of highlights several key points about current state of technology and privacy law and how you can't write old laws to new technologies. Basically, I'm kind of talking about so it's pretty cool because it's like horses to cars. Back in the day I think the old story was like botched this, but it's laws of the horse were rendered almost useless. Cars moved at speeds, horses couldn't, didn't need to be fed and rested, that kind of thing. Like it's just a. It's a pretty cool analogy for today compared to like back then when horses and cars were in that change I guess.
Speaker 2:No, I agree, I definitely agree.
Speaker 1:It's pretty cool.
Speaker 3:So, tony, congratulations. You've been recognized as a Fellow of Information Privacy, and so what does leadership in the field of privacy mean to you?
Speaker 2:What does leadership in the field of privacy mean to you? What does leadership in the privacy? I am always humbled when people think that I am a leader in the privacy space. What does it mean to me? It means pushing the envelope a little bit on how everyone thinks about this stuff, and so to me it's pushing privacy professionals to think about these things in different ways. As lawyers, I think, in-house lawyers especially there's this notion of like get to yes right, and I think that's particularly interesting for privacy professionals Like how do you get to a yes right? And seasoned in-house privacy professionals have like tons of scars from battles on this. But I think it's such a healthy exercise and it really gets to back to the things that we're talking about right, where you can kind of distill, where you can no longer apply your expertise because of the limitations in front of you, right? But it does really mean pushing the envelope, challenging being a partner to the businesses that you support or the clients that you support, but also challenging them to think about things differently. It means being willing to have hard conversations about what the regulatory activity needs to look like.
Speaker 2:It means I don't know. I think those are the things it means to me. I just have a lot of fun. I think I think it's important to have fun when you're doing this work, and that's a big part of who I am and what I do. I would tell my husband about this. He's like when would you stop working? I was like I don't think I would. I just really like it.
Speaker 1:It's just fun. So, yeah, right, because it's like, and especially because it's always ever changing, I mean, that's the. That's the cool thing about this, this space and other you know, muslim girls that might be want to do what you do someday, and getting to see someone like you be successful in that position that you're in it must be something that's just very I don't know rewarding, in sense that you might have others that look up to you that see, oh my gosh, I could actually do something like this. That's really cool. Yeah, it's always weird when, like I do something like this.
Speaker 2:That's, that's really cool. Yeah, it's always weird when, like I do have like younger high school college age females that come up and say, like I want to be like you and I'm like what me? I think you're doing way better than me and I hope they do surpass what I've done and where I've come to. But yeah, it's always nice to hear if it's important, no, go ahead.
Speaker 1:I was just gonna say it's important in this industry because, you know, the tech industry in general is full of males more than females, so that's that's why it's also very important too.
Speaker 2:Yeah, and I really, really dug myself a niche here between law and cyber and tech. I really was like, let's just find the hardest places to just find a seat. It's definitely, I think there are certainly. Yeah, representation absolutely is important. Diversity is important. I think we're certainly seeing much more of it as we continue to. You know, as time marches on, but it was hard, I think. The first you know, first, early years of my early, early part of my career, I was often one of the few you know females of color minority, and that's slowly changing, for sure. And yeah, it's it's. You deal with challenges all the time in terms of unconscious biases to outright biases, and you kind of have to just navigate those and you develop a thick skin. But then you also realize that everything you do, everything I do, hopefully just makes a difference. As long as I continue to practice in this space, I do hope to create space for more people like me and others.
Speaker 3:While we have you here, then for those folks who do look up to you and may not necessarily have an opportunity to interact with you one-on-one with any regularity, is there one book or resource podcast not named? Privacy Please.
Speaker 2:Privacy Please, privacy Please.
Speaker 3:That you'd recommend to someone who wants to better understand privacy and technology.
Speaker 2:Yeah, I am looking at my bookcase right now, the one that always sticks out. There's two that stick out to me and I'm looking at them. One is the Unwanted Gaze by Jeffrey Rosen. It says the destruction of privacy in America. I like that one, I do. Okay, I have three. Okay, that's one. I think one that every privacy professional should have on their desk is Dieterman's Field Guide to Privacy. Who doesn't love a field guide.
Speaker 2:It is so useful. It is the easiest reference tool that you can have on your desk, like, if you don't have it, guys, buy it and just put it on your desk.
Speaker 1:Are these graphic novels as well? Are there pictures for people that like pictures?
Speaker 2:Fortunately. No, You're talking to a lawyer.
Speaker 1:This one's a pop-up, oh, it's a pop-up. Okay, I like pop-ups.
Speaker 2:And then the and the last one is weapons of math, math destruction. Gabe, you may have heard of that one.
Speaker 1:Yeah, math, I thought we didn't like math.
Speaker 3:We don't, but we like math destruction though, so it's a lot yes.
Speaker 2:I don't know who that's by, but I can probably tell you I love that one.
Speaker 1:While you're looking that up, kathy O'Neill Okay, what do you think is the biggest misconception from tech companies about privacy?
Speaker 2:law. I think the biggest one is that it's just a compliance checkbox, and so I think that, like early stage tech companies or, like you know, founder builders, I think the important thing to think about is privacy at the outset. And I think and I can say this, I think I feel like I do really give some credit to Coinbase here on really contemplating privacy from the beginning and it was as for my tenure there and prep beyond that. I haven't been there so I can't really speak to it but it was such a core part of it's. Such a core privacy is such a core piece of the crypto industry itself, but Coinbase really held onto that as part of it's. It's such a core privacy is such a core piece of the crypto industry itself, but coinbase really held on to that as part of their, their design and their product development and it kind of really sits very neatly. But the reason it does is because it was designed kind of at the outset right, and so as companies continue to build, I mean keeping that in mind, it's so much easier to consider privacy and have it be an enabler for the business when it's thought of as such right, when you think about it as a compliance factor.
Speaker 2:It almost always comes by as an afterthought and then almost always is very difficult to meaningfully implement, right? I mean, cameron, I know you're a transcend Think about all the companies that are looking for these privacy automation solutions and the biggest pain point is that they haven't had anything up to this point. And we've hit an inflection point where they need something. Right, but at that point it takes so much more engineering work to pull something together versus having built from the start.
Speaker 2:But that comes from a mindset, right, and the mindset has to be like this is a core part of our fabric. If we're processing personal information, if we want to be viewed as trustworthy, right, forget about privacy, think about trust, right, just think about what it means, right. If we want to be a trusted fiduciary of customers, if we want to be a trusted partner for customers, if we want people to trust our product and with their children, what does that look like? Right, you don't have to use the P word, right? What does that look like? And that should force founders, force developers, to be embedding those things that we won't mention by design, right?
Speaker 1:Nail on the head because and I'm talking to you, cybersecurity leaders I'm talking to CISOs and CIOs, people that are making decisions for a lot of larger companies that have privacy underneath them Privacy is an enabler, it is a way to help grow the business, just like you were kind of stating, like that's the way you have to see it, especially in today's world.
Speaker 2:Yeah, and I think talking to leadership about that often again comes back down to the word trust. Right, it's user trust, it's customer expectation, and I think I mean we see all these surveys all the time but when we think about when customers, when we ask customers what their expectations are when they're using a product, one of the top ones almost always is security and safeguarding of the data that they're sharing with these products. Right, and what does that tie back to? Right, it ties back to strong data protection controls and privacy by design.
Speaker 1:What do you think about that, Gabe? You're on that other side.
Speaker 3:I think it's true why we started this podcast many moons ago. The intersection of privacy and security cannot be understated. You cannot have security without privacy and you cannot have privacy without security. You can have some semblance of security, you can have some minor controls, right, so you know, you can do things like apply identity controls, but that isn't privacy by any stretch of the imagination. But you also cannot have privacy at all without the security controls. If you do not apply confidentiality controls and integrity controls and availability controls, you will not succeed in having privacy. I think the challenge that I see, as someone who interacts with security leaders frequently, is they still see them as separate problems. They still see them as distinct and unique and apart from each other, even though they will acknowledge that there's an overlap. I think they've strayed from their roots that confidentiality, integrity and availability, as agreed upon as the core tenets of security, cannot be stripped out of the DNA of privacy.
Speaker 2:Yeah, I agree. I mean, they're absolutely interdependent.
Speaker 3:Yeah.
Speaker 2:And it's the key, and I think a lot of companies get this right right you have to acknowledge that interdependency and be willing to work cross lines. I mean especially when sometimes privacy programs are borne out illegal which is not abnormal still but there has to be that cross collaboration with the security partners, what I call the arms and legs of privacy. Right, they're the ones that are going to be able to operationalize a lot of this stuff. I'm just writing privacy policies, right? No, I'm just writing privacy policies, right?
Speaker 3:No, I'm just kidding, but you know, speaking of writing, you do have quite the reputation as being a writer and a thought leader. How do you stay ahead of trends in this space?
Speaker 2:I listen to privacy, please? No, I do.
Speaker 3:You heard it here first, gentlemen. You heard it here first.
Speaker 2:I do, but I listen to a lot of podcasts. I mean, I don't know. This is like the same question if you ask somebody like how do you get in shape, and they tell you diet and exercise, the answer is always going to be the same and nobody wants to hear it. But I'm subscribed to a ton of newsletters that are privacy related. I'm part of the IAPP.
Speaker 2:I'm very lucky to have a very broad network of fellow CPOs and privacy professionals. I I mean Cameron, saw me at the last conference. I know a lot of people and so I have a really good pulse, I will say, on what's going on and what is afoot and what could be, both here in the US and abroad. And then, of course, I mean my whole family, my kids, my toddlers, listen to podcasts too, so we all have our things we listen to. So on my drives I'll usually have something on helping me stay up to date, just with not just privacy but with tech right, just pure, like what's going on, particularly with the clients that I'm supporting trying to stay up to speed with that as well. So I mean there's no easy answer. I guess we should say that chat GPT plays a role here, but I mean, I don't really really I haven't used it for keeping up with that. Is that a bad thing to say?
Speaker 1:but I mean, it's just the way I can do it right now but it's just another. I mean just like using gemini. It's just an easier way to get research and and obviously you want to question everything you don't want to yeah, and there is value to the long form, right.
Speaker 2:so I think gpt had, or these um LLMs have value, they absolutely do and it's not going away. But I think when it comes to meaningful research and understanding, there is some legwork that should be done. The traditional, the good old gold fashion way where you read that fourth paragraph down on that article, right, you go all the way through.
Speaker 1:You know what? Okay, first of all, two things. I think, like you were mentioning before, you know a lot of people. You're easy to like. We met, we instantly just had a good vibe because you have a good sense of humor, which I think also helped you in law school early on, and like dealing with law, because you said you had to have thick skin, but you also had a good sense of humor and I think having that also helps in a lot of different areas and jobs, but especially in the one you know, being in law, because I know that that's probably like, at least from movies and stuff that I've seen it's like a. You know it's a doghouse, it's like it's a. I saw suits. I don't know how realistic that was.
Speaker 1:It's based on my life yes, but I do this all the time. I always forget the second thing that I was going to say, which was the bigger point. What were we just talking about before that? The last thing that you ended on reading traditionally reading, writing, math, I forget. I had a good point. It'll come back to me, but having a brain fart see, I do this to myself all the time I had two thoughts. I had to say the first one and I forget the second one.
Speaker 2:I do the same thing if I don't ask me to repeat something, I just cannot um, let me track back.
Speaker 1:Let me see, see if I can remember. Why don't I do the following?
Speaker 3:So why don't I give you the opportunity that we rarely give guests? Why don't we like to turn the tables and ask us a question? We've interviewed at this point countless numbers of privacy and security professionals.
Speaker 2:Ask away. I talk a lot about what I think is important to see in privacy professionals. I've been in-house and I've worn a ton of different hats in my years of practicing. Curious from your guys' perspective, what stands out to you? I mean, I'm very honored to be here. What makes you see someone and say like this is a person I want to know what they think about. This is someone that we see as our definition of success in the industry. What does that mean to you? What does that look like for?
Speaker 3:you Raw, unfiltered passion. Within the first answer to the first question you give them, it literally just comes spewing out of their very being. They tend to lean in, they start really. You can see it in the body language. The eyes lights up, the shoulders come up and they get a little close to the camera. If you're in person, the same thing too. I have a strong appreciation for it. I have a strong appreciation for it no matter what someone does. But as it pertains to privacy and security in particular, I don't think it's a place to be blasé. It's definitely not a place to be equivocating. You either lean into this thing and you're here for it, or make room for others that are, and the folks we tend to interview. We usually choose them based on that. I will not call any names, vibes, and we've very rarely had to do this, but I can only think of maybe two guests where it's like, wow, that person has less passion than my house cats.
Speaker 1:Hey, don't let the house cat demeanor fool you just because they seem like they're uninterested. That's a good point.
Speaker 3:That's a good point. Yeah, yeah, given the right motivation, they get real interested, don't they?
Speaker 1:Let's be honest, they play with strings.
Speaker 3:Is that open?
Speaker 1:Okay, so I agree with Gabe, definitely passion, because it helps Like someone that has passion and curiosity, like that's why I fell in love with this industry and you know, I came into like cybersecurity and then we kind of molded into privacy and then I fell into privacy in it. I just love the complexity of everything on both sides, when it comes to the law, when it comes to technology. It's just fascinating to me. There's so much and so much. There's so much darkness, there's so much positivity, there's just so much going on back and forth. So anyone like yourself that's passionate about it too, it's, it's so fun to have these conversations and learn where you came from and why you, you know, have that passion as well, and everybody has a different reason for it too. Everybody has a different backstory. It's just, it's very unique and, to be honest, like we're honored to have you here First of all, this is just, we're just two goofballs talking about privacy and security, and I mean you guys, our guests. When we have guests on, that's the real treat for us and, hopefully, for our listeners. Sure For sure.
Speaker 1:Okay, I thought about what I forgot about. What do you think about? All right, we have companies that still use these old school 10,000 page terms and conditions. Is it hurting their company or is it helping their company? Well, maybe that's the wrong question to ask. I think it's hurting it, but why do companies still is it the same mindset of like it's just a compliance checkbox? Like, are those companies that are still outliers? And why do we still have that? Why is that still a thing? Because nobody reads that stuff.
Speaker 2:Yes, and nobody reads privacy policies. Really, I mean, I do all day but Well, naturally.
Speaker 2:Yes, interesting question day, but well, naturally, yes, um, uh, interesting question. I think I have so many, I have so many thoughts on this. Just general space from from, uh, user, my own terms of service user agreement perspective. There's so many different laws that companies have to contend with, and I think there is an element in a level of paperwork that you need to bound customers to when things go wrong. Right, these are our promises, uh, promises from a company to a user and a user to a company. Right, that's the contract. That is the contract that needs to come into place. And so I don't think there'll ever be the death of the 10,000 page user agreement, because, particularly in the US, which is much more litigious than anywhere else in the world, without that there's there could be causes of action. That would just lead to way too much liability for customers and, sorry, for companies, and so it is, to an extent, a protective measure. And then, to the other extent, if we think about the customer, it's kind of a know your rights mechanism to let them know what their rights are and what their rights aren't right and what they're signing up for, and then it's the last step of love it or leave it, and most people don't read it, so they love it and they move on right and then you're bound to arbitration clauses and that's really fun. I hope all the lawyers listening to this just laugh at that. But that's the user agreement side.
Speaker 2:On the privacy notice policy side, I think things are interesting.
Speaker 2:I think after GDPR we kind of saw the shift initially this pendulum swing to having like a lot of global comprehensive privacy policies, like one policy notice to rule them all, one notice to rule them all, and I think we're starting to see more jurisdictional, the startup and more jurisdictional approach. And I think the driving force there from regulators is transparency and clarity to users. And so what I do think we're seeing on the privacy side is regulators being like this doesn't make sense, like I need one for my pick a country I don't know Australia right, we need one that is specific to Australia. We want to know where the DPO is in Australia, we want to know what their rights are particularly, and so we're starting to see more again this breakout of jurisdiction-specific notices or companies that host them, sort of against the IP address rate. So yours will populate and it'll be the specific Australia one or the Japan one or whatever, I think that trend is coming out now.
Speaker 2:Maybe people have seen otherwise, but that's sort of been my observation doing this work and so it's interesting and I do think that is meant to help the end user. I will say there are customers that absolutely do read the privacy notice, particularly when you're dealing with in the financial space, right when, like I mean, I absolutely read it for financial products. So I do think you're seeing more and more of this in the other direction.
Speaker 3:And those are shorter.
Speaker 2:Those are shorter and they are a bit more clear. I think there's much more flexibility with privacy notices versus user agreements with, like, making them the UI friendly yeah, the UX friendly. And making them the ui friendly yeah, the ux friendly. And making them short form, like. I think google does a good job with this for what it's worth. I think other companies do too. I think tesla does a pretty decent job too that was a great take.
Speaker 1:Appreciate that, gabe. You got anything else? Tech privacy wise. Before we get into some of our fun questions, which I'm excited about because we haven't done in a while- obligatory boo down with meta, but otherwise no, I'm uh they have a great.
Speaker 3:No, I'm just kidding, yeah yeah, look, I gotta, I gotta roll it out at least once a show, otherwise my sponsors don't, they don't cash. Yeah, oh, you gotta get a couple boo.
Speaker 1:Um, that's always fun. Well, actually, before we get into the fun questions before and wrap it up, is there anything, sonia, that you want to talk about that we didn't touch on? Before we get into the funnies?
Speaker 2:no, we had a really fun conversation about emerging technologies and regulatory. That was fun for me, hopefully fun for you guys guys?
Speaker 1:No, I don't think so. Anything you want to promote I know you said you're going to Boston next week.
Speaker 2:I'll be in Boston next week at the AI conference, so people should find me. And then I launched my website, which is very beautiful. I designed it myself.
Speaker 3:Yeah, that's nice.
Speaker 2:My architect dreams have fallen short to the web design. Tamaracksolutionscom is my. Oh wait, no, I'm sorry. Wow, I messed it up. Tamarack dot solutions. It's even cooler. Tamarack dot solutions is my site, so that's my little solo shop.
Speaker 1:How do you spell it?
Speaker 2:I just like that font T-A-M-A-R-A-C-K dot solutions.
Speaker 3:Go check them out, listeners, we'll make sure we include that in the show notes, as well as our social outreach as well too. Sonia, it was an absolute pleasure having you on the show. Good luck in Boston. Wait, wait, wait, wait. Oh no, no, I'm not signing off. I'm turning it over to you, my friend, but before you go, we do have some privacy probing questions. Some probing privacy questions. Okay, we do that are camera you pose on guest, if you would, sir.
Speaker 1:I'm going to admit this was created by Gemini, knowing your background and stuff. So if that's weird, sorry, that's fine, but I found these kind of interesting and we're definitely going to use one that we've used with most of our guests, because, that's, we'll go ahead and get that one out of the way. First For your toilet paper. Situation in the house for you and your husband is the toilet paper, is the flap on the top or the bottom? The only answer.
Speaker 3:The only, the only answer. And the rest of those people don't deserve privacy, by the way, if they keep it true I should have said it depends, so I could it could, because what if?
Speaker 1:what if you accidentally did it and you didn't realize it because you were just in a rush?
Speaker 2:then you should fix it, because there should never be that much of a rush yeah, that's true, good point.
Speaker 1:If you had to create a new privacy policy using only emojis, what would that be?
Speaker 2:Let me look at my keyboard.
Speaker 1:Check it out.
Speaker 2:I know what emojis I have.
Speaker 1:That sounds like way too futuristic if we're using emojis for just.
Speaker 3:I mean.
Speaker 2:I would have.
Speaker 3:You're describing Gen Z lawyers, essentially.
Speaker 2:I would have the smiley face with the eyeglasses. I would have the smiley face with the eyeglasses and then I would. No, I would have the smiley face with the the disguise, sunglasses and the mustache, then I would have the melty smiley face, and then I would have the praying hands.
Speaker 1:I like it All, right, okay.
Speaker 2:I'm not so jaded, but that's okay.
Speaker 3:Sonia, give it time.
Speaker 1:You can only use one app for the rest of your life. What app would that be?
Speaker 3:currently Does the off button count as an app, or is that I mean? Is it?
Speaker 2:like based on high utility, Because I really should just say no, no no, it's based on.
Speaker 3:you have the restrictment of one app. That's the only thing. Yeah, your phone will only allow you to use one app for the rest of eternity. Choose wisely.
Speaker 2:I would just use my chat GPT app and develop a parasocial relationship with my that's the right answer. With my LLM and have a lot of other problems. But it's useful it does, because then I can pull everything. If you buy a service, you can then pull everything from the web. So as it really went out, think about it.
Speaker 3:I thought you'd go calculator app, but all right.
Speaker 1:Calculator app, ai can do the math for you.
Speaker 3:That's why Sonja's answer was the better answer. That was the right answer that.
Speaker 1:That makes me okay. So be careful with this, though, because I've seen people I don't know if you guys have seen this on social media, but I can't remember where I saw it but someone was using chat, gpt or gemini to, instead of like the, what was it like? The husband would start using it to get more direct answers, and then the wife started to talk like the gemini, so it would like basically get one over on the ai. Does that make sense? So the wife was jealous that the husband was using ai to have these normal conversations because she's too complicated. And then she started to talk like I don't know, it's very.
Speaker 2:there's all these weird niche things. Yeah, Like romantic relationships with their yeah their models and stuff. I think it's I don't know. I think it's a little strange. I also find it very like it will always tell you that you're right. Great job. Yeah that's a great point, and so it's too. For me as a cynical lawyer. It's way too positive to be right. I'm like. No, I agree, there's no way I'm right all the time right.
Speaker 1:But that's when you know you should always question it, because you can literally it could be right and you can tell it no, that's not right, this is what I meant. And they'll be like oh, I'm so sorry, I missed that. It's too agreeable.
Speaker 2:Yeah, yeah too agreeable. And this is also the debate I know a lot of people say is this going to replace lawyers? I think maybe some of the more rote functionality, like functions or things that are done by lawyers, maybe, but again, legal analysis and the ability to be like it's. So, especially in the privacy and cyberspace, there's so much gray right. There is no one right answer, there's no one way to get to an answer, and I think that's where it's so hard to replace that human element that I mean we talk about this all the time. Privacy has such a deeply human element to the work that's done and I think it's written into the way it's even regulated. When you think about GDPR and like asking about the rights and impacts to humans, human freedoms, right, like that is such a deeply human thing and you really have to think about that as a person, and so I don't think there's really coming away from that. You need people behind that, one will say, and hope.
Speaker 1:Well, thank you, it was such a pleasure to have you on. I know we've kind of gone over time, so thank you so much for joining us and just doing what you do. It's inspiring, and we were just I'm just honored to know you and get to collab with you, so thank you so much for being on.
Speaker 2:Thank you, Cameron, Thank you Gabe.
Speaker 3:Sonia, it was an absolute pleasure. Enjoy Bye.