Privacy Please

S5, E210 - Engineering the Future of Privacy: Bridging Technology, Ethics, and Law with Jake Ottenwaelder

Cameron Ivey

Send us a text

Discover the intricate dance between technology and ethics as Jake Ottenwaelder, principal privacy engineer at Integrated Privacy LLC, takes us into the heart of fractional privacy engineering. Join us for a captivating journey where Jake, pivoting from cybersecurity to privacy engineering, decodes the complexities of modern data protection laws. He artfully bridges the gap between legal mandates and technical implementation, providing a lifeline to organizations navigating the treacherous waters of GDPR compliance and beyond. This episode is a beacon for anyone seeking clarity on the interplay between privacy, technology, and legal frameworks.

As we navigate the subtle distinctions between security and privacy engineering, Jake imparts wisdom on the essence of an engineering mindset in privacy practices. He dissects the rich tapestry of privacy engineering, painting a landscape where ethical data stewardship takes center stage, and integrative privacy solutions redefine how companies interact with consumer data. For those in the security sector eyeing a shift to privacy, Jake offers a roadmap, underscoring process improvement and the potential of privacy automation to transform the industry landscape.

We wrap up with a profound discussion on the broader implications of privacy engineering — from the ethical quandaries in AI to the pivotal role data privacy plays in national security. Jake sheds light on the convoluted challenges faced by organizations, such as data deletion and retention, and advocates for equitable privacy that transcends user sophistication. The episode concludes with a reflection on the future of data economy ownership amidst geopolitical shifts, a conversation that will resonate with anyone invested in the intersection of technology, privacy, and global affairs.

How to Start a Podcast Guide: The Complete Guide
Learn how to plan, record, and launch your podcast with this illustrated guide.

Support the show

Speaker 1:

The recording. Button started. That's what someone told me at the gym. That's what someone told me at the gym. It's not true. It's exactly what you look like. So can I use a voice? Can I do the voice? Is that okay?

Speaker 3:

I think they're going to add us if you start using voice. We're definitely going to get canceled if we start doing that. No, I'll stay away from that, definitely stay away from that, try it.

Speaker 1:

Add? Definitely say it, try it, add us. You know how we feel about that, alright, well, ladies and gentlemen, welcome to Privacy, please. I'm Cameron Ivey hanging out with Gabe Gumbs. As always, have a special guest with us today. Usually we like to let our guests in once, we go through the intros and just kind of chat and stuff, but we're going to kind of just kick it off right away with him, so we'll go ahead and introduce you now, Jake. I don't even, honestly, I haven't even tried to say your last name because I don't want to mess it up. So I'm going to let you tell us how you say that, because that's just how we roll. So Jake is a principal privacy engineer at Integrated Privacy LLC, which I know. We'll get into that a little bit more. But, jake, before we dive in all that stuff, how are you guys doing Gabe? How are you doing?

Speaker 3:

man, I'm solid, I'm here to talk to Jake. How are you doing, Jake?

Speaker 2:

I'm doing pretty well. I'm based out of Seattle, we are not having a lot of rain, we're getting into that spring season, so it's the only time we get the vitamin D and can actually get out Exc into that spring season, so it's the only time we get the vitamin.

Speaker 3:

D and can actually get out Excited. Everything's going well. Nice Seasonal depression drops to an all-time relative low.

Speaker 1:

Yep, yep. So, jake, let's go ahead and dive right in, because I know Gabe was going to go into this. But, jake, tell the listeners a little bit about yourself and then we'll kick off from there.

Speaker 2:

Yeah, awesome Thanks, cam. Yeah, hey everybody, my name is Jake Ottenwalder. Definitely a tough last name 16 letters or something like that. Yeah, I am a privacy engineer by trade, started off in cybersecurity engineering, have kind of transitioned myself over time into the privacy engineering field and then was a solo privacy engineer at a couple of different pre-IPO organizations before ran off and spun up my own practice. There's not very many people doing fractional privacy engineering if any. There's a lot of fractional DPO services, a lot of fractional CISO services, chief information security officers, fractional legal support really just looking to be able to support organizations who might not have the full funding for a desired like kind of privacy engineer headcount. Been working in the space for a couple of years. I had the pleasure of working with Transcend. Transcend has been a great partner to my organization but spun off and created integrative privacy and have been kind of trying to build this fractional services and just getting invested in the community. So you make a good point.

Speaker 3:

I, at least, am certainly unfamiliar with fractional privacy. Engineers, you'd be the first one I've met. There are lots of other fractional types of well, everything from fractional marketing offices to you name it, and so the primary question I'm left asking myself is well, hell, most people don't even have a full privacy engineer. Let's list a fractional one. So what problem are you solving for by offering fractional services as a privacy engineer?

Speaker 2:

Yeah, there's a lot of skill sets and conversations that privacy engineers can have that maybe vendors can't have, or legal services can't have, or consulting services.

Speaker 2:

Right, your DPO is going to come in and ask you or like tell you, hey, you have to comply with this privacy regulation, but they can only go so far and can't actually implement the technology. Or there's always conversation breakdowns between what legal is asking to do and what the engineers are trying to do. Fractional privacy engineer you can come in support maybe some of those point in time assessments, support, kind of ongoing maintenance or review of requests. Or coming in and just saying, hey, legal is telling you this, like you have to comply with GDPR. You don't know what that means. Or you need to ask somebody what that means or what does that mean specifically for your technology stack. And I think what we're seeing, and what I've been seeing personally, is people come in and just try to put up these guardrails that are supposed to just help their organization, but they're not specific to the technology stack, it's not specific to the environment or the context of the organization.

Speaker 3:

That's where I can come in and into that you said you were a security engineer prior to being a privacy engineer, and we frequently discuss on this show that you cannot have privacy without security. So that already seems like it scratches a niche for me. But from your perspective, as the person that has been both the security engineer and is now the privacy engineer, could you do the privacy engineering job? Let me rephrase that because I don't want to necessarily set up a scenario where it's a mandatory background, right, but how does having a security engineering background help you as a privacy?

Speaker 2:

engineer. Yeah, so I was just thinking about this before I get so frustrated when people are like, oh, jake, you still work in security. I'm like, no, I don't work in security, I work in privacy. Now, so while they are the same, they're two sides of the same coin.

Speaker 2:

To me, privacy blends in this ethical consideration. It has a lot more gray area than security. If you. There are a lot of different ways of and it's more of your, problem solving. Insecurity is around like creative ways to make sure that things are secure, but at the end of the day it's black and white of whether something is secure or not right, it's either you're using tls 1.2 or you're not. Kind of thing with privacy. There's a lot more gray area, nuance, because we're leveraging or worrying with like human factors and the actual people kind of working with our technology and working with our implementation. So honestly, I don't think people like going through private security into privacy isn't, I think, the best route for most people. I think it takes just somebody with an engineering mindset if you're specifically trying to go into privacy. Engineering, the engineering mindset, the problem solving, the just conceptual understanding that it takes to have someone go through a computer science program to understand, kind of, how things are built and how to take them apart. I don't know if that fully answers the question.

Speaker 3:

I don't know if there was any right or wrong answer to the question. So, yeah, no, I'm good with that, but you say we're in agreement One doesn't have to be a security engineer, but there are lots of security engineers out there that are always kind of looking at what their options are from a future career path perspective. What advice would you give someone who is currently a security engineer that is fascinated by the world of privacy and maybe wants to become a privacy engineer?

Speaker 2:

Yeah, I mean for me I started off my transition was I focused on security automation and also privacy automation work, so I have a background in process improvement and privacy automation.

Speaker 2:

I think the question I think what we're seeing with the industry also is that there's so many different fields and so many different specializations within privacy and even privacy engineering.

Speaker 2:

I would not consider myself anywhere close to the same privacy engineer that somebody at Google or Apple is their privacy engineering department and what they do is very different than the work and specialization that I focus on, which is more small business kind of privacy engineering work. Security engineering is the same where you have like penetration testing and appsec and scanning tools and all these different specializations that you can focus on. So the big thing is just figuring out like where do I want to play in and again just one of those skills that again it comes down to like regulations and stuff too. So understanding and just starting off with kind of the regulations and seeing how those like kind of mesh with your thought process and stuff, is this something that like comes easily or like makes sense? Because I think when your morals and ethics are aligned with privacy, it's a lot easier to like kind of work in this space.

Speaker 3:

Does that require organizational morals, individual morals, right? Like you know, sometimes we talk about the fact that companies are, you know, they are entities unto themselves. Some might consider them not exactly people. So it's hard to say that that company quote has ethics. But I think we could all point to some companies and be like, ah, that one has no ethics, that one has some ethics. And this one exudes a lot of ethics, is there a necessity for the business itself to demonstrate some form of of ethical standards?

Speaker 1:

or does they?

Speaker 3:

do? They just need to comply? I mean so I?

Speaker 2:

I don't enjoy, or I wouldn't, as just a worker, and whether it's you're a privacy engineer or somebody from from sales, engineering, marketing, business dev, I feel like you should have some.

Speaker 3:

My ethics have been the right sense of attacking the salespeople.

Speaker 2:

No, no, no, not attacking the salespeople. I'm just saying salespeople have ethics too. They have souls. Yeah, legal department as well, I'll throw them in. But no, I think you have to have some alignment with the organization that you support and I think that is what, unfortunately, and not to get too, like, I would say, theoretical okay, we need to delete data that's like 14 years old, that we don't use anymore, Like why are we holding on to this? And that is a push and is a question or requires a lot of scrutiny. Then it's not probably the same people that I want to like have lead my potential career and stuff Right. So there's some alignment there of like let's all have some North Star that we can focus on. But I think it is somewhat the privacy engineer's responsibility or privacy manager whoever you want to say to start pushing that vision and getting those values kind of adopted into the core of the company.

Speaker 1:

What's another one? So, oh, you have another one, Gabe.

Speaker 3:

Last one, last one, go, go, go.

Speaker 1:

Last one Go. I'll wait, I'll just, I'll just sit back, yeah, I apologize.

Speaker 3:

It's just fascinating. We've never had a full fledged privacy generator so like I'm. Let's dig in, let's dig in, let's dig in. So you've got me on your couch, you're doing your diagnostics, I'm telling you all about how my privacy was violated, from the time I came out of the womb completely stark naked. And you know, you're just sizing me up and you're going through everything. What is the problem? When someone is in front of you, like as part of the factual services, or you know, as you help many different organizations as part of the factual services, or you know, as you help many different organizations, what's that problem? That one problem, that they sit on the couch and you go? Okay, that's the one we should start with first. That's always the place where we start with. We go right back to that time when your mom told you no no-transcript.

Speaker 2:

I think that's the main issue, and the reason why I've started Integrative Privacy and have kind of the methodologies that I've developed are to drive this better cohesion of like, what are we doing here and what does privacy mean to this organization and how do we live by what is right for the customer and what is right for our data subjects or individuals that were processing data on their behalf? I think that's where it comes down to, and to pinpoint that even further would probably be this understanding of companies think that they own the data that they're collecting, where in reality, they're just stewards of other people's data, like they don't they ethically, or this is data that is me as a person, this is my identity, this is who I am. No organization or company should ever own that part of me, in my opinion, and so organizations generally, when they become more predatory or less ethical, they have this idea that they own data or own parts of my interaction or my lived experience, and that's where I think we have to stop that pretty quickly and reshape the conversation.

Speaker 3:

I completely lied, cameron. There was still one more hidden, but it's Jake's fault. Jake did this. You said something about a methodology, so you've developed a methodology to address this. So two questions there. A, is it something that you share parts of it with the audience after this show, such that you know, I like to leave people with concrete takeaways that they can use, or anything you've published on the topic? And then the second is published or unpublished. Can you tell us more about this man?

Speaker 2:

Yeah, yeah, so I just posted, or I'm working with IAPP, or hopefully, if somebody from there hears this, let's get this published on IAPP Come on IAPP.

Speaker 3:

I know we got at least a dozen, if not more, IAPP listeners out there, but certainly within the ranks. Can we just contact Jake already if this thing?

Speaker 2:

happened. So integrative privacy. So throughout my experience, I feel everybody starts off with this basic understanding of let's implement privacy by design. That's the first goal that organizations have. I've gone into companies again pre-IPO, ipo. They're sitting there and they're like okay, you're coming in, we need you to do. You know how to implement privacy by design. And I sit there and I'm like you don't even know what you're asking for Principles that you can strive for, but there's no concrete implementation steps. How do you even get to the position? And if you just jump the gun and try to implement some of the stuff that Privacy by Design is asking for, you're going to have a very hollow approach to implementation which could give you just this false sense of security when it comes to compliance risk. So I, throughout like the past couple of months, starting my own practice, I've developed these four key kind of building blocks that organizations should focus on. Try to keep it simple, but first starting off.

Speaker 3:

So it's analyze your surroundings, identify your data, acknowledge your risks and enable your champions Slow it down a little bit, I apologize, and then expand on each of those a little, if you would too, please. So you're good, sure.

Speaker 2:

Yep. So analyze your surroundings. So uncover the context behind why you're processing data. What are the internal or external factors around processing data and just what are you trying to do? What are you trying to accomplish? A lot of this comes down to business objectives. So what is the goal behind what you're doing with data? Identify your data. So, just again, this is just documenting and understanding what data elements you have.

Speaker 2:

There's mentions of privacy by design, talking about minimizing your data. You can't start the conversations of minimizing your data until you know what data you're collecting. What are you collecting, how is it being processed and where is it going. Third is acknowledging your risks. I think it takes a maturity assessment, but not just a maturity assessment. It takes a technical gap analysis, and that's something that, as a privacy engineer, I can focus on and provide to organizations. A technical gap analysis looks at not just like, are we doing double opt-in, but where is that actually coming into the code base and what does that mean? And acknowledging the risks that we have, so that the organization can make risk-based decisions to either accept or reject, or have to mitigate those risks, and then the final piece is enable your champions.

Speaker 2:

So in this entire process, we want to make sure that we're developing human-centered organizational processes. We want to enable other individuals within the company to have better, more stake in how their department or how their team manages privacy. We want to have individualized conversations with marketing. It's not just doing a privacy training once a year is never going to satisfy this requirement. You have to sit down and say marketing when you drop a cookie. This is what I need you to do and this is how we develop that process so that it is as least burdensome for you as possible.

Speaker 2:

And a big key to this for me is sitting down and learning what people are doing within the organization and where that intersects with privacy, not just going in and saying, hey, you need to do this. We sit down and say what do you do here, what makes you enjoy this work and how can I insert privacy into what you're doing so that they can be enabled to enjoy and want to do privacy work. And once you have that kind of buy-in, developing more processes or having to get additional checks makes it a lot. I've just found it makes it a lot easier. Everybody's much more willing to get that support. So those are my four methodologies that I take into even technical implementations, consulting work, looking at again gap analysis, risk analysis. This is what I've kind of developed.

Speaker 1:

What do you think the why is now not only for you but for privacy engineering, and why we're seeing that more Obviously there's a lack of that role. But yeah, why are we seeing it now more than ever, you think?

Speaker 2:

it's a, it's a good question.

Speaker 1:

I mean it could just be from your personal opinion, like obviously something drove you towards wanting to take this path? Yeah, did you have? Did something happen to you in your past when it came to privacy violation, or does something trigger that passion for you to kind of dive into this, or was it more?

Speaker 2:

of a. I see an opportunity here I am. You know, my internet is having some trouble, so sorry about that um oh, you're good okay, yeah, no, I I got.

Speaker 2:

I got the question, though, yeah, for me I. I started off in college again looking at cyber security engineering. I sat there and I was in a specific honors program for cybersecurity and I was sitting around and I was like everybody around me is so much smarter than I am. I cannot sit and code with an engineer or go line for line with any of these computer science guys. I need to find a different way to kind of differentiate myself. I did end up going through and doing my computer science minor and focused more on like business implementation and stuff. So even from an early, early time I focused a lot on like.

Speaker 2:

I think there's a massive gap between non-technical requirements and actually technically implementing them and I don't think we've seen a larger gap. The gap there in privacy is larger than it is in security. I think security is slowly starting to figure it out and they have different specializations that help out. There are no privacy engineering managers or like a security engineering manager. I have a great mentor who is an amazing engineering manager, knows privacy but is not maybe detailed into like privacy engineering management. So we're just still growing the field.

Speaker 2:

But for me it comes down to like there's such a gap and there's such a false sense of security when it comes to this kind of just privacy work. In general, people implement various like technology solutions like OneTrust that just don't understand, or just again throw it on your website and then just leave it. That's not how to implement kind of privacy solutions. And so the lawyers are saying, well, we told engineers to do it this way and we thought that we did it, but we don't have any ability to actually check. And I think that's where, again, bridging those gaps is where privacy engineering is growing.

Speaker 1:

Using the cheaper tools to show the vision of having privacy protocols in place to protect them, but also, you know, not have to really invest into a privacy team. Do you see that a lot? Do you think that that's pretty common?

Speaker 2:

today still, yeah, funding and headcount is still like a big challenge. I think as more regulations come out, we're going to see larger funding and stuff for headcounts and people. But I think it's a very specialized role which warrants a larger salary and I think every organization kind of needs somebody liaisoning those conversations and people aren't able to afford a full six-figure salary for a privacy engineer. And that's where the fractional support can be very beneficial and get the expertise from somebody who's worked with 10 to 15 companies over their experience and come in and know exactly how to solve the solutions.

Speaker 3:

How many people are taking shortcuts? But I presume some people would at least explore them right, because they're not bad shortcuts, are you know? By designed, by design, they're there to help us get someplace faster. But what are? What are some of the naughty shortcuts you see in privacy, engineering, that that you would warn people to. You know, not be tempted by right like if you do it this way you, you know it's tempting.

Speaker 3:

For example, buy some cheap, shitty tool, slap it on your website. It's like look at me Right, like that's an obvious shortcut that should be avoided. But what else? Yeah?

Speaker 2:

I mean even that, even that tool, and the biggest thing is that I know of so many implementations with a cookie consent solution, whether it's like any of the what I consider first gen tools like OneTrust, usercentrics just the ones that we're so familiar with not integrating into, like Google Tag Manager, is one of the biggest issues because you're going to block some things but not everything.

Speaker 2:

That's the biggest red flag that I see, and I have experienced or have known that there's a lot of organizations out there that just don't do it well or just don't do it at all, and that's really scary. Other challenges I mean the whole data deletion requirements and stuff doing that right is is very challenging. It takes a massive amount of investment. So trying to figure that out is like they're. Like you said, there's a lot of shortcuts and I think there's a lot of people who just put band-aids on the solution and make up on the pig kind of situation. Um, but it's I don't know many organizations that fully do it like the best that I think it's intended to be done shortcuts and consent is way too high.

Speaker 3:

Don't know many organizations that fully do it like the best that I think it's intended to be done. Shortcuts and consent is way too high to do it. Shortcuts and deletion that sounds wicked problematic, yeah yeah, what do you mean by that?

Speaker 2:

what I mean, let's, let's kind of dig into it yeah, I mean, the challenge is and this is like we can get down into like legal designations and stuff, but, like, there is a certain sphere of like understanding, right, that, like, if I'm interacting with this website, I only want a my data ticket master. I don't I need them to delete certain information from like different platforms and stuff about me. But if they need to go and tell, like, how do they delete information for like advertising, right, if they go to to Facebook's advertising portal and try to like, they can't a lot of Facebook.

Speaker 2:

Facebook doesn't have great like doesn't have great ways to be deletion processes or or other organizations fruits on the ground.

Speaker 3:

It's ran over. I mean I I use a little thing as an example but yeah, it's um fucking face.

Speaker 2:

Yeah, so it's.

Speaker 2:

It's that's like a massive, a massive issue when you look at like not just the first party processors but those second and third third parties that are like still going down that pipeline.

Speaker 2:

And then you also think about like I don't think many organizations are worried about like uh, uh kind of the linear nature of data, like deleting data from like a hierarchy perspective. For instance, like if you're doing an employee deletion, you delete data from like paylocity, where does that then flow? How does the data get deleted? Kind of lower down on some of these other HR systems and we need to delete data in specific orders, so that stuff from like if we don't delete Paylocity and we delete your benefits data first, paylocity might come in and repush that. Looking at issues around, like those timings of data deletions and then even auditing data deletions I don't know of any good tools that really do like great auditing of deletion processes or go in and continue to kind of monitor to make sure that data is deleted so I think that's all manual right, most of those, those processes yeah, yeah, a lot of it's a lot of it's still manual.

Speaker 2:

Um, it's just not. I'm not a great, not a huge fan of it. I understand why it's required, but I think it's a band-aid to a solution of. We need to have stricter data retention policies and better public auditing of data retention to understand how long data is being maintained for, because that's the larger culprit right Is how long are companies storing and maintaining data? It's not a mission of.

Speaker 2:

I don't think anybody should need to submit a DSAR request. Dsar requests and deletion requests are only helping. I would say and this is a great study that I want to work on I'm talking with Carnegie Mellon and they're what are the distributions of demographics of people who are doing deletion requests? The distributions of demographics of people who are doing deletion requests? Right? If we're, if we're trying to make privacy equitable for everybody, um, we can't have a a solution that's only benefiting people who are better educated about their rights or better technically savvy or technologically savvy. It needs to be a solution that helps everybody, and for me, that just means you have to have better deletion and just retention no, no, you did not.

Speaker 3:

I think the all over the place is all over the place with that one. Sorry, as a consequence of the fact that there are a lot of places that shortcuts can be taken and there are a lot of places that people are currently taking short yeah, oh, that's good.

Speaker 1:

Let me see if I can quote that for you on a nice snapshot, gabe. Let's talk about like, since we're talking about jumping all over the place, what about, like, with the rise of AI and machine learning technologies and things like that on the rise today, engineering you think, from your perspective, be leveraged to kind of help or ensure, like, responsible and ethical use of data, if you've ever thought about it?

Speaker 2:

I mean, I feel like it's kind of like a you should always have a bingo board for, like privacy podcasts, so free space is always ai will come up, um, but from a privacy engineer's perspective, it it's. So I I find it very funny um, all the talk around like ai and gen ai and stuff. I think there's a lot of people who I think iapp not not to. I know I'm asking them to publish some stuff but like they just put out a certification for ai, like general practitioner igpAGP certification and stuff. It's like I don't think anybody's really an expert in AI right now. I think to claim that or to focus on that is not, I think, genuine. I think there's a lot of value or we need to.

Speaker 2:

The AI problem stems from privacy right From a privacy engineering perspective, for me, nothing's changed right with AI. When it comes to machine learning models, generative AI. I focus on what is the data that's being collected, do they have consent for collecting that data and what is the just what's? How has the de-identification process been for that information? It all comes down to, again de-identification process been for that information. It all comes down to, again, de-identification and anonymization, which we've had ongoing debates and discussions around and I don't think anybody can.

Speaker 2:

Clearly, I think the consensus is that there is no consensus and that anonymized data will continue to just we're going to continue to kind of barrel down this path of non-committal kind of implementations of anonymization and because of that, you'll see a lot more privacy risk coming when it comes to these AI models and stuff. So yeah, for me nothing has really changed. I consider the problem the same as I would consider implementing an API. It's who's getting access? How secure is it? How are we protecting against it being, like, maliciously leveraged or how are we protecting, like populations, sensitive populations and sensitive data?

Speaker 1:

within it and, in your opinion, what do you think are some of the most pressing privacy challenges for organizations for the next few years? What do you kind of foresee, what do you hope to see happens and how? Maybe privacy engineering or something like integrated privacy can help address those.

Speaker 2:

Yeah, I think we're dealing with a lot of tech debt, so technical debt, which is like where you're going to implement a solution or, specifically, when we talk about it in privacy.

Speaker 2:

You implement a solution and you get it out the door quickly, but you don't go back and try to fix any of the privacy concerns, or you're sitting around with a lot of debt that needs to be reconciled with when you're trying to go public or you need to fix known vulnerabilities.

Speaker 2:

Those are like it's like tech debt kind of thing. So I think that's the main issue and I think we're going to continue to see challenges with funding. We're going to continue to see issues with, like, just tech debt and that needing to be kind of resolved over the next couple of years and that requiring a lot more work from the organization and more investment from the organization after the fact than it would be if you just came in the first time and thought about it holistically, came in and done it, did a gap analysis, did a technical understanding or worked with integrative privacy on implementing your solution, so that you know where all the things are kind of early on, on acknowledging your risks and focusing on, like what you need to resolve or not, have like skeletons in the closet yeah, so there's two topics I want to bring up.

Speaker 1:

gabe, I'm going to let you kind of introduce this one because I want to get your take on it. Jake and Gabe, we kind of touched on it a little bit. Well, no, we didn't get a chance to actually Did a little short broadcast on it. But with the Microsoft crisis have you heard of this yet, jake? I don't think so.

Speaker 3:

Government Accountability.

Speaker 1:

Board that gave them a very, very negative security review.

Speaker 3:

If you would Very, very negative security review if you would, basically stating that Microsoft has systemically and as an organization, deprioritized security, which, in the context of our conversations on this show, where we always talk about that intersection of security and privacy, one of the things that is lost in that statement is that they are legitimately jeopardizing the privacy of untold numbers of people also, but ultimately their security practices and the way they have monetized security such that it becomes more difficult for others to implement security solutions, is being called out. I would equally say the same again of their privacy practices. Microsoft has a nary, a smathering, like just nail to nail privacy tools, right, like they just don't have that many and I just don't think that there's any particular reason why they don't, other than they haven't prioritized it right, like I mean, they're Microsoft. If they want to prioritize security, they can, will and it will happen. If they want to prioritize security, they can, it, will, it will happen.

Speaker 2:

Yeah, I mean. All I'll jump in is yeah, I completely agree. I think large organizations just if they want it to happen, it'll happen. I mean, I think from a privacy and security side, privacy and security are never revenue-driving ventures, right? So we look at like, how can you? It's more of a cost-savings kind of measure and unfortunately, with a lot of the large oligopoly, I would say, of technology providers or technology developers in the US, we don't see a lot of the financial or business risk that would come with these kind of decisions. I think that speaks to a requirement or the desire for broad and privacy regulations or security regulations that actually have some teeth to them, um, and making sure that this is not just a one time, or developing, um, a solution that would actually like drive some change cool, and the other one that I wanted to kind of talk about was obviously the the most recent thing about the biden administration signing off for the tiktok ban that means a bit around.

Speaker 3:

They will survive well after tiktok, if you. If you require any additional cat memes jake, I got you. I got you, that's true, thank you I'll reach out.

Speaker 1:

I mean, the platform is already on instagram, so it's not like you won't have the type of platform to use. But I guess I want to kind of dig into. Obviously, this ban is going to happen, but what's the overall goal that they're trying to accomplish? Just to ban tiktok because they don't like, you know, the owners of tiktok, which is from bite dance? But that doesn't solve our problem, right? That doesn't, I mean, because all social media is still going to still have the same problem when it comes to teenagers and young kids, safety and privacy, along with organizations and people, individuals. There's just a whole mess Like it's not really going to change much. What do you guys think about the whole situation?

Speaker 2:

I was trying to look up somebody. I'll drop my share.

Speaker 3:

I don't think the US government was solving for a strictly privacy problem.

Speaker 3:

They were solving more for a who is invading the privacy. I say it without any hesitation or doubt that our government doesn't have a problem with prying on its citizens. It never has. Any suggestion to the contrary is a gross misunderstanding of history at best. At worst, it is an intentional distraction from the obvious. So none of this has anything to do with whether or not our privacy was being violated as American citizens. It was all about who is violating that privacy and what they're able to accomplish with that platform.

Speaker 3:

There's the obvious, the privacy part of it too, but there's also the just driving behavioral change, just larger driving behavioral change. Whether that behavioral change is and at the top of the show we joked about things like seasonal depression Change is, and at the top of the show we joked about things like seasonal depression but when that behavioral change is simply driving opinions and moods et cetera, such that, you know, come election time everyone is feeling just general apathy and so maybe they don't go out and vote, kind of thing. I think this is larger than just privacy. This is about control. This is about mind control. This is the part of the show where you all break out your tinfoil hats and I tell you that they're not necessary, because none of this is that far-fetched. In fact, it's all well-documented. What do you mean now? I feel like we should get tinfoil hats now and start putting them on what we're doing, fedoras to pirate hats.

Speaker 2:

They're all made out of tinfoil. I mean, gabe, I think you're spot on. That was what I was looking up as well. There was a, I think, greek economist, yiannis I'm going to butcher the last name.

Speaker 3:

It took me long enough to forget to say Yiannis.

Speaker 2:

He wrote a book called Techno-Feudalism what Killed Capitalism, and so, if we get into kind of this conversation, he said exactly what you're talking about cave. It's about mind control, it's about who owns the data and it's about, uh, who's able to make the decisions. I mean, you said the same thing as well of like it, whether it's tiktok or instagram or google like. Personally, I am more afraid of Facebook having all of my data because they are in the US and can probably affect my life more than a country on the other side of the world can affect my life with the data that they're collecting. What if I pose a different argument when we think about who owns the data? Because I, by and large, fully agree with you.

Speaker 3:

Anyone who's listened to this show more than half an episode know that there's no love lost between me and the book of faces, that they can burn. Burn to the ground tomorrow night and I would be the one right there with a big pile of marshmallows just handing them out like to everyone. But we, at least in theory, have levers of control over that US-based entity. We have zero levels of control over a non-US-based entity, especially one where sometimes our international relationships ebb and flow right. So I don't disagree with you, kind of like on the surface, but I'd rather none of them exist. But I'm not certain that I'm more comfortable with the American company versus the foreign company, right, like the foreign company owning it. I see the real danger in right, right like we have just fewer levels of control, just far fewer levers of control.

Speaker 2:

That that's concerning, I agree with you but my, my statement is like, what's the impact on me personally?

Speaker 2:

Right, like, like, if, if, let's say, the book of faces is collecting all my data?

Speaker 2:

There are a lot of, and we'll put the tinfoil hat on of like, okay, maybe the US government is also getting that same information now, right, because the government requires companies operating in the US to abide by certain requirements for like, subpoenas and stuff. So if we're looking at, like, what is more likely to infringe on my personal rights, either to privacy or whatnot, yes, we have less levers of control for a company related or organized in China, but what are they going to feasibly do, unless I go to China, unless I have those or any of these other, I would say, nations that the US doesn't have great relations with? I'm more concerned about what are they going to do with that data. And TikTok sends me some great opportunities to buy cool decorations in my house. That's awesome. I don't want the US government to pull data from Facebook, and so that's I don't disagree, and I think this is where the real danger is is that the danger is harder to see, especially at an individual level.

Speaker 3:

I think it might be easier for me to paint a picture at a larger nation state level, where, again and these are things that, for example, facebook even did where they conducted large scale experiments as to whether or not they could change people's moods and behaviors through what they put on their feed. So again, we know that again, that those same types of strategies are desired to be employed by our international adversaries. The direct impact becomes a I use that information to make more of the American populace again just generally apathetic, more depressive, such that they don't get involved in their own well-beings. That has a direct impact. I think your point is a good one, though, which is it is difficult to see the direct one-to-one impact that something like that has. But I would be mindful and cautious to highlight some of the bigger picture problems, such as yeah, I can direct more people towards the type of commerce that intentionally favors my country and that affects yours negatively, right?

Speaker 3:

Um, there, it's the law. It's a long game. Let's make no mistake about it. Tiktok is not about the short game. Facebook, arguably, was more about the short short game in the beginning, but things like tiktok are they're super long game, right, like I mean, we will still we're in agreement?

Speaker 2:

oh, we are in agreement.

Speaker 3:

I feel like all these us companies are doing the exact same thing anyway, whether it's like I mean, we will still, we're in agreement. Oh, we are in agreement.

Speaker 2:

I feel like all these US companies are doing the exact same thing anyway, whether it's like driving apathy, yeah. So, again, if we're concerned about driving commerce or creating apathy, that's been proven to happen at the US companies too. So, when we're talking about levers of control, do we really have full levers of control?

Speaker 3:

That's why I said theoretically, exactly Companies in the same way that we want them to. Theoretically, we do have those controls and, you'd be right, they are not being exercised in any meaningful way except for this time we magically signed a bill like that, which again goes back to the. If you want to do something about it, what's that bullshit? You know you can't. Absolutely can be done. It requires a little more than that stroke of that pen, yeah yeah, so so going all the way back to privacy?

Speaker 2:

uh, yeah, there's definitely. I don't see the goal behind this was really who's who's owning? The next stage of, I think the next stage of, economy is going to be around the data economy, and who owns that? Who controls that? And right now, china is doing a great job with that and the US is not happy and they want to maintain that control, that control, and so that's where we're seeing, in my opinion, this push for you said the TikTok ban, which I don't think TikTok will end up selling the organization. If I'm going to put on instead of a tinfoil hat, I'll bring out a crystal ball. I don't think TikTok's going to sell. I think it's going to take a couple of years for this to actually go through, and I don't even know if it is signed in and it's extremely un unpopular.

Speaker 3:

So we'll see how that kind of affects right? What is the likely? Because I have yet to understand what the quote ban means. Like ban what right? Um, I don't have it on my phone, but I presume everyone can still log into it.

Speaker 1:

Yeah, yeah, it must be yeah, is it a chess move? Don't you guys think it's a chess move, like US is trying to see what they could? Yeah.

Speaker 2:

So we go all the way back into the history of like TikTok. We look at like Vine, which was like one of the first platforms to start short-form content, and then we look at Musically, which then was bought out and created. What was the grounds for tiktok? Another platform is just going to come up in its place, um, and I think you could just copy the same source code, change the parent company to a different chinese parent company and they could still operate the tool. I think it's not. Yeah, again, I feel like the they have given them until I don't know when the timeline is, but they have a couple months to actually sell the organization, but I don't, I don't think that's I think it was uh like 90 days or something, if I I have that right.

Speaker 1:

But listen, hey, we all know the the earth is flat. Uh, the moon landing was staged and apparently, the moon is.

Speaker 3:

That's why, during the eclipse, like you, you saw it happening. Like it was right there, everyone looked up at. Apparently, the moon is flat. That's why, during the eclipse, you saw it happening. It was right there, everyone looked up and the moon is flat. Don't be fooled. Don't be fooled. You've never seen a round moon. When you look up at night, you haven't seen a round moon Get out. Yeah, I didn't go blind. It's perfectly flat. I mean it's circular, but it's perfectly flat.

Speaker 1:

But hey, Bigfoot's real.

Speaker 2:

Hey, I'm right up in Pacific Northwest. I went over with a family vacation over to Forks where they had the Twilight movie and stuff. Nothing from the Twilight movies was ever filmed out in Washington, but the entire little town is all focused on Twilight. And then I met with a guy who ran the Bigfoot store who said he lived as a Bigfoot for three months.

Speaker 3:

The judges will allow it. Superior human race, absolutely Superior sapien race. To us, said he. He lived as a bigfoot for three months. The judges will allow superior race superior sapien race to us so yeah, my theory is that I.

Speaker 1:

I actually believe more in shapeshifters than I do bigfoot skinwalkers and stuff yeah, I could see it, but I'm saying like maybe bigfoot was a ship, judges will allow it also absolutely.

Speaker 2:

Hey, I'm firmly in the camp of aliens, have been here for a while. It's been around it's silly to think not and I'm excited for when we get to hang out with them and stuff.

Speaker 1:

Oh man, they're among all humans right now. The weird ones there's just too many weird people in this world. I mean, we've seen men in black. I believe that.

Speaker 3:

I mean I this, in this world there's I mean we've said we've seen men in black. I believe that I mean I've seen people from florida.

Speaker 1:

So yeah, it's rather well jake, I'm gonna ask you one fun question. Well, first of all, before we, you know, end things here, is there anything that you wanted to bring up that we didn't bring up? Um, I always like to end with that question anything that we didn't talk about, that you might want to talk about or promote, or anything for your business, or what you're trying to do, or just anything that you want to kind of shout out to all the listeners.

Speaker 2:

I mean I'm very humbled to be sitting in this virtual room with you two, super honored to be invited on. Or I kind of twisted Cam's arm anyway, but yeah, I really appreciate you guys taking the time to chat with me.

Speaker 3:

This show is all about you.

Speaker 1:

We appreciate you coming on, indeed, and I appreciate the compliment, but I will reflect that yeah.

Speaker 2:

Hopefully there's some resources or links that'll come out. No, no worries.

Speaker 3:

I realize we're talking. My internet's been bad, sorry about that.

Speaker 2:

It's all good. Sorry about that. Yeah, go ahead. Yeah, if anybody wants to learn more about privacy engineering or integrative privacy, feel free to reach out. Sounds like a plan, the resources that you have.

Speaker 3:

Make sure you get those over to us. We'll post those in the show links, but also we maintain a GitHub page on awesome privacy engineering resources and I'll make sure to get that at the top of the list, because fractional CTO services is not a thing that exists on that list. Which is to say, of all the things on that list, 100% of it is here. Go help yourself, right? So happy to share with some folks how they can also get help.

Speaker 2:

Yeah, yeah. My next project is I'm working on a privacy, what I consider a body of knowledge. My goal is to have an open source body of knowledge, that again, when we're talking about getting everybody on the same page with privacy, let's get all the definitions written out, let's get implementation and clear requirements to kind of do that. So that's my next project.

Speaker 3:

Tell us when you're ready Before we go.

Speaker 1:

this one's to the folks at IAPV, if you haven't contacted Jake, yet I'm looking at you sideways. Like that one dog I don't know at you sideways, like that one dog I don't know. Anyways, nobody could see that if you're not watching the video, which will be up on uh, youtube, um, and also all the platforms once this releases, probably next week we'll see you in the future, folks, we'll see you in the future this makes no sense to you at all, doesn't even matter.

Speaker 1:

All righty then. Well, thank you guys. Yeah, thanks jake. We appreciate it, man, and we'll see you guys next time.

People on this episode