Privacy Please

S5, E217 - The Operational Edge of Privacy & Security with Amit Danenberg

Cameron Ivey

Send us a text

An exclusive livestream episode presented by Transcend and Privacy Please. Join Cam and Gabe as they chat with industry leader Amit Danenberg, who has a proven track record of spearheading security & privacy operations at top organizations like Cisco, Sonos, and Willow Innovations.

We dig into some crucial topics, including:

✅ Uncovering the advantages of prioritizing security & privacy in both operational and cultural aspects
✅ Learning effective ways to manage privacy programs and choosing the right technology partners who share the same vision.
✅ Understanding essential steps to prepare your enterprise for AI while ensuring data privacy & security.

Support the show

Speaker 1:

All righty then. Ladies and gentlemen, welcome to Privacy. Please Live. I am Cameron Ivey and with me, as always, mr Gabe Gumbs. I'm really excited about our guest today, but before we get started, gabe, you got anything?

Speaker 2:

What do I have Another day? Another privacy dollar. I'm eager. I'm eager for today's guests. I'm looking forward to it. I don't have anything in particular. Hopefully the audience brings their A game and has some really solid questions. It's a great opportunity to speak to a live professional that lives and breathes this as well.

Speaker 1:

Well, it's really kind of you to say, man, but we need to talk about our guest, of course. Oh, you meant the guest.

Speaker 2:

Nice, nice. I like what you did there.

Speaker 1:

There's no dad joke for the day. Just kidding, I'll throw another one, but let's go ahead and let Amit in right now.

Speaker 3:

Hi, hi, so good to be here. Thank you for having me hello, how are you? I'm good, very, very hot. Cambridge, massachusetts. I have to say we're in the gonna hit 93 today. I think it's. That's it for us. It's a lot and it doesn't seem to let off at night at all. It just stays there. But you know, we try and remind ourselves of these days when it gets to be january, february, exactly, complaining a little bit, not much with with the weather like that.

Speaker 1:

Would you prefer it to be like shoveling snow out of your driveway and and freezing the death, or would you rather be hot like it is right now?

Speaker 3:

no, I think they're hot like it is right now, because then you can get in the pool or you can go grab something cold. You know, when your toes freeze off and and you're shoving them yeah you lose them they're gone, exactly, exactly.

Speaker 1:

They're gone forever. Well, you can get prosthetic ones. I guess you can get prosthetic ones if you want. But, amit, thank you so much for taking the time to join us today. For our Privacy, please live. Episode brought to you by Transcend. Yeah, thanks for being here.

Speaker 3:

It's a pleasure and I'm really looking forward to the conversation. Yeah, thanks for being here. It's a pleasure and I'm really looking forward to the conversation.

Speaker 1:

Yeah, me too. Well, let's go ahead and kick things off. Why don't you tell us and the listeners a little bit about yourself and where you came from and how you got to being in privacy and security?

Speaker 3:

Sure. So I actually started my career as an engineer in software development and at the time it was really cool. Technology Like this was high throughput Wi-Fi for video streaming that today we take for granted, or RFID and smart speakers. So it was really very interesting technology and it was exciting to be doing it, but I always felt that I was sort of a supporting function. I really wanted to be part of that. It wasn't the heart of the company.

Speaker 3:

Software development is exciting as it is, and it is groundbreaking technology. When you get a chance to work on it, it's really exciting, but for me it wasn't what drove the company. Groundbreaking technology when you get a chance to work on it, it's really exciting, but for me it wasn't what drove the company. It wasn't that heart that was throbbing and getting products to customers, making customers happy, getting revenue for what you were selling, and just understanding how do teams across the company work to make all of that magic happen. And I realized that that actually took place in operations, not in development. And so I did my MBA and then transitioned myself gradually to operations and it's been fascinating to see how teams work together, to how often they don't really speak to each other and understand and I can bring them together and see hey, you know, just across the wall, this is who's expecting this stuff from you and what it means. And when you make things work better, it's amazing, and I always thought that it would be amazing for the company, right? I mean, it's obvious that if you make things work better, it's more efficient, you get probably more profitable, you can scale all those great things. What I didn't expect to see is how much it impacted employee happiness, because when people see things don't work well, they're super frustrated, they're angry at teammates, they're definitely angry at other teams and it just creates this animosity. And when you come and you solve these things, even if the intention is to actually make the company more efficient, you end up making team members happier, and that was something I didn't expect and it was sort of you know, it also helps retention and it really just brings builds collaboration and just how committed people are to the company as a whole. So I really love that aspect of being and having that impact in being in operations.

Speaker 3:

But what brought me to privacy is that actually everything in operations is driven by data. Right, because otherwise what happens is you get into this gut feel or this intuition, and that's where the emotions go high, and that's not a good place to go. It's not a way to solve problems, but when you rely on the data and you see how things are working, where opportunities are, sometimes it actually highlights. One example is we were looking at something when I was working at Sonos at how much we were paying for freight and we realized that there were some things that were being sent back and forth across distribution centers before they were shipped out.

Speaker 3:

Now, if we hadn't really dug into that data, we wouldn't have seen that. It would have been hidden, because it didn't happen that often, but it was often enough that we should have taken a look into it and resolved it. So what the data shows you is really fascinating. But with data comes risk, and that's where privacy and security come in, so that you're actually using data you know in a responsible way. And so that's how sort of my winding path got me to privacy and security to actually make operations better.

Speaker 1:

That's amazing. I mean, Gabe, you probably, I'm sure you have some questions popping up in your head. I'm thinking about privacy and security as a whole and how you know, before it used to be this separate thing, but nowadays it seems like they're finally starting to come together on both sides.

Speaker 2:

Yeah, I was thinking about exactly what you were pressing on there, cam, and in particular how those things come together operationally. We don't usually see under one umbrella. Does infrastructure become a component of that? How much does the business kind of recognize the tendrils of data and privacy? Where does it begin and end, or is there no beginning and end?

Speaker 3:

It's true. I think a lot of companies have this affinity for data. I want to see. I want to see who's buying my product. I want to see how they're using it. I want to see where do I sell more? Why do my customers fall off and don't engage?

Speaker 3:

There's this affinity to keep on asking these questions to improve operations, but I'm not sure how much companies really stop and say by getting this data, am I risking my customers Because I'm pulling more data from them? More data, higher chances of security breaches, higher risk to the customer. Do companies really take that time to do that trade-off and say I'm willing to expose my customers for the benefit that I give them, not just my company, but the benefit that they get, not just my company, but the benefit that they get. And today, you know, with all the Gen AI, I think the risk to security goes as much higher, and so I think this is even more important.

Speaker 3:

So there's this affinity and, like I said, data drives operations, but there's also a level of responsibility that a company needs to have, and in doing that, introducing that, especially when it comes from top leadership, it really impacts the company's culture and I think it has to come from top leadership in order for the company to really embrace it, because privacy and all the different regulations around it dictate certain things right, that you can't just decide to collect a new piece of data.

Speaker 3:

You have to have good reasons, you have to make your reasons transparent and get them approved before you decide to collect a new piece of data, and that I think many companies have not done and really embraced because you know storage was cheap. I've worked at many companies that said well, you know what, let's collect it, we'll figure out what we're going to do with it later, because there was no pain point. I don't think there was a lot of, at least, situations where I was at where it was hey, just a second, maybe storage is cheap, but what about the privacy concerns for our customers with what we're collecting?

Speaker 2:

Why do you think they don't do that? But is there some common thread across those organizations that you see that are not protecting their data that way?

Speaker 3:

I don't know if protecting as much as understanding these privacy implications to customers, and you know, gabe, a lot of companies that I see or have worked at collect data, but they don't immediately use it.

Speaker 3:

So sometimes that data is wrong and sometimes that data is just dirty with stuff that you don't need, and so what it becomes is just a pile of junk put in storage. It still presents a high risk because it's still being accumulated there and can potentially be targeted, but the sad thing is it's never actually being used, because the company hasn't sat back and said why? And thought why are we collecting this, how are we going to use it, how's it going to advance what we do operationally and how's it going to? Can we then also benefit our customers? Can we be more efficient and maybe make the right product available at the right time or maybe even drive prices lower, things like that. That's, I think, some of what you know the GDPR and more focus on privacy that's been happening in the last few years. I think that's bringing that up and forcing people right, because regulations always force people to do things even if they don't want to.

Speaker 1:

Yeah, I think have you seen in your years of experience, I would imagine, kind of incorporating that company culture around privacy? Have you seen like around strong privacy practices? Have you seen a positive change when implementing those in some of your past roles and operations?

Speaker 3:

I wish I could say that as a yes. Operations. I wish I could say that as a yes. I think and maybe this is more a case for smaller companies I can't speak for super large ones, but even larger ones whenever budgets are tight, it's always something that's easy to say, okay, let's not invest in it, because it's hard to show the benefit to the bottom line right, investing in a tool like Transcend that helps you really enforce privacy, or resources, training in-house staff or relying on consultants. But it's much easier to say, okay, let's invest more in advertising or in some other marketing campaign or stuff like that. It's easier for leadership to gravitate to that versus to privacy.

Speaker 3:

And though I think there are very cost-effective ways to actually make a big dent in privacy, it doesn't have to be all out, and that's what I would really encourage top leaders to do to really just starting by behavioral changes in a company. So the fact that documenting when a new feature is going to be developed right, product management has come up with a great thing. They understand customers need it. Just documenting what new data will be collected and sharing it with the right teams across the company, that transparency, that bringing a set of different voices around the table early on versus much later down the line. That is just a behavioral change. It creates conversations and it brings opinions and that you didn't really need to spend a dime on, but you still did and pushed the privacy needle really far along, and so I think that's actually that's something good, just in general, operationally, to do that.

Speaker 3:

Many times some teams data engineering teams often only see much later down the line, or analytics teams only get visibility into what being you know going to be collected way down the line, and they often say, hey, why didn't anybody ask us for our opinion before? We would have suggested something different. Through privacy as an excuse for privacy, you bring these people around the table earlier. So I think that's a great way and not costly at all.

Speaker 1:

Do you think that there actually is a shift and this goes back to your point where it starts from the top? Do you think that there's more companies where they're actually getting push from the top about privacy to the lower part of their company, or is it still a fight from bottom up?

Speaker 3:

I'm happy to chime in. I think it's um, it only comes, or what I've seen, it comes from the top. When it's, it comes as a result of regulation. So if you want to expand your, your market into an area that has different privacy regulations, it forces leadership's hands to do it and then it comes from the top. I think. When it's to do it and then it comes from the top, I think when it's you know grassroots is in most cases it's you know, just because it's a feeling is the right thing to do, then I think it's much harder to advocate for it. Understandably, but not really. It's hard to accept, but it's possible to understand why. I don't know. What have you guys seen?

Speaker 2:

Yeah, I'm inclined to agree. I've seen much of the same. The times when I've seen it work best from the bottom up still actually starts from the top down. There was at least one person championing that cause further up somewhere, and they took it all the way down to the ground floor, so to speak.

Speaker 1:

What do you think some misconceptions about businesses around these topics have been that you've seen in the last five years Any common misconceptions it's expensive, it's a headache, it's a distraction from what we need to do.

Speaker 3:

All these feelings that, hey, you know this is a monumental task that we need to. We're going to be deep in, uh, and we don't know. We know how we're going to get into it, we don't know how we're going to get out, uh, versus you know understanding, hey, maybe it can be done piecemeal. There are things that can be rolled out slowly. If you find the right partner, right technology partner that are in your same mindset, it might not be that expensive.

Speaker 3:

I think relying on outside experts that tends to definitely be expensive, but I'm a strong believer in, wherever possible, do yourself and be self-sustaining, because I think only employees really have the full context of what's going on in a company, and it's also a great way for retention, because if you can train people and give them the skill sets of privacy that they didn't have in their toolkit before, then it's a wonderful new thing that they can add to their resume and so they'd be happy to to stay.

Speaker 3:

Um, so maybe I'm naive. Maybe there are companies that where privacy is, you know, much bigger and then they can't afford to take this path. But for the companies I've worked at, I think, having a path where, together with the right technology partner, you can train and educate different levels of the company to the level of knowledge that they need. I think it's really possible to, like I said, change the way we operate and our mindset and the skills we have, and then it doesn't have to feel like, hey, this is way out of our budget, this is going to blow what we had planned, or there's no ROI on the new market that this could open for us.

Speaker 2:

Do you have a story that you tend to tell when trying to share this vision? How do you communicate this to those that are otherwise unwashed?

Speaker 3:

It's difficult. I think I had a certain level of success in doing something very similar in cybersecurity not on the privacy side. Also, small company that could not afford and couldn't justify an in-house CISO took a different path, took a path where it found the right partner that was not just a virtual CISO but took us on the path to being self-sustaining and working with a really great internal IT team that we had and obviously you know the other functions like data and product, but mainly the IT team. Together with this external partner, we were able to write our own policies and playbooks. That it's very difficult, I think, to do, just relying on consultants again, because the context is missing. Full-time employees or employees really understand the context and so they bring the right people together and they write the policies that really be the ones that we would use in case of an emergency.

Speaker 3:

And so, working on that, working through the NIST assessment and so running through the high priority items, we had this virtual CISO be our support function. But we were learning and we asked them very explicitly to teach us and it wasn't just, you know, we left with the same low level of knowledge as when we came in. So I think over after a year or so, our level of reliance on the external partner went down and our level of internal support really went up, and so that taught me that it's possible. If you set it up with the right partner and you have the right um thirst for learning internally in the in the company, I think you can really do that. Again, I don't want to generalize but but in my case I'm sure there are plenty of companies that have sensitive data that they can't afford to do that, but that's probably they have a different budget or it's a different item in their overall budget. I think any company that collects data needs to have privacy as a high priority consideration.

Speaker 1:

Yeah, because I think, no matter how small or big the company is, that should be more important than ever now, because it seems like the biggest shift in my eyes and I might not be the only one obviously now. Because it seems like the biggest shift in my eyes, and I might not be the only one, obviously but when, when I'm talking about enhancing customer trust, you see a shift now because if your company doesn't display that kind of privacy, you're going to lose customers because of it. So I think you're seeing that shift from the top as well. We need to tighten up our, our privacy program program, because we're losing customers because we don't have it's not at the place that it needs to be. Would you guys agree with that, or is that? That seems like it's pretty common, right? It's kind of the this is what's happening in the market and this is the way things are going.

Speaker 3:

I think it is more going that way and when you're able to show that it doesn't come with a high price tag, it's easier to get that buy-in. And yeah, people are more concerned today more than ever before right, about their data being used, about their data being exposed and used for AI models out there to be trained on, and there's a lot of concern and I think it will only go up over time and companies need to recognize that, respect it and treat you know customers' data like they want their own data to be treated, and that'll get them loyalty from their customers Is there a roadmap that you generally see as being available for folks to be able to achieve what you're describing.

Speaker 2:

Let me ask that a different way when do you start? Where do you start? You walk in day one. Where does a MIT begin? What's the first thing? You look around and Educate yourself.

Speaker 3:

So I didn't know much about privacy when I started and the first place I started was to educate myself and understand. So there was some knowledge on the team, based again on past experience, and so we pulled in on that, withdrew on that. On that, we drew on that we actually had reached out to our networks of folks to give us some pointers into where to online resources that are there to educate ourselves. And then we started looking into technology partners. That's how we came to transcend and so, by talking and doing this, due diligence across different partners. Every conversation is an education and I have to say I very explicitly asked and said look, I'm learning here along, teach me. And I apologize for my questions if they sounded, you know, way off, but it comes from just wanting to educate myself. And once I got a better sense of what I needed was two things was understanding what had to be done, what it takes to do it, and then the price tag on it. And once I had that, we could put a high level as a team. We could put a high level sort of roadmap for it and then present that to our executive team, because once they have that information, they understand how long is this going to take? Does that align with sort of the strategic roadmap that they had for the company?

Speaker 3:

If it's going to take a year and they actually needed to go into a new region where there was different privacy compliance within six months, then there was a gap and there was a problem. And if the price tag was something that was way off than what we could afford, then that was another problem. So first it's education, and there's a lot of material out there. There's a lot of online trainings, a lot of resources, material out there, there's a lot of online trainings, a lot of resources. But I found reaching out to networks and having people bring them along and have them educate us was very helpful. These are the technology partners and started to piece things together.

Speaker 2:

On that education journey. As you look back over your shoulder, what do you think is the most important question that you asked, now that you know what you know? Looking back, like what was that one question that was like ah, that's the most important question I asked, or what's the most important question you wish you'd asked sooner?

Speaker 3:

I think the most important question was can I do this? Can we do this in-house alone, Given the skills we have, is it possible to do that? Is it something that we can acquire or is it something that you really need an outside? Normally it's an outside expert who's got years of experience in this space and that was one of the things that was the biggest question for me because, again, it had it always comes back to money, right, and it always comes back to ROI. So, to answer that question of the price tag, that was the question that I was asking.

Speaker 3:

I asked a consultant, several consultants, and I asked tech platform experts and, like I said, folks in my network, and the answer was really it depends. It depends on the size of the company. Like I said, larger companies can afford and do bring this knowledge in-house. Smaller companies that collect a different. It also depends on what type of data you're collecting. I think the conclusion that I came to was it is possible to if you're willing to invest and learn. You can do this again with an expert to rely on, but you can, over a reasonable amount of time, become self-sufficient.

Speaker 1:

Amit, to your point, I think more people in your position, I think there's more people in your position that you probably realize that we're in the same boat mentally when it comes to privacy. I don't know if you ever felt like you were kind of alone there, but I think there's a lot of people in your position at other companies that probably felt the same way. And that's why I think to your point about finding a partner or a product that makes it easier for all companies of all sizes. To make privacy easier is super important, because there are a lot of legacy tools out there that involve a lot of manual work, that involve a lot of hours and time from engineers and from other resources of the company, that cost time and money. So you need a tool that's going to be efficient and that is flexible, that works with and grows with the company that you're at, instead of the other way around. So I think, to your point, that's super important to look for tools like that or to be able to find those types of tools.

Speaker 3:

And I think, also to help you prioritize, so you don't feel like I've been collecting data for 15 years. I don't even want to look and see what's in there. You know, I've only used a fraction of it. That fear of, hey, what am I? You know, just, it's overwhelming to think of needing to handle such a massive amount of data coming to it with someone who can help structure that and break that down into, you know, sort of doable milestones. That's very helpful and, I think, also builds confidence within leadership that it can be done and it's not this massive hole that we're going to go in and continue pouring money into and never see the end of it.

Speaker 3:

It all comes back again to. If we'd only been asking these questions from the beginning, right, why are we collecting the data? Where are we going to store it? How are we going to use it versus? Okay, let's collect it, it's available, these devices report this data, or we can pull this data off this website or whatever it is, and then we'll figure out what we do with it.

Speaker 3:

I think today, companies that start today with privacy being so prominent, probably do ask these questions. It's the legacy companies, you know, that have come from a different era where it was this great thing of having tons of data. It was so wonderful that we didn't really ask the questions around privacy and we were more happy that it was so cheap to store and that we could later use it. So that's why I think it's like I was saying sort of in the beginning these tools really play into operations and how you operate, and if you incorporate them into the fabric of your operations then it sort of, you know, becomes second nature. You don't feel that you're doing something or just to check off the box of privacy.

Speaker 1:

It's how you behave From going through that process of, maybe you know, vetting out tools and things like that. What are some of the things that you've learned that could be helpful or beneficial to others that are looking for privacy tools or security and privacy technology partners? Now what kind of questions should they be asking? What kind of long term benefits could it help or benefit for the business? What are some of the visions that?

Speaker 3:

you've had um, how would your partner scale with you? It's always that trade-off. So one of the things with especially smaller or mid-sized companies, they they evolve a lot within a year. Right, most agreements are you at least one year. Most vendors prefer it to be multi-year. The problem is is that the multi-year conflicts with the amount of change that's happening in the company and when you start and where you end up. Even after a year or two years, it could be a huge disconnect between the agreement you're committed to and what your company actually needs, and that's a challenge and that's a lack of aligned incentives, which is totally understandable.

Speaker 3:

Right Software SaaS providers want to show recurring revenue it's important to them and companies want to spend as little as as they can and to get the most benefit, so it's really understandable how there's this misalignment. Finding the right technology partner that understands this misalignment and is willing to be more flexible I'm not sure how to do it. I wish I could that's willing to, to be more flexible and adapt, understanding that this company is going to adapt the way they start, especially if they're only early on in their privacy journey. Within a year, within six months, it could look very different. So I think one is really understanding that this is a partnership between the SaaS provider and the customer and it's not one is beholden to the other, because you've now signed a long-term agreement.

Speaker 3:

A lot of platforms once you use them, it's very hard to leave and companies, especially small, have little leverage. It's true, but when both companies see this as a partnership, then it's different. It isn't how much leverage do I? It's true, but when both companies see this as a partnership, then it's different. It isn't how much leverage do I have over you. It's I'm, as a SaaS provider. I want to give you the best value, that you get the best value out of my tool, because then you will be advocating for us going forward through your network, through your, and. So that's the way to to align these different financial incentives, and I think that's key in all partners I've worked with, not just in privacy.

Speaker 3:

It's really makes or breaks, because sometimes companies get stuck almost with agreements that they cannot get out of and getting very little value. So I think that that would be my top choice, my top item. The second one, I think, is by far support and training. How much does the partner want to share and train internal staff versus keeping that knowledge to themselves and making the customer totally dependent on them for every little thing? That's very difficult and very slow. So a company that comes and wants to invest in really training internal employees to be able to support their platform, evolve it, grow it as the company is, um, I think that would be my choice number two, and from there everything else is easy.

Speaker 1:

Classic, everything's easy from here, guys. Yeah, um, I I think it's funny because when you were explaining the two differences, it sounds like you're you're kind of talking about in my brain. I'm seeing like older tools compared to next generation type of tools, where they're almost built more efficiently and they're easier to use for anyone in the company, instead of it being so technical and manual and things like that. And and you know, I think important things is like a lot of companies don't really tell you a lot of their sauce or their secrets or how things work. I think it's kind of neat for certain companies to kind of give you those that glossary of all the tools and how they work and how they operate and that kind of thing. I think that's super important because you're transparent with your customers and people that even aren't your customers If you have something like that on your website, I think, because it almost encourages them to learn how these things work and how they operate and you can better use them for your organization.

Speaker 1:

Yeah, that's fascinating. It's interesting, gabe, being on the security side of things. What is your take on all this? I mean, what are you kind of? What's going on in your head?

Speaker 2:

I think it's time we see more partnerships between the security side of the house and the privacy side of the house. I love that, amit, that your purview over operations brings together the data and the storage, and I think my lamentation here is that, as the security person in the room, so to speak, the data we oftentimes look at as belonging to some other part of the business, and the same thing with where that data is stored. That's an IT problem and the ownership of the data, well, that's the business unit problem. And so, as security, what's the responsibility to making all of these boats rise with the tide right, like, how do we? We link our hands together here and actually to the point you're making, how do we embrace a privacy program, leveraging security, for example? Because we know we can't actually have privacy end to end across the business without security, that that is not possible. You have to be able to protect the confidentiality and the integrity of that data. Protect the confidentiality and the integrity of that data.

Speaker 3:

You know, and that falls rather squarely in the laps of the security folks, agreed and I think the more the security folks are ingrained with the data folks, so they understand what data is being collected and what data needs to be combined with other data. And when it is combined, does it then become private versus before it wasn't? And then obviously it's a higher risk. And so having these conversations with these stakeholders early on, I think really, like you said, can, if you think about privacy early, it actually can enhance security right, because if you're collecting less data or you're thinking three times whether you really need to collect this extra piece of data there's less to secure or if you're able to even influence the way you analyze the data or the way your product uses the data in a way that makes it overall more secure, then you've actually got product data engineering, security and privacy all working together versus them all being in, like Gabe was saying, in their own silos and being called in only later so it really can create.

Speaker 3:

I think the privacy and the frameworks that privacy sort of dictate actually bring people closer together and, like I was saying, it really impacts the way we work, and that's what I. It's fascinating the way we work and that's what I. It's fascinating. It's not an obvious thing when you think of okay, I'm going to embrace privacy, but it has much greater implications. And when companies realize that it might also make them be more ready to invest in privacy than they were before because they get so many other benefits as sort of side effects almost of this it's the culture change in security, um, and privacy.

Speaker 1:

But I have a hot take question here who do you think's more stubborn? Security folks or privacy folks?

Speaker 3:

oh, it's definitely security folks you know the security folks have that uh fear ticket. That always gets people, you know, paranoid and that trumps everything right. When you get people scared, then they'll do what you tell them.

Speaker 2:

I mean, in their defense, they're worried about things like ransomware. So you know, the fear is real. The fear is undoubtedly extremely real, but you'd be right, they certainly do travel in the less comforting parts of our business.

Speaker 3:

Exactly, and maybe if they saw privacy as a partner in helping them reduce the chance of ransomware, then maybe it's this, as you know, in a different light, because sometimes I think maybe security folks feel alone Like, hey, we're the ones holding the flag here and why doesn't anybody listen to us and stuff. And maybe if they could see, hey, we can be in cahoots here with the privacy folks and together we I mean. One of the things we did in setting up the cybersecurity function was to adopt this training platform and once a month we'd send in this short snippet of training, a few minutes, just to keep security front of mind for people. If we could maybe also do more of that for privacy or find ways where you know, and then you actually have your whole organization being part of this and playing their role in their own part of the company.

Speaker 3:

But I think if we had more of picking or even developing training for the rest of the company together in collaboration between security and privacy, that could be a super powerful partnership.

Speaker 2:

Agreed, agreed Very much.

Speaker 1:

So AI, AI, AI, everything AI.

Speaker 2:

And the world of Gen ai I mean, yeah, I was gonna say, yeah, you're forgetting gen ai, gem, lml ai. I mean all the elemental ai. You got the elemental pi. You can't forget all the elements.

Speaker 1:

So yeah, well, I mean, it's it's uh, I mean it's the it. It's the most intriguing topic in our realm right now, and I think it's going to kind of especially with now we have. What is it? 16, 15 days until the EU AI Act goes into effect. Things are moving at a rapid pace space. What do you think I mean for the foreseeable future, the next few years? What are some of the things that we should look out for on the operations side? What are some of your fears? What do you think will happen? What do you hope doesn't happen? I know I'm picking a lot at you. I'm just kind of.

Speaker 3:

No, no, because I think it's operationally. There has to be something much further down the line that needs to happen in order to I'm just kind of it has much higher chances of hallucinating. And so if you were to run some gen AI on only sales data or only marketing or only finance, without giving it the inability to see all the company's data, then you're going to get very different and wrong results. And so I was listening to some podcasts a while back about actually the biggest challenge is getting teams to agree on working off a single data lake, and that normally also means a centralized data team, and that's a touchy subject because there's always this tension between centralized and federated. And, is it know, is it going to be one choke point? Does the central team understand the context of the data as much? I think getting alignment within a company on that and how to make sure that whatever you're running your Gen AI on is complete as much as you can, it represents your whole company, versus just a certain side of it I think that's going to be harder than uh, actually you know sort of well.

Speaker 3:

The second thing is also for companies to decide what they want to do with ai. Not, you know today it feels a lot like a solution. Looking for a problem, desperately looking for a problem and really knowing what it is that you want, how this AI is going to help you as a company operate more efficiently or help your customers be happier, want to spend more with you, but without having that first also understanding of how am I going to give it all the full context of the company and agreeing on that, getting buy-in and not having each team want to own and control its own piece of data. I think bringing that to awareness is a maybe it's even a harder challenge because again it comes back to you know how we operate and change management and humans getting in the way and all of that. So I think that's where it's going to start.

Speaker 3:

And the second thing I'm worried about is testing this thing. Companies are used to QA QAing their product right, or maybe analysts are used to reviewing their dashboards and seeing that they're correct before they release them. Testing this Gen AI and what its results are, I think, is a whole different level and it's going to also be as the company changes. You can't just test this once right, you're going to have to test it over and over and over again understanding that challenge and what it means to qa uh gen ai outputs. I think that's a that's a challenge that um companies also need to to think about before embracing something.

Speaker 1:

I mean it seems like AI Gen AI rapidly can continue to evolve and change faster than I mean would you say that it's been, it's going to be something faster than any kind of technology that we've ever experienced. I mean, is that a bold statement?

Speaker 3:

I think it's true, at least the hype of it.

Speaker 1:

Yeah.

Speaker 3:

I mean we see how quickly it is, whether that's going to you know of it. I mean we see how quickly it is, whether that's going to taper off in a year or two. It'll be interesting to see. If it doesn't give benefits right, the market's going to be hard to keep on investing in it. It comes back to companies really needing to be clear on what value, what benefit is this AI going to give us before deploying? And I think there's going to be more justifications that teams and companies are going to have to make to their investors and internal stakeholders before they get resources to go off and do this.

Speaker 1:

Yeah, because I wonder if it's something that's more threatening than helpful in the long run, or is it equally both?

Speaker 3:

it's interesting and it's very interesting from my perspective. I've been trying to think about how it plays into privacy yeah you know.

Speaker 3:

how does the whole privacy thing play into what data you allow to train your model on? How do you make sure that the data you're training it on is compl with all the privacy internal even data? I'm not talking about all the issues around training on publicly available data and the Internet the data that you've collected as a company. How do you make sure that, when you train your model, you're still complying with all these privacy regulations? You train your model, you're still complying with all these privacy regulations. I think that, for me, is something I'd like to learn more about, and we'll try and see how to educate myself, I think it's a big question.

Speaker 3:

I wonder if it's also a question for tools that tools need to address and how much privacy tools have actually taken into account and thought about changes maybe that they need to do to support Gen AI, model training and stuff like that. I don't have an answer for that at all.

Speaker 1:

Yeah, because it's going to be scary. One day we're not even going to know what's real anymore. Some people already don't know what's real anymore. Some people already don't know what's real. What is uh?

Speaker 3:

we all live in our own little reality, right, that suits it it's scary.

Speaker 1:

We I mean technology, I I was telling you about this yesterday, amit, it really sits with me and um, uh, it was uh. If you live in the past you're going to be depressed. If you live in the future you're going to be anxious, but if you live in the present you will find peace. I mean, it's kind of true.

Speaker 3:

I like it though yes, it's good to keep in mind, just in general in life, I think even putting aside AI and technology.

Speaker 1:

Just like, in general, anything that we didn't touch on today that you want to bring up or talk about that's really important to you. That comes to mind, anything that comes to mind anything.

Speaker 3:

I would have loved to turn the question over and ask you, as the privacy platform provider, what you would like to see from customers of yours in order to make your platform be more effective when it's deployed, like you know, sort of trying to say from my perspective as a customer. But what would you see? How do you see which customers tend to be more successful, which deployments, and is there something that you would like a customer to set up and be ready with to make a deployment in an overall privacy solution like not just the platform, the overall solution be effective? That would be just, I know, maybe too big a question for the few minutes.

Speaker 1:

That's a good question, though I mean I think honestly I I would want to I don't mean to put Gabe on the spot, but you know, being the president at Myoda, maybe that could. It might be interesting to hear your take on that version of it, gabe, and what you might want to see in your customers and that kind of thing, because I don't run a product, but I see what you're saying.

Speaker 2:

Yeah, it probably is the same across the board. Saying it, yeah, it probably is the same across the board. And I mean you touched on it in what you were saying earlier, which is some education. I often encounter folks that have some very strong misconceptions about what offerings are in the market, and some of that is just a byproduct of good old fashioned marketing shenanigans. Right, a lot of things all sound the same, they say, they do the same things, buyer beware, and all that good stuff. But I think some of that is education Know what outcomes you need for yourself first, and know what outcomes are possible versus just what is stated.

Speaker 2:

As a concrete example and I'm certain this applies to privacy as well, too we see things like immutability of data become quite buzzwordy also, and I don't think most customers actually understand what it means for their data to be non-alterable, and so they will do things like you know, check a box inside of you know infrastructure that says, ah, this thing is now blank and it doesn't actually govern the data, or, let me rephrase that, it doesn't actually have the outcome they want, even though they think they've applied some type of governance to it. Education, I think, is a good place to start. Understand that not all things are made equal.

Speaker 1:

I like that.

Speaker 3:

Yeah, I like that too, and I think when, as a vendor, you come to a customer with this willingness to really educate them not in a way to hey, let me convince you to buy my product, but really come in and say I want to educate you, because when you understand you make much better educated decisions and also it builds trust I then trust you as a vendor because you came really wanting what's good for me as a customer, versus trying to push your product onto me in, you know, sort of in different ways, and then you get a more educated customer who can make better decisions and overall the partnership is better. I really like that, Thanks.

Speaker 1:

We're going to bring it. Since we have a minute or two, I'm going to do some classic privacy. Please fund questions real quick because it's been a while, so I got two for you, amit. This is a classic. It's a very important question.

Speaker 2:

When it comes to your toilet paper situation, do you put?

Speaker 3:

the roll on at the top or the bottom to pull from. Remember we're judging you. Yes, of course I know it's the bottom and I wonder what that means. Does that show something?

Speaker 2:

about my personality that I can be aware of. You're on a list now that's for sure.

Speaker 1:

Well, we joke around that there's only one way, but honestly I have no idea what one or the other means. I just think it's funny.

Speaker 3:

I have to say I am consistent though.

Speaker 2:

That's good, yeah, no that there are those that are pure chaos. They just put the roll on and walk away.

Speaker 1:

Those people want to see the world burn.

Speaker 2:

Do you?

Speaker 1:

don't care about. One last question what is, what is one fun TV show or movie that you've seen recently that was surprising to you, or just maybe made an impact on you?

Speaker 3:

Well, it's old so I'm not sure, but we've been watching it for a few seasons now. It's called Monk I don't know if people watched it A detective who is super neurotic and it's sort of a Wayne on, sort of a quarrel, but it makes you smile. It's not that serious. It shows a little bit I'm not sure in what way. You know, the challenge is people who have this type of conditions, you know, and how difficult their life is, yet how special they are, because they do notice a lot of things, and so it's a lighthearted way to pass, like you know, 40 minutes at the end of the day. Been really enjoying that, mr Monk. It's been good. Thank you to Netflix for providing that.

Speaker 1:

Well, Amit, thank you so much for taking the time to be with us today. Really appreciate it. Really appreciate what you do and your mindset on just wanting to grow and learn and better security and privacy for organizations. So thank you for all that you do and thanks for being here with us.

Speaker 3:

Thank you. I appreciate your time and it's been a wonderful conversation.

Speaker 2:

Thank you.

Speaker 3:

Thank you so much.

Speaker 1:

See you everyone. Thanks for being here.

People on this episode