Privacy Please

S5, E214 - Mastering Secrets Management & Governance in a Data-Driven World with Brian Vallelunga, Founder of Doppler

Cameron Ivey

Send us a text

What if managing your company's digital secrets could be as seamless as sending a Slack message? Join us as we converse with Brian Vallelunga, the innovative CEO and founder of Doppler, who reveals his journey from Uber to spearheading a game-changing solution for developers. We cover Brian’s intriguing path, including a rocky venture into the world of crypto machine learning, and a transformative moment in Mexico that highlighted the pressing need for efficient secrets management. Learn how a pivotal dinner conversation and inspiration from Slack's Stuart Butterfield catalyzed the creation of Doppler, making it an indispensable tool for developers around the globe.

Why do so many companies fail to protect their most sensitive data? Brian and our hosts break down the alarming oversight of "secrets" like API keys and database URLs in data security. Hear a harrowing personal story of a scam that exploited breached data, gather practical tips on safeguarding your information and much more!

Support the show

Speaker 1:

Ladies and gentlemen, welcome back to Privacy, please. Cameron Ivey, here hanging out with Gabe Gumbs, got a special guest today. Brian, I don't want to even botch your last name. How do you pronounce your last name? It's Brian Vallelonga, and pleasure to meet you, vallelongan. Yeah, man, thank you so much. Ceo, founder of Doppler. Can't wait to get a little bit more pointed with what you guys do and what you you know, the problems you solve. But first let's, let's kind of start things off with tell us, tell us about yourself, where you came from, how you got to creating doppler yeah, happy to.

Speaker 2:

So my career kind of got like or the creation of doppler really got started, um, when I was at uber. I've been out of my entire life, doppler is company number eight. All the past were, uh, more feel like projects now than anything else, uh, compared to the size of doppler today. But yeah, I was working on a crypto machine learning marketplace, kind of like all the buzzwords in one and um. It was definitely one of those projects where it felt like pushing a boulder up a hill you move one foot forward and slip five feet back from exhaustion. It's just a really hard thing to get off the ground. Funny enough, though, I did teach me a powerful lesson that like eventually me, uh, I decided that business wasn't gonna work which I'll talk about in a sec but turns out years later, another company actually did build a part of that business and it's really big today hugging face, so, face, so. Very powerful lesson of like sometimes there's founder fit and I just wasn't the right founder fit. But yeah, fast forward. I'm like really struggling with getting this thing off the ground and I decided to take a trip to Mexico, take a break from it all, and my one rule was like I wasn't going to think about this crypto machine learning marketplace and that's all I ended up doing Entire time, like completely broke that rule and I realized pretty quickly that like I was never going to get this thing off the ground. That just wasn't for me. But I've always been inspired by other founders and kind of like some people attract basketball players and they like have all the stats of these players. I kind of do the same thing with founders, and one of my favorites is Stuart Butterfield. He's a founder of Slack. We probably all have used his tools and he's probably the best in the industry at failing upwards, in my opinion. Creates a video game Out of that video. That video game fails, but out of that video game is born Flickr. Then he's like I'm going to create a video game successfully this time and does it again. Creates another video game. Born of that was Slack, and so I was like, okay, what can I learn from him in this moment?

Speaker 2:

And managing API keys, database URLs, certificates, encryption keys all these types of secrets, in a way, were just really problematic for me to manage and I was really thinking about that like wow, okay, this is a weak point in the stack. This is something I'm struggling with. And so I come back from Mexico and I go to this dinner almost immediately afterwards, like two days later, and it was a Stripe Atlas dinner. Stripe, the paying processor would host these dinners with developers and founders. About 50 of them would go at this restaurant called Zero, zero and SF, and you just chat about whatever's top of mind, and at this dinner I'm like, hey, I just can't tell anymore. Am I a shitty engineer or is the world broken? You tell me. And they were like about 30 to 40% of them, maybe even 50, were like yeah, I'm having this problem too.

Speaker 2:

And one woman in particular just comes running up to me. I thought she was going to trample me and she said I've had three outages this week. Hand me a solution by Sunday. And I'm like nah, nah, nah, nah, nah, this is a Wednesday, I can't build it by Sunday. She goes I don't give a fuck, have it by Sunday. And I was like, okay, people need this, there's a problem here.

Speaker 2:

And so I didn't go build it by Sunday, but I did go start researching more and more and what I found is that developers, large companies and everyone in the middle were struggling, and they were struggling because there wasn't a tool built for developers, um.

Speaker 2:

So we built the first version, really developer focused. Um. Doppler became its first customer. And then right afterwards we did something called chipotle sales, where I'd take anyone who was an engineer who could have possibly have a problem with, uh, this thing that I'm trying to solve, and I'd take them to chipotle and the rule and the kind of rule was I'll get you anything you want on the menu, but in exchange I get to rant at you for about two hours and we got our first set of customers that way. And then we were off to the races. We went into Y Combinator and eventually did a seed round by Sequoia and that's kind of how, like Doppler got started. And I'm happy to go more into like what actually is Doppler, what's the problem we're solving and all that. I also have a pretty important message I want, so I'm happy to dive in deeper.

Speaker 1:

So you're the cause of this whole Chipotle portions issue going on.

Speaker 2:

Maybe, maybe. I think they owe me some lost revenue.

Speaker 1:

I think that the word on the street is that that's they're doing in terms of like marketing scheme on how to like just have Chipotle in everybody's minds. But if so, it's genius. It really is Good Gabe. Sorry, I thought you had something to add there.

Speaker 3:

No, I was going to say if you have something important to talk about, let's start there. I want to talk about the things you're passionate about. I mean, you're obviously passionate about Doppler, and I have some ancillary questions. For example, you've heard the phrase identity is the new perimeter. Right, yeah, and I'm curious how that pertains to machine identities, because mostly humans but let's put that on the back burner. Front burner, what's burning at you? What's the passion?

Speaker 2:

Yeah, I'll actually build right into this. So this is like perfect talk track for us. So, yeah, I kind of am excited to be here today, because I think it's like I'm here as a PSA public service announcement to kind of share a little bit about what's going on in the industry and why now is exactly the right time to act. If you think about it like everyday people me, you and everyone else who's going to be listening to this podcast we all use a number of services like Uber or sorry, not Stripe, netflix, julio, and I would say maybe there's 50 to a hundred companies that we kind of use on a very frequent basis and those companies store our data right. Every time I take a trip somewhere, my data is an Uber right. They know where I've been. Every time that I look something up on Google maps, same kind of thing.

Speaker 2:

Our banks have our banking information and how much money we have, and all these services that store all this incredible amounts of data. All that data is protected by something called secrets, and secrets are basically the keys to the digital kingdom. So imagine we have this door and behind this door is all of Netflix's data, and that door has a lock on it and to unlock that lock, you have a key, and that key is called a secret in this developer world or this data centric world, and most of these companies say that we really trust with our data, aren't really safeguarding those keys. And so there's, like this, this big question I have, which is like how can you possibly protect everyone's data if you're not protecting the keys to the digital kingdom? And that's what I'm here to talk about today is like why this actually matters, not just for companies.

Speaker 2:

Like, when we see data breaches, we see this big thing of like, oh, equifax got breached, toyota got breached, twitch got breached and we think about it as like, oh, this company is going to get a slap on the wrist, they're going to get some little fines, maybe they're going to get a couple of bad PR articles, but there's actually way more of a cost to it, and that's our data, right, our data is getting exposed, and I'm here to talk about like that's really scary as like just a citizen, and how that's affected me and what motivates me even stronger. I have some stories that I can share about, like how data breaches like led to like me getting scammed even further and that's going to probably lead to even more scams in the future.

Speaker 3:

If you're comfortable sharing that story, I'd love to hear it.

Speaker 2:

Yeah, so this is actually a real story. I live in Austin, texas, and I just moved and I got my mom to fly out from Los Angeles and I was taking her to barbecue and while we were at the barbecue restaurant, I get this call and it's someone claiming to be at the Texas Customs and Borders and they're like hey, brian, we're investigating you. We found this package in your name that has illegal drugs and money in it and we want to verify who you are. And so they start listing off a bunch of stuff about me, where I've lived in the past, where I'm living currently, who my friends are, just like so much stuff. And I'm like, okay, this has to be the government, right, like who else has this kind of information on me? And so, like I'm really being investigated here, um, and we're diving deeper and deeper and I'm like, crap.

Speaker 2:

This is like the moment where you're like your life flips and like you're gonna end up in like a jail cell.

Speaker 2:

And I don't even know how I did this, because I don't I don't buy drugs, um, and at some point, uh, I realized uh, when the lawyers get on the call, they start probing deeper that this is a scam and they were able to have all this information on me because of other data breaches that happened, like the Twitch data breach and so on, and so it really kind of grounded me in this idea of like, hey, especially data people and DevOps and like developers, we talk about data and like, oh, there's a terabyte of data here or gigabyte of data here or petabyte of data, but we're not talking about what that data is and that's real people's data.

Speaker 2:

And when that real people's data gets out, it's then used against those people, which then leads to more data. And eventually you could wake up someday with no money in your bank account because they guessed all your recovery questions and now they have access to your bank account because your email got breached, then your password got breached, then all your recovery questions got breached, then maybe scam called you to get a little bit more information out of you to answer those last remaining questions, and so what I'm here is just to talk about like this is a real problem and there's also like a really easy solution. Like this doesn't have to be like an insurmountable thing that, like every company has to go through. There are viable paths forward. But it's a reminder to anyone listening, especially people that have responsibility in responsibility and companies to protect data that there are, there's real costs associated with it if they don't, don't tease at it.

Speaker 3:

I mean because one of the things that I I think is going to be useful for this episode is let's leave people with some real world things. Put one on the table first, those those questions where they ask you what's your dog's name, etc. You should just have some standard or non-standard answers to those things. Already can that you, you know to use, or you can simply treat them like additional passwords, right, like that's the right, it's just, it's just yet another password, put another password in there and now you know, leverage your password manager because you have one. Right, like, if you rewind back to episode one, we did, we did talk about these and we even listed several ones to use back in episode one. But if you haven't been listening that long, go all the way back, listen to all of them and then land on one. But practically, more practically again, I always appreciate when PSAs, in particular, come with something we can do. Yes, you were getting to. This doesn't have to be insurmountable. I'm listening.

Speaker 2:

Yeah, okay, so we have a line that companies have a lot of data on us and they're charged with protecting that data. Right, keep the private data private. And we know that secrets are, in a way, the digital keys to that digital kingdom, and so what you really need to do is you need to protect those secrets, and the first thing we can do is identify are you protecting those secrets or not? And there's three very simple questions that everyone on this call that is charged with protecting this kind of data should be asking themselves where are all my secrets? Where are all the keys to the digital kingdom? Because if you don't know where all the keys are, how can you possibly protect them? Right? Just a simple logic. The second is who has access to those secrets and when were they accessed last? Right, can I set up permissions around? Who has access to the keys? And do I know, do I have an audit trail of when those keys are accessed? Right? Just like when you go into any building and you scan your fob to get into, uh, your apartment or into your office, there's a record of you showing up to work every day and leaving afterwards.

Speaker 2:

And then the last one is if those keys ever get out in the wild, which is called a breach, a data breach. Can you stop it from happening? Right? So, like, hackers are in your system, they're draining your data out. Can you stop it, in that moment At least? If you can't, well, what good is your security team? If you do not have strong, confident answers to those three questions? You guarantee to have a problem, and I think the first step before pitching any types of solutions is to understand that we do have a problem here, and I think most companies do. And what's even more scary that I found is that the bigger the institution, and especially the older the institution a, the more data they have, so they have a higher density of valuable data and be there the least prepared to go against hackers. And remember, hackers want the data and they don't need to like figure out how to hack every system. All they need to do is find the one crack in the system, the weakest link, and hit that. And if they get all access to all the keys, they get access to everything else. Right, they get access to all, all the doors. And so that's the first thing is like let's identify that there's a real problem here that needs to be addressed.

Speaker 2:

Now, once you get, I can pause here and we can we can go ahead. So once you've identified that there is a problem on, the next step is thinking about how do you solve that. And I think let's just reverse all those questions. Do I? Do I know where all my secrets are? Right? If I think that's a really fundamental part of this solution is I know I can attribute and track every single secret that there is. I can put access control so I can say developers have access to just the development secrets, the development keys, and the DevOps security team has access to the production secrets and we have an audit trail whenever those secrets are used.

Speaker 2:

And then the really big one is I think the right way to think about this is that a breach will eventually happen, and it's kind of like you always train for the worst case scenario. So if you train for a breach is going to happen, then when it does happen you're prepared right Versus if you treat it that it will never happen, then when it does happen you're screwed. And so if you assume that a data breach is going to happen sometime in the future, you need to be able to stop it. And the way you stop it is this thing called rotation, and rotation is a very, very fancy way in the industry of saying we're going to swap out the locks. So let's just say someone steals all the keys. Well, if you swap out the locks, the keys aren't valuable anymore, and so can we do rotation. And this is a real story, I'll tell. We have a customer I guarantee you you have heard of their name I can't say the name on the call, but I promise you everyone listening has heard of this company and they had a malicious actor in the company, meaning someone joined the company with bad intentions and stole all the secrets.

Speaker 2:

This is a real story, and when that happened, the security team said the only safe thing we can do is to swap out all the locks, rotate all the secrets. And it took their security team, fully focused on this, of three security engineers, six months to rotate all the locks. And one of the reasons why it's so hard is when you're swapping out a lock, you still have production infrastructure that's relying on that secret, and so you can actually bring your service down by swapping out all the locks, and so you have to do it in a safe way where you don't bring yourself down in the process, while also going as fast as possible, because remember that was a six-month period where attackers basically had free reign into their databases and their systems, right Six months to get whatever data they wanted out and cause any harm. And after adopting a service like Doppler or any other solution, and hey, I'm going to be the first one to tell you you don't have to buy Doppler I'd love it if you did, but you don't have to Just solve the damn problem. That way, I stop having my data and data breaches, you stop having your data and data breaches. But after they adopted the right type of solution, which is called a secrets manager, they're now able to solve that problem within minutes. So the hackers in their system they click a button. Within minutes, all of their production infrastructure is updated with a new set of secrets, a new set of locks, and so that's the fundamental pillars of having a good secrets manager. So you know where all your secrets are, you can set up access controls, you have audit logs when secrets are read and changed, and then you have a quick way to rotate all your secrets and that updates your production infrastructure. So you don't have any outages in the process and that is the fundamental pillars to solving this problem.

Speaker 2:

And there are a fair amount of in the market really good tools that do this very well, that are very developer friendly, doppler being one of them. And, if you have them, most companies. If you're a small company, like a startup, like Series A, series B or sorry, series A to Seed you can set up Doppler in a day. We have most customers set up in a day. Medium-sized companies Series B to Series C they set it up in a couple of weeks and then very large enterprises we have some enterprises that have hundreds of thousands of engineers and they set it up in a couple of months, and so, even in a couple of months, that's pretty fast for an enterprise. It is very, very easy to solve this problem with the right set of tools and it only takes one really cohesive platform that does it, doppler being one of them. I have a question? Let's do it.

Speaker 3:

Let's dive in, because secrets are near and dear to our hearts. I travel in this secrets business a tiny bit in a very different way. But how do you protect the secrets? Because that's always, pun intended, the key to this problem. Right, yeah, there is indeed the secret, which is why we got all the way down to mom and pop even need a password manager. Yeah, because they have to protect their secrets. And then there's this other secret that they have to now have that protects all of the other secrets. Right, like it's a, it's a. It's a cascading, never-ending nesting doll of secrets, or is it not?

Speaker 2:

no, I, I think you are right about that, it is um. So I would say there's like two paths forward there. There's what we're doing today, and then there's where we're bringing the industry. What we're doing today is, when secrets enter our systems, we immediately do this process of tokenization and the idea is we encrypt that secrets with our encryption keys. That are, we have an encryption key for every customer and then we allow enterprise customers to also double encrypt it with their own encryption keys so they can connect their AWS, gcp or Azure KMS so that they can use their own encryption keys to encrypt this. And then, once it's encrypted, we assign an ID to it and then that ID is in all of our systems and where the encrypted secrets are stored is in something completely firewalled off from the internet, not accessible to the internet. It has no dependencies. It's a binary and it's exceptionally locked down, with a lot of alerting around it and there's only one way to communicate with it. So it says as locked down as we can possibly make this thing right, like you need to hack the underlying programming language. We have to be able to hack this thing now, and that's what we're doing today and I and that's what we're doing today and I think that's like a pretty strong model. Like we have top tier penetration testers that try to break us all the time. We use some of the best firms in countries like, uh, the firms that, like blackrock will use and apple will use, we use um, and they have not been able to even get a medium, uh or high severity with us yet. So we we do a pretty good job there, but I don't think that's the end all be all.

Speaker 2:

I think, really, where I want to take the industry is to get rid of secrets altogether, like, why have this static token that basically says, hey, if you have this thing, you can unlock a door?

Speaker 2:

And we should really switch to identity based model and we're starting to see this today with passwords right, we had passwords before and now we have pass keys and where you can like scan your face or do a touch ID and you can get in to a system because it's your identity and machines have identities too.

Speaker 2:

Right, when you are in an EC2 instance or in a Kubernetes cluster or wherever it may be, that machine has an identity.

Speaker 2:

And where I'd like to take the industry and this is something Doppler's very passionate about what we're working on behind the scenes and it's going to take a couple of years for us to really get this to the scale we want to get it to is that Doppler knows the identity you have in your cluster and you can automatically authenticate to Doppler just by being in that machine.

Speaker 2:

So if you're in this machine, you're automatically authenticated to Doppler and we have an identity mapping there. And then we also build an identity mapping with all the services used, so your database, your payment processor, your messaging system, all of those systems. Instead of having an API key or a database URL, they have an identity and Doppler's mapped in and so just by being in that machine now, you're automatically authenticated to your database, to your messaging system, to any other systems you're using, and there's no secrets at all. And if you're in that kind of world, it's very hard to hack that, because you need to be in the machine to get access to anything now, versus just being able to steal a secret, run away with it and use it at an arbitrary later date.

Speaker 3:

Let's talk about the other important side of the secrets coin Trust. Yeah, there's the. I would love to get out of the secrets business altogether, but I would also prefer not to trust any of my secrets to other people.

Speaker 2:

Yeah, that's a good question.

Speaker 3:

Right, which is the reason why you are heavy in the developer community. There's so many people moving back to self-hosted models. For a lot of those reasons, especially all the reasons you mentioned All of the data scraping and analytics and everything that they're collecting on us analytics and everything that they're collecting on us, right, you know you've got folks moving to to to platforms that allow them to do everything on their own, including have their own, you know, photo cloud albums that do all the like. You can self-host all the things yourself these days, if, of course, you are capable of those things. But you travel in these circles, right, like very developer heavy environments, like a lot of all of my developer friends. They're all self-hosting a lot of uh of platforms that you know other people typically go trust others for, and so the same. The same is to be said for secrets in general. How do you see trust playing into this, especially in a world where you just described, where we moved away from traditional secrets, because it feels like trust becomes even more important when secrets go away?

Speaker 2:

Yeah, I think trust is incredibly important, or sorry, the topic of trust is very important, obviously Doppler's. So Doppler lands on the side of like. We are a trusted system. We are not a no trust system. So we are a cloud hosted service and we don't support on-premise today. I think in the future we definitely will, but not today, and we have debated this thoroughly internally this exact argument of like should we ask customers to trust us or not? And after going through a lot of back and forth, this is where we landed.

Speaker 2:

We think that these kinds of systems are like the underlying dependency that keep all your other systems up. If your secrets manager goes down, all of your infrastructure pretty much goes down at some point. So it's a real critical dependency. It's like a level zero dependency. And in that world we have found that one of the biggest problems with self-hosted systems is that they go down often. Like the most common complaint that we hear with HatchCorp all the time is that it will seal itself automatically or it will bring itself down, and it's just a real pain. Also, the other part with self-hosting that's a real problem is a company will go and install the self-hosted version, but they won't update it, and so when there is a vulnerability, it's like I can't force you to update, I can't force you to get that security update, versus when it's in a cloud hosted, you can. And so our kind of like model right now is we are going to be very transparent about what we do to build your trust. We're going to have extremely strong documentation about our security posture. We're going to remove all analytic tooling from the dashboard. We're going to talk openly about our dependencies that we do have and how they work and what data enters those systems. So you have a very complete picture of like what does it mean to trust Doppler? And eventually, over time, as we move more and more upmarket like, eventually at some point I'm sure we'll start selling to like serious government institutions in the military. Then we will have to move into a trustless world.

Speaker 2:

But for the customers that we service today, those mid-market enterprise customers, we have found that they're far more willing to trust us and then offload the burden of management and exchange. And so because they know that if that system goes down, it could ripple and bring down all of their other systems, which they really, really don't want, and they also want automatic security updates, which is very hard to do when you're deploying on-prem. The other really hard part about on-prem is when you deploy it so like organizations tend to be very big and especially enterprises, and someone else can go like change a port or change a firewall rule, and then all of a sudden the thing goes down and we get called in to fix it but we have no clue what changed. We may not. It may be 10 layers like outside of our abstraction, right, we're looking at this instance and we're like all the green lights are on, traffic's being served correctly, but 10 layers up, a firewall rule got added and now it's broken inside that private network, right, and we have no ability to debug that. Um, but they're they're coming back to us and say fix the problem.

Speaker 2:

We're like we have no idea. Everything we see is green, and those are some of the problems that we just don't want our customers to be able to face, and that's why we've gone so much slower on the on-premising, because I think it's very easy to take a bunch of code, bundle it up and ship it to a customer. It's everything after that. That's really freaking hard and we want to make sure that when we ship something it ships with that Doppler level brand of like. You know that when you press the install button with Doppler, it's guaranteed going to work, and it's going to work the exact way you expect it to work and it's going to work 10 generations from now or two years from now. It's going to work even better. And that's a very hard promise to facilitate with on-prem We'll eventually get there, but we've purposely chose the argument of choosing trust.

Speaker 3:

No, right on. I respect that completely. There's something to be said for, if one is going to trade for trust, that the value should be then returned. Yeah.

Speaker 2:

I appreciate that If there wasn't a trade, I would say we absolutely had to do autograph, but I think that trade quite honestly. What's another interesting thing that we've talked about a lot internally is it's not really up to us to decide what the market wants. The market wants what it wants, right, and if the market is willing to trust us, then the market has spoken and so far we have been able to maintain and grow that trust over time and I really hope to God we don't lose it. And I think over time, just as we move into bigger enterprises, they will be like hey, we just don't have an option, we have to move to a no-trust model. But I think it's really dependent on what kind of customers you're going after, and for us we think there's a huge density of important data that needs to be protected in the mid market and enterprise before we move to government and military.

Speaker 1:

On the topic of trust, I'm going to like and I think this is different from what you guys are talking about we're not talking about customer trust, right? I mean kind of we are, we are, we are what I mean. I guess I'll expand on the thought you guys are digging into. When it comes to cybersecurity, privacy, just having that realm. Obviously, nowadays, I think companies and individuals are even more aware and more knowledgeable about privacy and about trust. So that's that's huge moving forward. I think, especially for a company like doppler, for a company like transcend, for a company like miota, any company in the tech space, when it comes to customer trust, I would imagine that is like top of mind, because not only are companies looking at how their actual like um, like their actual tech that they're using and paying for, they're paying attention to how they're handling their private data. Yes, that's huge, I mean. I'm sure you're seeing that with Doppler, yeah, yeah, I mean we, we.

Speaker 2:

I'll give you a great example of like why this is so hard, why we have chosen. We've made some decisions that have made the business arguably harder to scale but better for the customer. A lot of companies today they have these really crazy tracking policies. You go to their website and they just track everything out of the sun and then, especially when you get into the product, they track even more. They'll have hindsight analytics, which is a big no-no in our eyes, and we purposely chose not to do that.

Speaker 2:

And we have a very low attribution model right. All of our analytics require you to opt in and say, hey, you're going to allow us to track it. Even then, we do a lot less tracking than normal companies do. And what does that come at? The cost of Our ability to grow revenue faster because, like, a lot of ad strategies require high attribution and we basically said we're going to give up attribution, uh, for the sake of customer privacy and customer trust. Um, and those trade-offs we don't regret making, and we also think the world is moving in that direction where the companies that are kind of breaking the mold will eventually have to get back in in line very soon or else they're going to start getting fines from Europe. But yeah, it is one of those things where I think there's a lot of trade-offs and you have to. I think you always have to choose a customer at the end day because eventually they will start to choose you back.

Speaker 1:

Yeah, you have to. No matter what you're doing. I don't care if you're building tech or you're making pizza for a community. It's always about the customer, the experience, feeling like they're being heard and taken care of. That's in any business.

Speaker 2:

Yeah, like. I'll give you another example of like. Why, like things we do at Doppler that make things a lot harder for us, especially around product decision-making. It's better for the customer. We have a solution that we use called PostHog. It does not have client-side tracking at all, for once you're inside the dashboard or inside the product.

Speaker 2:

But we go a step further. We actually anonymize all the accounts. So in our analytics solution we have no idea which company is using which features or tools. It's just a random ID for the users and for the companies. There's no emails, there's no PII, there's nothing. It is very, very hard for us to basically go and say this company did X behaviors in the company, inside the product. And we obviously do that because we didn't want PII data in postdoc. We didn't want another tool to have access to that data. What if postdoc has a breach, like someday, that could be Doppler data that gets exposed. Well, now, it's not Doppler data that will get exposed. We know exactly where all PII data is now because we keep we keep a very close guard on it. We know that even names and emails are things that should be better trusted by our, things that our customers are trusting us with and should be respected because of that.

Speaker 3:

That's a tough decision to make. The post-hoc one that is yeah, no, I respect it. I respect the hell out of it. That is not an easy one to make.

Speaker 1:

So what's the alternative there if you didn't make that decision?

Speaker 2:

Put all that PI data in post-Hog, which a lot of companies do, and then you can do much better analytics, can't you run and I'm broken record now, but can't you run PostHog, on-prem you? Could. So actually, I think they just stopped doing that. So no, you can't Okay.

Speaker 3:

There's another yeah, I can't remember, because I looked into this a while ago too, but there's a privacy respecting one also that you can that is post-op you can set, but it is post-op, it is post-op. I thought you were into that.

Speaker 2:

Yeah, yeah, they started out on-prem and this is exactly the thing that I was talking about before. They started on-prem and they removed on-prem because they found it way too hard to facilitate customers' better on-prem. They have a whole blog post about this where it's hard to get them up. It's hard to debug things when they break. They're spending an enormous amount on support costs. Customers weren't happy with the on-prem installation. Like I said, it's easy to cut up and ship it over. It's very hard to manage it and have a brand-level experience afterwards.

Speaker 3:

No, you're preaching to the choir. No, two ways about it. I I completely get it. Uh, there is.

Speaker 3:

It has become very difficult because that we, we have come from a world where you couldn't sassify everything and so you had to solve those problems right. Like and it does always beg the question to me sometimes because we make similar product decisions and trade-offs and we obviously have, you know, similar dependencies, all of the very similar things that one might find in technology companies, and because so many things are available in SaaS, in some ways it has made it too easy to make that decision right. And so now to your point. You still have to trust someone else, and so you took that extra step to remove things even before it goes out, which is, I think, a commendable move, even before it goes out, which is, I think, a commendable move, maybe arguably the right one to do. But that, too, still brings a lot of friction for a lot of businesses.

Speaker 3:

That, I think, is difficult to make, and I guess what I would argue is man, there really should be some middle ground where we don't have to choose between not being able to understand behaviors of our customers, even at a customer level, because that can be useful, but still worry about how we again trust others, because you can say to yourself you know we'll keep that data ourselves and we trust us because you know we trust us, right Like wherever you want to draw the trust. Your valid points about, yes, but now we have to maintain all the infrastructure, is again one of those reasons why sass has become the behemoth it is, but, uh, I think the the entire sassification of the planet, though, is part of where the trust does break down, right like because there's this, there's this web of trust, even below that, that you just can't unravel for the most part yeah, I mean, at the end of the day, even our infrastructure is SaaS, right.

Speaker 2:

We're still using AWS. I don't own my servers. I rent them. All of us are pretty much using the same set of regions anyways. My code is probably running on the same server as yours are Almost guaranteed. Almost guaranteed. What is it, virginia? Yeah, right, no, I think there is a strong argument to be made around. Uh, not everything needs to be sass. Not everything should be sass. Um, yeah, I do think that like, and I think even with secrets management, it's not something where they're, it's like black and white. It's not like everything should be sass, because they're like government and military make complete uh sense for why it should never be sass, right. Like, the last thing I want is like military secrets on, like AWS, right.

Speaker 3:

I mean, if you spend a lot of time working with or inside the government, you might argue that SaaS would do it better, Like I mean.

Speaker 2:

I guess, I haven't seen that other side of the story yet, and I know there's AWS Cloud.

Speaker 3:

Yeah, which is a nice middle ground, if you would, because leaving it purely to the government to run their own infrastructure is is also problematic, even from a citizen's perspective. It is just wholly problematic too. Like I don't want them, I mean, yeah, anyway, that's the whole ass on that.

Speaker 1:

On that same topic, though, but isn't that why they created fed ramp?

Speaker 3:

it's related yeah, the short answer is it's related. For sure it allows them to trust others. Yeah, no, no.

Speaker 2:

We just touched on something I actually want to dig a little bit deeper on. Yeah, this is another big thing. I think that's wrong with the industry around secrets management. We have compliance bodies like SOC2, iso and so on that are there to tell small companies, medium-sized companies, and you should trust not only who you should trust, but what are the set of things your uh, your dependencies or your uh, your, your vendors should have from a security perspective why you should trust them.

Speaker 3:

Yeah, exactly yeah, here's who you should and here's why you should.

Speaker 2:

Yeah, you know the one thing they're missing on? All those compliance bodies are outside of fed ramp secrets management don't tell you anything about how to store the keys to the digital kingdom.

Speaker 3:

Oh, no, I find myself asking very similar questions, which is why I asked you too about, um, you know the nesting doll of trust I've been having having this long-running conversation about and I put a very cheeky post online where I was like it proves to me your immutability is not just an ankle, right Like with the little meme of the dice sitting there with the coffee, because it's like your immutability is likely just an ankle. And then there's this trusting that comes back and folks are like but I pressed the button and I'm like and what exactly govern that? And, to your point, it's like it's just a policy. We've gotten to a place where some other body has created a policy, a policy, and we trust these entities based on this policy. You're, I think you're right. There's a massive problem there. Yeah, a massive problem are people actually fighting it.

Speaker 1:

You know that specific topic. It's almost like a follower type of thing.

Speaker 3:

It's like oh, I mean, I'll certainly say the following I've never heard anyone other than Brian complain about the obvious, which is that secrets management is woefully missing. I know NIST finally updated their own guidance on secrets management like two years ago, which means it's also already updated. I'm fairly certain it hasn't been updated since, but I could be wrong.

Speaker 1:

Is that because they then won't have control? Is that why, Brian, you think?

Speaker 2:

I think it's missing, not because of anything like super nefarious or anything like that, but more of just like secrets is just like something that hasn't been thought of like that much in the industry. Like if we go back five years to when Doppler was created, everyone was on ENV files, which is, for anyone who doesn't know, an ENV file is a literal file on disk, unencrypted, that has all your secrets and that was like the standard way. And even if you go to like the most popular frameworks on the planet today, including Kubernetes, you will find that they're still there. Still a recommendation to use ENV files. Right, the industry is still on an archaic system that basically says, hey, yeah, don't put your your secrets unencrypted on disk, which is like the worst of the worst, of the worst possible solutions you can do. Um is what the default recommendation is for almost every framework I can find on the internet right now. Like it's just the mentality that's going on in compliance and it's the update I might argue that it's a little worse than it hasn't been thought of.

Speaker 3:

I think it's worse. I think they think they included it. I think they think that all of the controls around who should have access to what right, I think they think they covered it in the larger overall. You know, just kind of least privilege access. But you would be right, it doesn't go far enough. But I think it's even worse than they haven't thought about it. I think they think they thought about it.

Speaker 2:

I think you may be right, and what it is is they don't understand the difference between the severity of hackers having your code and hackers having your secrets. They're just like it's text and both texts are the same, especially if it's an unique file, which is a file and then files are code.

Speaker 3:

It all kind of feels and blends Ryan's there. I am largely a hacker by trade for the better part of my career, and that's the problem with a large part of the industry is you have a lot of IT solutions being built by brilliant IT people, but there's not a lot of and even worse, you have all of these IT policies and there's almost guaranteed no security thought in the room. And when I say security thought, I'm not talking about the defenders, I'm talking about the builders and the breakers. There's largely only defenders in that conversation and, to your point, if you had both builder and breaker in that room as well, you would get two people that understood that ah code is also a thing that requires that thought process, but there's only defenders in that room.

Speaker 2:

No, I think you're 100% right. Quick, tangent story. There was like some big thing I went through in the past and we were using lawyers for it and I realized that like there's two sets of lawyers there are lawyers that can like review docs and then there's lawyers that will try to break those docs in court and you really want the guy that has spent his entire life and they're not the same by the way yeah yeah, you want the guy who goes to court every day, excited to break those documents, writing your docs because he's going to know all the loopholes and cover those up, and we need the exact same thing in those docs for compliance.

Speaker 2:

Yeah.

Speaker 3:

I was same thing in those docs for compliance. Yeah, I was at a user conference last week having like literally screaming at the top of my lungs why do all of these people have security slapped all over their bender boots and there are zero security people here? How is this possible? How is this possible that you all swear that your product is? Because security is a thing. I'm not going to even name names to get into it, but like it's a, it's a hard focus in that world at the moment and yeah, I'll add is like, doesn't it?

Speaker 2:

this one frustrates me to no end, which is that SOC 2, which is pretty much like like. So SOC 2 has a really interesting policy that says that if you're SOC 2, all your vendors have to be SOC 2. So that basically means everyone in the industry now that does B2B and most B to C have to be SOC 2, be it a business to business or business to consumer. And what I find really hilarious about that is basically that the people that are verifying that the company has done all the security things to be SOC 2 approved are not security people but tax people.

Speaker 3:

I was just going to say you're about to make enemies with the big four, because they're the ones that wrote sucked, and it's written that way so that more people would have to have sucked. Do you like hot takes though? Yeah, that's the hottest it takes. I'm with you on this one, brian Rabble rabble. When do we roll? Because I've got extra pitchforks. I got more than enough kerosene because I've got extra pitchforks.

Speaker 2:

I got more than enough kerosene. Honestly, there is something Doppler is working on right now where we are trying to get secrets management into SOC 2 as a compliance body.

Speaker 2:

And to be very clear for everyone on the call. We're not trying to get Doppler to be a hardcore requirement and only Doppler. We are trying to get secrets management. You can go use our competitor, for all I care, it's we need need. There's a big hole and it's the security stack and that governs all startups and enterprises. We need to plug the hole and we're trying to work with any company that wants to help, uh, to work on that one you got to go to the big four system integrators.

Speaker 3:

Honestly, they're the gatekeepers to getting that document updated.

Speaker 2:

I I shit you not. We are starting those conversations with them. We're, yeah, thinking on the same train.

Speaker 1:

That's pretty cool.

Speaker 2:

So if you want to work hand in hand with us, we're very happy to do that.

Speaker 1:

I'm trying to think. Gabe looks like he's frozen, so I don't know if he's talking or not. No, he's not frozen anymore. I completely froze up on my side. Awesome. All right, let's move to the future. Obviously, AI is a huge topic. Do you have any concerns, or what can I say? What do you think we should be looking at ahead when it comes to secrets management, when it comes to data privacy, when it comes to all the things we've been talking about with AI included, things that are happening? I mean we got freaking people, I think, in Arkansas writing laws with chat GPT. I mean there's got to be concerns there. What's the top of your mind for the future? Is that not an improvement?

Speaker 2:

I will do the stuff related to my industry and then I'll give you one that I think is more personal to me. I'll do the personal one first and close out the industry. So on the personal side, I tell my friends this all the time. I think we are living in the golden era of the coin flip era, and in the coin flip era we can either get utopia or dystopia, and it's going to happen in our lifetime, guaranteed, and there's a fundamental mathematical reason for that. Technology is growing at an exponential rate. Society is maturing at a linear rate, maybe even negatively right now, and so when you have that big divide where technology grows faster than the maturity of society, but society is the one that's controlling the use of technology, it gets very, very scary. Imagine if an infant had the button to press a nuke. Maybe some nukes would go off, and so I don't know where that leads us, but I do think that it's very possible we'll get either utopia or dystopia in our lifetime.

Speaker 1:

I don't know why, but first thing that popped in my head was Stewie from Family Guy. He would definitely press the button.

Speaker 2:

Yeah, and if you want to dive a little bit more deeper into the thinking of that, there's a really good book called what's Our Problem by Tim Urban Really easy read too. Goes pretty deep on that and then on AI, yeah.

Speaker 3:

I was going to throw in, because it is a fascinating thought. Part of what scares me, though, is that dystopia is the utopia some people are literally praying for.

Speaker 2:

Yeah, I guess it really depends on what variant of dystopia we're looking for. One thought that I have a lot is I live in the US South.

Speaker 3:

There are people whose entire belief system is literally grounded in a dystopian future. That is their utopia. I guess what I'm getting at is I don't disagree with what you're saying. It does depend on whether or not you think dystopia. The rapture is a whole ass thing that many people want to see come about.

Speaker 2:

It's also a flavor. I think the real question is like what's the flavor of dystopia? What's the flavor of dystopia?

Speaker 3:

Yet also true, because some people would just as soon see a whole other cross section of society find themselves in what looks a lot more dystopian, while they remain in what they think is utopian. While they remain in what they think is utopian. Yeah, no, I don't. I don't disagree with anything you're saying at all. I just worry that some people are literally rooting on dystopia.

Speaker 2:

It just depends on. I think it really comes back to like, if, just if there were sorry, their meaning for dystopia is anarchy. Yes, I think I can totally see that, but like I can totally see another side, which is like the human brain isn't is at the end of day. At the end day is constantly complex in the sense of like the complexity of the human brain is is only so complex and it's not going to get more complex.

Speaker 2:

And ai decided, hey, we want to understand how to master the human brain, to control all human thoughts well, in my mind, a dystopia, uh, would be one where humans lose all ability to have any level of reasoning and thought and we just are like mindless drones that do whatever the fuck an ai says. Like that'd be dystopia. Mine too, and I don't think anyone wants that, right. Everyone's still, at the end day, wants a little bit of free will, and. But that could be a very real future, right, and that could be a future that that's like an ai is tasked from another government, like, uh, an authoritarian government is like we want to control our people. Well, the best way to do that is to literally control them, right, like that can be quite scary. So, yeah, there's a lot of flavors utopia and utopia I love that.

Speaker 1:

Um, so I know we're coming up on time here. Uh, I want to be respectful, but if there's, if there's anything that, brian, that you wanted to touch on, that we didn't touch on, anything that you wanted to bring up, this has been great. I feel like we can. There's a lot more stuff that we can dig into. We can always do a part two down the road we're going to have to have a part two.

Speaker 3:

Yeah, totally. We're going to have to dig into dystopian fusions.

Speaker 2:

I appreciate that and I would love to come back. Yeah, I'll just close out with why I came here, which is this is a PSA. I want everyone listening, who actually has authority over data and responsible for protecting data, to remember that it's real people's data. On the other line, and if you can't answer these three questions, where are all my secrets? Who has access to them? Slash do I have audibility of that access and can I stop a data breach when it happens? Then you have a problem. Go, fix it. Could be Doppler, could be something else. If you want to know more from Doppler's perspective, go to Dopplercom. Slash, unfuck. U-n-f-u-c-k.

Speaker 3:

We'll teach you how to unfuck your secrets and you can get going from there. That's awesome.

Speaker 2:

You heard it here first. First, ladies and gentlemen, your secrets are fucked.

Speaker 1:

You can very quickly. I love that. Reminds me of that one book. I forget what it's called, but it's got the f word in it and it's um how to not give a fuck yeah, that one, I get less looks fabulous.

Speaker 3:

Yeah, you had it. Yeah, ladies and gentlemen, this has been a fabulous show. I appreciate you so much, brian. Thank you for being here today. Yeah, thank you, brian. Appreciate you, man.

People on this episode