Mission Impossible Movie Teaches Real Cyber Security Lessons

This episode explores surprising cyber security lessons hidden within Mission: Impossible's latest blockbuster. We analyze how Hollywood's depiction of AI threats, immutable backups, and air-gapped storage actually reflects real-world data protection challenges.
Curtis and Prasanna dissect the movie's central premise: an AI entity altering digital reality, making it impossible to distinguish truth from fiction. The solution? An underwater Doomsday Vault containing an immutable, offline backup of the original source code. We discuss how this fictional scenario mirrors actual cybersecurity best practices, from 3-2-1 backup strategies to cryptographic hash verification.
Key topics include the spectrum of immutability, why truly offline storage matters for ransomware protection, and how insider threats can compromise even the most secure systems. We also cover practical applications like object storage, SHA-256 hashing, and the human vulnerabilities that often undermine technical security measures. Whether you're a backup professional or just curious about data protection, this episode proves that sometimes the best cyber security lessons come from the most unexpected places.
You found the backup wrap up your go-to podcast for all things
Speaker:backup recovery and cyber recovery.
Speaker:In this episode, we extract some interesting cybersecurity and backup
Speaker:lessons from Hollywood's take on AI threats and data protection.
Speaker:The movie actually gets a lot of things right about immutable backups, air gap
Speaker:storage, and the importance of offline copies when everything goes sideways.
Speaker:Well, we also talk about how an underwater doomsday vault saves the world.
Speaker:Why you need cryptographic hashes to prove data integrity and what
Speaker:happens when your cybersecurity plan doesn't account for the human element.
Speaker:The episode I think, is a lot of fun and you will actually learn a few things
Speaker:about protecting your data from ai.
Speaker:And regular old ransomware attacks.
Speaker:By the way, if you don't know who I am, I'm w Curtis Preston, AKA, Mr.
Speaker:Backup, and I've been passionate about backup and recovery ever since.
Speaker:I had to tell my boss that there were no backups of the production
Speaker:database that we had just lost.
Speaker:I don't want that to happen to you, and that's why I do this.
Speaker:On this podcast, we turn unappreciated backup admins into Cyber Recovery Heroes.
Speaker:This is the backup wrap up.
Speaker:Welcome to the show.
Speaker:Hi, I am w Curtis Preston, AKA, Mr. Backup.
Speaker:And with me, I have a guy that took way too long to see the movie
Speaker:that we're talking about today.
Speaker:Prasanna Malaiyandi.
Speaker:How's it going?
Speaker:Prasanna,
Speaker:I am good.
Speaker:Do you wanna tell listeners why it took me so long to see it?
Speaker:because you were doing research sort of
Speaker:No.
Speaker:That, no, I
Speaker:wait.
Speaker:in the country.
Speaker:Oh well there is that, but you could have seen it in India.
Speaker:Couldn't you?
Speaker:Have had
Speaker:I could have, I don't know if it would be, it probably would
Speaker:be the same experience, but
Speaker:It would be interesting if they dub it.
Speaker:If they dub it, not dub it.
Speaker:What do they call it?
Speaker:What?
Speaker:What do they call it?
Speaker:Do they dub it?
Speaker:Is that the word?
Speaker:Yeah,
Speaker:dubbed or
Speaker:in Hindi watching Tom Cruise speak Hindi.
Speaker:That would be, that'll be awesome.
Speaker:what?
Speaker:If you, I bet if you watch the old ones on Amazon, you
Speaker:Uhhuh.
Speaker:watch it in different languages and I bet you Hindi and I think
Speaker:they might also have Thamil,
Speaker:Oh really?
Speaker:which
Speaker:Wow.
Speaker:me off all the time.
Speaker:But yes, I was watch because every once in a while, for some reason, um.
Speaker:Uh, when I watch like a TV show on Amazon Prime, the audio
Speaker:would be changed to Thamil,
Speaker:Uhhuh.
Speaker:Oh, really?
Speaker:and I'm like, wait, this is not the right language.
Speaker:not right.
Speaker:Um, yeah, that's, that's pretty funny.
Speaker:But that it, that it, every once in a while for me, it
Speaker:will, it will put it in Spanish,
Speaker:Yeah.
Speaker:So the thing though is when I, so I have seen English movies in India.
Speaker:In fact, I saw Life of Pie in India and so it was English, fine.
Speaker:is kind of funny given, you know, given the content of that movie,
Speaker:seeing that movie in India is, what's the, the guy, the, the main dude.
Speaker:Is he
Speaker:Patel,
Speaker:da?
Speaker:No, he's Indian.
Speaker:Dave Patel.
Speaker:Yeah.
Speaker:Yeah,
Speaker:grew up in the UK though, or something like that.
Speaker:because he, he's, yeah, he is got kind of a, kind of an English accent a little bit.
Speaker:Yeah.
Speaker:But it was a good movie, you know?
Speaker:it was.
Speaker:And you saw it recently again.
Speaker:No.
Speaker:the way, for people who don't know,
Speaker:Yeah.
Speaker:we're talking about for this episode is actually the Mission Impossible series.
Speaker:The Mission Impossible final reckoning.
Speaker:Yes.
Speaker:And well, sorry, what I was referring to is you recently saw it again,
Speaker:Oh yes.
Speaker:The, the Mission Impossible movie.
Speaker:Yes.
Speaker:Yeah.
Speaker:So I've now seen it three times.
Speaker:Has
Speaker:have now spent 10 hours of my life watching this movie.
Speaker:Or basically two Bollywood movies.
Speaker:Basically,
Speaker:No, actually I
Speaker:I don't know.
Speaker:This might qualify to be a single Bollywood movie given,
Speaker:given that it's length.
Speaker:Yeah.
Speaker:In fact, my wife was like, oh my God, that was a little long.
Speaker:Yeah.
Speaker:Yeah.
Speaker:It's 'cause when you guys watch Bollywood movies, you kind
Speaker:of, don't, you p pause them.
Speaker:We do it over
Speaker:Yeah,
Speaker:Yeah.
Speaker:yeah.
Speaker:But this was like, you gotta watch it.
Speaker:Well.
Speaker:I, yeah.
Speaker:You have to watch this movie like that.
Speaker:Yeah.
Speaker:and the best part is I think when we're watching it after it was done,
Speaker:she was like, they don't give you an intermission in the middle of the movie,
Speaker:so you could take a bathroom break or grab some snacks or whatever else.
Speaker:Because most Indian movies, they will pause in the middle,
Speaker:Oh, interesting.
Speaker:an intermission, so then you could go like, go grab snacks or
Speaker:go to the bathroom or whatever
Speaker:And,
Speaker:continue.
Speaker:and this is my chance to mention my favorite app,
Speaker:I did
Speaker:um, which, which is, which is, you know, not a sponsor.
Speaker:This is an app I've been using for 10 years, and it's like, you know.
Speaker:Changed my life.
Speaker:So the app is called RunPee, literally word RunPee.
Speaker:And it will tell you like no P as in go pee.
Speaker:Okay.
Speaker:It will tell you when you can go run p run and pee in the movie.
Speaker:Right?
Speaker:And it's free, uh, as long as you don't mind, like movies
Speaker:older than like a week or two.
Speaker:Uh, but if you, if you don't mind either watching a bunch of ads to get, uh.
Speaker:Uh, credits to use towards p coins is what they call them.
Speaker:Or in my case, I pay a dollar a month.
Speaker:To, to belong to the, you know, to the P universe.
Speaker:And that, that literally, I'm on the West Coast, so this stuff comes out.
Speaker:Literally, I, I get, like if it's, I'm going to see a show here in a
Speaker:couple hours that that comes out today, the P Times will be in that
Speaker:movie by the time I get to the movie.
Speaker:And basically it will tell you if you just use it like regularly, if you use
Speaker:it without paying at all, the number one thing it will tell you is, is there
Speaker:anything at the end of the credits.
Speaker:Which is like, it, it does that for all movies, which is nice to know.
Speaker:Uh, but if you, if you have the credits or whatever, it will tell you
Speaker:basically at 15 minutes and 37 seconds.
Speaker:Um, when, uh, in the, in this movie, uh, when, um, when Angela Bassett
Speaker:says, when Ethan, where are you?
Speaker:Um, you can go, you get three minutes and they tell you what
Speaker:happens while you're gone.
Speaker:And then even better.
Speaker:If you don't want to have to think about that while you're watching the
Speaker:movie, you can set a timer and it'll tell you when the universal logo fades
Speaker:out, press the go button and then you can just, and it will buzz your leg
Speaker:Hmm.
Speaker:time to pee.
Speaker:And you can, and you can, uh, go run.
Speaker:And then they have this little synopsis while you're gone and
Speaker:tells you, um, it's awesome.
Speaker:Uh, and they'll also tell you, they'll be like.
Speaker:You know, the second P time is the best.
Speaker:Like you don't miss anything, you know?
Speaker:Or like the third p time is, you know, it just depends on the movie.
Speaker:They'll be like, the third p time is for emergencies only because
Speaker:you know, you do miss this or that.
Speaker:Yeah, yeah,
Speaker:so for Hollywood movies, would they basically just time it
Speaker:for every single dancing?
Speaker:So you basically have like 12 different options
Speaker:yeah, yeah, yeah.
Speaker:Yeah.
Speaker:So.
Speaker:I, first off, I loved this movie.
Speaker:Um, it, um, and, and by the way, spoiler alert, do not listen to this
Speaker:episode if you don't want to hear, you know, secrets about this movie.
Speaker:And if you haven't seen the movie, you know what?
Speaker:I don't even know what to tell you.
Speaker:'cause it's about to leave theaters.
Speaker:Um, yeah.
Speaker:a critical thing.
Speaker:'cause I think it's so, unlike you, Curtis, I don't watch
Speaker:many movies in theaters.
Speaker:I think it's
Speaker:Yeah.
Speaker:couple a year.
Speaker:Right.
Speaker:And this was one of them though, where it was like, you must see it on a
Speaker:Yeah.
Speaker:to really.
Speaker:enjoy it
Speaker:Yeah.
Speaker:the action sequences, everything else is just so much bigger
Speaker:on a big T, a big screen.
Speaker:Yeah.
Speaker:TV is 98 inches, it's still,
Speaker:it's still not the same.
Speaker:still not the.
Speaker:that there's, there's a pounding audio and everything, right?
Speaker:And, and also the communal experience of watching it with other people
Speaker:and the other people reacting.
Speaker:When I saw it on opening day, the, again, spoiler alert,
Speaker:uh, this is your last chance.
Speaker:Uh, when, um, bill Dunlow shows up on the screen, the audience
Speaker:literally cheered, right?
Speaker:people who may not know who Bill Dunlow is,
Speaker:I.
Speaker:from one of,
Speaker:Go ahead.
Speaker:know the actor's name,
Speaker:Okay.
Speaker:He
Speaker:That's the character's name,
Speaker:Oh, is
Speaker:but go ahead.
Speaker:Yeah.
Speaker:So he is a person, if you remember whichever movie it was, where
Speaker:The very first episode.
Speaker:where they're trying to steal the knock list and he's in that
Speaker:Yeah.
Speaker:secure vault and he is
Speaker:Yeah.
Speaker:repelling down and he drops the knife and
Speaker:It's where the, the famous scene of the, where he like hovers six
Speaker:inches from the floor and almost,
Speaker:So.
Speaker:The guy who Curtis is referring to was in that movie, and he's a person
Speaker:who basically was, stole his ID and were imPrasannating him inside.
Speaker:And what you learn from this movie is he's actually the designer of the vault.
Speaker:And so when the vault actually ultimately gets penetrated and the,
Speaker:the data is stolen, they send him to a, you know, a, a listening station
Speaker:in, you know, the North Pole somewhere.
Speaker:Yeah.
Speaker:Alaska.
Speaker:And, um, and he shows up in the movie now, like 30 years later,
Speaker:which is, which was just really cool.
Speaker:It was a nice fan service thing.
Speaker:But yeah.
Speaker:So anyway, um, what I thought would be fun, uh, I, what I. Enjoyed the most
Speaker:about this movie, is that, first off, it was, in some sense it was a little on
Speaker:the nose because we are starting to see, because the, you know, the core like.
Speaker:Premise here is that there is this artificial intelligence that is
Speaker:altering reality as we know it.
Speaker:And, you know, and they show pictures of like, it's like
Speaker:replacing tanks with school buses and, you know, and all this stuff.
Speaker:And, and nobody knows what truth is anymore.
Speaker:And we are in the middle of that, right?
Speaker:Did you see the article?
Speaker:In fact, the register published something recently
Speaker:Mm-hmm.
Speaker:they had talked about how today with ai, it's very similar to like
Speaker:when the atomic bomb went off.
Speaker:I. that everything changed, right?
Speaker:Like all the particles, everything else change after that one moment,
Speaker:Yeah.
Speaker:like, AI is the exact same.
Speaker:Where like everything is different now.
Speaker:And like you said, like this movie points out, it's really hard to tell
Speaker:like what's real versus what's not.
Speaker:And it could
Speaker:Yeah.
Speaker:worse in the future.
Speaker:Up.
Speaker:unleashed something.
Speaker:Yeah.
Speaker:Up until now.
Speaker:If you saw a video of something, you could believe that that thing actually happened.
Speaker:Now it's like, here's this video of this politician saying this
Speaker:thing, and you immediately go, did he actually say the thing?
Speaker:Or is this a fake video?
Speaker:That's the new reality that we're living in, and that, and some of
Speaker:that AI is amazingly, uh, accurate.
Speaker:You know what?
Speaker:I hope it'll do.
Speaker:What.
Speaker:I hope we could just like tell it, create this podcast episode and it'll generate
Speaker:videos of the two of us that are lifelike, that have all the nuances that we use
Speaker:How do we,
Speaker:the episode.
Speaker:how do we know that hasn't happened already?
Speaker:Listeners, if you think that this is a AI generated podcast
Speaker:episode, please leave a comment.
Speaker:Yeah.
Speaker:And don't, don't ask us if we're homeless.
Speaker:There was a, we got that one comment.
Speaker:The guy said he was joking.
Speaker:He said he was joking.
Speaker:'cause I was like, wow, that ouch.
Speaker:You know, uh, yeah.
Speaker:He's, he is like, yeah, these guys, they kind of look homeless, but they
Speaker:know what they're talking about.
Speaker:Um, yeah, so we're definitely in, you know, some call it the post-truth world.
Speaker:Uh, it, it's very difficult.
Speaker:I, in some sense, I think.
Speaker:That if it causes everyone to question everything and then you go and have to
Speaker:verify the the source, uh, that's not necessarily bad, but it is difficult
Speaker:because there are a lot of things that have happened over history where
Speaker:when you had the video of the thing I.
Speaker:Uh, that was, that was a conclusive proof I can think of, like, celebrities
Speaker:dropping the N bomb, for example.
Speaker:Uh, you know, if you've got a video of that or, or like a celebrity, I can
Speaker:think of some people that have like beaten up their girlfriends and it
Speaker:was caught on a security camera again.
Speaker:Yeah, yeah.
Speaker:Elevator footage.
Speaker:Yeah, exactly.
Speaker:Um, anyway, so, but
Speaker:no,
Speaker:yeah, but go ahead.
Speaker:thing though, just talking about this, right?
Speaker:How it's hard to tell, uh, reality from fiction, right?
Speaker:And what really happened in the movie though, one of the things that they
Speaker:were trying to do as this entity, this AI thing, started taking over
Speaker:the world and changing everything.
Speaker:Is they were like, we just need to dump all the data that we have
Speaker:and like basically go offline.
Speaker:Right.
Speaker:So they
Speaker:Yeah.
Speaker:scenes where there were like CIA analysts and military researchers.
Speaker:Right.
Speaker:They were doing everything.
Speaker:Old school.
Speaker:Right.
Speaker:They had like giant displays and models where they were
Speaker:physically moving ships around on
Speaker:Yeah.
Speaker:and giant rooms of people just like take like.
Speaker:Writing down physical copies of everything that was digitized.
Speaker:Right.
Speaker:So
Speaker:yeah,
Speaker:some physical copy
Speaker:yeah.
Speaker:happened to the digital.
Speaker:And, and you know, to bring it back to our world, um, there's definitely, there
Speaker:was definitely some, some ideas there.
Speaker:Um, well, I, I don't wanna get into that just yet.
Speaker:Uh,
Speaker:I, I, I don't
Speaker:but go ahead.
Speaker:either because there was
Speaker:Yeah.
Speaker:I wanted to talk about too.
Speaker:One thing that I wanted to ask you.
Speaker:So they're sitting there tr so.
Speaker:You can't trust if what you're seeing is truth fiction.
Speaker:Right.
Speaker:Right,
Speaker:Um, and so they were trying to preserve it.
Speaker:right.
Speaker:Right.
Speaker:I guess a couple questions I have for you
Speaker:Mm-hmm.
Speaker:are there alternative ways to preserve that data in such a way that you know
Speaker:that it is the original, authentic piece and not something that has been changed?
Speaker:Yeah, let's let, let's, let's not talk about that just yet.
Speaker:But yeah, that's definitely a question that I want to, that I
Speaker:want to answer in this episode.
Speaker:Right.
Speaker:Um, but just, I just want to finish commenting on the overall movie.
Speaker:I really enjoyed what.
Speaker:It was.
Speaker:Amazing, really long.
Speaker:Um, I, um, the first off the concept, the premise of the movie, a bit on the
Speaker:nose at, at the moment, um, that was actually somewhat uncomfortable at parts.
Speaker:What I really enjoyed was that for the most part, with a few.
Speaker:Especially towards the end, a few crazy things.
Speaker:They got the concepts right, the tech concepts, they got the scuba concepts.
Speaker:I'm a scuba diver.
Speaker:Um, and they even got things right that they didn't really have to explain to
Speaker:the average viewer, but they got it right for the, you know, the handful of us
Speaker:that are in the scuba, scuba community.
Speaker:And same thing with the techie world.
Speaker:Right?
Speaker:I loved this idea of the Doomsday vault.
Speaker:Right.
Speaker:I, I'm pretty sure it's total fiction, uh, because that, that sounds like
Speaker:cooperation between like world governments and I don't see that happening.
Speaker:Um, I thought it was weird that it was like floating in water, but whatever.
Speaker:I, I
Speaker:Um,
Speaker:that scene and I was like, that is really interesting.
Speaker:It's not completely outside fiction
Speaker:It is not completely unrealistic.
Speaker:Right.
Speaker:There is, by the way, there is like, there is a doomsday like seed vault.
Speaker:I, I, I'm aware of like heirloom seeds and stuff.
Speaker:Yeah, yeah.
Speaker:Norway isn't also, not technically Doomsday Vault, but
Speaker:isn't like the LDS uh, genetic
Speaker:Yes, yes.
Speaker:Yeah.
Speaker:And
Speaker:similar
Speaker:the salt mines in, in Utah.
Speaker:Yeah.
Speaker:Yeah.
Speaker:Um, and.
Speaker:But yeah, I, I liked that they, they took a lot of effort with
Speaker:some, obviously artistic license.
Speaker:They took a lot of effort to get the tech right.
Speaker:Um, I mean, you know, there were a couple of silly things like, so,
Speaker:so one of my best friends is a, is a former, uh, submariner, right.
Speaker:And I asked him, I was like, do they have a room like that?
Speaker:And, and he is like, they actually do kind of have a room like that where
Speaker:you can go in and outta the sub.
Speaker:It's meant mainly for escaping.
Speaker:It's not, subs aren't made to.
Speaker:To go diving the way that they depicted in the, in the movie.
Speaker:But he's, but the, the most unbelievable part about that room is how big it was
Speaker:that even if they had a room like that, that room could fit like 10 people.
Speaker:He's like, there's no room on the sub that fits 10 people, let
Speaker:alone a room that's used once in a while for, for scuba diving.
Speaker:But, but anyway, um, yeah, I, I enjoyed that, that stuff.
Speaker:But let's talk about the the main plot.
Speaker:I'm gonna reword it in my words.
Speaker:In order to kill the ai, what we need is.
Speaker:An immutable copy of the original source code that's been stored in
Speaker:an air gapped, a water gapped, um, a water gapped storage facility.
Speaker:Far offline from everything.
Speaker:And, uh, and then, you know, we get, we're gonna combine that with this poison pill.
Speaker:Um, so, uh, I I love that.
Speaker:I love that.
Speaker:Basically the entire, the final episode of, of, um, because we just
Speaker:did an episode on air gapping, right?
Speaker:That the, the final episode of Mission Impossible and arguably the
Speaker:entire Mission Impossible series.
Speaker:The core, the, the, the thing that's gonna win the day is an air gap backup, right?
Speaker:Or, or like I said, like a water gap backup.
Speaker:What did you think about that?
Speaker:No, that was, yeah, when I, I, you know what?
Speaker:I actually didn't think about that until you just mentioned it,
Speaker:Yeah,
Speaker:then I was like, yeah, you're right though.
Speaker:It was, we have a copy that's like buried with a submarine
Speaker:that no one knows where it is.
Speaker:yeah.
Speaker:part
Speaker:I.
Speaker:little kind of.
Speaker:Not quite what we would think about, right?
Speaker:You wanna know where your air gap
Speaker:Yeah, we, it's kind of important when you do the 3, 2, 1, it's
Speaker:important to know where the one is.
Speaker:but, but other than that, you're right.
Speaker:It was completely offline, immutable, no access to anything.
Speaker:Yeah,
Speaker:that is literally what, uh, saves the world.
Speaker:yeah.
Speaker:Um, and I, I did, you know, it did make me want to ask questions of like.
Speaker:So there's no other copy.
Speaker:There's no, I think what they were trying to say was that all other copies
Speaker:Had been
Speaker:have been infected by the ai.
Speaker:Right.
Speaker:To which I still want to say there are no other copies that are
Speaker:stored in like immutable storage
Speaker:No,
Speaker:the original source code.
Speaker:probably not.
Speaker:That, that just, that, that bothers me.
Speaker:Well, because remember though, Curtis, right?
Speaker:Part of it is, right, if you think about AI in general, right?
Speaker:Or LLM, machine learning, right?
Speaker:Part of it is the source code, but part of it is all this like
Speaker:training data and other things, which is constantly evolving.
Speaker:So what you get from one day to the next may not be the same, right?
Speaker:And
Speaker:Yeah.
Speaker:it could be learning, it could be, right?
Speaker:It's the constant learning model, right?
Speaker:So.
Speaker:Yeah.
Speaker:And you know, and, and again, there's things that you just have to let
Speaker:go, you know, the willing suspension of disbeliefs as we call it.
Speaker:I'm like, how do you write a poison pill for some source code that you, that you
Speaker:don't know what the source code does?
Speaker:I don't know, whatever.
Speaker:Not to mention the fact that you're, you got this special little slot
Speaker:thing that's designed to go in, that's designed to go in the special little
Speaker:slot in a device that you've never seen.
Speaker:That part reminded me a little bit of, um, independence Day, where, um,
Speaker:virus and.
Speaker:yeah, where they had, they upload the virus to the thing.
Speaker:It's like, it's there, there was actually a blog post that came out back then.
Speaker:It was like the, the 20 things I learned about.
Speaker:It from watching or I don't know.
Speaker:I think it was like the 20 things I learned, and one of them was if
Speaker:you go up to a top secret military facility, if you just show them an
Speaker:alien, they will let you right in.
Speaker:If you remember, they had 'em in the back of the pickup truck.
Speaker:They're trying to get in, they pull up the tarp and they're like,
Speaker:oh yeah, come on, come on in.
Speaker:Uh, and then, but my favorite was a Mac can interface with, even though it can't
Speaker:interface with most earth computers.
Speaker:This was, this was back when Mac were, were really different, you know, uh,
Speaker:uh, as I'm sitting here on a Mac.
Speaker:Um, but yeah, that was, um.
Speaker:That was a funny one.
Speaker:And so there's, there's a couple things like that to, to poke fun at.
Speaker:I will say, just for those curious, the scuba stuff,
Speaker:they got it pretty dang right?
Speaker:Um, you know, the, the, the fact that one of the, again, one of those things that,
Speaker:that, um, that they included that they didn't have to include was when you, when
Speaker:you go down in, in depth every 10 meters, you add a. Uh, one to the denominator.
Speaker:So when you're at the surface, it's, it's equal one over one.
Speaker:When you're at 10 meters, the air, uh, is compressed by half.
Speaker:When you're at, you know, 20 meters, it's compressed by one third.
Speaker:So you'd add a number all the way.
Speaker:And so he went to 300 feet.
Speaker:So I'm just gonna approximate that to be a hundred meters.
Speaker:So that means.
Speaker:He's at one.
Speaker:The air, at the, at his depth was one 11th, the size that
Speaker:it would be at the surface.
Speaker:So that's why when they said they, they're like, how do you, how could
Speaker:he have taken one breath of error?
Speaker:I
Speaker:Right?
Speaker:How can you take one breath of air at 300 feet and last to the surface?
Speaker:And you know, one answer is, well, he didn't.
Speaker:But the other answer is, and by the way, I've actually done this, not at, not at
Speaker:a hundred meters, but, but at 90 feet.
Speaker:Um, I've run out of air.
Speaker:Um, and because as you come up the surface that, that, that process reverses itself.
Speaker:So that's why.
Speaker:You have to breathe continuously.
Speaker:The number one rule of scuba that you learn over and over and
Speaker:over again is always breathe.
Speaker:Never hold your breath.
Speaker:Hmm.
Speaker:as you come up the air, the air is expanding and so your lungs would explode
Speaker:if you didn't let air out your your thing.
Speaker:And so you can see him as he's coming up.
Speaker:So it literally is possible to take a single breath of air at
Speaker:depth and make it to the surface by slowly broke, blowing out, uh, air.
Speaker:Yeah.
Speaker:And um, and I know that for a fact because I did run out of air.
Speaker:At depth at 90 feet.
Speaker:Um, that's a story for another day, but at night for the record.
Speaker:Um, yeah, in rough ski.
Speaker:Yeah.
Speaker:Scary,
Speaker:Not good.
Speaker:actually, it's actually happened to me twice, but anyway,
Speaker:Hmm.
Speaker:nevermind.
Speaker:Um, but so let's talk about, um, I think the core thing of the movie
Speaker:is that air gap backup's good.
Speaker:Right.
Speaker:You, you do need a, you do need a, the ability to trust, you
Speaker:know, a copy of, of your stuff.
Speaker:So here's a question for you.
Speaker:Yeah.
Speaker:that technically considered an air gap backup because they've never actually
Speaker:used it for restore testing before
Speaker:I.
Speaker:to.
Speaker:Well, I would, I would say that it sort of, it was sort of forced to be
Speaker:an air gapped backup, and again, we're, you know, it's a water gapped backup.
Speaker:But, but, um, it, it was offline, right?
Speaker:The point is that it was offline.
Speaker:It was that whole sub was meant to be offline.
Speaker:Now, I don't, I. I don't know if was the sub offline because it crashed
Speaker:Yeah.
Speaker:was it offline?
Speaker:Because that was the design of subs.
Speaker:'cause remember the, that sub, the other sub, it had to come to the surface to
Speaker:be online and he was staying at depth on purpose to avoid detection and, and
Speaker:also, uh, infection by the, by the entity.
Speaker:Um, but yeah, I mean, sometimes you do what you gotta do, right?
Speaker:But in this case, um, I.
Speaker:And when we say it was an immutable copy, it was only immutable because there
Speaker:was nothing there that could change it.
Speaker:It was just a hard drive.
Speaker:yep,
Speaker:was just a hard drive with some source code.
Speaker:And by the way, why is the source code there anyway?
Speaker:It should be the compiled version of the whatever, you know, we
Speaker:will let that go.
Speaker:But you know, it, it, it could, it wasn't technically immutable.
Speaker:It was just a hard drive.
Speaker:Right.
Speaker:And then we get to the end, what they do to get the entity into the,
Speaker:um, what was the name of the place?
Speaker:Doomsday Vault, right?
Speaker:Because the, you, you may be thinking, well, isn't the whole purpose of
Speaker:this vault to be offline and, you know, and, and outlive the world?
Speaker:The answer is yes.
Speaker:That was the whole point of what she was doing.
Speaker:Where she's splicing in the, the red wire and the thing, and then she's connecting
Speaker:a, uh, a transmitter or receiver.
Speaker:Right?
Speaker:Which again, you know.
Speaker:Insider threat.
Speaker:insider threat.
Speaker:The, um, again, that's the part you kinda have to let it go because if you
Speaker:see how deep that vault is, there's no way a cell phone signal's getting
Speaker:outta that thing, but whatever.
Speaker:Um, and, um, and then he's, and then he ends up connecting his thing
Speaker:where he's, as he's falling through the air, you may recall that he
Speaker:does his final connection as he's falling through the air on a reserve.
Speaker:Parachute.
Speaker:Thank God.
Speaker:Um.
Speaker:By the way, total aside, I have an ex-brother-in-law who's whose,
Speaker:uh, parachute failed like that.
Speaker:And, and, uh, cigar rolled,
Speaker:Okay.
Speaker:is what they call it.
Speaker:Um, and he bounced.
Speaker:He did not die.
Speaker:He, he bounced, but he, he, he dove from 10,000 feet and, uh,
Speaker:or, or it died at 10,000 feet.
Speaker:And, uh, and he bounced and lived, uh, but he was, he was never the same again.
Speaker:Um, but yeah, so he.
Speaker:He, he transmits the stuff and they get it in there, and the whole thing
Speaker:of the, the blink of an eye and the, you know, that whole thing.
Speaker:wanna talk about.
Speaker:So they're
Speaker:Okay.
Speaker:One of the people, they designed this five dimensional storage system
Speaker:is that?
Speaker:was like a crystal like structure, which was maybe like three inches
Speaker:by two by one inch by one inch,
Speaker:Yeah,
Speaker:small,
Speaker:yeah, yeah.
Speaker:able to dump all of the entities data.
Speaker:There's.
Speaker:Yeah.
Speaker:Entity into it in like a blink of an eye.
Speaker:Right.
Speaker:And I was like, okay, does, and it would have to change
Speaker:colors when it was receiving.
Speaker:I was like, really?
Speaker:And once again, going back to what you said, they didn't know what the interface
Speaker:looks like because who had access to the Doomsday Vault and there was a guy making
Speaker:Yeah,
Speaker:was like
Speaker:yeah,
Speaker:in New York City in a
Speaker:yeah,
Speaker:vault.
Speaker:And
Speaker:yeah.
Speaker:So, yeah, so stuff like that, not to mention the whole,
Speaker:like he starts describing the wires that she's gonna find.
Speaker:Yeah.
Speaker:I am like, you'll, you know, to the right, you're gonna find a cluster of wires
Speaker:and you're gonna want to cut the red one and, you know, um, or don't cut the red.
Speaker:I, I don't remember.
Speaker:But, but, um, yeah.
Speaker:So not to mention the amount of data that, that, that, that, that, that,
Speaker:that saying had being transmitted over a cell phone signal over the
Speaker:public internet via thing, you know?
Speaker:Yeah.
Speaker:You, you just gotta let that part go.
Speaker:Yeah.
Speaker:Um,
Speaker:So,
Speaker:it, it reminds me of an episode of Alias where she's trying to steal data from
Speaker:a server and she does this upside down hanging thing, kind of like, uh, Ethan
Speaker:Hunt in the first episode and she has a wireless modem that she's using to
Speaker:transfer the data out of the server.
Speaker:And I'm like, that's not,
Speaker:How it
Speaker:that's not how that works.
Speaker:Well, it's like, do you remember, uh, with old car fob or key fobs
Speaker:where you would hold it to your chin to turn your head into an antenna?
Speaker:No, I do not remember that.
Speaker:So Volkswagen, Audi Keys, in the early two thousands, the distance
Speaker:was awful, if you held the car, uh, key fob underneath your chin.
Speaker:I, this sounds,
Speaker:increase the
Speaker:like an urban legend
Speaker:I've actually done it, Curtis.
Speaker:really.
Speaker:Yes.
Speaker:It explains a few things, but, um, yeah, so, so.
Speaker:talk about something.
Speaker:Okay.
Speaker:So what, just so we've got some things that we could talk about,
Speaker:we can talk about, I mean, we, we could talk about, um, this concept
Speaker:of the, of the doomsday vault.
Speaker:We can talk about the offline copy, we can talk about the immutability.
Speaker:We can talk about this question that you had about what would you
Speaker:do if you actually wanna store data that you can trust like that?
Speaker:Yeah.
Speaker:Yeah.
Speaker:So, which, which question do you wanna start with?
Speaker:start with production,
Speaker:Uh, so you're saying that you, you want to be able to have a copy?
Speaker:sorry.
Speaker:So data you're creating today,
Speaker:Mm-hmm.
Speaker:that either you're creating or already exists.
Speaker:Yeah.
Speaker:How do you go about to ensure that?
Speaker:It doesn't get changed, and you could prove the authenticity of said data.
Speaker:Yeah.
Speaker:Right, because this comes up in compliance use
Speaker:Yeah.
Speaker:right?
Speaker:With their SOX regulations, financial regulations, medical regulations,
Speaker:all of these things, right?
Speaker:Yeah.
Speaker:And so the one thing they didn't cover, which I'm wondering maybe that was a miss
Speaker:on their part, was, was there something they could have done for production data?
Speaker:Instead of we only focus on making sure that you're able to determine
Speaker:has it been altered or not?
Speaker:Yeah.
Speaker:So the answer is yes.
Speaker:I will say that the core thing still comes down to we need something,
Speaker:we need a place that we, that we can trust that's immutable, right?
Speaker:We need some type of storage, and, and I, I think this is a, this is true.
Speaker:For so much, for, for, for backups, for ransomware protection, for, uh,
Speaker:lawsuit, you know, for eDiscovery purposes, you need to be able to
Speaker:demonstrate for multiple reasons and multiple purposes that the thing that
Speaker:you're presenting, whether it's for a restore or for a lawsuit, is really
Speaker:the thing that you started with, right?
Speaker:And, um, and so.
Speaker:It has to start with a truly, truly, truly immutable copy of the data.
Speaker:Right?
Speaker:And, and again, I go back to the only thing that's truly immutable is
Speaker:something that even you aren't allowed to delete it even if you have all power.
Speaker:Right?
Speaker:Which is rare.
Speaker:Very,
Speaker:really, really rare.
Speaker:Right.
Speaker:Um,
Speaker:had access to a server,
Speaker:yeah.
Speaker:just physically destroy the server, the hard disks.
Speaker:Yeah,
Speaker:Which, which is why you're saying it's very, very hard to truly have
Speaker:well.
Speaker:copy.
Speaker:Well, what I, well, I'll also say there's no such thing as an immutable
Speaker:Yes.
Speaker:because of what you just said, right?
Speaker:There is no such thing, 100% immutable.
Speaker:We talk about immutability being a, a binary condition like pregnancy
Speaker:or death, but it's not right.
Speaker:It is a spectrum.
Speaker:Um, but.
Speaker:But there's, you know, there's good, better, best, right?
Speaker:And the best immutability is you can't delete it.
Speaker:No one else can delete it short of what you just described,
Speaker:short of destroying it.
Speaker:And then what you have to do to protect against that is you have to have
Speaker:multiple copies and multiple situations.
Speaker:Right.
Speaker:Um, and, um, by the way, I do, um.
Speaker:Before I forget the thing that happened from the first episode.
Speaker:The thing that we learned from the first episode is you can have all
Speaker:these, like there's this scene where they describe the 10 different security
Speaker:measures to get into the front door.
Speaker:And so what does he do?
Speaker:He goes in the back door,
Speaker:Yep.
Speaker:he's gonna come through the AC vent,
Speaker:Yeah.
Speaker:is just hilarious.
Speaker:Anyway, um, but.
Speaker:It has to start with a 100% immutable storage thing.
Speaker:There are a variety of ways to do that.
Speaker:Um, you know, uh, maybe we should just have an episode, which is talk
Speaker:about the different ways to have a hundred percent immutability.
Speaker:Um, although there's no such thing as a hundred percent I, um, but, so once you
Speaker:have that, then it's a matter of metadata.
Speaker:Helping you to prove that the thing is the thing, right?
Speaker:So if you've got a place where you can store metadata that is also immutable,
Speaker:then you can use things like hashes.
Speaker:You can use things like, you know, MD5 hashes, or, you know, and, and, and, and,
Speaker:you know, they gotta be made more and more secure, bigger and bigger hashes, uh, to,
Speaker:to prevent, you know, hash collisions.
Speaker:Um, you know, MD5, like, well, there's, um, uh, what's.
Speaker:Yeah.
Speaker:Thank you.
Speaker:Well, there's, there was Shah one
Speaker:Yeah.
Speaker:Shah two pretty much gone.
Speaker:So people are saying Shah 2 56, um, which just the, the i I think that
Speaker:just meets the 256 bits, right?
Speaker:The hash right.
Speaker:Um, and you, you need that, you, so with that, if you have,
Speaker:you know, a cryptographic hash.
Speaker:Each object that you stored in the immutable storage, when you retrieve
Speaker:it from that immutable storage, you can say, here's the thing, we run the hash
Speaker:against it and, and, and it matches.
Speaker:Right.
Speaker:Um, now again, you know, I'm constantly thinking about ways to get around
Speaker:that we go through the back door.
Speaker:The way to get around that would be to hack the source code of the Shaw.
Speaker:Yeah.
Speaker:Uh, algorithm.
Speaker:Yeah.
Speaker:Um, but, um, which again, that's why you gotta protect that, you know, you know
Speaker:it, it's all these different things.
Speaker:Which it, so
Speaker:go ahead.
Speaker:very interesting what you just mentioned because I don't know if we've talked
Speaker:about it on the podcast before.
Speaker:Right.
Speaker:I think normally when we talk about dealing with immutability, we
Speaker:only focus on the storage pieces.
Speaker:I. Right, the data storage,
Speaker:Mm-hmm.
Speaker:ever touched upon metadata, but I think that's a really important point
Speaker:Yeah.
Speaker:well, is whatever system you're using to keep track of the metadata
Speaker:also needs to be immutable because otherwise someone could just fake it
Speaker:Yeah.
Speaker:you're screwed.
Speaker:Yeah.
Speaker:That, but that's the whole point of that is like, and, and generally that's just
Speaker:the way object storage works, right?
Speaker:Is the whole, that, that hash is the identity of that thing.
Speaker:And so when, if the thing would ever change, it's identity
Speaker:changes and, you know, and right.
Speaker:And so you, and, and object storage is also self-healing.
Speaker:And then if, if a part of it is damaged, then it, it replicates and, and, and.
Speaker:Replicates that stuff, and when we replicate it, we ma we use the hashes
Speaker:to make sure that it's been properly replicated and all of those things.
Speaker:That's just many of the amazing things about mu um, uh, just object
Speaker:storage and what makes it so amazing.
Speaker:Um, and I, I do think that it, that it, that it has become the
Speaker:defacto way to store data from an immutability perspective.
Speaker:Um, uh.
Speaker:And in a way, in a way that can be used to prove that something
Speaker:is what it, what it was.
Speaker:Right.
Speaker:Yep.
Speaker:and, uh, I, I think that's the, that's the best answer to that.
Speaker:You, you add to that the challenge of storing it long term, it's
Speaker:a different discussion, an entirely different discussion.
Speaker:Yeah,
Speaker:Right.
Speaker:Um, because again, there's no medium.
Speaker:That lasts forever.
Speaker:There's no medium that even lasts more than say, 30 ish years.
Speaker:Right.
Speaker:the
Speaker:M-disc people will disagree with you.
Speaker:well, there are some that, that, like when we talk about like m-disc, right?
Speaker:Um, M-disc is designed to last longer than that, but it's only been tested
Speaker:in a lab to last longer than that.
Speaker:Right.
Speaker:Um, and I, I, as much as I'm a fan of m-disc, I don't think m-disc
Speaker:is gonna be around in 50 years.
Speaker:Yeah.
Speaker:Uh, so, you know,
Speaker:You're gonna have to
Speaker:just,
Speaker:to something.
Speaker:yeah.
Speaker:You're gonna have to migrate to something.
Speaker:Curtis, DNA.
Speaker:And that's the thing right there.
Speaker:There are all, I've, I've talked to some people in some skunkworks projects
Speaker:where they talk about DNA storage.
Speaker:They talk about optical storage by optical, I'm sorry.
Speaker:Um, like, like a block,
Speaker:Yeah,
Speaker:a, like 3D optical storage or five D, five D optical storage,
Speaker:whatever the hell that is.
Speaker:Um, what is five D?
Speaker:That was the one part.
Speaker:I was like, wait,
Speaker:Well,
Speaker:is that?
Speaker:Yeah,
Speaker:Um,
Speaker:maybe time and space are included in
Speaker:'cause that, that'd be like four D if it's time.
Speaker:And then space.
Speaker:No, but space is part of 3D.
Speaker:What's the fifth dimension?
Speaker:Something new.
Speaker:Maybe it's a black hole
Speaker:Fifth Dimension is a, is a musical group from the seventies.
Speaker:that, is something that you should be looking into now to a way to store your
Speaker:data in a way that not just from like being attacked by AI but being attacked.
Speaker:Well, you are being attacked by ai, but not in the way that the movie's depicting.
Speaker:It's more like, um, opportunistic cyber attacks that are using ai.
Speaker:We joked about it a lot, but the other value here in that movie is that it
Speaker:shows the value of an air gap backup or a water gap backup apparently.
Speaker:Um, yeah.
Speaker:think it was actually critical because one of the things that they were worried
Speaker:about, I know you just talked about immutable, 'cause they're worried about
Speaker:the entity modifying existing data, but also to some extent they were worried
Speaker:about the entity just learning, right?
Speaker:Gathering, reading the data
Speaker:Mm-hmm.
Speaker:where's there and learning from it and understanding where all the faults are.
Speaker:And so they
Speaker:Right?
Speaker:that offline air gap, water gap, whatever you wanna call it.
Speaker:Yeah.
Speaker:Yeah.
Speaker:Yeah.
Speaker:And I mean.
Speaker:That's the one part of the movie.
Speaker:I'm like, nobody in this world, nobody in the Mission Impossible Universe
Speaker:has an air gap copy backup of stuff.
Speaker:Like, what the hell?
Speaker:Um, but it is super important.
Speaker:But, we've already had it, a whole episode of that.
Speaker:So that's all I'm gonna say about that there.
Speaker:The other is that, um, is to just, when you look at security.
Speaker:JJ Just, just think about this.
Speaker:Think about the first episode.
Speaker:Think about this episode.
Speaker:When you look at security, sometimes we focus so much on this or that, right?
Speaker:It's important when designing your DR system, when you're designing
Speaker:your cybersecurity system to make sure that you have somebody who
Speaker:can help you think outside the box
Speaker:Yep.
Speaker:to help you think about the air vent,
Speaker:Yep.
Speaker:help you think about, uh, basically what what did happen was everybody
Speaker:sort of gave up and abandoned the, the, the doomsday vault, right?
Speaker:Um, basically, um.
Speaker:And they, they weren't killed.
Speaker:They just left.
Speaker:Yep.
Speaker:Um, and so we can talk about the, the vulnerability of the
Speaker:human in the cybersecurity plan.
Speaker:Yeah, it's like a lot of times when we talk, or you have a great example of
Speaker:this, right, where you always assume that someone may be around, like you
Speaker:used to do your DR testing when you
Speaker:Yeah,
Speaker:at the bank, and they would tell you, no, Curtis, don't do the test.
Speaker:Have someone else do it.
Speaker:yeah, yeah.
Speaker:we
Speaker:I.
Speaker:the talk about the derecho, right?
Speaker:Or the podcast episode where they're like, we had people who didn't have
Speaker:contact and who normally would be doing things and are not there,
Speaker:Yeah.
Speaker:so it's all these assumptions that you make about people being available or
Speaker:communications being available that may not be available when you need it.
Speaker:Yeah.
Speaker:A derecho.
Speaker:By the way, we had an episode on that and I learned about this in that episode.
Speaker:A derecho is a hurricane that forms over land.
Speaker:It's a thing.
Speaker:So, so the one last thing I want to bring up on this
Speaker:Yeah.
Speaker:so we've been talking about this entity, right, which is going and
Speaker:changing all the people's data and
Speaker:Mm-hmm.
Speaker:the rest and trying to take over the world.
Speaker:And at one point, right, it hijacks the nuclear weapons of all these
Speaker:Mm-hmm.
Speaker:right?
Speaker:Would you think that this is like a perfect cyber attack ransomware scenario?
Speaker:Well, I mean, it's a, it's a huge cyber attack ransomware scenario.
Speaker:Right.
Speaker:And it, and it's an argument for offline control of certain things.
Speaker:Right.
Speaker:I, I actually, I'm pretty sure that, for example, the control of
Speaker:the nuclear arsenal is not online.
Speaker:Right.
Speaker:It's much more about process.
Speaker:Right.
Speaker:And they show some of it with the cracking up the key and all that stuff.
Speaker:Again, I have zero knowledge of how the actual nuclear arsenal is controlled.
Speaker:You know, I.
Speaker:the football that the person carries
Speaker:Other than the, yeah, the thing they call the football for some reason, which I
Speaker:believe is just the, the codes, right?
Speaker:The cracking of the codes, and it's a verbal thing.
Speaker:And again, that process needs to be updated possibly
Speaker:because of, you know, AI and
Speaker:yeah,
Speaker:of voices and things.
Speaker:But it's, I think it's offline very much by default, by by design.
Speaker:Yeah.
Speaker:Um, and not to mention the fact that those things are running on
Speaker:like eight and a half inch floppies.
Speaker:isn't that a little scary?
Speaker:Yeah, it's what it is.
Speaker:There are some things that need to be, you know, so far the world has agreed.
Speaker:For example, when it comes to warfare that the decision to kill should
Speaker:always be in the hands of a human.
Speaker:Even if you have AI assisting you in the decision that the actual, that,
Speaker:that we don't want robocop, right?
Speaker:We don't want the scene in the original robocop, which for
Speaker:the record is an amazing movie.
Speaker:Um, and, um, you know, you have 50 seconds to comply.
Speaker:That's that seed, you know, is like that.
Speaker:That's when the movie like.
Speaker:That's when you're like, what am I watching?
Speaker:Yeah.
Speaker:Um, but um, yeah.
Speaker:Good movie.
Speaker:Anyway, this was fun.
Speaker:Yeah, no, I hope the listeners enjoyed this.
Speaker:Uh, slightly different take on thinking about backup and data protection
Speaker:Yeah.
Speaker:through Hollywood.
Speaker:I seen through Hollywood
Speaker:you
Speaker:and I.
Speaker:this, maybe we should do some others,
Speaker:I think we can, I think there's more.
Speaker:I think there's other movies and other TV shows where I've seen
Speaker:backups and cybersecurity depicted.
Speaker:That's really funny.
Speaker:We could probably bring Mike on, uh, to have another conversation.
Speaker:Um, once he finishes his chapter though, uh, by the way, I
Speaker:have written, I have finished.
Speaker:The, the rough draft of the final chapter, my final chapter of the book, he's
Speaker:writing the final chapter of the book.
Speaker:And, um, uh, so now it just begins the, the, oh, it's awesome.
Speaker:It's awesome.
Speaker:And, um, and now it just begins a process of the, the big
Speaker:thing is the cop the tech edit.
Speaker:And by the way, if you're in the cybersecurity space and you wanna be
Speaker:a tech editor of the book, reach out to me, w Curtis Preston at gmail.
Speaker:You get a free book.
Speaker:Right by being a tech editor, and you get mentioned in the book, so there's that.
Speaker:There's actually a guy in, in, in my second book, which was using Stan and Nas.
Speaker:There's a guy that's mentioned in it twice.
Speaker:Have I told you about this?
Speaker:His name is, uh, grant Melvin.
Speaker:I know Grant, he used to be my boss.
Speaker:Yeah.
Speaker:So Mel, he's mentioned in the book twice as Grant, Melvin and Melvin Grant.
Speaker:From NetApp, right?
Speaker:Yes.
Speaker:Yeah,
Speaker:Yeah, he was
Speaker:Grant.
Speaker:Grant, Melvin.
Speaker:Yep.
Speaker:I, what's funny is I interacted with him for months over email and then I
Speaker:run into him in, in, um, in the, you know, in NetApp headquarters and he is
Speaker:like, I'm Grant Melbourne from NetApp.
Speaker:I had no idea.
Speaker:Yeah.
Speaker:All right.
Speaker:Well, uh, thanks.
Speaker:Thanks for, thanks for, thanks for finally seeing the movie
Speaker:Yeah.
Speaker:uh, thanks for the episode.
Speaker:Yeah.
Speaker:All right, and thanks to the listeners.
Speaker:Hope you enjoyed this.
Speaker:If you like it, uh, tell us, you know, maybe we'll make some more.
Speaker:That is a wrap.
Speaker:The backup wrap up is written, recorded, and produced by me w Curtis Preston.
Speaker:If you need backup or Dr. Consulting content generation or expert witness
Speaker:work, check out backup central.com.
Speaker:You can also find links from my O'Reilly Books on the same website.
Speaker:Remember, this is an independent podcast and any opinions that
Speaker:you hear are those of the speaker and not necessarily an employer.
Speaker:Thanks for listening.