Oct. 6, 2025

Deepfake Attacks: The Growing Threat to Enterprise Security

Deepfake Attacks: The Growing Threat to Enterprise Security

Deepfake attacks are exploding, and your company is probably not ready. In this episode of The Backup Wrap-up, we dive into how cybercriminals are using AI to clone voices and create fake videos to authorize fraudulent wire transfers and reset credentials. With nearly 50% of businesses already experiencing deepfake attacks, this isn't a future problem – it's happening right now. We break down the two main attack vectors: authorization fraud (where fake CEOs trick employees into wiring money) and credential theft (where attackers reset passwords and MFA tokens). More importantly, we give you actionable defense strategies: multi-channel verification protocols, callback procedures for sensitive transactions, employee training programs, and break-glass scenarios. You'll learn what not to rely on (spoiler: caller ID is worthless) and why policy and procedure matter more than technology alone. This is a must-listen for anyone responsible for security or financial controls.

Speaker:

You found the backup wrap up your go-to podcast for all things

Speaker:

backup recovery and cyber recovery.

Speaker:

In this episode we're talking about.

Speaker:

Deep fake attacks.

Speaker:

Trust me, it's way scarier than you think.

Speaker:

Prasanna and I break down how attackers are using ai, uh, to clone voices and

Speaker:

create fake videos of CEOs and CFOs.

Speaker:

They only need 30 seconds by the way of a, of a voice to impersonate someone.

Speaker:

We cover the different types of attacks from wire fraud to credential theft, and

Speaker:

also give you practical steps that you can take right now to defend against this.

Speaker:

We talk about also how to set up procedures that won't get overridden.

Speaker:

When someone says they're having an emergency, nearly half of

Speaker:

all businesses have already been hit by deep fake phone calls.

Speaker:

Listen in before you become the next victim.

Speaker:

By the way, if you don't know who I am, I'm w Curtis Preston,

Speaker:

AKA, Mr. Backup, and I've been passionate about backup and recovery.

Speaker:

Ever since I had to tell my boss that we had no backups of the production

Speaker:

database that we had just lost.

Speaker:

I don't want that to happen to you, and that's why I do this.

Speaker:

On this podcast, we turn unappreciated admins into Cyber Recovery Heroes.

Speaker:

This is the backup wrap up pop.

Speaker:

Welcome to the show.

Speaker:

Hi, I am w Curtis Preston, AKA, Mr. Backup, and I have a guy with me who

Speaker:

continues to allow me to change the design of a project that I have going on.

Speaker:

And so it's all his fault.

Speaker:

Prasanna Malaiyandi, How's it going, Prasanna

Speaker:

I am good, but I just, just to make sure everyone

Speaker:

Uhhuh.

Speaker:

I had initially suggested this design to you.

Speaker:

And you poo-pooed on the idea and now you're walking back after you've

Speaker:

done additional investigation.

Speaker:

well, you know,

Speaker:

like your first design was solid

Speaker:

yeah,

Speaker:

because there's sort of the difference between like a

Speaker:

design and what is practical.

Speaker:

And what, and what doesn't cost a million dollars.

Speaker:

$1 billion.

Speaker:

uh, and I, I don't want to go into detail right now because it may

Speaker:

be, it may just be insanity, and I may, you know, this, this idea, um.

Speaker:

You know, and

Speaker:

We'll see how it

Speaker:

we'll see how it goes and, um, you know, but, uh, but yeah, I just keep,

Speaker:

I keep, I haven't pulled the trigger on the design and I keep changing the gun.

Speaker:

But, but at least it's not like you're swapping out a car for

Speaker:

a boat or an airplane, right.

Speaker:

No.

Speaker:

like, oh, do I want a car versus do I want four door

Speaker:

Do I?

Speaker:

a coupe?

Speaker:

Yeah.

Speaker:

That's sort of the thing.

Speaker:

You're at

Speaker:

Or do.

Speaker:

it down.

Speaker:

Or do I want a pick up or do I want, or

Speaker:

I don't know if it's quite at the pickup level.

Speaker:

I think you're beyond the, like, is it an SUV?

Speaker:

Is it a car,

Speaker:

Yeah.

Speaker:

Okay.

Speaker:

Yeah.

Speaker:

you're beyond that.

Speaker:

I think you're now

Speaker:

But I'm like, do I want 19 inch wheels?

Speaker:

Do I want plastic wheels?

Speaker:

Do we want

Speaker:

Yeah.

Speaker:

run flats?

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

Nice analogy.

Speaker:

I like that.

Speaker:

See, you're not as, it's not as, uh, bad as

Speaker:

It's just that every time I think I've got it sorted out, I come up

Speaker:

with a problem with my current design.

Speaker:

Yes, and so all you have to do is just stop talking to me.

Speaker:

Honestly, if you just stop talking to me, I bet you can

Speaker:

like quickly narrow down your

Speaker:

Yeah.

Speaker:

should talk to me after it's done, after it's

Speaker:

No, I can't do that.

Speaker:

You're on my everything advisor, you know, that.

Speaker:

Um, I just wanna say, I just wanna say to the audience, we've often mentioned

Speaker:

that Prasanna has this like, uh, this ridiculous level of random knowledge.

Speaker:

Today.

Speaker:

I'm talking to Prasanna and he's like, so yeah.

Speaker:

So I think the, the flow rate of a garden hose is like one to one

Speaker:

and a half gallons an hour, and.

Speaker:

And I was just like, like, why do you know that?

Speaker:

Also, I was significantly

Speaker:

Oh, were you, were you?

Speaker:

So it depends on the size of a pipe, they say it could be between five and eight.

Speaker:

Yeah.

Speaker:

Okay.

Speaker:

So, okay, so in this case you thought you knew something,

Speaker:

Yes.

Speaker:

right?

Speaker:

But I did clarify and say I think that might be low just given that

Speaker:

bathroom faucets are like one to one and a half gallon per minute.

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

All right.

Speaker:

Well, uh, we're not gonna talk about that stuff at all.

Speaker:

We're gonna talk about

Speaker:

deep fake attacks.

Speaker:

Um, and this is a real problem.

Speaker:

You sent me a great article that started this whole thing.

Speaker:

Uh, yeah, from the register and the title, we'll put, we'll put the link in here.

Speaker:

Uh, and the, the saying from Gartner, they're saying nearly

Speaker:

half of businesses suffered deep fake phone calls against the staff.

Speaker:

Um, so do you want to, you wanna just give an overview of

Speaker:

what we're talking about here?

Speaker:

Yeah, sure.

Speaker:

So with the prevalence of AI and all these models being amazing and being

Speaker:

able to generate video and photos, aspect is being able to generate audio.

Speaker:

so now these AI models are really good at taking someone's voice.

Speaker:

And being able to mimic it such that an attacker can make it sound

Speaker:

like you are talking the same way as the person that they're trying to

Speaker:

Right, right.

Speaker:

imitate.

Speaker:

And the quality has gotten so good that it's pretty difficult to differentiate

Speaker:

between the real person and the deep fake.

Speaker:

Especially if you don't know the real person.

Speaker:

Right?

Speaker:

So like if you, if you don't know them personally, I, I think I would

Speaker:

like with you, I think I would pretty readily notice a deep fake, right.

Speaker:

Well, first off, I would ask the deep fake, some random piece of information.

Speaker:

I'd be like,

Speaker:

probably may not

Speaker:

you know.

Speaker:

a deepfake probably has that

Speaker:

Oh, that's true.

Speaker:

I, I'd be like, I'd be like the, the classic line and, um, what, what's

Speaker:

the, what's the ignition timing on a four barrel carburetor on a 1968?

Speaker:

Yeah, my cousin, my cousin Vinny.

Speaker:

Um, it's a trick question.

Speaker:

Um,

Speaker:

um, the, um,

Speaker:

So, so there is so DeepFakes, right?

Speaker:

We talked about use it also as pranks, right?

Speaker:

And kids use it to be like, Hey, I am imitating someone else.

Speaker:

Or, you call in sick from school and the school calls your parents

Speaker:

and is like, where are you?

Speaker:

And you can imitate your parents and be like, Hey, so-and-so's sick.

Speaker:

He's not coming in today.

Speaker:

Right, right,

Speaker:

So there are sort of not so like critical reasons people use

Speaker:

it, like for the fun aspects of

Speaker:

right.

Speaker:

but then there's also the malicious uses of this technology, which is

Speaker:

kind of what this episode is about.

Speaker:

And, and also there, so there's audio deep fakes and there's video deep fakes, right?

Speaker:

Um, and it was interesting, again, that same article said that 36% of people, uh,

Speaker:

had, had, had video deep fakes, and that 5%, uh, had, had caused a serious problem.

Speaker:

So

Speaker:

Yeah,

Speaker:

go ahead.

Speaker:

so, and just the point about that.

Speaker:

Audio deep fakes are less costly today versus video

Speaker:

Right.

Speaker:

I think in one of the articles I read, I can't remember if it was the

Speaker:

registrar article, but like a video Deep deepfake, a really convincing

Speaker:

one might cost millions of dollars.

Speaker:

Hmm.

Speaker:

Um, that number has probably significantly gone down with a lot of the new AI

Speaker:

models that have been announced, but doing it real time is a little

Speaker:

bit more difficult than doing sort of a pre-process, but doing audio

Speaker:

is significantly easier than video.

Speaker:

Right.

Speaker:

Right.

Speaker:

And so and so it's, it's probably less, that makes it less likely to be there.

Speaker:

Right.

Speaker:

Uh, but, but it, we will talk about one particular type of attack that

Speaker:

uses video deep fakes, where I think that it addresses that, the issue

Speaker:

that you talked about right there.

Speaker:

Right.

Speaker:

Yeah.

Speaker:

Um, and, uh.

Speaker:

And, and why would they do this?

Speaker:

Why, why would somebody, you know, what, what are they attempting to do here?

Speaker:

So they are basically trying to trick the company employee, right, specifically

Speaker:

the employee, to do something.

Speaker:

It might be to gain access to the corporate network by doing password

Speaker:

resets, which is what we see a lot of, uh, scattered Spider currently

Speaker:

doing for their attacks, where they're calling in pretending to be someone.

Speaker:

Mm-hmm.

Speaker:

and being like, Hey, wire the money over to this

Speaker:

Right,

Speaker:

rather than the one

Speaker:

right.

Speaker:

you.

Speaker:

And now they just siphoned all your money and it's really hard to get it back

Speaker:

I'd say that, that, that's sort of two broad categories.

Speaker:

So I mean, the one giant category is basically get the employee to do something

Speaker:

that they wouldn't have otherwise done.

Speaker:

Right.

Speaker:

And that the, the two big categories would be at one grant, somebody

Speaker:

access to something, and two, grant them access to some money.

Speaker:

Right?

Speaker:

Send some money over.

Speaker:

And by the way, wire fraud is very real.

Speaker:

Wire fraud is very real.

Speaker:

And once it's done, it's kind of done right.

Speaker:

Um, and, uh, you know, once it's discovered, if the

Speaker:

money's gone, the money's gone.

Speaker:

Right.

Speaker:

Yep.

Speaker:

And there was a huge issue, and I think it's still a huge issue, where

Speaker:

people would basically get so-called signing instructions or last minute

Speaker:

wiring instructions from the broker and be like, wire it to this account.

Speaker:

They'd wire it, and then it turns out the broker had been compromised,

Speaker:

Right,

Speaker:

had imitated them, sent out all these emails, and now had a bunch

Speaker:

of money that people thought.

Speaker:

Was like their down payment going to the person and turned out it

Speaker:

was going to a criminal actor.

Speaker:

right, right.

Speaker:

Not good.

Speaker:

Not good.

Speaker:

Yeah.

Speaker:

Um, so.

Speaker:

Uh, first off,

Speaker:

on business.

Speaker:

yeah, focusing on businesses.

Speaker:

But first off, I think the number one goal of this call is to just,

Speaker:

if you don't know how big of a deal this is, this is a big deal.

Speaker:

It's a big deal.

Speaker:

Literally today, uh, there was one statistic that suggested

Speaker:

that they thought that 30%.

Speaker:

Um, you know, of enterprise fraud will be deep fake based by next year.

Speaker:

Okay.

Speaker:

Um, and also realize that they only need 30 seconds of clear audio to

Speaker:

be able to clone somebody's voice.

Speaker:

Um, which is kind of a

Speaker:

we're

Speaker:

Yeah, we're screwed.

Speaker:

Yeah, we're screwed.

Speaker:

I guess one question is these people, they're cloning,

Speaker:

Yeah.

Speaker:

usually the people in power, right?

Speaker:

Meaning the CFO, right?

Speaker:

The CEO, trying to get the employees, say the person in accounting to

Speaker:

sign a check or to send money

Speaker:

Right,

Speaker:

And so these people.

Speaker:

Usually have like public spotlights.

Speaker:

They're the ones who you find on LinkedIn with the social

Speaker:

right.

Speaker:

the videos the company.

Speaker:

Right.

Speaker:

Trying to promote the company.

Speaker:

Or they're giving talks at conferences, so it's not like, oh, they're just

Speaker:

locked away and someone somehow goes and like, records their audio.

Speaker:

Right.

Speaker:

right,

Speaker:

uh, downloads their phone messages.

Speaker:

Or their phone calls, right?

Speaker:

It is, this data's already out there.

Speaker:

It's not like it's hard to find.

Speaker:

right, right.

Speaker:

Let's talk about a couple of the different types of attacks.

Speaker:

The, the, the worst one, I think and, and, and it's worst in terms of

Speaker:

like the potential ramifications, but also in terms of, uh, it's so devious

Speaker:

and that is that they create a very short video of the person in power.

Speaker:

So this is like A-C-E-O-C-F-O, that type of person, someone with authority to.

Speaker:

Authorize a wire transfer, and then they, uh, create a very short video

Speaker:

that, that basically, um, mimics the thing that they want them to do.

Speaker:

Then the video, as part of this video, it drops.

Speaker:

Right?

Speaker:

So they, they, so what that does is that, that addresses the problem that

Speaker:

you were talking about, is it addresses the cost of making that video fake.

Speaker:

It also completely removes the need to do real time video fake,

Speaker:

which would be the thing that would be super, super expensive.

Speaker:

So if they can get a video of the person and then, um, do that.

Speaker:

And then, um, and then what they do is they're like, they drop the call

Speaker:

and then they immediately switch to a alternate communication method,

Speaker:

such as the easiest would be text.

Speaker:

It's like, Hey, I was trying to call you and my internet dropped.

Speaker:

And text now is the only thing.

Speaker:

And because of, again, human nature, you felt you were just

Speaker:

talking to the CEO and now you think you're still talking to the CEO.

Speaker:

And here, here's a question.

Speaker:

How easy is it to clone a phone number?

Speaker:

Oh, super simple.

Speaker:

Super simple.

Speaker:

spoof.

Speaker:

You can spoof a phone number and show, or it could, it

Speaker:

doesn't even have to be the ceo.

Speaker:

As an example.

Speaker:

Say you were on a video

Speaker:

Yeah.

Speaker:

you claim that, oh, my video's poor.

Speaker:

You drop back off, you hide your

Speaker:

Yeah,

Speaker:

You could still be doing the audio,

Speaker:

you could.

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

even have to go to text messages

Speaker:

Right,

Speaker:

voice call or other

Speaker:

right.

Speaker:

You just leverage the same

Speaker:

Yeah, so you could use the same channel, but just drop the, drop the video, right?

Speaker:

Um, yeah, go ahead.

Speaker:

And I think that's what happened in the first time I heard about this attack

Speaker:

was I think a company, the CEO joined and said, send out, uh, this vendor.

Speaker:

I think it was like $25 million or something.

Speaker:

it was, I think on a video call they were on where they got deep

Speaker:

faked and then they basically sent it to these fraudulent people.

Speaker:

Right, right.

Speaker:

Yeah.

Speaker:

I, and, and, and again, you know, go back to once that happens,

Speaker:

you're kind of screwed, right?

Speaker:

Yep.

Speaker:

and it's not like, uh, it's not like with a credit card where

Speaker:

you can, uh, do a chargeback.

Speaker:

Yeah, and the other thing to also mention here is.

Speaker:

It's not like they're just going and saying, oh, I'm just

Speaker:

gonna randomly target you.

Speaker:

They might have already gained access into your systems.

Speaker:

They might have already wandered around looking at, okay,

Speaker:

who's the important people?

Speaker:

Who's in accounting?

Speaker:

What meetings are there?

Speaker:

So they are already snooping, listening in on your conversations,

Speaker:

looking at your emails, and so they know exactly where to attack, who

Speaker:

to attack, and who to imPrasannate.

Speaker:

Right.

Speaker:

So, uh, that, so that's the one, that's one type.

Speaker:

That's where they're trying to get, like, to authorize like some sort

Speaker:

of invoice, some sort of payment.

Speaker:

Uh, the other is the, basically trying to gain access to an account.

Speaker:

So this is either reset a password or um, um, you know, resetting A MFA tokens.

Speaker:

Yeah.

Speaker:

Um, can you think of anything else that they would try there?

Speaker:

I think those are the main

Speaker:

Yeah.

Speaker:

But even there, I do wonder how much is really around the

Speaker:

voice aspects and the deep

Speaker:

Mm-hmm.

Speaker:

versus just the social engineering.

Speaker:

Because I think in those scenarios, social engineering

Speaker:

becomes more critical ex right?

Speaker:

Uh, in terms of understanding all the things needed in order

Speaker:

to be able to reset an account.

Speaker:

It's not necessarily that the help desk person or the person on the other

Speaker:

side knows your voice in particular.

Speaker:

Well, yeah, again, again, you're not necessarily, that's why I was saying

Speaker:

earlier in the call, it's not necessarily a voice that's known to the person,

Speaker:

but they're calling into a help desk.

Speaker:

But what you are doing is you're using a deep fake, uh, thing to you're, you're

Speaker:

trying to sound generally like somebody.

Speaker:

Yeah.

Speaker:

But, uh, again, it kind of relies on you not knowing that person really

Speaker:

well, because I think the, the, the nuances of how that person talks

Speaker:

is they're going to be, I think, relatively obvious to someone.

Speaker:

The point there again, is to reset something to reset

Speaker:

MFA so that then you could.

Speaker:

You know, do the MFA with with some other channel.

Speaker:

Yeah.

Speaker:

The other thing to also think about is there are still companies that

Speaker:

use voice as an authentication,

Speaker:

Yes.

Speaker:

right?

Speaker:

And so now becomes a question, okay, if my voice is my password,

Speaker:

Passport.

Speaker:

Passport.

Speaker:

If you're gonna quote the line, my voice is my passport.

Speaker:

Verify me.

Speaker:

Yep.

Speaker:

so now that becomes an attack vector,

Speaker:

Yeah.

Speaker:

It's now someone can mimic you and unlock the other person doesn't know

Speaker:

what you are supposed to sound like, but you've unlocked the account

Speaker:

because you've made a deep fake.

Speaker:

That quote of course is from sneakers, one of my favorite movies ever.

Speaker:

I need to go watch that again.

Speaker:

I do wanna talk just a little bit about personal stuff and then jump

Speaker:

into the company stuff, right?

Speaker:

There are a couple things that you could do that, and they're,

Speaker:

they're actually kind of similar.

Speaker:

One is to, to create some sort of code word within the family.

Speaker:

Right.

Speaker:

Some sort of shared secret that only you and the other person knows and say,

Speaker:

Hey, if I ever actually do call you and I say I'm on the roadside and I need you

Speaker:

to wire me 500 bucks for a tow truck or whatever, I'm gonna say this code word.

Speaker:

Right?

Speaker:

Um, and you can also, if you don't have a shared code word, you can, you can

Speaker:

reference like in you and I I, if, if this was happening right now, I would say, Hey.

Speaker:

You know that thing that we were talking about earlier today, you know, and you

Speaker:

can reference something that both, that only you and that person would know.

Speaker:

But, but yes, and I think that becomes the key, is something that only you

Speaker:

and that other person knows that hasn't been shared over a video, over text,

Speaker:

over some other communication method

Speaker:

It needs to be.

Speaker:

Yeah, very much.

Speaker:

Uh, that, and then also, and we're gonna get this to the company as well, is verify

Speaker:

through an alternate channel, right?

Speaker:

Um, is, is something other than, um, you know, call them back on

Speaker:

the number that you know them on.

Speaker:

Now, there are ways to get around that too, but in general, they're

Speaker:

gonna be faking the number.

Speaker:

And when you call the other person's actual number back, uh, it's not gonna,

Speaker:

it's not gonna be the scam, right.

Speaker:

And, and it's interesting that we are talking about this episode today.

Speaker:

So this morning I sent an article to my

Speaker:

Mm-hmm.

Speaker:

and, uh, for some reason the link was broken.

Speaker:

It didn't show up properly in text, and it just showed up as an empty bubble.

Speaker:

And she immediately calls me, she's like, did someone hack your phone?

Speaker:

Because normally I don't send things like that.

Speaker:

Right.

Speaker:

looked very weird.

Speaker:

She was.

Speaker:

Wanted to verify using a different channel to be like,

Speaker:

Hey, is that really you or not?

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

So let's talk, let's talk about, uh, to go to the companies now, right?

Speaker:

And because the first suggestion.

Speaker:

Is the same as the, as The suggestion that we just made, right, is, uh, is

Speaker:

to have multi-channel verification.

Speaker:

So the, the, the real key here is they're calling you in over one channel, or

Speaker:

they're contacting you via one channel.

Speaker:

In this case, what we're talking about is audio deep fake.

Speaker:

So they are calling you and they've called you over this one channel, and

Speaker:

then they want you to do the thing.

Speaker:

So the question is, you need to contact them back over another channel.

Speaker:

Um, now I can think of like an attack against that.

Speaker:

Do you know what that might be?

Speaker:

Well, it's basically they've compromised all your channels,

Speaker:

Well, that wa that wasn't the, yeah, that wasn't the attack I was thinking

Speaker:

about, although I, I do know, you know,

Speaker:

Yeah.

Speaker:

am aware of a situation where there was a company that.

Speaker:

Had had that happen, right?

Speaker:

They had, they had compromised all of the things.

Speaker:

That's not gonna fix that.

Speaker:

But in general, that's not the case.

Speaker:

But, uh, what I was thinking was the usual case or the usual hacker, what

Speaker:

they're gonna do is they're gonna try to create a sense of urgency.

Speaker:

They're gonna say, oh, you, you know, I, I need to call you back on

Speaker:

your, your phone number of record.

Speaker:

Oh, well I had my phone's dead, whatever.

Speaker:

You know, I can't, you know, I'm, I'm calling in on my friend's phone.

Speaker:

Uh, I'm trying to, you know, I'm trying to do the thing.

Speaker:

They, they come up, I'm sure that they're well trained on how to, um,

Speaker:

create that sense of urgency and, and that sense of exception of, here's why

Speaker:

I'm doing it in this really weird way.

Speaker:

So, a hundred percent agree and I have a story around this.

Speaker:

I actually have a story for

Speaker:

Okay.

Speaker:

so,

Speaker:

The way you say that, it's like you think that's all I am is just a

Speaker:

bunch of stories, but that's okay.

Speaker:

you always have

Speaker:

so many amazing

Speaker:

stories from all your

Speaker:

experience of being in the field.

Speaker:

But I do have a

Speaker:

Yep.

Speaker:

Mm-hmm.

Speaker:

other day I was looking to switch my, uh.

Speaker:

Internet plan and, uh, my contract was expiring, so I called in and

Speaker:

placed an order online and I get a call back like a couple hours later.

Speaker:

recognize the number.

Speaker:

The person leaves a message and is like, Hey, this is Xfinity calling,

Speaker:

uh, we're calling about your plan.

Speaker:

We had an issue.

Speaker:

Can you please call us back?

Speaker:

We can't do this right now.

Speaker:

You have to call in.

Speaker:

They leave a number.

Speaker:

Then they call like three or four other times, like in the next couple days.

Speaker:

And then, uh, three days later they call in and I pick up the

Speaker:

phone and I talk to the person.

Speaker:

They're like, hi, this is Xfinity.

Speaker:

Uh, we had an issue with your order and they had my name

Speaker:

Mm-hmm.

Speaker:

I'm like, oh, that's great.

Speaker:

And I was like, uh, and then it dawned on me, I'm like, I don't

Speaker:

even know if this is Xfinity.

Speaker:

'cause there's a

Speaker:

Right.

Speaker:

Garbage calls out

Speaker:

Yeah.

Speaker:

right?

Speaker:

So I was like, can you please give me your number so I can call back and verify?

Speaker:

And they've gotten really professional.

Speaker:

They're like, oh yeah, we totally understand.

Speaker:

This happens to a lot of people.

Speaker:

Here's our one 800 number.

Speaker:

Call us

Speaker:

Uh,

Speaker:

And then I looked at the number, I Google searched it, I couldn't find it.

Speaker:

And Xfinity's normal list.

Speaker:

And Xfinity has a customer security assurance

Speaker:

Uhhuh.

Speaker:

So I called 'em and I gave them the number, and they're like,

Speaker:

yeah, that's not an Xfinity number.

Speaker:

Yeah.

Speaker:

Right, but the person had a, the exact same script that Xfinity

Speaker:

would say whenever they call

Speaker:

Yeah.

Speaker:

They had the exact same accent, everything else, calling from a local number.

Speaker:

Yeah.

Speaker:

Right?

Speaker:

But it's

Speaker:

You did the right thing.

Speaker:

You, you did the, basically what we're talking about here, the multi-channel

Speaker:

verification, calling back to a, I mean, in that case it's technically

Speaker:

the same channel, but you're, you're calling back on a different phone number.

Speaker:

Right?

Speaker:

Um, and that's what you're doing here is the whole point of the, uh, the different

Speaker:

ways that log in and secure those login.

Speaker:

The whole point of that is to maintain that security.

Speaker:

If they're then trying to circumvent that security, there could be a real reason.

Speaker:

There could be a. Real person with a real scenario that's really down.

Speaker:

If, if that, you know, and they're, and they're, and so they can't use

Speaker:

the normal methods, that's where you really need to slow things down.

Speaker:

That person on the other end, they may be, they may be having the worst day of their

Speaker:

lives, but you need to slow things down and verify in every way that is possible.

Speaker:

Uh, and that may indeed be very inconvenient for the

Speaker:

person in question, right.

Speaker:

Yeah.

Speaker:

The other thing too is hopefully this is part of your processes.

Speaker:

Yes.

Speaker:

You have a process in place when something like this has to happen

Speaker:

and someone's asking for this, right?

Speaker:

There is, Hey, I need this approval.

Speaker:

We have these alternate mechanisms that we do to verify so everyone's on the same

Speaker:

Right.

Speaker:

rather than it being like, oh, I don't know what to do,

Speaker:

and everyone's frank, frantic.

Speaker:

right.

Speaker:

And, uh, so speaking of multi-channel, uh, the other thing is to ban personal

Speaker:

messaging apps for official business.

Speaker:

You, you're never gonna ban, you know, let's say signal forever.

Speaker:

I, I, I don't know if you can ban signal, like, like technologically you think you

Speaker:

can and like, like using a firewall rules.

Speaker:

It's

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

I don't think you can.

Speaker:

Well, also it's, especially if it's going out over cell phone signals.

Speaker:

Right.

Speaker:

Yeah.

Speaker:

Um, but you can ban them for business use.

Speaker:

And again, this goes back to a lot of this, you mentioned it just a minute ago.

Speaker:

This is about policy and procedure.

Speaker:

Mm-hmm.

Speaker:

there's, there's a certain amount that you can do.

Speaker:

Technologically, but in the end, what you need is regular training and policy

Speaker:

and procedure that says we don't use personal messaging apps for business.

Speaker:

And then when suddenly somebody is somehow contacting you via, you

Speaker:

know, let, let's say, let's say they found out that you use signal.

Speaker:

Like on your phone or something, and so suddenly someone is contacting

Speaker:

you via signal on your phone.

Speaker:

Well, that's a giant red flag right there.

Speaker:

It's like, Hey, this is not, we don't use it for official business.

Speaker:

Right, right.

Speaker:

Exactly.

Speaker:

Yeah.

Speaker:

Um,

Speaker:

And so establishing these mechanisms is good.

Speaker:

Now, you probably also need to have sort of a break class scenario,

Speaker:

you do.

Speaker:

Yeah.

Speaker:

right?

Speaker:

Uh, I could imagine the case, remember the derecho and how everything went down,

Speaker:

Mm-hmm.

Speaker:

Mm-hmm.

Speaker:

right?

Speaker:

Scenarios like that, that you couldn't have think, thought about, right?

Speaker:

There needs to be, okay, this is a documented procedure when we have

Speaker:

to go outside the normal scope.

Speaker:

Yes.

Speaker:

And that person should be well.

Speaker:

You, you should regularly train on that procedure.

Speaker:

And, uh, and that procedure, the more remote a person is,

Speaker:

the more that procedure needs to be kind of ironclad, right?

Speaker:

And, and you need to have.

Speaker:

Scenarios for those scenarios, right?

Speaker:

Yeah.

Speaker:

Right.

Speaker:

Of like, okay, you know, a derecho hit.

Speaker:

You know, if you get hurricanes all the time, you need to have a syn.

Speaker:

You need to have a. A procedure that says, here's what we do in this scenario,

Speaker:

and then whoever that is, that, that, that what, 'cause you don't know, you

Speaker:

don't know if it's a, a deep fake or not.

Speaker:

Yep.

Speaker:

Whoever it is.

Speaker:

Well, you need to follow our special procedure and maybe that special

Speaker:

procedure is we have this other phone number that you should have in your

Speaker:

phone or you should have in your, you know, like you said, your break

Speaker:

glass, you got your special, uh, thing that we gave you that you're

Speaker:

supposed to store in your right front.

Speaker:

Breast pocket.

Speaker:

I don't know.

Speaker:

I'm just making stuff up, but

Speaker:

like embedded into your

Speaker:

it's not,

Speaker:

it.

Speaker:

you need to, you need to embed the little chip on your arm.

Speaker:

Yeah.

Speaker:

Uh, but yeah, you have that break glass procedure.

Speaker:

Uh, I remember, um, when I was, when I ran it for a small organization,

Speaker:

the volunteer, it was a volunteer position and, uh, the, the.

Speaker:

Main person there really did not like the fact that they did not have root.

Speaker:

And I was like, I will not do this if you have root.

Speaker:

Like I will.

Speaker:

I'm, I'm not here.

Speaker:

Like, I'm just not interested in that.

Speaker:

But what they did was we created a break glass procedure for that, which

Speaker:

was, we had it in a special envelope that was sealed, that was in the safe.

Speaker:

And if I was ever incapacitated, they could break that.

Speaker:

But, uh, if that was ever the case, then obviously I had to change

Speaker:

the root password and everything.

Speaker:

Right?

Speaker:

Yep.

Speaker:

Yeah.

Speaker:

Yeah.

Speaker:

Going back to sort of the multi-channel verification, right?

Speaker:

One thing is that you should also think about.

Speaker:

For some sensitive procedures or processes, right?

Speaker:

Sort of having a callback protocol that says, okay, when this happens, this

Speaker:

is the mechanism we're going to do.

Speaker:

As an example, if someone's like, Hey, I need to wire a hundred

Speaker:

thousand dollars to someone, it's like, okay, that is a special case.

Speaker:

Right,

Speaker:

Right.

Speaker:

Kind of like how we talked about break

Speaker:

right.

Speaker:

right?

Speaker:

This is a special case.

Speaker:

Here are the exact steps that are required, and here are the people you need

Speaker:

to contact using this type of protocol to make sure that yes, this is legitimate.

Speaker:

Right.

Speaker:

And, and one of the big ones would be a big payment to a new vendor, right?

Speaker:

That should set all sorts of, uh, set off all sorts of red flags, right?

Speaker:

You're sending a hundred thousand dollars to a vendor that you've

Speaker:

never done any business with before.

Speaker:

Uh, and then, and for some reason the, the CFO is calling you about this payment.

Speaker:

That should be like, that should, you know, even if it sounds so much like a

Speaker:

CFO, that should set off, uh, you know.

Speaker:

Or, or it should be like, Hey, you should talk to procurement to figure

Speaker:

out, okay, when did the vendor get added?

Speaker:

Yeah.

Speaker:

Right?

Speaker:

And are they even aware of them?

Speaker:

Yeah.

Speaker:

And, and I just wanna speak to something here.

Speaker:

You need to have a conversation with these people who are, uh, you know,

Speaker:

your, in your management chain, and you need to have a conversation in advance.

Speaker:

Say, Hey, if you ever have a situation where it's triggering my,

Speaker:

you know, um, what do you call it?

Speaker:

Um.

Speaker:

Uh, red flag.

Speaker:

It's, yeah.

Speaker:

Yeah.

Speaker:

It's triggering off my spidey sense, whatever you want to call it.

Speaker:

And I then put the kibosh on the thing you're asking for.

Speaker:

You need to not like yell at me.

Speaker:

Right.

Speaker:

We need to agree in advance that if I put a stop to a large transaction or

Speaker:

something like that, or I don't reset your MFA because you're trying to log

Speaker:

in and then you, you know, you cannot.

Speaker:

Then just use your position to say, just do it.

Speaker:

Right.

Speaker:

Um, because that, that's exactly what a, a threat actor would do, you know?

Speaker:

Yeah.

Speaker:

And you've established these processes ahead

Speaker:

Right, right,

Speaker:

and so you should be following it as a company.

Speaker:

right.

Speaker:

Yeah.

Speaker:

And kind of along that line of sort of the processes, procedures, callback protocol,

Speaker:

It's really important that employees get

Speaker:

Yeah.

Speaker:

Right and periodic training because this landscape is changing.

Speaker:

How people are attacking with DeepFakes is constantly evolving.

Speaker:

What are the new attacks is also changing on literally a day by day basis.

Speaker:

And it's interesting 'cause Curtis, I don't, I know in the past, like we both

Speaker:

have done sort of security training.

Speaker:

I don't think I've ever run across deep fake training in those lessons.

Speaker:

Right.

Speaker:

No.

Speaker:

Yeah, and

Speaker:

Well, I mean, if you think about it was the last time, well, at least for me,

Speaker:

the last time I was working for a company that was doing that kind of training.

Speaker:

It was before Deepak was a thing, but yeah,

Speaker:

but hopefully companies are now updating their trainings to include

Speaker:

DeepFakes, maybe you should just periodically do a deepfake training or

Speaker:

just have someone pretend to be a deep fake, join a video conferencing call

Speaker:

yeah, yeah.

Speaker:

I like that.

Speaker:

see how people react.

Speaker:

wonder, uh, like, no.

Speaker:

Before, I wonder if, uh, if they've updated their stuff to a clay,

Speaker:

they've got to at this point.

Speaker:

yeah.

Speaker:

And along with those employee trainings is also, include this

Speaker:

in your tabletop exercises,

Speaker:

Mm-hmm.

Speaker:

right?

Speaker:

Just like we've talked about ransomware preparation.

Speaker:

I think that you must include deep, fake exercises to figure out, and

Speaker:

especially with people from finance and accounting who maybe normally

Speaker:

are not part of your tabletop

Speaker:

Mm-hmm.

Speaker:

It's important that they now have this focus training to

Speaker:

make sure they understand the scenarios and the situations.

Speaker:

Um, and there are, so I think policy and procedure is the big tool here, right?

Speaker:

Having said that, there is also starting to be detection tools right now.

Speaker:

This will be for the rest of our lives.

Speaker:

This will be an arms race, right, where we will have tools that detect ai.

Speaker:

Right now, it's relatively easy if you know what to look for to

Speaker:

detect an AI video or an AI audio.

Speaker:

If you know what to look for

Speaker:

ai?

Speaker:

you.

Speaker:

I don't think anybody could really do you as ai, but um.

Speaker:

But there are tools right there, there is software that could do real

Speaker:

time AI detection, and they look at things like how the voice sounds,

Speaker:

they look at the cadence of the voice.

Speaker:

They look at, uh, if it's video, they're looking at eye movement and you know, and

Speaker:

also like, uh, biometric watermarking.

Speaker:

There, there's all sorts of things, especially for high risk

Speaker:

people or high risk industries.

Speaker:

Um, and uh, and then also there, there's enhanced.

Speaker:

Things, you know, enhanced security measures like liveness detection

Speaker:

on multifactor authentication.

Speaker:

Right?

Speaker:

Voice, voice biometric systems.

Speaker:

Right?

Speaker:

That, that can actually detect that the, the voice that they're

Speaker:

listening to is synthetic.

Speaker:

Right.

Speaker:

Yeah,

Speaker:

Um.

Speaker:

also along those lines, there are now regulations coming out trying to

Speaker:

target sort of AI generated content.

Speaker:

So if you look at things like Facebook videos, if they are generated using

Speaker:

ai, I believe it now has to be tagged.

Speaker:

Mm-hmm.

Speaker:

And so there is sort of that awareness that AI is getting so good that

Speaker:

it's hard to differentiate, so it requires this additional tagging and

Speaker:

information to be able to help users discern AI versus a real person.

Speaker:

So there is technology.

Speaker:

So just, just search and, and keep yourself up to date on

Speaker:

deepfake detection technology.

Speaker:

It's going to be an arms race.

Speaker:

This is gonna be fun, right?

Speaker:

Um, so let, let's talk about some things not to rely on.

Speaker:

Caller id.

Speaker:

Yep.

Speaker:

Caller Id don't mean jack squat, right?

Speaker:

Yep.

Speaker:

Um, and also, uh, if you have voice recognition systems, if you don't

Speaker:

have additional authentication on top of that voice recognition.

Speaker:

Um, same thing with video calls.

Speaker:

And again, this, this urgent when, when it's like super urgent.

Speaker:

This is why you've got to, you've got to agree upfront.

Speaker:

Even when it's urgent, you know, um, you, you, you follow the

Speaker:

policy and procedure, right?

Speaker:

but I do wonder if people are of away from urgent.

Speaker:

As an example, if they start to understand, because like you said, it's

Speaker:

Mm-hmm.

Speaker:

If bad actors start to realize, hey, people are not well to the urgency

Speaker:

and we don't necessarily need to be urgent in order to be able to.

Speaker:

Do this sort of attacks, maybe that starts to drop away.

Speaker:

So I just wanna caution people to not rely on urgent as the only flag.

Speaker:

It is one of the

Speaker:

Yeah.

Speaker:

but it should not be the signal

Speaker:

Agreed.

Speaker:

Yeah.

Speaker:

It's, it's not the only flag, but it's definitely a big flag.

Speaker:

Right.

Speaker:

Yeah.

Speaker:

Uh, and of course, you know, I mean, this should go without saying, but things like,

Speaker:

oh, I know, I know this person's voice.

Speaker:

And, and also they knew, just like you mentioned earlier, they

Speaker:

knew details about me, dude.

Speaker:

They, they got all kinds of stuff, right.

Speaker:

And, and how real the video looks, uh, because it's, it's

Speaker:

pretty amazing, uh, especially if the video is very short, right.

Speaker:

Yeah.

Speaker:

Um, so I, I, I guess the, the summary is that, you know, the human element

Speaker:

is still, it's both our strongest defense and our weakest link, right?

Speaker:

Um, we, we can't, we can't, we can't solve our way out of this with technology alone.

Speaker:

It's got to be a combination of, uh, people, process and technology.

Speaker:

Yeah.

Speaker:

Yep, a hundred percent agree.

Speaker:

Yeah.

Speaker:

All right.

Speaker:

Well, it's been fun.

Speaker:

Prasanna.

Speaker:

Deep fake Prasanna.

Speaker:

Who knows?

Speaker:

Who knows?

Speaker:

All right, well, uh, thanks for listening folks, and, um, you know, um, be sure

Speaker:

to subscribe wherever you see us.

Speaker:

And, um, um, that's a wrap.