May 11, 2026

How Honeypots and Canary Files Catch Attackers Before They Strike

How Honeypots and Canary Files Catch Attackers Before They Strike
Apple Podcasts podcast player badge
Spotify podcast player badge
Castro podcast player badge
RSS Feed podcast player badge
Apple Podcasts podcast player iconSpotify podcast player iconCastro podcast player iconRSS Feed podcast player icon

Honeypots and canary files are two of the most underused tools in cybersecurity — and in this episode, Dr. Mike Saylor and I break down exactly how they work and why you should be using them. The short version: they're tripwires. They tell you a bad guy is poking around your network before anything gets encrypted.

Mike walks through his layered security analogy, explains the three different ways organizations use honeypots — learning attacker tactics, distraction, and testing — and then we get into canary files: what makes them different from a honeypot, how they beacon home when stolen, and why clock synchronization matters more than most people think if you ever want that evidence to hold up.

We also cover how to stand one up without a big budget, what tools are available, and why something is absolutely better than nothing. Plus, Mike and I have news about our new O'Reilly book, Learning Ransomware Response and Recovery.

0:00 - Intro and book news

1:09 - Meet the crew

3:45 - Security is all about layers

9:22 - What are honeypots and canary files?

11:00 - Three ways honeypots work for you

13:17 - Real-world examples: bait cars and glitter bombs

15:20 - Making your honeypot convincing

19:11 - Honeypot tools and options

21:13 - Something is better than nothing

24:10 - Monitoring and notifications

25:05 - Canary files explained

27:03 - How canary files beacon and track attackers

28:03 - Don't forget to sync your clocks

29:05 - Final thoughts

Speaker:

You found the Backup Wrap-Up, your go-to podcast for all things backup,

Speaker:

recovery, and cyber recovery.

Speaker:

This episode, we take a look at honeypots and canary files, two

Speaker:

tools that can tell you that a bad guy is poking around your network

Speaker:

before they've done any real damage.

Speaker:

Dr. Mike Saylor is back with me and Prasanna to break down how these work,

Speaker:

uh, why layered security matters, and what the actual daf- difference is

Speaker:

between a honeypot and a canary file.

Speaker:

We also cover, uh, how to stand one up without spending a fortune, why

Speaker:

clock synchronization matters more than you might think, uh, that is, if you

Speaker:

ever want your evidence to hold up.

Speaker:

And, uh, Mike and I also have, uh, a little bit of news about

Speaker:

our new O'Reilly book, Learning Ransomware Response and Recovery.

Speaker:

Hope you enjoy it

Speaker:

welcome to the Backup wrap up.

Speaker:

I'm your host, w Curtis Preston, AKA, Mr. Backup, and I have with me

Speaker:

a guy who's apparently in a hurry.

Speaker:

So we gotta get started.

Speaker:

Prasanna Malaiyandi.

Speaker:

How's it going, Prasanna?

Speaker:

Let's go.

Speaker:

Let's go.

Speaker:

Let's go.

Speaker:

Come on.

Speaker:

Fast, fast, fast.

Speaker:

you know how I do all my YouTube at two x?

Speaker:

Can we do this at like two x?

Speaker:

Yeah, I don't think we can do that.

Speaker:

at least pretend to speak faster.

Speaker:

Yeah.

Speaker:

I think this episode, think maybe this episode might be quick.

Speaker:

I always say that and then, next thing I know, 45 minutes later,

Speaker:

One hour.

Speaker:

yeah.

Speaker:

run around with my hair on fire.

Speaker:

Yeah, I see what you did there.

Speaker:

And, speaking of, hair on fire.

Speaker:

we have our guest with us once again, Dr.

Speaker:

Mike Saylor.

Speaker:

How's it going, Mike?

Speaker:

Whoop, whoop.

Speaker:

It's going well.

Speaker:

and he, unlike me is the possessor of physical books.

Speaker:

Could do you have it nearby?

Speaker:

Can you hold it up?

Speaker:

I don't have it nearby.

Speaker:

Oh, so disappointed in you, Mike.

Speaker:

so both of us have been waiting for the shipment, the physical

Speaker:

shipment of the book that we just wrote, learning ransomware response

Speaker:

and recovery, and he beat me.

Speaker:

I, guess shipment to Texas is faster than shipment to California.

Speaker:

I don't know, maybe it's 'cause

Speaker:

I have pictures of it.

Speaker:

although, yeah,

Speaker:

pictures.

Speaker:

of it.

Speaker:

By the way, I like your offer and I'm gonna follow it up as well.

Speaker:

I'm gonna do the same thing with my copies.

Speaker:

We as authors, we get 10 copies and, we're looking for stories, we're

Speaker:

looking for ransomware stories, ransomware events, things like that.

Speaker:

And, we'll pick the best, basically send me a DM on LinkedIn, send Mike a

Speaker:

DM on LinkedIn, and, and then, we'll pick the best ones and the best ones get

Speaker:

a signed copy if that's what you want.

Speaker:

and,

Speaker:

Can you, can they can they get it made out to eBay?

Speaker:

you're so funny.

Speaker:

There's also audio, ver audio book versions for those that don't

Speaker:

like to hold books while they

Speaker:

I literally

Speaker:

consume knowledge.

Speaker:

to the audiobook version just a few minutes ago.

Speaker:

It is very weird.

Speaker:

I'm curious what voice they're using.

Speaker:

is it good?

Speaker:

The guy's fine.

Speaker:

he's a professional audio book narrator.

Speaker:

Mike

Speaker:

Okay.

Speaker:

name.

Speaker:

I actually know a guy that does audio book narration.

Speaker:

His name's William Shakespeare, believe it or not.

Speaker:

and I know him.

Speaker:

And then I have another friend whose name is Robert Lewis Stevenson.

Speaker:

I introduced the two of them.

Speaker:

I also know Stuart Little.

Speaker:

I just know some random odd people.

Speaker:

Anyway.

Speaker:

so Mike, why don't you, why don't you give us a little, there, there was a

Speaker:

little story I think that can help, an analogy here with, talking about

Speaker:

doing alarms in the yard and stuff.

Speaker:

why don't you tell us that story?

Speaker:

Security's all about layers.

Speaker:

Whether you're talking about cybersecurity or physical security, you don't want,

Speaker:

you don't want the one bell or alarm that goes off, to indicate that you

Speaker:

know a threat is right in your face.

Speaker:

you'd prefer that, you know something off in the distance.

Speaker:

trip wire smoke, something coming across the river and making noise in the water.

Speaker:

there's some indication at some point out in the distance that gives you

Speaker:

the idea that something is coming as well as the time necessary to

Speaker:

identify the threat if it is a threat.

Speaker:

respond appropriately.

Speaker:

Like, how do I prepare for this?

Speaker:

if you think about it, in the middle of the night, if you hear something,

Speaker:

the first thing you do is you wake up and go, I think I heard something.

Speaker:

And then you're gonna wait a second and then maybe you hear it again.

Speaker:

And if not, you eventually, you may get up and check it out.

Speaker:

But at that point.

Speaker:

They're already in the house.

Speaker:

And so what do you grab whatever's closest to you, whether it's

Speaker:

your cat or a candlestick or

Speaker:

Baseball bat.

Speaker:

a shoe.

Speaker:

You're not truly prepared for whatever that might be.

Speaker:

You're reacting in the moment and that's not good.

Speaker:

And so from a physical security perspective, typically.

Speaker:

And like in a neighborhood, you've got the street, you've got a curb, sidewalk,

Speaker:

yard fence, and then you got the perimeter of your house with, doors and windows.

Speaker:

And then inside the house, maybe outside the house too, you've got motion lights

Speaker:

and some other things that may trigger.

Speaker:

awareness.

Speaker:

and then you've got, maybe you've got, a door and window sensors.

Speaker:

So if any of those, are interacted with you, maybe you get a beep or the alarm

Speaker:

goes off, and then hopefully you've got other stuff in the house like a dog that's

Speaker:

gonna bark, or more lights or more people.

Speaker:

but the point is if a car, or somebody comes off the street.

Speaker:

there, there's layer one, layer two is across the sidewalk.

Speaker:

maybe that's how far out your motion sensors are, and now they're in the yard.

Speaker:

Now the light goes off, or the dog starts barking.

Speaker:

and so now there's other indicators that there's a threat approaching.

Speaker:

and you, so now you're starting to respond.

Speaker:

you're looking out the window.

Speaker:

you're on the phone with nine one one or you're grabbing your weapon, or you're.

Speaker:

You got your crew together, to respond to this threat if it comes through the door.

Speaker:

So that's the idea.

Speaker:

And si very similarly in cyber, we don't wanna wait until

Speaker:

something gets to our laptop and, someone's moving our mouse around.

Speaker:

we want other things between us and the bad guy, at least enough

Speaker:

of them, more than one thing.

Speaker:

so that we start to become aware of weird stuff much sooner than all

Speaker:

of a sudden our files are encrypted and we can't use our computer.

Speaker:

Mike, I love that analogy.

Speaker:

one thing that I know people sometimes say is you just want to make your house

Speaker:

look less appealing for a would be robber right than the house next door.

Speaker:

and by having those motion lights or motion triggered lights and things like

Speaker:

that, it's like, Hey, maybe I should go next door and see what's there.

Speaker:

Rather than trying to go to your house, is there something similar, like

Speaker:

does that analogy also apply in the cyber world as well, or not so much?

Speaker:

At least that part.

Speaker:

Okay.

Speaker:

So in the physical world, we've got a much, a much more

Speaker:

personal sense of risk, right?

Speaker:

So if I'm on the street and I'm a bad guy and I'm on the street and I'm

Speaker:

looking at house A and house B. I'm absolutely assessing the risk to myself,

Speaker:

my health, my, my consciousness, per perhaps, being caught and going to jail.

Speaker:

injury.

Speaker:

Do they have a dog?

Speaker:

I don't want, I don't like dogs, right?

Speaker:

I don't wanna get bitten.

Speaker:

or is there a sign in the yard that says there's an alarm system?

Speaker:

there a car in the driveway that says, local police department?

Speaker:

first Amendment, or Second Amendment, or, don't tread on me.

Speaker:

And I like guns and we don't call the police.

Speaker:

We, we call the landscaper, So I'm doing this assessment because I am

Speaker:

personally, physically, personally involved in this threat, this crime.

Speaker:

cybersecurity is much different for a couple of reasons.

Speaker:

One, very rarely do bad guys sit back in their chair and they, look at

Speaker:

company A and company or, victim A or victim B. they don't always do that.

Speaker:

In fact, it's very rare.

Speaker:

what they typically do is just cast a wide net using tools to

Speaker:

find open windows and open doors in these victim networks and systems.

Speaker:

And it's not until they start looking into, what does this door,

Speaker:

what, who, where does this door go?

Speaker:

Where does this window go?

Speaker:

That then they determine, that's a, that's nasa, right?

Speaker:

I don't want to, maybe I'm risk averse to.

Speaker:

That,

Speaker:

Yeah.

Speaker:

versus, Joe Schmo dental office.

Speaker:

so it's, it does happen, but not to the same degree and definitely

Speaker:

not early on, like physical.

Speaker:

I'm doing it.

Speaker:

Step one.

Speaker:

I'm assessing risk cyber, it's down the road a bit.

Speaker:

as they get to know who the victims are.

Speaker:

Yeah.

Speaker:

And so

Speaker:

Targets are.

Speaker:

is, that determines like how you're going to do the kinds of things.

Speaker:

It's a good analogy of the security, but how we're going to

Speaker:

respond is a little bit different.

Speaker:

But I think there are some.

Speaker:

Analogous things.

Speaker:

So you talked about, you literally used the word tripwire.

Speaker:

You could have a trip wire.

Speaker:

there's a product called Tripwire.

Speaker:

the, there are, and they operate in much the same way.

Speaker:

I think the thing, the primary thing that we're gonna talk about today.

Speaker:

I don't think there's really an analogy to that in the, kinetic world.

Speaker:

that word keeps coming up.

Speaker:

I don't think there's really an analogy to this concept of a

Speaker:

honey potter or a canary file.

Speaker:

Before we talk about that, a persona, do you know why do

Speaker:

we use the term canary files?

Speaker:

Or why do we use the term canary?

Speaker:

Of course I've watched movies.

Speaker:

Did they not use a canary in Zoolander?

Speaker:

I don't think they did.

Speaker:

I don't think they did in Zoolander, but that's funny.

Speaker:

I was like, what is he, why is he talking about Zoolander?

Speaker:

Yeah.

Speaker:

the, yeah, it's the cana The phrase is the canary in the coal mine.

Speaker:

what was that?

Speaker:

So when miners would go into these coal mines, of course at some

Speaker:

point oxygen gets low, right?

Speaker:

You have a good chance of suffocating.

Speaker:

And so what they would do is they would bring a canary, which is a little bird

Speaker:

with them along deep into the mines.

Speaker:

And if a canary passed out or something else like that, then they

Speaker:

knew, okay, there's less oxygen.

Speaker:

Probably a buildup of carbon monoxide.

Speaker:

We need to get out of here before we pass out.

Speaker:

We die.

Speaker:

So canary in a coal mine.

Speaker:

Of course the honey pot.

Speaker:

I think this is really just an analogy to, Winnie the Pooh.

Speaker:

and, 'cause he cannot resist a honey pot.

Speaker:

So let's talk about what the, so this idea, Mike is again that we're

Speaker:

trying to, trying to figure out that somebody is doing something, before.

Speaker:

actually do something, is that, I'm trying to take your analogy, which I really

Speaker:

like, and then bring it into this world.

Speaker:

We're trying to figure out that they're up to no good they

Speaker:

actually get up to no good.

Speaker:

Does that sound right?

Speaker:

I.

Speaker:

Close.

Speaker:

it, that, that is one objective honeypots, actually serve a

Speaker:

couple of different purposes.

Speaker:

one of them, is, a blatantly vulnerable honeypot is designed to allow, entice,

Speaker:

bad guys to attack it so that we learn about their tactics and techniques.

Speaker:

So what's the newest way bad guys are attacking the newest version of Windows

Speaker:

or Apple io Mobile, iOS or whatever.

Speaker:

So we put these honeypots out there to figure out why or how, bad guys are doing

Speaker:

it, and we use that information to make our products better, or it's also a good

Speaker:

way to, attribute, a given attacker's.

Speaker:

Techniques and tactics and procedures, we call those TTPs.

Speaker:

so how do we attribute then, because maybe we've got this open case, comp all these

Speaker:

victims, they're getting attacked and we underst, we've documented those TTPs,

Speaker:

but we can't figure out who's doing it.

Speaker:

so we can put a honeypot out there with similar victim.

Speaker:

attributes, something that looks like the profile of a company these bad guys are

Speaker:

attacking and we learn from that and it gives us an opportunity to potentially

Speaker:

tra track that activity in real time.

Speaker:

So there's that.

Speaker:

So it's a learning tool to try and get, more, familiarity with bad guy TTPs.

Speaker:

the other one is a distraction, I've got this very valuable production network.

Speaker:

So maybe I'm a, an IOT network and IOT's very vulnerable, especially

Speaker:

the older ones where we didn't really build secure architecture.

Speaker:

It's, I can ping a wellhead from a conference room.

Speaker:

That's not good.

Speaker:

If I build a very similar, potentially even simulated, honeypot

Speaker:

environment, bad guys compromise it, and it looks and reacts just

Speaker:

like my production environment, then they're gonna stop looking.

Speaker:

They think they've achieved their goal, right?

Speaker:

And so it's a distraction.

Speaker:

And then lastly, honeypots can be used as a test environment.

Speaker:

so I can.

Speaker:

If it's only used for that, then we wouldn't call it a honeypot.

Speaker:

But because we've already created this simulated environment that's

Speaker:

supposed to replicate and behave like production, why don't I also use that

Speaker:

for testing my changes, like change management, other stuff, right?

Speaker:

so there's a couple of different thing, ways we can use honeypots and a real world

Speaker:

kinetic, example would be like bait cars.

Speaker:

we, the, the auto.

Speaker:

auto theft division of police departments.

Speaker:

how are bad guys stealing the brand new Cadillac when it's supposed to have these

Speaker:

coated keys and this, that, and the other?

Speaker:

they'll put one out on the street and they'll make people, they'll let the

Speaker:

right people know that car's gonna be there for a while and then they observe

Speaker:

bad guys and how they attempt to steal it and then they steal it and now we

Speaker:

can track it and figure out all this stuff and those bad guys get caught.

Speaker:

and then back on the.

Speaker:

gone in 60 seconds.

Speaker:

That's what's, it's a good show.

Speaker:

and they do that with a variety of different things.

Speaker:

It's not just cars.

Speaker:

They do it with bikes and computers and that's, of value that they've

Speaker:

got this high volume of theft with.

Speaker:

they'll create these bait, bait situations where, they want bad guys

Speaker:

to take it so they can learn from it and track 'em and, potentially curb,

Speaker:

curb the, The volume of that crime.

Speaker:

So a non sequitur, I'm going to comment on one of my favorite bait sort of

Speaker:

things that people do is a porch Pirates where they do the glitter bombs.

Speaker:

Yep.

Speaker:

So that's also a honeypot, right?

Speaker:

Yeah, it's like a honey pot.

Speaker:

A honey pot with a exploding honey.

Speaker:

yeah, I do love the Porch Pirate glitter bomb, folks.

Speaker:

and.

Speaker:

Yeah, I do.

Speaker:

I do love that very much.

Speaker:

and I think the idea with the honeypot, especially given what you're saying,

Speaker:

that not just a matter of alerting us, But, it's also a matter of,

Speaker:

learning about the attacker and also.

Speaker:

their techniques and also potentially slowing them down.

Speaker:

That is, I hadn't actually thought about that second one.

Speaker:

The idea that if we do a good enough honeypot, that it, they

Speaker:

actually might think that they've accomplished their objective.

Speaker:

That's a really interesting, method.

Speaker:

And of course, I'm assuming that it would also activate some sort

Speaker:

of notification, so that we know that something has happened.

Speaker:

did we talk, go

Speaker:

but yeah.

Speaker:

Just a question on a honeypot though, right?

Speaker:

It all boils down to though how realistic of a honeypot you create.

Speaker:

Because if a bad guy knows, hey, it's like I think Mike, in one of the previous

Speaker:

episodes, you talked about, okay, if malware runs and it's Hey, this is running

Speaker:

within a virtual machine or a sandbox machine, I think on prior episodes, right?

Speaker:

It's okay, maybe the ransomware.

Speaker:

Or malware doesn't operate in a certain way.

Speaker:

And so I think in order for a honeypot to be useful, it needs to really

Speaker:

emulate a real world, example use case such that a bad guy doesn't know,

Speaker:

Hey, I am in this isolated network.

Speaker:

I am attacking a machine which doesn't have any value because then

Speaker:

they'll just move on from that.

Speaker:

Right?

Speaker:

And it and honeypots are not, to your point, they're easy to set up,

Speaker:

but they, it takes some time to, and management to, care and feeding.

Speaker:

but if you're a, a sizable organization that has a test development

Speaker:

environment, it's a similar.

Speaker:

Exercise, right?

Speaker:

So you've got, you wanna replic, replicate to a degree or mirror

Speaker:

your production environment so that your testing is, is applicable.

Speaker:

if you can then take that test dev environment and mirror

Speaker:

that in your honeypot, whenever you update test dev just.

Speaker:

the copy in the honeypot.

Speaker:

and you should be using scrubbed or simulated data

Speaker:

in your test dev environment.

Speaker:

so all that stuff could, if you're doing it the right way, be pretty easy to

Speaker:

replicate or mirror in your honeypot.

Speaker:

it, and that, that takes me back, takes me back, back in the day.

Speaker:

once again, and I remember that we had, we had a naming convention

Speaker:

would very much, to the purpose of the server, and it would allude to

Speaker:

the fact that it was a production server or a test or a dev server.

Speaker:

And in this case, if the test of dev is also gonna be acting as a honeypot.

Speaker:

then you would definitely not want to do that, right?

Speaker:

You would want it to

Speaker:

Yeah, don't call your honeypot assets.

Speaker:

don't start their names with honeypot, or test dev.

Speaker:

Yeah,

Speaker:

Yeah, honey.

Speaker:

one.

Speaker:

honey pot one.

Speaker:

Yeah.

Speaker:

Don't do

Speaker:

Someone call that a clue.

Speaker:

it does the idea is that it's gonna be something like it's still running.

Speaker:

windows 2003 server, with SMB turned on with no authentication.

Speaker:

RDP my favorite, the ransomware de deployment protocol, RDP is

Speaker:

turned on and it's available.

Speaker:

All of the things that we're not supposed to do in our production environment,

Speaker:

it's expected that you would do that in a hunting pot environment.

Speaker:

one of the things too, bad guys are absolutely gonna jump at the opportunity

Speaker:

to get into a network like that, but at the same time, they're gonna go.

Speaker:

They're gonna go cautiously because of how blatantly obvious it is.

Speaker:

So if you're gonna, if you're gonna, if you're gonna really walk outside

Speaker:

with your robe open, make sure that you sprinkle in some documentation, for maybe

Speaker:

there's a text file that says, we really shouldn't be putting this online yet.

Speaker:

Or, some false artifacts, that, that would state risk or identify risk or,

Speaker:

my favorite is, emails from executives telling it, I don't care what the

Speaker:

risk is, we've gotta do it this way.

Speaker:

stuff like that would help, further the story and potentially keep,

Speaker:

bad guys engaged a little longer.

Speaker:

Interesting that, so the social engineering aspect of

Speaker:

this is quite interesting.

Speaker:

I'm thinking about, I'm actually thinking about, the, when they, in World War

Speaker:

ii, when they staged the attack for D-Day and they had that entire other.

Speaker:

fake army and they had plans and all of that, right?

Speaker:

and they really did, they fooled them, right?

Speaker:

that they thought that this, that this was happening.

Speaker:

at, it was, but they thought it was happening in a completely different

Speaker:

place at a completely different time.

Speaker:

I've actually been

Speaker:

Yep.

Speaker:

location where they states that it was really cool that the

Speaker:

fake, tanks and things like that.

Speaker:

Are there.

Speaker:

Tools people can use for creating honeypots and configuring.

Speaker:

I know you said it's probably e not too bad to set it up, but to actually

Speaker:

like configure it and getting it more legitimate and all the rest

Speaker:

takes time, caring and feeding.

Speaker:

And so I was just wondering what are some of the tools out

Speaker:

there people may use for this?

Speaker:

there's a couple, you can just Google it.

Speaker:

you'll find open source projects there.

Speaker:

There's one called the Honeypot Project.

Speaker:

and so there's, Image files there, there's pretty much everything you

Speaker:

would need, to set up your own honeypot.

Speaker:

but really a honeypot is just an environment that, segmented

Speaker:

away from everything else.

Speaker:

and it could just be a, literally a mirror of your test dev environment.

Speaker:

or if you've got a backup of a production environment from

Speaker:

years ago, just spin that up.

Speaker:

'cause all that stuff isn't gonna be patched.

Speaker:

So you don't need, necessarily need.

Speaker:

Okay.

Speaker:

That's actually good to know because that means you don't need something special.

Speaker:

You could use whatever you already have for the most part

Speaker:

in order to spin up honeypots.

Speaker:

Yep.

Speaker:

So there's free stuff and there's also services, mostly

Speaker:

for the research part of that.

Speaker:

I think Tea teapot, tea dash pot, it's, it's European or it's somewhere overseas.

Speaker:

It's foreign service.

Speaker:

and I think there's a pay for version of that, and there's a free version of that.

Speaker:

I, I need to go find the link that IS that.

Speaker:

I, I sent to you guys, I don't know, a week or two ago.

Speaker:

I'm seeing advertised people on budgets or time constraints.

Speaker:

I've been seeing this advertised, this basically honey pot in a box, and

Speaker:

it's literally, it looks around the same size As like the, the firewall,

Speaker:

which is a product I like a lot.

Speaker:

and, it's, it is just literally tiny little box.

Speaker:

It is just plug and play.

Speaker:

honeypot.

Speaker:

I don't know much about how it works, but that's a tool that I've seen out there.

Speaker:

I gotta see if I can find that and put that in the episode description.

Speaker:

I think just the idea is like having the go back to, and we say this a lot,

Speaker:

something's better than nothing, right?

Speaker:

and, you can walk before you can or crawl before you can walk

Speaker:

and walk before you can run.

Speaker:

anything is better than nothing.

Speaker:

about this idea?

Speaker:

Mike, go ahead.

Speaker:

As long as you train with it.

Speaker:

we don't have a whole lot, but I have this thing.

Speaker:

have you ever used that thing for that purpose?

Speaker:

A great example of that is having a cat.

Speaker:

I have a cat.

Speaker:

I love cats.

Speaker:

Hypothetically, I'm actually allergic to cats, but, let's just,

Speaker:

You don't

Speaker:

let's say you're a cat person.

Speaker:

example.

Speaker:

No, a cat is a great example.

Speaker:

I could use a dog, but after I get through this, you'll understand why I didn't pick

Speaker:

a, maybe a chihuahua, but not my dog.

Speaker:

all right, so you have a cat.

Speaker:

You love your cat, you pet your cat, you take care of your cat.

Speaker:

Your cat is company.

Speaker:

it's a domesticated animal.

Speaker:

You don't have any other security controls in your house.

Speaker:

maybe you're a good fighter.

Speaker:

You've got nails.

Speaker:

you can scream.

Speaker:

you can run fast, but no other real security controls.

Speaker:

If someone breaks into your house, if you think about it

Speaker:

and you're comfortable doing it, throw your cat at the bad person.

Speaker:

'cause the first thing the cat's gonna do is claws first towards

Speaker:

whatever it's heading towards.

Speaker:

And bad it.

Speaker:

Put yourself in the bad guy's shoes.

Speaker:

Someone just threw a cat at you.

Speaker:

Are you still coming forward or you're stopping to like,

Speaker:

defend yourself against his cat?

Speaker:

that gives you a couple of seconds at least to, to get away.

Speaker:

'cause maybe you are fast.

Speaker:

but again, because I'm not a cat person, I'm, I like cats.

Speaker:

I just, I'm allergic.

Speaker:

I couldn't use dogs.

Speaker:

'cause obviously in that example, dogs are heavier and dogs, if again, if you've

Speaker:

trained them or you've put them in that situation would attack or at least.

Speaker:

Growl and

Speaker:

Yeah, it

Speaker:

make a scene.

Speaker:

Make someone think twice.

Speaker:

I used to have great Pyrenees, 160 pound, great Pyrenees, super

Speaker:

sweet dude, big male, great Pyrenees, and I was outta town.

Speaker:

Once he set off the motion alarm in the house, called the police.

Speaker:

Police showed up to check to make sure nothing was in the house.

Speaker:

And.

Speaker:

He got to the side door and this big great py, he is just sitting

Speaker:

on his butt, looking out the door, head down a little bit.

Speaker:

And the police, I was on the phone with the officer, he is yeah,

Speaker:

no one's going in that house.

Speaker:

that's different.

Speaker:

That's different.

Speaker:

but if all you have is a cat and you think about it and

Speaker:

you've actually done a little.

Speaker:

Maybe mental exercise.

Speaker:

Throwing your cat at a bad person is a, next best thing to nothing, to

Speaker:

Curtis's point, it's better than nothing.

Speaker:

Right.

Speaker:

but you have to have at least put yourself in that situation so that

Speaker:

you can think of or do, whatever that one thing is that's better than

Speaker:

Don't try it with the goldfish.

Speaker:

it.

Speaker:

the throwing a, a goldfish bowl at somebody is probably good.

Speaker:

the another important thing about a honeypot and on the, we're

Speaker:

in the action items part here.

Speaker:

So if we're gonna do something right, if we're gonna do a honeypot

Speaker:

there, part of that needs to be that a notification, right?

Speaker:

And so you do need to create a type of notification that won't easily

Speaker:

be noticed, 'cause You want to be notified that somebody has hit the

Speaker:

trip wire that somebody has put their.

Speaker:

their hands in the honey pot, but you don't want the, the bad guys

Speaker:

to know that they've been caught with their hands in the honey pot.

Speaker:

wanna monitor the honeypot just like you would.

Speaker:

parts of your network because you, you need to know, you need to know

Speaker:

bad guys are attacking your honeypot so that you can better prepare

Speaker:

for the iop, the scenario where they attack your real network, if

Speaker:

you're using it to learn from them.

Speaker:

Obviously you're gonna collect all that telemetry data.

Speaker:

and if they start taking things from you, that's a great place

Speaker:

to put those canary files.

Speaker:

So now you can track where it goes.

Speaker:

going to just ask about Canaries, 'cause we talked about it at the very beginning

Speaker:

and then we moved off to honeypots and.

Speaker:

So yeah, so Canary files would be something just, it is like a

Speaker:

honey pot within a server, right?

Speaker:

So you have a regular server, and then you've got, these files that

Speaker:

were very, very, but does that, are you okay with that description that

Speaker:

it's like a honeypot within a server?

Speaker:

Does that sound okay?

Speaker:

Mike

Speaker:

doesn't

Speaker:

so a honey pot is really just anything that a honey pot's, anything that, that,

Speaker:

looks appealing, it's gonna attract.

Speaker:

So you're attracting bees, you're attracting things to the honey.

Speaker:

So yeah, the canary file could be the honey for sure.

Speaker:

So it's just a, it is just, I think it's just a different

Speaker:

term for kind of the same thing.

Speaker:

It's just a, it's a file as opposed to an entire server.

Speaker:

And that file

Speaker:

the objective is a little different.

Speaker:

So the honeypot in general is designed to capture, it's designed to distract,

Speaker:

it's designed to attract, and capture activity and learn from and keep them away

Speaker:

from, the beehive, but the canary file.

Speaker:

Adds a couple of other objectives to that one.

Speaker:

because it's a file, a particular object, and that object is typically a file,

Speaker:

like a spreadsheet or a Word document or.

Speaker:

a file, it's not a folder, it's not a volume, it's a file.

Speaker:

That file has its own little trip wires and a couple of things that, that are

Speaker:

going to trip them is any change to the attributes and metadata of that file.

Speaker:

So if when we set that file two days ago, the last accessed and

Speaker:

created date was, two days ago.

Speaker:

If any of the metadata changes today, that trips the Canary file, so we'll get an

Speaker:

alert that says, Hey, someone touched it.

Speaker:

then in addition to that, if someone takes it.

Speaker:

We can track that.

Speaker:

So wherever it lands that hosted, that rented, command and control server, bad

Speaker:

guy's, cell phone, bad guy's, laptop.

Speaker:

It goes from here to there, it calls home.

Speaker:

It beacons, to the extent possible.

Speaker:

if a bad guy really knew what he was doing, he would prevent that.

Speaker:

But the canary,

Speaker:

It.

Speaker:

excuse me.

Speaker:

yeah.

Speaker:

Canary, Canary files, could be classified as malware because the

Speaker:

recipient fully aware of the intent or behavior of what they took.

Speaker:

So that would be classified as malware.

Speaker:

it's not.

Speaker:

It says it's, it says it's the financial spreadsheet, but really it's Canary file,

Speaker:

so that would be classified as malware.

Speaker:

It's not presenting itself as what it really is.

Speaker:

but yeah, Canary files do a little bit more, and it's designed to, to go

Speaker:

with, outside your honeypot, go with the bad guy, and hopefully report back

Speaker:

It.

Speaker:

It's like what in all those TV shows where it's like someone downloaded

Speaker:

the file, oh, I can track 'em with GPS at this particular location.

Speaker:

Okay, let's go bust it and grab them.

Speaker:

Sort of, yeah.

Speaker:

Yeah.

Speaker:

But one final thing, just to note, if you're in, if you're looking

Speaker:

at implementing a honeypot or a canary files, it's important to

Speaker:

have your clocks synchronized.

Speaker:

So use NTP, to make sure that all your clocks are the same.

Speaker:

Why would that matter, Mike?

Speaker:

synchronization in general is just, important from a networking

Speaker:

perspective so that things work.

Speaker:

in, misaligned time sync is actually a vulnerability.

Speaker:

And so that, you could use that as an enticement, but you're looking

Speaker:

at collecting data for evidence and for behavior and for tracking, that

Speaker:

information needs to be correct.

Speaker:

It happened on this day and time, and you can tie that together with

Speaker:

other, so if I'm gonna, if I'm gonna catch a bad guy and I said, you

Speaker:

hacked my system last Tuesday, all the logs and event data is from 1999.

Speaker:

You're gonna have trouble in court.

Speaker:

or January 1st, 1970.

Speaker:

Yep.

Speaker:

all right, I think we've talked about canary files enough.

Speaker:

person, any final thoughts?

Speaker:

you should go watch Zoolander.

Speaker:

Go watch.

Speaker:

I think I got the black long pop.

Speaker:

and, how about you, Mike?

Speaker:

Any fi any final thoughts?

Speaker:

I think canary files and honeypots are useful tools if you've got the

Speaker:

time to deploy them and to keep them updated, along with the playbook for

Speaker:

what happens when bad guys actually, start to, and it's not always bad guys.

Speaker:

Sometimes it's just, we call 'em script kitties.

Speaker:

just the curious, low level, people that, that are just getting into

Speaker:

cyber are curious about cyber.

Speaker:

they're running queries across the internet with the

Speaker:

tools they have access to.

Speaker:

They're gonna find your honeypot.

Speaker:

and so what do you do?

Speaker:

What do you do when, When, when they do it is kinda like back in the day when

Speaker:

your house is on fire, what do you do?

Speaker:

You closest to exit, stop, drop and roll, get outta the house.

Speaker:

We don't do that anymore.

Speaker:

I don't think my kids would know what to do if my house caught on fire.

Speaker:

'cause there's no commercials, there's no public service announcements.

Speaker:

but quite similarly, if you're gonna build something to attract bad guys,

Speaker:

make sure you've got a plan for how to handle that when it does happen.

Speaker:

Mike, I have one question for Mike.

Speaker:

Since you engage with a lot of people who've been attacked by ransomware,

Speaker:

malware, et cetera, how often do you see cases where the honeypot

Speaker:

exposes the attack, the trip wire before the main attack comes in?

Speaker:

Very often, if it, if it's a, if it's a, a representative honeypot, in other words.

Speaker:

Everything the bad guy did to this honeypot network would

Speaker:

have been similar to what they did to the production network.

Speaker:

if they're not the same, then you've got some indication of how a bad

Speaker:

guy operated and got where they got where they, where they ended up.

Speaker:

You can compare that to how that would've gone against the production environment.

Speaker:

but very often, you bad guy tactics.

Speaker:

From one type of bad guy, one group, one, one skillset.

Speaker:

It's kinda the real world, we call, what's their mo, what's their modus operandi?

Speaker:

some bad guys on, on the cyber side, they use the same tools.

Speaker:

they follow the same process.

Speaker:

They do this first and this second, or, there's this decision tree, but

Speaker:

it's, it's almost always the same from one, one victim to the next.

Speaker:

So is it safe to say though, if a bad guy attacks a honeypot, you detect it, there's

Speaker:

a high likelihood that your production environment is going to be safe.

Speaker:

it's gonna be next.

Speaker:

Okay.

Speaker:

Okay.

Speaker:

That's yes.

Speaker:

yeah, so the hunts, because at some point they're gonna go.

Speaker:

Why aren't they reacting?

Speaker:

Why aren't they shutting this down?

Speaker:

Why aren't they shutting me out?

Speaker:

I'm just in here having a good time.

Speaker:

I'm gonna invite some friends.

Speaker:

At some point they're gonna, this is weird.

Speaker:

I wonder if we're in a honey pot.

Speaker:

Now they're probably, there's a good chance they're gonna get upset about that.

Speaker:

back to Curtis's point, we've gotta monitor that honeypot so that you're

Speaker:

aware when they're attacking you.

Speaker:

that's your defense.

Speaker:

That's one of those layers.

Speaker:

That's gonna give you time now to really put focus on your

Speaker:

production environment, 'cause it's likely gonna be the next target.

Speaker:

And you, now you've got a chance to be ready and prepared and maybe

Speaker:

even, add some staff call, call a friend, start blocking stuff

Speaker:

that you don't narrow necessarily.

Speaker:

Block, start, shutting remote access down and things like that.

Speaker:

Turn off all your third party vendors that don't need access right now.

Speaker:

And maybe for the next six weeks, really put focus on that 'cause.

Speaker:

If they find out that there was a honeypot, yeah.

Speaker:

They're gonna, they're gonna look for your real network, your real environment.

Speaker:

Call a friend.

Speaker:

Call a mic.

Speaker:

right.

Speaker:

with that, I will say that is a wrap.