Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
10: Cryptomath (Cryptography) image

10: Cryptomath (Cryptography)

Breaking Math Podcast
Avatar
964 Plays7 years ago
Language and communication is a huge part of what it means to be a person, and a large part of this importance is the ability to direct the flow of that information; this is a practice known as cryptography. There are as many ways to encrypt data as there are ways to use them, ranging from cryptoquips solvable by children in an afternoon to four kilobit RSA taking eons of time. So why are there so many forms of encryption? What can they be used for? And what are the differences in their methodology, if not philosophy?

---

This episode is sponsored by
· Anchor: The easiest way to make a podcast. https://anchor.fm/app

Support this podcast: https://anchor.fm/breakingmathpodcast/support
Recommended
Transcript

Intro: Promotions and Announcements

00:00:00
Speaker
With Lucky Land slots, you can get Lucky just about anywhere. Daily beloved, we are gathered here today to... Has anyone seen the bride and groom? Sorry, sorry, we're here. We were getting Lucky in the limo when we lost track of time. No, Lucky Land Casino, with cash prizes that add up quicker than a guest registry. In that case, I pronounce you Lucky.
00:00:21
Speaker
Play for free at LuckyLandSlots.com. Daily bonuses are waiting. No purchase necessary. Boyd were prohibited by law. 18 plus. Terms and conditions apply. See website for details.
00:00:30
Speaker
It is Ryan here, and I have a question for you. What do you do when you win? Like, are you a fist-pumper, a woohooer, a hand clapper, a high-fiver? I kind of like the high-five, but if you want to hone in on those winning moves, check out Chumba Casino. At chumbacacino.com, choose from hundreds of social casino-style games for your chance to redeem serious cash prizes. There are new game releases weekly, plus free daily bonuses, so don't wait. Start having the most fun ever at chumbacacino.com.
00:01:00
Speaker
Hey Breaking Math fans. First, I want to thank you for listening. I have an important message for everyone. You can start your own podcast right now with Anchor. Anchor lets you create and distribute your own podcast. Just get an idea, record and upload. It's just that easy. Anyone can do it.
00:01:17
Speaker
I'm on my way to accomplishing my dream, and you can too. Just get on your device's app store and download Anchor. It contains everything you need to make a podcast. With Anchor, you can put your podcasts on all the big platforms. Apple Podcasts, Spotify, Google Podcasts, Amazon, and more. Reach the whole world with Anchor. Best of all, Anchor is free. You have nothing to lose with a free platform. Download the Anchor app or go to anchor.fm to get started.
00:01:47
Speaker
This episode is distributed under a Creative Commons attribution share like 4.0 international license. For more information, visit creativecommons.org. Somebody stole our website. Oh no, whatever shall we do? I mean, I guess you could go to the new website, http://breakingmathpodcast.app with no www for all you old timers.
00:02:15
Speaker
So breaking me off podcast.app? I mean, if you're into that sort of thing.

The Role and History of Cryptography

00:02:30
Speaker
Language and communication is a huge part of what it means to be a person. And a large part of this importance is the ability to direct the flow of information.
00:02:44
Speaker
This is a practice known as cryptography. There are as many ways to encrypt data as there are ways to use them, ranging from crypto quips solvable by children in an afternoon to four kilobit RSA taking eons of time. So why are there so many forms of encryption? What can they be used for? And what's the differences in their methodology, if not philosophy? All of this and more on this episode of Breaking Math. Episode 10, crypto math.
00:03:20
Speaker
I'm Jonathan. And I'm Gabriel. And today we have on a new guest, would you like to introduce yourself? I'm Anthony Jay Mintley. And Anthony, what's your background? I'm a graduate of engineering at University of New Mexico, and I'm an open source software developer. Anthony, would you tell us a little bit about your interest in cryptography? Well, I like to pretend I'm a super spy like everybody's after me.
00:03:47
Speaker
Now, in reality, you know, crypto is something that can help you be safe from people stealing your information. You know, you read in the news all the time of identity theft and so on. And it's, I don't know, just a way that you can protect yourself in certain ways. And also joining us today is? I'm Jaleela Arthur, your studio engineer. Happy to be here.
00:04:12
Speaker
And Gabriel, you're going to talk about a recent relevance cryptography with respect to viruses in Google documents. Yes, absolutely. As of recently, there was actually a big notice that made the rounds. I don't know what news sources you all watch, but a lot of the news sources talked about a Google Docs virus. I have some colleagues, some colleagues who were on this show previously. Julian Wilde sent me a message.
00:04:34
Speaker
And he told me not to open any emails from him that have a Google Drive attachment. Now the thing is, the entire Breaking Math podcast, we all use Google Docs regularly in our planning. So that was very interesting of him to tell me. So it looks like this virus will get into your system and then it'll also send out other emails from your Google address and invite people to a shared Google document.
00:04:57
Speaker
So that's very relevant. Now this happens, oh goodness, in my lifetime every single year there's always a new news report about another computer virus that is stealing information and making people vulnerable. Yeah, I think one of the first ones that I was kind of aware of was the I love you virus that spread via, wasn't it a macro in some program?
00:05:19
Speaker
I'm not familiar with that one. Tell me a bit about the I love you virus. Basically you open this email and if you run what it did, it would email all of your colleagues and so on and it spread that way. There was an early virus though in the early days of the internet that worked with a backdoor in I believe SSH that literally took down basically the entirety of the internet for like a while.
00:05:43
Speaker
The guy wrote it, and he said that it was just for personal testing, but you never know. You had something to add? That was the Morris worm, wasn't it? Yes, yeah. That was, I think, the first computer virus that attacked networks on a widespread scale. And Morris was the son of a famous computer scientist who was also named Robert Morris.
00:06:05
Speaker
Well, so I think it's very, very clear why this is an incredibly relevant topic to discuss.

Cryptography Techniques and Applications

00:06:11
Speaker
I always like to talk about our topic and why we chose it on breaking math. What's interesting is even the name breaking math, I think of, you know, something involving encryption, you know what I mean? So I have a question for you, Jonathan. How would you explain to any listeners who are curious why we're discussing cryptography on a math podcast?
00:06:31
Speaker
It's one of those things where the math, well it's twofold. A lot of the math behind cryptography was started to be developed in the 1800s and earlier. I mean for a long time there's basically no practical applications of pure number theory.
00:06:47
Speaker
And now there's like a ton. I think the NSA employs a good portion of the world's mathematicians, or at least for its size. And second of all, it's just an interesting topic for the mathematical point of view. I'm a programmer.
00:07:04
Speaker
something related to cryptography is called hashing and basically it's when you want to look something up by name. So if you are looking something up on certain types of programs and you put in a name and it looks up the information related to that name, it changes the name into a number and that's how it looks it up. So it's intricately tied to math and it always has been.
00:07:31
Speaker
is not one of those things that just got intricately tied to math, and we're going to talk about the art and history of cryptography in just a second. On this episode, we'll be discussing the art of cryptography itself, general encryption strategies, social engineering, which has a lot to do with fine men actually, white, gray, and black hats and what's the difference between them, and finally the arms race that is sort of intrinsic to cryptography.
00:07:56
Speaker
I got to say I'm really excited about the Feynman one. I find that if you go to any party where there's people who are often interested in physics, you just name drop Feynman and people just open up to you. Isn't that, that's the coolest thing. So I'm very excited whenever we get to talk about Feynman.
00:08:12
Speaker
The art of cryptography goes back a little more than 2000 years with the Roman Republic and has continued increasingly to the present time. Once the purview of kings and generals is now used frequently by the common people, it has mainly been technology and an increase of mathematical knowledge which has driven this change.
00:08:30
Speaker
So what types of cryptography were and are common? Okay, so before we go into the exact types of cryptography, Anthony, I was wondering if you wanted to give a quick overview of what public versus private key cryptography is, or symmetric versus asymmetric, and then I'll give some analogies.
00:08:48
Speaker
Okay, so when people think of crypto, they usually have this sort of vague idea of a single key that two people use to encrypt and decrypt a message. And that is a real type of crypto, a very common type of cryptography. But there's another one that's got some very interesting properties, and it's called key-pair encryption.
00:09:12
Speaker
And key pair encryption involves every party in the conversation, each having two keys, two keys that work together, a public key that they show to everybody and a private key that they keep private. And they can do certain kinds of operations with those keys.
00:09:29
Speaker
Yeah, it's almost like a mailbox where you have the key to get all your mail, but everybody has a key to be able to put mail in your specific mailbox. And the other kind is like you have to use the exact same key and put mail into the same mailbox and kind of retrieve it. Would you say that that's accurate? I think it works.
00:09:48
Speaker
That was extremely helpful. Now, I do not have a background in cryptography whatsoever. So I really appreciate the analogy about the mailbox. Yeah, that was very helpful. That's all. I don't know if that added a lot. And so you talked about some operations and you're talking about signing, right? And would you like to go into what signing is?
00:10:06
Speaker
So there are two sets of operations that are pretty fundamental to key pair cryptography. They're signing and verifying and encrypting and decrypting. And often they're combined so that you both sign and encrypt or both verify and decrypt. So signing is the, I guess the first one to go over. Signing uses your private key.
00:10:31
Speaker
So, for example, if Gabe and I wanted to have a conversation and we're leaving notes on a table for each other to pick up, well, anyone can leave a note on the table. How does he know it's written from me? If he has your seal, right? His seal that he puts on with digital wax.
00:10:50
Speaker
Right. So historically, people used signatures as something that were signatures or seals, like on a ring or whatever, as a method of something that couldn't be easily forged and could uniquely identify that person. So with a key pair, someone's private key can be used to create a signature for any message. And the public key is used to verify the signature, correct?
00:11:18
Speaker
Then, yes, the recipient can use my public key, which is posted on bulletin boards. Everybody knows what my public key is. And they can use that to verify that I signed that message. And it's not only useful for other people to have it, it's even more secure when the most people have it, right? Yes.
00:11:36
Speaker
Just in the interest of breaking this down to the most simple terms, I was first introduced to cryptography when I was an eight year old for this road, it was via this road trip book, it was games for road trips, and they had a wheel.
00:11:52
Speaker
So when you talk about this key pair, I'm just interested in kind of relating it to that so that I can understand it. So the wheel, you can change what each character meant. So when you have the public key, it's like somebody else has that same wheel and the private key. I don't know. Could you, is this helpful to you in breaking it down?
00:12:11
Speaker
What you're describing is probably the asymmetric kind, the kind that only has a single key that is used by both the author and the recipient of the message. What is interesting about that, though, is because public keys have to be so difficult to reverse, and that's something we'll be talking about in a little bit, they a lot of times are more computationally expensive for the same security.
00:12:37
Speaker
than asymmetric keys. And asymmetric keys are even sometimes sent using a symmetric key or public key cryptography. You can imagine it kind of like making a wormhole. Now, I'm with Jaleela on this because cryptography is not my forte. So the terms and the analogies that you guys are using are brand new to me. So I'm still not entirely following with respect to the private key and the public key.
00:13:02
Speaker
So private key cryptography actually depends on multiplying prime numbers together. So let's say I gave you, for example, the number 309. It's gonna take you longer to know that that's divisible by the primes three and 103 than it is to multiply those two numbers together and get 309.
00:13:29
Speaker
So one of the earliest forms of cryptography is a simple letter substitution cipher. And you can see these on a once popular children's toy. Oh, that's right. That's right. Actually, I recently saw the show Scrubs where JD and Turk were very excited because their cereal box gave them two decoder rings and they wear them everywhere.
00:13:50
Speaker
And there's this popular game you might have seen in a newspaper called a crypto quip that basically uses this. It's not secure at all. It takes a little bit longer to figure out than, you know, just reading text. But there are I mean, if you just look at what like letter frequencies, you can figure out pretty easily how secure would you say that is? Well, my grandma can decode it. So I'd say it's not that secure.
00:14:18
Speaker
So this is like where J equals K, and then you can figure out the offset, the alphabet. OK, so if J equals K, then most likely JK, then K equals L, or you can figure out the frequency. That's what you're talking about.
00:14:36
Speaker
That's a Caesar cipher. It works for things that are even more complicated than that, for instance, where J is K, but K is Q, for example. Oh, so it doesn't have to be linear. But cryptocryps are linear. And cryptocryps are not linear. Cryptocryps are just kind of scrambled up.
00:14:57
Speaker
and they're really, they're fun to solve actually. So you only need about 20 characters worth of stuff for it to be decryptable. Anthony, do you have anything to say about the number of letters that you have to have or the amount of information you have to have for something to be decodable?
00:15:15
Speaker
In terms of formal knowing mathematically how many numbers you need, I don't have much input there, but you can look at the patterns of words and consider E is, I mean anyone who's played Wheel of Fortune knows.
00:15:31
Speaker
E is the most common letter in the English language, T is up there also. And so by looking at the frequency of the letters, you can assume that probably the most common letters, an E or a T or some other vowel, and from there you can figure out enough to decode the word. So it's something that you can just analyze and figure out. The security in that is not that strong.
00:15:56
Speaker
Yeah, it's fascinating. And I wanted to go back real quick and say, so I'm still not clear on how a decoder ring itself works. Oh, a decoder ring just literally has the letters on the inside and a shuffled bunch of letters on the outside. And so when you're encoding it, you look for the letter from the outside to the inside. And when you're decoding, you do the opposite way. And what's bizarre about that is if you get three decoder rings, you can make an Enigma machine.
00:16:23
Speaker
Oh my gosh, that's amazing. And so, an enigma machine is much more complex. Now, I've seen the recent movie with Alan Turing, so I'm familiar with the enigma machine. But for our listeners, just in case they're not familiar with it, I don't feel confident myself explaining it. It is used by the Germans during World War II and many other people.
00:16:43
Speaker
Yeah and before that it was used by a lot of businesses because as you know corporate spies are used to be even a bigger part of business. But what it is basically is imagine you have three decoder rings on three fingers and so you look up let's say you look up A and then inside of that it's B and then you look at B on the outside of the second one
00:17:05
Speaker
And then the inside is like Q. And the third one, it goes from Q to F. So you write down F. But then you turn the decoder ring by one on the third one, and then you just keep doing that. Oh my gosh. You put in the crypto text and plain text. And Gabriel, what are crypto text and plain text? Oh, yeah, it's actually in the word itself. Crypto text is just the encrypted text and plain text is just the unencrypted text.
00:17:29
Speaker
This area really, really excites me. See, this is why math is really, really cool. Just the fact that it can go from simple to complex that quickly. Just with three decoder rings, you can already make an enigma machine. That's one of the fascinating things about mathematics. Yeah, and RSA basically works by you have a message and you have the public key, which is in two parts.

Historical Case Studies in Cryptography

00:17:53
Speaker
I can't remember what the two parts are called. Do you remember what they're called, Anthony?
00:17:57
Speaker
I don't quite get it
00:18:20
Speaker
There's always a sort of duality in cryptography. Now I want to talk about a few forms of encryption that are being used historically that aren't related to the Enigma or the decoder rings. And one of them was used during the Revolutionary War. Gabriel, what is that?
00:18:40
Speaker
Yeah, now this is one that I am aware of actually. This is one that my friends and I used to make at sleepovers when I was in fourth grade. I don't know what, I think I was reading, oh gosh, what mystery book was it? It was like a box car children. Hardy Boys. Or Hardy Boys or something like that. Yeah, yeah, yeah, one of those stories. And there was a lot of talk of secret messages in them. We simply did the one where you have a newspaper article and you identify where the letters are that you need in your message, just anywhere that you'll find them.
00:19:07
Speaker
And then you have a card where you have hole punches and the holes are in specific locations to correspond to the letters in your message. And all you do is put the card right over a newspaper article and there you go. You can read your message. So I think that that would require the person who needs to receive a message, they'd have to know exactly where to put the card. Somebody who didn't know exactly where to put it, I see that as being a further level of encryption.
00:19:42
Speaker
I probably butchered that, but that means do you speak Navajo in Navajo? Oh, okay. What is that referred to? Oh, yes. Now, this is a whole other topic. This is a very fascinating topic and this is relevant to World War II. In addition to encryption, there's also the Navajo language.
00:19:59
Speaker
And during World War II, on the American side, on the ally side, we had Navajo code talkers who communicated and they were utilized because we knew that the enemy, that the Axis powers did not have anybody who could translate the Navajo language. So that's also a means of encrypting information just from lack of access to a key, aka a translator.
00:20:25
Speaker
There's a great movie about the wall. OK, I'm going to I'm going to rephrase that. There's a movie about that. Nicholas Cage movie called Wind Talkers, in fact, and that that does you know, it communicates the principle of the story. Yeah, sad fact about that is that a lot of Navajo code talkers were shot in Japan because they were confused for Japanese people.
00:20:44
Speaker
So many of our examples come right out of World War II, which is just fascinating. We could do a podcast about the encryption of almost any war. What would you call this? Security by obfuscation, or do you have any thoughts on this?
00:21:08
Speaker
It was not exactly a strong code in terms of mathematics. If Japan had had native Navajo speakers, they probably could have figured it out. The primary problem was that no one knew the language. There was a layer of code underneath where Navajo words, a Navajo speaker who didn't know the code, would still not be able to decode the messages. But he could probably figure it out given a little bit of time.
00:21:38
Speaker
So it was sort of a security by obscurity method, one that worked well, but it probably wouldn't hold up today. Although some people still use that, especially in hardware. I've heard of a few examples that are just atrocious. I wish I knew specifics right now.
00:21:57
Speaker
So back to the Enigma and the Colossus machine which decoded it, it exploited a flaw in the Enigma code. And that flaw was that no letter could be encrypted to itself.
00:22:11
Speaker
Interesting. I'm not. Oh, wow. And actually I was not familiar with that. So I know about the Enigma machine from the movie, the recent Alan Turing movie played in which the actor Benedict Cumberbatch plays Alan Turing in the movie was called The Imitation Game. However, I was not aware of the exploit. Can you tell me a little bit more about that?
00:22:29
Speaker
Sure, it basically works by a little bit of statistical analysis, basically where should letters not be where they are, and then it relied on just a lot of clever tricks like that. Basically, they crack it by
00:22:50
Speaker
taken a message that they knew what it was and then just reverse engineering it. Interesting, interesting. Yeah, and actually in the movie, they have a great scene where they illustrate one of the hacks, I guess we'll call it. And do you all, I don't know if you all have seen that movie. I think that they figured out somehow that every German message started off the exact same way. And what was it? It was started off with like,
00:23:13
Speaker
Heil Hitler or something of that sort. Using that, they were able to leverage that information to help them decode the German code. It's like modern code breaking, which we're going to talk a little bit about.
00:23:32
Speaker
If I know the random number generator that you're using to make your random or pseudo random numbers, then I'm gonna be that much closer to knowing what your code is. I mean, it even gets to the point where if you have a sensitive enough microphone that's listening or listening to the internal workings of the computer, you could break RSA pretty quickly. And RSA is, in theory, absurdly secure. Wow, that is fascinating.
00:24:02
Speaker
So by having a microphone record the noises of the computer, they can break the code.
00:24:08
Speaker
Yeah, well, because they know that you're doing, like, addition is taking you this long. Actually, if they know how long it takes you, even that can give them enough information to help break the code a lot quicker. God, that's so interesting. Any information. I love how almost all of this, all of this can always be a shout back to an old episode. And recently, I've been doing callbacks to the episode on information theory. I love information theory. Fascinating how, again, as you just said earlier, if you can identify the time it takes to do an operation, that gives you much needed information. That's fascinating.
00:24:38
Speaker
Oh, yeah. Yeah, everything comes down to information theory, including the best way of picking a password. And yeah, we're going to move on right now to general encryption topics. Now we're going to talk about a hodgepodge of encryption topics. So I think we should talk a little bit about salting. And that has to do with what?
00:25:03
Speaker
Well, one attack that's very popular these days is to break into a database belonging to some company or website that has user information. And it'll often have passwords, for example. And if those passwords are just stored in the database as is, most people reuse passwords on other websites.
00:25:27
Speaker
Oh, I just wanted to mention a horrible thing that I saw recently where there's a website that if you typed in the same password that somebody else had, it would let you know and then it would tell you which user had it. Wow, I'm sorry. To me, that seems evil.
00:25:46
Speaker
It was probably more of a certain incompetence than evil. Sorry, I didn't mean to interrupt.
00:26:01
Speaker
Yeah, and also not the same password as user 23, and then of course you could just break into anybody's account. I mean, that's where dictionary attacks, which we're gonna talk about, are even easier. But sorry, continue with your story. So it's pretty common these days for databases with usernames and passwords to get hacked, to get released on the black market. And sometimes the passwords are stored in plain text in the database, which is
00:26:29
Speaker
really bad, that's unfortunately way too common. So the first step in securing those passwords is hashing them. So when you're implementing a web form to log in, the user enters their password and you don't compare their password to anything because that would require keeping it in plain text in a database and that could be leaked. Instead you hash the password and you store the hash in the database.
00:26:58
Speaker
So then when they enter their password, you hash what they entered, compare it with what's in your database, and if they match, then you know they entered the right password.
00:27:05
Speaker
And do you want to go real quick into what hashing is just one more time so that everybody's on the same page? Hashing is kind of a one way function. It's a way of taking a value, maybe a number or a string of letters, a number or a string of letters and making a random looking string out of it, one that has no relation to what the number actually is.
00:27:31
Speaker
And it's a function, of course, by definition, you could apply it. So if I put in cat and I come up with a QWERTY, it's always gonna go from cat to QWERTY, correct? That's right. That's how this login form works, is that if you enter the right password, it will get hashed the right way and compare with the hash you did five months ago when you signed up for the website originally. So if it's such a difficult thing, why do we do something called salting?
00:28:01
Speaker
Well, the point of salting is to keep if, say, two people have the same password, and this is very common, you know, a bunch of people use 123456 as their password. A bunch of people use their pet's name as a password, and so there's probably a large number of people who use FIDO or whatever.
00:28:22
Speaker
So if all these people have the same password, then if they're all hashed in the same way, then in the database, anyone with the same hash for their password must have the same password. So people who attack the database can focus their efforts on the hash that they see the most.
00:28:42
Speaker
So, salting the password, which is basically adding another value into the password before you hash it, means that every user will have a different hash. And this also protects against dictionary attacks. Gabriel, do you want to talk a little bit about dictionary attacks?
00:29:00
Speaker
they just look up every word in the dictionary and then see what the hash is, and then to find the hashes that they already hashed. All right, yeah, so in dictionary attacks, basically whoever is attacking essentially will try every word that they can in the dictionary and figure out what the hash is, and then through that way they can gain access to the passwords.
00:29:19
Speaker
Yeah, so insulting, let's say your password is like Apple, but then I keep the salt at least private to myself. So you give me the word Apple and I put in banana and then I hash Apple banana and get something different than the hash for Apple. It's very useful.
00:29:40
Speaker
Now the salt has to be stored somewhere because it has to be used every time the user logs in. So the salt is usually stored alongside the password in the database. But the point of the salt is not to make it impossible to recreate the password hash, it's to make it more expensive.

Creating Secure Systems and Passwords

00:29:57
Speaker
Hashing all the passwords in a hundred thousand user database is going to take a long time. Whereas if you go for dictionary attacks, that's almost instantaneous. So the point is not to make it impossible, just to make it prohibitively slow. Yeah, and that could be the difference between somebody getting, like for a credit card company, a lot of credit cards and maybe like seven.
00:30:21
Speaker
Wow. Man, I'm learning so much from this episode, and I love it. One thing that I love, I have a question for you. If this concept has been considered, so in cybersecurity culture, when people know about a lot of these techniques, do people ever get or too often get too comfortable knowing that you already have hashing and all this stuff, and they probably figure, eh, no one's going to try this. You know what I mean? And then suddenly that leaves a vulnerability out of, I guess we'll call it laziness.
00:30:46
Speaker
Laziness is the enemy of good cryptography. It's always there. So is that a big problem right now with cryptography? Because I take their word for it. All I know is the HTTPS thing.
00:30:58
Speaker
The security field is moving rapidly and it's unfortunate but a lot of that is driven by the attackers. People are always finding new and inventive ways to break pretty rigorous cryptographic methods now. And so the good method to use is always changing. It used to be that salting was not that common because dictionary attacks way back when were not common, probably not even all that useful because
00:31:27
Speaker
Having a big dictionary used to take up too much space on the computer. But now space is almost free. Everybody can get a 500 gigabyte hard drive for like 30 bucks. It's amazing. Wow. Yeah. And I think next we should talk a little bit about picking passwords. This is a little bit of a practical thing. Oh, this goes directly to information theory.
00:31:53
Speaker
Oh, God, that's a good issue right now. Now, what's really cool is I know that later on we're going to do a story that involves social engineering that also really illustrates the need, but that'll be a very nice callback later on. So I think that'd be a great topic. Oh, yeah. And basically, let's say that there's a million words in the English language, and you choose a random word.
00:32:17
Speaker
That's about 20 bits of entropy. So let's say you'd make some substitutions in there, like a common substitution, like one for, let's say the word is bitter, and something like that, you make five or six of them. That's only adding about 10 bits of information, meaning that somebody could brute force hack your password in just about a billion tries, which in today's time is not that much if they have access to the hashing function.
00:32:45
Speaker
I'm following along and for the sake of our listener, I want to say let's actually do it. So as you said earlier, did you say that we choose one word at random or did you say a group of words?
00:32:55
Speaker
A group of words is much more secure than just one word. Okay, so as you were saying, for our listeners who are following along, if we were to do it now, as you said earlier, choosing completely at random, choosing intentionally not related, just flip open a book and start picking words. You said bitter, and then I guess the next word that we could- Well, let's get a random word generator so that we could genuinely show how easy it is to make a secure password. Okay, yeah, yeah, yeah. Now, this is a very, very relevant topic. I've worked for jobs where they made sure that we had the most secure passwords.
00:33:25
Speaker
The first word is COIL. The second word is HOST.
00:33:32
Speaker
Third word is eraser. Okay. And fourth word is blindly. Okay. All right. So our four words are coil, host, eraser, blindly. I've chosen completely at random. So the way to remember this is just mnemonics. Coil, host. That could be imagine Alex Trebek, but he's a big slinky. Sorry. I like that. I like that. Awesome. Eraser, blindly. And then you're I think of an eraser erasing everything blindly. All the data is going away.
00:34:01
Speaker
Yeah, it's going away because the guy on the game show is trying to erase his answer, eraser blindly. So, Coyle hosts eraser blindly. You already know that password, and it's very secure. Nice, nice. I like that. Now, that's a certain level of encryption. That level of encryption is random, because I think when people try to pick their own random, they often aren't really random. Isn't that right?
00:34:23
Speaker
Oh, yeah. You could do Markov chain stuff. And, yeah, if you want to make a random password by just kind of pounding on the keys, that could be solved really easily by a brute force system. Yeah. And, you know, there's tons of stories. There's so many stories of people saying, I'm going to choose a password that nobody will ever know. And it's completely embarrassingly predictable. So what are the stories we're going to talk about?
00:34:48
Speaker
And you enjoy the story a lot, Gabriel. Oh, this story got me laughing out loud. I am thrilled that we are going to share the story on this podcast.
00:35:00
Speaker
Now, what it's about is Xerox. Xerox had this time-sharing system. What a time-sharing system is, it's like the early internet where you get to use somebody's computer. It's kind of like Google Cloud back in the 1970s. I have to tell you, this is Xerox, the company that we all know and love that is in every single office everywhere that has that distinct smell. They used to have a really good R&D department, Xerox PARC.
00:35:27
Speaker
I don't remember what it stands for, but that's where the mouse was invented, for example. I didn't know that. Now, real quick, this story, I've never heard this story before. It's new to me. Is this a pretty well-known story? In hackish circles, yes. Okay, so the idea is to bring this story to the masses, but obviously, a lot of our listeners may have heard it and a lot of them may not have.
00:35:47
Speaker
And it's from a thing called the jargon file, which you can find online and I encourage you to read it. But yeah, so what happened was they found a bug in a program and this bug allowed you to execute a small amount of code in executive mode. Gabriel, do you want to break that down?
00:36:06
Speaker
Okay. So, so yes. Okay. So, so obviously we've all got software and you know, almost everybody who's listening to this podcast has hardware and software. The software is used to play the podcast. Now there's the user mode, you know, that allows you certain controls to turn something on and to use it. And then there's the executive mode. That's for somebody who wants to make changes to the software that may affect the user experience. So that's like your manager keys, right? And is there any better, a better way to explain that?
00:36:35
Speaker
No. Yeah. And the problem with being able to execute even a small amount of stuff in that code is that all hell breaks loose. OK. Yeah. Yeah. So it's like forcing a Trojan horse. So so the idea, you know, as we said earlier, is imagine if if a bunch of users got the keys to their manager's office. That's kind of the bug that we're talking about here. Right.
00:36:55
Speaker
Yeah, and it's like, also it's like, imagine you get the keys to your manager's office and then all of a sudden you're the manager. That's the kind of bug we're talking about. Yeah, so all hell can break loose. This is why we have encryption. And secure modes that are enforced by certain rules. That's why you can't make certain calls. But Xerox, they made a bug report and Xerox ignored them.
00:37:20
Speaker
What do you all expect when you send in a bug report? Why they'll get right on it, right? Apparently you've never worked for an actual software company. No, no, no. Okay, so we've got listeners out there who are computer scientists who work for the government and they work for a computer software company. And when you send a bug report, what's a good analogy? It's kind of like... Well, it's hard to tell what the good bug reports are if you don't have a good bug reporting system because it just feels like people bugging you on email or on Trello or whatever you're using.
00:37:50
Speaker
Okay, so suffice it to say that this report, it's like it got filed away on some desk and is collecting dust. Oh, like making IPR requests. I'm not familiar. Probably. Just like if you try to get information from a public executive, you have to file information inspection form because we as citizens have rights to the government and what they do, it's all public information. And if you file one, they just like never get back to you.
00:38:19
Speaker
Bug reports can be a lot like that in a bad company. But yeah, so they made this bug report and they waited for weeks and weeks. So what happened next, Gabriel? Okay, so I think that they continued to send more requests and they were ignored. Now, I think they made an effort to follow up with the company because this was important. When you're a computer scientist and you identify a vulnerability,
00:38:46
Speaker
That is, there's a lot at stake here, the well-being of the companies at stake. So I think they even made some phone calls, didn't they? Well, I think they worked inside of the company, but yeah, they made phone calls. They did all that kind of stuff. Everything you're supposed to do. Yeah, they did everything, you know, everything you're supposed to do to identify the higher ups at Xerox that they have a vulnerability and it's not being addressed. So what happens when you aren't heard when you talk, you shout? Yes. I love that. Yes. When you're not heard when you talk, you shout. Yes.
00:39:16
Speaker
And this is how they shouted. They created one of the first viruses. That's right. So these are the actual people who were trying to report the vulnerability. They created a virus. Yeah. And back in the day you had these things called tape drives.
00:39:30
Speaker
We still have them today, but just not as frequently. They're really high. They could store a lot of stuff on them and they would make them go back and forward so that they'd walk across the floor. Um, they would, the punch card would act like it was haunted, things like that. Okay. Yeah. So just, it made the hardware go haywire and it was just a big nuisance. We have a tale for you and it's one of the early classic computer hacks. Um, I hope you enjoy it.
00:39:55
Speaker
Back in the mid-1970s, several of the system support staff at Motorola discovered a relatively simple way to crack system security on the Xerox CPV time-sharing system. Through a simple programming strategy, it was possible for a user program to trick the system into running a portion of the program in master mode, supervisor state, in which memory protection does not apply.
00:40:20
Speaker
The program could then poke a large value into its privileged level byte, normally write protected, and could then proceed to bypass all levels of security within the file management system, patch the system monitor, and do numerous other interesting things. In short, the barn door was wide open.
00:40:41
Speaker
We're going to break this down a little bit for you so you don't get lost. Master mode is like having keys do something, right, Gabriel? Yes. Yes, exactly. Exactly. Yeah. So essentially this story goes way back. This was a story of a virus in the mid 70s. And of course it was for the company Xerox. And again, I think the best phrase was that the barn doors were left wide open to the master mode. So that's a big, big vulnerability.
00:41:08
Speaker
Now with the master mode, you could do anything on a computer basically. You could rewrite everything on a computer. Nothing is safe except for stuff that's encrypted. Yes. Yes. So for our listeners who are less familiar with the vernacular, this is just a big, big vulnerability. It's sort of like the keys to the kingdom were given to anybody, to everybody. Now let's hear a little bit more of that story.
00:41:31
Speaker
Motorola quite properly reported this problem to Xerox via an official level 1 SIDR, a bug report with an intended urgency of needs to be fixed yesterday. Because the text of each SIDR was entered into a database that could be viewed by quite a number of people, Motorola followed the approved procedure. They simply reported the problem as security SIDR
00:41:59
Speaker
and attached all of the necessary documentation, ways to reproduce, etc. The CPV people at Xerox sat on their thumbs. They either didn't realize the severity of the problem or didn't assign the necessary operating system staff resources to develop and distribute an official patch.
00:42:20
Speaker
Months passed. The Motorola guys pestered their Xerox field support rep to no avail. Finally, they decided to take direct action to demonstrate to Xerox management just how easily the system could be cracked and just how thoroughly the security safeguards could be subverted. What would you do if you were in this situation?
00:42:42
Speaker
So again, to put yourselves in the perspective of the folks from Motorola, you've done everything in your power. You've done everything in your power to alert a company that they have a big vulnerability and you're feeling like you're not being heard. That would bother me. I don't know what I would do. So maybe it wasn't everything in their power.
00:43:04
Speaker
They dug around in the operating system listings and devised a thoroughly devilish set of patches. These patches were then incorporated into a pair of programs called Robin Hood and Friar Tuck. Robin Hood and Friar Tuck were designed to run as ghost jobs, demons in Unix terminology,
00:43:24
Speaker
They would use the existing loophole to subvert system security, install the necessary patches, and then keep an eye on one another's statuses in order to keep the system operator and affect the super user from aborting them.
00:43:39
Speaker
One fine day, the system operator on the main CPV software development system in El Segundo was surprised by a number of unusual phenomenon. These included the following. Tape drives would rewind and dismount their tapes in the middle of a job. The card punch output device would occasionally start up of itself and punch a lace card, card with all positions punched. These would usually jam in the punch.
00:44:07
Speaker
And the Xerox card reader had two output stackers. It could be instructed to stack in A, stack in to B, or stack in to A unless the card was unreadable, in which case the bad card was placed in stacker B. One of the patches installed by the ghost added some code to the card reader driver. After reading a card, it would flip over to the opposite stacker.
00:44:30
Speaker
As a result, card decks would divide themselves in half when they were read, leaving the operator to recolate them. All right, now we're going to explain a little bit what's going on because computers are so different nowadays.
00:44:42
Speaker
A tape drive is like a giant cassette tape. Do we have listeners young enough to not know what cassette tapes are, do you think? I guarantee that we do. Well, I know. I think we do. I think we do. We are really popular among kids four and under, right? Oh, yeah. We are played in nurseries in Budapest. Yeah, I don't know. I don't know. There may be some kids out there or future listeners who may be less familiar with the tape deck. You have to rewind them.
00:45:09
Speaker
Yeah. So basically what it is is like, imagine it looks like a scroll made out of magnetic tape. Yes. There's some Buzzfeed articles, Google search bugs feed. You know, you're born in the eighties F and you'll see you have tape decks, not iPods. And so tape drives are basically like giant scrolls, but the scroll can move itself and they would go back and forth so quickly that they would walk across the floor.
00:45:34
Speaker
A card punch is basically just something that punches holes in cards, and one thing about punching all the holes in a card is that it jams up the works. Imagine just using one of those whole card punchers and trying to punch everything in a piece of paper. You'd start punching the punches after a while. Now, in addition to this mayhem that we've discussed so far, there were also insulting messages left from Robin Hood to Friar Tuck and vice versa.
00:46:01
Speaker
Naturally, the operator called in the operating system developers. They found the bandit ghost jobs while running and killed them, and were once again surprised. When Robin Hood was gunned, the following sequence of events took place. ID1. Friar Tuck, I am under attack. Prey save me. ID1. Off. ID2. Fear not, friend Robin. I shall rout the Sheriff of Nottingham's men.
00:46:30
Speaker
ID1. Thank you, my good fellow. Each ghost job would detect the fact that the other had been killed and would start a new copy of the recently slain program within a few milliseconds. The only way to kill both ghosts was to kill them simultaneously. Very difficult. Or to deliberately crash the system. And, of course, they crashed the system. They eventually did patch it.
00:46:57
Speaker
But that's a good story of why security levels and encryption are important. Because, you know, Gabriel, what's your take on this?
00:47:06
Speaker
Oh, gosh, gosh. This was just a phenomenal, I mean, like, this is almost like, you know, a senior high school prank, but this was done at the corporate level. And this was done very, very much to teach a lesson. And I think it's a phenomenal story. It's phenomenal because in this case, we're examining the motives of the team that created the bug. And the motive was to point out the inefficiencies of human
00:47:35
Speaker
Oh gosh, how do I phrase this? The vulnerabilities inherent to trusting an organization to humans. Exactly. Yes, yes. This is pointing out the inefficiencies of trusting humans to run things.
00:47:51
Speaker
So in this next topic, I wanted to bring up, I said earlier that I was a big fan of Richard Feynman. I wanted to talk about a story, a well-known story of Richard Feynman that involves safe cracking. Before we begin, I want to ask you guys, if you had to crack into a safe, how would you do it? Yeah, it's a very difficult thing. How would you do, Jalila?
00:48:14
Speaker
I would freeze it with nitroglycerin. No, the freezy stuff. Liquid nitrogen? Liquid nitrogen. I would freeze it with liquid nitrogen and then hit it with a hammer. Actually, it's related to a common way of safe cracking is you use a carbide drill and you drill into it and then you open the safe from the inside using tools. Because safes are actually designed to be really easily open from the inside, but not the outside.
00:48:43
Speaker
Another thing you could do is listen to it like they do in the movies. Now it's a lot more complicated than they actually do in the movies. Let's say you had a safe that took six combinations. You need to try every single permutation of the number that you hear the click on the opposite number in every single way. And that could be as many as, um, I think six factorial times two to the sixth, but don't quote me on that a lot.
00:49:08
Speaker
That's a lot of combinations. Now, interestingly enough, there are people who figure out a way. I wish we had Charlie here on the episode because he talked about that great quote on the Chaos episode from, who was Jeff Goldblum's character, where he says, nature will find a way.

Security Flaws and Real-world Examples

00:49:26
Speaker
Richard Feynman would find a way.
00:49:29
Speaker
So yes, there's a very famous story. I might just give you the bullet points of his involvement at Los Alamos Labs. So in the 1940s, Richard Feynman, along with many, many other physicists, were in Los Alamos, New Mexico, working on a big project. Do you all know what project that would be? The very peaceful project, correct?
00:49:50
Speaker
The project that is justified to exist in order to ensure world peace in the form of deterrence, the Manhattan Project. I was gonna... I'm sorry, did I spoil your moment there? No, it's okay. I was gonna just try to, you know, because it was like which project is that and then the suspension of the building and then it's the Manhattan Project.
00:50:12
Speaker
Those stories are great stories, guys, the whole story. So this one fascinates me because you have a brilliant physicist who was clearly brilliant enough to work on the Manhattan Project, and he's in a Feynman, and his entire chapter is all about safe-cracking and lockpicking. This is mischief. This is mischief. I love it because this story shows that the serious scientists were also full of mischief. And if you're wondering what chapter we're talking about, that's in Surely You're Joking Mr. Feynman. It's a wonderful book.
00:50:38
Speaker
Yes. Oh, surely you must be joking, Mr. Feynman. Oh, yeah. Sorry. Yes. And I think they've, I don't know, I've seen it available online. This is very standard material for anybody who's into him. So this is all about his mischief. Essentially, in the story, when he was bored at work, you know, around his office, he spent his time trying to pick locks and learn the tricks of the trade. He did this on all kinds of locks. First, he started on very, very simple locks.
00:51:06
Speaker
And lock picking is very common hackish pastime. Yes. Yes. If you're a hacker then you like picking locks. Yeah. It's just a fun thing. It's almost like it's a puzzle. You just have to solve. And you know how how safe are things. So he spent his time and really he figured out how to hack and I'm sorry.
00:51:24
Speaker
how to pick locks around his office. And he became known as the guy to go to in case that there was a physicist who was absent that day. And people just needed to access his files instead of, you know, going to find the janitor with the key. They just say, Feynman, will you please help us out? Because he knew the tricks of the trade for picking a lock. But also he was very good at safe cracking as well, which is a little different. Now, Jonathan, do you know more about that story and how he reduced the total numbers?
00:51:50
Speaker
Yes. And I think a lot of that story boils down to one question that he asked his friend. He said, do you think that this person is the type of person who would use a date as a combination for their luck? And he said, yeah, yeah, he's definitely that kind of person. And so they tried every date that would probably be important to him. And while I was his, he was his daughter's birthday.
00:52:11
Speaker
Oh, my gosh. So yes, yes, that that shows that's the reason why you shouldn't have a date as the combination for an important thing. You need random numbers. Dates are not good. The only problem with random numbers is they're so difficult to remember. And that's also we talked about. And also that's a problem with password making.
00:52:36
Speaker
Same exact thing. Now, another trick that he used was to just use the default combination. 25, 50, 25. Yeah, yeah. The actual story ends where he's talking with an actual safe cracker. And the safe cracker says that a lot of times he just says, what's the factory default? And that's so many times people in the government wouldn't even change the password from the factory default. I think he said one in six safes would open with just the factory default. Isn't that ridiculous?
00:53:06
Speaker
And people still do that. Like if you get access to somebody, somebody's, for example, there's this program called my sequel. It's a database program. And by default, you don't need a password and people don't reset that. They should set that. They don't do that. Ah, guys, you need cyber security is important. We can't emphasize that enough either through, you know, lazy knit, maybe it's intellectual laziness that people just don't want to put in the time. It's awful.
00:53:34
Speaker
I just think it's funny because I have the password or the code to my shed embedded in my brain from a kid and it's my dad's birthday. It's that same example. And then I can think of another where admin for any organization that has a database or whatever, they're always punching admin or just something super standard that is volunteer user, no password.
00:54:05
Speaker
I'm going to go steal your dad's trowels.
00:54:07
Speaker
No! You know part of that story I just love, he scares, he puts the fear of God into another physicist because you know there was a day when he was in a position to access some top secret stuff that the government was working on and these were government secrets involved with the Manhattan Project so you bet they are secured. Well Feynman goes you know to these safes and he thinks, he just thinks should I?
00:54:35
Speaker
I don't know. And of course, he's a mischievous and curious guy. So he thinks, if I was a physicist, and if I was really, really into math, what would I set the combination to? Hmm. What could it be? What could it be? What do you think, Jalyla? What do you think? I don't know. Fibonacci series. I don't know. I just, that's immediately what I think. Some sort of, Capricard's constant. I don't know. Oh, yeah. Okay. Or like pie. Well, I think he drives pie, right?
00:55:01
Speaker
Yeah, if I recall correctly, Pi was one of the top-secret ones. Yeah, that may have been one of them. Or maybe he tried Pi and it didn't work. I mean, you all can read the story. I know that he tries Euler's number. That's probably the second. Euler's, oh my God, blame Euler for everything. Yeah, exactly, which is like the actual number, what is it, 2.1718? 2.7182818. How embarrassing. We need to edit that out. I know Euler's number.
00:55:26
Speaker
Of course he does. I'm just kidding. Yeah, yeah. So he tries Euler's number. He tries some combination of the first six digits of Euler's number and boom, it worked. So he had access to this safe and he began to drop all these little notes, you know, like just so you know, the safe is really easy to crack, you know, just pointed out there. Then he signs his name is Richard Feynman or, you know, or something, something.
00:55:49
Speaker
And then he continues, you know, he tries other locks, they all have the same combination, they all have Euler's number. So he takes out files and he leaves notes and he's like, this one was so easy to crack. And then he signs same guy. Well, what's funny is when the physicist comes back to work, he doesn't open the safes in the correct order.
00:56:09
Speaker
or the order that Feynman had assumed he would, he goes to the safe that doesn't have Feynman's name. And when the guy opens it, there's a missing document and a note that says this was so easy to hack into. Same guy. The guy turns white. He could have died. He looked like he had seen death. And when Feynman approached him, he's like, whoa, buddy, are you okay?
00:56:31
Speaker
and the guy, you know, he looked like he had never been more scared in his life, realizing that these secrets had been stolen right under his nose, not knowing that it was Feynman and his mischief that took the code. That's hilarious. And Feynman's the type of person who, true story,
00:56:46
Speaker
stole the door, and then when they asked him if he stole the door, he said yes, and they said, find him and stop joking around. That is absolutely, that is one of my favorite stories. Who stole the door? He just told him straight up. It was me. Yeah, and they're like, find him and quit screwing around. Oh my gosh. We're trying to find the culprit. Yes. So in this next segment, this is a very interesting one. You know, we're talking about a lot of things in this episode and about encryption and about hacking into things. One of the things I want to talk about, it's a very, very real issue, is social engineering.
00:57:15
Speaker
Now, this is including things like phishing with a pH where you make an email look like it's official. The social engineering goes way back. Spam, if you think about it, could be a form of social engineering because certain people, I mean, in a certain way, anything that requires any hack that requires human intervention is a social engineering hack.
00:57:39
Speaker
Now, I want to tell you guys a cool story. I went to a seminar specifically on security. This seminar was so amazing because in my company, I was with engineers and physicists and PhDs and skeptical people who strongly, strongly believed that they would never be able to be taken advantage of because they're too smart for it. They're too self-aware. You know what I mean?
00:58:00
Speaker
And that's a huge problem. I mean, executives fall prey to the Nigerian email scheme more than other people. The smarter you are a lot of times is the more hubris you have. We've got blind spots. We've got blind spots. It's true. So I want to tell you guys a story that is embarrassing to tell and it made me feel really bad after it, but I learned a valuable lesson and now I can tell the story about it. Now, if you figure out the punchline at a time, please don't say it.
00:58:26
Speaker
Uh, uh, because it is possible, uh, but just hear my story. Okay. So, so I go to the seminar, right? And you know, and we're, we're, we're, we're small talking and they show up and these are a bunch of folks who have experience working for all, you know, you know, police departments and, and the CIA. And they're like, you guys, we, we are here to tell you a story or, or I'm sorry, uh, not tell a story. We're here to give you a seminar on security.
00:58:50
Speaker
We need to be very frank with you. This topic is dry. We've worked here for a long time. We know the drill. It's a dry topic. We're going to do our best. You might be yawning by the end. No judgment here. We know how governments operate.
00:59:07
Speaker
in order to make it a little bit more lively, we actually have permission to make it a little bit more like a game show. This is some kind of an attempt to make sure you guys can pay attention and ingest the lessons. So if we're going to do a game show here, I'm going to give you a speech here in a minute and we're going to ask you questions at the end and we're going to have some prizes. So here are the prizes and suddenly they show us the darn coolest things like an iPhone case, you know, and I'm like, I want that.
00:59:36
Speaker
and they showed us like a badge holder that's shinier than the other ones and it's just really sharp looking and office supplies guys who is not a sucker for really cool office supplies they had cool stuff and I wanted it and suddenly I'm suddenly more awake and I'm like oh I want it I want it and they're like okay okay so so you know
00:59:55
Speaker
before we read we want to identify a few contestants for the first round who wants to go up and you know everyone's hand is up and like oh uh we need to randomly pick somebody okay who's who's born in january let's just start with january and we're like oh i'm not a january birthday you know but
01:00:10
Speaker
There were like four or five hands that they were up and like, oh, yikes, we only need three of you and we need a first and a second and a third. Who was born closest to January 1st? And suddenly everyone was sorting out each other and they identified a person who was born January 1st and they said,
01:00:28
Speaker
You're gonna have to tell us when you were born so we can sort you. So, you know, they're like, I was born the 15th. I was born the 17th. And they said, oh my God, you're born on the 17th? I was born on the 17th. I was born in 68. When were you born? And they said, oh no, I was born in 1972. Like, oh yeah, yeah, well that's pretty close.
01:00:44
Speaker
had the same birthday. So of course, they had the first round guests up there and they did this cool contest and they identified the guests or the contestants according to their birthday. And I thought, well, that's pretty clever. That's cool. I really wanted to be part of this next round. So this pattern continues.
01:01:06
Speaker
They give a lesson, they ask for contestants, and then they give you a quiz, and they give prizes for participation. They make you really want to participate, and they make you, I don't want to say feel bad if you don't get to, but feel bad if you don't get to, like you're left out. Next round, they said, we need to change it up, you guys. We did birthdays last time. What are we going to do this time? I need something. What do you guys have? And someone said, age. No, we're not going to do age. That's offensive, even though we did birthdays.
01:01:32
Speaker
So they finally decided some obscure thing. Somebody had said, mother's maiden name alphabetical order. And they're like, you know what? Sure, let's do it. So they had some conversation about how we are a patriarchal society, but to change it up, let's do your mother's maiden name. So they had to say, if your mother's maiden name starts with a B, you are in this round.
01:01:55
Speaker
Nobody? How about a C? How about a D? Finally, they had a crowd of people and they all had their maiden name and then they were able to extract this information about what their mother's maiden name was. Now, I'm going to pause on my story here. There's a lot of our listeners, I think, who know exactly what's going on. Have we caught on to what's going on? I didn't catch on yet, Jonathan.
01:02:20
Speaker
Oh, definitely. But I'm paranoid. Okay, okay. So I was thinking like, wow, this is an amazing conversation. I'm engaged. My mother's last maiden name is Lane, so I wouldn't be in this round. And so essentially, the way this seminar worked, by the end of the seminar, they had our birthdays, they had our mother's maiden name, they had the elementary school that we went to,
01:02:48
Speaker
They can now get our passwords. They can now go to any website and say, forgot your password. I need your information. See, there's actually a very easy way to protect against this. Okay. Create another persona and lie. Your mother's main name. It was waffle.
01:03:05
Speaker
Okay. Okay, don't use that because now everybody's going to use that as well. But that's interesting. Cool. I was hacked and I was a sucker, you guys. So here's my hope. I hope that people who are listening to this can clean some wisdom and be aware of when suddenly a stranger is warm because, you know, warm and becoming and be aware of trivial information.
01:03:28
Speaker
Yeah. In a study, a huge percentage of people give up their work password for a chocolate bar. And I mean, people are really simple when it comes to that kind of thing. And it's your job to educate yourself.
01:03:42
Speaker
Oh my gosh. Okay. Okay. So, so the story was amazing because it seemed like trivial information, but I could have lost everything from that game. So yeah. And there were PhD physicists who said, you know, I feel embarrassed now, but now I'm aware. So.
01:04:01
Speaker
I have a great story of having passwords that are not secure. This comes from middle school. I had two friends. Let's just name them Jim and Bob. So there's Jim and he's actually brothers. Bob is his brother and Jim decided to take his laptop and hack into it.
01:04:22
Speaker
So he types the username, which is Bob, and do you all want to guess at a middle school level what he tried for Bob's password?
01:04:32
Speaker
Not a polite company. Oh, Bob's password? No, no. Well, OK. It was the equivalent of Bob's stuff. That was the password to hack into Bob's computer. So so, you know, my friend who we're calling here Jim was very, very clever and Bob's password was not very well encrypted. So don't don't put something that you think nobody will choose. Don't don't put something handy. You really need true random words.
01:04:58
Speaker
I do know somebody whose password used to be your mom as a password won. How secure is that? How secure is your mom as a password or even like just inappropriate passwords? No, those are not secure. There's a million inappropriate words. Those are not secure. Yes.
01:05:18
Speaker
So everybody hates password policies, right? The ones that make you change your password every 30 days, and they say you have to have a number, you have to have a capital letter, and you have to have a symbol. And those are maybe an example of where people can look at the math and make a policy out of that, and it turns out to not be useful in the real world.
01:05:42
Speaker
Jonathan's method of picking passwords as words, where the point is you get lots of entropy by picking multiple words, that's great. But for a long time, everybody picked passwords by saying, oh, a password's like eight characters long and it's going to look like a jumble of symbols.
01:06:02
Speaker
The reason they have the policy of, a capital letter must have a number, is that if you just have a password of all lowercase letters, let's say it's eight characters long and they're all lowercase, that's odds of 1 in 26 times 1 in 26 times 1 in 26.
01:06:18
Speaker
which, do that eight times, and you get a sort of high chance of that password being guessed.
01:06:30
Speaker
Yeah, I mean, there's only 208 billion. And if you, on a modern computer, that would take just a couple hours. Now, if you mix in capital letters in that, instead of 1 in 26 and 1 in 26 and 1 in 26, it's 1 in 52, 1 in 52, 1 in 52, which makes it dramatically harder. And then you add numbers in that, makes it 66, and so on. So they put in this strict requirement.
01:06:58
Speaker
But unfortunately, if you have a password with a jumble of like at symbols because you were forced to add them, you're not going to remember it, most likely. And not only that, you're going to choose something easy to remember, which is likely going to have even less entropy. Exactly.
01:07:15
Speaker
And one last thing that I want to talk about in this section is digital postage. And that's when, to send a message, you have to put a certain amount of work into it. A hash is essentially random. And if I tell you the requirement that it has to have, for example,
01:09:25
Speaker
...from forging the data on the Magstripe. I have a reader-writer at home, cost me five bucks, and I can add data to a Magstripe with it. I just plug it in my computer, type some things, and swipe the card, and it's got new info on it. I don't do that with my credit cards, but with other kind of swipe cards, you can actually have a bit of fun.
01:09:46
Speaker
I heard that SCAM for a while, or maybe it's still ongoing in certain places, is for people who work with aprons to hide one of those swipe things in their apron and wipe off the customer's credit card before they bring it to the thing, and by wiping it off, they take the data off.
01:10:12
Speaker
The other thing about the mag stripes is that they can be easily read in that way. And this is a common case of fraud by employees who have readers like that or by people who are third parties. For instance, there have been news stories in New Mexico recently about skimmers put at ATMs and gas stations. He put a little device on top of the card slot in a gas pump.
01:10:42
Speaker
and it will read the data from the card as you enter it. And so they come back a week later and take the little machine off and they have all this information on all these other credit cards, which they can then put on their own fake credit cards that they've made and use them for purchases.
01:11:00
Speaker
And so you were talking about how the new key cards, these new chip cards, so is that like they have the private key and then other older cards have the public key? I'm just trying to understand that private key.
01:11:18
Speaker
So the operations you can do with key pairs are you can sign a message and someone else can verify it.

Modern Security Technologies

01:11:26
Speaker
You sign it with your private key and they verify it with my public key.
01:11:31
Speaker
One good example of that is imagine I mail you a box with a key in it, and I ask you to put something inside of it. So you take the key that's in the box, I'll put the thing in it, lock it, and then the key gets stuck in the lock. You can't open it again. But there's a second slot where I have my own private key where I can open the thing up again.
01:11:57
Speaker
OK, but you need that public key. And you broadcast it. You give it to everybody. Oh, I'm getting it now. Thanks, guys. So the. So chip cards work. They don't really do encryption, but they use the signatures and verification, which is as fundamental a part of this as the actual enciphering of a message.
01:12:20
Speaker
So older chip cards that came out a few years ago, or may have been in use in Europe for a while, I don't know the entire history, but they had a signature from the bank on the card. So the bank would sign the number on the card, and so they would say, yes, we assert that this is the real number on the card.
01:12:48
Speaker
Then someone else, then you plug the card into a reader and the reader checks the signature on the card. Oh, this signature came from the bank. So therefore, this is a valid credit card number and we'll charge to it and so on.
01:13:07
Speaker
So those are the older cards. And the reason that they don't use those kinds of cards anymore is that it still doesn't stop the problem of duplicating a card. Because the signature, once it's been made, can be duplicated. It would be like taking one of those wax seals and taking a photo of it and recreating it. You can take the data from the card and put it on another card.
01:13:29
Speaker
And when that new card gets plugged into a terminal, it'll reproduce the same signature, even though that's still someone else's card. So what do the new cards do? The new cards contain a private key on the card. And the terminal will ask the card a question. The card actually responds with a little bit of computer logic in there. It can actually respond to those questions in real time.
01:13:57
Speaker
and it will sign that response so that the terminal knows that it only just answered the question. It knows that it's not the same question that was replayed from when someone asked the same question five days ago. Oh, that makes sense.
01:14:15
Speaker
So why is this more difficult to replicate than the old card models? It seems like all it does is have a function. But if I see what the function is, why is that impossible to replicate? Because each terminal asks a different question at any given time. So one day it might ask, what color is the sky? And the card would respond blue and sign that message and tell that terminal. And the terminal would know that was the answer to the question.
01:14:41
Speaker
Now, if someone were to sniff that transaction with a skimmer, they would see, oh, here's the word blue, and here's the signature of the card saying blue. And they put that on their card. But next time the terminal might ask, what color is grass? And most grass isn't colored blue. And so if they use that signed response, it would be answering the wrong question.
01:15:08
Speaker
So what you're saying is that if I had the card itself and I analyzed the card using all these electron microscopes, I could replicate that. But if I'm just somebody who sees what goes in and out of the card, I could not replicate it easily.
01:15:21
Speaker
That's exactly right. The information secret to the card that allows it to make those signatures is not exposed over the pins. That's the difference with the mag stripes, for example, aside from the fact the mag stripes didn't do any signatures at all. But all the information on the card was accessible by swiping the mag stripe.
01:15:42
Speaker
whereas with the chip, not all the stuff on the chip can be gotten by inserting the card into the terminal. You could acquire those secrets perhaps by taking the card apart and looking at it with a microscope, but your dishonest waiter at the restaurant is not going to do that. Yeah, unless of course technology improves exponentially and then we're going to need even more. It's always an arms race, isn't it?
01:16:11
Speaker
Yes, there is a definite there's a really fascinating scene of hardware analysis and forensics. And these are the kinds of problems that that type of people work on. And today we had on Anthony and Anthony is anything you would like to plug? Well, I like open source software. It's free and it's a great way for people to learn about technology like encryption and experiment with learning how computers work.