Introduction to the Digital Revolution and 'Breaking Math'
00:00:00
Speaker
We live in an era of unprecedented change, and the tip of the spear of this era of change is currently the digital revolution. In fact, in the last decade we've gone from an analog to a digitally dominated society, and the amount of information has recently been increasing exponentially. Or at least it seems like it's recent.
00:00:20
Speaker
In fact, however, the digital revolution has been going on for hundreds of centuries. From numerals inscribed in bone to signals zipping by at almost the speed of light, our endeavors as humans and some argue our existence in the universe is ruled by the concept of digital information. So how did we discover digital information and what has it been used for? All this and more on this episode of Breaking Math. Episode 4, Digital Evolution.
00:00:54
Speaker
I'm Jonathan Baca. And I'm Gabriel Hesh. And we're coming at you from KUNM Studios from Albuquerque, New Mexico. You could check us out at facebook.com slash breakingmathpodcast or at breakingmathpodcast.com.
Meet Zach Bigger: Astrophysics and Connections
00:01:08
Speaker
And with us we have... My name's Zach Bigger. I'm an undergrad student here at UNM. And Gabriel actually approached me about appearing as a guest on the Breaking Math Podcast.
00:01:18
Speaker
Yeah, actually, I know Zach from, oh gosh, where was it, Zach? I think it was a church youth group years ago, I think. I think that's probably exactly right. Goodness, yes. And since then, we've both gone our ways and learned a lot more about the world. I consider myself a mathematical evangelist. I believe math is everything. I did approach Zach. I recently became aware that Zach is studying astrophysics at UNM.
00:01:38
Speaker
And I thought, oh, what a great opportunity to reach out to another student. And we can talk about, obviously, math and physics, which we know there's a huge relationship between, but also just bring you onto the podcast and get your perspective. So thanks a lot for trusting us and coming down to the
Digital Information in Physics
00:01:54
Speaker
studio. We're very happy you're here. Happy to be here. No, Zach, as a physics student, you study an area where it's dominated in the large part by continuities.
00:02:05
Speaker
But where does digital information fit into what you've studied? How do you mean digital information? Numbers, data like that. Of course there's measurements to be done with numerals, but how does digital information otherwise impact what you study?
00:02:21
Speaker
Oh, sure thing. I'm sorry. I'm going to jump in here, actually. Sorry. I literally bumped into some physics students maybe a month ago over at a nearby coffee shop. And actually, every single one of them had to learn lab view and programming. So I hope that's what you're going for. What I'm going for is a sort of understanding that even though there might be continuities in physics, measurements still have to be made digitally. So digital information has its
00:02:49
Speaker
place in physics, even when you write things down on paper, you write down numerals and you have significant figures and things like that.
History and Significance of Computing
00:02:58
Speaker
Would you agree with that assessment? Absolutely. One of the things that I find absolutely fascinating about physics in general is specifically the relationship between different values and different quantities that we assign to different phenomena. But it's interesting that most of these are also followed by
00:03:16
Speaker
what we define as constants, Planck's constant, things like that, that are specifically ratios to how these things interact with each other. You don't measure the force of gravity by just multiplying the masses together and dividing by the radius squared. You have to include the universal gravitational constant.
00:03:33
Speaker
And it's always interesting to me that these numbers that we assign things are not directly related and they have to be followed by some proportionality, some constant that changes them in a way that we can actually use.
00:03:48
Speaker
That's very interesting. In fact, on our first episode, we talked about how ratios were important to the Egyptians and the Greeks and all that. Now shifting a little bit into computing, we're going to go way, way back and we're going to talk about the first computers, which everybody here has.
00:04:08
Speaker
I wanted to talk a little bit about our planning for this episode and why we decided to do a whole episode on computing and the history of computing. So if you look at our website online, you'll see our biographies. I am a former science teacher. I also taught math, and I am now an electrical engineer. And of course, Jonathan is a computer science student. And of course, Zachary is an astrophysics student. So you would know just from that that we're interested in computing.
00:04:33
Speaker
As I did research in this episode, I realized how much our knowledge about the world all throughout history is based on computation and how different cultures did it. I'm trying to get at a main justification, the crux for our listeners of why we dedicate an entire episode just to computation.
Early Computing Devices and Cultural Counting Systems
00:04:55
Speaker
If you were to ask me why we dedicate an episode to computation, it's because it's the way that we move things around. When you do any kind of symbolic manipulation, you're taking part in a computation. It's the backbone of mathematics, which is the backbone of many natural science fields.
00:05:17
Speaker
One of the things that interests me is you can take an abacus, you can take a simple calculator, you can take even up to a smartphone. In fact, I carry a scientific calculator in my backpack. Most of what I do is I add things with them, but that doesn't really reveal
00:05:36
Speaker
the full potential of these computing devices. I mean, ancient cultures used abacus for incredible computations and interesting equations. And if you gave me an abacus right now, I would know how to count beats. And what I do with my incredibly advanced and expensive scientific calculator is I add two numbers together over and over and over again.
00:06:03
Speaker
Well, the first computer was specifically adapted to addition. Everybody has them in this room and their hands and feet. In fact, you look at most cultures, they start with base 20. That's to say they group things in 20, kind of like how we group things in 10. We have 110, 100, et cetera.
00:06:26
Speaker
That was the first computing machine, and the first computer hardware is something that we're also very familiar with, tally marks. Tally marks have been found on bone as far back as 400 centuries ago.
00:06:42
Speaker
Okay, now I think I'll go ahead and call this a well-known fact, at least in the College of Education. I think that when we're learning about mathematics, we were always taught that the assumption, of course, is that the reason why we use base 10 in our counting, of course, is because we've got 10 fingers. Perhaps base 10 was more popular sometime after the dawn of shoes. I don't know, because then you didn't have, you know, you can't count on your toes anymore. But I think that most number systems, like including the Mayan,
00:07:11
Speaker
They're either either base 10 or base 20. Is that right? The my number system is directly base 20. If you want to get technical, it's base 20, comma 18. You could find out more about that on the paper for this episode, which you can find at the website, breakingmathpodcast.com, slash papers.html.
00:07:30
Speaker
Yeah. I got to say I love that. I love that we're able to go into much more depth on the papers. Also include diagrams. I know when we were talking about this podcast, one of the, one of the things we're hesitant about was the fact that, you know, you're listening to us. We can't say look at figure one, figure two, but we can when we direct you to the website.
00:07:48
Speaker
Now, we're going to talk about exactly why a hand and a tally mark system works as a computing system. What is an essential computing system? What is a minimum computer that you need for it to be considered a computer? A computer needs memory.
00:08:04
Speaker
that's found in the tally marks. You can write numbers down that way. A computer needs direction, and that's done in the human mind when you're using your hand. And a computer needs a way to distinguish processes, which can be done using hands very easily.
Number Systems and Cultural Significance
00:08:22
Speaker
There's a mathematical method developed actually pretty recently within the last century called Chismbop math that is done completely on the fingers.
00:08:31
Speaker
I'm sorry, I'm gonna make sure I'm saying it correctly. Chism bop? Chism bop. Yeah, I believe he was from a Slavic country. Okay, I'm sorry, was that just in brief? In brief, it's a method of calculating on your fingers. Calculating on your fingers was, of course, very popular in the medieval era, where, for example, touching your thumb to various fingers was
00:08:56
Speaker
representative of like 10, 20, 70, you could bake any number from zero to like a thousand on your hands. Okay, wow. You know, I like that. Actually, one thing that I love about this entire podcast format is when we can connect what we do in one podcast to other podcasts. And what I've been thinking about, Jonathan, since you've been telling us about the history is,
00:09:17
Speaker
One of the hallmarks, I would say, of humanity as opposed to other animals and other mammals, or I'll even say consciousness, is counting things and storing information symbolically, which part of that is calculating, of course. My anthropology classes from my first degree back in 2006, we learned about cultures that don't have numbers or words for numbers from more than three. They literally have one and two, and then after three, it's just many.
00:09:45
Speaker
So, I don't know. The identification of a number system is very, very interesting in an aspect of consciousness. I don't want to overreach here, but there's a connection. There is definitely a connection. One thing to mention when we're talking about digital computation, meaning computing on our human digits, is that many cultures have, for example, the word for seven would be the same word for your right index finger.
00:10:14
Speaker
because that's a seven digit. Okay. Okay. It makes sense. I'm following you. Okay. So as, as you were saying earlier, and again, um, I'm debating how much time to spend on, on each of these concepts. Cause really we could have an entire podcast or an entire podcast series on the history of computation. But if we were to start with, with, with just, um, we said maybe one of the first, uh, methods of computation and, and counting, I believe, uh, did you say was the tally Mark?
00:10:39
Speaker
Yeah, one of the first numerical storage methods. Okay, so again, for all of our audience members, if you've seen old cartoons where you'll see a couple of prisoners who've been in there for years and years, and they've got a long beard, there's an entire wall that is filled with tally marks. And one of the things that the first thing I noticed when I was very young is the slash marks.
00:11:00
Speaker
So, and it's interesting, there was a point where the slash mark was introduced. Somebody thought of, there was a first person, well, it could have been many people, but the first person thought of every time that they get to a fifth one, instead of putting a fifth line down, you know, if you draw one line down for every day that you're in prison or every year you're in prison, the fifth one would be a slash mark, which goes across all of them. And that would allow you to then group by five and improve, it improves your efficiency of counting.
00:11:28
Speaker
Yes, which does speak a lot to both our short term memory, because if you see five pebbles, you know that there's five pebbles. If you see 87 pebbles, you're not going to know that there's 87 pebbles. You're going to know there's a bunch of pebbles. And it also speaks to the fact that we have naturally a base 10 or base 20 system because five divides 20.
00:11:49
Speaker
In our information episode, we talked about communicating information. So if you can reliably assume that each group of tally marks with a slash, each one of them is five, then yeah, you just look at a group of them and instantaneously count by fives. You can efficiently communicate information, which is what we always do even nowadays in computation. And Zach, I'm going to put you on the spot real quick. What's three plus eight? Eleven. Two minus one. Two minus one.
00:12:17
Speaker
Now, you said earlier they use your calculator for adding and subtracting mostly. Absolutely. Can you explain to me the differences between what you do on your calculator and what you just did? Most of what I just did was instant recall with the obvious exception of two minus one that threw me for a loop for some reason. And it's a lot like times tables. You memorize a lot of things. You memorize. We spit it out because we've seen it so many times. It's so basic and we just know it.
00:12:46
Speaker
But if I put up two fingers on one hand and three fingers on the other, you can see that there's five fingers directly up, right? Yeah, I mean, and that would be a case where I would group them together and just visually see five. Kind of like counting change. You know, what a lot of us do, what I personally do is I group them by dollars. I stack up a dollar in quarters. I stack up another dollar in quarters. And rather than counting, there's 25 cents plus 25 cents plus 25 cents.
00:13:14
Speaker
however many times, I can just see one, two, three, four, five, six, however many dollars. And being able to have that visual cue in my brain is essential. And what's interesting is
Mechanical Calculators and Early Algorithms
00:13:26
Speaker
that the abacus is especially adept at doing that kind of addition, which I think speaks the fact that you use a computer, essentially, that's faster than the vacuum tube computers that we had 50 years ago, 60 years ago.
00:13:42
Speaker
to do something as simple as addition means that there's a cognitive limit to what you can add and subtract, and that's why we need computers.
00:13:51
Speaker
Wow, okay, actually, even though I have a degree in education and I took classes on the history of mathematics, we actually did not spend a whole lot of time on the abacus. So even though I would say that's probably more well known, can we talk a little bit more about how an abacus works? Because that's amazing. Sure, let's talk about the Sotoban. Okay. The Sotoban is a Japanese abacus. It's a frame that's divided into two pieces. There's rods going through the two frames.
00:14:18
Speaker
On one side of the rod, there's two beads, and on the other side of the rod, there's five beads, or might be one and four. The way these work is through another touchstone of computation. I'm glad that you brought this up. Algorithms. To add two things, you increase the digit. If there's a carryover, you increase the one next to it. It's the same way that we add just on paper.
00:14:43
Speaker
There's algorithms for multiplication, square roots, cubed roots. The soda bun was actually so well developed that it was taught well into the 20th century. Oh my gosh, wow. I've never really spent a whole lot of time thinking about the history of computation. I'm really glad we did this episode because then I'm thinking, first of all, you had the revolutionary slash mark, which were introduced that changed the way we think of numbers.
00:15:07
Speaker
And then once you really get these patterns and then you introduce, as you said, algorithms. And again, I think that word needs more unpacking. I think now a lot of folks have heard algorithm. There's Facebook algorithms. But if you were to describe an algorithm, let's say that you've got a fourth grade student, you know, mommy, what's an, or dad, what's an algorithm? What would you say? Son?
00:15:27
Speaker
Would say that I don't have a kid and so I don't know why you're calling me dad But then I would say that an algorithm is a set of instructions to do something to numbers That's what an algorithm is. Okay, you know one of the things that actually just kind of
00:15:43
Speaker
popped into my mind is the difference between algorithms and heuristics. And what I was describing to you just a few minutes ago was more of a heuristic. That's a shortcut that my brain takes without having to go through the algorithm, the set of instructions that tells me how to get 2 plus 2 equals 4.
00:15:59
Speaker
which I think just as an aside is why the storytelling method in 1984 where they told Smith that two plus two equals five is so powerful because it exploits the fact that heuristics can sometimes summarize algorithms.
00:16:18
Speaker
So, so far, the things that we've touched on so far for our audience is that first thing we said was counting. We then said efficient methods of counting symbolism, both in slash marks and in numbers, that a single number would stand for an amount and it's understood. We then mentioned the third thing, really a third and fourth, algorithms and heuristics. I'm just, I'm trying to keep a tally mark for these main components. You know what I mean?
00:16:45
Speaker
Yes, and I think one thing to mention is that the abacus, it was the first place value system, so even though the Egyptians had different symbols for ones, tens, hundreds, they used the stones to mean the same thing, they just didn't write it down that way. Much like Zach's electronic calculator, one of the first mechanical calculators was invented by Glaze Pascal.
00:17:08
Speaker
You may know Pascal from his triangle or from his many mathematical theorems, but he was also an inventor and he invented a mechanical adding device that was very ingeniously developed. Really? It was a set of wheels that you would, if you've ever used a old telephone where you have to turn the wheel thing around. Rotaries, right? Rotary phone. It worked like a rotary phone where you just dial in the number and then you dial in the second number and you get the sum as a result.
00:17:38
Speaker
I read that Pascal actually put away his mathematical developments in favor of a life of religious observance, which was very interesting. Yes. He had a belt that had spikes on it that anytime you've had an impure or mathematical thought, he would jab his elbow into. Wow. Okay. He gave up his absence for mathematics only when he was sick and needed a distraction.
00:18:02
Speaker
Regarding the brutal nature of what we just heard, I'm still kind of stuck on the rotary adding device. When you dial in a number on a rotary phone and then you find the sum of it, how exactly does that work? Does it count rotations? Does it count degrees in rotation?
00:18:21
Speaker
Each number was separated by 36 degrees of rotation, so the numbers were equally spread out on the wheel. Now the way that it worked internally was you would turn it and let's say you're adding 5 plus 5 and you want to get 10.
00:18:35
Speaker
You turn it halfway, you get five, and on the output dial, you see five. Now you keep turning it, it goes six, seven, eight, nine, and then it goes to zero. But when it goes to zero, it hits a lever, which uses gravity. This is the ingenious part of his machine to move the next one over by 36 degrees. A mechanical principle. It's a lot like the odometer on an old school car.
00:19:01
Speaker
Yeah, and what's brilliant about it is that because it uses gravity, you can make an infinitely large Pascal machine.
00:19:08
Speaker
One of the things that I like to say a lot in defense of mathematics is when it comes to, quote unquote, the fuzzy studies, sociology, geography, psychology even, even to a degree biology, there are a lot of good answers and then there's a best answer and there's kind of a world of room to argue that. When it comes to things like mathematics and even to a degree physics,
00:19:32
Speaker
You set up your equations, you plug in your numbers, and provided you didn't screw up your input variables, going back to the Chaos Theory podcast, setting up your equation, and everything works out right, you will get the right answer. And the right answer is the right answer, and anything else is demonstrably not. Although, wasn't it Feynman who said that anything that's too complicated isn't physics? I'm not sure, but that sounds like Richard Feynman to me.
00:20:01
Speaker
Now we're going to talk about a few more mechanical calculators real quick, just to show how they progressed. Right after Pascal made his device, Leibniz made his own, Leibniz famous for calculus. He had a wheel that could multiply numbers, not just add them. Its operation is too complicated to describe here, but can be found on the paper. And again, that's breakingmathpodcast.com slash papers.html. Now before
The Jacquard Loom: Punch Cards and Information Storage
00:20:30
Speaker
We'd like to talk about one last machine, and this was invented in the early 1800s, and it's called the Jacquard loom. It was a loom which was operated by punch cards, completely mechanical, and it made, I believe, lace using patterns from the punch card. And this is important because now we have a very compact way of storing information using punched holes in paper.
00:20:57
Speaker
That was so useful that it was used until the 70s even in personal computers. The Jacquard loom, he was inspired by, wasn't it a train punch card? Yeah, a train conductor. Because train conductors, depending on where the person was sitting, would punch the card in a different place. And Jacquard said Eureka, and he made his loom.
00:21:19
Speaker
Okay. Wow. Okay. So, so basically you've already got a memory storage device. You know, right now we've got our USB drive back then they've got their punch card. Okay. So that, but still that is a legitimate memory storage device. It's just a piece of a card that just has a hole punched in it. And from that you've got information.
00:21:36
Speaker
Yep. Nice. From the Jacquard loom, in the early 19th century, an inventor named Charles Babbage took inspiration. But instead of weaving threads, he wanted to weave equations. To this end, he devised two machines. The first was known as a difference engine, and had the task of calculating complicated polynomials.
00:21:54
Speaker
The second, the analytical machine, was more complex and had all the components of a modern computer. It was never built, but it was designed. And it was programmed by Ada Loveless, who devised the first algorithm ever invented to run on a computer. It was designed by Charles Babbage. It was never built during his lifetime, but it was essentially a precursor to the modern day computer, albeit it was made from gears and... Steam engines. Exactly, exactly. I love the story of the difference engine.
00:22:24
Speaker
Yeah, as I recall, it was partially built. Yes, yes. A seventh of it actually was built in his lifetime, but it was never actually built. So back in his day, there was a job that was a very menial job. The job was called a computer. A computer was somebody who spent their time doing arithmetic by hand.
00:22:45
Speaker
In engineering and in science and in navigation, you do need computations done. You need trigonometry, you need logarithms, and you need reference charts. So basically, they would have these rooms of people who were computers who would spend all day filling out a table that would be used later for engineers. And it was very, very boring. It was drudgery, spending all your time doing these computations by hand.
00:23:13
Speaker
And it was very economically useful to have these tables at the time. One of the problems, they offered, I believe, a prize that would be equivalent to $250,000 today to design a clock for reckoning so that you could calculate longitude. If you think about it, the better you could design a route, the quicker you get goods to where they need to go.
00:23:39
Speaker
So this was an expensive problem and having inaccuracies was expensive and that's why part of this was built at all.
Charles Babbage: Visionary Challenges
00:23:49
Speaker
If you can imagine a device where it's got a set of wheels and on the wheel there are numbers from zero through nine. There are also columns, there are gears. It looks sort of like if you could imagine the inner workings of a clock and a factory assembly line all put together. How else would you describe it?
00:24:09
Speaker
I'd describe it as like a row of those CD stacks. OK. OK. But made out of gears. So and again, obviously, if you think of a 17th century computer, it doesn't have a monitor, of course. But as I said earlier, it's got wheels with numbers on it. And as you turn the crank, the wheels turn and the numbers it sets on, that's your final number. Essentially, he had this design for a computation engine. And now, sadly, it was never built. And I think that's tragic.
00:24:39
Speaker
It wasn't for over 100 years after him that devices were actually built, right? Well, I mean, mechanical calculators, like the ones that Leibniz had used, were continued to be built. But a true computer, like a stored-program computer it's called, where the computer has the program as part of its data, was not built until 1948 at Manchester.
00:25:06
Speaker
If we go to the century previous to that, we have the designs for this difference engine and this analytical engine, but they never actually made it. Now, here's the humanities aspect of this. I'm going to go ahead and veer off of the mathematics. I want to talk about the human element of all of this. There's a couple of reasons why this device was never built. Apparently, Babbage was extremely, extremely difficult to work with.
00:25:34
Speaker
for a few reasons. As he would make these grand designs and the process of building them would start, he would continuously scrap the plans because, oh, he's got a better idea. So what's the problem? How would you describe somebody who can never finish a task because they keep having more ideas?
00:25:55
Speaker
Attention deficit. Perhaps in today's term, he'd be very attention deficit. You know, and it's funny, he's so inspired, but he keeps changing his plans and doing it more efficiently so things don't get done. There are other issues as well. Also, as I understand it, according to various records, he didn't have people skills whatsoever.
00:26:17
Speaker
He was able to get the funding for his device. He was able to hire machinists. But according to the records, there was a dispute about something about being reimbursed. I guess the machinist packed up everything and moved his entire family closer to Babbage. And I guess Babbage wasn't really willing to compensate him for various reasons. According to the story, his machinist quit working for Babbage before the device was complete. So you have an incomplete device and no staff.
00:26:47
Speaker
And they wasted about 20,000 pounds on that. Real quick, just so that our listeners know exactly how much that is, 20,000 pounds at that time. That is equivalent to the price of 22 brand new steam engine locomotives. That's a lot of coin.
00:27:07
Speaker
Yeah, in modern terms, 20,000 pounds is about $5 million. And on the last issue I've heard, it's a very, very human issue, was it was very difficult for Babbage to explain the significance of his device to the generals and the admirals who made decisions about funding. Now, this wasn't always the case. He was successful with procuring some money. A lot of money. Yes, yes. But in some cases,
00:27:35
Speaker
you know especially when you're not delivering according to your set timetable imagine trying to explain it to a general who's used to you know like how can we use this device to win wars how can we use this device to defeat our enemies you're like oh but you can calculate math real fast here let me show you you know well i mean it's it's interesting to note that um
00:27:55
Speaker
scientists in general kind of have a reputation for being bad at communicating. You'll come across a scientist who says, hey, let's build this laser-cooled cryogenic chamber for whatever, whatever, because it'll be really cool and we can do a lot of science and discover a lot of things. And you have to have the more advocate side of things, like the Neil deGrasse Tysons and the Bill Nye's to come in and be like, okay,
00:28:21
Speaker
Yes, it'd be really cool. Here's how it would be useful. Essentially why you should invest in this really, really cool project, scientists historically don't have a gift for that.
00:28:32
Speaker
Yeah, I think the same thing happened to Faraday. And it's worth mentioning that during this time, England was known for being extremely backwards as far as innovation went. The banks, for example, kept everybody's accounts on carved sticks. They started that practice in the 13th century. They only stopped because of a fire.
00:28:54
Speaker
That brings us back to tally marks on bones, right? No, like literally they had a big notch meant like a hundred pounds and then a squiggly notch meant something else. In the 20th century with widespread electricity came a new breed of inventor who uses marvelous resource as a primary inventing material.
Electric Computing Revolution and Binary Systems
00:29:14
Speaker
Claude Shannon, while still a student at MIT, devised a way to calculate in binary using relays. Relays are essentially electrically operated switches. And thus the electric computing revolution was born. Quickly, computer after computer came out, and Moore's law was established.
00:29:33
Speaker
Now, what Moore's law is, it basically says that the amount of transistors or computing transistor is a basic thing they use to compute with. So as I understand it, transistors, that was the major thing of the last century of the digital revolution. I mean, basically, that's what allows us to incorporate electronics into computing devices.
00:29:55
Speaker
Well, a little bit, Pourier transistors being used in computers, you had vacuum tubes, which are an electronic device. Okay, okay. And then before that, you had the purely electric device of relays, which was used in certain early computers like Conrad Zeus' Z3. Okay, well, I'm not familiar with that. Real quick, sorry, when did the Z3 come out? I believe 1948.
00:30:21
Speaker
Okay. Okay. I mentioned now in that, in that paragraph that you just, um, wrote aloud, you mentioned our good buddy, Claude Shannon. He was, we almost had an entire episode dedicated just to Claude Shannon earlier. Oh yeah. Our information theory episode. And it's amazing that he also, uh, devised a way to calculate using relays, which is, um, related. I mean, it's binary, but it's not totally related to his other field of expertise. He was a polymath.
00:30:48
Speaker
Wow. Yeah. So essentially at age 21, you know, he's a master's degree student at MIT. His master's thesis essentially laid the groundwork for digital logic. You know, just for those of us who are not entirely familiar, can you describe what a polymath is?
00:31:05
Speaker
A polymath is someone who's a genius in multiple areas. So, for example, Leonardo da Vinci is my favorite polymath. He was good at art. He was good at mechanics. He was good at all sorts of stuff. Now, I catch the, you know, the prefix poly, but why math?
00:31:26
Speaker
Oh, um, I think it's a Greek, don't know what it's Greek for. Fair enough. Yeah, very cool. Thank you. Essentially my introduction to digital logic came from computer logic design. How can we essentially do a quick crash course in digital logic?
00:31:42
Speaker
Let's just do a quick review of what binary is. Okay, binary. Should we pick on Zach? Zach, what do you know about binary? Well, I mean, it essentially goes back to the information podcast that you guys did. Isn't it essentially a series of yeses or nos or zeros and ones? It's not necessarily zeros and ones, it's on or off, yes or no. It's two options and it's a series of these options that define
00:32:11
Speaker
a program, a memory, information, however you wanna say that, that's what it is, is it not? I think he hit the nail on the head there, don't you think there, Jonathan? Yes, and I think that one of the important things about binary to mention is that it's like counting and skipping every number that has two through nine in it. That's another way of looking at it. So you got zero, right? You got one.
00:32:38
Speaker
Don't do 2, don't do 3, don't do 4, don't do 5, don't do 6, don't do 7, don't do 8, don't do 9, but then you got 10. And that's 2 in binary.
00:32:48
Speaker
Then you got 11, which is three in binary. Then you skip every number up to a hundred and that's four in binary. Um, and that's why it's good for electricity because electricity, it could be, it could be different than on or off, but when you're dealing with what's called nonlinear components, not on or off is the easiest thing to deal with.
00:33:09
Speaker
Nice, nice. So, you know, essentially it's all of the ways of storing information and communicating when you break it down. So you also have gates, right? Like, you know, in digital logic.
Interactive Computing and Algorithm Complexity
00:33:21
Speaker
Yeah. We'll discuss the gates in just a second. But right now what we're going to do is we're going to create a quick computer. Cool. Gabriel, you're going to be the memory. Alrighty. Zach, you're going to be the ALU. I have no idea what that is. You're going to have to break it down for me.
00:33:35
Speaker
An ALU is an arithmetic and logical unit. I'm going to ask you questions. And I will be the main computer. And our problem is two plus two. We want to calculate two plus two. So I'm going to pretend I'm getting instructions from a main memory source. So I'm going to pretend I'm getting them from you, Gabriel. The first thing I'll give to you is I would say in slot zero, store number two.
00:34:01
Speaker
All right, slot zero, we've got number two, check. Slot one, store two. Done. Slot one has a two. All right, now ALU, get slot one and zero and add them together and put them in slot zero. And that's slot one is two and slot zero has two. Add them together and do what again? Put them in slot zero. Put them in slot zero. Okay, so there's now a four in slot zero, correct? Gabriel, is there a four in slot zero? I got to be very honest. I lost track.
00:34:29
Speaker
Gotta be honest for our listeners. I thought here's where my mind was during that last bit. I thought man I wonder if you could completely model a computer through a mailroom at like a big you know bustling business Could you do that? That'd be more better for a parallel computing model Okay, dude. I'm sorry. I failed it be okay. You've got bad memory you and me, but
00:34:53
Speaker
I was having trouble adding two and two because I wasn't sure what I was supposed to be doing another reason why computers are Better than humans at this stuff. Yeah, I mean just a demonstration. We had two numbers We had two numbers Gabriel Sorry, okay, you know what I'll cheat I'll use my digits my fingers my phalanges I
00:35:14
Speaker
But as you can see, everybody had their different tasks. All the memory was supposed to do was say what number correlated to what other number. All the ALU did was manipulate numbers and all the main computer did was tell people what to do. So do we want to try this again, kind of understanding what our tasks are?
00:35:38
Speaker
Yes. OK. All right. Gabriel, on your left hand, store two. OK. Boom. Got it. Gabriel, on your right hand, store two. OK. Zach, add his two hands together and make his right hand the sum. Make your right hand four. And do I? You don't do anything to the other one. Make your right hand four. Your left hand stays two. What is on your right hand? Four, actually. OK. We just had a two and two like a computer does. Nice. That is painful. Oh, my gosh.
00:36:08
Speaker
But to a computer, it's automatic. And it does this using gates. We're going to just go through the quick types of gates. Let's say you want to get into a bar that has a cover charge. If you have the $10 and your $21, you can get in. But if you don't have either one, then you can't get in. That's called an AND gate.
00:36:28
Speaker
Okay, so basically in AND gate, you have to meet, in this case, if we're talking binary, for an AND gate, you have to have two criteria that work. If you're gonna have an output of a one, both of your inputs have to be one, one and one.
00:36:42
Speaker
Yeah, so, and one can mean true as well as a one. Oh, right, right, right. Okay, cool. So, Gaber, are you 21? I'm over 21. Do you have $10? I do have $10. All right, you could go into the bar. Cool. That's an AND gate. Nice. Now, an OR gate is, instead of having both of them have to be one, either one could be one.
00:37:02
Speaker
OK, and then just purely mathematical, of course, you know, so you can think about it and as more like multiplying. So if you need to end up with a one, you have to have, you know, one times a one and your only options are using ones or zeros. Yeah. And then you have an exclusive OR gate. So let's say you have two friends, Andy and Bob, they don't get along. If you're going to have a party, if you invite both Andy and Bob, you're not going to have a good time. If you don't invite either one, you're not going to have a good time. But if you invite one of them, you're going to have a good time.
00:37:29
Speaker
Oh, I like that. Okay. Okay. So only one or the other. So it's kind of like a sorting thing. Yeah. And when you add things together, if I add one and one together, I get 10 in binary. You have two ones and because you have two ones, you get a zero out. Okay. You got at a zero and a one or a one and a zero, you get zero one out. Nice. Now our next topic is going to be lambda calculus. Oh gosh. I was not prepared for this. Yeah.
00:37:55
Speaker
Now before we go into lambda calculus, we're going to have to talk to you about church-turing numerals.
00:38:00
Speaker
Okay, dude, you know what? Honestly, I worry about our listeners, especially those who are driving, because you might be putting them to sleep there, Jonathan. Before we go into church touring numerals, we're going to have to go into church touring logic. Is this going to keep going? Yeah, I think he's torturing us here. For the sake of our listeners, okay, so I'm sure that that's awesome and that's relevant. And we'll be on the paper. Okay, it'll be on the paper. And that's for, I'm sure there's listeners out there who like to know more about that.
00:38:27
Speaker
Wow, we went through a lot. Okay, so we made it up to Shannon, and then basically an introduction to do digital logic, just a basic intro, and then that brings us almost to the present day.
00:38:37
Speaker
Some of the first computers were the size of rooms, as you may well know. While some computers, such as the Sunway Taihu Light, are still at size, computers that we interact with day-to-day are much smaller, or at least they seem that way, since the advent of the internet we've arguably created a global computer. This poses the question, where is computing going, what form will it take, and what role will it play in our lives? So, before we talk about what role it will play in our lives, let's talk about the role that it does play in our lives.
00:39:05
Speaker
Unfortunately, you know, I feel like myself and many others have a cell phone on us and more often than not, you know, when we're waiting in line, we're always on Facebook or Instagram and we've sort of become zombies to a degree. Then that's not only a bad thing, you know, there's two sides to it.
00:39:22
Speaker
Whether or not it's a good or a bad thing, the fact remains that that is definitely a thing. And these systems are ruled by, as we described before, algorithms. Except for the algorithms aren't simple algorithms like adding two and two. They're algorithms that treat human beings like pieces of data and manipulate the connections between them. Wow. The way you phrase that, I almost feel objectified. Should I?
00:39:52
Speaker
Maybe you should, maybe you shouldn't. It's up for debate. I think that you shouldn't because I think that objectification of humans in terms of numbers is just the platonic ideal.
00:40:03
Speaker
OK, moving on. And one thing that we have to talk about is how little we know about algorithms, really. For example, sorting algorithms. We do not know the minimum bound on how fast a sorting algorithm can go. It's still up for debate. And we've been studying these for 50 years. OK, so I know that computer scientists will completely jive with what you're saying. But for those who are not computer scientists, if you were to describe sorting algorithms in short,
00:40:32
Speaker
Sure, let's say I give you a list of numbers. They're completely out of order and I need you to put them in order. How do you do that if you're a computer? The simplest way you might think is, well, just take the smallest one, put it first, and then take the next smallest one, put it second. But it turns out that that's a very slow process.
00:40:51
Speaker
A faster way to do it would be, think about if you had a deck of cards and you need to put them in order. If you look through the entire deck for the smallest card every single time, it's going to take a long time. However, if you split the deck into two halves, and then split each half into two halves until you have just pairs of cards, and then you sort them that way, it's going to be a lot quicker, and that's actually called merge sort. Oh, okay. Well, actually, the whole card example, that actually cleared it up quite well for me. Thank you for that.
00:41:20
Speaker
Before we understand about what a computer can do, we need to understand the difference between a computer and a supercomputer.
00:41:28
Speaker
And basically the difference between a supercomputer and a computer is that a supercomputer is what we call massively parallel. An example of this that you actually have on your computer is your graphics card. It's basically tons of tiny little computers on your graphics card, but in a supercomputer you have tons of big computers all working in concert. And these are such complex machines that just to debug them you have to have specialized software.
00:41:55
Speaker
Of course, you have to have specialized software to debug anyway. The way that the software works is it treats the entire computer memory as a tree and gives you heuristics.
00:42:07
Speaker
So is this sort of where the, a bunch of years back I heard about a guy who made a quote unquote supercomputer out of the little Raspberry Pi chips essentially and had like 200 nodes on this supercomputer. Is that sort of what this is? Yeah, it's a massively parallel computer, which I could tell from experience, maybe not relatable experience, they're very fun to program.
00:42:34
Speaker
There's a limited amount of speed up, however, even with a supercomputer. There are certain algorithms that you cannot parallelize. Certain algorithms you have to just do step one, step two, step three. You can't do steps one, two, and three at the same time.
00:42:49
Speaker
The fastest algorithms are ones that you could do all the steps at the same time and get a good result. Google uses a lot of these in their search algorithms, for example, their main algorithm that the base of their algorithm is called PageRank.
Quantum Computing and Future Prospects
00:43:02
Speaker
And for a more complete description of PageRank, please refer to the paper. Now, Zach, what are you familiar about when it comes to quantum phenomena and how it might be used for computation?
00:43:14
Speaker
Truth be told, not very much. I know that there are a lot of different, quote, unquote, quantum states that can be used more effectively than binary because there's more than an on and an off position. And that's really about the limit of my knowledge as far as that goes. And I don't even really know how accurate that is. It's pretty accurate.
00:43:36
Speaker
When I worked at Sandia Labs earlier, actually, there were a lot of advertisements for people who were specifically working in the area of quantum computing. There's a lot of research right now in it. Now, it took me over two years of working there before I had somebody explain to me exactly the relevance or even basically how a quantum computer worked.
00:43:56
Speaker
I had a really, really cool analogy. Happy to share it. Oh, yeah, please. Okay, sure, sure thing. So the way it was explained to me was like this. If we're up to speed, at least to some degree, about the phenomenon of quantum mechanics, it's the idea that very small things, you know, parts of atoms like electrons or protons or even groups of atoms, you know, I'm not quite sure how high this goes up. Yeah, it could go up, quantum dots and all that.
00:44:24
Speaker
Right, so you've got a very small thing, a very small unit that can exist either as a particle or as a wave. And with regard to quantum computing, it was explained to me like this. Imagine that you have a maze, the kind of maze that would be on a children's menu at a restaurant, and you have to find your way out of the maze from the center or from something like that.
00:44:50
Speaker
It was explained to me that if you had a typical computer, not a quantum computer, try to find its way out of a maze, it would try one path at a time. If you could just send out a marble or send out one particle and it would take a path and then if it took the wrong path, you'd have to start again.
00:45:06
Speaker
And of course, if the computer had memory, it would know which path it took and it would take a while. It would take some amount of time before it could make its way out of the maze through the process of trial and error. Now, a quantum computer utilizes the wave property of things. So if it was a quantum computer, it would send out a wave of particles and it would take every path at once. Therefore, it would solve the puzzle the first time, every time. That is simply mind blowing to me.
00:45:36
Speaker
And that's one thing that actually a lot of people misunderstand about quantum computers. And I'll explain it using my own analogy, similar to yours, to explain what people misunderstand about it. Imagine you have a waiter taking orders.
00:45:56
Speaker
And this waiter has a property that it could split into two waiters. It just does that. And these waiters exist in the same space at the same time. Now, this waiter takes a bunch of orders. If the waiter keeps splitting up and it just splits into 64 different waiters, it could do all these orders in a very short amount of time. However, we have all these waiters with different orders and we have to somehow consolidate them.
00:46:24
Speaker
the consolidation itself takes a lot of time. A quantum computer is a computer that could split into as many computers as there are bits representable on the qubits. A qubit is the basic unit of quantum computation. However, even though you could get every solution, you still have to find the correct one and that takes time.
00:46:44
Speaker
So I mean is this sort of like when you have you know an eight lane wide freeway and then everybody needs to get off at once so everybody can move really really really fast on the freeway and then everybody needs to get off and all of a sudden everything stops and slows way down is that sort of
00:46:59
Speaker
Very similar to that, yeah. And one thing that quantum computers are very good at, though, is factoring integers. How good they are is actually a subject of debate, but they're faster than classical computers at it. This would pose a problem for encryption. Encryption depends on numbers being very difficult to factorize. And if they're easy to factorize,
00:47:23
Speaker
Every single time you do anything on the internet, it's going to be insecure in the future if this turns out to be a viable method. I know there's a lot of physical limitations that anyone can read up on the internet, so obviously quantum computers aren't here yet, but they are quite a thing. Well, there have been some quantum computers developed. I think they're up to about five qubits. Oh, okay. Well, I guess, yeah, I'm not completely up to speed on the development of them.
00:47:50
Speaker
Yeah, they might go as high as seven qubits. I'm not totally sure. That's one of the problems with quantum computers is that we have not discovered a way of adding on qubits. It's kind of similar to when humans were very nascent and we didn't have a place value system. It's kind of the same problem. We need a place value system for quantum computers. OK. Let's say that we had a world where we had quantum computers and every computer was a quantum computer. How might things be different?
00:48:19
Speaker
It's difficult to say because there are so few algorithms for quantum computers that have really been developed that are faster. What might be different is we'd have very low energy computing because quantum computers don't take a lot of energy.
00:48:35
Speaker
because they have to be reversible. Reversible means that every computation that you do, you can reverse. If you have the data that you spit out, you have the data that you took in and vice versa. There's actually a physical limit to computation, however. Moore's law cannot go on forever. And that's another thing that we have to talk about with the future of computing. The most
00:48:57
Speaker
A one kilogram laptop can do is about 10 to the 50th operations. That's one with 50 zeros after it per second. And you said the laptop made by God. OK. And you said the greatest amount of computations that it can do is what did you say? One with 50 zeros after it per second. OK. OK. So that that's a lot of computations now in our modern day it is.
00:49:21
Speaker
Can I take a stab here at something? Zach, earlier you were talking about some of the constants in the universe and you had mentioned Planck's constant. I'm wondering, is that number based on Planck's constant at all? You know, I was actually going to ask the same thing. What's the limiting factor there? Is it the minimum size i.e. Planck's constant?
00:49:40
Speaker
It doesn't have to do with entropy of information. It takes a certain amount of entropy to do a calculation, to do a bit flip. And it's based directly on Planck's constant. You could find more information on the paper. Wow. Okay. Interesting. So it's just crazy to me, again, it's unintuitive, but obviously factually demonstrable.
00:50:03
Speaker
I was actually also going to ask, as far as you mentioned low energy computing with quantum computers, how does that all relate to energy versus entropy? Because in order to reorder a system, you have to add energy to it. And how do you minimize the energy that you add to a system and still come out with the order that you were looking for to begin with?
00:50:26
Speaker
It basically has to do with when you turn a one into a zero, it takes a certain amount of entropy. I can't remember exactly how much it takes at the present moment. When you do an operation like AND, where you take two bits, and if I tell you bit one and bit two is one, you know that both bits had to be one. However, if it's zero, you don't know what two bits you got in with. So therefore, you have a loss of information. So you have, again, an entropy. Cool.
00:50:56
Speaker
And the 10 to the 50th limit holds true for quantum computers just as much as it does for classical computers. There's no getting around it. Okay. So the only way around that then would be obviously if Planck's constant or different. Yes. And who knows? I mean, when we get to that level of complexity, maybe we could generate a universe where that's true. Who knows?
00:51:16
Speaker
Wow, wow, interesting. And that's getting into post-singularity territory. Singularity being, of course, when computers become as intelligent as humans and start improving on themselves faster than humans can currently imagine. Exciting stuff, exciting stuff. Exciting for some people, scary for everybody, but a reality for us
Conclusion: Evolution and Future of Computing
00:51:38
Speaker
Computing has existed as long as structured human thought has existed, and computing mechanisms have evolved with our understanding of the way in which we manipulate our world. We've explored everything from telemarks to electronic pulses, from gears to wires, and beyond. As we evolve further as a species, so will computers, our faithful allies. I'm Jonathan Baca. And I'm Gabriel Hesh. And with us on today, we had... Zach Bigger. And Zach Bigger, is there anything you'd like to plug?
00:52:05
Speaker
I think mathematics is really cool. I think it's very applicable as a tool to the real world, and I think people tend to be afraid of it. And I hope that this podcast kind of dims that fear a little bit and allows more people to jump in and try and use it for cool things.
00:52:23
Speaker
And for resources and to join our community, you could go to facebook.com slash breakingmathpodcast or to breakingmathpodcast.com. And of course, this is made possible through KUNM and Generation Listen. And this has been Breaking Math.