Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
78: Perpetual Notion (Entropy and Thermodynamics) image

78: Perpetual Notion (Entropy and Thermodynamics)

Breaking Math Podcast
Avatar
957 Plays1 year ago

As you listen to this episode, you'll be exerting mental effort, as well as maybe exerting effort doing other things. The energy allowing your neurons to continually charge and discharge, as well as exert mechanical energy in your muscles and chemical energy in places like your liver and kidneys, came from the food you ate. Specifically, it came from food you chewed, and then digested with acid and with the help of symbiotic bacteria. And even if that food you're eating is meat, you can trace its energy back to the sun and the formation of the earth. Much of this was established in the previous episode, but this time we're going to explore a fundamental property of all systems in which heat can be defined. All of these structures had a certain order to them; the cow that might have made your hamburger had all the same parts that you do: stomach, lips, teeth, and brain. The plants, such as the tomatoes and wheat, were also complex structures, complete with signaling mechanisms. As you chewed that food, you mixed it, and later, as the food digested, it became more and more disordered; that is to say, it became more and more "shuffled", so to speak, and at a certain point, it became so shuffled that you'd need all the original information to reconstruct it: reversing the flow of entropy would mean converting vomit back into the original food; you'd need all the pieces. The electrical energy bonding molecules were thus broken apart and made available to you. And, if you're cleaning your room while listening to this, you are creating order only at the cost of destroying order elsewhere, since you are using energy from the food you ate. Even in industrial agriculture where from 350 megajoules of human and machine energy, often 140 gigajoules of corn can be derived per acre, a ratio of more than 400:1, the order that the seeds seem to produce from nowhere is constructed from the energy of the chaotic explosion from a nearby star. So why are the concepts of heat, energy, and disorder so closely linked? Is there a general law of disorder? And why does the second law mean you can't freeze eggs in a hot pan? All of this and more on this episode of Breaking Math.

Distributed under a CC BY-SA 4.0 License (https://creativecommons.org/licenses/by-sa/4.0/)

[Featuring: Sofia Baca; Millicent Oriana, Jacob Urban]

Help Support The Podcast by clicking on the links below:


Recommended
Transcript

Origins of Energy and Entropy

00:00:00
Speaker
As you listen to this episode, you'll be exerting mental effort as well as maybe exerting effort doing other things. The energy allowing your neurons to continually charge and discharge as well as exert mechanical energy in your muscles and chemical energy in places like your liver and kidneys through chain reactions came from the food that you ate. And even if that food you're eating is meat, you could trace this energy back to plants that could trace their energy back to the sun and the formation of the earth.
00:00:25
Speaker
This was established in the previous episode when we talked about conservation of energy, but this time we're going to explore a fundamental property of all systems in which heat can be defined, and properties related to disorder. All these structures had a certain order to them. The cow that might have made your hamburger had all the same parts that you do, stomach, lips, teeth, and brain.
00:00:41
Speaker
The plants such as the tomatoes and weed were also complex structures complete with signaling mechanisms. As you chewed that food, you mixed it, and as the food digested, it became more and more disordered. That is to say, it became more and more shuffled, so to speak, and at a certain point it became so shuffled that you would need all the original information to reconstruct it.
00:00:58
Speaker
Reversing the flow of entropy would mean converting vomit back into the original food, or ashes back into a book. You'd need all the pieces and information. The electrical energy bonding molecules were thus broken apart and made available to you. And if you're cleaning your room while listening to this, you're creating order only at the cost of destroying order elsewhere, since you're earning energy from the food you ate.
00:01:17
Speaker
Even in industrial agriculture, with 350 megajoules of human and machine energy, often 140 gigajoules of corn can be derived per acre, which is an energy ratio of more than 400 to 1, the order that it seems to produce from nowhere, and the energy it produces from nowhere, is constructed itself from the energy of a chaotic explosion of a nearby star. So why are the concepts of heat, energy, and disorder so closely linked? Is there a general law of disorder? And why does the second law of thermodynamics mean you can't freeze eggs in a

Podcast Introductions and Promotions

00:01:44
Speaker
hot pan?
00:01:44
Speaker
All this and more on this episode of Breaking Math. Episode 76, Perpetual Notion. I'm Sofia and this is Breaking Math. With me I have on the host and co-hosts of another podcast that we have on this network. Nerd Forensics. And that's Miliante Mororiana and Jacob Urban. Welcome everybody.
00:02:10
Speaker
hey beyond so before we continue with the episode i want to talk to you a little bit about magic mind magic mind are these little um shots that you take uh they have a bunch of different stuff in them they have uh cordyceps mushrooms have a little bit of caffeine essential vitamins a bunch of other stuff you can see on their website paul's erudose the mathematician said that a mathematician is a machine for turning coffee into theorems so when i put a little nitrous in the tank is what i've been thinking
00:02:34
Speaker
But legitimately, I do use this stuff. It's about the price of an energy drink per, you could get a 30 day or 60 day supply. But if you want to save a little bit of money on that and try it out, you can go to magicmind.co-breakingmath and get up to 56% off your subscription for the next 10 days with my code breakingmath20.

Deep Dive into Entropy and Thermodynamics

00:02:56
Speaker
So today we're going to talk about the concept of order and disorder and how it kind of relates to energy. So what do y'all know about entropy right now? Entropy is the depletion of energy, correct?
00:03:08
Speaker
It's related to that. What entropy is, and I'll just define it really quick for the rest of the episode, is, well it has a lot of definitions, but essentially it's the amount of unusable energy in a system. So if you have two boxes of air, one is hot and one is cold, and you mix them together, the entropy will increase because it's gonna be more disordered than the original two, because the original two, you at least separated the order.
00:03:35
Speaker
And if you do the math, it actually works out. Because the math is that you take every probability that a system can be in, you take each probability and multiply it by the natural log of itself, which is just a logarithmic function. And I'll go over that in a second. You get the total entropy. And it also turns out to be really closely related to temperature. So we're going to be talking about entropy and thermodynamics in this episode, as opposed to just energy.
00:04:04
Speaker
And what do you know about specifically thermodynamics? It's it's the laws of how energy works
00:04:11
Speaker
It dictates things like how energy can't be created or destroyed, just redistributed. And it also talks about how you can't create things like perpetual motion and perpetual heat, perpetual cold, things like that. Oh, yeah, absolutely. Yeah. And there's a tradition of first, second and third laws. And there's also a zero and negative first law because physicists love to do things like that, you know, instead of just making it the fourth and fifth law. That's stupid.
00:04:38
Speaker
Welcome to physics. That's stupid. They know it's stupid and they're doing it on purpose. Oh, they are. I mean, I mean, I could go. I find men had a disagreement with the way that this the color charge is named because there's red. There's red, blue and green charge, but I have nothing to do with the actual colors, red, blue and green, which makes it hard to learn for people. And if you have the Internet, look at the history of how black holes are named.
00:05:09
Speaker
All right. So shall we recap last episode? Millie, what do you remember from last episode? We were discussing things like a pushing a block of ice down or pushing a block down like a sled that was basically made of ice.
00:05:22
Speaker
Yeah, yeah, like the loop-de-loop. We talked about the amount of how high a loop-de-loop would have to be. We're talking like a perfect Hot Wheels track, basically. And we've calculated, surprisingly, we didn't have to use the shape or anything of it, except for the fact that it's circular. And we came up with the fact that it could only go to four-fifths of the original height. We also talked about the principle of conservation of energy, which is like, you know, energy cannot be created or destroyed.
00:05:47
Speaker
So like even if you throw like a block of what you call it a clay at the wall and it sticks to the wall and it seems like the energy is lost, it's actually converted into sound energy and heat energy. Are you familiar with the concept? Yeah, yeah. The transference of energy. And specifically called the conservation of energy. Because it's to say that in a closed system, energy is conserved.

Energy Transformations and Calculations

00:06:11
Speaker
Yeah, it can't be created or destroyed. It can only continue to exist in a different manner. Oh, yeah. And then we talked about kinetic energy, which to recap, that's half the kinetic energy is half the mass times the velocity squared, which means if you go 20 miles per hour, you actually have four times the amount of energy going 10 miles per hour and 80 miles per hour. You have 64 times the amount of energy going 10 miles per hour.
00:06:35
Speaker
There's also potential energy, gravitational potential energy. So for example, if I hold something and drop it, how much energy can I derive from that? Let's see, we also talked about the work energy theorem, which is how basically work is force times distance. And the amount of work done is also the amount of change in energy in any process.
00:06:58
Speaker
So, for example, if I accelerate a car, you know, if I accelerate it 10 miles per hour, it goes a certain distance. But if I accelerate it to 20, using the same amount of acceleration, it's going to be a lot further. So that's why the kinetic energy goes up as the square instead of directly proportional. Yeah. No, that makes sense, yes. Yeah, absolutely.
00:07:21
Speaker
Then we finally talked about mass-energy equivalence. We talked about how, if you converted all the explosions from all the nukes that the United States has set off into matter, it would make 9,000 kilograms of matter. USSR, it was 13,800 kilograms of matter. Roughly six of that was actually Sarbomba, which I thought that would be interesting to bring up.
00:07:45
Speaker
Because my audience we have not I've not been on for a while. I apologize for that I was dealing with life stuff, but everything is back on track We're gonna start releasing more regularly and I'm very happy to have the hosts of nerd forensics on today I'll give it a listen if you like nerd culture or pretty much. Do you want to promote that real quick?
00:08:04
Speaker
Uh, yeah. So basically our show goes over everything that we possibly enjoy. Um, we talk about movies, talk about video games, comic books, uh, sports. Yeah. Like anything that catches our attention, we will talk about it. And when we talk about things, we usually try to find things that aren't very commonly talked about things that are kinda, you know, a little bit kooky, a little bit weird. Our show is dedicated to discovery.
00:08:30
Speaker
Yeah, like talk about the last episode to the last episode, we did a review of Phantom of the Paradise, which is a Brian De Palma movie. That's absolutely incredible. And before that, we started our series where we're reading Scott Adams's book, God's Debris. Oh, yeah. If you want to get a kick out of.
00:08:48
Speaker
If you want to hear us critique a terrible book, it's really good. Yeah, and it'll be as painful for you as it was for us, or hopefully it'll be comical. So let's start with the first law of thermodynamics.

Thermodynamics Laws Explained

00:09:00
Speaker
So before we were talking about the energy of a closed system, right? Yes. So that could be like... So the heat energy of a system is the sum of all the kinetic energy of the bouncing molecules around. So if these molecules are bouncing around... So let's say U is what we're going to call the internal energy of a closed system.
00:09:17
Speaker
is equal to the energy supplied to the system as heat Q minus the work W done on the system surrounding. So the first law of thermodynamics says that if you have the internal energy of a closed system, it changes as the heat supplied to the system minus the work done on the system surroundings.
00:09:37
Speaker
There's also a way of looking and that was because the way the system was originally done. You could also use work as the amount of work that the surroundings do on it. And then it would be Delta U equals Q plus W. Does that make sense? Or do I need to? No, no, that actually. Yeah, I get it. Yeah, that actually made perfect sense.
00:09:55
Speaker
And essentially, it's the law of conservation of energy, but applied to heat and work. And the work is, for example, if you have a piston in your car, the heat is the gasoline and the explosion that you supply to the piston. And the work done is the amount of torque and, essentially, rotational energy through torque, the amount of rotational energy that it supplies to the car.
00:10:21
Speaker
to the wheels, which turns into linear motion as the friction from the wheels pushes on the ground, which rotates the earth slightly in space. Okay, interesting. So yeah, when you were a little kid and you thought that the car was staying still and the roads were moving past you, if you're one of the kids who thought that, it turns out that you're a little bit right.
00:10:45
Speaker
So the second law of thermodynamics actually defines entropy, and it says, well, partially defines it. It says, it's based on the observation that energy always dissipates in real processes. And it's saying that the amount, the change of entropy over a change of time, so basically the rate of entropy changes over time, is greater than zero. So basically, entropy is always increasing is what that statement says.
00:11:07
Speaker
But yeah, this is observed with engines, how they slow down, and even with processes like cleaning and paper burning. And the reason why is because you start noticing that as these processes happen, there's a disorder. And that's the first clue that entropy is related to disorder, basically. And also, for example, if we have a piston that's moved by a really hot source, if the source cools down over time because of the reservoir of just the environment that it's in,
00:11:37
Speaker
the entropy will increase as the piston slows down basically as it has less and less energy. So you can see entropy directly as the loss of energy to heat and other processes through gases, through friction with gases, etc.
00:11:57
Speaker
So we're going to do a short aside on information theory. We talked about Claude Shannon on a previous episode and he worked for AT&T and he had the fundamental problem, how many voices can we put on one wire? So he was grappling with the problem of how much information you could transfer over a wire and he came up with information theory. And so Jacob, what do you remember about logarithms?

Information Theory and Entropy

00:12:20
Speaker
Uh, it's so like, it has to do with base numbers, right? Exponential. Yeah. Or exponential basis essentially. Yeah. Yeah. And basically what the, what's defined as is like log base 10 of a thousand is three because 10 times 10 times 10 10 to the third is a thousand thousand and log base 10 of 10,000 is what now how many zeros does it have?
00:12:46
Speaker
10,000 has five zeros. It doesn't have five. No, it's four, four. I'm not great at this, but you have 10,000 as four zeros, which means 10 times 10 times 10 times 10, which means log base 10 of 10,000 is four and log base 10 of a million is what Millie five. Well, how many zeros is a million six? Sorry. So yeah, six. Cause you have five would be a hundred thousand. Yeah.
00:13:10
Speaker
Yeah, and how about a billion? Yeah, well you think about a thousand each as a group of three, which means that log base one thousand of a million is two, because a thousand times a thousand is a million. Yeah, but this is why I can't math. So do you want to guess what log base two of eight is? Log base two of eight would be... Two to the what is eight?
00:13:34
Speaker
to the what is eight, uh, yep. You got it first third and a 64 log base to 64 would be two, four, eight, 16, 32, 64. That's one, two, three, four, five, six. Yeah. And so, and so on. And, uh, what Claude Shannon uses is the log base too. Um, and that's actually where we get the term bit from binary digits. Um, it was used first in this essay. Okay.
00:14:01
Speaker
And a bit is also a choice between on and off. But yeah, Claude Shannon, this is from his mathematical theory of communication. The logarithmic measure is more convenient for various reasons. One, it is practically more useful. Parameters of engineering importance such as time, bandwidth, number of relays, etc. tend to vary linearly with the logarithm of the number of possibilities.
00:14:25
Speaker
For example, adding one relay to a group doubles the amount of possible states of the relays. It adds 1 to the base 2 logarithm of this number. Doubling the time roughly squares the number of possible messages, or doubles the logarithm, etc. 2. It is nearer to our intuitive feeling as to the proper measure. This is closely related to 1, since we intuitively measure entities by linear comparison with common standards.
00:14:47
Speaker
One feels, for example, that two punch cards, editor's note, or two pages in a book, etc., should have twice the capacity of one for information storage, and two identical channels twice the capacity of one for transmitting information. Three, it is mathematically more suitable. Many of the limiting operations are simple in terms of the logarithm, but require clumsy restatement in terms of the number of possibilities.
00:15:12
Speaker
So that's how he defines entropy in his theory. So for example, the entropy of a letter in English is about 1.1 bits, because each letter is like, so if you were one bit, each letter you'd have about a 50-50 shot of guessing the next one, which turns out to be true. So like, for example, let me think of a simple sentence in my head in Will Guess Letters. Millie, you wanna guess the first letter? D? No, one guess.
00:15:43
Speaker
We're gonna count the guesses for each letter and we'll see it actually goes down over time. I. Nope. C. Nope. A. Nope. U. Nope. O. Nope. M. Nope. L. Nope. K. Nope. U. I'm sorry I already said U. I. T. Yep. So 10 guesses for the first letter. T.
00:16:09
Speaker
So now the next letter is probably not Q, right? So there's no reason to guess Q. So what's the guess for the next letter? E. Nope. H. Yep. So that was two guesses.
00:16:20
Speaker
OK, but here's the thing is that linguistics is kind of a thing I like. So there are patterns to how the English language is built. Exactly. And the bit content of each letter is encoded in the patterns of the language. Yeah. Yeah. So just knowing that the first letter is a T, I automatically know that the next letter very strong likelihood of the English language to be an H.
00:16:47
Speaker
Oh yeah, and by the way, for this exercise, space is also a letter. So next character, wanna guess? E. Yep, so that would take one guess. Next one? Space. Yep, that took one guess. E. Nope. L. Nope. M. Nope. C. Yes. So that was four guesses. O. Nope. U. Nope.
00:17:18
Speaker
Nope. A. Yep. So it was four guesses. T. Yep. Space. Nope. A. Nope. H. Nope. Okay. O. Nope. A. Nope. T. Nope. L. Nope. W. Nope. You already guessed O.
00:17:47
Speaker
Yup. I know. I'll give you a lecture. I can't give you continue. Nope. Uh, next space. Uh, a R E space B L. Nope. R.
00:18:18
Speaker
Yep. Those two guesses, right? Oh.
00:18:25
Speaker
W. N. Space. N. O. W. Yeah. So the sentence was, the cats are brown now. The amount of guesses went down for each letter. So let me double check this. Yeah. Except for the S, which really threw us off because we were like, what we didn't, we were not picking up that an S. Yeah. Oh yeah. I, I didn't think an S because when I, there wasn't a space after cat.
00:18:53
Speaker
I thought you were trying to give me a hard one. Yeah. Like I thought that like catapult cattle catheter. Like I was like, I hope it's not a catheter. That's why I guess H hesitantly. But yeah, you're right. Oh, that is very fascinating. The way it went down.
00:19:11
Speaker
Well, and also we do know that quote. Yeah. Oh yeah. But that's actually, that's one of the patterns in like, like, that's what, um, like we, do you DC this via very familiar patterns all over the place. And that was just like, you know, like an example of, uh, you know, how that information can, uh, be constricted. But you notice that the first two letters were harder than the rest and it got easier over time. And as it became clear that it was a quotation, it became even easier because the amount of possible possibilities went down even further.
00:19:39
Speaker
All right. So that's, um, so that took us an average of 2.13 guesses per letter. Uh, which, uh, if you take the log base two of that, it was 1.09, which actually means that we were very close to the actually average per for guesses in English, 1.1 bits per letter. Oh wow. Now, like the English is such an odd language that there are points. We had 10 guesses.
00:20:01
Speaker
Oh, yeah. And also, you notice that the beginnings of words were also hard because you don't know exactly what the next word is going to be. But, you know, like if I say, can I ask you the next word is probably question. Maybe it's query, but, you know, it probably starts with a Q. So and just a funny quotation. So John von Neumann, who invented game theory, or at least
00:20:27
Speaker
So John von Neumann, one of the inventors of game theory, told to Claude Shannon on why he should borrow the term in an information theory. As he told to Myron Trevis, as I learned from xkcd.com slash 2318, he said, entropy, you should call it entropy. No one knows what entropy really is. So in a debate, you'll always have the advantage.
00:20:48
Speaker
And what's interesting is it actually turned out to be very closely related to the concept of entropy.

Thermodynamic Laws in Depth

00:20:53
Speaker
It was kind of related, then it became very related. So, for example, if you have a system like a coin that's half heads, half tails, if you take the first number, which is one half, times log base two of one half, minus one half times log base two of one half,
00:21:13
Speaker
you get the answer that is one bit per flip. Versus if you had a coin that's weighted to one quarter heads, three quarter tails, or vice versa, if you multiply that out into negative one quarter, log of one quarter, minus three halves, log two, three halves, you get that it actually only generates 0.8 bits. Interesting.
00:21:34
Speaker
So the third law of thermodynamics says that as the temperature of a system approaches absolute zero, the entropy is constant. And remember that the entropy always increases, right? So the state can never actually be reached. Does that make sense? Yes. And the energy at absolute zero, Boltzmann's constant times the natural logarithm of the number of ground states, which is a quantum mechanical concept that denotes a state that a system can occupy at a minimum energy.
00:22:02
Speaker
So now we're going to talk about the negative first and zeroth laws of thermodynamics. So the negative first law is that information is conserved. What that basically means is that from every state, there's only one possible state that could go to. Because if there's two possible states that it could go to, there would be either increase, there'd be like, depending on your perspective, increase or decrease in information.
00:22:24
Speaker
Interesting. That is fascinating. The consequence also is that if you know everything perfectly about a system, you could always derive the past. If we're taking into consideration a system like the Earth, you have to consider anything that might interact with the Earth, which expands backwards in time at the speed of light, outward from a person's position.
00:22:45
Speaker
And the zeroth law of thermodynamics says that if two thermodynamic systems are in thermal equilibrium with one another and are also separately in thermal equilibrium with the third system, then all three systems are in thermal equilibrium with one another. What that means, thermal equilibrium, is that you connect them without energy flowing either way. So what this says is like you have three hot gases and you connect them. If you connect the first two hot gases, then no heat will flow, right? Because they'll just stay the same temperature? Yeah. Yes. And we'll talk more about heat flow and temperature change in a second.
00:23:15
Speaker
but then if you have the second and third one and no heat flows between them, that means between the first one and the third one, no heat will flow between them if you connect to them in the same way. Yeah, interesting.
00:23:26
Speaker
And it's also standardized the concept of temperature, as we'll see. One kind of interesting thing is that entropy is about a human perspective, essentially, in many ways. Because if you had perfect knowledge of the system, that actually means that the system has zero entropy with respect to your perspective.
00:23:47
Speaker
So it's very fascinating how entropy is about perspective. Okay, so let's say if we don't have any probabilities, our probability is essentially zero. And if we take p log p, you're essentially taking zero times negative infinity. So to calculate that, you have to have them battle it out. So use L'Hopital's rule. L'Hopital's rule turns log p into, turns into limit, p approaches zero of p, which is equal to zero. But yeah, that's just an aside.
00:24:19
Speaker
So let's say you have two systems connected, right? And so they're gases at a different temperature. We know that if the energy flows from one to the other, that means that the energy that one gas loses, the other gas will gain, right? Yes. So the change in gas A, so that's DEA here, is equal to negative DEB. Change in energy of A, change in energy of B.
00:24:43
Speaker
We also know that the average entropy has to increase of any closed system, and these two systems together become a new closed system, right? Yeah. Which means that the entropy of A plus entropy of B, the change of entropy has to be greater than zero. And what that means is, since we could define the change of energy as the temperature times the change in entropy, that means that the change of entropy for A is equal to the change of energy of A over the temperature that A is at.
00:25:09
Speaker
And we do the same for B, and when we substitute this together, we get DEA over TA plus DEB over TB is greater than zero. And you work out the math, and eventually you get DEA times TB minus TA is greater than zero. Therefore, that means that if TB is greater than TA, that means that the change of energy in A, which is DEA, must be positive. Is that clear from here?
00:25:34
Speaker
Yeah, because if Tb is greater than Ta, then Tb minus Ta is greater than zero, which means, and since the change of energy times whatever is here has to be greater than zero, it can't be negative. No, yeah. B has to be greater than A now. Yeah, and if B is greater than A, then Dea has to be greater than zero.
00:25:55
Speaker
And so we're going to talk very briefly about Carno engines. Carno engines is an idealized engine that basically produces no useful work, but turning itself essentially. Yeah. I've heard of those. Yeah. Did we talk about them in the last episode? I don't think the last episode, but I know me and you have discussed them in an episode.
00:26:14
Speaker
Yeah, that's right. Because the efficiency of a Carnot engine is the temperature of the hot reservoir minus the temperature of the cold reservoir divided by the temperature of the hot reservoir. So basically there's a proof, and I think we'll go over this in a problem episode, but the construction of the proof is very interesting.
00:26:33
Speaker
You assume that you have an engine more efficient than a Carnot engine, and you connect that engine to a Carnot engine and run it in reverse, and you get energy out of nothing. And since we can't get energy out of nothing because of the first and second laws, that means that nothing can be more efficient than a Carnot engine, but also useless.
00:26:51
Speaker
Yeah. That's amazing. It can't do anything but power itself. Oh yeah. So now we're going to talk quickly about Maxwell's demon.

Thought Experiments and Theoretical Concepts

00:27:02
Speaker
So let's say we have two reservoirs that are equal temperature, right? They're gases. And we have a little demon that when he sees a hot, a very fast molecule going through, he lets it go from right to left. And if he sees a really cold molecule coming through, he lets go from left to right, making one of the reservoirs colder and one of the reservoirs hotter.
00:27:21
Speaker
you can use those reservoirs to drive an engine. So it seemingly seems that you could reorder the gas and drive energy for nothing, right? Do you know what the solution to this is? Why this is not the case? It turns out the amount of energy they would take to detect the molecules and operate a gate that was capable of shutting molecules out, it would be the energy that would drive the system. So you're back to the Carnot engine.
00:27:48
Speaker
Well, you're back to conservation of energy essentially. So now we're going to talk about the entropy of a black hole, which we talked about in our black hole series, but I think it's interesting. It turns out that the entropy of a black hole is equal to the Boltzmann constant times the area of the boundary of the black hole divided by 4 times the Planck length squared.
00:28:10
Speaker
So what that means is that the area in plank length squared units is proportional to the entropy of a black hole, which is something that drove Hawking crazy because he thought that a black hole could not have entropy. Yeah, that's something I learned today. Yeah, interesting.
00:28:32
Speaker
And finally one last property of entropy that I thought would be interesting to talk about is that if entropy of a system closed system always increases, right? That means that the amount of usable energy goes down, right?

Entropy's Cosmic Implications

00:28:43
Speaker
Yes. Yes. And if you consume Oh, yes from what we've been told. Yes And if the if you consider the universe to be a closed system with itself That means that the entropy will always increase in the in the in the universe, right? Yeah. Yes with that That's what's meant by the heat death of the universe now this this
00:29:00
Speaker
How has the potential to change if you look at the idea of the universe as the ultimate free lunch or like, you know, just some esoteric views. But what it basically means is like, you know, all systems decay. All time is temporary. All things turn to dust.
00:29:22
Speaker
entropy and thermodynamics are to energy what the study of mechanics is to Newtonian physics you really can't appreciate how the latter works without the former we've explored the strange property called entropy how it relates to energy and even how it applies to information theory I'm Sofia Baca and I was joined today by Nelson Oriana and Jake bourbon and yeah we talked about entropy um any final thoughts Jacob

Final Thoughts and Outro

00:29:45
Speaker
like entropy is kind of a scary thought when you put it into the idea of like heat death and things of that nature but again it is absolutely essential for how the universe has to function oh yeah absolutely and yeah it's um uh definitely it's uh how about you Millie any last thoughts
00:30:08
Speaker
Um, I mean, yeah, everything has to decay stuff breaks down over time. And I mean, it would be pretty horrible if like things didn't, we'd be so crowded with just crap. There'd be bones everywhere and like, yeah, horrible. No, no, it's a good thing that everything breaks down over time. All right. Um, thanks y'all for joining us on breaking math. Uh, want to plug your show.
00:30:35
Speaker
Yeah, so again, that's nerd forensics show about discovery that you know anything nerd culture that we think is you know, awesome And again, we tried you the you know out of the ordinary we actually had an episode about wrestler named new Jack and our next two sports episodes we've been talking about are going to involve a basketball player who
00:31:00
Speaker
disappeared mysteriously and a nascar driver who Had no business being on a track Well stay tuned for those Jacob anything else you want to plug no, no just nerd forensics comes to give us a listen. Uh Sounds good. Uh check out nerd forensics if that if you're into that thing, I will say it's a more explicit podcast in this one So definitely, you know let that guide you but um
00:31:29
Speaker
More comedy show. Yeah, yeah, it's much more comedy oriented. Absolutely. The editing on that show is completely different than editing on this show. There's clips that are related to the things that y'all are talking about. Yeah, this is more just straight information. Yeah, this is more like an actual education program where ours is more kind of like junk food. Yeah. Well, I mean, we are the show that has that intro that sounds like an intro on the outro that sounds like this.