share
interactive transcript
request transcript/captions
live captions
download
|
MyPlaylist
PAUL: So it's a pleasure to introduce today's speaker, Lenny Susskind, who is here giving a series of free lectures this week, is one of our Messenger Lecturers. The Messenger Lectures are described on the university website as one of the most important important of Cornell's extracurricular activities. I'll save the full description of the series for the public lecture on Wednesday night at 7:30. That will be "Black Holes, the Conservation of Information and the Holographic Principle.
I'll mention here just that we've had stunningly distinguished Messenger Lecturers and physics and astrophysics at Cornell, including RA Millikan, 1925 through Arthur Eddington in 1933, J. Robert Oppenheimer in 1945, Fred Hoyle in 1960, of course, Richard Feynman in 1964, up to and including Steven Weinberg in 2007, Nima Arkani-Hamed, the most recent in physics in 2010.
I happened to host Weinberg in 2007. And when I contacted him to ask him about the Messenger Lectures, he told me, of course I know about the Messenger Lectures. I frequently referred to Feynman's 1964 lectures, which later became his book, The Character of Physical Law. When I happened to see Lanny about a year ago and asked him the same question, Lenny said, of course I know Feynman's 1964 Messenger Lectures. I was there.
[LAUGHTER]
So Lenny was a graduate student here. He obtained his PhD from Cornell in 1965 under adviser Peter Carruthers. He was at Yeshiva University often from then until 1979, when he moved to Standford, where he's been since.
I'll again say it's a long story for Wednesday night. The short story is that since 2009, he's been director of the Stanford Institute for Theoretical Physics, and has authored two popular science books-- one about cosmology and the other about quantum mechanics and black holes. He's also interested in teaching popular courses. He wrote a series of courses about modern physics entitled The Theoretical Minimum, which can be found online, leading the MOOC movement in physics.
I first sort of met him while I was still an undergraduate. He was, in 1976, giving a series of Loeb lectures on course-grade quantum field theory. And I didn't attend those lectures. I didn't learn about lattice QCD until I came here as a graduate student.
But I clearly remember that just before the start of one of the lecturers, this fellow who sort of looked like an Olympic marathon runner. He was also taller than-- came out and asked me an urgent question, to which I actually knew the answer-- down the hall, first door to the left.
[LAUGHTER]
So as you'll see, Lenny is the unique person to be giving this 50-year follow-on to Feynman's lectures, satisfying four specific criteria. Number one, he was there.
[LAUGHTER]
Number two, he's extremely eminent.
[PHONE CHIMING]
Number three, he's still sufficiently compos mentis to pull it off.
[LAUGHTER]
LEONARD SUSSKIND: We'll see.
PAUL: And number four, has the chutzpah to be willing.
[LAUGHTER]
Let's welcome Lenny Susskind.
LEONARD SUSSKIND: OK.
[APPLAUSE]
Thank you very much, Paul.
PAUL: Turn it on down there, your mic.
LEONARD SUSSKIND: Oh, yes. Hello? Am I here? Ah, there we go. Can you hear me?
SPEAKER 1: No.
LEONARD SUSSKIND: No?
[LAUGHTER]
OK. Let me say one thing to begin with. I really feel sorry and sad for all the young physicists in the world who will never get to meet Dick Feynman. He was a close friend of mine. I think the greatest inspiration that I ever had in my life was listening to the 1964 Messenger Lectures.
I never looked at them again. I couldn't. I said, once is all you ever want to do this particular thing. In fact, I always felt the same thing with Dick. I never wanted to go back over the things I did with him. Once was enough. Once was so unique and so exceptional that to see them again or to do them again just seemed something I didn't want to do.
So I haven't looked at the Feynman lectures again. But I do remember-- now, I could be wrong about where I heard Feynman say this. I think if my memory serves me right, it was in the first of the Messenger Lectures. I could be wrong. It could have been another lecture somewheres.
But as I remember, Feynman once described theoretical physics as follows. First, you get an intuitive idea of how something works, how to explain something. Then you find equations to quantitatively express your idea. From the equations, you make calculations and predict something new-- a new number or a new relationship.
After that, somebody else-- usually somebody else-- goes out and does an experiment to test the prediction. And if enough predictions turn out to be right, then you've got a theory. If the idea covers enough different cases, the theory becomes a principle.
This, of course, is correct. But I think it's not the whole story. There are other ways that theoretical physics progresses, or physics in general progress. One I would call Dirac way. I'm specifically interested in theoretical physics. One I call the Dirac way. Dirac said, look for beautiful equations. If an equation is beautiful enough, it will be right.
[LAUGHTER]
Well, maybe. It didn't work for Dirac. Then there's the Einstein way. Sometimes principles, these things we call principles, collide. When two things, both of which are so deeply rooted that we cannot conceive of either of them being false but nevertheless when taken together seem to lead to contradiction, then we have a conflict of principles.
It's when such conflicts are resolved that physics makes the greatest conceptual advances. My lectures are about three related conflicts of principle. I think it's fair to say that all have powerfully changed physics, even though not one of them has been completely resolved.
Today, I will discuss Boltzmann's struggle with the second law of thermodynamics. On the one hand, entropy always increases. On the other hand, Newton's laws are reversible. We will come to what that means. The outcome of the struggle was profound.
Entropy is hidden information. Boltzmann's insight deeply affected every area of science. But in the end, it did not solve the problem that Boltzmann originally set out to conquer. Why is there an arrow of time? Why does time go one way? What's different about the future than the past?
I'm going to explain how modern developments in cosmology make this puzzle even more puzzling and how they also suggest a solution. But be prepared to be very skeptical. We'll be driven to extremely remote ranges of time and space way beyond anything we can hope to directly access.
Nevertheless, the arrow of time is a fact of nature and deserves an explanation. So let me begin. I will just comment that I think it's an absolute disgrace that a beautiful, wonderful room like this in which physics can be presented does not have a permanent blackboard.
[LAUGHTER]
Had I known that when Paul asked me-- I asked him explicitly, Paul, is there a blackboard in the room? And Paul said, of course there's a blackboard in the room. He was lying to me.
[LAUGHTER]
I don't know if I would have turned down the opportunity to give these lectures. I don't think I would have. I don't think I would have. But nevertheless, I have to say I'm deeply saddened and disappointed by the lack of a blackboard in the lecture hall at Rockefeller Hall.
[LAUGHTER]
I'm going to use this monstrosity of a machine. I don't like using PowerPoint. I will use it today. Oh, there's another interesting fact. Paul said, give lecture 1, 2, and 3 in that order-- first the first one, then the second one, the third one.
And I assumed that this meant that the first lecture should be the first logically ordered one and that it should be for the general public, a broader thing that introduced some concepts at not exactly a lower level but with fewer equations and fewer technical concepts. And so I wrote the first lecture. And the first lecture was about Boltzmann and so forth.
I then found out just a couple of days ago that the first lecture is the physics department colloquium, and the second lecture was the general lecture. Sorry, buddies. You are getting the first lecture, because it's the first lecture. I can't help it. So if you find this a little bit trivial, then I invite you to--
PAUL: The lecture.
LEONARD SUSSKIND: What's that?
PAUL: To the second lecture.
LEONARD SUSSKIND: To the second lecture. All right, good. You said it. I didn't know what I was going to say next. All right. The lectures, the entire series of lectures, had some weird title that Paul made up, had nothing to do with me.
The correct title should have been "Conflicts of Principle." I don't know if it's "Conflicts of Principle" or "Conflict of Principles," but one of the other. And basically the three related questions of concept, of conflict of principles.
OK, let me start with an experiment. This is a real genuine experiment. I did this in my laboratory. I took pictures of it. And I have 232 slides on here, incidentally, and you'll see. But most of them are little movies, little film clips. The film clips are homemade film clips. I make them myself. And we'll see how they work. All right.
We start with a room. The room is the purple area. It's a sealed room. Nothing can get into it or out of it. And up in the corner of the room is a little bit of maybe a lot of molecules, gas, all stuffed into the corner of the room.
Now, I want you to tell me which of the following two little movies makes sense as physics and which does not. OK, so here we go. That's one. Here's the other one. Should I do them again? No. That looked pretty good. I like that. That looks very uncomfortable. I don't believe that will ever happen. Most of you don't believe that the air in the room will also rush into the corner.
So what's going on? Well, we usually blame this on the second law of thermodynamics. The second law of thermodynamics says that entropy always increases. And if you know just a little bit about entropy, you will know that the entropy of a room filled uniformly with gas and thermal equilibrium has a high entropy. And when all the molecules are stuffed off into the corner, it has low entropy. Entropy increases, end of story. Second law of thermodynamics tells us which film is correct.
The clash of principles, the conflict of principles. The second law says that entropy always increases. Newton's laws of motion-- and for practical purposes today, I don't want to introduce quantum mechanics. But quantum mechanics also says the same thing. It says that the laws of physics, in particular Newton's laws, are reversible.
Let's remind ourselves what that means. What it means, in the context of a very simple example-- oh, here it is-- the context of a simple example is that, if a ball can roll-- a frictionless ball on a frictionless surface-- can roll from one point to another in a certain amount of time, then Newton's equations say that there's another solution in which it can roll back to the first position.
Therefore there cannot be a quantity which always increases. If it increased going this way, it would decrease going that way. They can't be-- Boltzmann's effort to find the quantity in mechanics, which always increases, was doomed from the beginning. And people told him that, incidentally. He struggled with it.
OK, so that raises the question then, what is entropy? What is it that seems always to increase, despite what Newton might say about it? What's the meaning of the second law? And why is there an arrow of time? OK.
Let's begin with what entropy is. So let's see. Yeah. Entropy, according to Boltzmann in the end when he finally understood it, is hidden information. What does it mean that information is hidden? Well, in the practical example that he was thinking about, a gas, information is hidden because it's contained in a collection of degrees of freedom, which are too small to see and too numerous to keep track of. When information is contained in two numerous set of degrees of freedom too small to see, that information is called entropy.
Let's take an example. Here's an example of 64 coins. Now, neglect the fact, ignore the fact, that they're on a lattice. That's not the important point here. I had to draw them some way. So here's the 64 coins. They might just be in a bag. They might even be invisible to you. But there they are-- 64 coins. And each coin has two faces. One face is red. One face is blue.
Here's a special configuration of the coins, in which they're all showing red. How many such configurations are there? Well, the number is right over here. There's one such configuration. It's rather special. Incidentally, if you saw it, even if you didn't know where the coins were, if you saw those coins, you'd recognize it instantly. Easily recognized.
Here's a configuration with one blue coin. It's less unique. There are 64 of them. Here's another one. And there are 64 of them altogether. You might not so quickly recognize which was which if it was flashed in front of your eyes. 64 configurations with one flipped coin.
How many with two flipped coins? Well, you can work it out. It's 64 times 63 over 2, and the answer is 2,016. The numbers are going up fast, rather fast. Here's another one. Three coins. 41,664 configurations with three coins. I suspect, if I flash that at you pretty fast, you would not be able to tell me afterwards which three coins were flipped, but maybe you could.
Here's four. The number is six better part of a million. 650-- whatever. Some large number-- that's how many configurations. Notice they're going up really fast. How about if half the coins are red and half the coins are blue? This is a kind of generic situation. The numbers are vast. 2 times 10 to the 18th configurations.
Supposing you were just to pick a configuration randomly, what would you be likely to pick? You would be very unlikely to pick one. You would be extremely unlikely to pick all reds. Picking them at random, you'd probably pick something like this just because there are so many configurations like this.
The technical definition of entropy for an ensemble, a collection of states, for a collection of discrete states of a system, like the system of coins, is this, that the particular ensemble-- it's a technical word, ensemble, but it just means a collection of states- that are somehow recognizable in some way-- recognizable or not, is the logarithm of the number of configurations that satisfies a certain criterion.
OK. For example, the state with one flipped coin with no flipped coins. There's only one state. Log 1 is 0. It has no entropy. And you go down, the entropies rise, not as fast as these numbers on the left. But the entropy goes up and eventually reaches a maximum.
All right. Now, let's talk about something different. Let's talk about equations of motion. What is an equation of motion? An equation of motion is a rule for updating the state. Given the state, the equation of motion tells you what the state is in the next instant. Think of it that way. I'm imagining time is discrete, but that's not important.
These six boxes represent six different states. How many states are there of the 64 coins? 2 to the 64th, too many to draw. I can't draw them all. So I've just drawn six. Why? Six is a nice number. OK, those are six states, but think of them as representing all the possible states of some system, not the ensembles but the individual states.
What is an equation of motion? An equation of motion is simply a rule which tells you, given the state, what's the next state? 1 goes to 2. 2 goes to 5. 5 goes to 3. 3 goes to 6. 6 goes to 4. 4 goes back to 1. And what happens next? 1 goes to 2, 2 goes to 5, and so forth, and you cycle through these states.
This is a typical example of a simple discrete equation of motion. It tells you how the system moves forward. In classical physics, it's completely deterministic. In quantum physics, a similar thing is true. It's called unitarity, but let's just take the classical version of it.
Here's a law of physics, or a equation of motion, which is also deterministic. If you start with 1, you go to 2. If you start with 6, you go to 2. If you start with 5, you go to 2. That's deterministic. I can tell you exactly where you'll be after a certain number of steps.
Something wrong with it. What's wrong with it? What's wrong with it is it's not reversible. If you know you're at 2, you can't tell where you came from. You can't retrodict. If you know you're at 1, you didn't come from anywhere. You couldn't have come from anywhere. So this is what's an example of irreversible equation of motion.
The laws of Newton and the laws of quantum mechanics and the laws of every physical system that we know, at the very bottom of its description, are reversible-- all of them. And so equations like this are forbidden. Do we know why? Well, in some way, we do know why. But I'm going to ask you to trust me that this is one of the things that the laws of physics, as we know them, require. What's the rule? The rule is that every box has one arrow in and one arrow out-- an arrow to tell you where you were and an arrow to tell you where you'll be next.
OK. That's the sense in which information in classical physics and in quantum physics, if you work out the quantum analog of it, that information is never lost. You always know where you came from. You always know where you will go to. And there is no contraction into a smaller number of states so that you lose information. Incidentally, in classical physics, it's called Liouville's theorem. In quantum mechanics, it's called unitarity.
OK. Let's now ask what an evolution would look like. Now, this is a made-up evolution. I'm going to assume that, as you go from one step to the next, not too many coins are flipped simultaneously. One, two, or three coins are flipped. All right, here it is. Now, keep track of it, because it's important to keep track of the sequence.
OK, I'll go back over it too. Let's do it a little more slowly. We started with the lowest entropy state. Why? I just decided to start with the lowest entropy state to see what would happen. Not too many coins get flipped simultaneously, so the next step might be one blue coin. Which blue coin? Don't know. But the point is the equation of motion tells me which blue coin.
What's going to be next? Well, it can't go back to all red. Why not? Because it has to cycle through all the states. Assumption is it cycles through all-- the system cycles through all the states. It can't just go from red to blue or one red to one blue to red one and red to one blue, so it'll go to something else. Oh, let's got back a minute.
It might go to a single blue coin someplace else. But there are many, many fewer of those states than there are states with, let's say, two, three, or four flipped coins. Much more likely, just for that reason, that will go perhaps to four flipped coins. Remember, there's about a half a million-- more than a half a million configurations of that type.
From here, it might go back to one coin. It might go back to three coins. But much more likely that it will go to more coins for the simple reason that there are more configurations like that. And thus it goes. Once it gets to this point, where the entropy is maximum, where it's almost half and half, give or take, it will simply rattle around in similar states for a long, long time, for a very long time, much longer than we have to run through this.
But every so often, a fluctuation will happen-- a Boltzmann fluctuation. A rare and unusual configuration will arise. Here's one with only three blue coins. What's the next one going to be? Maybe it's two blue coins. Now, there were just so many more with four blue coins that it went to four blue coins. And so it goes.
Whew. It's getting interesting. That's an interesting-- whoa, ah. Didn't go anywheres. Oh. Maybe we'll get back to the beginning. Not likely. And so we go and so we go and so we go. Ooh. Whoops. There's an interesting one. How do we get there? It has a star in it. Not a star, not an astronomical star. But it has a little star in it-- one star. That's a really interesting configuration. How did it get there?
Well, it got there randomly. It just accidentally got there. Remember, you have to cycle through everything. Not so surprising that you got something really rare and interesting. But it didn't get there for a interesting reason. Didn't get there for a reason which you could explain by some inevitable law of physics that says that stars will emerge in some very, very specific and organized way.
These, the example of a star that I just described is an example of a recurrence. You can call it a recurrence, more or less a Poincare recurrence. You can call it a Boltzmann fluctuation. You can call it a freak unexplained happening.
They are a general property of finite systems. Because the system is finite, it only has a finite number of states to cycle through. And so weird things will happen. Weird things will happen over and over again. And on top of that, they will vastly outnumber the normal histories.
The normal histories are the ones that started with extremely low entropy. I will come to why I say that in a moment. But the normal histories, the ones that we normally understand or think we understand, our histories that start with very, very low entropy.
There are many more ways to create that star than to start with low entropy. Just many, many more random statistical ways to how to make it happen. Roughly speaking in the real universe, what we're talking about is very, very freak phenomena where a bunch of random dust molecules or gas molecules might simultaneously come together and make a galaxy.
That sounds almost completely impossible, but it's not completely impossible. And what this tells us is that, in a finite system-- a system that just cycles around itself-- it is much, much more likely than any other way of making that star or that galaxy. We'll go through a few more examples to get the idea.
Let's explore the idea further. And suppose for a moment that the universe is such a finite system-- finite system, finite number of degrees of freedom, number one and number two contained in a finite volume. If it's contained in a finite volume and it has a finite energy, we describe it-- well, we can describe it in any case by phase space.
The configuration of space of states is not a series of six boxes, but it's a point in the phase space of some number of molecules. For example, 10 to the 80th molecules or whatever the right number is. And so it's described by a phase space, which is a plot of position and momentum and velocity.
Now, to make it a finite system, let's assume that the positions of the molecules are bounded, they can only go between here and here, and that there's a brick wall here that prevents them from going any further. Furthermore, let's suppose that the energy of the system is finite. In that case, the momentum of the particles cannot exceed a certain amount. And so the whole thing is contained in a box.
Now, there's a region of phase space that's my target. By my target, I mean the thing I want to explain, how we got there. This is some region of phase space which describes the world, let's say, as we know it now. It's got a certain abundance of helium, a certain abundance of hydrogen, a certain abundance of lithium. It's got planets. It's got stars. It's got you and me. And that's it.
It's not unique. The state is not unique. There are many, many states that look like that. It's an ensemble, and it's an ensemble with some pretty significant entropy-- an entropy of about 10 to the 100 in some units or other. But still, it's a very small region of the whole phase space. Most states are not like that.
So the question is then, how do we get there? And the standard idea of a theory for an explanation-- forget that we call that explanation inflationary cosmology right now. We'll come back to that in a little while. But the basic idea is that there's some special family of states-- not a unique state but some special family of states-- with a very low entropy, very few of them as a starting point.
But if we follow the phase trajectory, the trajectory of the system, from any point within that little region there, it always evolves into here. That's called a explanation of our world, if we can find such a thing. If we can find a small region of very low entropy such that almost every point within that region evolves into the region where we are, then we say we have an explanation. The explanation is, for reasons unknown, we started out in here. And for reasons known, we evolved to here. Inflationary cosmology might be one such candidate description.
OK. But once we get into this-- let's call this the zone of life just to give it a name. Once we get into the zone of life, we don't stop. The world goes on. We eventually will exit the zone of life, sadly. We will eventually exit the zone of life. Our universe will expand, dilute. We will eventually get out of the configuration space where we can exist, and we will wind up outside it.
Is at the end? No, that's not the end. The phase point continues to wander around. And for long, long periods of time, it evolves around in what's roughly thermal equilibrium in this world in a box. It evolves around in thermal equilibrium, which is like the series of coins which were in this featureless configuration, about half red and half blue. And it stays there for a very long time, not in any particular state but wandering around through the phase space, the wandering phase space point. Feature list, nothing interesting happening.
But notice over here, the phase space point managed to wander back into the zone of life. But it didn't wander into the zone of life from the trajectory that started with this low entropy state. It wandered in from someplace else.
The zone of life is much bigger than the starting point here. How do we know that? We know that because entropy increases. The reason entropy increases is because we went from a small region to a bigger region. We wander around and then accidentally-- practically accidentally-- in a freak event, we wander into here.
The people who live in here-- look around them. And they see a world which is not explainable in this way that we're familiar with explaining phenomena. In fact, my guess is those people in there would not be too surprised. They would say, what would be overwhelmingly surprising would be to start from here. There's almost no states there. We know what's happening. Our phase point is just wandering around. And every once in a while, it wanders into the zone of life.
If we wait even longer, we will find out that the phase point wanders in and out and in and out of the region of life. Very, very rarely, much more rarely, does it actually evolve back through this point and wind up behaving in the way that we normally consider explainable.
What I'm telling you is the thing that we normally consider explainable is the most outrageously unlikely of all possible ways of getting where we are. Why? Because it starts with a very, very low entropy state. And low entropy states are very rare.
So in this world, in a box, what one should expect is that the typical example of life, the typical example of a world which would support life, would be what we would call a freak world. We don't live in a freak world. We know that inflationary cosmology back to some very low entropy starting point works.
So there's something wrong with this picture of-- let me go back. There's something wrong. Yeah. There's something wrong with this picture of living in a box. There are ways out. Maybe we don't live in a finite box. That would get us out of it, perhaps. The problem is that modern cosmology tells us that we do live in a finite box, in a finite and, in a certain sense, ungrowing box. If the box grows, then you can get out of this problem.
OK. All evidence that we have about cosmology-- experimental, observational-- points to the idea that we do live in a finite box. The evidence that I'm talking about is the existence of a cosmological constant, sometimes called vacuum energy, sometimes called accelerated expansion, and most often called dark energy. They're all the same thing.
So let me go through in a few minutes what the arguments tell you, what the equations tell you. A good fraction of you have seen this equation before. I'm curious, what percentage of people have seen this equation? Probably about a third, I would say, something like that. So let me tell you what this question is.
This is called a Friedmann-Robertson-Walker equation. It's the equation for the expansion of the universe. a represents the radius of the universe at any time. When physicists put a dot on top of a variable, it means the time derivative of it.
And the left-hand side is called the rate of expansion. The rate of expansion, a dot, is the time derivative of the radius. Universe is expanding. a dot over a is finite, and this is called the rate of expansion. And the equation says that the rate of expansion is proportional too.
Forget this number here. This is a number. We'll set it equal to 1. In some units or other, it really is equal to 1. Some Planck units or some sort of units, it's equal to 1. This object is the energy density, or the mass density, in the universe. E equals mc squared-- same thing.
And what you can say about this energy density is, at least the usual sources of energy, the usual types of energy density-- particles, photons, all the usual things. Let's call the particles matter. Those have an energy density which decreases as the universe gets big. Why? Because it dilutes. Energy density dilutes as you make the universe bigger. Same thing with the radiation. Radiation dilutes even faster for a technical reason that I won't bother you with.
But there is one form of energy. It's often called very mysterious. I think it's not so mysterious. It's called a cosmological constant, or vacuum energy. It exists in the world. It exists in our equations. It exists in quantum field theory. It must exist, and it does exist. What's queer about it is how little of it exists.
But we're not going to get into that today. We're just going to say there is a vacuum energy. And the thing about the vacuum energy is it doesn't dilute. You take a box of a certain amount of vacuum energy, and you grow the box. What happens is the density of vacuum energy stays the same.
Now, you can ask where the energy came from. That's another issue. It is understood. It's not mysterious. But there is a vacuum energy, and we know that it really exists. It's been measured astronomically. Here's the number. It's even bigger. It's very small, but it's actually bigger than the density of ordinary material throughout space.
That tells us that the Friedmann-Robertson-Walker equation just has a right-hand side, which is just a number. It doesn't change. You can solve this equation, a dot over a equals the square root of lambda. And the solution is an exponentially growing universe.
Now, that doesn't look like a finite box of a certain size. That looks like a growing box. And with a growing box, you can get out of this problem of the Boltzmann fluctuations. But this is misleading. Let me show you, let me take you through the arguments about why this is misleading and what the equations really tell us.
All right. First of all, the universe, the entire universe described by this theory is expanding exponentially. That means every point is moving away from every other point-- moving away according to the Hubble law. And the thing that's exceptional about a cosmological constant is that the Hubble number here, the relation between velocity and distance-- this is velocity.
Distance from you. Here you are right here. I'm also pretty close to there. And we're looking out at some distant thing. It's moving away from us, because the universe is expanding. And how fast is it moving out? A velocity proportional to the distance away. The coefficient here is called the Hubble constant. And the special thing about vacuum energy is that the Hubble constant is truly constant. It doesn't change with time. That's special.
Now, this is a strange equation. If you think about it, what it says is, if you go far enough away, where D times the square root of lambda is equal to the speed of light, there will be things moving away from you with the speed of light.
Moreover, if you even go further out, they're moving faster than the speed of light. Trust me, this is OK. This is allowed. Things moving away faster than the speed of light, but you can't see them. When they send signals back to you, the signals are also swept up and travel away from you.
And so the result is that there is a certain radius called the horizon, the radius of the horizon. And for all practical purposes, everything that you can ever see, everything that you can ever know about, is within that radius.
That radius, or that shell, out at that distance is called your horizon. And it's very much like a black hole horizon, except looked at not from the inside of the black hole. But it's as if there was a black hole on the outside that, when things cross over into this Neverland out beyond here, you'll never see them again.
Well, that's not quite the way either black holes or cosmological horizons really work. The way they really work is that, if you with your telescope were following something moving out, you would not see it cross the horizon.
Why not? You can never see anything cross the horizon. First of all, you can't see anything beyond the horizon. You actually can never see anything cross the horizon. What you do see, or what your mathematics would tell you, is that, as time goes on, the particles, the dust, the galaxies asymptotically-- within your horizon here-- asymptotically approach the horizon, moving slower and slower and slower, freezing at the horizon. That's what the equations of general relativity say. Very much like a black hole.
So here is either what you would see through a telescope or what the mathematics would tell you. There's all your galaxies. How did they form? They formed, perhaps, by starting with some low entropy state and creating galaxies in the standard way.
And they start to move out. They start to move out. And all the particles eventually arrive near the boundary, near the horizon, taking longer and longer and longer to get there. Let me draw a picture of one way of thinking about this.
Think of all the particles in the universe being contained within this horizon. So they're within this region here. And there's a potential function-- a potential function which looks like this. You're at the center here. You're exactly at the center. So you'll neither fall off this way or this way.
But something which is over here will start picking up steam and eventually fall down to here. Something over here will fall down to here. So if this region was filled up with particles, they will all fall down to here. They will all fall down to here. That's what's going on here. And from your perspective, within your telescope, you will see them simply freeze and congregate or pile up, pile up at the horizon. That's the mathematical description, the general relativity, tells us you would see.
Now, if we add some quantum mechanics. Now, we have to add some quantum mechanics to go next. If we add some quantum mechanics, the main result of quantum mechanics is that a horizon like this cannot tolerate an infinite number of degrees of freedom or an infinite number of particles all piled up like that.
And in fact, the pileup has a finite thickness. The finite thickness is proportional to H-bar. Things cannot get that close. Rather, the pile-up piles up at a distance which is comparable to a length called the Planck length.
These particles pile up here. And because they've fallen down, they have some kinetic energy. And therefore, they're hot. This system has temperature. That's a well understood feature of spaces of this type. They're called de Sitter spaces. De Sitter spaces have temperature. And all the thermal junk is out near the edge. That's your description of the world.
Now, of course every once in a while, because these particles are hot, once in a while, one of them will get some extra energy and jump back up to the top. One particle-- who cares about one particle? It might jump up to the top. Two particles might actually make it to the top-- even rarer.
So here's what we would expect to see as we follow the system. Particles, a few particles, jump up. They fall back down. They jump up. They fall back down. They jump up. You can see why I have 180 slides. I think it's 230. This keeps happening over and over.
But every once in a while-- and accept that when I say "every once in a while," I mean a long while. Every once in a while, the mathematics in any case tells us that there will be an exceptionally large number of particles that jump out of the horizon and congregate somewheres.
A dust cloud is formed. A dust clouds is formed by no mechanism that you would recognize as a standard astronomical mechanism. By random fluctuation. That random fluctuation might collapse to form a black hole. That's a rather likely thing that it would do. And this process is well understood. It's been studied for many years-- the creation of black holes out of de Sitter space.
But before it forms a black hole, maybe it will not form a black hole. It might form a galaxy. The dust cloud. The dust cloud may then evolve into a galaxy. What would the people who live on that galaxy think? They would look out, and they would say, gee, this is interesting. We are alone in the whole universe.
Why would they be alone? Well, the probability of making one galaxy is incredibly small by random fluctuation. The probability of making two galaxies is vastly, vastly smaller. So the most likely thing is, if you found yourself in a galaxy, you would say, ooh, a fluctuation happened. But the likelihood that a double fluctuation happened is negligible. So these people would not be all that surprised that they were alone, if they understood the theory of fluctuations. And if they believed that their birth was due to a random fluctuation, they would have a very nice theory of the world.
So lesson-- over very long time scales, everything happens. Freak events are vastly outnumber the "comprehensible," in quotes, "histories," the ones which start with small entropy and go where we're going. And the reason that the comprehensible of normal histories are so overwhelmed by the other ones is that they originate from low entropy configurations, which are the most unlikely.
Something is clearly wrong with this picture. We did evolve out of a inflating early low-entropy state. I don't believe that we are simply the one part in 10 to the 10th to the 10th to the 10th of the 10th civilizations that were just so lucky that they were the first ones to be formed and therefore were not these crazy recurrences, as that, I think, is very unlikely.
So something's wrong. There are various possibilities of what could be wrong. Perhaps this idea that vacuum energy means that we live in a finite box, which just recurs and recurs and recurs. Maybe that's wrong. Maybe the mathematics, the quantum mechanics, the combination of ideas of quantum mechanics and gravity don't really fit together. And this idea that everything piles up-- ooh, I have two lasers.
[LAUGHTER]
Ooh, ooh, wait, wait. Look, look. One of them is behind the other. All right, this idea of pile-up here-- maybe that's wrong. Maybe the use of thermodynamics is wrong. On top of everything else, the time scales that we're talking about are truly vast-- exponentially, exponentially large. Can we trust anything about such enormous time scales? Aren't we being a little bit-- jumping ahead of the game a little bit by assuming that we know how physics works for 10 to 10th to the 10th to the 10th years?
Of course we're jumping ahead. Something unexpected may happen in between. But that's just the point. That's just the point. What this is telling us very likely is that something unexpected that's unaccounted for by the equations as we now know them must come into play over these very long time scales. And what kind of thing might it be?
That was intended. I'm trying to remember what the next one is. OK, let me go through them. That didn't help. Oh, there it is. Here is the standard theory of the evolution of the universe. This is inflationary theory. Inflation is the theory which was confirmed over the last few months by this famous experiment, which just took place-- the BICEP2 experiment. It didn't lead to this picture. It confirms this picture. I'm going to tell you what the picture is.
There is a field. The field has a name. It's called the inflaton. It's a field in space, and that field can vary. The vacuum energy depends on the value of that field. So if the field is at this value here, there is a large vacuum energy.
Remember what a large vacuum energy means. It means a tremendously rapid growth of expansion. But it also means that the horizon is very small, because you don't have to go very far until you get to the point where things are moving away with the speed of light.
The smaller the expansion rate, the further you have to go till you get to the point where you reach the speed of light and therefore the bigger the horizon. This pot, this pot of particles here, is big when the cosmological constant or the one the vacuum energy is small. And it's small when the vacuum energy is big.
So if we start up at the top of this potential, where the vacuum energy is large, the observable universe within a horizon is very, very small. And then the standard-- I'm going to give you the quick nutshell version of all of cosmology. The universe rolled down to the bottom, where there's a little bit of vacuum energy.
So the universe is still expanding. It's still accelerating. It's still in this phase like this. But much slower expansion, much bigger universe. But it's stuck there. What happens when it gets down to here incidentally? Everything interesting that happened in the universe happened on the way down here, while it was out of equilibrium, including us. We're at somewheres down near the bottom. But we're not really at the bottom yet.
Once it gets to the bottom, that's when you're in thermal equilibrium. That's when the universe has evolved to where everything has fallen down to the sides, and it's sitting there in thermal equilibrium. And what does it do? It jiggles around there. It jiggles. This is not my nerves. This is my purposeful jiggling right now. I'll try to stop. I can't. OK.
It jiggles around here. Jiggles around here for how long? For an immensely long amount of time. Every so often, though, it might get a fluctuation, which pushes it up the hill a little bit. And then it will fall back down-- a partial incomplete recurrence.
It'll do it again and again and again, over and over again. Very, very rarely will it jump up to the top. And when I say rarely, I mean it's the most unlikely thing. The entropy up at the top here is minimal. It's like all the particles having gotten up to the top. You don't need all the particles to get up to the top to make a galaxy.
OK, so that's the history of the universe in a nutshell-- a roll from a high value of the vacuum energy down to the bottom, sits there for long periods of time, and then now and then jumps back. What would it look like from the point of view of the pictures we drew?
Well, when we we're up on the top-- the universe or at least the horizon-- the portion that we can see, the portion which comes into the mathematical description or quantum description of the universe, is very small. How small? I mean really small.
This would be really, really small up at the top of the potential. I can't remember what the number is, but it's microscopic. It's a lot smaller than a proton. And so we sit there with a very, very small universe. But it rolls down the hill.
When it gets down to the bottom of the hill, the horizon is much bigger-- big enough for us to live in, big enough for the galaxies to live in, big enough. And think of these red dots now as galaxies. But then it evolves. It continues to evolve and expands. And everything goes out toward the edges.
And sooner or later gets to thermal equilibrium, with all the particles down in the bottom here. They sit there. They do things. They knock against each other. And it's quite boring. Nevertheless, every once in a while, a few particles might jump up out and then fall back in. Still boring.
But again, with a very, very low rate, a dust cloud may form. A dust cloud may form, make a black hole, or it may make a galaxy. I'm repeating. I didn't make these new. These are the same ones again. And then evaporate back to nothingness again.
This is the picture that we are now stuck with. Over and over again, the vast majority of observers are these freaks-- the freaks who occur for no good reason other than random things happen. They vastly, vastly outweigh the number of times that the universe jumps up to the top. Where is the top? The top is gone.
This is where we are. We're stuck there. We're stuck there with a theory which tells us that the vast overwhelming majority of civilizations-- their physicists will correctly have a theory that says that they were born at a random fluctuation. They will not be surprised to find themselves alone in the world. And they would consider what we see or at least an attempt to describe the universe the way we describe it as deeply, deeply misguided. We don't expect crazy fluctuations. And the craziest fluctuation would be back all the ways up to the top.
Something's wrong. Now, one thing that might be wrong-- I rather think it is what is wrong-- is that this phase space box is not really closed and sealed. Imagine it had a little hole in it. The hole cannot be arbitrarily small for it to do its work. But imagine that there was a hole in the side of the box so that the universe phase point could leak out of a box.
Then the history would be, you start-- I mean, we still don't understand why we started here. We still don't understand. But let's take it for granted that the cosmologists are right and we did start here. What will happen is we will go into the zone of life. Then eventually, we will leave it. We will rattle around a few times, maybe many times, and then find the way out of the box.
Once you're out of the box, finished. Once you're out of the box, assuming the rest is infinite, assuming the rest is infinite, you will not come back into the box. The likelihood of coming back into the box altogether over the infinite range of time, beyond that time, is negligible.
And so once you're out, you're out. You will not recur again. If the box was open like this, and we started, let's imagine, a Gedankenexperiment where we-- maybe not we, maybe somebody else-- starts a sequence of universes-- starts one here, and follows it. Starts another one, follows it.
The overwhelming majority of them now will pass through the zone of life only once. Therefore, the overwhelming number of observers will see a world which is consistent with being traced back to this very early low entropy state. Do the equations support this view? Yes, actually they do.
Remember I drew that picture of the potential? Well, I'm not the world's biggest believer in string theory, although I did have something to do with it. But I'm not. But nevertheless, string theory tells us something. And the thing might be much broader and more general than string theory.
What it says is that there's always a way out of this minimum here. There are always vacuums in the technical jargon. There are vacuums with zero cosmological constant. They're supersymmetric. Doesn't matter what you call them.
They are configurations, where when you leak out of here, you jump around, you jump around here. Every so often, you go over the top, and you leak out here. When you leak out there, you do not come back. There's no way back after that.
Why? Because these things have zero vacuum energy, and zero vacuum energy means infinitely big horizon. You've leaked out to infinity. If this is true, it would be a way out. It would not explain why we start in a particular configuration, but would get us out of this trap of the recurrent nightmare of Boltzmann fluctuations.
The world would look something like this then-- or the evolution. You would start small up on the top of the potential. You would fall down to the bottom, get bigger. You would then tunnel through to the other side and simply get big, and end of story. You get one shot. It's a one-shot universe, where you get one chance to have life in it and, after that, finished.
What? I'm essentially finished. That is a possible explanation of what's going on. As I said, I don't expect you nor do I think that you should be unskeptical about everything I said. I think you should be skeptical about it. We have very little chance of extrapolating that far correctly.
This is what current theory is pushing us toward. Will surprises come in between now and the time of recurrences or whatever time it will take for people to get these things straight? I rather suspect surprises will happen. But nevertheless, let me just leave you with the statement that I deeply believe. I said it before, the arrow of time is a fact of nature and needs an explanation. Thank you.
[APPLAUSE]
PAUL: Yes?
LEONARD SUSSKIND: You may have to translate. My hearing isn't so good.
SPEAKER 2: So in the tradition of relativity, you started out with a point of view and joked about, well, we're here, about here, and that the horizon was based on that point of view. It seems like another point of view would create a different bubble. The two bubbles would intersect somewhere at super hot points.
LEONARD SUSSKIND: Well, my next lecture will be about black holes.
PAUL: Please repeat the question.
LEONARD SUSSKIND: I think you are asking the very simple question, which we could phrase this way-- imagine somebody not at rest in the center of that bubble here. But imagine somebody who was actually traveling with a galaxy that was falling through the-- where are you? Yeah. That was following through.
Would they see themselves passing through the super hot region and get themselves scolded? No. That's the puzzle of horizons, which we will come to next time. Next time is the puzzle, the conflict of principle between information and conservation, which is one side of this, and the equivalence principle, which is the other side.
The equivalence principle says that somebody falling down here sees nothing special when they cross that point. But the conservation of information, at least from the perspective of somebody at the center, says in some sense nothing really can fall through.
So this is going to be the discussion of next time. Let me just say, again, I'll have to say, trust me that the mathematical description that we've evolved out of largely the work of people like Hawking and Bekenstein and other people that the right way to think of our description at the center of all of this is in terms of a finite box with all of the degrees of freedom contained within that box.
Each one has a different description, and then the region of overlap they have to agree. I think that's what you were asking, if I'm not mistaken. But I would wait until the next lecture to try to be more specific about that.
SPEAKER 3: Could one say that the unit directionality of time is just a statement about the nature of consciousness?
LEONARD SUSSKIND: Could be.
SPEAKER 3: Doesn't have anything to do with physics at all? We see at one point in time. But we assume that we're adding time along with us.
LEONARD SUSSKIND: I guess I don't think so. I guess I think there's a physical fact here that has to be explained. Now, I understand what you're asking. I think I don't subscribe to that view of it. I do think there's a physical fact that needs explanation. But could consciousness be part of the way we perceive time? Sure, it is. Of course it is a part of the way we perceive time. Nevertheless, I think underlying there is a physical fact. I could be wrong.
SPEAKER 4: An individual particle might. Does it does it not have a matter of time if you just have one particle in the box? And so it looks like the arrow of time comes from statistical mechanics.
LEONARD SUSSKIND: Yes, it does. And in fact, I think the way Boltzmann finally formulated it is not that the entropy always increases. Whatever configuration you find yourself in, the next configuration is most likely to have a larger entropy.
But it's also true that, if you just find yourself at a random point among these configurations, the previous configuration also was most likely to have a larger entropy. So it does come into it. Always this term most likely, the idea that entropy always increases was misguided, as Boltzmann eventually discovered, and that the right statement you might say is that entropy almost always increases. So does that address what you were asking?
SPEAKER 4: Close.
LEONARD SUSSKIND: Yeah, close. It's better than nothing.
SPEAKER 4: Yeah.
LEONARD SUSSKIND: You didn't pay much to get in here. So what the hell, I mean.
[LAUGHTER]
SPEAKER 5: So I guess there's folks in the audience that take the basic interpretation to quantum mechanics, and they provide a alternative interpretation to the direction of time having to do with updating prior information or something like that. I don't quite understand it all.
LEONARD SUSSKIND: Me neither.
SPEAKER 5: But would you care to comment?
LEONARD SUSSKIND: No.
[LAUGHTER]
I don't know. No. As I said, I think there's a physical fact there that needs explanation.
SPEAKER 6: I want to ask kind of a similar question. Do you actually have any evidence that time is continuous?
LEONARD SUSSKIND: That it's continuous? No. No, I think there is always a level at which it could be discrete and we wouldn't notice that. I don't think anything I said really depends on the discreteness or continuity of time. The pictures that I drew of evolution were discrete, OK?
The basic set of ideas were a sort of summary of the logic of classical mechanics. In classical mechanics, it's not discrete updating. It's solving differential equations. But the idea of one arrow in and one arrow out becomes what is called Liouville's theorem in classical mechanics. If you know what that is, you will recognize what I said. If not, you will get to when you learn classical mechanics.
PAUL: Just so that we don't leave people concerned, as a final question, you suggested there might be a tunneling to some zero cosmological constant.
LEONARD SUSSKIND: Oh, yeah. Yeah, I jumped--
PAUL: Do you want to just allay people's concerns and estimate the timescale for that?
LEONARD SUSSKIND: Before I allay your concerns, before I allay your concerns, let me make you more worried. Jumping to a zero cosmological constant is not an innocent thing. It's true. If the cosmological constant-- if that's all that happened, and the cosmological constant suddenly became zero, I don't think it would affect us very much.
But you're jumping over this barrier to something new, to a place where the parameters of physics are different, where everything is different. There may or may not be electrons in that world. There may or may not be photons. In other words, physics, the properties of a physical world, would be very, very different. In addition to the cosmological constant being different, everything would be different.
In other words, it could well be a world in which atoms don't exist. So that's the bad news. The good news is that such events are exponentially unlikely. Let's see. Where was it? Where was the picture? Let me redraw the picture over here. I'll draw a picture, and then we dump this. And then we do that.
The time scale for the penetration through here is very, very sensitive to the parameters of this bump. It's typically exponentially long times. But is 10 billion years exponentially long? Yeah, it's exponentially in something, namely the log of 10 billion.
So there's no good reason why it couldn't happen tomorrow. But if you take some simple reasonable parameters for this coming out of various kinds of particle physics frameworks, you might expect that the time scales are much, much longer than the age of the universe. But, you know, I don't think anybody has a precise theory of this. Could it be tomorrow?
PAUL: Yeah, it will.
[LAUGHTER]
LEONARD SUSSKIND: I don't think so. I don't think so.
PAUL: Not tomorrow, but next lecture is Wednesday night.
[LAUGHTER]
LEONARD SUSSKIND: Unless it happens between now and tomorrow.
PAUL: Let's thank the speaker again.
[APPLAUSE]
Theoretical physicist Leonard Susskind delivered the first of his three Messenger Lectures on "The Birth of the Universe and the Origin of Laws of Physics," April 28, 2014. Susskind is the Felix Bloch Professor of Theoretical Physics at Stanford University, and Director of the Stanford Institute for Theoretical Physics.