您好,欢迎访问同道前行教育科技有限公司
  • 官方微信

17376597507

咨询热线

当前位置: 首页 > 多道智库

好文精选66篇之26——熵:给宇宙带来无序的无形力量

发表时间:2020-08-17 09:01:12 0

Entropy: The Invisible Force That Brings Disorder to the Universe
by Jesslyn Shields Mar 27, 2020


Entropy describes how disorder happens in a system as large as the universe or as small as a thermos full of coffee. Jose A. Bernat Bacete/Getty Images
You can't easily put the toothpaste back into the tube. You can't expect molecules of steam to spontaneously migrate back together to form a ball of water. If you release a bunch of corgi puppies into a field, it's very unlikely you're going to be able to get them all back together into a crate without doing a ton of work. These are the problems associated with the Second Law of Thermodynamics, also known as the Law of Entropy.
The Second Law of Thermodynamics
Thermodynamics is important to various scientific disciplines, from engineering to natural sciences to chemistry, physics and even economics. A thermodynamic system is a confined space, which doesn't let energy in or out of it.
The first law of thermodynamics has to do with the conservation of energy — you probably remember hearing before that the energy in a closed system remains constant ("energy can neither be created nor destroyed"), unless it's tampered with from the outside. However, the energy constantly changes forms — a fire can turn chemical energy from a plant into thermal and electromagnetic energy. A battery turns chemical energy into electrical energy. The world turns and energy becomes less organized.
"The second law of thermodynamics is called the entropy law," Marko Popovic, a postdoctoral researcher in Biothermodynamics in the School of Life Sciences at the Technical University of Munich, told us in an email. "It is one of the most important laws in nature."
Entropy is a measure of the disorder in a closed system. According to the second law, entropy in a system almost always increases over time — you can do work to create order in a system, but even the work that's put into reordering increases disorder as a byproduct — usually in the form of heat. Because the measure of entropy is based on probabilities, it is, of course, possible for the entropy to decrease in a system on occasion, but that's statistically very unlikely.
The Definition of Disorder
It's harder than you'd think to find a system that doesn't let energy out or in — our universe is as good an example of one as we have — but entropy describes how disorder happens in a system as large as the universe or as small as a thermos full of coffee.
However, entropy doesn't have to do with the type of disorder you think of when you lock a bunch of chimpanzees in a kitchen. It has more to do with how many possible permutations of mess can be made in that kitchen rather than how big a mess is possible. Of course, the entropy depends on a lot of factors: how many chimpanzees there are, how much stuff is being stored in the kitchen and how big the kitchen is. So, if you were to look at two kitchens — one very large and stocked to the gills but meticulously clean, and another that's smaller with less stuff in it, but pretty trashed out by chimps already — it's tempting to say the messier room has more entropy, but that's not necessarily the case. Entropy concerns itself more with how many different states are possible than how disordered it is at the moment; a system, therefore, has more entropy if there are more molecules and atoms in it, and if it's larger. And if there are more chimps.
Entropy is Confusing
Entropy might be the truest scientific concept that the fewest people actually understand. The concept of entropy can be very confusing — partly because there are actually different types. The Hungarian mathematician John von Neumann lamented the situation thusly: "Whoever uses the term 'entropy' in a discussion always wins since no one knows what entropy really is, so in a debate one always has the advantage."
"It is a little hard to define entropy," says Popovic. "Perhaps it is best defined as a non-negative thermodynamic property, which represents a part of energy of a system that cannot be converted into useful work. Thus, any addition of energy to a system implies that a part of the energy will be transformed into entropy, increasing the disorder in the system. Thus, entropy is a measure of disorder of a system."
But don't feel bad if you're confused: the definition can vary depending on which discipline is wielding it at the moment:
In the mid-19th century, a German physicist named Rudolph Clausius, one of the founders of the concept of thermodynamics, was working on a problem concerning efficiency in steam engines and invented the concept of entropy to help measure useless energy that cannot be converted into useful work. A couple decades later, Ludwig Boltzmann (entropy's other "founder") used the concept to explain the behavior of immense numbers of atoms: even though it is impossible to describe behavior of every particle in a glass of water, it is still possible to predict their collective behavior when they are heated using a formula for entropy.
"In the 1960s, the American physicist E.T. Jaynes, interpreted entropy as information that we miss to specify the motion of all particles in a system," says Popovic. "For example, one mole of gas consists of 6 x 1023 particles. Thus, for us, it is impossible to describe the motion of each particle, so instead we do the next best thing, by defining the gas not through the motion of each particle, but through the properties of all the particles combined: temperature, pressure, total energy. The information that we lose when we do this is referred to as entropy."
And the terrifying concept of "the heat death of the universe" wouldn't be possible without entropy. Because our universe most likely started out as a singularity — an infinitesimally small, ordered point of energy — that ballooned out, and continues expanding all the time, entropy is constantly growing in our universe because there's more space and therefore more potential states of disorder for the atoms here to adopt. Scientists have hypothesized that, long after you and I are gone, the universe will eventually reach some point of maximum disorder, at which point everything will be the same temperature, with no pockets of order (like stars and chimpanzees) to be found.
And if it happens, we'll have entropy to thank for it.
Now That's Interesting
Twentieth century scientist Sir Arthur Eddington thought the concept of entropy was so important to science that he wrote in The Nature of the Physical World in 1928: "The law that entropy always increases holds, I think, the supreme position among the laws of Nature.... If your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation."

 

在线客服

ONLINE SERVICE

联系电话

17376597507

返回顶部