Most probable distribution. Second law of thermodynamics. Entropy Statistical weight of the most probable distribution n 10

Maxwell discovered a path that eventually became a wide highway. Over the next hundred years, the grand edifice of statistical mechanics was erected, thanks in part to the work of Ludwig Boltzmann and J. Willard Gibbs. (Gibbs was the first great American theoretical physicist, who, like other "prophets", was the last to be recognized at his own university. It is said that the president of Yale University, having decided to create a physics department, turned to several European scientists for help. They sent him to Willard Gibbs, whom the President did not know. Gibbs was on the staff of Yale University at the time.)

The essence of the statistical hypothesis formulated for gases is that we give up trying to know the exact position and speed of each of the many particles that make up the system, and instead assume if there is no additional information that for each particle of the system all possible positions and directions of velocity are equally probable (the word equiprobable should be especially emphasized). We do have some information: it is assumed that the total energy of the system E and the total number of particles in it N are fixed (we assume that the energy and number of particles are conserved). Therefore, some combinations of particle aggregate velocities and positions are prohibited; As an example of a forbidden system, we will indicate such a combination when at least one particle has an energy greater than E: in this case, the total energy of the system would exceed E.

One could imagine a situation where all the energy of a gas is invested in one particle, which moves at an extremely high speed corresponding to the energy, while the remaining particles stand still. We feel, however, that such a configuration is unlikely to be “viable”, since one would expect that a fast moving particle would collide with other particles and give up some of its energy to them. Another combination is also possible, when the total energy of the gas is divided equally between all the molecules, which move in equal formation one after another at the same speeds; but this situation, as intuition tells us, looks unlikely, since collisions will ultimately lead to chaotization of movement.

Let us consider all possible (and differing from each other) distributions of molecules in space and in speed, satisfying the conditions that the energy E and the number of particles N remain unchanged when all the molecules are in one corner of the vessel and have the same speed, when they are in another corner and have a different speed, etc., i.e. we will take into account absolutely all possible combinations. Let us now find the most probable distribution of positions and velocities of molecules. This problem is solvable under the conditions listed above. The basic idea of ​​statistics lies in the hypothesis that if a system

is at a given temperature (in thermal equilibrium, such as a gas in a vessel), the velocities and positions of the molecules are described by the most probable distribution. Knowing this most probable distribution of molecules, it is possible to calculate the viscosity coefficient, pressure and other quantities.

The Maxwell-Boltzmann distribution requires that the particles be uniformly distributed in space and their velocities as shown in Fig. 385.

This is the most probable distribution of particles in positions and velocities, provided that all configurations are equally probable, and the total number of particles and their total energy are fixed.

Thus, we do without the assumption of equality of particle velocities and do not solve the equations of motion from which we could obtain the exact values ​​of the coordinates and velocities of each particle, but we introduce the most probable distribution of positions in space and velocities for all particles. This very radical assumption goes far beyond the laws of mechanics; it is not without reason that it was discussed and analyzed for a long time and intensively after Maxwell and Boltzmann. This assumption has been formulated in different ways. But essentially it all comes down to a purely intuitive guess that in any real physical situation, unlikely distributions of molecules (both in space and in speed) cannot arise so often as to have at least some influence on the equilibrium properties of the system.

Let us illustrate the meaning of this hypothesis with several examples. Consider a gas consisting of a large number of particles enclosed in a container. It is quite possible for such a distribution of particles to occur when all the particles move in one direction, hit at some point on one wall of the vessel, and none of them hits the opposite wall

wall (Fig. 386). As a result of this movement, a significant force will be applied to one wall of the vessel, but no force will be applied to the other wall, so the entire vessel will bounce sideways until the opposite wall collides with the molecules, after which the vessel will bounce back. It's possible, but unlikely. It is unlikely that molecules will be able to momentarily organize their movement and begin to move in one direction instead of randomly rushing in all directions.

Fig. 386. All molecules move in the same direction.

It may also happen that at some point all the molecules suddenly find themselves in one corner of the vessel, and all other parts of the vessel appear empty (Fig. 387). At this instant, the density of the gas in one corner of the vessel will become very large, while in other parts the density will be zero. This situation is also possible, but unlikely.

Suppose there are 10,000 cars in a parking lot and the parking lot has only one exit; When football ends, all car owners get behind the wheel. The question arises: is such a situation possible when all the cars continuous flow will they leave the parking lot without creating traffic jams or accumulations of cars in some places?

Fig. 387. All molecules gathered in one corner.

Of course, this is possible, but it is extremely unlikely unless there are a large number of traffic police on the scene. As a rule, when a parking lot is vacated, an incredible mess of cars forms, since each of them moves almost randomly, trying to leave the parking lot.

The assumption contained in the works of Maxwell, Boltzmann and Gibbs is equivalent to the statement that a large number of particles obeying Newton’s laws of motion, in the presence of certain external restrictions (for example, constancy of the total energy and the total number of particles), as a result of mutual collisions, eventually go to some average state. From the famous Boltzmann theorem (theorem) it follows that for given initial conditions, particle collisions lead to the gradual establishment

most likely condition. Statistical mechanics relieves us of all the inconveniences associated with solving the equations of motion. It is based on the assumption that the distribution of particles in the equilibrium state is the most probable, and then derives all the consequences arising from this distribution. It is obvious that distributions may also arise that are not the most probable. It is no less obvious, however, that such distributions will quickly disappear if the vessel is shaken or disorder is introduced in some other way.

Consider a system consisting of a large number of molecules. Let's call it a macroscopic system. The state of such a system can be described in two ways:

1. Using average system characteristics, such as pressure P, volume V, temperature T, energy E. A state defined by characteristics averaged over a large number of molecules will be called a macrostate.

2. By describing the state of all molecules forming the body, for this it is necessary to know the coordinates q and momenta p of all molecules. A state defined in this way will be called a microstate.

Let the macroscopic system be part of some large closed system; we will call it the environment. Let's find the microscopic Gibbs distribution, i.e. probability distribution function of various states of a macroscopic system that does not interact with surrounding bodies and has constant energy. Different states of a system that have the same energy have the same probability.

Each energy value of a macroscopic system can correspond to various microstates; the number of such states is called statistical weight.

Let the macrostate of a system of 4 molecules be specified using the parameters: P, V, T, E. The molecules are in a vessel separated by a permeable partition (Fig. 10.1a). The vessel is located in some environment, but does not interact with it.

Rice. 10.1a. Rice. 10.1b. Rice. 10.1c.

If all 4 molecules are in the right half of the vessel, then the macrostate of the system (0 - 4) can be written using one microstate, listing the numbers of the molecules. In this case, the statistical weight is .

Let now one of the molecules move to the left half of the vessel (Fig. 10.1b). It could be molecule 1, then molecules 2, 3, 4 will remain in the right half, or it can be molecule 2, then molecules 1, 3, 4, etc. will remain on the right. In total, 4 different microstates are possible, therefore, the statistical weight of the macrostate is (1 - 3).

The probabilities of all microstates are the same. The state where molecule 1 is on the left and 2, 3, 4 is on the right has the same probability as the state when molecule 2 is on the left and 1, 3, 4 is on the right. This conclusion is based on the assumption that all molecules are indistinguishable from each other.

The uniform distribution of molecules on both halves of the vessel becomes obvious when large quantities molecules. We know that the pressure equalizes over time in both halves of the vessel: and since the concentration of molecules, even at a constant temperature, the number of molecules on the left and right will be the same:

Since the highest statistical weight corresponds to the highest probability of the state w, then obviously the probability is proportional to the number of states. State (2 - 2) is the most probable, because has the greatest statistical weight (Fig. 10.1c).

Since all directions of molecular speed are equally probable, at first glance it seems that all speed values ​​should be equally probable. However, as experiment shows, at each temperature T there is such a speed (the most probable) that most gas molecules move at speeds that are not very different from it. Molecules whose speeds are much greater or much less than the most probable are rare.

It makes no sense to solve the problem of the number of molecules that have a precisely specified speed v, since there are such molecules in each this moment There may not be time at all. But we can raise the question of the number of molecules whose velocities lie in a certain certain velocity range. The entire range of speeds is divided into sections and the number ">v to v + is found.

Then the probability that we will meet such a speed is

def-e">Probability is equal to the proportion of molecules having a speed in the specified interval in relation to the total number of molecules n.

Obviously, the probability is example">v, but also depends on the width of the interval ..gif" border="0" align="absmiddle" alt="not in free, but in a unit speed interval, i.e.

example">f(v) - probability density, or distribution function. Dependence of probability density (fraction) random variable from its meaning it is called distribution function of this random variable f(v). We consider the molecular velocity distribution function.

For different v, the value of f(v) will be different. If we knew it, we could build “steps” called a histogram. Such histograms are used not only in physics, but also in sociology, medicine, technology, etc.

Since the speed in a system of molecules changes continuously, the distribution function f(v) can be determined more accurately:

example">f(v) should be normalized by the condition:

formula" src="http://hi-edu.ru/e-books/xbook787/files/f684.gif" border="0" align="absmiddle" alt="does not mean that there are molecules in the gas with infinitely high speeds, this is a computational technique that is possible because there are very few molecules with very high speeds.

The graph of the function f(v) is shown in the figure; the curve f(v) is asymmetric and passes through zero at the origin. The area of ​​the elementary strip shaded in the figure gives the probability that the speed of the molecule lies within the interval from v to v + example">n, it gives the probable number of molecules with speeds in the same interval.

Distribution function gas molecules by velocities was obtained by J. Maxwell. He solved a problem for a gas consisting of a very large number n of identical molecules in a state of random thermal motion at a certain temperature. It is assumed that there are no force fields acting on the gas. Maxwell's law of velocity distribution has the form:

formula" src="http://hi-edu.ru/e-books/xbook787/files/f687.gif" border="0" align="absmiddle" alt="

The function f(v) has a maximum at the value of the most probable speed (m is the mass of the molecule, T is the temperature).

As the temperature increases, the velocity distribution curve of molecules is deformed.

selection">at low speeds decreases, and with at higher speeds - increases. The maximum value of the distribution function f(v) decreases, so that the area under the curve remains unchanged.

Maxwellian distribution:

    1) is always established at thermal equilibrium in systems of non-interacting or elastically interacting particles, the movement of which can be described by the laws of classical mechanics: gases, some liquids, Brownian particles (small particles in liquids).

    2) stationary (does not change with time), despite the fact that as a result of collisions the molecules constantly change their speed.

Knowledge of statistical distribution functions makes it possible to calculate the average values ​​of microscopic parameters without knowing their values ​​for individual molecules. For example, arithmetic average speed the motion of molecules can be found by calculating the integral:

example">f(v) and integrate, we get average speed ideal gas molecules:

to determine the mean square speed, you need to calculate the integral:

formula" src="http://hi-edu.ru/e-books/xbook787/files/f693.gif" border="0" align="absmiddle" alt="

Experimental measurements of the velocities of molecules (one of the first such experiments was carried out by O. Stern using molecular beams) showed good agreement with the theoretical values ​​of the velocities given by the Maxwell distribution. In his experiments on thermionic emission, Richardson tested Maxwell's law of velocity distribution in 1921. In a state of equilibrium, a saturated pair of electrons is formed above the metal surface; the experimental determination of electron velocities obeys Maxwell’s law.

In a gas, there is a certain distribution of molecules by speed that is, on average, constant over time.

The equilibrium state of a gas is characterized not only by the distribution of molecules by speed, but also by coordinates. In the absence of external fields, this distribution will be uniform, i.e. the gas is evenly distributed throughout the entire volume of the vessel: in any equal macroscopic volumes inside the vessel, on average, there is the same number of molecules. But what about the situation in the presence of a field acting on the molecules, for example, a gravitational field?

Find law of distribution of gas molecules with height in a uniform gravity field possible from the condition of mechanical equilibrium.

Let's consider a vertical column of gas with a base area S, mentally select in it at a height h a layer of thickness formula" src="http://hi-edu.ru/e-books/xbook787/files/f695.gif" border="0" align="absmiddle" alt="= const.

formula" src="http://hi-edu.ru/e-books/xbook787/files/f697.gif" border="0" align="absmiddle" alt="

where is the formula" src="http://hi-edu.ru/e-books/xbook787/files/f699.gif" border="0" align="absmiddle" alt="can be written in the form

formula" src="http://hi-edu.ru/e-books/xbook787/files/f701.gif" border="0" align="absmiddle" alt="

From the Clapeyron-Mendeleev equation for an arbitrary gas mass m

PV = mRT/M,

density

formula" src="http://hi-edu.ru/e-books/xbook787/files/f703.gif" border="0" align="absmiddle" alt=".gif" border="0" align="absmiddle" alt=", we get

example">g and T has the form

formula" src="http://hi-edu.ru/e-books/xbook787/files/f715.gif" border="0" align="absmiddle" alt="

determined by the Boltzmann distribution. It characterizes the spatial equilibrium distribution of particle concentration depending on their potential energy.

The general theory of equilibrium statistical distributions was created by J. Gibbs. He showed that in a state of thermal equilibrium at temperature T, the law of distribution of molecules over any quantity characterizing their state (coordinate, speed, energy) has an exponential character, and in the exponent there is the ratio of the characteristic energy of the molecule to the value kT, taken with a minus sign, which proportional to the average kinetic energy chaotic movement of molecules.

Test questions and tasks

1. What is the velocity distribution of gas molecules in the equilibrium state?

2. What is the distribution function of a random variable, for example the speed of molecules?

3. Plot the Maxwell velocity distribution function. What happens to the distribution function curve as the temperature increases?

4. What experiments confirm the conclusions of the theory about the distribution of molecules by speed?

5. How to find the average speed of molecules using the distribution function? Root mean square speed of molecules?

6. Define the most probable speed of movement of molecules.

7. Write the law of distribution of molecules in the gravitational field.

8. At what temperature does the root mean square speed of oxygen molecules exceed their most probable speed by 100 m/s?

Where
total number of molecules
number of molecules in 1st part of the vessel,
in the second. Thermodynamic probability in the example under consideration.

Likewise for distribution
:

.

For
.

Note that that the highest thermodynamic probability is for a uniform distribution, it can be carried out in the greatest number of ways.

Relationship between entropy and probability has been installed Boltzmann, who postulated that entropy is proportional to the logarithm of the probability of the state

(entropy is determined up to a constant

const), where
Boltzmann constant,
thermodynamic probability.

The second law of thermodynamics and its statistical interpretation

    Boltzmann formulation:

All processes in nature proceed in a direction leading to an increase in the probability of the state.

    Clausius' formulation:

Such processes are impossible, the only final result of which would be the transfer of heat from a less heated body to a more heated body.

From the point of view of Boltzmann’s formulation, the transition from a cold body to a heated one is fundamentally available, But unlikely.

Example. Using the Boltzmann formula, we calculate from the change in the entropy of 2 bodies located at temperatures of 301 K and 300 K, respectively, the ratio of the probability of the bodies being in these states if an amount of heat is transferred from one body to another
. Let us denote the probability of staying at a temperature of 300 K
, 301 K
.

.

Due to the smallness of the transmitted energy, the difference
can be estimated using the relation:
.

, Then

This means that for every
cases of transitions
from a body with a temperature of 301 K to a body with a temperature of 300 K, one case of transfer of the same amount of heat from a body with a temperature of 300 K to a body with a temperature of 301 K can occur. (Note that for a very small amount of heat
the probabilities become comparable and for such cases the second law can no longer be applied.).

In general, speaking, if there is a multivariance of paths and processes in the system, then By calculating the entropy of final states, you can theoretically determine the probability of a particular path or process, without actually producing them, and this is an important practical application of the formula connecting thermodynamic probability with entropy.

Questions for self-control


The concept " statistical weight"(the term is also used thermodynamic probability) is one of the main ones in statistical physics. To formulate its definition it is necessary to first define the concepts macrostate And microstate.

Same condition macroscopic body can be characterized in different ways. If the state is characterized by the task macroscopic state parameters (pressure, volume, temperature, density, etc.) then we will call such a state macrostate .

If a state is characterized by specifying the coordinates and velocities of all molecules of the body, then such a state will be called microstate .

It is obvious that the same macrostate can be realized different ways, that is, different microstates. The number of different microstates by which a given macrostate can be realized is called statistical weight or thermodynamic probability .

For clarification the above concepts consider model(!) - a vessel in which they are N molecules. Suppose that the vessel is divided into two identical parts, and different macrostates differ in the number of molecules in the left and right halves of the vessel. That's why within the model we assume the state of the molecule is given if it is known which half of the vessel it is in.

Different microstates differ in what molecules are on the right and left. 1.2 – 3.4 (as shown in Figure 9.5) is one of the states. 1.3 – 2.4 – another microstate.

Each molecule can be located on the left or on the right with equal probability. Therefore the probability i -that molecule located, for example, on the right is equal to ½. The appearance of that molecule on the left side of the vessel along with that one is statistically independent event , so the probability of finding two molecules on the left is ½ ½ = ¼; three molecules – 1/8; four – 1/16, etc. Therefore, the probability of any arrangement (microstate) of molecules is equal to .

The statement that, the probabilities of each of their microstates are equal, called ergodic hypothesis , and it lies at the basis of statistical physics.

Let's consider N = 4. Each of the arrangements of molecules in the halves of the vessel is a specific microstate. Then the macrostate with the number of molecules on the left corresponds to 1 microstate. Statistical weight such a macrostate is equal to 1, and the probability of its implementation is 1/16. For other macro-states, the following can be stated:

Corresponds to 6 microstates statistical weight 6, 6/16

Corresponds to 4 microstates statistical weight 4, 4/16

Corresponds to 1 microstate statistical weight 1, 1/16

Now you can see that due to the acceptance of the ergodic hypothesis, the statistical weight turns out to be proportional to the probability (regular!) implementation of a given macrostate.

If the container contains N molecules, then it can be proven that the state weight of the macrostate is that on the left n molecules, and on the right (N–n)

If for four molecules the probability of gathering in one of the halves of the vessel is 1/16, that is, a quite noticeable value, then for N = 24 this probability is of the order of .

Under normal conditions, 4 cm 3 of air contains about 10 20 molecules. The probability of them gathering in one of the parts of the vessel is estimated at .

Thus, with an increase in the number molecules in the system, the probability of significant deviations from the approximate equality of the numbers of molecules in the parts of the vessel decreases very quickly. This corresponds to the fact that the state weight of states with approximately equal numbers of molecules in the halves turns out to be very large and quickly decreases as the molecules in the parts deviate from equality.

If the number N is not very large, then over time there are noticeable deviations in the number of molecules in one of the half of N/2 . Random deviations of a physical quantity from its average value are called fluctuations:

Arithmetic mean of absolute fluctuation equals zero. Therefore, the characteristics of fluctuations are often considered mean square fluctuation :

More convenient and indicative is relative fluctuation :

Moreover, in statistical physics the following relation is proven:

those. the magnitude of the relative fluctuation is inversely proportional to the root of the number of particles in the system . This statement confirms our qualitative finding.

Similar to the number of molecules in one of the halves of the vessel, other macroscopic characteristics of the state - pressure, density, etc. - fluctuate near average values.

Consider nature equilibrium and nonequilibrium states and processes from the point of view of statistical physics. Equilibrium, by definition, is a state that does not tend to change over time. It is clear that the most probable of all macrostates of the system will have this property to the greatest extent, that is, the state realized by the most big amount microstates, and therefore has the greatest statistical weight. That's why equilibrium state can be defined as a state whose status weight is maximum .

An example of a typical irreversible process is the spread of gas molecules, initially concentrated in one of its halves, to the entire volume of a vessel. This process is irreversible, since the probability that, as a result of thermal movement, all molecules will collect in one of the halves of the vessel is very small. Accordingly, always the process is irreversible, the reverse of which is extremely unlikely .


LECTURE No. 10 STATIC PHYSICS AND THERMODYNAMICS

10.1. ENTROPY

As we have established, the probability of a system’s state is proportional to its static weight, so the static weight W itself could be used as a characteristic of the state’s probability. However, W is not an additive quantity. Therefore, to characterize the state of the system, use the quantity

which is called entropy systems. Indeed, if we consider two systems of 4 molecules each, then the statistical weight of the state when each of the subsystems contains, for example, one molecule on the left will be equal to 16, i.e. . This ratio is valid for any conditions. Hence, stateweight is non-additive. In the same time entropy state of the resulting system i.e. is an additive quantity.

Since during irreversible processes in an isolated system it moves from less probable to more probable states, it can be argued that the entropy of an isolated system increases when irreversible processes occur in it .

The equilibrium state is the most probable state, which means the entropy of the system that has passed into an equilibrium state is maximum.

Therefore, it can be argued that the entropy of an isolated system remains constant if it is in an equilibrium state, or increases if irreversible processes occur in it.

The statement that the entropy of an isolated system does not decrease, called second law of thermodynamics or law of increasing entropy .

Entropy is, obviously, state function and must be determined by the state parameters. A monatomic ideal gas has the simplest properties - its state is completely determined by specifying two parameters, for example, temperature and volume. Accordingly, its entropy can be defined as a function of temperature and volume: . Related calculations show that the entropy of a mole of an ideal gas is given by

where is a certain constant, with the accuracy to which entropy is determined.

Now we can clarify the question of how entropy changes uninsulated system, for example, when imparting a certain amount of heat to it. Let's take differential (2) and multiply it by:

But the increase in internal energy of the gas. Since equality .Then (3) is transformed to the form:

Included in (4) are additive , and therefore (4) is true for any mass of gas :

According to the first law of thermodynamics, the right-hand side (5) is . That's why:

Formula (6) turns out to be valid for any bodies, it is only necessary to communicate the amount of heat was reversible .

Let's dwell on physical essence of entropy .

Let us introduce definitions: a state realized in a relatively small number of ways will be called ordered or non-random . A condition realized in a large number of ways - disorderly or random .

Then it can be argued that entropy is a quantitative measure of the degree of disorder in a system . The communication of an amount of heat to the system leads to an increase in the thermal movement of molecules, and hence to an increase in entropy. Moreover, the higher the temperature of the system, the lower the proportion of disorder introduced by the given message, which is the physical meaning of formula (6).

If the amount of heat is communicated to the system during irreversible process, then its entropy increases not only due to the receipt of heat, but also due to the occurrence of necessary processes, since an irreversible process is accompanied by an increase in the probability of the state of the system, its statistical weight

In this case, in (7) means the temperature of the reservoir from which the system receives . Combining (6) and (7) together we can write:

At absolute zero, every system is in its ground state, i.e. the state with the lowest energy. The static weight of this well-defined state is equal to unit , which means the entropy of the system is zero. It corresponds Nernst's theorem , according to which the entropy of any body tends to zero as its temperature tends to zero :

Nernst's theorem is also called third law of thermodynamics .

Share