Skip to main content
Chemistry LibreTexts

Statistical Entropy

  • Page ID
    1881
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Let us imagine that we have four colored blocks inside of a cardboard box that is divided in two by a line drawn down its center. Where would the blocks appear if you were to shake the box, set it down, and then open it to take a look? Well, let us examine the possibilities.

    microstates.JPG

    Figure 2: Possibilities for colored blocks in a box.

    In figure 2 you can see that after opening the box there are a myriad if possible color combinations. But these color combination follow a pattern. Notice that there is only one way that all for blocks can be on the left side of the box, while there are six ways that the blocks can be organized such that two blocks are on each side. Notice that the organization of these boxes resembles a Gaussian curve. More possibilities exist for two blocks appearing on each side of the box, so most of the time the blocks will be organized in that fashion. Let’s call the number of possible arrangements or microstates W. As figure 2 demonstrates there are 6 microstates at which two blocks are on each side of the box. To calculate this value we can use the formula

    1.JPG (1)
    Where N is the total number of particles in the box, and nleft and nright represent the number of particles on the left and right side respectively. This equation can be expanded to a system divided into an arbitrarily large number of regions.
    \[W = \dfrac{N!}{\prod_i n_1!} \tag{2}\]

    To find the probability (P) that the red block will appear on the left side of the box we can simply count the number of times where this occurs and divide by the total number of possibilities.

    \[ P_{r,left} = \dfrac{W_{r,left}}{W_{tot} = \dfrac{8}{16} = \dfrac{1}{2} \tag{3}\]

    To find the probability of finding both the red and green blocks on the left side of the box we follow the same pattern

    \[ P_{rg,left}=\dfrac{W_{rg,left}{W_tot}=\dfrac{4}{16}=\dfrac{1}{2} \times \dfrac{1}{2} = \dfrac{1}{2} \tag{4}\]

    Now, what if instead of four blocks, we have Avagadro’s number of blocks (6.022x1023)? Then the number of possible arrangements becomes EXTREMELY large and as a result the probability of all the blocks being on the left side of the box gets arbitrarily close to 0 because
    5.JPG (5)
    where n is the number of particles on the left side of box. We can think of entropy as the number of possibilities available at any state. For example, the middle column in figure 2 has the most entropy because the highest number of possibilities exists. Thus we might be tempted to say that ; however, this is not the case because where V is volume of the box and N is the number of particles in the box. To remedy this proportionality we cam simply take the natural log of W, because of the following property
    \[ \ln (V^N) = N \ln (V) \tag{6} \]
    We can connect \(S\) and \(\ln(W)\) with Boltzmann’s constant kb=1.38x10-23J/K.
    \[ S=kbln(W) \tag{7}\]

    References

    • Chang, Raymond. Physical Chemistry for the Chemical and Biological Sciences. Sausalito: University Science Books, 2000.

    Statistical Entropy is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?