In any case, this Schottky Defect is a prime example of what we call a point defect; basically defects occur in all solids, and this can be easily proven via a consideration of thermodynamic parameters. Instead of the usual ΔG = ΔH - TΔS treatment that we usually apply in Chemistry, let us consider a more top-down approach that is much more complete, albeit still lacking in details.
Now, at absolute zero, we know that according to the Third Law of Thermodynamics the atoms of a perfect crystalline solid are arranged perfectly in a regular crystal lattice, hence giving rise to zero entropy and zero disorder (we neglect any residual entropy in this discussion for simplification purposes!). Now, as the temperature increases, there will be a corresponding increase in thermal agitation that tends to produce defects in the crystalline structure - defects are basically misalignments or irregularities in an otherwise regular array of atoms in the solid.
When the temperature increases, the atoms go into a more frenzied state of vibration about their lattice positions, and sometimes, the atoms can actually be displaced from their lattice site, and migrate to the surface of the solid. This results in a vacant lattice site, which is referred to as a Schottky defect. The diagram below shows a perfect ordering in two dimensions, and a Schottky defect occuring:
And that's basically it, a Schottky defect that forms due to disorganizing thermal motion. Now what we wish to do now is to determine how the number of Schottky defects varies with the absolute temperature for a crystal that is in thermal equilibrium at a temperature T. Well, let us put into place certain assumptions:
1) The energy associated with a Schottky defect is ε: in other words, we say that the zero of energy is an atom within the lattice, and the energy of an atom on the surface of the solid with respect to an inner atom is ε, which is what is required to produce a defect.
2) For N atoms, let there be n defects, such that the total energy is nε: by saying this, we are saying that there are relatively few defects as compared to the total number of atoms (n is much lesser than N). As such, as according to the diagram below, all defects are well spaced away from one another such that each defect is surrounded by a regular array of atoms:
Now you might say: isn't this expression too easy? Using simple statistics for the statistical weight? And then for the entropy? Well, you're right, we have made some additional assumptions:
1) We have neglected surface effects: basically, there is an additional entropy term that we neglected, which corresponds to the number of ways you can arrange the atoms on the surface of the solid. But the reason for this is simple: for a typical amount of solid, say one mole, we have 10^23 atoms present. Since the number of atoms on the surface is proportional to (10^23)^2/3, we have roughly 10^16 sites on the surface of the solid. Comparing this number (10^16) to the actual number of inner atoms (~10^23), we say that surface effects can be suitably neglected under normal conditions.
2) We have neglected vibrational effects: earlier, we mentioned that the atoms do vibrate about their equilibrium positions, and this in fact contributes to the entropy as well at higher temperatures.
However, having neglected these two effects doesn't mean we're calculating the wrong entropy - it just means that we're calculating an incomplete entropy term. The key point here is that surface effects, defects and vibrations each have their own entropy and energy term - we can then calculate their energies separately (same goes for their entropy) and then sum them up later on, and it is still correct. The basis for such a treatment is that each component is a subsystem and they all interact weakly. Because they interact weakly, they are essentially independent of one another, and thus we can separate out their thermodynamic variables.
This is reminiscent of having an isolated system being split into two or more subsystems, where each subsystem has its own characteristic temperature; it is these temperatures (and other variables) that must be equal to one another when thermal equilibrium ensues within the solid.
Now that the problems are out of the way, let us consider a mathematical simplification by application of Stirling's rule, which states that for large values of x:
In which case if we apply it to the entropy of the system, we now have:
I've obtained the derivative of S with respect to n in anticipation that we'll need it later. Now, we'll make use of the basic definition of temperature in terms of entropy so that:
The right hand side is simply an extension of the middle term where I've used the chain rule; now that we've obtained the partial derivative of S with respect to n, let us obtain the other derivative:
Putting everything together we obtain the final expression:
Let us keep in mind that n <<>
And voila! This expression has the form of the much-celebrated Boltzmann distribution! An amazing thing, don't you think?
Just ruminate over it. :p
No comments:
Post a Comment