Saturday, January 26, 2008

Repeat After Me: Schottky, Not Schotty Defect!

What a lovely Saturday night for reminiscing; ever since learning about point defects in my Inorganic Chemistry module, I've never gotten the pronunciation right before. Can you believe it?

Schottky Defect

Amazing. I keep pronouncing it as 'Schotty' by insisting on the absence of a 'k'; I should really give the guy some credit! Haha.

In any case, this Schottky Defect is a prime example of what we call a point defect; basically defects occur in all solids, and this can be easily proven via a consideration of thermodynamic parameters. Instead of the usual ΔG = ΔH - TΔS treatment that we usually apply in Chemistry, let us consider a more top-down approach that is much more complete, albeit still lacking in details.

Now, at absolute zero, we know that according to the Third Law of Thermodynamics the atoms of a perfect crystalline solid are arranged perfectly in a regular crystal lattice, hence giving rise to zero entropy and zero disorder (we neglect any residual entropy in this discussion for simplification purposes!). Now, as the temperature increases, there will be a corresponding increase in thermal agitation that tends to produce defects in the crystalline structure - defects are basically misalignments or irregularities in an otherwise regular array of atoms in the solid.

When the temperature increases, the atoms go into a more frenzied state of vibration about their lattice positions, and sometimes, the atoms can actually be displaced from their lattice site, and migrate to the surface of the solid. This results in a vacant lattice site, which is referred to as a Schottky defect. The diagram below shows a perfect ordering in two dimensions, and a Schottky defect occuring:

And that's basically it, a Schottky defect that forms due to disorganizing thermal motion. Now what we wish to do now is to determine how the number of Schottky defects varies with the absolute temperature for a crystal that is in thermal equilibrium at a temperature T. Well, let us put into place certain assumptions:

1) The energy associated with a Schottky defect is ε: in other words, we say that the zero of energy is an atom within the lattice, and the energy of an atom on the surface of the solid with respect to an inner atom is ε, which is what is required to produce a defect.

2) For N atoms, let there be n defects, such that the total energy is nε: by saying this, we are saying that there are relatively few defects as compared to the total number of atoms (n is much lesser than N). As such, as according to the diagram below, all defects are well spaced away from one another such that each defect is surrounded by a regular array of atoms:

3) We have assumed that n is much lesser than N: in general, this is true at temperatures below 500 K, because the energy of a Schottky defect is roughly of the order of 1 eV, while the thermal energy at 300 K is around 1/40 eV. As such, we say that very few defects form at normal to moderate temperatures.

With this three assumptions, we can now say that:

Total energy of system, E(n) = nε

The next step is to then determine the statistical weight (or thermodynamic probability) of a typical macrostate of the solid crystal. Let us go by the above assumptions, so let's consider a crystal composed of N atoms, with n defects as our macrostate, and the number of microstates (i.e. number of ways to obtain such a configuration) is simply the number of ways one can select n lattice sites from N lattice sites:
The corresponding entropy term is then given by:

Now you might say: isn't this expression too easy? Using simple statistics for the statistical weight? And then for the entropy? Well, you're right, we have made some additional assumptions:

1) We have neglected surface effects: basically, there is an additional entropy term that we neglected, which corresponds to the number of ways you can arrange the atoms on the surface of the solid. But the reason for this is simple: for a typical amount of solid, say one mole, we have 10^23 atoms present. Since the number of atoms on the surface is proportional to (10^23)^2/3, we have roughly 10^16 sites on the surface of the solid. Comparing this number (10^16) to the actual number of inner atoms (~10^23), we say that surface effects can be suitably neglected under normal conditions.

2) We have neglected vibrational effects: earlier, we mentioned that the atoms do vibrate about their equilibrium positions, and this in fact contributes to the entropy as well at higher temperatures.

However, having neglected these two effects doesn't mean we're calculating the wrong entropy - it just means that we're calculating an incomplete entropy term. The key point here is that surface effects, defects and vibrations each have their own entropy and energy term - we can then calculate their energies separately (same goes for their entropy) and then sum them up later on, and it is still correct. The basis for such a treatment is that each component is a subsystem and they all interact weakly. Because they interact weakly, they are essentially independent of one another, and thus we can separate out their thermodynamic variables.

This is reminiscent of having an isolated system being split into two or more subsystems, where each subsystem has its own characteristic temperature; it is these temperatures (and other variables) that must be equal to one another when thermal equilibrium ensues within the solid.

Now that the problems are out of the way, let us consider a mathematical simplification by application of Stirling's rule, which states that for large values of x:

In which case if we apply it to the entropy of the system, we now have:

I've obtained the derivative of S with respect to n in anticipation that we'll need it later. Now, we'll make use of the basic definition of temperature in terms of entropy so that:

The right hand side is simply an extension of the middle term where I've used the chain rule; now that we've obtained the partial derivative of S with respect to n, let us obtain the other derivative:

Putting everything together we obtain the final expression:

Let us keep in mind that n <<>

And voila! This expression has the form of the much-celebrated Boltzmann distribution! An amazing thing, don't you think?

Just ruminate over it. :p

No comments: