This idea for a post has been percolating for a while. I feel like now I am finally ready to share it.
Last semester, in the class 8.13 — Experimental Physics I, one of the experiments that I did was investigating the phenomena of Johnson noise and shot noise and using thes to find, respectively, the Boltzmann constant and electron charge. The other experiments that I did were investigating properties of hydrogen-like atoms through spectroscopy, and determining the speed and decay times of cosmic-ray muons. By far the best and worst experience was with Johnson noise. Follow the jump to read on.
I feel that the physics behind Johnson noise, compared to the other experiments, is truly spectacular. Part of me is biased because I have become rather interested in statistical mechanics (and more of that will come in future posts), but I think it is really profound how a microscopic quantity like the Boltzmann constant can be measured through purely macroscopic means. Nyquist's paper on the subject needs a few readings through to be fully understood, but it is amazing how the physics is presented there. He essentially uses basic circuit theory to argue for the form of Johnson noise. I would like to share what was written, though the paper is certainly well-worth reading.
Consider two resistors $R$ connected by parallel wires with no impedance of any kind. Electrons move in the wires thanks to thermal excitement at a temperature $T$. If the system is at thermal equilibrium, then the power $P = RI^2$ transferred from the first resistor to the second is the same as from the second resistor to the first, regardless of the details of the resistors (e.g. shape, chemical composition).
Now suppose that connecting the two wires are an inductor $L$ and capacitor $C$ in series, such that $R^2 C = L$. If indeed the circuit is resonant, then more power could be gotten by sending voltage from a resistor through the inductor-capacitor element, but this violates the laws of thermodynamics. Hence, the voltage from thermal excitations of electrons only depends on resistance and temperature.
Now consider the original situation, modified so that the parallel wires have inductance and capacitance per unit length such that the characteristic impedance of the transmission line is equal to the resistance at either end: $Z_0 = \sqrt{\frac{L}{C}} = R$. This implies that the transmission line is impedance-matched at either end, so any power sent from one resistor will be fully absorbed (i.e. no reflection) by the other resistor, and vice versa.
Electromagnetic waves will be traveling in both directions. If the resistors at each end are now shorted out, an electromagnetic cavity of sorts has been formed, and the waves in each direction get superposed into standing waves. The boundary conditions require that the amplitude of the waves be zero at each end. If they propagate with velocity $v$ and if the separation between the shorts is $\ell$, then the allowed frequencies (i.e. the ones that satisfy these boundary conditions) are $\nu_n = \frac{vn}{2\ell}$ for $n \in \mathbb{N}$. Each frequency labeled by $n$ gives two electromagnetic modes/degrees of freedom because of the contribution of the electric and magnetic fields. Because a very large number of such modes are allowed by assumption, $\nu$ can be approximated as a continuous variable. The equipartition theorem says that if the Hamiltonian (i.e. the energy) is a quadratic function of a given variable, then that variable contributes $\frac{k_B T}{2}$ to the thermal energy of the system in an ensemble average. In this case, the volume Hamiltonian density $H = \frac{1}{8\pi}\left(E^2 + B^2 \right)$; as there are two variables that are quadratic for each mode, then each electromagnetic mode contributes $k_B T$ to the thermal energy. (UPDATE: to say that the contribution is $k_B T$ for each electromagnetic mode because the electric field is one variable and the magnetic field is another variable and the Hamiltonian is quadratic in each is a naive way to view this situation. The counterexample would be, using the notation $u^2 \equiv \mathbf{u} \cdot \mathbf{u}$, what about $H = \frac{p^2}{2m}$? That seems like it would contribute only $\frac{k_B T}{2}$ to the thermal energy, yet in reality it contributes $\frac{3k_B T}{2}$. This would then imply that as $\mathbf{E}$ and $\mathbf{B}$ are 3-dimensional vectors, they should each contribute $\frac{3k_B T}{2}$ to the thermal energy. However, this too is naive, because $\mathbf{E}$ and $\mathbf{B}$ are not allowed in a sense to exhibit all three degrees of freedom each. A transmission line as in this model supports TEM modes, meaning that the electric and magnetic fields are orthogonal to each other and to the direction of propagation. This means that the electric and magnetic fields only have one degree of freedom each despite being 3-dimensional vectors; seen from another angle, after the direction of propagation is chosen, one of the fields can lie anywhere in a plane such that it satisfies the boundary conditions, and the other field has its direction fixed by the direction of propagation and the direction of the field chosen first. Indeed, then, the total thermal energy contribution from the electromagnetic field is $k_B T$ rather than $3k_B T$.) Hence, in the continuum approximation of allowed frequencies, the thermal power in a finite range of frequencies is $P = \int k_B T \, d\nu.$
Because there are two resistors $R$ in this circuit, the current is related to the voltage by $V = 2RI$, or $I = \frac{V}{2R}$. The power transferred as before is $P = RI^2$, which becomes $P = \frac{V^2}{4R}$. Hence, the Johnson noise voltage is related to the resistance and temperature by \[ \frac{V^2}{4R} = \int k_B T \, d\nu. \] More generally, if the resistance is more like a frequency-dependent impedance thanks to the presence of other circuit elements contributing impedance, then \[ V^2 = 4k_B T \int R(\nu) \, d\nu, \] which is the familiar expression for the Johnson noise voltage. (Please do correct me in the comments if I have messed up anywhere.)
Isn't this incredible? First of all, a microscopic quantity $k_B$ can be found purely through macroscopic measurements of $V$, $R$, $T$, and $\nu$. Second, the formula was found in a completely general manner; the shape, chemical composition, and other properties of the circuit matter not one bit, so the approximation of the circuit as a transmission line is totally fine.
So the underlying physics is really cool. But what about the experiment itself? Well, to say the least, it invokes pretty much the full spectrum of emotions possible in a human being:
On a side note, you may have noticed that I'm able to type $\mathrm{\small{\LaTeX}}$ into Blogger. I'm able to do that by inserting the code
Last semester, in the class 8.13 — Experimental Physics I, one of the experiments that I did was investigating the phenomena of Johnson noise and shot noise and using thes to find, respectively, the Boltzmann constant and electron charge. The other experiments that I did were investigating properties of hydrogen-like atoms through spectroscopy, and determining the speed and decay times of cosmic-ray muons. By far the best and worst experience was with Johnson noise. Follow the jump to read on.
I feel that the physics behind Johnson noise, compared to the other experiments, is truly spectacular. Part of me is biased because I have become rather interested in statistical mechanics (and more of that will come in future posts), but I think it is really profound how a microscopic quantity like the Boltzmann constant can be measured through purely macroscopic means. Nyquist's paper on the subject needs a few readings through to be fully understood, but it is amazing how the physics is presented there. He essentially uses basic circuit theory to argue for the form of Johnson noise. I would like to share what was written, though the paper is certainly well-worth reading.
Consider two resistors $R$ connected by parallel wires with no impedance of any kind. Electrons move in the wires thanks to thermal excitement at a temperature $T$. If the system is at thermal equilibrium, then the power $P = RI^2$ transferred from the first resistor to the second is the same as from the second resistor to the first, regardless of the details of the resistors (e.g. shape, chemical composition).
Now suppose that connecting the two wires are an inductor $L$ and capacitor $C$ in series, such that $R^2 C = L$. If indeed the circuit is resonant, then more power could be gotten by sending voltage from a resistor through the inductor-capacitor element, but this violates the laws of thermodynamics. Hence, the voltage from thermal excitations of electrons only depends on resistance and temperature.
Now consider the original situation, modified so that the parallel wires have inductance and capacitance per unit length such that the characteristic impedance of the transmission line is equal to the resistance at either end: $Z_0 = \sqrt{\frac{L}{C}} = R$. This implies that the transmission line is impedance-matched at either end, so any power sent from one resistor will be fully absorbed (i.e. no reflection) by the other resistor, and vice versa.
Electromagnetic waves will be traveling in both directions. If the resistors at each end are now shorted out, an electromagnetic cavity of sorts has been formed, and the waves in each direction get superposed into standing waves. The boundary conditions require that the amplitude of the waves be zero at each end. If they propagate with velocity $v$ and if the separation between the shorts is $\ell$, then the allowed frequencies (i.e. the ones that satisfy these boundary conditions) are $\nu_n = \frac{vn}{2\ell}$ for $n \in \mathbb{N}$. Each frequency labeled by $n$ gives two electromagnetic modes/degrees of freedom because of the contribution of the electric and magnetic fields. Because a very large number of such modes are allowed by assumption, $\nu$ can be approximated as a continuous variable. The equipartition theorem says that if the Hamiltonian (i.e. the energy) is a quadratic function of a given variable, then that variable contributes $\frac{k_B T}{2}$ to the thermal energy of the system in an ensemble average. In this case, the volume Hamiltonian density $H = \frac{1}{8\pi}\left(E^2 + B^2 \right)$; as there are two variables that are quadratic for each mode, then each electromagnetic mode contributes $k_B T$ to the thermal energy. (UPDATE: to say that the contribution is $k_B T$ for each electromagnetic mode because the electric field is one variable and the magnetic field is another variable and the Hamiltonian is quadratic in each is a naive way to view this situation. The counterexample would be, using the notation $u^2 \equiv \mathbf{u} \cdot \mathbf{u}$, what about $H = \frac{p^2}{2m}$? That seems like it would contribute only $\frac{k_B T}{2}$ to the thermal energy, yet in reality it contributes $\frac{3k_B T}{2}$. This would then imply that as $\mathbf{E}$ and $\mathbf{B}$ are 3-dimensional vectors, they should each contribute $\frac{3k_B T}{2}$ to the thermal energy. However, this too is naive, because $\mathbf{E}$ and $\mathbf{B}$ are not allowed in a sense to exhibit all three degrees of freedom each. A transmission line as in this model supports TEM modes, meaning that the electric and magnetic fields are orthogonal to each other and to the direction of propagation. This means that the electric and magnetic fields only have one degree of freedom each despite being 3-dimensional vectors; seen from another angle, after the direction of propagation is chosen, one of the fields can lie anywhere in a plane such that it satisfies the boundary conditions, and the other field has its direction fixed by the direction of propagation and the direction of the field chosen first. Indeed, then, the total thermal energy contribution from the electromagnetic field is $k_B T$ rather than $3k_B T$.) Hence, in the continuum approximation of allowed frequencies, the thermal power in a finite range of frequencies is $P = \int k_B T \, d\nu.$
Because there are two resistors $R$ in this circuit, the current is related to the voltage by $V = 2RI$, or $I = \frac{V}{2R}$. The power transferred as before is $P = RI^2$, which becomes $P = \frac{V^2}{4R}$. Hence, the Johnson noise voltage is related to the resistance and temperature by \[ \frac{V^2}{4R} = \int k_B T \, d\nu. \] More generally, if the resistance is more like a frequency-dependent impedance thanks to the presence of other circuit elements contributing impedance, then \[ V^2 = 4k_B T \int R(\nu) \, d\nu, \] which is the familiar expression for the Johnson noise voltage. (Please do correct me in the comments if I have messed up anywhere.)
Isn't this incredible? First of all, a microscopic quantity $k_B$ can be found purely through macroscopic measurements of $V$, $R$, $T$, and $\nu$. Second, the formula was found in a completely general manner; the shape, chemical composition, and other properties of the circuit matter not one bit, so the approximation of the circuit as a transmission line is totally fine.
So the underlying physics is really cool. But what about the experiment itself? Well, to say the least, it invokes pretty much the full spectrum of emotions possible in a human being:
- Curiosity: the physics looks so cool, and the experimental setup looks pretty interesting to say the least.
- Relaxedness: the calibration is fairly easy and quick to do.
- Confusion: is that really Johnson noise or are the wires functioning as a radio antenna for a faraway source?
- Suspicion: no, that doesn't really look like Johnson noise, though the voltage amplitude is low, and I have zoomed in horizontally quite a bit.
- Self-control: massage and caress the Johnson noise box and the preamplifier; somehow, that messes with the degrees of freedom in the desired way.
- Anger: spank the Johnson noise box and the preamplifier...hey it works! Except now it doesn't. What the heck happened?
- Outrage: the muon experiment across the bench is affecting our calibration!? How on Earth are we supposed to do anything now?
- Superstition: quick, wrap the twisted pair in more shielding aluminum foil! That'll block out further incoming radiation!
- Horror: what happened to our calibration this time?
- Frustration: why is the Arduino controller for the oven so terrible? Aren't Arduino units supposed to be amazing all the time?
- Resignation: please, somebody, just help us!
- Craziness: sing songs to the Johnson noise box to make it calm down and fall asleep...except that (1) the Johnson noise box is not a sentient being and (2) hey it worked...except now it doesn't.
- Joy: $k_B$ is what it should be!
On a side note, you may have noticed that I'm able to type $\mathrm{\small{\LaTeX}}$ into Blogger. I'm able to do that by inserting the code
<script type="text/x-mathjax-config"> MathJax.Hub.Config({tex2jax: {inlineMath: [['\$','\$'], ['\\(','\\)']]}}); </script> <script src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script>(remove the backslashes before each dollar sign, as these are necessary to have them display in this post only and not be eaten by $\mathrm{\small{\LaTeX}}$) to the beginning of the HTML code for each post. This is how I will be able to insert mathematical typesetting to posts from now on.