We know that Planck was a great physicist because Planck’s constant is named after him. Likewise, the strength of forces is measured in units named after Newton. So you shouldn’t be surprised to learn that the physicists’ unit for the frequency of light waves is named after the man who first demonstrated that light really is a wave—Heinrich Hertz.
Through a series of exceptionally challenging experiments, Hertz demonstrated in the laboratory that oscillating charges emit waves that move exactly the way Maxwell predicted about 15 years before. How ironic, then, that Hertz stumbled upon, and then brushed aside, a side effect that would eventually demonstrate that light is not a wave after all—or, at least, not all of the time.
Hertz created radio waves in his lab by charging a pair of metallic spheres so much that a huge electric spark would jump from one sphere to the next. He found that he got the juiciest sparks when he kept the spheres nicely polished. He also found that when ultraviolet light shone on the negatively charged sphere, the sparks flew all the easier.
Since this effect linked both light and electricity, it was christened the photoelectric effect. About 10 years later, after J. J. Thomson had discovered the electron, it was recognized that the spark between the spheres was basically a stream of electrons. This led to a physical interpretation for the effect of ultraviolet light on the sparks. If the light delivered enough energy to electrons residing within the metal, the electrons could escape the atoms to which they were bound. A fixed amount of energy was required to liberate the electron in the first place, and any extra energy would be carried away by the liberated electron in the form of kinetic energy.
DEFINITION
The photoelectric effect is the ejection of electrons that occurs when electromagnetic radiation shines on a (typically metal) surface.
So what does this have to do with Maxwell and his waves? First of all, Maxwell’s equations tell us that the energy of a light wave has nothing to do with its frequency. Therefore, provided you shine your light brightly enough, you will always liberate electrons, no matter what frequency of light you happen to be using. Yet in the laboratory physicists saw that no electrons are ejected whatsoever when they used light that was at a frequency below a certain threshold (or “cut-off” frequency). Moreover, if they increased the frequency of their light sources, the electrons that were ejected traveled faster and faster.
Second, Maxwell’s classical theory tells us that the energy of a light wave is determined by its intensity. This implies that the brighter you shine your light, the faster the liberated electrons should whiz through the lab. In reality, however, we find that the kinetic energy has no dependence whatsoever on the light’s intensity. As scientists increased the brightness of their light sources, they simply produced more electrons with the same kinetic energy.
Lastly, if you were to shine a very dim light on the metal, Maxwell’s theory would tell us that you’d have to wait a little while for the electrons to accumulate enough energy to break free of their atoms. His equations would even let you calculate just how long you’d have to wait. However, experiment proved that no such time lag existed, no matter how dimly physicists shined their light—provided, of course, that it was of high enough frequency.
Leave a Reply