Hi all
We just began study of the photoelectric effect and quantum nature of photons in our physics class. While our given text describes black body radiation fairly accurately, it fails completely in one aspect - how can quantisation of energy be used to derive a theoretical emission curve resembling that of experiments? Why, as the frequency of emitted radiation increases, does intensity decrease? I thought about this a little more, and came to the conclusion that as black bodies increased in temperature, the frequency of vibration in the body also increased. This would explain the increase in high frequency radiation intensity as temperature increased, but fails to explain the decrease. I am aware of the implications of classical theory and the principle of conservation of energy, but am looking for a more technical answer. Why does it decrease in the pattern it does?
On a side note, what exactly is the visual model for an 'atom vibration'? As energy increases, both frequency and amplitude (and thus intensity) have the potential to increase, yet we only see an increase in frequency. Why is this? Then again, am I mistaken? As temperature increases, intensities (and hence amplitudes) for each specific wavelength also increase. Are both increasing? What determines such a disproportionate relationship in energy distribution (energy increases tend to yield frequency increases rather than intensity)?
Please help, any replies are much appreciated
We just began study of the photoelectric effect and quantum nature of photons in our physics class. While our given text describes black body radiation fairly accurately, it fails completely in one aspect - how can quantisation of energy be used to derive a theoretical emission curve resembling that of experiments? Why, as the frequency of emitted radiation increases, does intensity decrease? I thought about this a little more, and came to the conclusion that as black bodies increased in temperature, the frequency of vibration in the body also increased. This would explain the increase in high frequency radiation intensity as temperature increased, but fails to explain the decrease. I am aware of the implications of classical theory and the principle of conservation of energy, but am looking for a more technical answer. Why does it decrease in the pattern it does?
On a side note, what exactly is the visual model for an 'atom vibration'? As energy increases, both frequency and amplitude (and thus intensity) have the potential to increase, yet we only see an increase in frequency. Why is this? Then again, am I mistaken? As temperature increases, intensities (and hence amplitudes) for each specific wavelength also increase. Are both increasing? What determines such a disproportionate relationship in energy distribution (energy increases tend to yield frequency increases rather than intensity)?
Please help, any replies are much appreciated