One of the key performance attributes of any radio receiver old or new, professional or domestic is its sensitivity. It needs to be able to be sensitive enough to receive the signals that are required of it.
There are several ways in which receiver sensitivity can be measured, but one popular method is the signal to noise ratio it provides.
This video looks at what limits radio receiver sensitivity, what signal to noise ratio is, how this can be used to measure the performance of a radio and then we’ll explain the typical specifications and what each part means. In terms of what limits the sensitivity of a radio receiver, it is not the gain that can be incorporated – it is possible to have large amounts of gain. Instead the real issue is that noise is generated within the radio itself and this can mask the very weak signals that might need to be received.
One of the most straightforward ways of measuring the sensitivity or noise performance of a radio is to use a measure called the signal to noise ratio or SNR. Essentially the signal to noise ratio, SNR compares the output level of the receiver for a signal of known input strength with that of the noise when no signal is present.
It is possible to define the signal to noise ratio for a radio receiver as the difference between the wanted signal and the background noise for a given input signal level, in a given bandwidth and for a specific type of signal modulation and if amplitude modulation is used then the modulation depth must also be specified.
A typical specification for a radio might be: a sensitivity of 0.5 microvolts for a 10dB signal to noise ratio when receiving a single sideband radio signal in a 2.7 kHz bandwidth. The various figures within the specification are all explained within the video, along with why sometimes the specification may be referred to as a signal plus noise to noise ratio.