Why 50 Ohms Became the RF Standard
The history, physics, and practical trade-offs behind a number every RF engineer knows.
Welcome to the weekday free-edition of this newsletter that is a small idea, an actionable tip, or a short insight that takes less than 5 minutes to read. On Sundays, I write in-depth technical deep-dives on many aspects of semiconductors and chip design for paid subscribers of this publication. See the about page for more info.
If you would like to suggest a topic for this newsletter, please use this form.
If you've ever ventured into the field of radio frequency (RF) engineering, the first thing you will notice is that everyone is obsessed with the value of 50 ohms. This is the value which is often assumed to be at the input and output of RF circuits such as amplifiers and filters, and the reference resistance to which a lot of quantities are compared to.
In this short post, you will learn why the choice of 50 ohms has prevailed in RF engineering: a mix of history, physics and convenience.
The humble coax
The use of a coaxial cable was one of the earliest ways to transmit a high frequency signal from one point to another. Unlike low frequency signals such AC line signals at 50/60 Hz, or audio signals below 20 kHz which often work with simple wires, high frequency signals require the use transmission lines to avoid signal reflections. A coaxial cable shown below has a center conductor of radius a and an outer ground conductor at a radius b.
One of the fundamental properties of a coaxial cable is its characteristic impedance often represented by Z0. It is the value of resistance seen when looking into a coaxial cable of infinite length. The value of Z0 is solely determined by the physical construction of the cable. For reasons we will see below, Z0 = 50 ohms has become the standard for almost all radio frequency circuits.
Loss and power handling
Assuming that the transmission line is air-filled (εr=1), the plot below shows how characteristic impedance changes with the ratio of outer to inner radius. From a practical point of view, diameter ratios between about 1.5 and 4 were reasonable. Too small and you risk shorting out the conductor and ground; too large and your cable becomes massive. Since any value between 25 ohms and 80 ohms is a reasonable choice, how was the best ratio be chosen?
Well, the other aspect to an RF cable is its loss, or how quickly the signal is attenuated as it travels along the transmission line. The copper used in the cable has some resistance that dissipates energy, and if it is filled with a dielectric like teflon, that would have some loss too.
When the loss is calculated across different diameter ratios, it turns out there is a minimum at a value of 3.6, which corresponds to a characteristic impedance of 77 ohms. This is why cable TV coaxial cables are all designed to be 75 ohms — for minimum cable losses.
But loss isn't everything. Depending on the applications, these coaxial cables need to handle power too. When the power handling capability of these cables were studied for different diameter ratios, it turned out that the best power handling was for a ratio of 1.6 which corresponds to a characteristic impedance of about 30 ohms.
Just wing it
In engineering, when you run into conflicting trade-offs like this, you often just pick something in the middle that provides okay performance on both fronts and call it a day. If you take the geometric mean of 30 and 77 ohms, you get about 48 ohms, which you then round up to 50 ohms and tell everyone this is what they must design to.
In 1949, the US military released the MIL-17-C specification sheet that defined the various types of coaxial lines and what their characteristic impedance should be. They consisted of both 50 and 75 ohm coaxial cable definitions. Today, this specification has evolved to MIL-DTL-17 standard which all suppliers to the military must adhere to.
When AT&T and Bell Labs started constructing long distance microwave relay networks, they also aligned much of their cabling to matching the available military-grade 50 ohm cabling systems. When RF test equipment manufacturers such as Hewlett-Packard (later Agilent, now Keysight) got in the game, they built all their laboratory equipment such as network analyzers and signal generators all around 50 ohms.
Today, most of the RF industry revolves around 50 ohms. There is nothing magical about this arbitrary value; it was just the messy reality of engineering trade-offs, early players setting the standard and everyone else just adopting it.
But hey, it works!
Here is a more in-depth article on the fundamentals of transmission lines if you’re interested.
Absolute basics of Transmission Line Theory
Transmission lines are everywhere whether you realize it or not.
References
E. J. Sterba and C. B. Feldman, "Transmission Lines for Short-Wave Radio Systems," in Proceedings of the Institute of Radio Engineers, vol. 20, no. 7, pp. 1163-1202, July 1932.
This is great. We just purchased some BNCs for a project and was curious as to why they *had* to be 50 ohm. Could not find the information easily.
When I learned coax it was 75 ohms for a good match to dipole antenna, and most stuff you find related to TV is 75. I was surprised much later to encounter 50 ohm equipment. Thanks for explaining why!