If you follow the news, you’ve probably heard something about 5G. It’s been billed as the foundation of a new technology revolution, as the next thing that is going to change the way people do everything. I’m always cautious of people trying to make predictions like that, since it’s notoriously challenging, and we have a tendency to only remember the people who were right about what happened in the past, but even if half of what is being hyped about 5G comes to be, it would change a lot…on the backend. Users might not even notice much of a different in daily life. Yet for all that this is supposedly a world-changing technology, it seems that most people have no idea what it actually means.
Of course, the vast majority of people have very little idea of how their cellphones and other technological devices actually work, so perhaps that shouldn’t be too surprising. If everyone knew just how much information their devices collected on them, and how the technology functioned, and how susceptible it can be to pernicious problems, there would probably be a lot lower rate of adoption. Considering the responses I tend to get when I start explaining to people how microwaves work, attempting to change that is probably futile, but I’ll try anyway. To start, we need to talk about wireless communication, generally.
Almost all wireless communication, whether it’s the WiFi connection on your computer, the 4G connection on your smartphone, the 3G connection on your less smart phone, the radio antenna on your car, the satellite dish on your house (if you have satellite television), or the pair of walkie-talkie radios you were obsessed with when you were nine (no, that was just me?), operate on the same basic principles. Information as we know it – for instance, a fragment of a conversation – is encoded on a computer as binary information, a series of ones and zeroes. Think of it like this: I could assign a numeric designation to every letter in the alphabet. A would be 1, B would be 2, C would be 3, and so on and so forth. Those numbers are expressed in the decimal system (base 10), but they could also be expressed in the binary system. Just like how, when counting, you add another digit after getting to 9, binary works the same way. When you run out of numbers, you add a digit. So to express the number 4 in binary, we need to count. For this exercise, we will designate numbers in base 10 like this: n10. Numbers in binary will be expressed n2.
Counting to four in base ten is easy: 010, 110, 210, 310, 410. Counting in binary, it would look like this: 02, 12, 102, 112, 1002. If 0 is A, 1 is B, and so forth, then I could say A-B-C-D-E in binary code as 0-1-10-11-100. That, in brief, is how computers encode all information, though of course the encoding scheme is a bit more complicated than 0=A (if you’re interesting in learning more about how computers encode specific characters, look up ASCII Code).
Now that the computer has the information encoded, it might want to transmit that information to a receiver (for instance, maybe you want to send a text message that says ABCDE). To do this, the computer must now encode the information onto an electromagnetic waveform. Almost all modern wireless communication occurs via electromagnetic waveforms – light waves. Different frequencies have different properties, and are useful for different things. Think of someone turning a flashlight on and off to signal to someone else in the dark. It’s a similar principle for wireless communication. Light behaves like a wave (and also a particle, but we won’t get into that here – for our purposes today, light is a wave), which means it has peaks and valleys. Those waves can have different properties. Wavelength is the distance from one peak to the next peak. It is the inverse of the frequency, which is how often the peak appears. So when you here something like “millimeter wave radar” or similar, the distance between two adjacent peaks of the electromagnetic waveform being employed by that device is on the order of millimeters. We choose different wavelengths for different tasks – some are absorbed by weather, some transmit farther, and so forth – but that is beyond the scope of the present discussion.
Aside from frequency/wavelength, electromagnetic waveforms also have amplitude, which is how high the peaks can get. So I could, for instance, designate the peak value as “1” in my binary encoding scheme, and the trough value as “0.” Let’s call the peaks P, and the toughs T, so 1=P and 0=T. If I wanted to send the ABCDE message using this scheme, I would module the waveform I am transmitting to look like: T-P-PT-PP-PTT. Congratulations: you’ve just learned how amplitude modulation (AM radio) works. FM (frequency modulation) uses the same concept, but varies the frequencies transmitted. All of it – letters to binary, binary to waveform, and then back again into something that the receiver can understand – works on the same basic premise as a cipher.
However, not all electromagnetic waveforms are created equal. As you may have noticed from the discussion above, it takes up a significant amount of wave to send just a simple message. Now, the wave is moving at the speed of light, so it may not seem to matter much, but if you’re sending a lot of information, how much data you can encode onto a wave becomes significant. We measure data in bits, nibbles, and bytes. There are four bits in a nibble, and two nibbles in a byte (so there are eight bits in a byte). Bits are usually represented with a lowercase b, and bytes are usually represented with an uppercase B. We can also apply prefixes, which is how we end up with kilobytes (kB), megabytes (MB), and gigabytes (GB). So a gigabyte is 8,000,000,000 (eight billion) bits (roughly – there are actually two different ways to do this calculation, because of the binary/decimal arguments).
If you know the encoding scheme, the waveform properties, and some other information, you can calculate a data rate yourself, but let’s throw some numbers out just to get an idea for what this looks like in practice. If you go to a site like SpeedTest, you can see what kind of data rates you’re currently getting for your device. My computer was on my home WiFi when I ran it, and I got a downlink rate of about 60 Mbps (megabits per second), and an uplink rate of about .25 Mbps (it’s normal for consumer devices for the downlink to be far, far faster than the uplink – if you think about it, you spend a lot more time downloading things from the web than you do uploading. Uplink speed only becomes a concern if you’re trying to do something like upload a video stream). I then ran the same test on my phone using the local cellular network, and got a downlink speed of about 4 Mbps. These are actually really fast: the first satellite I worked on had a downlink speed of about 100 kbps, and an uplink speed of about 10 kbps (one time, it took us a week to upload a single file).
5G, which stands for 5th Generation, is all about data rates. It uses higher frequency (shorter wavelength) signals (in the 25 to 50 GHz range, in some cases) to be able to pack more data into a certain length of waveform. If each peak and valley can, for instance, hold a single bit of information, and you’re multiplying the number of peaks and valleys within a meter of waveform by a thousand, you’ve just managed to encode a thousand times more information within that meter, which means that a thousand times more information will be transmitted in the amount of time it takes to transmit that meter. Now, I realize that waveforms are not static entities, but the visualization is a valid one to understand why this matters.
By using these higher frequencies, far more information will be able to be transmitted. That can be helpful in the simple, single device context I’ve been referencing here, where one device is trying to communicate with another. It would allow, for instance, you to get download speeds on your phone comparable to the download speeds you would ordinarily get on WiFi. Hurrah, you can download shows faster (yes, that was supposed to be a bit sarcastic, and yes, all bit puns in this post are at least a nibble intentional). There’s talk that people wouldn’t even bother with home WiFi anymore; they’d just connect all of their devices to a 5G data plan. Yet that isn’t what has people so excited about 5G.
What is making the technologists and futurists and Silicon Valley data harvesting companies and the CCP so excited about 5G is its potential to enable the Internet of Things (IoT). If you know someone who has a refrigerator with an iPad in the front, you know about the IoT. It’s this idea that we shouldn’t just connect our tablets, smartphones, laptops, and desktops to the internet. Instead, we should connect everything to the internet: televisions, appliances, lights, fans, curtains, blinds, furniture, hardwood floors…if it’s a thing, somebody probably wants to make it part of the IoT. I mean, haven’t you always wanted your nice, hardwood floor to sell your personal data to the highest bidder and display advertisements for your optimized brand of internet-connect vacuum cleaner?
Because of the vastly higher data capacity of 5G networks, they’re supposed to be able to support this kind of omnipresent interconnectivity. That is the short answer to the question about 5G that I just spent an entire post trying to answer. There are certainly advantages to being able to support and leverage that level of interconnectivity, especially in something like, for instance, manufacturing and automation. Imagine if your factory floor was rigged out with IoT connected sensors and devices. Instead of having to retool the factory or run a bunch of optimization efforts, you could set it up so that the factory would optimize itself. Or you could use it so that your refrigerator could sell you advertisements for a new kind of apple.
In other words, I have reservations about the use of the IoT. I can see a lot of specific, tailored situations in which it could be hugely beneficial to have much higher data rates available, and I could see a lot of useful technologies being enabled by it. At the same time, 5G does have its drawbacks. The range is much shorter, because higher frequency waveforms are absorbed and blocked much more readily by matter. This is a problem that doesn’t seem to have been entirely addressed, and not even accounting for the interior dead zones that are probable, we’ll need far more 5G transmitters than we have 4G transmitters. There are also immense security concerns. The only way to make something truly secure from cyber threats is to isolate it from the internet. By connecting everything, we are introducing far more vulnerabilities into an already porous system. It might not seem important to have a lot of security features on your IoT microwave, right up until someone hacks your bank account through said microwave.
There are also health considerations. Now, I’m not one of these people who thinks that we’re all being neurologically manipulated by the lightwaves coming off of our cellphones (on a good day, at least), but I do have some reservations about the effects of long term exposure to higher energy radiation, which is what 5G is. Sure, it’s been demonstrated to be “safe enough” so far, but we haven’t even seen yet what the cumulative effects of prolonged cellphone use are over a human lifetime – cellphones have only really been ubiquitous for about a decade. Simply put, higher energy, shorter wavelength radiation is more dangerous to living cells, and 5G is higher energy than 4G.
You know I like to look for historical parallels, and I can’t help seeing similarities between the visions people have for 5G and the IoT, and Nikola Tesla’s vision for wireless electricity. Tesla claimed that he could transmit high voltage alternating current through the atmosphere without wires using a modified Tesla Coil. The tower was never completed, and experiments to prove whether or not it would have been possible have consistently failed, but I look at this project and think “did anyone even stop to wonder what beaming huge amounts of wireless electricity through the atmosphere might do to the planet and its inhabitants?” 5G seems a lot more likely to succeed than that project ever did. I can only hope that we won’t uncover the flaws in this miracle technology too late.