5G technology is slowly being introduced into our life and as the term is being more and more used we thought it was time to help you understand what it really is about.
Lee Woodland, Product Manager at Airbus Secure Communications, has spent over 20 years working in commercial Fixed and Mobile Telecoms. Now the lead product manager for Tactical Wireless communications, he has led the development of Everus, a 4G Tactical Cellular communications node. Hear from Lee as he discusses the key fundamentals behind 5G technology and what we can expect to see in the future.
In the beginning… 1G in 1979
1G, or the first “Generation”, was the first commercially automated mobile network. Launched in Tokyo by Nippon Telegraph and Telephone (NTT), it was soon expanded to cover the whole population of Japan by 1979.
Despite it being incredibly expensive the adaptation was extremely rapid. It was clear and obvious that the technology’s potential was massive, but there were still some glaring issues that needed to be resolved, namely; poor sound quality and coverage, incompatible with other 1G networks due to differing frequency ranges, no roaming between different operators, only allowed audio, and weak security due to no support for encryption. This technology was also quite expensive and not made for the general public.
In 1991, 2G: The Revolution of mobility
The second generation of mobile networks originated in Finland in 1991. Launched under Global System for Mobile Communication (GSM), 2G brought a wireless standard that was far superior to what 1G had to offer; the biggest revolution being the support of multi-media and not just voice. It also had a better coverage and capacity than the 1G technology.
The next biggest change was a cultural one; people started using text messages, images, and even videos.
Still, 2G had some serious constraints, especially as it only offered speeds of up to 236 Kbps.
3G in 2001: The Transition to better connectivity
Launched in 2001 by NTT DoCoMo, 3G essentially brought about the ‘packet-switching’ revolution, ensuring much better connectivity for ‘data packets’ that drive internet connectivity. One of the key drivers behind 3G was standardization. 3G aimed to provide a single network protocol in a market that had been littered with different vendors and protocols. This standardization enabled international roaming services and the ability to access data from any location in the world. It had a 2Mbps capacity and was clearly made for internet usage and multimedia sharing.
In addition, the massive data transfer capabilities opened the way for a variety of new internet-driven services such as voice over IP (i.e. Skype), video conferencing and streaming, as well as location-based services. The 3G era also heralded the introduction of the Blackberry, iPhone and other ‘smartphones’. These devices fundamentally changed the way the old mobile phone was used by transforming it into a mobile, mini-computer.
In 2009, 4G: Today's smartphone data
The first-release Long Term Evolution (LTE) 4G standard was commercially deployed in Oslo and Stockholm in 2009. Since then it has quickly spread throughout most parts of the world, including the US. With theoretical data speeds of 200Mbps, this network generation brings high-quality streaming into a new reality.
4G also addressed one of the historically limiting factors of 2G and 3G, namely the wider range of frequency spectrum that 4G uses. This is a theme that will be carried forward to 5G.
Now that we understood where 5G coming from, look out for our up coming article explaining the changes 5G implies.