Author: Ali Khayrallah has been working away at the G’s of mobile for many years. He leads a research team shaping the future of mobile technology at Ericsson in Santa Clara. He is currently focused on 6G efforts in the US with industry, academia and government.
[Disclaimer: The below article is the author's personal opinion]
Just as the main operators in North America are completing the first wave of 5G network rollouts and 5G phones are becoming mainstream, we are starting to hear about 6G (or Next G, or whatever name sticks eventually).
Why so soon and what will it do for us? This article will try to give you a glimpse of some answers.
The long game
History doesn’t quite repeat itself but it kind of rhymes. Each ‘G’ (for generation) of mobile from 2G to 4G has lasted about 10 years, and it seems 5G will too. So we can guess that the 6G era will start around 2030. What is less obvious to the general public is that the buildup also takes a decade, so the time to start working on 6G is now. As you will come to appreciate, this is truly a long game from early research to commercial deployment on a global scale. Each new G offers an opportunity for radical changes, unconstrained by necessary compatibility within a single generation. To get there, we need time: to do the research and mature the technologies that potentially drive changes; to integrate them into complex systems and figure out ways to exploit their potential; to reduce them to practice and understand their feasibility; to create standards that incorporate them; to design products and services based on those standards; and finally to deploy networks.
I will first talk about what 6G is about then discuss how to get there, in particular standards and spectrum, as well as geopolitical factors that may help or hinder us.
6G: use case and benefits
It is of course difficult today to pin down the technologies that will enable 6G networks or the use cases that will drive the need for them, but we can paint a big picture of where we might be headed.
We expect the trend towards better performance in customary metrics such as capacity, bit rate, latency, coverage and energy efficiency to continue, as it has in previous G’s. To that end, we foresee further improvements in workhorse technologies such as multi-antenna transmission and reception, in particular more coordination of transmissions across sites. Also, the insatiable appetite for more spectrum will continue to lead us to ever higher frequencies, into the low 100’s of GHz. The need for ubiquitous coverage will push for integration of non-terrestrial nodes such as drones and low earth orbiting satellites into terrestrial networks. The success of these various directions hinges on solving a wide array of tough technical problems.
Networks will also need to evolve in other ways, such as trustworthiness, which entails the network’s ability to withstand attacks and recover from them. One aspect is confidentiality, which goes beyond protection of data during transmission to secure computation and storage. Another aspect is service availability, which requires resilience to node failure and automated recovery.
We can also think of use cases that will create the demand for 6G. One use case is the internet of senses, where we expect the trend from smartphones to AR/VR devices and beyond that involve most of our senses, leading to a merge of the physical and virtual worlds and putting very tough latency and bit rate requirements on the network. Another use case is very simple and possibly battery-less devices such as sensors and actuators for home automation, asset tracking, traffic control etc. Such devices must be accommodated by the network with appropriate protocols. Yet another is intelligent machines, where the network provides specialized connectivity among AI nodes, allowing them to cooperate. Speaking of AI, it is also expected to increasingly pervade the operation of the network itself, moving down from high level control closer to signal processing at the physical layer.
Setting up standards: why do we need them?
It sounds so 20th century but there are very good reasons, the main one being mobility. In mobile communications we need well defined interfaces so network elements speak and understand the same language. Phones move around and they have to be able to connect to different networks. Within a network, components from different vendors have to work together. Standards define the interfaces to make it all work together, and they do much more, including setting the minimum performance requirements for phones and base stations. In practice, companies spend a lot of money and effort on interoperability testing to ensure their equipment plays well with others.
Three main ingredients to 6G success (or failure)
In the mobile industry, the main standards body is 3GPP, which issues releases about every 18 months. A release fully defines a set of specifications that can be used to develop products. For example, Release 15 (2018) provided the first specifications for 5G, primarily covering the signaling and data channels to support improved mobile broadband. One particularly useful feature is the so-called flexible numerology, which enables the same structure to be adapted for use over a wide range of frequency bands. Release 16 (2020) added several features, including unlicensed spectrum operation and industrial IoT. Release 17 currently under construction will include operation at higher frequencies, more IoT features and satellite networks. From where we stand today, we expect the first release with 6G specifications to be around 2028.
3GPP standards enable mobile networks to flourish globally, making it possible to recoup the enormous R&D investments. Since the advent of 4G, there has been a single effective standard worldwide. Earlier, there were two dominant factions developing the CDMA and GSM families of standards. This split probably led to the failure of several large companies. In our industry, fragmentation is the F-word. I will revisit this in the context of current geopolitics.
Until recently, all mobile spectrum was in the low band (below 3 GHz), which has become jam-packed not only with mobile but many other services. The psychedelic colored spectrum map gives you a feel for it. With 5G, the floodgates have opened, with new spectrum becoming available in mid band (roughly 3 to 7 GHz) and high band (24 to 52 GHz). These higher bands are great because it’s possible to operate with wider bandwidths (in 100’s of MHz compared to 10-20 MHz in low band) and support higher rate services. But propagation characteristics in higher bands make for challenging deployment, as signals don’t travel well through walls etc. Moving into even higher bands in the 100’s of GHz will exacerbate this problem. Also, spectrum used by legacy systems will get gradually re-farmed for use by new networks. In addition, there is a push led by the FCC (Federal Communications Commission) to mandate spectrum sharing between networks and incumbent users such as radar as a way to accelerate spectrum availability. The CBRS band at 3.55 GHz is the leading example of this type of policy. Keep in mind that spectrum is our lifeline and we’ll take it and make the best of it wherever and however it’s available.
The “trade is good” principle that has dominated government policies since the fall of the Soviet Union seems to be on its way out, being replaced by more nationally centered policies. In this context there is now keen awareness of the rise of China as a serious technological rival to the US and its allies. This has manifested itself to a full extent in telecom with all the recent attention on 5G and mobile networks as a strategic national asset.
There is wide support in congress for big spending on technology R&D, including 6G, evidenced by several proposals under discussion around the National Science Foundation (NSF) alone. Their common thread is a multifold budget expansion and an increased emphasis on technology transfer.
In the private sector, the Alliance for Telecommunications Industry Solutions (ATIS) which represents the interests of the telecom industry in North America has launched the Next G Alliance to develop a roadmap towards 6G and lobby the government to influence policy and secure funding for R&D.
This is all good on the national scale, but it may come back to bite us with standards fragmentation and the threat of losing the global market scale. Navigating this complicated landscape will be challenging and it will be fascinating to me to see how it all plays out over the coming years.