New life at the edge of the cloud

Covid-19 with its associated issues of teleworking, distance learning and endless Zoom meetings has shown that our digital life has long been taking place in the “cloud”. However, gigantic data processing centers have a disadvantage: data transmission takes too long for new applications such as online gaming, smart devices, the Internet of Things, drones or robots. The solution? Edge computing – processing power at the edge of the cloud: response in real time.

Helmut Spudich

History repeats itself, also in the development of digitalization, and brings along evolutionary changes. With mainframes that took up whole rooms, computers entered our everyday life in the 1960s, with “dumb” terminals at the periphery, which gave access to, and strictly controlled, computing power and data. The PC turned the relationship upside-down; the continuous miniaturization of chips and storage ousted the chunky monsters and democratized computers.

Until the Internet came along and connections between PCs and servers created a new role for the “computer”, the server, at the center of millions of connections between end devices – though no longer in the form of mainframes, but rather as huge server farms, which provide both enormous processing power and sheer unlimited storage capacity. As their specific location was irrelevant and de facto unknown, the “cloud” was created – a cloud-like formation which, depending on necessity, took a specific shape for users – whether as a ticket service for event organizers or server capacity for start-ups.

As users, whether as a private individual or corporation, we live in the cloud in one form or another today. This has been impressively demonstrated by Covid-19 over the past months: Zoom – a cloud service – has become a synonym for endless private and professional video calls. Teleworking and distance learning would simply be impossible without the cloud. The infrastructure for all this has long been in place, but many were not aware of it.

Without connection to the digital infrastructure of computer farms individual PCs, tablets or individual smartphones are almost useless. Numerous cloud providers offer a home that can quickly be scaled as required. The IT of companies has long been accommodated in the cloud – often in hybrid forms. According to estimates of the Synergy Research Group, the cloud industry generates revenues of more than 300 billion US dollars. The market leaders are Amazon, Google and Microsoft, which announced a billion-euro investment for the establishment of a data center in Austria only a few days ago.

Now the wheel of digital history keeps turning and in many ways it resembles the replacement of mainframes by PCs. Smartphones, smartwatches, smart cars and the Internet of Things generate a vast quantity of data, whose transmission and processing in far-away computer farms and the related way back to our end devices lead to waiting times – called high latency in technical terminology. This is less disturbing for searches, sending messages or streaming videos. However, there are applications that can’t wait that long.

Pirelli for instance tests car tires whose sensors supply information on the condition of the road. If, for example, the road is icy, the following vehicles can be warned immediately – provided this happens in real time and the information does not have to travel electronically to a data processing center located thousands of kilometers away. Online gaming with many participants needs ultra-short latencies – otherwise all the fun of the game will be lost. Virtual and augmented reality – whether for games, entertainment or information systems – need extremely short latencies to allow users to immerse themselves into their own cosmos. A smartwatch designed to detect dangerous heart rates and sound an alarm in dangerous situations needs results immediately, and not with a time delay. Car manufacturer Toyota estimated the data volume transmitted by assistance systems between smart cars and servers at 10 exabytes – the inconceivable amount of ten billion gigabytes – per month in the future.

All of those are tasks for edge computing, the computers at the edge of the cloud. The logic of edge computing is obvious: when the transmission of data to distant data processing centers takes too long, the processing centers have to be situated closer again. As the example of a smart vehicle shows, it makes no sense to send such data volumes to central server farms at all: they must be evaluated immediately, can then be deleted and, as the case may be, some critical information can be stored in the cloud. Or in the case of simultaneous translation of languages thanks to artificial intelligence, as offered by Skype: edge computing reduces the irritating pauses that would occur if the translation were done in distant cloud servers.

New providers such as the New York start-up Packet rely on covering this need with distributed micro data centers – the same way PCs took over tasks from mainframes in the past. Major cloud providers are also aware of the problem of high latency. That is why Azure IoT Edge by Microsoft or AWS Greengrass by Amazon offer their developers software for their computers to build a bridge to cloud services. With 5G, mobile communication providers also enter the arena of edge computing – because 5G promises extremely short latencies, especially for industrial applications. This is only possible when mobile operators provide the relevant edge infrastructure. Locations of base stations already equipped with servers lend themselves to this purpose. In Austria, this comprises roughly 18,000 locations, substantially closer to the application than any cloud processing center.

Edge computing will not render the cloud obsolete; rather, there will be new life at the edge. The systems complement each other collaboratively: while tasks like webhosting, searching, streaming of entertainment content or online shops will still need the enormous performance of the cloud and are less time-critical, edge computing fulfils tasks that require real-time reactions.

Both cloud and edge computing play a central role for the development at AT&S. Ever larger data processing centers, plus distributed edge computers at numerous locations, require high energy efficiency from the hardware and miniaturization from servers to save space. For 5G, a tight network of so-called microcells will be necessary in the further expansion in a few years in order to enable extremely short latencies and high data rates. Such cells are installed in unconventional places, in manhole covers as well as in lampposts – smallest sizes and most efficient energy use are prerequisites in this context.

The capability of intelligent printed circuit boards to integrate all types of components, miniaturization and energy efficiency play a key role here. Together with customers and their specific applications, AT&S develops integrated circuit substrates (IC substrates) for this purpose, highly integrated printed circuit boards in which semiconductors and other components are manufactured in a module to save space and – thanks to shorter signal and new technology – power.

Published On: 13. January 2021

Share post: