Written by Tania Piunno
"The 'Edge' equation is quite simple - reduce data volumes sent to the Cloud and automatically lower your costs".
What was once believed to be the foreseeable future is happening right now, whether we are ready for it or not. With the emerging 5G ecosystem, we are in the middle of a data revolution that has operators scrambling to overcome the challenges associated with having a centralized cloud computing environment. Edge computing promises to remedy these challenges (addressed later in this blog) by creating a distributed network architecture as a means to offload tasks from the core.
There are many facets to edge computing; however, when we discuss digital transformation today, similar to the real estate market, the hot topic is - location, location, location. Many of my recent conversations about the growth of 5G and the Internet of Things (IoT) are focused on the geographic location in which the applications will live, and the Edge of a network is where everything is going.
The birth of Edge Computing
Cloud computing architecture has been around for a lot longer than Edge. People would send all data to the Cloud and have it processed and analyzed in one centralized location. Although many benefits come with cloud computing, such as flexibility and scalability, people would now argue that large amounts of data cannot always be migrated or moved quickly to the Cloud, and alternate strategies are needed. Thus, the birth of edge computing as an efficient way to handle the massive explosions in data volume that will continue to generate at tremendous speeds.
Why is location so important?
Organizations need a more natural way to access the data, interact with it, and make faster decisions in real-time. Enterprises, for example, require a more intelligent placement of workloads and division of tasks among multiple computers and access points connected via a single network to speed up those tasks. Having a distributed cloud in your network infrastructure enables less strain on the core network and provides the means to deploy ultra low latency applications in remote locations.
The optimal time to have a distributed model - including applications at the far Edge of the network - is when endpoint devices leverage more and more data. The emergence of 5G brings with it, increases and surges in data traffic and speed demands of next-generation networks. Edge computing allows service providers to process much of the data locally from a cell tower or as close to end-users as a street cabinet. It helps offload data to the Cloud during peak times, and information is sent to the central Cloud only when the network becomes overly congested.
Why 5G and Edge Computing are perfect together
The shift from 4G to 5G networks is in its beginning stages and is becoming a reality faster than we can imagine with the help of new technologies. Thanks to 5G, people have warmed up to edge computing technologies. The reason behind it is most use cases for 5G require a lot more computing power at the network edge. With 5G comes more smartphones, more data, more network congestion, and unfortunately, centralized cloud computing may be too slow for devices that require data processed in milliseconds.
Take online gaming, for example. Edge computing promises to offer much better in-game experiences by reducing latency. Since latency is such a fundamental component of online games, AR and VR for that matter, the future of gaming is where you think it is - at the Edge.
Also, the potential for edge computing to transform the telecommunications industry by leveraging existing cellular networks is revolutionary. As 5G networks roll out more fully worldwide and implementations by some of the big players like AT&T and Verizon have already begun, mobile edge computing plays a critical role in a telecommunication company’s next-generation infrastructure strategy, regardless of business size.
Although the idea may seem like a huge leap, edge computing promises substantial economic benefits by reducing datacenter costs. Datacenter cost reduction alone is enough to drive many organizations to move compute towards the Edge as they transition to a network architecture that supports 5G.
Edge and increased cost savings
Having a centralized architecture model means storage is located close to compute resources. That works well for companies looking for pay-per-use billing, even if that means lower performance for end-users. However, when vast loads of data are transferred on or off the Cloud, data transfer is very expensive, and as a result, much less appealing. Costs associated with bandwidth, distance data is travelling, resources needed to monitor and configure the transferred data, all tend to add up very quickly. The 'Edge' equation is quite simple - reduce data volumes sent to the Cloud and automatically lower your costs.
Due to these cost savings, companies are forced to rethink their IT infrastructure approach to incorporate Edge into their traditional data processing strategies. With edge server hardware, part of the data processing can be done closest to the users and not automatically sent to the Cloud to be analyzed. This distributed offloading solution reduces capital expenses as well as bandwidth costs. And by reducing the distance by which data travels before it is processed, operators have more money to spend on modernizing legacy systems that, whereas prior, were considered barriers to entry for infrastructure innovation.
Where Kontron fits into the Edge Computing value chain
Kontron is at the much-needed hardware portion of the value chain, enabling a more distributed infrastructure from a single, intelligent platform, including servers that sit inside the local datacenter, in cell phone towers, street cabinets, etc. We provide the compute infrastructure that resides near the source of the data. In other words, hardware that allows analytics and data gathering to occur near mobile phones, tablets, laptops, the list goes on. We’re beginning to see movement toward this virtualized infrastructure and the numerous benefits it brings to operators, thus enabling edge services and essentially more computing power in a smaller hardware footprint.
Although centralized datacenters will undoubtedly continue to play a vital role in digital services, Edge becomes a better option under these primary conditions:
1. When there isn’t enough bandwidth or reliability to send the data back to the Cloud.
2. When demand for low-latency is much more complex, such as for critical applications, autonomous vehicles (AV) applications, for example, require response time in real-time to avoid accidents or collisions or where the situation might be a matter of life or death.
3. When operators need better equipment and security in harsh environments to deploy applications anywhere like a a street post cabinet.
As more compute power goes to the far Edge, it will need a foundation. Operators are looking for a converged platform that can help with storage, processing, security, and networking, running a software layer on top. Kontron’s ME1100 mobile edge computing servers quickly adapt to the service provider's virtual infrastructure and unlock IT and cloud computing capabilities within the radio access network (RAN). Ideal for ultra-low latency and high-bandwidth applications, these short-depth Xeon-D servers enable content closer to the Edge while allowing operators to solve restricted space, power, environmental challenges, and savings on costs.
What does this mean for Cloud?
Finally, this brings me to question whether Cloud is disappearing with the rise of edge computing. I realize the answer is 'no.' All it means is that Cloud is coming closer to us, the consumers of the data. An exciting challenge for forward-thinking companies will be how to deploy both Cloud and edge computing technologies and work with both simultaneously.