The last few years have witnessed a tremendous increase in the consumption of mobile applications. With the falling price of smartphones and mobile tariffs, there is a growing dependence on mobile apps to execute several professional and personal tasks. Right from banking to shopping to performing several work-related tasks, apps are enabling us to complete several tasks, especially so during the ongoing COVID-19 pandemic.
Several apps require a quick response, and this is the key reason why the telcos are increasingly using the mobile edge to ensure that the quality of services is maintained.
Typically, the data collected from the devices had to be routed back to the cloud or a central datacentre to be analyzed for insights. CSPs and organizations are now using the network edge instead of sending the data to the central location.
The mobile edge has opened up new opportunities for the Communications Service Providers (CSPs). It has unlocked innovation at the network edge, with the development of applications for business, industries, and consumers. The upcoming new technologies, like Augmented Reality (AR), Virtual Reality (VR), Artificial Intelligence (AI), and gaming, will further demand the greater use of mobile edge.
Hosting applications on the network edge offers several benefits. Possibly the most significant advantage is the reduced latency. Low latency of less than 10 milliseconds is required for several new-age use cases, like connected vehicles, public safety mission-critical use cases, gaming, smart city, and smart factory. The use of network edge also brings down the cost because it reduces the backhaul. The use of network edge also helps the CSPs meet the regulatory and compliance requirements since the data never leaves the owner’s site. This is particularly relevant in the case of sensitive and highly-regulated applications.
In spite of several benefits of running applications on the network edge, it comes with several challenges. One of the significant problems of deploying applications at the edge is that there are different underlying platforms architectures, like virtual Radio Access Network (vRAN) and Next Generation Central Office (NGCO), among others. Further, there are various access termination methods, like LTE, CUPS, SGI, S1, and 5G. All this leads to network complexity, which prevents developers from developing innovative applications for the network. It also creates a problem for development and testing cycles for applications as developed by CSPs and restricts their portability.
More often than not, enterprises use different apps that are hosted on different public and private clouds. Applications running on the edge will need to connect to various cloud platforms.
Further, the present generation of servers and network gears are unable to meet the requirement for data crunching, delivery, acceleration and orchestration. It poses a unique distributed system challenge for the CSPs.
The telcos need a scalable and flexible framework that allows them to efficiently manage a multi-service edge. Further, the service providers need the ability to service-chain network service applications at the edge based on the subscriber and the associated policies. The framework should be able to integrate the present and the future applications and ensure effective management of the same. Needless to say, the service providers require high throughput at the network edge.
Over the last few years, Intel’s Open Network Edge Services Software (OpenNESS), a software toolkit, has emerged as a technology of choice for the telcos to address these challenges.
It allows developers to create apps that are able to run in any edge location or even on a centralized cloud. Intel’s OpenNESS has standard APIs from 3GPP and ETSI Multi-access Edge Computing (MEC) industry group.
It also offers cloud adapters to work seamlessly with Amazon Web Services (AWS) and Microsoft Azure, among several others. It provides cloud and IoT developers an easy-to-use toolkit to develop and deploy applications at the network edge or on-premise edge locations.
The telecom service providers also need to ensure that the platform is not only great for developers but is also easy to operate at scale. The existing present-day infrastructure and service orchestration solutions are not built for the app ecosystem. Further, some of the orchestration solutions are proprietary ea with vendor lock-in.
OpenNESS works with Kubernetes, which is used to provision edge resources and configures the enhanced platform features for the network edge. With Intel’s OpenNESS, the CSPs can create a platform that scales efficiently and can orchestrate resources and applications effectively.
The OpenNESS’ platform supports innovation with a microservices-based cloud-native architecture that provides CSPs with an easy and clear option for driving the adoption of cloud-native solutions at the edge. OpenNESS is built on microservices and open APIs, which means that other solutions can easily align with this environment. Virtual Network Functions (VNFs), user-facing apps, and orchestrators can use APIs to work with Intel’s framework. The use of microservices means that it is easier to manage and upgrade when compared with traditional monolithic applications.
Clearly, Intel’s OpenNess makes it easier for software and application developers to create innovative products that can run on any edge environment.
Further, the microservices architecture allows CSPs to fast-track the adoption of cloud-native solutions at the edge, allowing them to connect to multi-cloud environments. It enables CSPs to orchestrate services at scale and leverage cloud-native and microservices-based architecture.
As we advance, edge computing will support multi-tenancy and KPIs for several new-age services and applications. It will encourage a more decentralized and collaborative approach.