Cloud Continuum: From Cloud to IoT to Edge Computing

Cloud Continuum: From Cloud to IoT to Edge Computing
share on
by Sanjeev Kapoor 17 Feb 2023

The cloud computing services concept has its roots back in the 1960s, when researchers introduced the idea of accessing computing resources from remote, through a supercomputer. This idea was implemented in practice during the mainframes era, where mainframe systems served multiple client computers. Over the years, the concept evolved in the direction of a distributed computing paradigm, thanks to the advent of multi-tier models and of virtualization technologies. Nevertheless, the term “Cloud computing’ itself did not come into use until the early 2000s, when companies like Amazon, Google, and Salesforce began offering web-based applications and services that could be accessed from anywhere and on any device, using a pay-as-you-go model. The popularity of cloud computing has since exploded, as businesses and organizations of all sizes have recognized the benefits of being able to access computing resources on demand, without having to invest in and maintain their own physical infrastructure.

Nowadays, a great number of modern enterprises leverage the capacity, scalability, and quality of service of the cloud. Specifically, modern businesses use a variety of cloud computing services, which range from access to virtualized computing resources (e.g., compute cycles) to the use of a variety of software applications (e.g., Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) systems). Despite its popularity, the cloud continuum has some proclaimed limitations when it comes to supporting real-time, low-latency applications. This is because cloud resources are usually accessed through a wide area network, which incurs intolerable delays for real-time applications. At the same time, several organizations are reluctant to share sensitive data to cloud due to privacy and data protection concerns. These are some of the main reasons why cloud computing is nowadays complemented with other computing paradigms such as edge computing.

 

From Cloud Capacity to Edge Intelligence

Edge computing is a distributed computing paradigm that processes data and executes applications closer to the source of the data. Hence, edge computing emphasizes local data processing rather than sending all data to a centralized data center or cloud. This is often done to reduce latency, improve response times, and lower the amount of data that needs to be transmitted over a wide area network. The development of edge computing infrastructures is propelled by the advent of 5G networks, which make provisions for the setup of high-performance local networks.

Cloud or something else.
Let's help you with your IT project.

In edge computing, data is processed and analyzed on devices or servers that are located closer to the edge of the network, such as edge servers, edge clusters and embedded devices. This provides faster and more efficient data processing. At the same time, it results to a much more efficient use of network resources, given that a much more limited volume of data must travel through the network to the cloud.

Edge computing is particularly useful in situations where real-time data analysis is required, or where the volume of data generated is too large to transmit to a centralized data center for processing. This is for example the case in a very large number of applications that must implement intelligent functionalities locally i.e., at the point of action, rather than within the cloud. Here are some prominent examples:

  • Most autonomous and connected vehicles must implement local intelligence functionalities in real-time (e.g., obstacle avoidance, automated braking) to ensure safe and effective driving.
  • Several industrial control systems (e.g., production defect detection) must operate in real-time to ensure timely and high-quality production.
  • Smart security applications in urban environments (e.g., real-time video analytics) need to identify security issues (e.g., abnormal behaviors) at the point of interest rather than losing time for transferring and processing large amounts of data from cameras in the cloud.

Overall, edge computing drives the implementation of edge intelligence functionalities at the point of interest. Furthermore, it reduces the attack surface of cloud applications as it drastically limits the amount of data that are transferred from the data sources to the cloud. This provides much fewer opportunities for data breaches.

In practice, edge computing is a complementary approach to cloud computing, as it allows for distributed data processing and provides a balance between centralized cloud resources and local device processing. This is also the reason why the terms “cloud” and “edge” are commonly combined in the scope of the term “cloud/edge” computing paradigm, which is characterized by the interplay between cloud and edge applications.

 

The IoT Part of the Continuum: Edge Intelligence on Devices

In several cases, Internet of Things (IoT) devices can serve as edge nodes in the cloud/edge computing paradigm. Therefore, IoT plays a crucial role in both cloud and edge computing. IoT devices are typically small, low-power devices that collect and transmit data to be analyzed and processed. These devices generate vast amounts of data, and cloud and edge computing provide the infrastructure needed to store, process, and analyze this data.

IoT devices can benefit from both cloud and edge computing, depending on their specific needs. For example, an IoT device that generates large amounts of data may benefit from cloud computing, while an IoT device that requires real-time processing may be better served by edge computing. Moreover, IoT enables novel forms of edge intelligence, which are provided on the device itself. Nowadays, this form of edge intelligence becomes more sophisticated than in the past, as it is possible to execute Artificial Intelligence (AI) models within pervasive, resource constrained devices (e.g., microcontrollers). This explains the popularity of Embedded Machine Learning and TinyML, which are increasingly used to make machines more intelligent than ever before.

Overall, IoT is an important enabler of both cloud and edge computing, as it generates vast amounts of data that need to be stored, processed, and analyzed. Cloud and edge computing provide the infrastructure needed to make sense of this data and extract meaningful insights from it.

The symbiotic relationship between cloud computing, edge computing and IoT is reflected on the term cloud/edge/IoT computing continuum. The latter is a way of visualizing the relationship between cloud computing, edge computing, and the Internet of Things (IoT). It represents the different layers of the computing infrastructure that work together to support different applications. Specifically:

  • At the top of the continuum is cloud computing, which provides a highly scalable and flexible infrastructure for storing, processing, and analyzing data based on a single or multiple clouds (e.g., hybrid clouds).
  • In the middle of the continuum is edge computing, which brings the computing and storage resources closer to the IoT devices themselves.
  • At the bottom of the continuum is the IoT itself, which consists of a vast network of connected devices that generate and transmit data. IoT includes a wide variety of devices, from sensors and wearables to industrial machinery and smart appliances.

At each layer, different technologies and architectures are used to meet the specific requirements of enterprise applications. By working together, these layers form a continuum that enables the seamless and efficient processing of data from IoT devices. Modern enterprises are expected to invest in the different layers of the continuum to meet the requirements of their applications and improve their business results.

Recent Posts

get in touch

We're here to help!

Terms of use
Privacy Policy
Cookie Policy
Site Map
2020 IT Exchange, Inc