x32x01
  • by x32x01 ||
What is Edge Computing ?
What is Edge Computing ?
See, most organisations took plunge to cloud computing paradigm... Cloud provided its own benefits to propel the massive adoption of cloud computing, but it was and still is a very 'centralized' approach. And the most of their DATA was either being sent to their data-centers or cloud, where it would be processed and analyzed to help in decision-making.

But soon many companies realized that it is introducing unnecessary LATENCY, and the challenges of BANDWIDTH, resiliency, and data sovereignty. Then emerged a new computing model which is a kind of hybrid model of computing. They would maximize on the cloud-capabilities for things and tasks which require 'very intensive' computing workloads. However, they needed to address the requirements of workloads which required processing in 'REAL-TIME.' This was achieved with Edge Computing.

WHAT IS EDGE COMPUTING?
At its simplest meaning, edge computing is the practice of capturing, processing, and analyzing the data near where it is created.
Putting it another way, you can say that edge computing brings the data and the computing closest to the point of interaction. It takes a 'decentralized' approach to data processing and allows your company to move the process of analysis (read, analytics) and subsequent decision-making literally VERY CLOSE to where the actual data is produced. And this analysis of data happens on 'real-time' basis.
It was October, 2018 when Gartner declared that-- "Around 10% of enterprise-generated data is created and processed outside a traditional centralized data-center or cloud." Gartner also predicted that -- this figure will reach 75% by 2025. This is what makes Edge Computing very important.
However, it is a bit difficult to understand Edge Computing. Because it involves the complete IT environment, devices, and processes that happen at the edge of a network.

What Is This Network Edge?

The edge of a network refers to where the local network or its devices interact with the internet - the outer border that “touches” the internet. It presents both a network security concern and an opportunity to speed up processing CLOSER to - or WITHIN - the devices at the edge.

For Internet devices, the network edge is where the device, or the local network containing the device, communicates with the Internet. The edge is a bit of a fuzzy term; for example a user’s computer or the processor inside of an IoT camera can be considered the network edge, but the user’s router, ISP, or local edge server are also considered the edge. The important takeaway is that-- the edge of the network is GEOGRAPHICALLY close to the device, unlike origin servers and cloud servers, which can be very far from the devices they communicate with.

EXAMPLES OF EDGE COMPUTING

There are literally countless devices which are operating at the edge of your network. For example,
  • IoT devices: Toasters to refrigerators in your kitchen, your smartwatches, or many types of scanners which are used on a factory floor.
  • Smart Phones, or Wireless devices that transmit data over 5G network, for faster processing of 5G to create smooth end-user experience.
  • Mobile functionality embed in your car or other vehicle. Self-driving cars need to process the information they receive from sensors regarding the speed and proximity of vehicles, people, and various objects. With edge computing, this can be done instantly, enhancing the safety of the driver and others.
  • Many devices used for Building Management solution
  • Many devices and sensors used in manufacturing plant, or offshore oil rigs, or retail outlets
  • Many devices used in hospitals or other medical settings
A wearable health monitor is an example of a basic edge solution. It can locally analyze data like your heart rate, blood pressure or sleep patterns and provide recommendations WITHOUT a frequent need to connect to the cloud. Only a certain information is then sent to the cloud, while most of it is handled within the edge network.

More complex edge computing solutions can act as gateways. In a vehicle, for example, an edge solution may aggregate LOCAL data from traffic signals, GPS devices, other vehicles, proximity sensors and so on, and process this information locally to improve your safety or navigation on the roads.
More complex example still are edge servers, such as those that are currently being deployed in next-generation (5G) mobile communication networks. The edge servers are deployed in 5G cellular base-stations and the job of these edge servers is to host applications and cache content for local subscribers. With this topology, the data does NOT have to travel all the way to a remote data center for the edge device to function properly.

How Edge Computing is different from Cloud Computing?

Cloud computing is about processing data in a data-center or public cloud, whereas the processing of data takes place 'locally' in edge computing.
As I have touched down the example of Edge Servers above. Edge servers actually perform many of the functions of full-fledged data-centers. They are deployed, and are capable of hosting applications and caching content close to where end-users are doing their computing.
You may have not realize it but every time when you are watching movies on Netflix or YouTube (Streaming Videos services), you are not being connected to their centralized data-center all the time. Instead, you are made connected to the geographically nearest (Edge) Servers and those servers are keeping the cache of most frequently accessed contents. And the content is being streamed to you from those nearest servers.

The biggest advantage, after real-time processing, of Edge Computing, is that a powerful edge device can support smart applications. These can incorporate machine learning (ML) and artificial intelligence (AI), AR/VR, robotics, etc., taking advantage of their proximity to the source of input. In this way, smart applications can recognize patterns in the environment of the edge devices on which they operate, and then use this information to adjust how they function and the services they provide.

Every time you ask Siri or Alexa or Google a question, for example, your voice recording is sent to an edge network where Google, Apple, or Amazon uses AI to translate voice to text to enable a command processor to generate an answer to your question.

Hence, the data is stored at INTERMEDIATE POINTS at the ‘edge’ of the network, rather than always at the central server or data-center. It is very critical for you to remember this point!

Edge computing is essentials of manufacturing, healthcare, transportation, telecommunication, farming sectors, as they require a large chunk of data to be processed locally and in real-time basis. However, some of their data still needs to be sent to the cloud or their data-centers.
However, cloud computing can introduce latency because of the distance between users and the data centers where cloud services are hosted. Edge computing moves computing closer to end users to minimize the distance that data has to travel, while still retaining the centralized nature of cloud computing.
Thus, instead of finding the difference between the cloud and edge computing, you should think of Edge Computing, as an extension of the cloud rather than a replacement.

A Beautiful Example of Edge Computing

Consider a building secured with dozens of high-definition IoT video cameras. These are "dumb" cameras that simply output a raw video signal and continuously stream that signal to a cloud server. On the cloud server, the video output from all the cameras is put through a motion-detection application to ensure that only clips featuring activity are saved to the server’s database.
This means there is a constant and significant strain on the building’s Internet infrastructure, as significant bandwidth gets consumed by the high volume of video footage being transferred. Additionally, there is very heavy load on the cloud server that has to process the video footage from all the cameras simultaneously.
Now imagine that the motion sensor computation is moved to the network edge. What if each camera used its own internal computer to run the motion-detecting application and then sent footage to the cloud server as needed?
This would result in a significant reduction in bandwidth use, because much of the camera footage will never have to travel to the cloud server.
Additionally, the cloud server would now only be responsible for storing the important footage, meaning that the server could communicate with a higher number of cameras without getting overloaded. This is what edge computing looks like.

What Are The Challenges Of Edge Computing?

1. Edge computing can simplify a distributed IT environment, but edge infrastructure isn’t always simple to implement and manage.
2. Scaling out edge servers to many small sites can be more complicated than adding the equivalent capacity to a single core data-center. The increased overhead of physical locations can be difficult for smaller companies to manage.
3. Edge computing sites are usually remote with limited or no on-site technical expertise. If something fails on site, you need to have an infrastructure in place that can be fixed easily by non-technical local labor and further managed centrally by a small number of experts located elsewhere.
4. Site management operations need to be highly reproducible across all edge computing sites to simplify management, allowing for easier troubleshooting. Challenges arise when software is implemented in slightly different ways at each site.
5. Physical security of edge sites is often much lower than that of core sites. An edge strategy has to account for a greater risk of malicious or accidental situations.

Security Concerns with Edge Computing

There are significant cybersecurity concerns with respect to Edge computing.
There are so many types of devices which make up the edge computing. Taken together, they all significant increase the 'ATTACK SURFACE' your organisation would face. Every edge device connected to your system, your network topology is another attack surface.
For example, suppose you have a number of edge devices in your factory. You assign different workers to log-in to those devices so that they can use them. These workers are assigned the job of sending regular information to a local edge server. This server would also be sending back some data to those devices. It is quite natural situation...

What if some of those devices are left with weak passwords? Any one can breach those devices then, be him a outside hacker, or a disgruntled insider worker, or any other malicious threat-actor....they can send harmful code to your server that supports the edge network.

Such threat-actors can very easily spy on every activity that is going on within your network and they can steal all the data that is being transferred throughout your network, if you have not placed proper security measures for each device.

An edge computing environment is also susceptible to distributed denial-of-service (DDoS) attacks, particularly if it is connected to the internet. Because many edge networks are still connected to the internet, a DDoS attack could render the devices on the edge useless. It is vital therefore to ensure your edge network is adequately secured.

In order to combat these security challenges, you will have to adopt a combination of micro-segmentation along with zero-trust principles...
 
TAGs: Tags
edge computing

Register & Login Faster

Forgot your password?

Latest Resources

Forum Statistics

Threads
515
Messages
516
Members
43
Latest Member
aadev
Back
Top