Headlines
Loading...
What Is Edge Computing? Everything You Need to Know about - TeKnology Article

What Is Edge Computing? Everything You Need to Know about - TeKnology Article

 


Data is the lifeblood of modern business, providing valuable business insight and supporting real-time control over critical business processes and operations. Today's businesses are awash in an ocean of data, and huge amounts of data can be routinely collected from sensors and IoT devices operating in real-time from remote locations and inhospitable operating environments almost anywhere in the world.

But this virtual flood of data is also changing the way businesses handle computing. The traditional computing paradigm built on a centralized data center and everyday internet aren't well suited to moving endlessly growing rivers of real-world data. Bandwidth limitations, latency issues, and unpredictable network disruptions can all conspire to impair such efforts. Businesses are responding to these data challenges through the use of edge computing architecture.

Let us know more about Edge Computing.


TABLE OF CONTENT

WHAT IS EDGE COMPUTING?

GREATEST EXAMPLES OF EDGE COMPUTING.

HOW DOES EDGE COMPUTING WORK?

ADVANTAGES OF EDGE COMPUTING.

HOW TO BUY AND DEPLOY EDGE COMPUTING SYSTEMS?


بِسْÙ…ِ ٱللَّÙ‡ِ ٱلرَّØ­ْÙ…َÙ€ٰÙ†ِ ٱلرَّØ­ِيمِ Ù¡


WHAT IS EDGE COMPUTING

Edge computing is the practice of capturing, storing, processing, and analyzing data near the client, where the data is generated, instead of in a centralized data-processing warehouse.

Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data. This is expected to improve response times and save bandwidth. "A common misconception is that edge and IoT are synonymous. Edge computing is a topology- and location-sensitive form of distributed computing, while IoT is a use case instantiation of edge computing. "The term refers to architecture rather than a specific technology.

Edge computing moves some portion of the storage and computes resources out of the central data center and closer to the source of the data itself. Rather than transmitting raw data to a central data center for processing and analysis, that work is instead performed where the data is actually generated -- whether that's a retail store, a factory floor, a sprawling utility, or across a smart city. Only the result of that computing work at the edge, such as real-time business insights, equipment maintenance predictions, or other actionable answers, is sent back to the main data center for review and other human interactions.


GREATEST EXAMPLE OF EDGE COMPUTING

There are far too many examples of edge computing use cases to list here, so we’ve chosen 10 important ones below:


AUTONOMOUS VEHICLE

Autonomous platooning of truck convoys will likely be one of the first use cases for autonomous vehicles. Here, a group of trucks travels close behind one another in a convoy, saving fuel costs and decreasing congestion. With edge computing, it will be possible to remove the need for drivers in all trucks except the front one, because the trucks will be able to communicate with each other with ultra-low latency.




REMOTE MONITORING OF ASSETS IN THE OIL AND GAS INDUSTRY

However, oil and gas plants are often in remote locations. Edge computing enables real-time analytics with processing much closer to the asset, meaning there is less reliance on good quality connectivity to a centralized cloud.

SMART GRID

Edge computing will be a core technology in the more widespread adoption of smart grids and can help allow enterprises to better manage their energy consumption.

Sensors and IoT devices connected to an edge platform in factories, plants, and offices are being used to monitor energy use and analyze their consumption in real-time. With real-time visibility, enterprises and energy companies can strike new deals, for example where high-powered machinery is run during off-peak times for electricity demand. This can increase the amount of green energy (like wind power) an enterprise consumes.

PREDICTIVE MAINTAINANCE

Manufacturers want to be able to analyze and detect changes in their production lines before a failure occurs.

Edge computing helps by bringing the processing and storage of data closer to the equipment. This enables IoT sensors to monitor machine health with low latencies and perform analytics in real-time.

IN-HOSPITAL PATIENT MONITORING

Healthcare contains several edge opportunities. Currently, monitoring devices (e.g. glucose monitors, health tools, and other sensors) are either not connected, or where they are, large amounts of unprocessed data from devices would need to be stored on a 3rd party cloud. This presents security concerns for healthcare providers.

An edge on the hospital site could process data locally to maintain data privacy. Edge also enables right-time notifications to practitioners of unusual patient trends or behaviors (through analytics/AI), and the creation of 360-degree view patient dashboards for full visibility.

VISUALISED RADIO NETWORKS AND 5G (vRAN)

Operators are increasingly looking to virtualize parts of their mobile networks (vRAN). This has both cost and flexibility benefits. The new virtualized RAN hardware needs to do complex processing with low latency. Operators will therefore need edge servers to support virtualizing their RAN close to the cell tower.

CLOUD GAMING

Cloud gaming, a new kind of gaming that streams a live feed of the game directly to devices, (the game itself is processed and hosted in data centers) is highly dependent on latency.

Cloud gaming companies are looking to build edge servers as close to gamers as possible to reduce latency and provide a fully responsive and immersive gaming experience.




CONTENT DELIVERY

By caching content – e.g. music, video stream, web pages – at the edge, improvements to content delivery can be greatly improved. Latency can be reduced significantly. Content providers are looking to distribute CDNs even more widely to the edge, thus guaranteeing flexibility and customization on the network depending on user traffic demands.

TRAFFIC MANAGEMENT

Edge computing can enable more effective city traffic management. Examples of this include optimizing bus frequency given fluctuations in demand, managing the opening and closing of extra lanes, and, in the future, managing autonomous car flows.

With edge computing, there is no need to transport large volumes of traffic data to the centralized cloud, thus reducing the cost of bandwidth and latency.

SMART HOMES

Smart homes rely on IoT devices to collect and process data from around the house. Often this data is sent to a centralized remote server, where it is processed and stored. However, this existing architecture has problems around backhaul cost, latency, and security.

By using edge compute and bringing the processing and storage closer to the smart home, backhaul and roundtrip time is reduced, and sensitive information can be processed at the edge. As an example, the time taken for voice-based assistant devices such as Amazon’s Alexa to respond would be much faster.


HOW DOES EDGE COMPUTING WORK?

Edge computing works by capturing and processing the information as close to the source of the data or desired event as possible. It relies on sensors, computing devices, and machinery to collect data and feed it to edge servers or the cloud. Depending on the desired task and outcome, this data might feed analytics and machine learning systems, deliver automation capabilities or offer visibility into the current state of a device, system, or product.

Today, most data calculations take place in the cloud or at a data center. However, as organizations migrate to an edge model with IoT devices, there’s a need to deploy edge servers, gateway devices, and other gear that reduce the time and distance required for computing tasks—and connect the entire infrastructure. Part of this infrastructure may include smaller edge data centers located in secondary cities or even rural areas or cloud containers that can easily be moved across clouds and systems, as needed.

Yet edge data centers aren’t the only way to process data. In some cases, IoT devices might process data onboard, or send the data to a smartphone, an edge server, or a storage device to handle calculations. In fact, a variety of technologies can make up an edge network. These include mobile edge computing that works over wireless channels; fog computing that incorporates infrastructure that uses clouds and other storage to place data in the most desirable location; and so-called cloudlets that serve as ultra-small data centers.

An edge framework introduces flexibility, agility, and scalability that are required for a growing array of business use cases. For example, a sensor might provide real-time updates about the temperature a vaccine is stored and whether it has been kept at a required temperature throughout transport.

Sensors and edge IoT devices can track traffic patterns and provide real-time insights into congestion and routing. And motion sensors can incorporate AI algorithms that detect when an earthquake has occurred to provide an early warning that allows businesses and homes to shut off gas supplies and other systems that could result in a fire or explosion.

ADVANTAGES OF EDGE COMPUTING

As organizations wade deeper into the digital realm, edge computing and edge technologies eventually become a necessity. There’s simply no way to tie together vast networks of IoT edge devices without a nimbler and more flexible framework for computing, data management, and running applications outside a data center. Edge computing boosts device performance and data agility. It also can reduce the need for more expensive cloud resources, and thus save money.

Also, because edge computing networks are highly distributed and essentially run as smaller interconnected networks, it’s possible to use hardware and software in a highly targeted and specialized way.

This makes it possible, for example, to use different programming languages with different attributes and runtimes to achieve specific performance results. The downside is that heterogeneous edge computing frameworks introduce greater potential complexity and security concerns.

SOME KEY BENEFITS:

ENHANCED SPEED

From a performance standpoint, edge computing can deliver much faster response times. That’s because locating key processing functions closer to end-users significantly reduces latency. In traditional networking, data is typically collected on the edge and transmitted back to centralized servers for processing. If a response is needed, these servers then send instructions back to devices on the edge. But with edge computing frameworks, this processing is handled much closer to the source of the data. Devices can respond much faster since they spend less time waiting for data packets to traverse the distance from the edge to the core and then back again.

BANDWIDTH RELIEF

By keeping more data on the network edge, the overall volume of traffic flowing to and from central servers is reduced. That frees up much-needed bandwidth throughout the entire system as a whole, eliminating troublesome bottlenecks and unnecessary processing tasks. For the growing number of organizations managing data-intensive digital media services, the ability to cache high-demand content in regional edge servers puts far less strain on the broader network. End users get the benefit of faster performance since their local network isn’t competing with other regions for limited bandwidth resources.

IMPROVED DATA MANAGEMENT

Data gathered on the network edge is incredibly valuable because it contains valuable insights into user behavior. Unfortunately, much of that information is also useless “noise,” which is why powerful analytics tools are needed to process that unstructured data to identify meaningful trends. Networks typically transmit all information gathered on the edge back to centralized servers capable of sifting through massive troves of big data. A well-designed edge computing network, however, can use a combination of local devices and edge data center resources to better manage that data. Rather than transmitting all of that data back to the core, edge networks can process some of it locally and only pass on certain types of information. This frees up valuable processing resources throughout the network and greatly improves the quality of data insights generated by big data applications. 

BETTER SECURITY

Although edge computing expands the overall network surface area and increases the number of endpoints, this doesn’t necessarily mean there are more vulnerabilities to exploit. While it’s obviously important that IoT edge devices are properly secured, the distributed nature of edge networks makes them much more difficult to compromise. If a breach occurs in one area, the compromised portions of the network can be cordoned off without having to shut everything else down. Organizations can also leverage the additional processing resources of the edge network to improve their threat analysis data, which allows them to identify and respond to potential cybersecurity threats much more quickly.

IMPROVED RELIABILITY

Since edge computing architecture distributes processing tasks throughout the network, it tends to be more resilient than more centralized systems. In a traditional network, everything goes down when the main servers experience downtime because all services and applications rely on them for instructions and processing. Edge computing frameworks, on the other hand, are far less consolidated. Even if the core servers are forced to go offline briefly, many essential services can still be delivered on the edge thanks to a combination of local processing and regional edge data centers. This is incredibly important for use cases involving healthcare and autonomous vehicles, where even a few seconds of downtime could quite literally cost lives.


HOW TO BUY AND DEPLOY EDGE COMPUTING SYSTEMS?

IMPLEMENTING STRATEGY:

DECIDE HOW MUCH INTELLIGENCE THERE WILL BE IN YOUR IoT DEVICES

The more intelligence per device, the less intelligence is needed in the edge servers themselves. This is due to data already being filtered at the source: the IoT device. Intelligent, standardized IoT devices will provide lower volumes of data in more easily managed formats. However, intelligent IoT devices have an associated higher cost; it's important to find a happy medium.

Bear in mind that adding a few dollars to each IoT device to gain small amounts of additional intelligence can lead to an aggregate cost of thousands of dollars, which might be better spent on intelligent edge computing systems.

DECIDE HOW YOU ARE GOING TO GROUP YOUR DEVICES

This might be decided to some extent by how you deal with step one -- a disparate grouping of intelligent IoT devices might be easier to manage than one of a collection of relatively dumb devices, as less filtering, analysis, and reporting will have to be done on the data streams.

Don't group all similar IoT devices on a network through an edge device. This will still result in massive data transfers across the network. The idea is to group devices creating data into manageable areas and capture the data as close to the group as possible to minimize cross-platform data traffic.

Therefore, it might be preferable to group devices by proximity, rather than capability. Proximity grouping also lowers latency and enables a far faster response to identified events. Again, this means that the edge server must be more intelligent, as it will deal with different data streams reporting on different events.

DEFINE CAREFULLY WHAT THE PREFERABLE OUTCOMES ARE

It's tempting to try to use edge systems to completely carve out areas of a platform and to use these servers to fully manage the IoT devices, but you shouldn't. A monitored high reading in one IoT device through one edge server might be meaningless on its own, for example, yet when compared to similar devices being monitored through all edge servers, it might be incredibly important. As such, it's necessary to define what a true exception is and how such exceptions must be handled.

Maintain a good rules engine, and update it as required to reflect how priorities and needs are changing. And make sure that a full and real-time inventory of the environment is in place to ensure that the edge servers are operating against the real environment -- not something that was in place some time ago.

USE A HUB-AND-SPOKE APPROACH

To manage the flow of data that is required, you must have an edge infrastructure that consists of different edge servers placed across a network with a hierarchical manner of dealing with the data between them. The optimum way to deal with such a complex system is to have the lowest-cost, least-intelligent edge servers -- a relative use of terminology, these systems might be quite intelligent and costly in themselves -- as close to the IoT devices as possible.

Where these edge servers identify events that might be of further interest, they must be able to send the relevant data to a more intelligent, more central server that is managing a group of or all the edge servers. This central system can then apply more intelligence to data analysis and better decide what actions are required. It's also necessary for the edge servers to work bilaterally: The outer edge servers must be able to identify events and send data to the center, while the center also must be able to demand data in real-time from the outer edge servers to bolster the data it's dealing with.

An example here could be in a data center where one edge server picks up a high-temperature reading. As far as it's concerned, it's a local event, but it sends that event through to the central server. That server requests that all other outer edge servers send through readings from all appropriate temperature monitors. If they are all within limits, then yes, it's a local problem, probably due to a single item overheating. However, if other reports come through of even minor temperature increases, it might mean that the cooling system for the whole data center has failed -- requiring a much different set of events to correct.

Where proximity grouping lowers direct data latency for all IoT devices under the control of the one edge server, a hub-and-spoke model lowers the time needed for analyzing data as the central server only needs to deal with known, high-priority data.

EMPLOY ADVANCED DATA ANALYTICS AND REPORTING

Although automation has come a long way, it's still not 100% accurate. As such, there will be cases -- there could be quite a lot of them due to the lack of maturity of the area currently -- where edge servers must alert a human to an action to be taken. False positives and negatives must be avoided and possible routes to remediation must be shown to any human that gets involved. As such, don't scrimp on the analytics tools used, and make sure that reporting is carried out in a clear and meaningful manner.

CONSIDERATIONS FOR AN EDGE COMPUTING IMPLEMENTATION

Edge computing is likely to become as necessary to an organization's overall platform as the cloud is. It's not going to be a case of "either/or" but rather "where does edge fit best within the overall architecture?" For many organizations, it will be in managing the growth of IoT, whether this is for production lines, control systems, intelligent building management, or whatever else. For others, it might be in more highly specialized use cases, such as dealing with highly "chatty" data systems that sit badly on standard cloud platforms. However, edge computing should be an area that organizations are looking into now. Leaving it for a future time could result in major changes being required to existing cloud architectures and more time and money spent on trying to overlay and integrate more physical systems into a highly virtualized environment.

Edge computing is still in its early days, and it is easy to get things wrong by not applying the right amount of forethought upfront. These points should help in ensuring that the right edge deployment strategy is put in place.


EDGE COMPUTING DEPLOYMENT OPTIONS

So how should CSPs go about addressing an edge computing opportunity? Based on their enterprise strategies, the use cases addressed and the underlying business case, CSPs can take different roles in the value chain, or use different deployment options. The roles are categorized based on whether they want to build edge infrastructure and whether they want to front the enterprise. Fronting the enterprise means that CSPs have more than the relationship for connectivity, but also the relationship to influence the enterprise's choice of edge deployment option. A CSP can take a single role, or a combination of the roles described below as part of the strategy, for example, towards different industry verticals depending on how strong their relations are. Ericsson has identified four main roles for CSPs:

Which role, or roles, suit you and your enterprise strategy the best? To analyze that and select a suitable deployment strategy, one can go through a set of questions addressing areas like competence in enterprise verticals, capacity to provide complex solutions, and go-to-market capabilities.


CONCLUSION

Edge computing gained notice with the rise of IoT and the sudden glut of data such devices produce. But with IoT technologies still in relative infancy, the evolution of IoT devices will also have an impact on the future development of edge computing. One example of such future alternatives is the development of micro modular data centers (MMDCs). The MMDC is basically a data center in a box, putting a complete data center within a small mobile system that can be deployed closer to data -- such as across a city or a region -- to get computing much closer to data without putting the edge at the data proper.



I am a writer and geek. I have a deep understanding of how to make engaging content, with a focus on tech hacks. I'm a writer, editor, and content strategist with a passion for technology, hacks, tips and tricks. I write about the latest in tech news and trends.

0 Comments:

If you have any promblem, please let me know!