A form of distributed computing whereby data storage and processing shifts closer to a device rather than relying on a central location.
Edge computing is often seen in IoT deployments as well as for applications that depend on real-time data such that decision making is done close to data creation.
A form of distributed computing whereby data storage and processing is performed closer to individual devices rather than relying on a central location.
You can save money and boost performance for certain applications, such as autonomous systems, where you need near real-time decisions.
Edge computing may increase the complexity of your overall architecture.
Edge computing is often used to support Internet of Things deployments.
What is it?
A form of distributed computing whereby data storage and processing shifts closer to a device rather than relying on a central location.
Edge computing is particularly suited to applications where latency issues have serious performance implications, for example in Industrial IoT, medical imaging, or VR/AR. By shifting the computation closer to the edges of the network, you’re able to improve performance.
What’s in for you?
If you have applications that are highly sensitive to data latency, edge computing can provide a significant performance boost.
Businesses can save money by getting the processing performed locally, minimizing the amount of data that needs to be stored centrally or in the cloud. Because less energy is used to transport the data, Edge computing is potentially more sustainable too.
What are the trade offs?
Edge computing introduces more diverse and complex deployment scenarios for your organization. Think about management, monitoring, and testing challenges associated with complex and remote architectures.
You’ll need to consider how you deal with data privacy on the edges of your network — this is particularly true if you’re using resource-constrained IoT sensors that may limit your ability to encrypt data.
How is it being used?
Edge computing is often found in Internet of Things (IoT) deployments, where the bandwidth costs of transporting large amounts of data long distances could be prohibitive.
Increasingly, we’re seeing edge computing being used for applications that exploit real time data — such as video processing and analytics, autonomous vehicles and robotics. This is supported by faster networking technologies, such as 5G wireless.
Would you like to suggest a topic to be decoded?
Just leave your email address and we'll be in touch the moment it's ready.