Considering Edge Computing? Here are 5 questions you should ask. (IT Toolbox Blogs)

IoT (Internet of Things) cloud-based platform solutions can be excessively slow, expensive and inefficient.  Due to bandwidth restrictions, by the time data from hundreds or thousands of IoT devices reach the cloud for processing, it’s often too late to act upon it.

Cloud vs Edge
Cloud computing is not always efficient for data processing when the data is produced at the edge of the network, as is common in IoT  applications. Response times of a minute or less are crucial for many IoT applications, particularly those involving driverless cars, security cameras and critical patient monitors. These applications are finding it hard to overcome the bandwidth bottleneck to deliver the massive data they collect to the cloud for processing. To bypass the cloud’s bandwidth limitations, many organizations are turning to edge computing. 

Edge computing is distributed information technology architecture in which device data is processed at or near its source. Depending on the implementation, data for real-time decision-making may be processed (analyzed) at the point of origin by the device collecting the data, or sent to an intermediary server located in close geographical proximity to the device. In either case, response times are no longer subject to network outages or limited connectivity to the cloud. Only the data not intended for local processing is sent to the cloud for later historical analysis, big data analytics or long-term storage. 

While reducing the amount of data sent over cloud-based networks can improve performance and reduce storage and infrastructure costs, there are still costs to consider when moving data processing to the edge. Boston’s CIO Jascha  Franklin-Hodge admits that “there are some very specific use cases where edge computing is the antidote to not enough bandwidth and not enough connectivity”, but adds, “What cloud infrastructure has taught us over the last 10 years is that centralized, high-efficiency computing infrastructure in most use cases is going to outperform distributed, lower-efficiency systems in price, performance, scalability, resiliency and all the other things we value. I think a lot of the use cases of edge are going to fall off as we build more robust networks.” 

So with Franklin-Hodge’s cautionary words in mind, here are five questions that should help clarify whether increasing bandwidth with edge computing is right for your applications both now and in the future:

How much computing is really needed at the edge?
Another way to ask this question is:  Which data intensive tasks would benefit most from being offloaded from the network?  Not all applications will qualify and there will still be many that require a broader data aggregation beyond the scope of local computing. Look for cases where it would be more efficient for data to be processed nearer to the user or data source.  According to Steven Carlini, these three are primary candidates for edge computing:

IoT aggregation and control point.
Intelligent IoT devices use edge computing to collect massive amounts of data automatically about physical assets (such as machines, equipment, vehicles and facilities) to monitor status or behavior, and then analyze the data in real-time to provide visibility and control over local processes and resources.

High-bandwidth Content Distribution
Here, edge computing is used to relieve network congestion to improve streaming of high bandwidth content. Content is cached on multiple servers and directed to users based on their proximity to the servers.

On-premise applications
Also known as off-premise cloud computing, on premise applications are business critical applications that can be duplicated on-site, so that any kind of network disruption is limited to the edge computing devices and the local applications on that device.

Is the necessary bandwidth even available locally?
Edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. Whether there will be sufficient bandwidth depends on how much data the application/device generates. One option to increase bandwidth is to install high-performance servers at remote locations to replicate cloud services locally.

How much storage is appropriate at the edge?
With edge computing, large amounts of data that would have been stored in the cloud will now be stored locally. While storage hardware is cheap, the cost of managing it is not. Will the cost of maintaining and managing device data locally justify moving it to the edge?

How will devices at the edge be secured?
Compared to uploading raw data directly to the cloud, processing data at the edge could better protect user privacy. But, the distributed architecture of edge computing makes intelligent edge devices potentially more vulnerable to malware infections and security breaches.

Will you be able to update edge devices in a consistent manner?
Because they are no longer part of the cloud infrastructure, edge devices will require onsite management.  Updates must be deployed in a consistent manner on all devices to maintain adequate security and performance levels. 

Edge computing
big data
Jascha Franklin-Hodge
Steven Carlini

Disclaimer: Blog contents express the viewpoints of their independent authors and
are not reviewed for correctness or accuracy by
Toolbox for IT. Any opinions, comments, solutions or other commentary
expressed by blog authors are not endorsed or recommended by
Toolbox for IT
or any vendor. If you feel a blog entry is inappropriate,

click here

to notify
Toolbox for IT.

Source: SANS ISC SecNewsFeed @ January 31, 2017 at 02:12PM