Smart Devices and Services Are Driving Edge Computing. Is Your Organization Ready?
- October 18, 2018
The fourth wave of computing is being driven by a combination of forces – organizational demands to increase revenue, profitability, and serve customers better as well as robust use cases that can be solved by the latest technology advances. The first wave of computing started with centralized mainframes, the second wave swung to distributed use of client-server-PCs, and the third wave shifted back some to centralized cloud computing. Now, the fourth wave, which is also called “distributed core” or “fog computing”, distributes computing back to the sources of data and consumption of services.
Operational efficiency leads the first phase of the fourth wave deployments. These include Internet of Things (IoT) sensors installed in high value, mission critical products deployed in field. This is a low cost entry point to add sensors to field devices, and cuts across many industry verticals. Early adopters such as manufacturing and facilities benefit economically from monitoring and acting on critical or expensive end-point data generated from manufacturing platforms, IT equipment, electrical grids, oil and gas grids, shipping containers to transportation fleet equipment. Another low-risk, high-return application is the automation of facilities such as HVAC, lighting, and facilities equipment.
Extending monitoring to predictive maintenance is the logical next step. This requires developing machine learning infrastructure and applications in the back end data center to predict upcoming failure conditions in field equipment. Predictive maintenance is an easy upsell of a value added service that creates an extra revenue stream for the provider and improved customer satisfaction. The Dell Technologies Support and Deployment services (SDS) group is applying this through SupportAssist. Over the last several years, the SDS team has collected more than 260 attributes from 50 million hard drives. Using this info, they’ve trained neural network models to make highly accurate predictions of drive failures. Those models are then pushed to the edge to monitor real time attributes of active drives in-service and to generate alerts with associated metadata of any imminent drive failures weeks in advance. This allows customers to take corrective actions like backups or replacing drives dispatched by the services team.
The next round of IoT deployments is around security. These include analysis of sound, images and video from cameras and surveillance equipment deployed in many private and public spaces. The analysis happens in batch mode retrospectively, or with a significant time lag. Infrastructure with machine learning intelligence detects unusual or abnormal situations in these physical spaces. Automating this repetitive and mundane process reduces human labor and decreases the risk of overlooking real anomalies. Moreover, humans can be deployed to address these abnormal situations.
In the initial phases of these use cases, the compute and storage are deployed at the back-end data center (aka core) or cloud. Scaling out these applications increases round trip data size transfer, and adding a real-time component overloads the end-to-end infrastructure. Real-time security alerts need very low latency decision making. To satisfy that, compute (including storage and communication) pushes closer to the action, driving the need for a layer of computing. This layer is called edge computing.
There are two broad categories for edge computing – “Internet of Things,” described above, and “Smart Services”. While IoT primarily consists of interacting with inanimate “things” like field equipment, smart services add a layer of rich interactions to humans as well. Examples of smart services at the edge include smart multi-media service, first responder services during emergencies, multi-player gaming with real-time and richer audio-visual experiences, and immersive experiences like AR/VR. This new class of “born in the edge and for the edge” services demands faster response time, high data bandwidth, and predictable quality of service. These cannot be well-serviced with long round trips back and forth from consumption point to a centralized data center. The majority of data created at the edge needs to be computed and acted upon locally.
In smart multi-media services, real-time analysis of audio and video can then be extended to object profiles recognition, human sentiment and intent analyses, and more. These can be used to provide improved, personalized services or precautionary actions as the case may be.
Retail object recognition can be used for inventory management and faster checkouts. In manufacturing, image analysis of quality of final products at the end of assembly line can improve yield and lower cost at higher speeds with lower human involvement.
More viewpoints at The hitchhiker’s guide to Edge Computing – by Moor Insights posted on Forbes
Dell Technologies and Microsoft have kicked off joint proof of concept for bringing Microsoft Brainwave to the edge. Brainwave improves quality by processing product images in manufacturing use cases. Brainwave executing on PowerEdge servers with custom FPGAs will improve accuracy, speed up the process, with reduce the amount of human intervention to improve both operational efficiency and customer experience. More info is available here: Microsoft-unveils-project-brainwave, view the video, and Open to External Testers
Today, augmented reality (AR) and virtual reality (VR) add immersive experiences to mostly 2-dimensional internet experiences. They provide distinct competitive advantages by changing the experience for your users, customers or partners. Even today’s content delivery requires content caching at edge cloud locations for latency. However, immersive experiences are computationally more intensive, requiring high data bandwidth and low latency responses matching human reaction time. It is infeasible to drive this from centralized data centers or distantly located clouds. This will require a combination of powerful edge computing distributed closer to consumers and 5G mobile for practical applications such as consumer shopping, virtual real estate property visits, and remote physical examinations. On the consumer side, these need to be supported with affordable and lightweight headsets. To learn more, refer to Enabling Mobile Augmented and Virtual Reality with 5G Networks.
As servers and accelerators can be deployed away from traditional data centers, the technology improvements described here are key drivers for moving compute closer to the edge. Technology that is more power efficient, lower cost, secure and resilient for training and inferencing; fast NVMe storage, storage class memory, software defined paradigms, cheaper sensors and AR/VR headsets. 5G promises theoretical bandwidth of gigabits and will reduce latency to sub-millisecond levels – ideal when data needs to be transferred back to the core.
Are groups inside your organization ready for the fourth wave? There is strong collaboration required between Operational Technology (OT) groups and IT groups. OT from lines of business (LoB) lead the charge on use cases and business cases. IT needs to deliver on their strengths of software, infrastructure, security, and analytics to OT groups. 451 Research reports that OT budgets will rise by an average of 49% in 2018, higher than IT’s 35% gains. Despite the fact that IT has the bigger budget. The good news is that both groups rate business value of such projects very similarly.
Organizations leading the change work through longer investigation and deployment cycles but will benefit from competitive advantages to better serve their customers and improve revenue streams.
The fourth wave will require modernization of IT infrastructures on account of:
Read more at Edge Computing – the Fourth Wave Rises – by Moor Insights
To participate in proof of concepts, contact your Dell EMC sales team.
References: