In 2015, there were 15.41 billion connected Internet of Things (IoT) devices around the world. By 2020, just two years from now, that number will nearly double to 30.73 billion. Manufacturing, healthcare, and insurance are the top three industries that have the most to gain from IoT, generating a petabyte of data every single day.
Dealing with the data from these devices has become one of IT’s and IoT’s biggest challenges says Intel. In manufacturing, for example, by 2020, industrial IoT alone will generate a petabyte of data per day, including a new highly valuable data type—video.
Video - as in vision technology - is widely considered the eye of IoT. From transportation, public services, retail, industrial manufacturing, healthcare, and more, tens of millions of connected video devices across many sectors will generate massive amounts of data in both size and volume—a single raw 4k UHD frame is 8 MB. With so many cameras generating streams of 30 frames per second (fps) or more, it adds up quickly, even with today’s high-efficiency codecs. Cisco predicts that by 2021, video will be 82 percent of all IP traffic.
Video - as in vision technology - is widely considered the eye of IoT. From transportation, public services, retail, industrial manufacturing, healthcare, and more, tens of millions of connected video devices across many sectors will generate massive amounts of data in both size and volume—a single raw 4k UHD frame is 8 MB. With so many cameras generating streams of 30 frames per second (fps) or more, it adds up quickly, even with today’s high-efficiency codecs. Cisco predicts that by 2021, video will be 82 percent of all IP traffic.
IoT applications and architectures are on the front lines to deal with the challenges from increasing data variety, volume, and velocity. The challenges include latency that impacts the availability of data, security of information, and cost to manage and move the data, and Intel is pushing its Xeon D processors for edge designs.
When data has to be analysed and responded to in real-time, any delay is a formula for failure. Even with data travelling on the fastest networks, massive amounts converging on a local network and then a backbone can still take many seconds to reach a data centre thousands of miles away, be analysed, and the response returned to the recipient. And, even with traffic prioritisation, volume and distance to destination can delay critical information. When the analytical response to that data involves human safety or precision machinery, a delay could be the difference between success and disaster, and so many architectures choose to keep the time-critical analytics close to the source.
Some industries have strict regulatory requirements and companies that generate highly sensitive Intellectual Property (IP) or operational data must secure and protect it. It creates a violation of personal information protection or an unacceptable business risk if the data is exposed or stolen. Cloud-based IoT solutions do not make sense for these companies; they need an in-house solution—with designed-in security and protection—where their business and operations execute, even while handling the massive amounts of data that might be generated locally.
5G mobile deployments will increase mobile network speeds, but, when every byte has a value tied to it, the cost of transporting large amounts of video across metered networks makes it prohibitive.
With issues like these, analysts predict that 45 percent of generated data will be processed, stored, and acted on at the edge by the end of next year.
The availability of advanced processing capabilities designed for the edge enables data to be handled locally instead of in the cloud, or before being sent to the cloud. These latest-generation technologies are provided for high-performance inferencing, analytics, general purpose compute, and Artificial Intelligence/Deep Learning (AI/DL) at the edge for the unique use cases presented to IoT solution and application architects.
Designing an edge compute solution for emerging applications is driven in large part by the type of data, the size of the data, the volume of data to be processed, and how fast it needs to be analysed. When it comes to combining inferencing, general compute, storing and securing of data, and analytics at the edge, Intel highlights its Xeon processors that offer two to twenty cores to match the performance needs of edge systems, with extensibility to eight processors on a platform.
These CPUs perform well for AI/DL applications and are optimised to handle new emerging use cases involving video analytics, pattern recognition, predicting outputs, and operational efficiencies. Built-in security technologies enable developers to easily design in security from the architectural stage down to implementation, using advanced encryption and compression acceleration, platform and boot integrity, and a host of other security capabilities built into the processor silicon.
It also points to the OpenVINO toolkit to fast-track the development of computer vision and deep learning inference into vision applications. Intel offers the broadest range of vision products and software tools to help OEMs, ODMs, ISV’s and system integrators scale vision technology across infrastructure, matching specific needs with the right performance, cost, and power efficiency at every point in an artificial intelligence (AI) architecture.
This supports from algorithm development to platform optimistion using industry standard APIs, frameworks, and libraries. OpenVINO allows developers to take networks created in common frameworks, like Caffe, Tensorflow, and MXNet, and optimise those on heterogeneous hardware engines.
Related stories:
No comments:
Post a Comment