Thursday, April 27, 2017

Micron sees IoT security through memory (via Microsoft Azure)

By Nick Flaherty

Memory maker Micron has teamed up with Microsoft to add authentication technology to its memory devices to boost the security of the Internet of Things. This is linked to Microsoft's move to provide IoT-as-a-service on the Azure cloud.
The technology uses a hardware ‘root of trust' integrated into Micron's flash memory in the IoT device along with the Microsoft Azure IoT cloud to establish a strong trusted link between that IoT device and the cloud.

Micron has also launched strong cryptographic identity and device health management in flash memory. The concept of monitoring persistent memory storage is becoming more and more critical to understanding a device's health and by using Microsoft's support of Device Identity Composition Engine (DICE), an upcoming standard from the Trusted Computing Group (TCG), the combination of the Azure IoT cloud and Authenta helps ensure that only trusted hardware gains access to the IoT cloud.

The key aspect of the combined solution is that the health and identity of an IoT device is verified in hardware on the device where critical code is typically stored. This enables more advanced functionality like hardware-based device attestation and provisioning.

Authenta provides protection for the lowest layers of IoT device software, starting with the boot process. This enables system developers to harden system level security without adding additional hardware components, leading to a more affordable and robust IoT solution, and means IoT devices that use standard flash memory chips (which is most of them) can now be enhanced to improve cyber-security using this combined approach.

Microsoft and Micron will offer software development kits (SDKs) that help make it easier to provide the secure IoT cloud management and connectivity for new platforms and devices, as well as the ability to retrofit legacy systems.

Expect Micron to be in discussion with other cloud providers on SDKs to link Authenta to other services so that hardware designs are not locked into one cloud provider. 
"Microsoft and Micron are collaborating to provide customers with a unified approach to improve IoT security. This capability will speed up adoption of the latest IoT concepts by enabling customers to broaden their IoT connectivity while decreasing the investment of implementation," said Sam George, director of Azure IoT cloud services. "Combining these technologies will enable critical security competencies to be underpinned at a low-level in both hardware and software so that users can quickly begin to add their value to these solutions without many of the resource burdens that have been repressing innovation in the industry."

"A secure Internet of Things requires an always on trust between billions of end-points and cloud management services. Anchors of this trust must be rooted in hardware and be scalable to even the smallest embedded devices," said Amit Gattani, senior director of Segment Marketing, Embedded Business at Micron. "We are pleased to see Microsoft extending their Azure IoT platform to include such trust services and creating an ecosystem with partners like Micron that provide hardware root of trust building blocks for end-devices. This will significantly ease developments and deployments for our customers across Industrial, Automotive and Consumer IoT markets."

Authenta is initially available in the Serial NOR product family and is sampling now to select customers. Users of Microsoft's DICE technology and Azure IoT services can now contact Micron and Microsoft to begin evaluation and integration of these security and identity solutions. 

Wednesday, April 26, 2017

System-in-package delivers low cost sub-GHz wireless connections

By Nick Flaherty

Microchip has launched a System in Package (SiP) that combines an ultra-low power microcontroller with an 802.15.4 sub-GHz radio to provide multi-year battery life in a 5 x 5 mm package.

Rather than integrating RF and digital functions into a single chip Microchip has put two chips in a single package for the SAM R30 SiP. It uses the same protocol as Zigbee but in the 915 or 868MHz ISM bands, giving longer range and lower power consumption but with lower data rates,making it suitable for connected home, smart city and industrial applications in the Internet of Things (IoT).

The SiP is built using the SAM L21 MCU (acquired with Atmel) that is based on the Cortex M0+ architecture and features ultra-low power sleep modes, with wake from serial communication or General-Purpose Input/Output (GPIO) while consuming 500nA.

With the radio chip operating in the 769-935 MHz range, the SAM R30 SiP gives developers the flexibility to implement a point-to-point, star or mesh network. Microchip helps developers get started immediately with the free MiWi point-to-point/star network protocol stack. Mesh networking capabilities will be available later this year. Nodes outfitted with the SiP can be positioned as far as one kilometer apart, with the ability to double the range in a star topology. When used in a mesh network, the SAM R30 delivers reliable wide-area coverage for applications such as street lighting or wind and solar farms.

Developers can begin prototyping immediately with the ATSAMR30-XPRO development board, priced at $65. This USB-interfaced development board is supported by the easy-to-use Atmel Studio 7 Software Development Kit (SDK).

The SAM R30 SiP is available in 33pin and 48pin QFN packages to be sampled or purchased in volume production quantities.

For more information, visit

Related stories:

Tuesday, April 25, 2017

Microsoft boosts security to offer IoT-as-a-service

By Nick Flaherty

Microsoft is offering its IoT capability on its Azure cloud as software-as-a-service (SaaS) to speed up deployments and has boosted its security provision as a result.

Microsoft IoT Central is a fully managed SaaS offering that enables powerful IoT scenarios without requiring cloud solution expertise. Built on the Azure cloud, it simplifies the development process and makes it easy and fast for customers to get started.

To do this, Azure IoT now supports Device Identity Composition Engine (DICE) and many different kinds of Hardware Security Modules (HSMs), says Arjmand Samuel, Principal Program Manager at Microsoft. DICE is an upcoming standard at Trusted Computing Group (TCG) for device identification and attestation which enables manufacturers to use silicon gates to create device identification based in hardware, making security hardware part of new devices from the ground up. HSMs are the core security technology used to secure device identities and provide advanced functionality such as hardware-based device attestation and zero touch provisioning.

The Azure IoT team is also working with standards organizations and major industry partners to employ latest in security best practices to deploy support for a wide variety of Hardware Secure Modules (HSM). HSMs offer resistant and resilient hardware root of trust in IoT devices and Azure integrates HSM support with new platform services such as Hub Device Provisioning and Management, enabling developers to focus more on identifying specific risks associated with their applications and less on security deployment tactics.

IoT device deployments can be remote, autonomous, and open to threats like spoofing, tampering, and displacement. In this case HSMs offer a major defense layer to raise trust in authentication, integrity, confidentiality, privacy, and more. The DICE minimalist approach is an alternative path to more traditional security framework standards like the Trusted Computing Group’s (TCG) and Trusted Platform Module (TPM), which is also supported on the Azure IoT platform.

The move also includes analytics with Azure Stream Analytics on edge devices, a new feature that extends from the cloud down to the device level.

Azure Stream Analytics on edge devices has the same unified cloud-management for stream analytics running across edge devices and the cloud. This approach enables organizations to use streaming analytics in scenarios where connectivity to the cloud is limited or inconsistent, but the need for quick insight and proactive actions are essential to run the business.

50 companies join EdgeX open framework for IoT edge computing

By Nick Flaherty

The Linux Foundation has launched an open source project to build a common open framework for Internet of Things (IoT) edge computing and an ecosystem of interoperable components for Industrial IoT.

The EdgeX Foundry aims to simplify and standardise Industrial IoT edge computing, although this is still at the level of the intelligent gateway rather than further down into the edge of the network. 

So far 50 companies, including AMD, Analog Devices, Dell and sensor company RFmicron as well as energy harvesting EnOcean Alliance have signed up, although Intel, ARM and board and gateway makers are conspicuous by their absence at this point. 

The project however is dominated, naturally, by the IoT software services companies as it aims to develop a range of microservices written in Java, Javascript, Python, Go or  C/C++ (see figure) that can sit on a range of operating systems and hardware (whether x86 or ARM). The choice of operating systems -Windows, Linux (of course) and even MacOS - highlight the gateway focus of the project. However, an OS-agnostic project lends itself to porting to real time operating systems further towards the network edge.  

"EdgeX Foundry is part of our commitment to playing a major role in providing solutions to help customers bridge the physical and digital world through IoT," said Michael Murray, General Manager of Industrial Sensing Products at Analog Devices. "We want to reduce complexity, democratize IoT standards and provide trusted data for customers, and we look forward to working with the EdgeX community to achieve those goals."

The Linux Foundation points to widespread fragmentation and the lack of a common IoT solution framework that are hindering broad adoption and stalling market growth. The complexity of the current landscape and the wide variety of components creates paralysis, and EdgeX is intended to solve this by making it easy to quickly create IoT edge solutions that have the flexibility to adapt to changing business needs.

"Success in the Internet of Things is dependent on having a healthy ecosystem that can deliver interoperability and drive digital transformation," said Jim Zemlin, Executive Director of The Linux Foundation. "EdgeX Foundry is aligning market leaders around a common framework, which will drive IoT adoption and enable businesses to focus on developing innovative use cases that impact the bottom line."

EdgeX Foundry is unifying the marketplace around a common open framework and building an ecosystem of companies offering interoperable plug-and-play components. Designed to run on any hardware or operating system and with any combination of application environments, EdgeX can quickly and easily deliver interoperability between connected devices, applications, and services, across a wide range of use cases. Interoperability between community-developed software will be maintained through a certification program.

Dell is seeding EdgeX Foundry with its FUSE source code base under Apache 2.0. The contribution consists of more than a dozen microservices and over 125,000 lines of code and was design following feedback from hundreds of technology providers and end users to facilitate interoperability between existing connectivity standards and commercial value-add such as edge analytics, security, system management and services.

"One of the key factors holding back IoT designs in the enterprise is that there are too many choices to safely and easily implement a system that will provide a return on investment in a reasonable timeframe," said Mike Krell, Lead IoT Analyst at Moor Insights & Strategy. "EdgeX Foundry will fundamentally change the market dynamic by allowing enterprise IoT applications to choose from a myriad of best-in-class software, hardware and services providers based on their specific needs."

According to a Gartner report, there will be 20.4 billion connected things in use globally by 2020. The sheer quantity of data that will be transmitted from these devices is driving adoption of edge computing, where connected devices and sensors transmit data to a local gateway device instead of sending it back to the cloud or a central data center. Edge computing is ideal for deploying IoT applications because it allows for quicker data analytics and reduced network traffic. This is essential for applications which require localized, real-time data analysis for decision making such as factory optimization, predictive maintenance, remote asset management, building automation, fleet management and logistics.

"Businesses currently have to invest a lot of time and energy into developing their own edge computing solutions, before they can even deploy IoT solutions to address business challenges," said Dr Philip DesAutels, Senior Director of IoT at The Linux Foundation. "EdgeX will foster an ecosystem of interoperable components from a variety of vendors, so that resources can be spent on driving business value instead of combining and integrating IoT components."

Adopting an open source edge software platform allows hardware makers to scale faster with an interoperable partner ecosystem and more robust security and system management, while Sensor and Device Makers can write an application-level device driver with a selected protocol once using the SDK, and System Integrators can get to market faster with plug-and-play ingredients combined with their own proprietary inventions.

The Linux Foundation will establish a governance and membership structure for EdgeX Foundry and a technical steering committee will provide leadership on the code and guide the technical direction of the project.

The full list of founding members includes: 
  1. Advanced Micro Devices (AMD), 
  2. Alleantia, 
  3. Analog Devices, 
  4. Bayshore Networks, 
  5. Beechwoods Software, 
  6. Canonical, 
  7. ClearBlade, 
  8. CloudPlugs, 
  9. Cloud of Things, 
  10. Cumulocity - IoT consolidation drives with Cumulocity buy
  11. Davra Networks, 
  12. Dell, 
  13. Device Authority, 
  14. Eigen Innovations, 
  15. EpiSensor, 
  16. FogHorn Systems - Moving IoT analytics to the network edge
  17. ForgeRock, 
  18. Great Bay Software, 
  19. IMS Evolve, 
  20. IOTech, 
  21. IoTium, 
  22. KMC Controls, 
  23. Kodaro, 
  24. Linaro, 
  25. MachineShop, 
  26. Mobiliya, 
  27. Mocana - Infineon teams with Mocana for network security
  28. Modius, 
  29. NetFoundry, 
  30. Neustar, 
  31. Opto 22, 
  32. relayr, 
  33. RevTwo, 
  34. RFMicron, 
  35. Sight Machine, 
  36. SoloInsight, 
  37. Striim, 
  38. Switch Automation, 
  39. Two Bulls, 
  40. V5 Systems, 
  41. Vantiq, 
  42. VMware
  43. ZingBox. 
Industry affiliate members include: Cloud Foundry Foundation, EnOcean Alliance, Mainflux, Object Management Group, Project Haystack and ULE Alliance.

For more information on EdgeX Foundry including how to participate, see

Related stories:

Monday, April 24, 2017

Bots move into industrial IoT

By Nick Flaherty

German IoT software developer OSIsoft is extending its partnership with Rockwell Automation to integrate its PI System technology into a new bot-based appliance for the Industrial Internet of Things (IIoT).

Rockwell's FactoryTalk Analytics for Devices automatically discovers devices on industrial networks to conduct diagnostics and monitor their health, providing early warnings, diagnoses problems and gives insight to take action. All of this to improve uptime of processes and machines using the PI System software. Users of the system can receive “action cards” through their smartphones, tablets or a web browser or engage with the device through “Shelby,” a natural language voice-activated (bot) system.

The PI System technology embedded in FactoryTalk Analytics for Devices captures and organizes the vast amount of data generated by these networks so it can serve customers immediately from the appliance, or later have the data delivered to Microsoft Azure via FactoryTalk Cloud for further Big Data analytics. Worldwide, the PI System manages over 1.5 billion sensor-based data streams, making it one of the most widely uses IIoT technologies.

“Industrial customers need deep, detailed insight into their operations in real-time to stay competitive: that is what drives our FactoryTalk strategy,” said John Genovesi, Vice President of Information Software and Process Business at Rockwell Automation. 

The PI System captures data from sensors, manufacturing equipment and other devices and transforms it into rich, real-time insights that engineers, executives and partners can use to reduce costs, dramatically improve overall productivity, and create new connected services and smart devices.

OSIsoft and Rockwell Automation have collaborated for over a decade on the technology.PI System powers the FactoryTalk Historian embedded in many Rockwell Automation systems. BHP Billiton, for instance, manages millions of data tags across mines, transportation assets and production facilities to reduce variability and increase quality. PI System technology ships in approximately 1,800 Rockwell Automation Systems per year and by 2020, OSIsoft anticipates that hundreds of thousands of devices from various vendors with PI System technologies will be shipping.

“Right now fewer than 14 percent of companies have completely connected their production data to the rest of their enterprise,” said Martin Otterson, Senior Vice President of Customer Success at OSIsoft. “Our relationship with Rockwell Automation will fuel the development of products and solutions that will let more people take advantage of machine and operational data for more projects in more ways than ever before.”

Related stories: 

Power news this week


World's first battery turbine hybrid powers up
.World's first battery turbine hybrid powers up
Williams leads consortium to build major UK battery factory
.Williams leads consortium to build major UK battery factory
Excelsys boss retires
.Excelsys boss retires


Coating boosts lithium metal battery performance
.Coating boosts lithium metal battery performance
Researchers use paper to create lightweight energy generator
.Researchers use paper to create lightweight energy generator
AMS Technologies opens thermal design centre in Poland
.AMS Technologies opens thermal design centre in Poland


Power controller with wide input range operates up to 175°C
.Power controller with wide input range operates up to 175°C
H-bridge drivers for 2.5V motors
.H-bridge drivers for 2.5V motors
Low cost 1W and 2W AC-DC converters target the smart home and office
.Low cost 1W and 2W AC-DC converters target the smart home and office


Coilcraft: Choosing Inductors for Energy Efficient Power Applications
.Choosing Inductors for Energy Efficient Power Applications
Intersil: Putting Safety into Li-ion Battery Packs
.Putting Safety into Li-ion Battery Packs
Exar: Solving the Power-Up Challenge for SmartFusion2 SoC FPGAs
.Solving the Power-Up Challenge for SmartFusion2 SoC FPGAs

By Nick Flaherty

Thursday, April 20, 2017

Xilinx pushes dynamic reconfiguration technology

By Nick Flaherty

Being able to add new hardware to a design in the field just with a software download is one of the huge advantages of using a field programmable gate array (FPGA). Being able to do this for one element of the design without impacting on the rest - partial reconfiguration - is a key capability that has been many years coming.

Now the latest update of the Vivado design tool from leading FPGA maker Xilinx has included Partial Reconfiguration technology. This enables dynamic field updates and increased systems integration in a broad range of applications such as wired & wireless networking, test & measurement, aerospace & defense, automotive, and data centres. 

Designers can now change functionality on the fly, eliminating the need to fully reconfigure and re-establish links, dramatically enhancing the flexibility of All Programmable devices. System upgradeability & reliability are greatly enhanced by providing the ability to update feature sets in deployed systems, fix bugs, and migrate to new standards while critical functions remain active.

“The use of Partial Reconfiguration in Xilinx devices allowed us to optimize the size of the FPGA, and provide complete flexibility to maintain system connectivity while independently reconfiguring multiple ports in our design,” said Craig Palmer, senior engineering manager, Viavi Solutions.

The Partial Reconfiguration technology enables dynamic configurability by swapping portions of the design while the rest remains operational, requiring zero downtime and little impact to cost or development time.

“Partial Reconfiguration of FPGAs is a key element in Keysight’s toolbox for creating the next generation of test and measurement solutions. Partial Reconfiguration enables us to manage the ever increasing need for flexibility and complexity of test systems,” said Tom Vandeplas, senior researcherat test equipment maker Keysight Laboratories.

The Vivado Design Suite HLx Editions 2017.1 release is now available for download. Partial Reconfiguration functionality is now included at no additional cost with the Vivado HL Design Edition and HL System Edition. In-warranty users can regenerate their licenses to gain access to this feature. Partial Reconfiguration is available for Vivado WebPACK Edition at a reduced price. 

Wednesday, April 19, 2017

Researchers find flaws in RISC-V core

By Nick Flaherty

Researchers at Princeton University have found a number of significant flaws in the RISC-V open source processor core. The specification is set to come to market later this year, although some companies such as SiFive are already using it.  

The researchers, testing a technique they created for analyzing computer memory use, found over 100 errors involving incorrect orderings in the storage and retrieval of information from memory in variations of the RISC-V processor architecture. The researchers warned that, if uncorrected, the problems could cause errors in software running on RISC-V chips. Officials at the RISC-V Foundation said the errors would not affect most versions of RISC-V but would have caused problems for higher-performance systems.

"Incorrect memory access orderings can result in software performing calculations using the wrong values," said Margaret Martonosi, Professor of Computer Science at Princeton and the leader of the Princeton team that also includes Ph.D. students Caroline Trippel and Yatin Manerkar. "These in turn can lead to hard-to-debug software errors that either cause the software to crash or to be vulnerable to security exploits. With RISC-V processors often envisioned as control processors for real-world physical devices (i.e., internet of things devices) these errors can cause unreliability or security vulnerabilities affecting the overall safety of the systems."

Krste Asanović, the chair of the RISC-V Foundation, welcomed the researchers' contributions. He said the RISC-V Foundation has formed a working group, headed by Martonosi's former graduate student and co-researcher Daniel Lustig, to solve the memory-ordering problems. Asanović, a professor of electrical engineering and computer science at the University of California-Berkeley, said the RISC-V project was looking for input from the design community to "fill the gaps and the holes and getting a spec that everyone can agree on."

"The goal is to ratify the spec in 2017," he said. "The memory model is part of that."

Lustig, a co-author of Martonosi's recent paper and now a research scientist at NVIDIA, said work was underway to improve the RISC-V memory model.

"RISC-V is in the fortunate position of being able to look back on decades' worth of industry and academic experience," he said. "It will be able to learn from all of the insights and mistakes made by previous attempts."

The RISC-V instruction set was first developed at UC-Berkeley, with the idea that any designer could use the instruction set to create processor cores and software compilers. The project is now run by the RISC-V Foundation, whose membership includes a roster of universities, nonprofit organizations and top technology companies, including Google, IBM, Microsoft, NVIDIA and Oracle.

Martonosi's team discovered the problems when testing their new system to check memory operations across any computer architecture. The system, called TriCheck, allows designers and others interested in working with a design, to detect memory ordering errors before they become a problem. The tool has three general levels of computing: the high-level programs that create modern applications from web browsers to word processors; the instruction set architecture that functions as a basic language of the machine; and the underlying hardware implementation, a particular microprocessor designed to execute the instruction set.

The memory ordering challenge stems from the complexity of modern computers. As designers squeeze more performance out of computer systems, they rely on many concurrent operations sharing the same sections of computer memory. This parallel, shared-memory operation is extremely efficient, both for speed and power usage, but it puts a heavy demand on the computer's ability to interleave and properly order memory usage. If, for example, several processes are using the same section of memory, the computer needs to make sure that operations are applied to memory in the correct order, which may not always be the order in which they arrive from different concurrently running processors.

Subtle changes in any of the three computing levels — the machine level, the compiler and the high-level programming languages — can have unintended effects on the other layers. All three have to work together seamlessly to make sure memory errors don't crop up. One advantage of TriCheck is that it allows experts in one of these layers to avoid conflicts with the other two layers, even if they do not have expertise in them.

"If I write a program in C, it makes some assumptions about memory ordering," said Martonosi. "Subsequently, a different set of memory ordering rules are defined by the instruction-set architecture. We need to check that the high-level program's assumptions are accurately supported by the underlying instruction set and processor design."

However, the researchers said the TriCheck's greatest strength is its ability to give designers a broad view of memory usage. Although designers have long been interested in this perspective, previous attempts to comprehensively analyze memory operations have been too slow to be practical.

TriCheck is able to check memory ordering efficiently by using succinct formal specifications of memory ordering rules, known as axioms. For a given program, compiler, instruction set and hardware implementation, TriCheck can enumerate many ordering possibilities from these axioms, and then check for errors. By expressing the memory-ordering possibilities as connected graphs, TriCheck can identify potential errors by looking for cycles in the graphs. These checks can be done very efficiently on modern high-performance computers, and TriCheck's speed has allowed it to explore larger and more complex designs than prior work. 

"TriCheck is an important step in our overall goal of verifying correct memory orderings comprehensively across complex hardware and software systems," she said. "Given the increased reliance on computer systems everywhere — including finance, automobiles and industrial control systems — moving towards verifiably correct operation is important for their reliability and safety."

The TriCheck project culminates four years of work by Martonosi's group of developing checks across various layers of hardware, memory and software.

Other stories:

Friday, April 14, 2017

Actility raises $75m for wide area IoT

By Nick Flaherty

French low power wide area network (LPWAN) technology developer Actility has raised $75m to expand its delivery of the industrial Internet of Things (IoT) using a wide range of technologies .

The Series D funding round included Creadev, Bosch and Inmarsat, alongside telecoms operators KPN, Orange Digital Ventures, Swisscom and equipment maker Foxconn. A second closing later this month will see additional strategic investors joining the company without involving banks.

Actility's ThingPark platform is used for large-scale LPWA rollouts worldwide with the LoRaWAN LPWAN protocol that Actility co-developed, as well as LTE-M and NB-IoT. A software stack with the OS service and business support manager, application integration enabler, and e-commerce platform provides a turn-key IoT platform supporting sensor to cloud applications.

“This funding will enable us to grow our IoT technology and ecosystem platform faster to meet the needs of service providers, solution providers and enterprises in large industry verticals, for example rolling out our disruptive global location and tracking service more quickly,” said Actility CEO Mike Mulica. “It will also allow us to accelerate our strategy for the US, and build strength in China. And last but by no means least, it will enable us to look at strategic acquisitions to broaden our technology portfolio and cement our leadership in LPWA.”

“We have been looking the best project in the business of connectivity for the IoT for a while," said Florent Thomann, a member of Creadev’s management board, and in charge of new digital models. "In Actility, we found a company that offers an ideal solution and has made the perfect technology choices in LPWA and LTE-M to meet that growing connectivity market. We are convinced by both the company and its management, which shows a real visionary insight into the technology and business models and the way that connectivity will evolve. Furthermore, Actility’s team is proving to be particularly agile at innovation, adapting to new technologies very efficiently. We are pleased to bring our culture of ambition, support and sharing best business practice to help nurture Actility’s long-term growth.”

Having a satellite operator such as Inmarsat is a significant boost. “Inmarsat sees a great deal of potential in Actility, and its expertise in global IoT networks, based on LoRaWAN, makes it a natural fit for our investment," said Paul Gudonis, President of Inmarsat Enterprise. "There are clear synergies between us, namely the ability to deliver innovative connectivity services to customers in remote locations, creating the potential for a global IoT network. To this end, we recently developed our LoRaWAN-based network in partnership with Actility to enable IoT to reach every corner of the globe. We have many more projects planned with Actility and we are excited to support the company’s rapid growth as it continues to make great strides in the IoT arena. This market is rapidly maturing and Actility, with its growing ecosystem of partners, is ideally positioned to take advantage of this for the benefit of businesses across a variety of industries.”

Related stories: 

Friday, April 07, 2017

Cognex buys machine learning developer for vision systems

US machine vision giant Cognex has bought a Swiss developer of machine learning software, highlighting the increasing interest in artificial intelligence in embedded systems.

ViDi Systems, based in Villaz-St.-Pierre, Switzerland, was founded in 2012 by computational neuroscientist Dr Reto Wyss and the CPA Group, a Swiss industrial holding company and business incubator. This follows two machine vision acquistion late last year.

ViDi’s deep learning software uses artificial intelligence techniques to improve image analysis in applications where it is difficult to predict the full range of image variations that might be encountered. Using feedback, ViDi’s software trains the system to distinguish between acceptable variations and defects.

EnShape in Jena, Germany, was acquired in October for its patented 3D area-scan technology for fast image capture at high resolution, and eliminate the need to mechanically move objects in front of the device as required with laser line scanners. This created a new Cognex engineering centre in Jena.

In August Cognex also completed the acquisition of 3D vision software developer AQSense in Girona, Spain. AQSense develops and sells a library of field-tested 3D vision tools, and the company’s software engineers joined Cognex’s 3D engineering team upon the closing of the acquisition.

The story is here: Cognex continues European buying spree with machine learning developer

By Nick Flaherty

Related stories:

Thursday, April 06, 2017

NXP combines Kinetis and LPC development tools

By Nick Flaherty

NXP has combined the development tools for two of the most popular embedded microcontrollers in the industry, giving designers dramatically more flexibility in system implementation.

The MCUXpresso Integrated Development Environment (IDE) unifies development support for thousands of LPC and Kinetis (formerly Freescale) MCUs based on ARM Cortex-M cores using the same software suite.

The MCUXpresso IDE features simple, scalable and user-friendly interfaces and tools and is built to leverage the capabilities of the highly popular MCUXpresso SDK and Config Tools. The new feature-rich, Eclipse-based framework completes the trio of powerful MCUXpresso software development solutions and provides access to thousands of new project wizards and clone projects, saving designers valuable time by giving them a head-start to customise their own innovations.

“If design tools are simple, yet comprehensive, our customers stand a much better chance of designing tomorrow’s next amazing innovation,” said Geoff Lees, senior vice president and general manager of the microcontroller business line at NXP. “This unified software enablement gives developers more choice in high-quality controller solutions to fit their design needs. NXP will continue to stay ahead of the design trends and expand our MCUXpresso software and tools to support a variety of products in the future, ensuring our customers have access to the most comprehensive design tools on the market.”

Available in full-featured free and professional upgrade editions, the MCUXpresso IDE unifies Kinetis and LPC microcontrollers under a set of compatible tools. With a dedicated quickstart panel, automatic probe detection and configuration, and an intuitive project creation and cloning wizards, the MCUXpresso IDE is designed to ease developers through the setup and optimisation of their projects to application design and even multicore development. The MCUXpresso IDE supports full-featured, advanced debugging with unlimited code size and code profiling in the free offering, adds advanced trace features in the professional edition, and preserves hardware investments by supporting the former Freescale Freedom and Tower System, as well as LPCXpresso boards and custom hardware platforms.

This MCUXpresso SDK release adds new device support and includes examples and project files for use in the new MCUXpresso IDE. The MCUXpresso SDK also now includes support for NXP’s NTAG I2C Plus connected NFC tag for home-automation and consumer applications and will soon support the FRDM-KW41Z board designed for portable, extremely low power applications requiring Bluetooth® low energy (BLE) v4.2 and IEEE 802.15.4 RF connectivity. The MCUXpresso Config Tools offers a single powerful configuration environment with pins and clocks tool for dynamic generation of initialisation C code, and quickly guides users to example projects and web-based tools for rapid board bring-up.

Related stories:

Wednesday, April 05, 2017

Intel McAfee security deal is all about the IoT this time

By Nick Flaherty

Intel buying McAfee in 2011 for $7.7bn was all about the enterprise. Now, Intel spinning out McAfee into a separate company in a $4.2bn deal that is all about the Internet of Things.
Back in 2011, Intel was aiming to secure the enterprise alongside its PC and server processors in a market where it dominated. Now it needs to secure the IoT, it needs cooperation from companies that license the ARM architecture. Hence the need for an independent venture.
The key change is the McAfee Data Exchange Layer (DXL), the industry-endorsed communication fabric providing real-time interaction between applications. This needs to be taken down the stack to the gateway, where Intel processors are being used, but further down to the node. This is the challenge. Another Intel company, Wind River, is already taking up that challenge, pushing the VxWorks real time operating system further into the IoT.
The McAfee Security Innovation Alliance has over 135 partners around the world, and 30 of these are using the DXL connection as an API.
The giveaway is in the new strapline for McAfee - innovation, trust, and collaboration. The new company is 49% owned by Intel, with the remainder from equity house TPG and private equity investment firm Thoma Bravo, bu tit has to demonstrate that it can work well with the rest of the industry that does not rely on Intel. Intel Senior Vice President and General Manager Chris Young will lead the new McAfee as Chief Executive Officer. TPG partner Bryan Taylor has been named Chairman of the Board.
“Cybersecurity is the greatest challenge of the connected age, weighing heavily on the minds of parents, executives and world leaders alike,” said Christopher Young, CEO of McAfee. “As a standalone company with a clear purpose, McAfee gains the agility to unite people, technology and organizations against our common adversaries and ensure our technology-driven future is safe.”
“We offer Chris Young and the McAfee team our full support as they establish themselves as one of the largest pure-play cybersecurity companies in the industry,” said Brian Krzanich, CEO of Intel. “Security remains important to Intel, and in addition to our equity position and ongoing collaboration with McAfee, Intel will continue to integrate industry-leading security and privacy capabilities in our products from the cloud to billions of smart, connected computing devices.”
The advantage of DXL is that it is an open standard. Unlike typical integrations, each application connects to the universal DXL communication fabric and there is just one integration process instead of multiple efforts, which makes it suitable for enterprise scale IoT deployments.
OpenDXL will support a broad range of languages, enabling developers to create integrations using their favourite development environment. One app publishes a message or calls a service; one or more apps consume the message or respond to the service request.
As is the goal for any standard, the interaction is independent of the underlying proprietary architecture of each integrating technology and integrations are much simpler because of this abstraction from vendor-specific APIs and requirements.
In addition to creating native DXL integrations, developers can also wrap their services to interact or wrap the API of a commercial product to publish data onto DXL. Other services can listen to DXL messages and calls to enrich their functionality with the latest data, or take appropriate action. For a more sophisticated app reflecting orchestration, these sorts of actions can be scripted together to drive a waterfall—or simultaneous set—of actions.

The challenge now is to persuade the wider embedded industry that the new McAfee is truly independent of Intel in order to use the technology.

Related stories on the Embedded Blog: 


South West Innovation News - news from across the region for oneof the world's hottest tech clusters