AIoT and Machine Learning
AIoT and Machine Learning
Machine Learning is progressing at a rapid rate with new Deep Learning algorithms able to solve historically difficult problems using data-driven design principles. This is especially exciting in IoT where the rapid increase in connected devices has led to an explosion in the amount of data being generated at the edge. These new algorithms are playing a critical role in advancing the next phase of the IoT revolution.
There are many reasons why this approach is being embraced in many markets and areas such as smart cities, smart homes, the Industrial Internet of Things (IIoT), wearables, and more. According to one market research firm’s projections, the AIoT market is expected to increase from US$5.1 billion in 2019 to US$16.2 billion by 2024, growing at a CAGR of 26 percent1).
A recently published Price Waterhouse Cooper (PwC) article shows the drivers of IoT growth and benefits of artificial intelligence. Decreasing costs are among the benefits of AIoT. However, increases in device proliferation and venture capital (VC) spending as well as convergence of information technology (IT) and operational technology (OT) and big data and the cloud/fog are also occurring.
With decreasing cost in several areas as a benefit, it is no surprise that many systems developers are interested in taking advantage of these combined capabilities in their next designs. To accelerate the development of differentiated AIoT products, Infineon Technologies has released the ModusToolbox
AIoT addresses system barriers in IoT. The original design concept of simply moving all of the data generated on the edge to the cloud for analysis and machine learning has run into three fundamental barriers: privacy, reliability and latency. To reduce these barriers, system designers have changed the location of the ML algorithms that typically have run on the cloud at the edge. A great example to think about are voice-based smart assistants.
Firstly, when you interact with an assistant, the time it takes to make a round-trip to get the answer is generally a poor user experience since that is not the natural way for humans to interact. Secondly, reliability and bandwidth of internet connection is also critical, especially when these assistants run on wearable devices such as smart watches and you don’t always have a perfect, reliable connection to the cloud. Thirdly, with the proliferation of these assistants everywhere, privacy is always top-of-mind and trusting service providers with sensitive voice data is always a challenge. Running these algorithms efficiently on edge eliminates these barriers and allows AIoT products to scale much more rapidly.
ModusToolbox
ModusToolbox
To reduce AIoT development time, ModusToolbox
One of the other key features this toolset brings is helping system designers visualize how these optimization techniques impact model performance, so they can make the right trade-offs between performance vs the size/complexity of running the model efficiently on a PSoC
To help system designers get started quickly, code examples and IoT-focused development kits are provided to have a smooth developer experience that reduces the complexities system developers face when developing AIoT applications. These typically require a seamless Machine Learning workload integration, along with compute, connectivity, and cloud domains. ModusToolbox
ModusToolbox
ModusToolbox
Click here for more information about Infineon’s machine learning solutions.
Click here for more information about Infineon IoT solutions.
1) AI in IoT Market worth $16.2 billion by 2024; MarketsandMarkets; https://www.marketsandmarkets.com/PressReleases/ai-in-iot.asp