IoT edge computing: how to pan for gold in the next billion-dollar market?

We are seeing that edge intelligence is profoundly changing the IoT industry, making real-time data analysis easier and businesses operating more efficiently. The current situation is that customers are more and more fond of IoT devices that integrate edge computing, and chip companies’ R&D investment in edge AI chips is also increasing year by year.

Author: Doctor M

Maybe you have already discovered that, whether it is personal consumption or industrial applications, today’s Internet of Things is getting smarter. The reason for this is the collaboration between edge computing and the Internet of Things. Edge computing is the future of computing, processing and storage. The technology, although still in its infancy, is growing rapidly. According to Gartner, as of 2018, about 10% of enterprise data is generated and processed at the “edge.” In 2025, that number will reach a staggering 75%. Gartner also predicts that in the next 3-5 years, edge computing will become the next blue ocean market of more than tens of billions.

How should we understand the development trend of this market and related technologies? What technical resources are currently available to help us dig for gold in this huge potential market? Let’s talk today.

Why do IoT and edge computing work together?

The Internet of Things (IoT) refers to the process of connecting physical objects to the Internet, receiving and transmitting data on the network without any human intervention, and its ecosystem consists of network-enabled smart devices that use embedded systems such as processors. , sensors and communication hardware) to collect, send and process data obtained from the environment. IoT devices share the sensor data they collect by connecting to IoT gateways or other edge devices, where the data is sent to the cloud for analysis or directly locally. Now, IoT can also make the data collection process easier with the help of artificial intelligence (AI) and machine learning (ML).

At work, the massive amounts of data generated by the IoT need to be processed and analyzed quickly, and edge computing brings computing services closer to the end user or data source, such as IoT devices. This allows IoT data to be collected and processed at the edge where the device is located, rather than sending the data back to the data center or cloud, which is important for work models that require faster startup or real-time operation. As can be seen in numerous examples, IoT devices are becoming increasingly valuable in harnessing computing power as a means of rapidly analyzing data in real-time.

Autonomous driving is a prime example of how IoT and edge computing need to work together. We know that self-driving vehicles on the road need to collect and process massive amounts of real-time data about traffic, pedestrians, street signs, and stop lights. If the vehicle needs to stop or turn quickly, it will take a long time to transmit the data back and forth between the vehicle and the cloud, which cannot meet the needs of real-time processing of the moving vehicle, and there is a huge security risk. Edge computing brings the equivalent of cloud computing to the vehicle, enabling IoT sensors in the vehicle to process data locally in real-time to avoid accidents.

How does edge intelligence benefit IoT?

The advantages of introducing edge computing into the Internet of Things are mainly reflected in five aspects.

• Reduced latency

Edge computing can be described as a distributed computing approach that brings computing power and data storage closer to the primary source of data. The primary goal of edge computing is to improve network latency, and IoT is a specific technology that helps achieve this. Reducing the latency of network operations should be the biggest benefit that edge computing brings to the IoT.

• Reduce bandwidth

Through edge intelligence, all data that needs to be processed in real time will be processed locally, and the data sent through the Internet for post-processing based on cloud-based services is greatly reduced, effectively saving network bandwidth requirements.

• Increase flexibility

Edge intelligence can be either a centralized deployment or a distributed solution with sufficient flexibility. Economies of scale are attractive for cloud edge providers.

• Predictive and analytical capabilities

Through edge computing, businesses can leverage the data collected locally and the visibility and analytical capabilities provided in the cloud to conduct business across the globe. In addition, through real-time insights into business operations, companies can also use this to predict future demand and provide service innovations to improve operational efficiency.

• Cost advantage

Edge computing can help reduce the size of expensive internal enterprise network deployments. For end users, while edge computing is a nearly invisible benefit, they are the ultimate beneficiaries as the overall cost of IoT business decreases.

The rise of edge computing chips

Today’s IoT devices are getting smarter with the advent of edge computing and artificial intelligence (AI). For example, installing edge IoT devices in factories can track the working condition of machines and perform predictive maintenance to avoid failure and damage to the entire system; smart cameras equipped with edge AI chips can identify people in addition to capturing video. flow, monitor diver behavior, etc.

These are just a microcosm of the benefits edge computing brings to IoT. Driven by the huge application market, the edge computing market is expanding rapidly. GraveVIEW estimated in its market analysis report that the global edge computing market size will be about 4.7 billion US dollars in 2020, and will grow at a compound annual rate of 38.4% from 2021 to 2028. Growth rate (CAGR) is growing rapidly. The huge market demand has effectively driven the market development boom of edge computing chips.

From the conception to today, cloud computing has existed for more than 10 years, and to this day, it is still a development hotspot in the Electronic information industry. The rise of cloud-based machine learning is heavily influenced by GPUs (NVIDIA is the main driver). This success immediately caught the attention of other chip makers, followed by AI-specific processors driven by the likes of Google, AWS, and Microsoft, with leading players like AMD, Intel, Qualcomm, and ARM joining the AI ​​chip fray battle. With the gradual decline of real-time data processing capabilities, the edge computing industry has risen rapidly. However, GPUs and CPUs originally used for cloud computing are not microchips, especially GPUs, which always have the problem of high energy consumption.

When determining the edge computing hardware processing architecture, FPGAs and MCUs are a good choice. In particular, FPGA SoCs integrated with Arm processors have great flexibility in application and are very suitable for AI inference computing at the edge of real-time networks with limited performance and stringent power consumption requirements. In the edge computing market, due to the huge number of MCU-based edge devices, the integration of AI functions into these general-purpose MCUs is becoming a direction for chip manufacturers. Today, companies such as Maxim, NXP, Silicon Labs, and STMicroelectronics have successively launched a full range of microprocessor products for edge computing.

Xilinx’s Versal edge AI series integrates application processors, AI processors, and FPGAs. DSP engine. The Versal Edge AI family accelerates the entire application from sensor to AI, enabling real-time control, four times faster computing compared to past AI processor architectures, and ISO 26262 and IEC 61508 safety performance for all applications and other key criteria. In order to meet the performance requirements of different scenarios, the Versal series edge AI processors provide seven models from VE2002 to VE2802 to choose from.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 1: There are 7 models of Versal series edge AI-specific processors for different applications (Source: Xilinx)

Maxim’s new neural network accelerator, the MAX78000 SoC, integrates two MCU cores for system control, an Arm Cortex-M4 processor and a 32-bit RISC-V processor. Combined with ultra-low-power deep neural network accelerators, it provides the computing power required for high-performance AI applications. It is an ideal choice for edge computing applications such as machine vision, facial recognition, target detection and classification, time series data processing, and audio processing. The MAX78000’s Convolutional Neural Network (CNN) Accelerator with 442KB of weight storage can run AI inference faster than software solutions running on low-power microcontrollers after data is configured and loaded 100 times, the power consumption is less than 1%.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 2: Maxim’s neural network accelerator MAX78000 SoC (Credit: Mouser)

The i.MX RT series is a cross-border MCU promoted by NXP in recent years. It supports the rich functions of high-performance MCUs and AP application processors, and is designed for low-cost, high-performance, and highly integrated edge computing. Part of the NXP EdgeVerse edge computing platform, the i.MX RT series offers Arm Cortex-M cores, real-time capabilities and MCU availability at an affordable price. NXP’s MCU-based EdgeReady face recognition solution makes full use of the performance of the i.MX RT106F cross-border MCU, completely replacing the traditional “MPU+PMIC” architecture in hardware, without expensive DDR, developers can quickly, Easily add face recognition liveness detection to its products and do it with low-cost IR and RGB cameras, eliminating the cost of expensive 3D cameras.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 3: i.MX RT106F block diagram (Source: NXP)

Epilogue

Chip-enabled edge intelligence increases the value of IoT devices in a number of ways:

• One is that edge AI chips generate less heat and power consumption, and they can be integrated with handheld devices (such as smartphones) and other non-consumer devices (such as robots).

• Second, edge-based AI chips reduce or end the need to send bulk data to cloud solutions or data centers. This means that processor-intensive machine learning calculations can be performed locally, improving data security while increasing processing speed.

• The third is that edge AI chips simplify the operation mode for enterprises to collect and process data. While companies collect data from connected devices, they can analyze data directly on the device in real-time, reducing the complexity of decision-making.

Computing chips are getting closer to where the data resides, with both established chipmakers and startups focusing on adding AI capabilities to the edge. According to Verified Market Research (VMR), the edge AI chip market will expand at a compound annual growth rate (CAGR) of 2.27% from 2021, and the industry will reach $2.09 billion by 2028. In the field of edge computing, computer vision has become a prominent application case of artificial intelligence, especially in deep learning, which employs multi-layer neural networks and unsupervised techniques to achieve the results of image pattern recognition.

We are seeing that edge intelligence is profoundly changing the IoT industry, making real-time data analysis easier and businesses operating more efficiently. The current situation is that customers are more and more fond of IoT devices that integrate edge computing, and chip companies’ R&D investment in edge AI chips is also increasing year by year.

We are seeing that edge intelligence is profoundly changing the IoT industry, making real-time data analysis easier and businesses operating more efficiently. The current situation is that customers are more and more fond of IoT devices that integrate edge computing, and chip companies’ R&D investment in edge AI chips is also increasing year by year.

Author: Doctor M

Maybe you have already discovered that, whether it is personal consumption or industrial applications, today’s Internet of Things is getting smarter. The reason for this is the collaboration between edge computing and the Internet of Things. Edge computing is the future of computing, processing and storage. The technology, although still in its infancy, is growing rapidly. According to Gartner, as of 2018, about 10% of enterprise data is generated and processed at the “edge.” In 2025, that number will reach a staggering 75%. Gartner also predicts that in the next 3-5 years, edge computing will become the next blue ocean market of more than tens of billions.

How should we understand the development trend of this market and related technologies? What technical resources are currently available to help us dig for gold in this huge potential market? Let’s talk today.

Why do IoT and edge computing work together?

The Internet of Things (IoT) refers to the process of connecting physical objects to the Internet, receiving and transmitting data on the network without any human intervention, and its ecosystem consists of network-enabled smart devices that use embedded systems such as processors. , sensors and communication hardware) to collect, send and process data obtained from the environment. IoT devices share the sensor data they collect by connecting to IoT gateways or other edge devices, where the data is sent to the cloud for analysis or directly locally. Now, IoT can also make the data collection process easier with the help of artificial intelligence (AI) and machine learning (ML).

At work, the massive amounts of data generated by the IoT need to be processed and analyzed quickly, and edge computing brings computing services closer to the end user or data source, such as IoT devices. This allows IoT data to be collected and processed at the edge where the device is located, rather than sending the data back to the data center or cloud, which is important for work models that require faster startup or real-time operation. As can be seen in numerous examples, IoT devices are becoming increasingly valuable in harnessing computing power as a means of rapidly analyzing data in real-time.

Autonomous driving is a prime example of how IoT and edge computing need to work together. We know that self-driving vehicles on the road need to collect and process massive amounts of real-time data about traffic, pedestrians, street signs, and stop lights. If the vehicle needs to stop or turn quickly, it will take a long time to transmit the data back and forth between the vehicle and the cloud, which cannot meet the needs of real-time processing of the moving vehicle, and there is a huge security risk. Edge computing brings the equivalent of cloud computing to the vehicle, enabling IoT sensors in the vehicle to process data locally in real-time to avoid accidents.

How does edge intelligence benefit IoT?

The advantages of introducing edge computing into the Internet of Things are mainly reflected in five aspects.

• Reduced latency

Edge computing can be described as a distributed computing approach that brings computing power and data storage closer to the primary source of data. The primary goal of edge computing is to improve network latency, and IoT is a specific technology that helps achieve this. Reducing the latency of network operations should be the biggest benefit that edge computing brings to the IoT.

• Reduce bandwidth

Through edge intelligence, all data that needs to be processed in real time will be processed locally, and the data sent through the Internet for post-processing based on cloud-based services is greatly reduced, effectively saving network bandwidth requirements.

• Increase flexibility

Edge intelligence can be either a centralized deployment or a distributed solution with sufficient flexibility. Economies of scale are attractive for cloud edge providers.

• Predictive and analytical capabilities

Through edge computing, businesses can leverage the data collected locally and the visibility and analytical capabilities provided in the cloud to conduct business across the globe. In addition, through real-time insights into business operations, companies can also use this to predict future demand and provide service innovations to improve operational efficiency.

• Cost advantage

Edge computing can help reduce the size of expensive internal enterprise network deployments. For end users, while edge computing is a nearly invisible benefit, they are the ultimate beneficiaries as the overall cost of IoT business decreases.

The rise of edge computing chips

Today’s IoT devices are getting smarter with the advent of edge computing and artificial intelligence (AI). For example, installing edge IoT devices in factories can track the working condition of machines and perform predictive maintenance to avoid failure and damage to the entire system; smart cameras equipped with edge AI chips can identify people in addition to capturing video. flow, monitor diver behavior, etc.

These are just a microcosm of the benefits edge computing brings to IoT. Driven by the huge application market, the edge computing market is expanding rapidly. GraveVIEW estimated in its market analysis report that the global edge computing market size will be about 4.7 billion US dollars in 2020, and will grow at a compound annual rate of 38.4% from 2021 to 2028. Growth rate (CAGR) is growing rapidly. The huge market demand has effectively driven the market development boom of edge computing chips.

From the conception to today, cloud computing has existed for more than 10 years, and to this day, it is still a development hotspot in the Electronic information industry. The rise of cloud-based machine learning is heavily influenced by GPUs (NVIDIA is the main driver). This success immediately caught the attention of other chip makers, followed by AI-specific processors driven by the likes of Google, AWS, and Microsoft, with leading players like AMD, Intel, Qualcomm, and ARM joining the AI ​​chip fray battle. With the gradual decline of real-time data processing capabilities, the edge computing industry has risen rapidly. However, GPUs and CPUs originally used for cloud computing are not microchips, especially GPUs, which always have the problem of high energy consumption.

When determining the edge computing hardware processing architecture, FPGAs and MCUs are a good choice. In particular, FPGA SoCs integrated with Arm processors have great flexibility in application and are very suitable for AI inference computing at the edge of real-time networks with limited performance and stringent power consumption requirements. In the edge computing market, due to the huge number of MCU-based edge devices, the integration of AI functions into these general-purpose MCUs is becoming a direction for chip manufacturers. Today, companies such as Maxim, NXP, Silicon Labs, and STMicroelectronics have successively launched a full range of microprocessor products for edge computing.

Xilinx’s Versal edge AI series integrates application processors, AI processors, and FPGAs. DSP engine. The Versal Edge AI family accelerates the entire application from sensor to AI, enabling real-time control, four times faster computing compared to past AI processor architectures, and ISO 26262 and IEC 61508 safety performance for all applications and other key criteria. In order to meet the performance requirements of different scenarios, the Versal series edge AI processors provide seven models from VE2002 to VE2802 to choose from.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 1: There are 7 models of Versal series edge AI-specific processors for different applications (Source: Xilinx)

Maxim’s new neural network accelerator, the MAX78000 SoC, integrates two MCU cores for system control, an Arm Cortex-M4 processor and a 32-bit RISC-V processor. Combined with ultra-low-power deep neural network accelerators, it provides the computing power required for high-performance AI applications. It is an ideal choice for edge computing applications such as machine vision, facial recognition, target detection and classification, time series data processing, and audio processing. The MAX78000’s Convolutional Neural Network (CNN) Accelerator with 442KB of weight storage can run AI inference faster than software solutions running on low-power microcontrollers after data is configured and loaded 100 times, the power consumption is less than 1%.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 2: Maxim’s neural network accelerator MAX78000 SoC (Credit: Mouser)

The i.MX RT series is a cross-border MCU promoted by NXP in recent years. It supports the rich functions of high-performance MCUs and AP application processors, and is designed for low-cost, high-performance, and highly integrated edge computing. Part of the NXP EdgeVerse edge computing platform, the i.MX RT series offers Arm Cortex-M cores, real-time capabilities and MCU availability at an affordable price. NXP’s MCU-based EdgeReady face recognition solution makes full use of the performance of the i.MX RT106F cross-border MCU, completely replacing the traditional “MPU+PMIC” architecture in hardware, without expensive DDR, developers can quickly, Easily add face recognition liveness detection to its products and do it with low-cost IR and RGB cameras, eliminating the cost of expensive 3D cameras.

IoT edge computing: how to pan for gold in the next billion-dollar market?
Figure 3: i.MX RT106F block diagram (Source: NXP)

Epilogue

Chip-enabled edge intelligence increases the value of IoT devices in a number of ways:

• One is that edge AI chips generate less heat and power consumption, and they can be integrated with handheld devices (such as smartphones) and other non-consumer devices (such as robots).

• Second, edge-based AI chips reduce or end the need to send bulk data to cloud solutions or data centers. This means that processor-intensive machine learning calculations can be performed locally, improving data security while increasing processing speed.

• The third is that edge AI chips simplify the operation mode for enterprises to collect and process data. While companies collect data from connected devices, they can analyze data directly on the device in real-time, reducing the complexity of decision-making.

Computing chips are getting closer to where the data resides, with both established chipmakers and startups focusing on adding AI capabilities to the edge. According to Verified Market Research (VMR), the edge AI chip market will expand at a compound annual growth rate (CAGR) of 2.27% from 2021, and the industry will reach $2.09 billion by 2028. In the field of edge computing, computer vision has become a prominent application case of artificial intelligence, especially in deep learning, which employs multi-layer neural networks and unsupervised techniques to achieve the results of image pattern recognition.

We are seeing that edge intelligence is profoundly changing the IoT industry, making real-time data analysis easier and businesses operating more efficiently. The current situation is that customers are more and more fond of IoT devices that integrate edge computing, and chip companies’ R&D investment in edge AI chips is also increasing year by year.

The Links:   LTM10C273 SKIM455GD12T4D1 SKM200GAL126D

Author: Yoyokuo