How AI is reshaping the edge computing landscape

How much computing power is needed at the edge? How much memory and storage is enough for AI at the edge? Minimum requirements are increasing as AI opens the door to innovative applications that need more and faster processing, storage and memory. How can today’s memory and storage technologies meet the stringent requirements of these challenging new edge applications?

What do we mean by “the edge”?

Edge includes any distributed application where the specific processing happens outside of the server, even if the data is ultimately sent to a data center. The big idea is to avoid sending all the data over the internet for processing on a server, and instead allow the data to be processed closer to where it is collected, avoiding latency issues with long data round trips. and allowing near real-time response on the site.
The border is roughly divided by the distance from the server to the endpoint. The call close to the edge You can include applications close to the data center, perhaps even within the same building. the far edge take the other extreme in applications like autonomous vehicles. The overlapping feature is that the edge system processes data that would traditionally have been sent to a data center. This has practical applications in many industries.

Data latency and bandwidth at the industrial edge

In industrial applications, edge computers are typically designed to take input from sensors or other devices and act accordingly. For example, preventive maintenance takes readings from acoustic, vibration, temperature, or pressure sensors and analyzes them to identify anomalies that indicate minor machine failures. Machines can be taken offline immediately or when necessary to allow maintenance to take place before catastrophic failure. Reaction times should be fast, but the amount of data is low. However, AI is putting pressure on these edge systems.

The impact of AI on edge processing loads

AI places a different kind of load on computer systems. AI workloads require faster processors, more memory, and powerful GPUs. AOI, for example, has seen widespread adoption for PCB inspection, using video input from high-speed cameras to identify missing components and quality defects. In fact, similar visual inspection technology is being adopted in industries as diverse as agriculture, where it can be used to identify defects and discoloration in products.
Running complex algorithms on video inputs requires the parallel processing capabilities of power-hungry GPU cards, more memory for efficient and accurate AI inference, and more storage space for additional data. But don’t they already exist in data centers?

Bringing a small part of the data center power to the edge

Essentially, to address AI tasks at the edge, we’re bridging the gap between the edge and the data center. Servers tucked away in temperature-controlled data centers have terabytes of memory and vast amounts of storage available to handle specific high-capacity loads and keep systems running fast. But when it comes to inferences that happen far from the data center, it’s a different story. Edge computers don’t enjoy such idyllic surroundings and must be built to withstand harsh environments. The edge needs hardware that strives for maximum performance and takes less-than-ideal conditions into account.

edge hardware

Adding AI at the industrial edge requires hardware suitable for the task. An industrial computer that can handle extreme temperatures, vibrations, and space constraints is a must. In particular, three things are needed for vision systems, the most prolific AI application to date, memory to support efficient AI inference, storage for incoming data, and PoE to support adding cameras.
You can get more memory in a smaller footprint with the latest ddr5. Delivering more memory capacity at the edge at higher speeds, with twice the speed and four times the capacity of DDR4 for the same footprint, it makes more efficient use of available space and resources.
Capacity needs to be expanded for edge applications, as the data needs to go to the server or stay at the edge for some time, so SSDs are needed for staging. The move from SATA to NVMe has opened the doors to higher speeds and performance, and the soon-to-be-available NVMe PCIe G4X4 SSD is the latest SSD in the Cervoz portfolio and provides industry-leading performance for these applications.
Vision systems need cameras. PoE+ is the easiest and most efficient way to add high-speed cameras to your system, providing power and data transmission over a single cable. loudly PoE Ethernet Modular PCIe Expansion Card adds this functionality via a small plugin for power.

Get a head start on AI at the edge

For businesses looking to gain an edge, the combination of industrial-strength industrial computers plus memory and storage provide the reliability to withstand harsh edge environments and the power to enable next-generation AI technologies at the network edge.

About Cervoz
Based in Taiwan, Cervoz Technology supplies embedded components for the industrial PC market. The company has nearly two decades of experience designing and developing high-performance memory and storage solutions for industrial applications.

Leave a Reply

Your email address will not be published. Required fields are marked *