お客様の素早い設計とより早い製品化を実現する、技術情報と専門知識をご紹介します。
Dusty construction sites. Fog-covered fields. Crowded warehouses. Heavy rain. Uneven terrain. What does it take for an autonomous machine to perceive and navigate challenging real-world environments like these – reliably, in real time? Together with Au-Zone Technologies, we set out to build a perception system that performs under operational stress but is fast to integrate and easy to scale.
The result: the Raivin module, a 3D perception system that fuses radar sensing, vision processing and edge AI inference into a single, production-ready unit. Designed for operational complexity, the Raivin enables machines to process and act on complex environmental data in real time.
With pre-trained AI perception models and a unified hardware-software stack, the Raivin simplifies the deployment of intelligent perception, marking a step forward in bringing scalable autonomy to the edge.
The push toward autonomy and physical AI is outpacing the readiness of traditional perception solutions. Many still rely on single-sensor stacks that falter in complex, unpredictable environments.
Together with Au-Zone, we set out to solve the problem — co-developing an edge AI sensor fusion system designed to deliver high-confidence, low-latency perception.
Vision delivers rich semantic understanding: object detection, classification and segmentation. Radar adds continuous depth and motion tracking, even through obscured environments. Fused with AI inference, these signals form a synchronized, context-aware 3D model of the world. This enables real-time decision making with a high degree of confidence.
But even with the benefits of multi-sensor systems, autonomous applications are still limited by how quickly they can respond to real-world stimuli. Latency matters, and these workloads can’t tolerate delays caused by cloud processing or slow sensor refresh rates.
Only by combining radar and vision with edge AI processing in one unit could we deliver a system fast enough, reliable enough and robust enough to meet the demands of next-generation autonomy.
From the start, the Raivin was a co-development effort. A full-stack design process where every layer, from silicon to software, was developed collaboratively to deliver unified performance at the edge.
The Raivin Module, built using a Toradex Verdin iMX8M Plus , is a commercially available AI perception solution that provides low-level radar cube and vision data processing with edge AI into a single, deployable unit.
Quad-core Arm Cortex-A53 and NPU up to 2.3 TOPS for AI inference, vision-based classification, segmentation and scene understanding
ISO 26262 ASIL D compliant; handles real-time radar signal processing and fusion
Operates in the 7681 GHz band; enables spatial awareness in dynamic and degraded conditions
The Raivin module was developed using Au-Zone’s EdgeFirst Studio™ , a platform that simplifies multimodal data collection, AI-assisted labeling, training, validation and deployment, without requiring deep ML expertise.
Within EdgeFirst Studio, the EdgeFirst Perception Stack helps developers accelerate sensor fusion design through pre-trained models and a workflow-optimized environment. Teams can label datasets, fine-tune models and validate performance within a single toolchain, significantly reducing development time and lowering the barrier to entry.
The result is a tightly integrated 3D perception system optimized for low latency, low power and ready for deployment in an edge environment.
The Raivin was put to the test in a live demo replicating the types of environmental stressors autonomous machines face every day, from weather and motion to visual obstructions:
Historically, sensor fusion has been complex, requiring fragmented tools, custom pipelines and deep domain expertise. The Raivin changes that.
With pretrained AI models integrated into Au-Zone’s EdgeFirst Studio, engineers can implement radar and vision integration without starting from scratch. The software supports dataset management, training and validation, enabling fast iteration with minimal coding or ML infrastructure. It can also be used as a data collection platform to explore custom solutions for different objects and working environments.
The ready-built hardware solution is optimized for edge AI processing, eliminating concerns about custom implementations and hardware tradeoffs.
The Raivin is already commercially available, giving OEMs a validated 3D perception system that can scale. Whether deployed in mobile robots, precision agriculture or fleet vehicles, the Raivin module enables fast integration of AI-powered perception through a single platform.
Au-Zone sets the benchmark for visual perception at the edge. With over two decades of innovation since its founding in 2001, Au-Zone has emerged as a trusted leader in embedded AI and machine learning for edge computing.
Solution: The Raivin, 3D Perception Sensor Fusion
Learn more about Au-Zone
Director Industrial Segment Marketing for Transportation and Mobility, NXP Semiconductors
Altaf has more than 30 years of experience in applications engineering, product marketing and business development roles for enterprise, service provider and industrial applications. He is the segment lead at NXP for Transportation and Mobility which includes Mobile Robotics, Machine Vision and warehouse logistics automation and responsible for defining system solutions to accelerate automation using Autonomous Mobile Robots. He graduated from the University of South Bank in London, UK with a BS in Electrical and Electronic Engineering.