Infineon accelerates deployment of safe and secure robots using digital twins in collaboration with NVIDIA

Infineon has announced the expansion of its collaboration with NVIDIA to advance system architectures for Physical AI, with a focus on humanoid robots. Building on the collaboration announced in August 2025, the companies intend to combine Infineon’s strengths in motor control, microcontrollers, power systems and security with NVIDIA’s AI, robotics and simulation platforms to help the ecosystem design and deploy humanoid robots. Infineon will also join NVIDIA Halos AI Systems Inspection Lab to examine the design of robust hardware and software safety foundations, ensuring that robots can operate safely and securely in real-world environments.

Humanoid robots are complex systems that must perceive their surroundings, make decisions in real time and act safely – often in workplaces designed for humans. To enable this, they rely on a chain of semiconductor-based functions: sensing, processing, actuation, connectivity and energy management. Infineon is bringing its semiconductor solutions to NVIDIA’s simulation and robotics platforms to accelerate this chain of sensing, thinking, and acting safely and securely so that humanoid robots can move more quickly from lab pilots into deployment at scale.

A key element of the collaboration is the use of digital twins of Infineon’s smart actuators and selected sensors. These virtual models are deployed in NVIDIA Isaac Sim and NVIDIA Isaac Lab, open robotic learning and simulation frameworks, so developers can test and finetune robot motion control and perception in realistic simulation before hardware is built or integrated. By identifying and resolving issues earlier in the development cycle, customers can shorten time‑to‑market and reduce integration risk for humanoids used in applications such as logistics, manufacturing and service robotics.

Building on their existing collaboration, Infineon and NVIDIA will work on a common system architecture for humanoid robots that delivers ultra‑low latency, compact form factors and high-power density. Infineon will provide motor‑control solutions powered by NVIDIA Holoscan Sensor Bridge that interface with NVIDIA Jetson Thor developer platform, using Infineon AURIX microcontrollers and PSOC devices and supporting post‑quantum cryptography (PQC) for firmware and system protection.

Security is a key part of the collaboration. NVIDIA Jetson Thor pairs a compact compute module with a carrier board that provides power and interfaces to sensors, networking, and actuators. Infineon will provide hardware TPM (Trusted Platform Module) chips and other security components as reference designs to protect AI models and data and secure the system from NVIDIA Jetson module all the way to the cloud. The collaboration will also focus on Halos safety framework, enabling the design of certifiable systems for Level 4 autonomous vehicles and robotics. Infineon will provide hardware and software safety foundations, integrating hardware platforms and operating systems to ensure rigorous safety and systematic cybersecurity by design across the entire stack. This helps companies designing Jetson carrier boards build in stronger security, including secure boot, encrypted communications, and safe over-the-air updates.

infineon.com

> Read More

ST and Leopard accelerate robotics vision with NVIDIA multi-sensor module

ST and Leopard Imaging have introduced an all-in-one multimodal vision module for humanoid and other advanced robotics systems. Combining ST imaging, 3D scene-mapping, and motion sensing with the NVIDIA Holoscan Sensor Bridge technology, the module integrates natively with NVIDIA Jetson and NVIDIA Isaac open robot development platform, simplifying and accelerating vision system design within the size, weight, and power constraints of humanoid robots.

Powered by the NVIDIA Holoscan Sensor Bridge, the new module integrates seamlessly with NVIDIA Jetson over ethernet for real-time sensor data ingestion and NVIDIA Isaac open robot development platform, which offers open AI models, simulation frameworks and libraries for developers. The new module includes a build system and application programming interfaces (APIs), artificial intelligence (AI) algorithms curated for mobile robots, sample applications, domain randomisation, and a simulation environment containing sensor models.

ST continues to integrate its sensors, drivers, actuators, controllers, and development tools into the NVIDIA robotics ecosystem as a key NVIDIA robotics and edge AI partner, including high-fidelity models and proof-of-concept modules.

For vision-based sensing, the ST VB1940 automotive-grade RGB-IR 5.1-megapixel image sensor with combined rolling shutter and global shutter modes. ST has also released a mass market and industrial version V**943, part of the ST BrightSense product family, existing in monochrome or RGB-IR, in die or packaged sensor.
For motion sensing, the LSM6DSV16X 6-axis inertial measurement unit (IMU) embeds ST machine-learning core (MLC) for AI in the edge, sensor-fusion low-power (SFLP), and Qvar electrostatic sensing for user-interface detection.

For 3D depth sensing, the VL53L9CX dToF all-in-one LiDAR module, part of the ST FlightSense product family, provides 3D depth sensing with accurate ranging up to 9 meters. With its resolution of 54 x 42 zones (near 2,300 zones) combined with a wide 55°x42° FoV providing 1° angular resolution, short and long-distance measurements and small objects detection are achievable at up to 100 fps.

www.st.com

www.leopardimaging.com

> Read More

TI accelerates the next generation of physical AI with NVIDIA

TI today announced it is accelerating the safe deployment of humanoid robots into the real world with NVIDIA. By combining TI’s real-time motor control, sensing, radar and power technologies with NVIDIA’s advanced robotics compute, ethernet based sensing and simulation technologies, robotics developers can validate perception, actuation and safety earlier and more accurately. TI connects NVIDIA physical AI compute to real-world applications with deterministic control, sensing, power, and safety at every joint and subsystem. This partnership will help developers move faster from virtual development to production-ready, scalable and safety-compliant systems.

As part of this collaboration, TI designed a sensor fusion solution by integrating its mmWave radar technology with NVIDIA Jetson Thor using NVIDIA Holoscan Sensor Bridge to enable low-latency, 3D perception and safety awareness for humanoid robots.

TI’s mmWave radar sensor, IWR6243, connected via ethernet to NVIDIA Jetson Thor enables scalable low-latency, 3D perception and safety awareness for physical AI applications. By fusing camera and radar data, the solution improves object detection, localisation, and tracking while reducing false positives for confident, real-time decision-making in humanoid robots.

This solution enables human-like perception that works reliably in challenging conditions – from low light and bright glare to fog and dust indoors and outdoors – and addresses a critical safety gap that has limited real-world deployment of humanoid robots. For example, while cameras may not reliably detect glass doors or reflective surfaces, radar provides consistent detection of these transparent obstacles, enabling smooth navigation in places like office buildings, hospitals and retail environments.

ti.com

> Read More

Mouser’s autonomous vehicle resource centre addresses real-world deployment challenges

Mouser has expanded its Autonomous Vehicle (AV) resource centre focused on the system architectures and design constraints shaping production-ready autonomy. The hub examines how sensing, in-vehicle networking, and vehicle-to-everything (V2X) communications feed real-time decision systems, and why safety, cybersecurity, and ethical edge cases increasingly define what “deployable” means in the field.

For widespread, real-world deployment of autonomous vehicles, engineers must balance deterministic performance, functional safety, and cybersecurity while ensuring they can make safe and ethical decisions on the road. Robotaxi services have clearly illustrated the challenges of real-world use with continued technical barriers. To address these issues, AV designers are integrating high-bandwidth sensing, in-vehicle networking, and continuous over-the-air (OTA) software updates into architectures that can be certified, serviced, and evolved over time. These pressures are accelerating the move toward software-defined vehicles and zonal architectures, which separate sensing and actuation from centralised compute to reduce wiring complexity, improve fault isolation, and enable modular system evolution beyond initial deployment.

Developed with input from Mouser’s technical team and manufacturer partners, the AV Resource Hub provides a curated library of articles, blogs, eBooks, and product information designed to help engineers evaluate real-world deployment tradeoffs. Topics span perception and sensor fusion, deterministic networking, functional safety and cybersecurity, ethical decision logic, and regulatory considerations, framed through the lens of practical system integration rather than lab-only performance.

resources.mouser.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration