5Mpixel RGB-IR global shutter sensor is a first for in-cabin monitoring

For simultaneous driver and occupant monitoring in vehicles, the OX05B1S has been introduced by Omnivision. It is claimed to be the smallest 2.2 micron pixel with the highest near infrared (NIR) sensitivity.

It is also believed to be the automotive industry’s first 5Mpixel RGB-IR BSI global shutter sensor for in-cabin driver monitoring systems (DMS). Despite its pixel size of just 2.2 micron, it offers 940nm NIR sensitivity for performance in extremely low light conditions. It has a wide field of view and enough pixels to view both the driver and occupants, adds the company. It is also claimed to be the first RGB-IR sensor for in-cabin monitoring to feature integrated cybersecurity.

The introduction supports the automotive industry’s transition to 5Mpixel operation and only one camera is needed instead of two for simultaneous driver and occupant monitoring systems, reducing complexity, cost and space,” points out Andy Hanvey, director of automotive marketing at Omnivision. The company adds that the OX05B1S was developed with “strong ecosystem partner support, working with companies such as Seeing Machines, to enable a complete, seamless solution for automotive OEMs”.

The sensor is based on Omnivision’s Nyxel NIR technology which uses novel silicon semiconductor architectures and processes to achieve the world’s best quantum efficiency (QE) at the 940nm NIR wavelength, claims the company. The OX05B1S has an NIR QE of 35 per cent which is four times the capability of the previous generation of sensor. This enables the OX05B1S to detect and recognise objects that other image sensors would miss under extremely low lighting conditions, to enhance in-cabin camera capabilities for occupant and driver monitoring and security. In other non-automotive systems it can be used to enhance selfie images and videoconferencing.

The sensor comes in Omnivision’s stacked a‑CSP package that is 50 per cent smaller than competitive products and allows for higher-performance image sensors in tighter camera spaces. It is also available in a reconstructed wafer option for designers who want to customise their own package.

The OX05B1S has an optional always on optional feature. 

Samples are available now and the sensors will be in mass production in Q1 2023.

http://www.ovt.com

> Read More

Infineon expands Aurix microcontroller family with electrification in mind

At CES 2022, Infineon has introduced the TC4x series of Aurix microcontrollers. They target trends in e-mobility, advanced driver assists systems (ADAS), automotive electric-electronic (E/E) architectures and affordable AI applications. 

The scalable family allows for a common software architecture, adds Infineon, for dependable electronics and software-based applications. The  Aurix TC4x offers enhanced connectivity, advanced safety and security, says Infineon, while new software over the air (SOTA) features are intended to meet car manufacturers’ demands for fast and secure car-to-cloud connection, with updates in the field, as well as diagnosis and analysis when the vehicle is in use.

Infineon Technologies manufactures semiconductors “that make life easier, safer and greener”. 

http://www.infineon.com

> Read More

High resolution radar sensor monitors vehicle’s blind spots

Using a Doppler division multiple access (DDMA)-based signal processing method, the AWR2944 radar sensor can help automotive makers implement systems that can detect vehicles further away than is possible today. The 77GHz sensor has been announced by Texas Instruments and is supplied in a small form factor which it claims is approximately 30 per cent smaller compared to radar sensors today. 

The AWR2944 sensor integrates a fourth transmitter to provide 33 per cent higher resolution than existing radar sensors, enabling vehicles to detect obstacles more clearly and avoid collisions. The DDMA signal processing improves the ability to sense oncoming vehicles at distances up to 40 per cent farther away than is currently possible. 

The high resolution radar sensor will enable driver assistance technology to more accurately monitor blind spots and efficiently navigate turns and corners to safely avoid collisions for improved advanced driver assistance systems (ADAS), says the company.

According to the Federal Highway Administration, more than half of fatal and injury crashes occur at or near intersections or junctions. Texas Instruments says that the AWR2944 radar sensor can help vehicle manufacturers meet new safety regulations, enabling vehicles to detect obstacles more clearly to avoid collisions.    

“Visibility around corners has historically been challenging for autonomous and semiautonomous vehicles,” concedes Curt Moore, manager for Jacinto processors at TI. “For automated parking and driving, being able to see farther with devices like the AWR2944 sensor – and then seamlessly process that data with our Jacinto processors – leads to improved awareness and safety,” he believes.

The AWR2944 and an AWR2944 evaluation module (AWR2944EVM) are available now. 

Texas Instruments (TI) designs, manufactures, tests and sells analogue and embedded processing chips for markets such as industrial, automotive, personal electronics, communications equipment and enterprise systems. 

http://www.TI.com

> Read More

Radar scene emulator brings automakers closer to vehicle autonomy

Automotive OEMs are provided with full-scene emulation via the Radar Scene Emulator introduced by Keysight Technologies. It enables them to lab test complex, real-world scenarios, accelerating the overall speed of test and full vehicle autonomy. 

Full-scene emulation in the lab is critical to developing the robust radar sensors and algorithms needed for advanced driver assistance systems (ADAS)/autonomous driving (AD) capabilities. Keysight’s full-scene emulator combines hundreds of miniature radio frequency (RF) front ends into a scalable emulation screen representing up to 512 objects and distances as close as 1.5 meters.

Using full scene rendering that emulates near and far targets across a wide continuous field of view (FOV), Keysight’s Radar Scene Emulator enables customers to rapidly test automotive radar sensors integrated in autonomous driving systems with highly complex multi-target scenes. 

Its patented technology shifts emulation away from target simulation for object detection to traffic scene emulation. This approach allows automotive OEMs to see more with a wider, continuous field of view (FOV) and supports both near and far targets. In this way, gaps in a radar’s vision are eliminated, while enabling improved training of algorithms to detect and differentiate multiple objects in dense, complex scenes. As a result, autonomous vehicle decisions can be made based on the complete picture, not just what the test equipment sees, explained Keysight.

Radar sensors can be tested against a limited number of targets, providing an incomplete view of driving scenarios and masking the complexity of the real-world. Keysight’s radar scene emulator allows OEMs to emulate real-world driving scenes in the lab with variations of traffic density, speed, distance and total number of targets. Testing can be completed early for common to corner case scenes, while minimising risk, added the company.

It also provides a deterministic real-world environment for lab testing complex scenes that can presently only be tested on the road. OEMs can “significantly accelerate ADAS/AD algorithm learning by testing scenarios earlier with complex repeatable high-density scenes, with objects stationary or in motion, varying environmental characteristics, while eliminating inefficiencies from manual or robotic automation,” said the company.

There are point clouds (multiple reflections per object), which improve resolution for each object. For example, distinguishing between obstacles on the road which is required for Level 4 and 5 vehicle autonomy as designated by the Society of Automotive Engineers (SAE).

Keysight will demonstrate the Radar Scene Emulator at CES 2022 (5 to 8 January) at Booth 4169, Las Vegas Convention Center, West Hall.

Keysight’s radar scene emulator is part of the company’s Autonomous Drive Emulation (ADE) platform, created through a multi-year collaboration between Keysight, IPG Automotive and Nordsys. The ADE platform exercises ADAS and AD software through the rendering of pre-defined use cases that apply time-synchronised inputs to the actual sensors and sub-systems in a car, such as the global navigation satellite system (GNSS), vehicle to everything (V2X), camera and radar. The open platform, ADE enables automotive OEMs, and their partners, to focus on the development and testing of ADAS/AD systems and algorithms, including sensor fusion and decision-making algorithms. Automotive OEMs can integrate the platform with commercial 3D modelling, hardware-in-the-loop (HIL) systems and existing test and simulation environments.

http://www.keysight.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration