Solid-state lidar improves detection distance for vehicles

At this week’s CES 2019 (8 to 12 January) in Las Vegas, USA, RoboSense will demonstrate an upgraded version of its MEMS solid-state lidar, an automotive grade version designed for the mass production of autonomous vehicles. The RS-LiDAR-M1 has patented MEMS technology and offers vehicle intelligence awareness to a level that fully supports Level 5 driverless automated driving. The company also claims a breakthrough on the measurement range limit based on 905nm lidar with a detection distance to 200m. As a result, says the company, the upgraded optical system and signal processing technology can now clearly recognise even small objects, such as railings and fences.

The first generation MEMS solid-state lidar RS-LiDAR-M1Pre was launched at last year’s CES and was loaded on the Cainiao unmanned logistics vehicle in May 2018. This year the company will be showcasing the potential of its MEMS optomechanical system design, with improvements in detection distance, resolution, field of view (FoV) and reliability.

The RS-LiDAR-M1 MEMS optomechanical lidar provides an increased horizontal field of view compared to the previous generation, reaching 120 degrees FoV; only a few RS-LiDAR-M1s are needed to cover the 360 degrees field of view. It also means that with only five RS-LiDAR-M1s, there is no blind zone around the car with dual lidar sensing redundancy provided in front of the car for a Level 5, i.e. full driverless – driving.

The company believes that the battle between 1550 and 905nm lidar is about cost and performance. When aiming for a low-cost 905nm lidar, it is necessary to overcome the technical difficulties of achieving sufficient measurement range. The RS-LiDAR-M1 achieves what the Robosense describes as a breakthrough on the measurement range limit based on the 905nm lidar, with a detection distance to 200m.

The final output point cloud effect means that the RS-LiDAR-M1 has improved detection capability via the upgraded optical system and signal processing technology, which can now clearly recognise even small objects, such as railings and fences.

http://www.robosense.ai

> Read More

Developer toolbox supports STM32Cube microcontrollers at the edge

Driving artificial intelligence (AI) to edge and node embedded devices, STMicroelectronics has introduced the STM32 neural network developer toolbox.

AI uses trained artificial neural networks to classify data signals from motion and vibration sensors, environmental sensors, microphones and image sensors, more quickly and efficiently than conventional handcrafted signal processing.

The STM32Cube.AI extension (X-Cube-AI) software tool generates optimised code to run neural networks on STM32 microcontrollers. It can be downloaded inside ST’s STM32CubeMX MCU configuration and software code-generation ecosystem.

Today, the tool supports Caffe, Keras (with TensorFlow backend), Lasagne, ConvnetJS frameworks and integrated development environments (IDEs) including those from Keil, IAR, and System Workbench.

The FP-AI-Sensing1 software function pack provides examples of code to support end-to-end motion (human-activity recognition) and audio (audio-scene classification) applications based on neural networks. This function pack leverages ST’s SensorTile reference board to capture and label the sensor data before the training process. The board can then run inferences of the optimised neural network.

The ST Bluetooth low energy (BLE) Sensor mobile app acts as the SensorTile’s remote control and display.

The toolbox consists of the STM32Cube.AI mapping tool, application software examples running on small form factor, battery-powered SensorTile hardware, together with the partner program and dedicated community support offers a fast and easy path to neural network implementation on STM32 devices.

The extension is supplied with ready-to-use software function packs containing code examples for human activity recognition and audio scene classification that are immediately usable with ST‘s reference sensor board and mobile app.

Developer support is provided through qualified partners in the ST Partner Program and dedicated AI/machine learning (ML) STM32 community, assures the company.

ST explains that STM32Cube.AI can be used by developers to convert pre-trained neural networks into C-code that calls functions in optimised libraries that can run on STM32 microcontrollers.

Accompanying software function packs include example code for human activity recognition and audio scene classification. These code examples are immediately usable with the ST SensorTile reference board and the ST BLE Sensor mobile app.

ST will demonstrate applications developed using STM32Cube.AI running on STM32 microcontrollers in a private suite at CES, the Consumer Electronics Show, in Las Vegas, (8 to 12 January).

http://www.st.com

> Read More

Mali image signal processors enhance image quality for drones and security

Arm has released the Mali-C52 and Mali-C32 image signal processors (ISPs). They are claimed to provide class-leading image quality, a complete software package, and full set of calibration and tuning tools and bring a higher image quality to a range of everyday devices including drones, smart home assistants and security, and internet protocol (IP) cameras

Based on Arm Iridix technology, the ISPs apply over 25 processing steps to each pixel, three of which – high-dynamic range (HDR), noise reduction and colour management -deliver key differentiation in terms of image output quality, says Arm.

By incorporating Arm’s market-leading Iridix technology and algorithms for noise and colour management, the Mali-C52 and Mali-C32 ISPs efficiently deliver all three at high resolution and in real-time (e.g. 4k resolution at 60 frames per second).

The company estimates that demand for IoT devices with embedded vision capabilities are among the highest across connected devices. For example, the security (surveillance) and IP camera market is expected to see an annual growth rate of 20 per cent through to 2021, reaching over 500 million shipped units. Personal robots are growing at a year on year rate of 75 per cent, and are expected to reach two million units shipped in 2021. Similarly, the smart home market is growing at 14 per cent year on year, to realise 88 million in 2021, while other devices like drones, augmented reality (AR)/ mixed reality (MR) equipment and action cameras are being widely adopted.

The Arm Mali-C52 and Mali-C32 ISPs both have Arm’s dynamic range management and tone mapping technology, to enable viewers to see enhanced shadows, without touching or changing the highlights of the image.

The Mali-C52 can be configured to optimise image quality or area. Arm’s silicon partners can use the same IP and software across a range of products and use cases. The Mali-C32 is optimised specifically for area in lower-power, cost-sensitive embedded vision devices such as entry-level access control or hobby drones.

Both ISPs are include the hardware IP along with ISP software drivers, including the 3A libraries (auto-exposure, auto-white balance, auto-focus) and the calibration and tuning tools. They feature Iridix technology, designed to deliver a precise model of human retina contrast adaption, enabling cameras to see like the human eye. They can process 600 megapixels per second, which is essentially professional photography quality (e.g. DSLR) at premium smartphone-level frame rates, says Arm.

A software package for controlling the ISP, sensor, auto-white balance, auto-focus and auto-exposure is included. Both bare metal and Linux (Video4Linux framework – v4L2) software are provided.

http://www.arm.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration