LoRaWAN geolocation module are compact for asset tracking

Low power indoor-outdoor geolocation company, Abeeway, partnered with Murata to develop the LBEU5ZZ1WL geolocation module. It is a low power wide area network (LPWAN) module, designed to maximise battery life and is claimed to be sensitive and flexible enough to build customised, inexpensive, IoT tracking devices. Applications are both indoors and outdoors and include asset recovery, traceability, inventory management and theft prevention.

The modules integrate LoRaWAN [long range WAN] connectivity. The low power technology has become the de-facto standard for connectivity over public and private networks. In addition to a dual-core, multi-protocol wireless microcontroller, the modules contain a high-performance multi-constellation GNSS receiver (GPS, GLONASS, Beidou and Galileo) and support a patented low-power localiser performance (LP) mode. 

On-board Bluetooth Low Energy (BLE) 5.2 (Bluetooth SIG cert is underway) connectivity allows for short range communication with smartphone apps and other devices for applications such as contact tracing or monitoring BLE tagged equipment. The multi-technology modules have full power-optimised FreeRTOS, drivers and a LoRaWAN stack.

“The key requirement is to minimise time to market for device manufacturers embracing LoRaWAN for industrial or consumer applications. The new geolocation module . . .  brings a market-ready reference for anyone wanting to develop a customised tracking device,” said Olivier Hersent, CEO at Actility, Abeeway’s parent company.

The module implements Semtech’s latest transceiver and ST’s STM32 wireless microcontrollers for security features and device management for LoRaWAN deployment, he added.

According to Akira Sasaki, general manager – connectivity module marketing, Murata: “This exciting collaboration greatly simplifies the development of new LoRaWAN tracking devices and offers customers faster time to market by minimising BoM [bill of materials] size, TCO [total cost of ownership] and complexity”.

The modules are designed and manufactured by Murata Manufacturing. The company encapsulated an array of enabling devices within one system in package (SiP). The compact module helps engineers save valuable board space for space-restricted tracking applications, as well as helps them to reduce certification risk and budget, test time and cost for mass production, said Murata.

The Abeeway-Murata Geolocation Module with a fully featured evaluation kit will be available in production quantities after September 2022. The modules will be demonstrated at Embedded World (21 to 23 June) in Nuremberg, Germany and LoRaWAN World Expo in Paris, France (6 to 7 July).

http://www.murata.com

> Read More

ToF sensors advances use less energy or double range

Metasurface lens technology is introduced by STMicroelectronics in its second generation, FlightSense multi-zone direct time of flight (ToF) sensor. The VL53L8 uses less energy and can range twice as far as existing products, said the company. 

The FlightSense ToF ranging sensor for smartphone camera management and augmented reality / virtual reality  (AR / VR) offers up to four metres in range in all zones indoors, while reducing the power consumption by half compared to the previous-generation device, operating in common conditions. 

The ToF sensor’s metasurface lens technology and power-efficient architecture reduces battery loading, extends camera autofocus ranging and enhances scene-understanding features, explained Eric Aussedat, executive vice president and general manager of the imaging sub-group within ST’s Analog, MEMS and Sensors group.

What is claimed to be the world’s first optical metasurface technology was developed in partnership with Metalenz. It enables optical systems to collect more light, provide multiple functions in a single layer and deliver new forms of sensing in smartphones and other devices, in the form factor of a single, compact package, added ST.

This second-generation ranging sensor incorporates an efficient optically diffractive metasurface lens technology, manufactured at the company’s 300mm fab in Crolles, France. The VCSEL driver is three times more capable than the previous generation with an efficient VCSEL, enabling the VL53L8 to deliver twice the ranging performance of earlier VL53L5 or reduce power consumption by 50 per cent, when operating in comparable conditions. It delivers this performance while maintaining the same field of view and discrete output-ranging zones (4×4 at 60 frames per second or 8×8 at 15 frames per second).

The module embeds a high output 940nm VCSEL light source, an SoC sensor with an embedded VCSEL driver, the receiving array of SPADs (single photon avalanche diodes) and a low power, 32-bit microcontroller core. 

The VL53L8 adopts a metasurface lens technology in both the transmit and receive apertures and delivers 16 or 64 discrete ranging zones with stable and accurate ranging, said ST.

The sensor is housed in a single reflowable component that offers 1.2V and 1.8V I/O compatibility for ease of system integration. It is also claimed to significantly reduce the host processor loading over the demands of the first-generation sensor.

Like all other FlightSense ToF proximity sensors, the VL53L8 retains an IEC 60825-1 Class 1 certification and is fully eye-safe for consumer products with an advanced lens detach detection system. 

The VL53L8 can be used in smartphones and tablets in both user-facing, like object tracking and gesture recognition, and world-facing applications, such as laser-autofocus, camera selection, touch-to-focus, and flash dimming. It is also suitable for accessories in personal electronics equipment (e.g. smart speakers and AR / VR and mixed reality (MR). The VL53L8 will deliver even greater benefit to these features in low-light conditions, said ST. It can also support indoor/outdoor detection and smart-focus bracketing, as well as consumer lidar, where depth mapping is required. 

The VL53L8 is entering mass production now for select customers. 

http://www.st.com

> Read More

3D image sensor is ISO26262-compliant 

3D image sensors can differentiate a vehicle’s dashboard, and monitoring systems, said Infineon, as it introduced the ISO26262-compliant, high resolution 3D image sensor. It has developed in collaboration with 3D time-of-flight system specialist pmdtechnologies and is the second generation of the REAL3 automotive image sensor, the IRS2877A.

“We are now offering high resolution with a tiny image circle to the automotive world,” said Christian Herzum, vice president 3D Sensing at Infineon. “This enables cars with functions from the consumer world, while maintaining automotive standards and even improving passive safety,” he added. It can be used to integrate secure facial authentication for seamless connectivity for any type of service that requires authentication such as payment, battery charging or accessing private data.

In addition, the same camera meets all requirements for driver monitoring to detect driver distraction and fatigue. This enables a driver monitoring system with secure 3D facial recognition using only one time of flight (ToF) camera.

“From the start we were focussed on improving the robustness of the underlying pmd-based ToF technology against external influences such as sunlight or other disturbing light sources. For this reason, the new imager shows excellent and cutting edge performance even under harsh conditions”, said Bernd Buxbaum, CEO pmdtechnologies.

The REAL3 sensor is in a 9.0 x 9.0mm² plastic BGA package and offers a VGA system resolution of 640 x 480 pixels with an image circle of 4.0mm. This allows lens sizes similar to those on smartphones to be used for automotive applications. 

The high resolution of the REAL3 sensor also makes it suitable for camera applications with a wide field of view, such as complete front-row occupant monitoring systems. The 3D body models enable accurate estimates of occupant size and weight, as well as precise passenger and seat position data. These figures are required for intelligent airbag deployment and restraint systems. 

The latest REAL3 sensor is qualified to AEC-Q100 Grade 2 and is the first of its kind being developed according to the ISO26262 (ASIL-B) standard, claimed Infineon.

3D data also allows for comfort features such as gesture control or intuitive interior lighting that follows passengers’ movements. The imager may also be used in environmental perception scenarios as a flash-lidar. 

In addition to automotive designs, it may also find applications in mobile robotics, drones and other autonomous use cases for operator safety.

Development samples of the new 3D image sensor chip (IRS2877A) are available now and series production has begun. The ISO26262 compliant variant IRS2877AS will be available for series production by the end of 2022. 

http://www.infineon.com

> Read More

Sensor tracks motion and orientation for IoT and metaverse

Sensor fusion for motion tracking, heading and orientation detection is provided by the FSP201 low power sensor hub microcontroller released by Ceva. It is designed for use in consumer robotics and smart devices which use motion tracking and orientation such as 3D audio headsets or XR (extended reality, or augmented, virtual and mixed reality) glasses. They can also be used in six-axis motion use cases across the IoT and the metaverse.

The FSP201 is based on Ceva’s proprietary MotionEngine sensor processing software and a low-power 32bit Arm Cortex M23 microcontroller. Manufacturers can choose from a pre-qualified list of external six-axis IMU (inertial measurement units) sensors (i.e. accelerometer plus gyroscope). These are from different sensor suppliers, thereby ensuring supply chain flexibility. The sensor options can be used to provide correction smoothing actions and auto-centring for example. The former corrects orientation drift slowly and transparently allowing head and body tracking by the user for an immersive XR or 3D audio experience, the latter dynamically recentres the soundstage in 3D audio applications based on a user’s gaze to maintain immersion in dynamic conditions and eliminate drift. Other options are sensors with tilt independent heading, which allows for proper heading output even when a robot’s driving surface is uneven, enabling rapid adjustment to obstacles or changes in flooring types and ones with inclination detection to provide three degrees of freedom robot orientation to detect surface issues that could cause the robot to become stuck or damaged.

Proprietary algorithms are able to monitor changes in sensor performance and temperature during live operation for dynamic calibration. Several low-cost MEMS sensors from leading suppliers are pre-qualified and have drivers pre-integrated to accelerate development and ensure supply chain flexibility, added Ceva.

The FSP201 is designed to be versatile, using I2C and UART interfaces for chip connectivity and can be placed directly on the target product’s main circuit board or designed into a separate module. It is also code-compatible with the BNO08X series of nine-axis sensor system in package (SiP) products. 

The FSP201 and evaluation tools are available for immediate sampling via designated distributors. 

https://www.ceva-dsp.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration