Integrated IP and software develop contextually-aware IoT devices

At CES this week, Ceva will demonstrate its SenslinQ integrated hardware IP and software platform, designed to streamline the development of contextually-aware IoT devices.

The platform collects, processes and links data from multiple sensors to enable intelligent devices to understand their surroundings, explains the company by aggregating sensor fusion, sound and connectivity technologies.

Contextual awareness adds value and enhances the user experience of smartphones, laptops, augmented reality/virtual reality (AR/VR) headsets, robots, hearables and wearables. The SenslinQ platform centralises the workloads that require an intimate understanding of the physical behaviours and anomalies of sensors. It collects data from multiple sensors within a device, including microphones, radars, inertial measurement units (IMUs), environmental sensors, and time of flight (ToF) sensors, and conducts front-end signal processing such as noise suppression and filtering on this data. It applies algorithms to create “context enablers” such as activity classification, voice and sound detection, and presence and proximity detection. These context enablers can be fused on a device or sent wirelessly via Bluetooth, Wi-Fi or NB-IoT, to a local edge computer or the cloud to determine and adapt the device to its environment.

The customisable hardware reference design is composed of an Arm or RISC-V microcontroller, CEVA-BX DSPs and a wireless connectivity island, such as RivieraWaves Bluetooth, wi-fi or Dragonfly NB-IoT platforms, or other connectivity standards provided by the customer or third parties. Each components of these three components are connected using standard system interfaces.

The SenslinQ software is comprised of a portfolio of ready-to-use software libraries from CEVA and its ecosystem partners. Libraries include the Hillcrest Labs MotionEngine software packages for sensor fusion and activity classification in mobile, wearables and robots, the ClearVox front-end voice processing, WhisPro speech recognition and DSP and artificial intelligence (AI) libraries. There is also third party software components for active noise cancellation (ANC), sound sensing and 3D audio.

The accompanying SenslinQ framework is a Linux-based hardware abstraction layer (HAL) reference code and application programming interfaces (APIs) for data and control exchange between the multiple processors and sensors.

https://www.ceva-dsp.com

> Read More

Wireless VR/AR haptic glove allows gamers to “feel” digital objects

At CES next week, BeBop Sensors will announce the Forte Data Glove, claimed to be the first virtual reality (VR) haptic glove integrated and exclusively designed for Oculus Quest, Oculus Link, Oculus Rift S, Microsoft Windows Mixed Reality, HTC Vive Cosmos, HTC Vive Pro, HTC Focus Plus, and Varjo VR headset technology. It is also the first haptic glove for the HTC Cosmos and for the Microsoft Windows mixed reality headsets, including HP, Lenovo, Acer, Dell, and Samsung, through integration with the HP Reverb. In addition, it is believed to be the first haptic VR glove to fully support Oculus Quest Link, which allows Oculus Quest to leverage the graphics capabilities and processing power of a VR computer for higher end VR interaction, says BeBop Sensors.

Described as the first affordable, all-day wireless VR/AR (augmented reality) data glove, the VR headset/data glove fits in a small bag for portability and requires almost no set-up, bringing VR enterprise training, maintenance and gaming to new areas. The Forte Data Glove ushers in the next generation of VR, says BeBop Sensors, by allowing people to do real practical things in the virtual world with natural hand interactions to feel different textures and surfaces.

A nine degree inertial measurement unit (IMU) is integrated, to provide low drift and reliable pre-blended accelerometer and gyro sensor data. Six haptic actuators are located on four fingertips, the thumb and the palm.

Up to 16 haptic sound files can reside on the glove and new files can be rapidly uploaded over Bluetooth or USB.

The sensors are fast, operating at 160Hz, with instantaneous (sub six millisecond) response. By providing touch feedback, the user experiences a more realistic and safer training for business and enhanced VR gaming experiences, says the company.

Hand tracking ties natively into each system’s translation system, with top-of-the-line finger tracking supplied by Bebop Sensors’ fabric sensors. Haptic feelings include those for hitting buttons, turning knobs, opening doors for touch sensations in VR/AR.

The universal open palm design fits most people and the glove can be cleaned, is hygienic and breathable with waterproof sensors.

The glove targets enterprise, as well as location-based entertainment (LBE) gaming markets, including VR enterprise training, VR medical trials/rehabilitation, robotics and drone control, VR CAD design and review and gaming.

BeBop Sensors will be at CES in Las Vegas, (7 to 10 January, 2020) Booth 22032 LVCC South Hall.

http://www.bebopsensors.com

> Read More

Image sensor evaluation kits support VR and smart buildings

Evaluation kits from ams support eye tracking, presence detection and object recognition in virtual reality headsets, smart lighting and home and building
automation products.

The Raspberry Pi and Arduino-based NanEyeC evaluation kits are based around the ams NanEyeC miniature image sensor.

The NanEyeC camera, image sensor is supplied as a 1.0 x 1.0mm surface-mount module. It produces 100kpixel resolution up to 58 frames per second and can be used for video applications where the camera needs to be accommodated in an extremely small space, such as eye tracking in virtual reality (VR) headsets. It can also be applied in user presence detection, to support automatic power on/off controls in home and building automation (HABA) applications such as air conditioning, home robotics, appliances and smart lighting.

The NanoVision demo kit for the NanEyeC is based on an Arduino development platform. It includes all necessary drivers to interface the sensor’s single-ended interface mode (SEIM) output to an Arm Cortex-M7 microcontroller. It also supports image processing including functions like colour reconstruction and white-point balancing.

The second board is the NanoBerry, designed for more demanding operations like eye tracking or stereo vision systems. This evaluation kit uses a NanEyeC image sensor add-on board to the Raspberry Pi port and includes firmware to interface to the Raspberry Pi host processor. Engineers can use the Arm Cortex-A53-based processor to perform demanding operations such as object detection, object tracking and computer vision functions provided by the OpenCV library.

The NanoBerry kit is suitable for high frame-rate and low-latency applications such as eye tracking. When integrated into the NanEye PC viewer, it enables full evaluation of the NanEyeC with access to all registers and raw image data.

The NanoVision board is available now to customers on request and the NanoBerry kit will be demonstrated at CES 2020 (the Venetian Tower, Suite 30-236) and will be available to customers in Q1 2020.

https://ams.com

> Read More

Fingerprint sensor authenticates via smartphone’s display

Believed to be the largest surface fingerprint-on-display (FoD) module, designed with organic printed electronics, a four-finger authentication module by Isorg will be demonstrated at CES 2020.

It is expected to bring higher data security to mobile devices, says the company. The company specialises in organic photodetectors (OPDs) and large-area image sensors and will demonstrate the full-screen Fingerprint-on-Display (FoD) module which supports up to four fingers simultaneously touching a smartphone display.

Isorg’s FoD module responds to demands from OEMs and end-users for higher security technologies that can support large surface area fingerprint sensing. Solutions currently available are restricted to single finger identification within a surface area of less than 10 x 10mm. Isorg’s FoD module supports one- to four-finger authentication across the entire dimensions of the six-inch (152mm) smartphone display (or even larger). The module is very thin, less than 0.3mm thick, so integration into smartphones is made easy for OEMs, added the company.

In addition to the image sensor, the module includes optimised thin film optical filters developed in-house and driving electronics, as well as software from Isorg’ industrial partners covering the interface with smartphone OS and the matching algorithm. The scalable FoD is compatible with foldable displays, added the company.

Isorg’s four-finger authentication capability paves the way for secure smartphone mobile banking and payments, personal health monitoring and remote home control applications, including enhanced data protection for wearables, such as access control. By enabling more identification data to be captured with multiple fingers, it significantly reduces the risk of false finger identity theft, in addition to the ease of use by being able to place fingerprints anywhere on the display.

Smartphone OEMs will be able to sample Isorg’s Fingerprint-on-Display module in spring 2020. In parallel, Isorg is also extending development of its FoD for application in the biometrics security market, aimed at meeting growing security needs in access control, border control and other identity access management areas, including mobile ID applications.

The live demo will take place at Eureka Park Tech West – Sands Expo, Level 1, Hall G booth #50463 at CES Las Vegas (7 to 10 January 2020)

http://www.isorg.fr

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration