Sensor fusion software places control in the palm of consumer’s hand

Motion-based gesture control together with 3D motion tracking and pointing are enabled by software by Ceva that can be used for wireless, handheld controllers for consumer devices.

The Hillcrest Labs MotionEngine Air algorithms and software stack will be demonstrated at CES in Las Vegas (7 to 10 January 2020). The production-ready software delivers low power, motion-based gesture control, 3D motion tracking and pointing for consumer handheld devices in high volume markets, such as smartphone and PC stylus pens, smart TV and over-the-top (OTT) remote controls, game controllers, augmented reality and virtual reality (AR and VR) controllers, and PC peripherals.

Ceva reports that advances in low power inertial sensors and Bluetooth Low Energy along with the MotionEngine Air sensor fusion software yields sub-mA level power draw for the whole system for precise, interactive and intuitive motion-control for always-on and always-aware control.

MotionEngine Air software is optimised for use in embedded devices. It has precise pointing, gesture and motion control algorithms, a cursor for point-and-click control of an onscreen user interface, gestures such as flick, shake, tap, tilt, rotate, or circle for intuitive user interface controls, six-axis sensor fusion enabling 3D motion tracking for gaming and VR, motion events (pick-up, flip and stability detector) to enable power savings and patented orientation compensation and adaptive tremor removal.

The software stack has host drivers and sensor management to streamline software integration with popular operating systems including Android, Windows, MacOS and Linux.

The MotionEngine Air software is flexible with a low power and small memory footprint and can run on a variety of processors, including Arm Cortex-M, RISC-V and CEVA-BX and CEVA-TeakLite families of DSPs. It can be delivered in multiple configurations – including one that requires an accelerometer plus gyroscope (IMU) and a gesture and motion event-based one that requires only an accelerometer.

MotionEngine Air software is pre-qualified for use with sensors from the leading inertial sensor suppliers, says Ceva.

The company will demonstrate its MotionEngine Air software and evaluation board during CES 2020 at its hospitality suite in the Westgate Hotel.

http://www.ceva-dsp.com 

> Read More

Partners to develop AI hardware and software for autonomous vehicles

Videantis, which provides automotive deep learning, computer vision and video coding solutions, has announced that it will partner with the Fraunhofer Institute for Integrated Circuits IIS, Infineon and other leading companies and universities to develop an artificial intelligence (AI) ASIC and software development tools specifically for intelligent autonomous vehicles.

The Videantis AI multi-core processor platform and tool flow has been selected for the KI-Flex autonomous driving chip project.

Autonomous driving relies on fast and reliable processing and merging of data from several lidar, camera and radar sensors in the vehicle. This data can provide an accurate picture of the traffic conditions and environment to allow the vehicle to make intelligent decisions when driving. The process of intelligently analysing these volumes of sensor data requires high-performance, efficient, and versatile compute solutions.

Videantis, Fraunhofer IIS and partners working on the KI-Flex project to develop a powerful system-on-a-chip and associated software development tools will use the AI multi-core processor to run the algorithms used for sensor signal processing and sensor data fusion to enable the vehicle’s exact position and environment to be understood. The chip will integrate the next-generation videantis processors and technology to run these demanding artificial intelligence algorithms in real-time and with low power consumption. This chip will also be supported by an automated tool flow from Videantis, which automatically distributes and maps AI workloads onto the parallel architecture.

Under the KI-Flex program, Videantis will integrate its multi-core processor into a software-programmable and reconfigurable chip that processes the sensor data using AI-based methods for autonomous driving.

The project, which runs until August 2022, is funded by the German Federal Ministry of Education and Research (BMBF). Fraunhofer IIS leads the project consortium, which comprises Ibeo Automotive Systems, Infineon Technologies, Videantis, Technical University of Munich (Chair of Robotics, Artificial Intelligence and Real-time Systems), Fraunhofer Institute for Open Communication Systems FOKUS, Daimler Center for Automotive IT Innovations (DCAITI, Technical University of Berlin) and FAU Erlangen-Nürnberg (Chair of Computer Science 3: Computer Architecture).

Videantis is headquartered in Hannover, Germany. It provides deep learning, computer vision and video processor IP for flexible computer vision, imaging and multi-standard hardware/software video coding for automotive, mobile, consumer, and embedded markets. Based on a unified processor platform approach that is licensed to chip manufacturers, Videantis provides tailored solutions to meet the specific needs of its customers. Core competencies are deep camera and video application with SoC design and system architecture expertise. Target applications are advanced driver assistance systems (ADAS) and autonomous driving, mobile phones, augmented reality / virtual reality (AR/VR), IoT, gesture interfacing, computational photography, in-car infotainment, and over-the-top (OTT) TV.

http://www.videantis.com

> Read More

Image sensor evaluation kits support VR and smart buildings

Evaluation kits from ams support eye tracking, presence detection and object recognition in virtual reality headsets, smart lighting and home and building
automation products.

The Raspberry Pi and Arduino-based NanEyeC evaluation kits are based around the ams NanEyeC miniature image sensor.

The NanEyeC camera, image sensor is supplied as a 1.0 x 1.0mm surface-mount module. It produces 100kpixel resolution up to 58 frames per second and can be used for video applications where the camera needs to be accommodated in an extremely small space, such as eye tracking in virtual reality (VR) headsets. It can also be applied in user presence detection, to support automatic power on/off controls in home and building automation (HABA) applications such as air conditioning, home robotics, appliances and smart lighting.

The NanoVision demo kit for the NanEyeC is based on an Arduino development platform. It includes all necessary drivers to interface the sensor’s single-ended interface mode (SEIM) output to an Arm Cortex-M7 microcontroller. It also supports image processing including functions like colour reconstruction and white-point balancing.

The second board is the NanoBerry, designed for more demanding operations like eye tracking or stereo vision systems. This evaluation kit uses a NanEyeC image sensor add-on board to the Raspberry Pi port and includes firmware to interface to the Raspberry Pi host processor. Engineers can use the Arm Cortex-A53-based processor to perform demanding operations such as object detection, object tracking and computer vision functions provided by the OpenCV library.

The NanoBerry kit is suitable for high frame-rate and low-latency applications such as eye tracking. When integrated into the NanEye PC viewer, it enables full evaluation of the NanEyeC with access to all registers and raw image data.

The NanoVision board is available now to customers on request and the NanoBerry kit will be demonstrated at CES 2020 (the Venetian Tower, Suite 30-236) and will be available to customers in Q1 2020.

https://ams.com

> Read More

Flexible tags communicate with standard touchscreens

Research hub, imec, with TNO, and Cartamundi have developed a flexible capacitive identification tag that communicates with standard touchscreens. The C-touch tags can be integrated in a range of paper and plastic based objects such as tickets, certified documents and payment cards. Connection to the internet is established simply by placing the tagged object on the touchscreen or vice-versa.

C-touch tags are thin and flexible chips that have a unique identifier which can communicate via any touchscreen. Smart cards or other objects with embedded C-touch tags can securely interact with mobile phones used worldwide, as well as with the large number of touchscreens integrated in cars, booths, walls, coffee machines and everyday objects, says imec.

No additional hardware and major reconfigurations or additional costs for the users are incurred, confirms imec. The tags offer security thanks to the very short communication range and have the potential to be produced at low cost thanks to the monolithically integrated antenna. Compared to existing RFID technologies such as NFC, the C-touch tag does not require an external antenna. The tiny antenna is part of the chip, making the tag much smaller than current NFC tags. The small size enables C-touch tags to be integrated in all use cases where interaction via touchscreens is feasible, but RFID/NFC tags are either too large or too expensive or where contactless reading is a disadvantage; this can range from board games to providing higher security in payment cards, or to replace difficult to service and manage hardware readers and access control points with easy to service and update apps on standard mobile devices.

The C-touch tag is based on thin-film transistor technology and is powered by a thin-film battery or a thin-film photovoltaic cell that converts light from the touchscreen. The 12-bit thin-film capacitive identification tag achieves up to 36bits per second data transfer rates at 0.6V supply voltage, which is compatible with commercially available touchscreen devices without requiring modifications. The flexible thin-film integrated circuit has a 0.8cm2 on-chip monolithic antenna and dissipates only 38nW of power at 600mV supply voltage.

http://www.imec.be

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration