Image sensors for computer vision use ST’s die-stacking technology

Two high-speed image sensors released by STMicroelectronics use global shutter to capture images. Global shutter mode captures distortion-free images when the scene is moving or when near-infrared illumination is needed. This makes the sensors suitable for use in the next generation of smart computer vision applications, says ST.

Global-shutter sensors save all pixel data in each frame simultaneously, contrasting with rolling-shutter operation that captures pixel data sequentially, which makes moving images vulnerable to distortion or in need of additional corrective processing.

The company’s image sensor process technologies are claimed to enable class-leading pixel size while offering both high sensitivity and low crosstalk. The silicon process innovation and advanced pixel architecture allows a smaller sensor pixel array on the top die, while keeping more silicon area on the bottom die to increase digital-processing capabilities and features.

ST’s advanced pixel technology, including full deep trench isolation (DTI), enables extremely small 2.61 x 2.61 micron pixels that combine low parasitic light sensitivity (PLS), high quantum efficiency (QE), and low crosstalk in a single die layer.

The VD55GO sensor has 640 x 600 pixels and the VD56G3 sensor has 1.5Mpixels (1124 x 1364). The VD55GO measures 2.6 x 2.5mm and the VD56G3 measures 3.6mm x 4.3mm, making them the smallest image sensors available today, in relation to resolution, says ST.

They also have low pixel-to-pixel crosstalk at all wavelengths, specifically near-infrared, which ensures high contrast for image clarity. Embedded optical-flow processing in the VD56G3 calculates movement vectors, without the need for host computer processing.

The sensors are intended for a wide range of applications including augmented reality / virtual reality (AR/VR), simultaneous localisation and mapping (SLAM), and 3D scanning.

According to Eric Aussedat, imaging sub-group general manager and executive vice president of the Analog, MEMS and Sensors Group, STMicroelectronics: “They are enabling another step forward in computer-vision applications, empowering designers to create tomorrow’s smart, autonomous industrial and consumer devices.”

Samples are shipping now to lead customers.

http://www.st.com

> Read More

ETSI announces step towards AR interoperability

The ETSI Industry Specification Group on the Augmented Reality Framework (ISG ARF) has announced ETSI GS ARF 003, describing it as a key specification towards the interoperability of augmented reality (AR) components. Rather than a single provider to deploy AR applications and services, the ETSI framework makes provision for components from different providers to interoperate via the defined interfaces, allowing broader and quicker adoption of AR technology.

The ETSI GS ARF 003 introduces the characteristics of an AR system and describes the functional building blocks of a generic AR reference architecture and their relationships. The global architecture gives an overview of an AR system which is based on a set of hardware and software components and data describes the real world and virtual content. The functional architecture applies to both fully embedded AR systems and implementations spread over IP networks in a scalable manner with sub functions. These sub functions can either be deployed on the AR device or be provided via cloud technology.

Muriel Deschanel, chair of the ETSI ISG ARF, said: “AR can be a real asset for many use cases in industry 4.0 or in the medical sector. With the significant improvement to network performance brought by 5G, in particular in terms of bandwidth and latency, cloud services will become essential to a larger number of AR use cases,” she said.

An example of AR for industry 4.0 will be if extra staff are employed to cope with peak activity, they may not have the expertise for the job or there may not be time to train then, so AR will enable an experienced operator in another area to train, guide and give precise instructions to the new operator.

http://www.etsi.org

> Read More

Voice cores from CEVA support TensorFlow Lite for Microcontrollers

Machine learning at the edge is now possible for WhisPro speech recognition software from CEVA, as it is now available with open source TensorFlow Lite for Microcontrollers. TensorFlow Lite for Microcontrollers from Google is already optimised and available for CEVA-BX DSP cores, for low power artificial intelligence (AI) in conversational and contextual awareness applications, says CEVA.

The license provider of wireless connectivity and smart sensing technologies targets conversational AI and contextual awareness applications, with support for the TensorFlow Lite for Microcontrollers cross-platform framework for deploying tiny machine learning on power-efficient processors in edge devices.

Tiny machine learning brings AI to low power, always-on, battery operated IoT devices for on-device sensor data analytics in areas such as audio, voice, image and motion. Customers using TensorFlow Lite for Microcontrollers can use a unified processor architecture to run both the framework and the associated neural network workloads required to build intelligent connected products. CEVA’s WhisPro speech recognition software and custom command models are integrated with the TensorFlow Lite framework to accelerate the development of small footprint voice assistants and other voice-controlled IoT devices.

The CEVA-BX DSP family is a high-level programmable hybrid DSP/controller offering high efficiency for a broad range of signal processing and control workloads of real-time applications. Using an 11-stage pipeline and five-way VLIW micro-architecture, it offers parallel processing with dual scalar compute engines, load/store and program control that reaches a CoreMark per MHz score of 5.5, making is suitable for real time signal control. Its support for SIMD instructions makes it suitable for a variety of signal processing applications and the double precision floating point units efficiently handle contextual awareness and sensor fusion algorithms with a wide dynamic range. It also facilitates simultaneous processing of front-end voice, sensor fusion, audio processing, and general DSP workloads in addition to AI runtime inferencing.

http://www.ceva-dsp.com

> Read More

Ethernet 800G Verification IP meets networking demands

Meeting the increases in bandwidth for video-on-demand, social networking and cloud services, Synopsys has introduced the Native System Verilog Ethernet VIP to complement its 112G high-speed SerDes PHY IP to enable high-performance cloud computing. It is claimed to be the industry’s first verification IP (VIP) and Universal Verification Methodology (UVM) source code test suite for Ethernet 800G.

The Synopsys VC VIP for Ethernet 800G is based on the Ethernet Technology Consortium (ETC) specification. It enables SoC teams to design next-generation networking chips for data centres with ease of use and fast integration, to accelerate verification closure and time-to-market. The VC VIP is used to verify Synopsys’ DesignWare 56G Ethernet, 112G Ethernet, and 112G USR/XSR PHYs for FinFET processes, which designers can integrate into 800G SoCs.

The ETC standard provides specifications for an 800G implementation based on eight lane x 100Gbits per second technology, enabling adopters to deploy advanced high bandwidth interoperable Ethernet technologies.

Francois Balay, president of MorethanIP, believes the Synopsys VC VIP will prove an advantage to developers. “Being first in the industry, Synopsys VIP, source code test suite and DesignWare IP for Ethernet 800G strengthens the ecosystem and facilitates early adoption of the technology and fast development of high-speed networking applications,” he said.

Synopsys VC VIP for Ethernet uses a native System Verilog UVM architecture, protocol-aware debug and source code test suites. It can switch speed configurations dynamically at run time and includes an extensive and customisable set of frame generation and error injection capabilities. Source code UNH-IOL test suites are available for key Ethernet features and clauses, to facilitate custom testing and accelerate verification closure.

Synopsys VC VIP and source code test suite for Ethernet 800G are both available today as early access standalone products. The DesignWare 56G and 112G Ethernet PHYs are available now and the silicon design kit for the DesignWare USR/XSR PHY IP in 7nm FinFET process is also available.

http://www.synopsys.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration