Mouser ships Microchip’s SAM R34 SiPs to deliver low power LoRa for edge devices

Distributor, Mouser Electronics, is now stocking the SAM R34 LoRa sub-GHz system in package (SiP) family from Microchip Technology. The SAM R34 SiP devices are claimed to delivering industry-leading low-power performance for a range of IoT applications. The SiP family integrates a 32-bit microcontroller, software stack, and sub-GHz LoRa transceiver in a small 6.0 x 6.0mm package.

The Microchip SAM R34 LoRa sub-GHz SiPs, available from Mouser Electronics, incorporate a Microchip SAM L21 microcontroller based on a 32-bit Arm Cortex-M0+ core with up to 256kbytes of flash and 40kbytes of RAM. The onboard UHF transceiver supports LoRa and FSK modulation and covers frequencies from 137 to 1020MHz with maximum transmit power up to +20dBm without external amplification. The SAM R34 family is supported by Microchip’s LoRaWAN protocol stack and supports Class A and Class C end devices, as well as proprietary point-to-point connections.

The SiPs offer sleep modes as low as 790nA, which help to extend battery life and conserve power consumption in smart devices. The SAM R34 SiP includes a USB interface, making it suitable for USB dongle applications or for software updates via USB. The SiPs are suitable for battery-powered and sensor-based connected applications, including smart agriculture, smart city devices, and tracking devices for supply chain management.

Mouser also stocks the SAM R34 Xplained Pro evaluation kit, supported by the Atmel Studio 7 integrated development platform. The kit includes reference designs and software examples that enable engineers to develop SAM R34-based LoRa end-node applications. The kit is certified with the Federal Communications Commission (FCC), Industry Canada (IC), and Radio Equipment Directive (RED) so designs can meet government requirements across geographies.

Mouser claims to stock the world’s widest selection of the latest semiconductors and electronic components for the newest design projects. Mouser Electronics’ website is continually updated and offers advanced search methods to help customers quickly locate inventory. Mouser.com also houses data sheets, supplier-specific reference designs, application notes, technical design information, and engineering tools.

http://www.mouser.com

> Read More

Cortex processor adds to automotive IP from Arm

Designed to process multiple streams of sensor data, the Arm Cortex-A65AE has been added to the Arm Automotive Enhanced IP.

The Cortex-A65AE processor delivers enhanced multi-threading capability with integrated safety through Arm’s Split Lock technology.

The processor is optimised for 7nm processes and is Arm’s first multi-threaded processor with integrated safety for handling sensor data in autonomous and high throughput needs in in vehicle infotainment (IVI) and cockpit systems.

For autonomous driving, multiple sensor inputs allow cars to view their environment, perceive what is happening, plan possible paths ahead, and deliver commands to actuators on the determined path. As more sensors are added, the requirement for multi-threaded processing increases. With data being collected at different points of the vehicle, high data throughput capability is a key part of the heterogeneous processing mix required to enable advanced driving assistance systems (ADAS) and autonomous applications. The Cortex-A65AE manages the high throughput requirement for gathering sensor data and can be used in lock-step mode connected to accelerators, such as machine learning (ML) or computer vision, to help process the data efficiently. This has to be done with a high level of safety capability.

In addition, more autonomy and advancing driver aids will mean that drivers will be informed through augmented reality (AR) head-up-displays, alerts and improved maps. Sensors will be able to monitor eyelid movement to detect tiredness, body temperature, vital signs and behavioural patterns to personalise the in-car experience. These capabilities require high throughput, ML processing and a lot of heterogeneous compute.

This requires a heterogeneous compute cluster. The Cortex-A65AE is a throughput focused application class core with Split-Lock to enable the highest safety integrity level with leading performance and power efficiency, claims Arm.

http://www.arm.com

> Read More

AI analysis software improves facial classification accuracy

Neuromorphic computing company, BrainChip, has released BrainChip Studio 2018.3, an update for its artificial intelligence (AI) -powered video analysis software. The update improves the software’s face classification accuracy by 10 to 30 per cent, confirms the company.

BrainChip Studio is primarily used by law enforcement, intelligence, and counter-terrorism agencies that use existing CCTV infrastructure.

Building on BrainChip Studio’s use of spiking neural networks to enable facial classification on partial faces, which is useful where the probe image or the extracted faces may be obscured by hats, masks, scarves or camera angle, BrainChip Studio 2018.3 uses a full-face mode to perform facial classifications. In situations where the entire face is visible in the probe image or in the extracted faces, this full-face mode provides a significant increase in facial classification accuracy without impacting throughput.

BrainChip Studio’s facial classification technology works in environments where traditional biometric-based face recognition systems fail, for example, low-light, low-resolution, and visually-noisy environments.

BrainChip Studio 2018.3 is currently available.

BrainChip provides neuromorphic computing solutions, AI that is inspired by the biology of the human neuron. The company’s spiking neural network technology can learn autonomously, evolve and associate information just like the human brain. The proprietary technology is fast, completely digital and consumes very low power. The company provides software and hardware that address the high-performance requirements in civil surveillance, gaming, financial technology, cybersecurity, ADAS, autonomous vehicles, and other advanced vision systems.

http://www.brainchip.com

> Read More

Autonomous driving IP core will be demoed at CES 2019

At CES 2019 in Las Vegas (8 to 11 January 2019), AImotive will showcase its automated driving technology.  It will display aiDrive2, aiSim2, and the silicon-proven aiWare hardware IP core.

The modular self-driving software stack aiDrive will be demoed alongside aiSim2, the autonomous technology simulator running on AImotive’s purpose-built simulation engine.

Demonstrating its highway autopilot capabilities, the aiDrive2 will run on Nvidia’s Drive PX2 embedded platform. The Budapest-based AImotive aims to encourage wider collaboration in the autonomous industry by providing a modular and customisable platform for the development of automated driving systems.

The aiSim2 simulator will also be on display. The engine has proprietary hardware and the simulator runs on a single GPU and a multi-GPU set-up side by side. The ability to ensure deterministic physically-based rendering on any hardware set-up, enables aiSim2 to drastically accelerate the development of autonomous technologies while overcoming the limitations of game engine-based simulators, explains AImotive.

The aiWare test chip will be on display in Las Vegas, running AImotive’s own algorithms to prove the capabilities of the hardware IP core when implemented on silicon. Created through a partnership between AImotive, VeriSilicon and Global Foundries, the chip runs aiWare1. AImotive is currently offering the scalable aiWare2 and aiWare3 architectures to customers looking to create smart sensors or centralised AI acceleration clusters for automotive use.

Visit AImotive at CES 2019, at Tech East North Hall booth 7538

AImotive is one of the largest independent teams in the world working towards fully self-driving car technology. It addresses challenges of autonomous mobility, powered by AI, simulation technology, and supporting hardware architectures.

It has partnered with the Khronos Group to develop the Neural Network Exchange Format (NNEF), the first neural network data exchange standard to make communication easier and more reliable between AI toolsets and inference engines.

The company was granted licenses to test their self-driving vehicle fleet in Hungary, Finland and the states of California and Nevada.

http://www.aimotive.com

> Read More

About Smart Cities

This news story is brought to you by smartcitieselectronics.com, the specialist site dedicated to delivering information about what’s new in the Smart City Electronics industry, with daily news updates, new products and industry news. To stay up-to-date, register to receive our weekly newsletters and keep yourself informed on the latest technology news and new products from around the globe. Simply click this link to register here: Smart Cities Registration