Company Will Conduct a Development Workshop for Qualified Developers and Data Scientists in Perth, Western Australia on 1 November 2019
BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultra-low power, high performance edge AI technology, today announced a neural networking workshop using the Akida™ Development Environment (ADE). Qualified developers and data scientists are invited to attend the workshop on 1 November 2019 at the Ernst & Young facility at Tank Stream Labs, EY Office, Level 5, 11 Mounts Bay Road, in Perth, Western Australia. The workshop will focus on implementations of Convolutional Neural Networks converted to Event-Based Neural Networks and the development of Native Spiking Neural Networks.
The workshop, led by BrainChip founder and CTO Peter van der Made, will introduce developers and data scientists to the Akida device and development environment in advance of product introduction. Mr van der Made will be accompanied by several research scientists and applications engineers to facilitate the installation and execution of the ADE. The focus of the workshop will include the novelty of the design, utilization of Akida as a complete network and application of the ultra-low power Akida design for edge applications. The Akida Development Environment is a full featured suite of tools including TensorFlow and Keras for deep learning neural networks, Brainchip SNN development tool and the Akida hardware simulator.
Participation in the workshop is limited and participants can apply for admission at www.brainchip.com/workshop beginning on 15 October. Laptop computers will be provided for all attendees and the Company encourages attendees to bring existing CNNs or SNNs for conversion or execution on the ADE.
“We are excited to invite developers and data scientists to participate in the Akida Workshop”, said Peter van der Made. “Akida is a revolutionary edge AI network technology and the AKD1000 is the first in a family of devices that is ultra-low power, high performance and a complete re-configurable network providing inference, training and learning. The reduction in system latency provides a faster response and a more power-efficient system that can reduce the large carbon footprint of data centers.”