Posts Tagged ‘ai’

Deep Learning

Sunday, December 6th, 2015

deepLearningAI500I recently attended a Deep Learning (DL) meetup hosted by Nervana Systems. Deep learning is essentially a technique that allows machines to interpret sensory data. DL attempts to classify unstructured data (e.g. images or speech) by mimicking the way the brain does so with the use of artificial neural networks (ANN).

A more formal definition of deep learning is:

DL is a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers with complex structures,

I like the description from Watson Adds Deep Learning to Its Repertoire:

Deep learning involves training a computer to recognize often complex and abstract patterns by feeding large amounts of data through successive networks of artificial neurons, and refining the way those networks respond to the input.

This article also presents some of the DL challenges and the importance of its integration with other AI technologies.

From a programming perspective constructing, training, and testing DL systems starts with assembling ANN layers.

For example, categorization of images is typically done with Convolution Neural Networks (CNNs, see Introduction to Convolution Neural Networks). The general approach is shown here:

Construction of a similar network using the neon framework looks something like this:

Properly training an ANN involves processing very large quantities of data. Because of this, most frameworks (see below) utilize GPU hardware acceleration. Most use the NVIDIA CUDA Toolkit.

Each application of DL (e.g. image classification, speech recognition, video parsing, big data, etc.) have their own idiosyncrasies that are the subject of extensive research at many universities. And of course large companies are leveraging machine intelligence for commercial purposes (Siri, Cortana, self-driving cars).

Popular DL/ANN frameworks include:

Many good DL resources are available at: Deep Learning.

Here's a good introduction: Deep Learning: An MIT Press book in preparation

Brain-Like Chip With 4000 Processor Cores

Saturday, August 9th, 2014

left-right-brainIBM Unveils a ‘Brain-Like’ Chip With 4,000 Processor Cores. The TrueNorth chip mimics 1 million neurons and 256 million synapses that IBM calls “spiking neurons.”

...the chip can encode data as patterns of pulses, which is similar to one of the many ways neuroscientists think the brain stores information.

IBM Research: Neurosynaptic chips provides more information on the low power system architecture and potential applications:

Neurosynaptic-chips

This is similar to Qualcomm's Brain-Inspired Computing effort.

Brain-Inspired Computing

Saturday, October 19th, 2013

zeroth-npuBringing artificial intelligence to mobile computing is a significant challenge. That's the goal of Qualcomm's new Zeroth Processors.

Mimicking the human nervous system and brain to allow computers to learn about their environment and modify their behavior based on this information has long been the goal of artificial neural networks.  Whatever computing model is used to achieve this capability the real problem is one of scale. The human brain is estimated to have 100 billion neurons -- with 100 trillion connections. That is at least 1,000 times the number of stars in our galaxy.

These computational models can be implemented in software (e.g. Grok), but the ability to scale to the levels required for even simple human-like interactions is severely limited by conventional computing platforms.  The Zeroth Neural Processing Unit (NPU) is a hardware implementation of the brain's spiking neural networks (SNN) method of information transmission. Integrating the NPU into computing platforms at the chip level would begin to address the computational and power requirements for these types of applications.

The goals of the Zeroth* platform are:

  1. Biologically Inspired Learning
  2. Enable Devices To See and Perceive the World as Humans Do
  3. Creation and definition of an Neural Processing Unit—NPU

Achieving "human-like interaction and behavior" is an ambitious goal, but it seems like this is a good first step.

UPDATE (25-Oct-13): Good overview here: Chips 'Inspired' By The Brain Could Be Computing's Next Big Thing.

UPDATE (1-Jan-14): CES 2014: Intel launches RealSense brand, aims to interface with your brain in the long run
___________

* The name Zeroth comes from the science fiction Three Laws of Robotics. The First law was that "A robot may not harm a human being."

Asimov once added a "Zeroth Law"—so named to continue the pattern where lower-numbered laws supersede the higher-numbered laws—stating that a robot must not harm humanity.

We'll have to wait and see, but let's hope so!