Building truly strong AI is an incredibly fascinating area. Companies like Numenta are working to understand the neocortex of the human mind to achieve great strides in AI. Matt Taylor from Numenta will be speaking at AI with the Best on their work with strong AI that follows principles learnt from neuroscience, here are some of his insights.
There are two different approaches to artificial intelligence out there today:
Artificial neural networks is likely to be the sort of AI technology you’ve heard about before. Matt explains that “the current AI landscape is dominated by technologies rooted in Artificial Neural Networks (ANNs). The neuronal model used by these systems was architected decades ago before we understood much about how neurons really worked.”
“This simplistic neuronal model has been a hindrance to the creation of intelligent software systems, because it doesn’t contain the biological realism necessary for truly intelligent non-biological systems.” — Matt Taylor
ANNs have provided some astonishingly good results so far in the field of AI and machine learning, however, their techniques don’t quite relate to how the brain really works. They aren’t really replicating neurons in the neocortex, despite the “neural network” name.
Hierarchical Temporal Memory (HTM) involves studying how the neocortex of the human brain works and working towards using the same principles to create smarter systems. As Matt explained it, it is “a theory of intelligence based on observations of the mammalian neocortex”. The neocortex is where all of your thoughts and memories are. It is, basically, your intelligence. That’s a pretty important part of the brain right there. Understanding that, and being able to use what we learn from it in our own systems would be undeniably beneficial!
“We’re not trying to recreate a brain, we’re just trying to learn how intelligence works in the neocortex and build systems on the same principles.” — Matt Taylor
The neocortex is fascinating because the structure of it is the same throughout the whole neocortex, no matter what function it is performing, whether it be reasoning, or language, or any other function. This shows us that the brain uses the same concepts throughout to deal with a range of different tasks and processes. Matt explains that their model “attempts to recreate the information processing flow of one layer of one region of neocortex by modelling neurons and their interactions as realistically as possible.”
The hierarchy part of the model comes from the principle that the various parts of the neocortex are linked together to form a hierarchy. The lower-level functionality such as data from your senses is at the bottom of the hierarchy, then the output continually is passed upwards to higher-level functions of the brain. The higher-level functions are those that’d interpret the concepts at a much more intelligent level (e.g. the difference between raw visual input and proper analysis of that input to recognise its general shape, then actually recognising the object and so on).
The model used in HTM is one that actually came about from Numenta and its co-founder, as Matt explains, “We use a biologically constrained model of neurons and their interactions informed by years of neuroscience research by Numenta and (previously) the Redwood Center for Theoretical Neuroscience (founded by Numenta co-founder Jeff Hawkins).”
HTM is the model which truly suits Numenta’s dual mission of:
Matt has a great video explaining the concepts behind Hierarchical Temporal Memory here:
Their YouTube channel on HTM Open Source has a tonne of great resources to explain some pretty fascinating concepts. Definitely worth checking out!
Matt says that the core of the argument is that “pyramidal neurons in your neocortex are much more complex than ANN neurons”. For those keen on the technical specifics, Matt explains with the following points on HTM:
The following image from Matt and Numenta visually shows how the neuron concept in ANN and HTM compare to that of a biological neuron:
A major theme in Matt’s talk at AI with the Best this year is the difference between “Weak AI” and “Strong AI”. Matt points out that weak AI has given us a whole lot of benefits so far, so it isn’t necessarily bad,
“ANNs have produced a myriad of Weak AI functionality that has become extremely good at understanding and processing narrow information domains. Deep Learning has recently leapt forward as the primary method of understanding the world’s data and with wonderful functionality that enhances our lives and helps us understand our world.”
However, there’s a limit to what we can achieve with ANNs. Matt and Numenta believe we need to move beyond that. Matt says that he believes that “in order to produce Strong AI, we must understand how biological intelligence works. In my talk, I argue that our work at Numenta is moving in that direction with our research and technology, but that current ANN-based machine learning is not.”
“I don’t believe AI technology based upon the simplistic neuronal model of ANNs will lead to Strong AI, which is really the thing we’re striving toward. Strong AI is the “next step” in the evolution of life as a progression of information systems, and our current techniques will (IMO) not bring us there.” — Matt Taylor
Matt is keen to clear up one very common misconception — artificial intelligence is not already here in the way that so many people are reporting. In fact, Matt doesn’t even like the term “artificial intelligence”.
“I don’t like the term AI. There is nothing artificial about intelligence. It either is or is not. We are all working on non-biological intelligence, not artificial intelligence. — Matt Taylor
He also believes we should all try to agree on what “intelligence” means. He covers what they mean when they say AI in the video below:
Matt says that “too many people are already reporting that AI is here today, and making it ubiquitous is simply a collection of engineering problems. I think the correction should be that Weak AI is here today, and most of the AI work ongoing is scaling it and moving it to the “edge” of different platforms.
Frankly, the development and expansion of Weak AI does not lead to Strong AI. For that, we need to take a different approach. And we need to agree on what intelligence is.”
If you’re looking to get into this area and have more of an understanding of it, Matt had a few books which helped him in his early days,
“In 2006, I was working as a software contractor in St. Louis, MO. I was really getting interested in the nature of my work and how it related to the overall evolution of technology in general, and I remember buying the books “On Intelligence” and “The Singularity is Near” to get a better perspective on the big picture of things as it related to humanity. This led to more books like “The Selfish Gene” and “What Technology Wants” for continued philosophical exploration, not only about artificial intelligence but about the relationship between human evolution and technological evolution.”
“I realized that life could be represented as “the progression of information systems”, and that technological evolution was simply the continuation of biological evolution” — Matt Taylor (he gave a talk on this very topic in 2012)
The interaction between technology and humankind has helped free information in a way that we’ve never seen before, but the amount of data we’re producing requires a new approach if we’re ever going to understand this world that is evolving today. Matt explains his own learning and understanding of this,
“The most important function of technology for the progression of “life as an information system” was the ability for information to store itself in non-living material. Before technology, information contained within “life as we know it” was restricted within biological systems. With the emergence of technology, humans opened a door that freed information from those restraints and allowed it to write itself into the non-physical world.
But at this stage of the evolution of information systems, we were dealing with a “data overload”. We have written so much information about our universe into non-biological structures that we cannot process it efficiently. Most of it is being wasted.”
“Even if we dedicate every human brain to understanding this data we are storing, we will never catch up. We are producing data faster than anything can process it.” — Matt Taylor
“This is when I realized that the next big step in human evolution was the creation of non-biological intelligent systems. We need these systems to understand our world. We simply don’t have the computational power in our biological brains to understand the data firehoses that have evolved from biological systems. Most of this data is not even being stored. We will soon have billions of IoT devices all producing streams of data that we won’t even have the capacity to store, let alone analyze and understand.”
“We need non-biological intelligence to understand this data. We need systems that are not restricted by physical and biological limitations, systems that can explore this data landscape and understand it. We need non-biological intelligence to continue evolving.” — Matt Taylor
“Intelligent biological systems (brains) facilitated a major acceleration in the pace of evolution by allowing an information pathway via living organisms through time and space. The major limitation of biological intelligence is its physical constraints. If we could identify the makings of intelligence by studying the only thing we all agree is intelligent (the neocortex), we could potentially replicate our biological intelligence in systems that are unrestricted by biology.”
Also, remember that there’s a whole heap of information available to help you learn all of these concepts over at the HTM Open Source YouTube channel.
A huge thank you to Matt Taylor for taking some time to share his passion for HTH with us! If you’d like to get to see his talk at AI with the Best, a totally online conference on April 29th (watch from anywhere with an Internet connection!), you can get tickets here (use the code WTBVIP or the link right there to get a discount!).
Learn to build for the Amazon Echo, Google Home, Facebook Messenger, Slack & more!