22
© 2016 International Business Machines Corporation From Cognitive Computing to Artificial Intelligence Dr Peter Waggett Director Emerging Technology September 2016 Follow us @IBMWatson

From Cognitive Computing to Artificial Intelligence Dr Peter WaggettDirectorEmerging TechnologySeptember 2016

Embed Size (px)

Citation preview

Presentation Headline Subhead

From Cognitive Computing to Artificial Intelligence

Dr Peter WaggettDirectorEmerging Technology

September 2016Follow us @IBMWatson

2016 International Business Machines Corporation

2016 International Business Machines Corporation

About MeDr Peter Waggett

PhD in Rocket Science

Failed Astronaut

Over 20 years IT Research Experience

2016 International Business Machines Corporation

Smarter Planet Vision

InstrumentedInterconnectedIntelligentInteractive

2016 International Business Machines Corporation

Automating the WorldUnderstanding the World

Moores Law

2016 International Business Machines Corporation

Cognitive Computing

Grand Challenge

2016 International Business Machines Corporation

IBM Watson and Jeopardy

2016 International Business Machines Corporation

PersonOrganizationL. GerstnerIBMJ. WelchGEW. GatesMicrosoft

If leadership is an art then surely Jack Welch has proved himself a master painter during his tenure at GE.Welch ran this?

Noses that run and feet that smell?How can a house burn up as it burns down?Does CPD represent a complex comorbidity of lung cancer?What mix of zero-coupon, non-callable, A+ munis fit my risk tolerance?Why is it so hard for computers to understand us and Vice Versa?

2016 International Business Machines Corporation

Main point: So why is it so hard for IT to meet this urgent need? Much of the difficulty is rooted in the fact that computers speak a different language than humans. Computers handle precise numbers and everything they do is built on this root capability. People handle words and ideas which in and of themselves have no inherent meaning. Think about the seeming contradiction of phrases like this and the inherent difficulty in making sense of statements like this for computers that are built to be great at crunching numbers.

Further speaking points: Natural Language is implicit -- the exact meaning is not completely and exactly indicated Its highly dependent on the context -- what has been said before, the topic, how it is being discussed -- factually, figuratively, fictionally etc. A literal reading of the sentence at the bottom might lead to the conclusion that Jack Welsh and Renoir were colleagues.

Moreover, natural language is often imprecise it does not have to treat a subject with numerical precisionhumans naturally interact and operate all the time with different degrees of uncertainty and fuzzy associations between words and concepts. We use huge amounts of background knowledge to reconcile and interpret what we read.

Consider these examples.it is one thing to build a database table to exactly answer the question Where is someone born?. The computer looks up the name in one column and is programmed to know that the other column contains the birth place. STRUCUTRED information, like this database table, is designed for computers to make simple comparisons and to be exactly as accurate as the data entered into the database. Natural language is created and used by humans for humans. It lacks the exact structure and meaning that computer programs typically use to answer questions. Understanding what is being represented is a whole other challenge for computer programs .

Consider this sentence It implies that Albert Einstein was born in Ulm but there is a whole lot the computer has to do to figure that out any degree of certainty - it has to understand sentence structure, parts of speech, the possible meaning of words and phrases and how they related to the words and phrases in the question. What does a remembrance, a water color and an Otto have to do with where someone was born.

Additional information: Consider another question in the Jeopardy Style X ran this? And this potentially answer-bearing sentence. Read the SentenceDoes this sentence answer the question for Jack Welch - -what does ran have to do with leadership or painting. How would a computer confidently infer from this sentence that Jack Welch ran GE might be easer to deduce that he was at least a painter there.

7

Watson is a massively parallel system

Answer Scoring

Models

Responses with Confidence

Inquiry

Evidence Sources

Models

Models

Models

Models

Models

PrimarySearch

CandidateAnswerGenerationHypothesisGenerationHypothesis and Evidence ScoringFinal Confidence Merging & RankingSynthesis

Answer SourcesInquiry/Topic Analysis

EvidenceRetrievalDeep Evidence ScoringLearned Modelshelp combine and weigh the Evidence

HypothesisGenerationHypothesis and Evidence Scoring

InquiryDecomposition

2016 International Business Machines Corporation

Attribute extractionMedicationsSymptoms

Diseases

Modifiers

2016 International Business Machines Corporation

Imagine if you could read, understand and recall all relevant data

Answer Scoring

Models

Responses with Confidence

Inquiry

Evidence Sources

Models

Models

Models

Models

Models

PrimarySearch

CandidateAnswerGenerationHypothesisGenerationHypothesis and Evidence ScoringFinal Confidence Merging & RankingSynthesis

Answer SourcesInquiry/Topic Analysis

EvidenceRetrievalDeep Evidence ScoringLearned Modelshelp combine and weigh the Evidence

HypothesisGenerationHypothesis and Evidence Scoring

InquiryDecomposition

Ingest relevant data across a broad domain to create a RepositoryUnderstand complex English language inquiriesIndentify potential responsesRank each option with confidence and supporting evidenceLearn and Adapt

2016 International Business Machines Corporation

Artificial Intelligence ??

2016 International Business Machines Corporation

Artificial Intelligence

the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

2012 International Business Machines Corporation

Todays computing systems are challenged to extract real-time actionable information from complex and cluttered sensor data with very low power. Biological systems on the other hand, have evolved to be extremely efficient at making sense of raw sensory information in real time at power levels orders of magnitude smaller than conventional compute.

IBMs SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) Chip with brain-inspired computer architecture is powered by an unprecedented one million neurons and 256 million synapses. Such differentiated architecture both in hardware and software can enable neural computation to be deployed where the sensory data is at the edge, or at massive scale within the cloud.

2015 International Business Machines Corporation

Emerging Compute Fabrics

2D Grid of QubitsCryptography

Physics and Quantum Chemistry

Material and Drug Design

Challenge Exactness: Binary Synapse, 8-bit weightChallenge Synchrony: Asynchronous, event driven design Challenge Error Free Computing: Training in probability space Push the limits of energy efficient implementations (customized)Result: 100X power reduction

IBM TrueNorth1M Neurons256M Synapses5.4B TransistorsRealtime73 mW

2016 International Business Machines Corporation

#An example of an emerging compute fabric is the SyNAPSE neuromorphic chip. This brain-inspired chip enables sensory perception in mobile and IoT applications by implementing a low power scalable architecture. The most recent version of this is the IBM TrueNorth chip which contains 1M neurons and 256 synapses implemented using 5.4B transistors and consuming only 73mW. TrueNorth achieves this by challenging several requirements that constrain traditional compute fabrics including exactness of data representation, synchronicity of events, error tolerance, and energy/frequency optimization. As part of a cognitive hardware and software ecosystem, this technology creates new possibilities of transformative applications and devices with sensory perception. These include applications in robotics, healthcare, public safety, environmental protection, and many others.Another example of an emerging compute fabric is the quantum computer. Quantum computers promise exponential leaps in speed and power by exploiting quantum super-positioning to represent multiple states simultaneously. Some of the advantages are obvious: speed to handle process-intensive workloads and the power to scale out depending on business need. But the real differentiator is that these benefits compound quantum computings true strength, which is an entirely new way to tackle problems. Harnessing such capabilities would provide extraordinary business advantages by accelerating innovation and solving problems that are unsolvable today in areas as diverse as pharmaceuticals, encryption, and materials discovery.

14

Brain-inspired systems will allow better analysis of sensory data

2016 International Business Machines Corporation

Thus far, we have talked about opportunities to create new value by connecting, curating, and analyzing data.

But if you remember where we started, we spoke about the growth of data at the edge and noted that this data, which is rapidly growing, has tremendous, untapped value. We noted that much of it, arguably the majority, is sensory data, which is growing from the proliferation of cameras, smart phones, and instrumented vehicles. And, that it is most useful if it can be acted upon extremely quickly - within milliseconds.

The new compute platform will create one way to utilize the resources of the collective to act upon this data in near real time fashion.

But, real-time reaction to sensory data is quite challenging today since it is very different from the data that typical computers are designed to process. It has little inherent structure, a great deal of noise, and an extremely high ratio of raw bits to useful information

Id like to close by giving you a peek a little further into the future. Personally, I find this one mind-blowing, and in a moment youll understand why.

IBM Research is investing deeply in the development of a new form of computing called neuromorphic. Or, in plain language, brain-inspired systems.

People have struggled for decades to make conventional computers do the things that animal brains can do. This has proven to be extremely challenging. Thats because the brain has evolved to receive and handle humongous amounts of real-time, noisy spatio-temporal data from a wide variety of sensors. It recognizes patterns in these inputs and executes a range of actions, mostly autonomically.

Amazingly, the human brain does all of this while consuming only about 20W of power processing complex images and making complex decisions in real time.

We have already built a new chip with a brain-inspired computer architecture powered by an unprecedented one million neurons and 256 million synapses. It is really a network of very specialized, extremely low power cores trained or imprinted with specific tasks like recognizing colors, sounds, patterns.

This new chip only consumes 70mW during real-time operation still not as efficient as the brain but it uses orders of magnitude less energy than traditional chips.

Our long-term goal is to build a neuro-synaptic chip system with ten billion neurons and one hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume. 15

16Neuromorphic Image Matching

2016 International Business Machines Corporation

IBM Confidential

Recognition

2016 International Business Machines Corporation

Brain-inspired devices will play diverse, critical roles

2016 International Business Machines Corporation

Here is just a small sampling of ways that neuromorphic or brain inspired computing may be applied in the real world:Public cameras are now widely considered an important part of public safety and crime prevention, especially in large urban centers. But they still rely heavily on human monitoring, which becomes more challenging as input sources multiply exponentially, compounded by the fact that information contained in these video feeds often needs to be understood and acted upon within a matter of seconds. What if the system could recognize patterns, shapes, images and behaviors in real-time, and automatically trigger preventative or rescue actions? Brain-inspired chips hold the potential to do that on our behalf and to augment public safety measuresMobile devices are a rich source of sensor data. But, while mobile processors are power-efficient, they must remain powered down most of the time. A mobile processor will drain a typical (1440mAh) battery in 2-5 hours if continually active. Brain-inspired hardware could enable always-on operation for sensory processing and context generation.Todays most fully-instrumented automobiles generate ~1 gigabyte/second of sensor data. A brain-chip equipped vehicle could develop far more complex models of a scene for driver assistance and self-driving, sensing and alerting car control systems to dangerous situations that can be handled autonomically. For this to make a difference, such systems would need to respond in as little as 200 milliseconds. Neuromorphic chips are capable of that. (Think about how you can sense not to step off that curb into ongoing traffic or, to be clear, you dont think the human brain does it before you are conscious of it!)

Looking just a bit farther ahead, into the not too distant future, we see an unlimited number of ways to use brain-inspired devices:For example, for a visually-impaired individual, a simple pair of glasses can become a mobile sensing platform, capable of object recognition and scene understanding, and going beyond that to understanding human gestures, emotional state and affect, prompting the wearer with rich audio cues. For a hearing-impaired individual, the glasses could process the auditory environment in the same way, providing visual prompts.There are myriad applications for neuromorphic computing in sensing the environment. For example, we envision jellyfish-like sensory buoys that float on the ocean and collect a wide range of information, such as temperature, air pressure, humidity, and also conduct duties such as tsunami monitoring.And what about sensing areas that are too dangerous for people to enter? We can imagine a digital tumbleweed that could roll itself into a disaster scene, perhaps the aftermath of a forest fire or a chemical incident, and report back extensive data in real time, based on a host of onboard sensors.We are tremendously excited about this area of innovation, and truly see no end to possible applications.16

Leading in the next era of computingAdvancedImage Recognition

PersonalityAnalyticsSymbiotic CognitivePlatforms

NeurosynapticBrain-inspiredComputing

Advanced Dialogue & Reasoning

ExascaleData-centric SystemsQuantumComputingHybridCloudPhysicalAnalyticsCurationat ScalePersonalized Learning

0-18 months18 months - 3 years3-10+ years

Ad HocSystems

2016 International Business Machines Corporation

What weve covered in this presentation is just the beginning. IBM is doing forward-looking work in a diverse range of areas that will radically change and improve the ways we manage and curate data, analyze and understand it, and ultimately extract value from it.

Timeframe: 0-18 monthsPersonality analytics use of psycho-linguistic analytics on ways people express themselves (e.g., in social media) to develop very accurate personality profiles Hybrid cloud seamless composition of cloud computing services across public and private resourcesPersonalized learning cross-linked models of learner, content, learning and pedagogy, providing insights to content designers and supporting personalized approaches to learningCuration at scale reducing time-to-value for insights by integrating, enriching, aligning, reclassifying and using other techniques on data, tailoring it for multiple use-casesTimeframe: 18 months-3 yearsPhysical analytics creating a deeper understanding of the physical nature of things through modeling, and using underlying principles of physics of real-world systems to make sense of dataSymbiotic cognitive platforms rich human interaction with big data in cognitive environment, leading to accelerated decisions and reduced bias Advanced image recognition give systems vision more like ours, only tireless and less easily distracted Exascale data-centric systems new architecture that moves compute to data at all levels of the systems hierarchy, reaching exascale (one quintillion operations per second) before the end of the decade (~2018)Advanced dialogue and reasoning systems that can debate a human being using natural language, able to interact, reason and persuade Timeframe: 3-10+ yearsQuantum computing entirely new type of technology with estimated speed-up factors that defy belief (~40 orders of magnitude), running algorithms that would take decades on todays processors in only a matter of minutesAd hoc systems dynamically creating the equivalent of a cloud in every users pocket, based on dramatically increasing processing power and networking capacity in mobile devicesNeurosynaptic brain-inspired systems advanced work to create cognitive technologies with performance and efficiency that approaches that of the human brain19

Wetware

2016 International Business Machines Corporation

Prediction is Difficult .......Especially About the Future(Niels Bohr)

2016 International Business Machines Corporation

2016 International Business Machines Corporation

22Main point: Join the conversation and take the next step.

Further speaking points:. Get involved and learn more about ways that Watson can help your business today. Learn more on the web. Join the conversation on twitter and facebook. See how Watson was created and is having a real impact on youtube. And above all, contact your IBM representative to your priorities and goals and how Watson can help play a part in meeting them. 22