The document summarizes the evolution of artificial intelligence (AI) from the 1950s to the present. It discusses three waves of AI development: handcrafted knowledge in the early period, statistical learning from the 1960s to 1980s, and contextual adaptation from the 1990s onward. Recent advances are driven by increased computing power, data availability, and new algorithms. Deep learning is increasingly important and applications include voice control, natural language processing, and computer vision. While AI has great potential, a lack of talent and data is creating a bifurcated ecosystem with large tech firms at the top.
2. The evolution of Artificial Intelligence (AI) since the 1950s went
through several phases…….
Source: Artificial Intelligence by Giorgio Fumera from University of Cagliari | AI: 15 Key Moments in the Story of Artificial Intelligence by BBC | Future Progress in Artificial Intelligence: A
Survey of Expert Opinion
Further Advancement
(Mid 1980s–1990s)
Great Expansion
(Mid 1960s–early 1980s)
Recent Developments
(Mid 1990s – )
Starting from solving problems
with toys and real-world situations
including game playing, proving
theorems, natural language
processing (NLP), recognizing
objects in images
Research funding from DAPRA’s
strategic computing program in US,
Fifth Generation Computer System in
Japan and ESPRIT in Europe
Technical and theoretical advances
including the rise of machine learning
Many successful commercial
applications including: games, robots,
driverless vehicles, homes,
recommender systems, automated
trading, translating systems, aircraft
autopilots, fraud detectors, search
engines, etc.
Early Explorations
(1950s–1960s)
Alan Turing
introduced
the Turing
test
“Artificial
Intelligence”
was created at
Dartmouth
conference
1956
1950 1966
1969
Shakey
mobile
robot
MIT’s
“summer
vision
project”
1973
AI winter
1981
AI’s
commercial
value started
to realize
1990
Rodney Brooks was
inspired by
advances in
neuroscience,
which had started
to explain the
mysteries of
human cognition
1997
IBM’s Deep Blue
defeated the
world chess
champion
2002
iRobot -
first home
robot
2011
IBM’s Watson
won
"Jeopardy!“
2012
Google Brain
recognised a
cat from
millions of
unlabeled
videos
2016
Google's
AlphaGo
defeated the Go
world champion
The Future
AI experts estimate a 90%
chance of machines
achieving human-level
intelligence by 2075 and
superintelligence within 30
years of attaining human-
level intelligence
3. That can be clustered into three waves of AI development that may
be described as - handcrafted knowledge, statistical learning and
contextual adaptation
Source: A DARPA Perspective on AI
Abstracting
Perceiving
Learning
Reasoning
2 Statistical learning
Systems based on statistical models developed
to address specific challenges and trained using
big data
• Examples: voice recognition, face
recognition
• Features: Nuanced classification and
prediction capabilities. No contextual
capability and minimal reasoning ability
• Challenges: statistically impressive but
individually unreliable, inherent flaws can be
exploited, skewed training data creates
maladaptation, “blackbox”
AI Wave
1 Handcrafted Knowledge
Systems that have established sets of rules
to represent knowledge in well-defined
domains
• Examples: logistics program
scheduling, game-playing programs
• Features: Enables reasoning over
narrowly defined problems. No learning
capability and poor handling of
uncertainty
• Challenges: The structure of the
knowledge is defined by humans. The
specifics are explored by the machine.
Failure of the autonomous cars in the
DARPA Grand Challenge
3 Contextual Adaptation
• Systems that construct contextual
explanatory models for classes of real
world phenomena
• Examples: Image recognition
• Features: Ability to perceive, learn,
abstract and reason
• Models that generate explanations of
how an object might have been created
to explain and drive decisions
4. These waves of AI developments are advancing rapidly, driven by
machine and deep learning
Faster and more powerful
computation (GPUs)
Greater data availability Development of new algorithms Tech giants are opening up resources to
enable others to develop better AI (e.g.
TensorFlow, Amazon AI)
Availability of cloud-based
infrastructure
Factors driving the rapid advancement of AI
Source: The Fourth Industrial Revolution: a Primer on Artificial Intelligence by David Kelnar
Human intelligence exhibited by
machines
Focal Areas of AI
• Reasoning
• Knowledge
• Planning (including navigation)
• Natural language processing
• Perception
Statistical techniques enable predictions by
machines to improve with experience
Beyond deep learning, it includes various
approaches:
• Random forests: create multitudes of
decision trees to optimise a prediction
• Bayesian networks: use a probabilistic
approach to analyze variables and the
relationships between them
• Support vector machines: be fed
categorized examples and create models to
assign new inputs to one of the categories
• It models the brain and uses an artificial
‘neural network’ - a collection of neurons
connected together
• It is useful because the algorithm undertakes
the tasks of feature specification (defining
the features to analyze from the data) or
optimization (weighing the data to deliver an
accurate prediction)
Artificial Intelligence
Deep Learning
Machine Learning
A subset of AI
A subset of machine learning
The broadest term
1950's 1960's 1970's 1980's 1990's 2000's 2010's
5. • Deep learning uses neural
networks loosely analogous to the
observed behaviour of a biological
brain's axons
• It consists of multiple hidden layers
of networks between the input and
output layers and are trained
separately, breaking down the
characteristics of the data into
multiple parts and combining all
the layers in the end to provide the
output
Trend 1: Looking ahead, deep learning is expected to be an
important technique in AI
Input layer: data
can be fed into
the network
Hidden Layer:
information is
processed
Output Layer:
results come out
Source: Computer Science: The Learning Machines by Nature | The Fourth Industrial Revolution: a Primer on Artificial Intelligence by David Kelnar | From not Working to Neural Networking
by The Economist | Analytics Vidhya | Artificial Intelligence: 10 Trends to Watch in 2017 and Beyond by Tractica | Learning Deep Architectures for AI By Yoshua Bengio
Neural networks
Illustration of using deep learning neutral networks for facial
recognition
1. Identify pixels of light and dark 2. Learn to identify edges and
shapes
3. Learn to identify more complex
shapes and objects
4. Learn which shapes and objects
define a human face
Deep Learning Characteristics
• Deep learning helps reduce the time and effort spent on feature engineering. Deep learning is increasingly used in conjunction with machine learning,
natural language processing (NLP), computer vision or machine reasoning
• Deep learning is typically employed for feature extraction on a larger or more complex set of data while employing machine learning algorithms to perform
basic clustering or regression learning tasks when features of the data have been determined
• The performance of deep learning neural networks (DNN) has been demonstrated to increase on a linear scale with the increase in number of DNN layers,
necessitating hardware to process and train these algorithms to also grow in scale
Image source: Andrew Ng
6. Trend 2: There are different types of learning, with semi-supervised
and reinforcement learning gaining traction
Source: Three Things You Need to Know About Machine Learning by Medha Agarwal | Machine Learning by TechJini
• Given pre-determined features
and labeled data
• Direct feedback
• Predict outcome
• Example: traditional insurance
underwriting
• Challenge: need for large
amount of labeled data which
is costly and time-consuming
Supervised Semi- Supervised Unsupervised Reinforcement
LearningTypes
• Given unlabelled and
unstructured data
• No feedback
• Find hidden structure
• Example: customer
segmentation
• Challenge: tend to be less
accurate
• A blend of supervised and
unsupervised learning
• Used for situations in which
there is some labelled data but
not a lot
• Example: Gmail spam
• It is expected to see increasing
usage for large data sets,
where data labelling is an issue
• Experience driven sequential
decision-making
• Occasional feedback in the
form of a reward
• Example: Game (AlphaGo),
robots, autonomous driving
• Challenge: require a significant
amount of data
• Algorithms with fewer layers. For
instance, logistic regression, support
vector machine
• Better for relatively less complex and
smaller datasets
• New technique that uses many layers of
neural networks
• Useful for complex target function
and large datasets
LearningTechniques
Shallow Deep
7. Source: Expect Deeper and Cheaper Machine Learning by IEEE Spectrum | How AI is Shaking up the Chip Market by Wired | A Machine Learning Landscape by Karl Freund from Moor
Insights & Strategy | Artificial Intelligence: 10 Trends to Watch in 2017 and Beyond by Tractica
Trend 3: Shift from GPUs to AI optimized hardware
• Graphics processing units (GPUs) have been the dominant hardware platform for AI applications and are expected to drive advances in performance,
especially for high-performance deep learning systems
• At the same time, the emergence of alternative hardware platforms like field-programmable gate arrays (FPGAs), application-specific integrated circuits
(ASICs), and specialized processor architectures are competing with GPUs on performance, cost, and power consumption
• As AI algorithms advance to account for applications with dynamic inputs (e.g. autonomous driving, personalized medicine), the evolving nature of algorithms
and workloads will determine suited architecture. Processors will increasingly be “right-sized” to align capabilities and cost with specific workloads.
Hardware across the Machine Learning Landscape
There are two key aspects:
Training refers to training the neural network with
massive amounts of sample data. It is typically
performed in large datacenters on GPUs, almost
exclusively provided by NVIDIA for the time being.
Inference refers to using a trained model to provide
outputs on real-world data.
• It is usually done at the application or client end
point, rather than on the server or cloud
• Inference requires fewer hardware resources, and
depending on the application, can be performed
using a central processing unit or non-specialized
hardware. This could be FPGA, ASICs, digital signal
processor, etc
• There are rising expectations that inference will
move locally to mobile devices
Image source: Moor Insights & Strategy
8. • Rapid advances in machine learning and NLP have also enabled voice
control to become more practical
• Companies like Baidu, Apple, Google, Microsoft, and Amazon are
making significant progress in voice recognition using DNNs
• Voice control may soon be sufficiently reliable for interacting with an
array of devices, robots, and home appliances coupled with open source
But some challenges remain:
• The complexity, subtlety and power of language
• An analysis of the way people use Alexa and Google’s assistant platforms shows that third-party apps have not been well used nor
particularly sticky
In the past Now & Future
Developers published over 10,000 skills on Amazon Alexa by 23 Feb 2017
Improvement of word accuracy rates by platform
950 1000 1400 2000 3000 4000 5191 6068 7053
10000
2016
May
2016
Jun
2016
Jul
2016
Aug
2016
Sep
2016
Oct
2016
Nov
2016
Dec
2017
Jan
2017
Feb
Trend 4: Machine learning and natural language processing give
rise to voice as the next conversational interface
Source: Amazon Alexa Now has 10k Skills, Including Europe by Voicebot.ai | MIT Technology Review
Image source: Internet Trends 2016 by KPCB
9. Generalized Intelligence: broad mental capacity
that influences performance on cognitive ability
measures (Examples: emotional intelligence,
creativity, intuition)
Reasoning: solving problems through logical
deduction (Examples: legal assessment, financial
asset management, games)
Knowledge: representing knowledge about the
world. (Examples: medical diagnosis, drug creation,
fraud prevention)
Planning: establishing and achieving goals
(Examples: logistics, physical and digital network
optimization, predictive maintenance)
Communication: understanding written and
spoken language (Examples: voice control, real-time
translation and transcription)
Perception: deducing things about the world from
visual images, sounds, and other sensory inputs
(Examples: autonomous vehicles, medical diagnosis,
surveillance)
1
2
3
4
5
6
Goals of AI
Reasoning
Knowledge
PlanningPerception
Generalized
Intelligence
Communication
1
2
3
4
5
6
AI Challenge Resolution Capabilities
Trend 5: AI is especially important given its ability to solve
challenging problems and potentially impact almost every industry
Source: The Fourth Industrial Revolution: a primer on AI by David Kelna | AI, Deep Learning, and Machine Learning: A Primer by Frank Chen
Horizon Robotics uses large-scale cloud-based
deep neural network algorithms on high-
performance and low-power brain processing
units for applications in smart homes and
autonomous cars
(www.horizon-robotics.com)
Dynamic Yield delivers an end-to-end platform
for personalization in eCommerce, media, travel
industry using machine learning to improve
targeting capabilities
(www.dynamicyield.com)
Xuebajun is a mobile application that helps
students solve homework questions using
Scene Text Recognition (STR) Technology and
deep learning to improve character recognition
rate
(www.xueba100.com)
LightCyber uses machine learning to map out
and monitor all users and devices on a
company network. It detects behavioural
anomalies and selects meaningful actionable
alerts for escalation
(www.lightcyber.com)
Maxent provides anti-fraud software as a
service based on machine learning techniques
(www.maxent-inc.com)
Selected Vertex Portfolio Companies
10. Firms
Talent | Data
Trend 6: At the same time, a dearth of talent and data is driving
the emergence of a bifurcated AI industry ecosystem that is top-
heavy and long-tailed
Dominant Players | Technology & Financial
Services Companies
• Have access to very large training datasets and the ability
to drive the advancement in algorithms
• Focus on highly scalable use cases like image recognition
or patient data processing, which avail the most
significant revenue opportunities or help enhance existing
services and products to gain a competitive edge
• Be expected to lead the top-heavy AI ecosystem because
of specialty in high performance computing systems that
power advanced use cases such as predictive
maintenance, algorithmic trading or static image
recognition
A dearth of talent and data in AI are critical challenges that play to the strengths of dominant players like big technology
firms and financial service companies creating a bifurcated AI industry ecosystem comprising:
Small & Medium-sized Enterprises (SMEs) |
Startups
• Possess relatively smaller data sets
• Focus on using or enhancing existing algorithms (many
are increasingly open-sourced) as well as employing high-
performance cloud services
• There are many niche applications where AI is expected to
add value and startups can compete effectively with the
bigger players
Source: Artificial Intelligence: 10 Trends to Watch in 2017 and Beyond by Tractica | Artificial Intelligence is the New Electricity by Andrew Ng | Banks and Tech Firms Battle Over
Something Akin to Gold: Your Data by the New York Times| JPMorgan Software Does in Seconds What Took Lawyers 360,000 Hours by Bloomberg | Wells Fargo Increases Emphasis on
Emerging Technologies by Wells Fargo
11. Final Comments
Vertex has invested in companies across geographies addressing different industry
applications leveraging AI to transform their service offerings. These include Xuebajun in
education, Horizon Robotics in autonomous cars and smart homes, Maxent in fraud
detection, Dynamic Yield in customer personalization and optimization, and LightCyber in
cybersecurity (recently acquired by Palo Alto Networks).
With each investment, we’re learning more about success strategies using AI to transform
industries. We’re excited to be active investors in this space and are looking forward to the
journey ahead.
12. Thanks for reading!
Disclaimer
This presentation has been compiled for informational purposes only. It does not constitute a recommendation to any party. The presentation relies on data and
insights from a wide range of sources including public and private companies, market research firms, government agencies and industry professionals. We cite
specific sources where information is public. The presentation is also informed by non-public information and insights.
Information provided by third parties may not have been independently verified. Vertex Holdings believes such information to be reliable and adequately
comprehensive but does not represent that such information is in all respects accurate or complete. Vertex Holdings shall not be held liable for any information
provided.
Any information or opinions provided in this report are as of the date of the report and Vertex Holdings is under no obligation to update the information or
communicate that any updates have been made.
About Vertex Holdings
Vertex Holdings, a member of Temasek Holdings, focuses on venture capital investment opportunities in the information
technology and healthcare markets, primarily through our global family of direct investment venture funds. Headquartered
in Singapore, we collaborate with a network of global investors who specialize in local markets. The Vertex Global Network
encompasses Silicon Valley, China, Israel, India, Taiwan and Southeast Asia.
Contact us: Brian Toh
btoh@vertexholdings.com
Tracy Jin
tjin@vertexholdings.com
James Lee
jlee@vertexholdings.com