SlideShare a Scribd company logo
1 of 86
Download to read offline
Let Android dream electric sheep:
Making emotion model for chat-bot with
Python3, NLTK and TensorFlow
Jeongkyu Shin
Lablup Inc.
Let Android dream electric sheep:
Making emotion model for chat-bot with
Python3, NLTK and TensorFlow
Jeongkyu Shin
Lablup Inc.
Illustration * © Idol M@aster / Bandai Namco Games. All rights reserved.
–Bryce Courtenay's The Power of One
“First with the head, then with the heart.”
Today’s focus
▪ NLP and Sentiment: Big problems when making chatbots
▪ Natural Language Understanding
▪ SyntaxNet and DRAGNN
▪ Emotion reading
▪ SentiWordNet and SentiSpace[1]
▪ Emotion simulation
▪ ML Sentiment engine
[1] Our own definition for sentimental state space
Illust: http://www.eetimes.com/document.asp?doc_id=1324302
I’m not sure but
I’ll try to explain the
whole process I did
And I assume
that you already
have experience /
knowledge about
machine learning
Illustration *(c) marioandluigi97.deviantart.com
Things that will not be covered today
▪ Phase space / embedding
dimension
▪ Recurrent Neural Network
(RNN)
▪ GRU cell / LSTM cell
▪ Multi-layer stacking
▪ Batch process for training
▪ Vector representation of
language sentence
▪ Sequence-to-sequence
model
▪ Word2Vec
So we will go through
▪ Understanding Language
▪ Reading Emotion
▪ Mimicking Sentiment
▪ And make it more complex for fun
© Sega. http://www.sega.com
Image from Sonic Wikia : http://sonic.wikia.com/wiki/Sonic_the_Hedgehog
It’s even hard for humanbeings.
Let’s start from
Language Understanding:
Lexical
Output
Chat-bots with Machine Learning
Context
Analyzer
Natural
Language
Processor
Response
Generator
Decision
maker
Sentence
To
vector
converter
Deep-learning model
(RNN / sentence-to-sentence)
Knowledgebase
(useful with TF/IDF ask bots)
Per-user context
memory
Lexical
Input
Deep-learning model
SyntaxNet / NLU
(Natural Language
Understanding)
Today’s focus!
Understanding Languages
▪ The structure of language
▪ “Noun” and “Verb”
▪ “Context”
▪ POS (Part-of-speech)
▪ Roles for the words
▪ Added as tags
▪ Only one meaning in the current sentence context
▪ Generalized POS tags
▪ Some POS tags are very common (noun, verb, …)
▪ Others? Quite complicated!
SyntaxNet (2016)
▪ Transition-based framework for natural language processing
▪ Feature extraction
▪ Representing annotated data
▪ Evaluation
▪ End-to-end implementation using deep learning
▪ No language-awareness/dependencies: data-driven
▪ Interesting points
▪ Found general graph structure between different human languages (2016-7)
▪ http://universaldependencies.org
*github.com/tensorflow/tensorflow
Generating NLP with SyntaxNet
Obtaining
Data
POS
Tagging
Training
SyntaxNet
POS tagger
Dependency
parsing
Transition-based
Parsing
Training
Parser
SyntaxNet implementation
▪ Transition-based dependency
parser
▪ SHIFT, LEFT ARC, RIGHT ARC
▪ “deviation”
▪ Configuration+Action
▪ Training
▪ Local pre-training / global training
*Kong et al., (2017)
SyntaxNet Architecture
*github.com/tensorflow/tensorflow
NMT / PBMT / GNMT
▪ Neural Machine Translation
▪ End-to-end learning for automated translation
▪ Requires extensive computation both training and inference
▪ GNMT
▪ LSTM Network w/ 8 encoder & 8 decoder
▪ Much better than PBMT (Phase-Based Machine Translation)
*Wo et al., (2016)
DRAGNN (2017)
▪ Dynamic Recurrent Acyclic Graphical Neural Networks (Mar. 2017)
▪ Framework for building multi-task, fully dynamically constructed
computation graphs
▪ Not GAN (Generative Adversarial Network)!
▪ Supports
▪ Training and evaluating models
▪ Pre-trained analyze models (McParsey) for 40 language
▪ Except Korean. (of course; )
▪ Problem?
▪ Python 2 (like most NLP toolkits)
*Kong et al., (2017)
*Android meets TensorFlow, Google I/O (2017)
Model differences
▪ DRAGNN[1]: End-to-end, deep recurrent models
▪ Use to extend SyntaxNet[2] to be end-to-end deep learning model
▪ TBRU: Transition-Based Recurrent Unit
▪ Uses both encoder and decoder
▪ TBRU-based multi-task learning : DRAGNN
▪ SyntaxNet: Transition-based NLP
▪ Can train SyntaxNet using DRAGNN framework
[1] Kong et al., (2017)
[2] Andor et al., (2016)
Encoder/Decoder (2 TBRU)
Bi-LSTM Tagging (3 TBRU
Y1 Y2 Y3 Y4 Y5
Y1 Y2 Y3 Y4 Y
Transition Based Recurrent Unit (TBRU)
Network
Cell
Discrete
state
Recurrence fcn
Input embeddings
network activations
Figure 1: High level schematic of a Transition-Based Recurrent Unit (
TBRU
▪ Transition-based recurrent unit
▪ Discrete state dynamics: allow network connections to be built
dynamically as a function of intermediate activations
▪ Potential of TBRU: extension and combination
▪ Sequence-to- sequence
▪ Attention mechanisms
▪ Recursive tree-structured models
Encoder/Decoder (2 TBRU)
Bi-LSTM Tagging (3 TBRU)
Y1 Y2 Y3 Y4 Y5
Y1 Y2 Y3 Y4 Y5
Stack-LSTM (2 TBRU)
Y1 Y2 Y3 Y4 Y5
Transition Based Recurrent Unit (TBRU)
Network
Cell
Discrete
state
Recurrence fcn
Input embeddings
network activations
Figure 1: High level schematic of a Transition-Based Recurrent Unit (TBRU), and common network
Encoder/Decoder (2 TBRU)
Bi-LSTM Tagging (3 TBRU)
Y1 Y2 Y3 Y4 Y5
Y1 Y2 Y3 Y4 Y5
Transition Based Recurrent Unit (TBRU)
Network
Cell
Discrete
state
Recurrence fcn
Input embeddings
network activations
Figure 1: High level schematic of a Transition-Based Recurrent Unit (TBRU
architectures that can be implemented with multiple TBRUs. The discrete
recurrences and fixed input embeddings, which are then fed through a network c
an action which is used to update the discrete state (dashed output) and provid
consumed through recurrences (solid output). Note that we present a slightly sim
LSTM (Dyer et al., 2015) for clarity.
NLP problems; Dyer et al. (2015); Lample et al.
(2016); Kiperwasser and Goldberg (2016); Zhang
et al. (2016); Andor et al. (2016), among others.
tures by providing a fin
puts that ‘attend’ to relev
space. Unlike recursive*Kong et al., (2017)
Parsey McParseface
▪ Parsey McParseface (2017)
▪ State-of-art deep learning-based text parser
▪ Performance comparison
Model News Web Questions
Martins et al.
(2013)
93.10 88.23 94.21
Zhang and
McDonald
(2014)
93.32 88.65 93.37
Weiss et al.
(2015)
93.91 89.29 94.17
Andor et al.
(2016)*
94.44 90.17 95.40
Parsey
McParseface
94.15 89.08 94.77
Model News Web Questions
Ling et al.
(2015)
97.44 94.03 96.18
Andor et al.
(2016)*
97.77 94.80 96.86
Parsey
McParseface
97.52 94.24 96.45
POS (part-of-speech) tagging For different language domains
*github.com/tensorflow/tensorflow
McParseface model / DRAGNN framework
*github.com/tensorflow/tensorflow
Where is Korean?
▪ Korean language-specific characteristics
▪ Order / sequence is not important / Distance is important
▪ However, SyntaxNet works even with raw Korean
sequence!
▪ Even Japanese translation is supported
▪ Better solution?
▪ Now working on creating preprocessing NM for increase
DRAGNN performance
▪ With `Josa` ML
© Flappy Pepe by NADI games
Now, let’s move to the emotion
reading part.
Looks easier but hard, in fact.
Problems for next-gen chatbots
▪ Hooray! Deep-learning based chat bots works well with Q&A scenario!
▪ General problems
▪ Inhuman: restricted for model training sets
▪ Cannot "start" conversation
▪ Cannot handle continuous conversational context and its changes
▪ And, “Uncanny Valley”
▪ Inhuman speech / conversation.
▪ Why? How?
© http://www.fanuc.eu
© http://muscleduck.deviantart.com
So, let’s just focus on the details of humanbeings
: What makes us human?
imdb.com
ingorae.tistory.com/1249
I've seen things you people wouldn't believe.
Attack ships on fire off the shoulder of Orion.
I watched C-beams glitter in the dark near the Tannhäuser Gate.
All those moments will be lost in time, like tears in rain. Time to die.
Lexical
Output
Sentence generatorDeep-learning model
(sentence-to-sentence
+ context-aware word generator)
Emotion engine
Grammar
generator
Context memory
Knowledge engine
Emotion engine
Context
parser Tone
generator
Disintegrator
Response generatorNLP + StV Context analyzer+Decision maker
Lexical
Input
Today’s focus!
Conversational context locator
▪ Using Skip-gram and bidirectional 1-gram distribution in recent text
▪ I ate miso soup this morning. => Disintegrate first
▪ Bidirectional 1-gram set (reversible trigram): {(I,miso soup),Eat}, {(eat,today),miso soup},
{(miso soup,morning),today}
▪ Simplifying: {(<I>,<FOOD>),<EAT>}, {(<EAT>,Today),<FOOD>}, {(<FOOD>,morning),Today}
▪ Distribution: more simplification is needed
▪ {(<I>,<FOOD>), <EAT>}, {(<TIME:DATE>,<EAT>), <FOOD>}, {(<FOOD>,<TIME:DAY>),< TIME:DATE>}
▪ Now we can calculate multinomial distribution
I Today MorningMiso soupEat
<I> Today Morning<FOOD><EAT>
<I> <TIME:DATE> <TIME:DAY><FOOD><EAT>
*I’ll use trigram as abbreviation of reversible trigram
Conversational context locator
▪ Using Skip-gram and bidirectional 1-gram distribution in recent text
▪ 나는 오늘 아침에 된장국을 먹었습니다. => Disintegrate first
▪ Bidirectional 1-gram set: {(나,아침),오늘}, {(오늘,된장국),아침}, {(아침,먹다),된장국}
▪ Simplifying: {(<I>,아침),오늘}, {(오늘,<FOOD>),아침}, {(아침,<EAT>),<FOOD>}
▪ Distribution: more simplification is needed
▪ {(<I>,<TIME:DAY>), <TIME:DATE>}, {(<TIME:DATE>,<FOOD>), <TIME:DAY>}, {(<TIME:DAY>
,<EAT>),<FOOD>}
▪ Now we can calculate multinomial distribution
나 오늘 아침 된장국 먹다
<I> 오늘 아침 <FOOD> <EAT>
<I> <TIME:DATE> <TIME:DAY> <FOOD> <EAT>
*I’ll use trigram as abbreviation of reversible trigram
Conversational context locator
▪ Training context space
▪ Context-marked sentences
(>20000)
▪ Context: LIFE / CHITCHAT /
SCIENCE / TASK
▪ Prepare Generated trigram sets
with context bit
▪ Train RNN with 1-gram-2-vec
▪ Matching context space
▪ Input trigram sequence to context
space
▪ Take the dominator axis
▪ Using Skip-gram and trigram
distribution in recent text
▪ {(<I>,<TIME:DAY>), <TIME:DATE>}
▪ {(<TIME:DATE>,<FOOD>),
<TIME:DAY>}
▪ {(<TIME:DAY>,<EAT>),<FOOD>}
▪ With distribution
▪ Calculate maximum likelihood
significance and get significant
n-grams
▪ Uses last 5 sentences
For better performance
▪ Characteristics of Korean
Language
▪ Distance between words:
important
▪ Sequence between words: not
important
▪ Different from English
▪ How to read more contextual
information from longer
text? (e.g. Documents)
▪ Change from trigram to in-range
tri pairs
▪ I ate miso soup this morning:
▪ In range 1: {(<I>,<FOOD>), <EAT>}
▪ In range 2: {(<TIME:DATE>), <EAT>}
▪ In range 3: {(<TIME:DAY>), <EAT>}
▪ Heavily depends on the length of
original sentence
▪ Short?
▪ Long?
<I> <TIME:DATE> <TIME:DAY><FOOD><EAT>
Emotion engine
▪ Input: text sequence
▪ Output: Emotion flag (6-type / 3bit)
▪ Training set
▪ Sentences with 6-type categorized emotion
▪ Positivity (2), negativity (2), objectivity (2)
▪ Uses senti-word-net to extract emotion
▪ 6-axis emotion space by using Word2Vec model
▪ Current emotion indicator: the most weighted emotion axis using
Word2Vec model
Illustration *(c) http://ontotext.fbk.eu/
[0.95, 0.05, 0.11, 0.89, 0.92, 0.08]
[1, 0, 0, 0, 0, 0] 0x01
index: 1 2 3 4 5 6
Position in senti-space:
Making emotional context locator
▪ Similar to conversational
context locator
▪ Just use 1-gram from input
▪ Add the corresponding word
vector on emotion space
▪ How to?
▪ Use NLTK python library
▪ NLTK has corpora / data for
SentiWordNet
▪ Also gives download option!
import nltk
nltk.download()
Downloading NLTK dataset
Making emotional context locator
▪ Get emotional flag from sentence
Sample test routine for Sentimental state
from nltk.corpus import sentiwordnet as swn
def get_senti_vector(sentence, pos=None):
result = dict()
for s in sentence.split(' '):
if s not in result.keys():
senti = list(swn.senti_synsets(s.lower(),
pos))
if len(senti) > 0:
mostS = senti[0]
result[s] = [mostS.pos_score(), 1.0-
mostS.pos_score(), mostS.neg_score(), 1.0-
mostS.neg_score(), mostS.obj_score(), 1.0 -
mostS.obj_score()]
return result
{'I': [0.0, 1.0, 0.25, 0.75, 0.75, 0.25],
'happy': [0.875, 0.125, 0.0, 1.0, 0.125, 0.875],
'super': [0.625, 0.375, 0.0, 1.0, 0.375, 0.625],
'surprised': [0.125, 0.875, 0.25, 0.75, 0.625,
0.375]}
{'Hello': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0],
'I': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0],
'am': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0],
'happy': [0.875, 0.125, 0.0, 1.0, 0.125, 0.875],
'was': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0],
'super': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0],
'surprised': [0.125, 0.875, 0.0, 1.0, 0.875, 0.125]}
sentence = "Hello I am happy I was super
surprised"
result = get_senti_vector(sentence)
Adj. only
All morpheme
Creating Korean SentiWordNet
▪ Procedure to generate Korean SentiWordNet corpus
for i in swn.all_senti_synsets():
data.append(i)
<maimed.s.01: PosScore=0.0 NegScore=0.0>
<fit.a.01: PosScore=0.5 NegScore=0.0>
<acceptable.s.04: PosScore=0.25 NegScore=0.0>
<suitable.s.01: PosScore=0.125 NegScore=0.0>
<worthy.s.03: PosScore=0.875 NegScore=0.0>
<unfit.a.01: PosScore=0.25 NegScore=0.0>
<불구의.s.01: PosScore=0.0 NegScore=0.0>
<알맞다.a.01: PosScore=0.5 NegScore=0.0>
<만족스럽다.s.04: PosScore=0.25 NegScore=0.0>
<적합하다.s.01: PosScore=0.125 NegScore=0.0>
<훌륭하다.s.03: PosScore=0.875 NegScore=0.0>
<부적합하다.a.01: PosScore=0.25 NegScore=0.0>
2. Translate words
into Korean3. Treat
synonym
<불구의.s.01: PosScore=0.0 NegScore=0.0>
<알맞다.a.01: PosScore=0.5 NegScore=0.0>
<적합하다.a.01: PosScore=0.5 NegScore=0.0>
<어울리다.a.01: PosScore=0.5 NegScore=0.0>
<만족스럽다.s.04: PosScore=0.25 NegScore=0.0>
<적합하다.s.01: PosScore=0.125 NegScore=0.0>
<훌륭하다.s.03: PosScore=0.875 NegScore=0.0>
<부적합하다.a.01: PosScore=0.25 NegScore=0.0>
4. Choose the score from ‘representative word’
1. Get every synsets from sentiwordnet data
Tips to create Korean SentiWordNet
▪ Get open dictionary files (stardict(.ifo) / mdict (.mdx)
format)
▪ You will find non-free dictionary data files on the web. Use them with
your own risk.
▪ Convert them to .csv format using pyglossary
(https://github.com/ilius/pyglossary) package
▪ https://github.com/ilius/pyglossary/blob/master/doc/Octopus%20MDi
ct/README.rst will help understanding and manipulating .mdx file.
▪ Use the file for automatic translation of sentiwordnet data
Reading emotion with SentimentSpace
▪ Creating emotion space
▪ 1. Generate word space using word2vec model
▪ 2. Substitute word to SentiWordNet set
▪ 3. Now we get SentimentSpace!
▪ 4. Get the emotion state by giving disintegrated
word set into SentimentSpace
▪ Focuses on reading emotion
▪ Final location on WordVec space = Average
sentivector of nearest neighbors
*SentimentSpace: our definition / approach to simulate emotion.
SentimentSpace:
WordVec Space with
folded 2-sentiment dimension
[.85, .15, .0]
[.75, .05, .20]
[.65, .15, .20]
[.25, .10, .65]
Unfolding SentimentSpace
▪ Unfolded SentimentSpace
▪ Near nodes = Similar
sentiments
▪ Great representation with
serious problem
▪ Value resolution
▪ `Forgotten emotion`
[.85, .15, .0]
[.75, .05, .20]
[.25, .15, .20]
[.25, .10, .65]
acceptable
fit
Unfavorable
unsatisfactory
objectivity
Reading emotion from sentence
1. Extract sentiwords from sentence
2. Locate the sentence on sentispace via
Vector-sum of senti-words
3. Average the senti-state values on the point
§ Why using wordvec space instead of senti-
space directly?
§ Resolution problem on sentiword dataset
sentence = "Hello I am happy I was super surprised”
fragments = (‘I’, ‘happy’, ‘super’, ‘surprised’)
Tips for SentimentSpace
▪ When picking the best match from candidates
▪ e.g. fit ➜
▪ 1. Just pick the first candidate from senti sets
▪ 2. Calc the average Pos/Neg scores- [ 0.25, 0 ]
▪ When generating Korean SentiWordNet corpus
▪ 1. Do not believe the result. You will need tremendous amount of pre /
postprocessing
▪ SentimentSpace is very rough. Keep in mind to model the emotion
engine
<fit.a.01: PosScore=0.5 NegScore=0.0>
<acceptable.s.04: PosScore=0.25 NegScore=0.0>
<suitable.s.01: PosScore=0.125 NegScore=0.0>
<worthy.s.03: PosScore=0.875 NegScore=0.0>
Do we really know what emotion is?
And, mimic the sentiment.
© Sports chosun
–Ridley Scott’s blade runner
"More human than human" is our motto.
(c) http://www.westword.com/arts/the-top-five-best-futurama-episodes-ever-5800102
Modeling emotion model
▪ Goal:
▪ Simulating ‘emotion’ of bot
▪ What is emotion?
▪ How emotion changes?
▪ How?
▪ Define ‘emotion changes’ as changes of
SentimentSpace paremeters
▪ Deep-learning model of parameter
transitions by conversation
Simulating Sentiment
▪ The next goal: let machine have emotion
▪ We can read emotion from text now.
▪ How can we `mimic` the emotion?
▪ What is emotion?
▪ How to teach sentiment to machine?
▪ Sentiment Engine:
▪ Machine Learning Model to mimic emotional state and transitions
Teaching sentiment engine
▪ Characteristics of Emotion
▪ Memory effect: Affected by both short and long-range memory
▪ Flip-flop: can be reversed by small artifacts / events
▪ Mutuality: Both self and partner affects each other
▪ ML model structure requires:
▪ Memory
▪ Allows catastrophe
▪ On-demand input with self-sustained information
RNN
We will use
To simulate
sentimentality
Sentiment engine structure
▪ Input
▪ Prior emotion parameter sequences (p, n)i<t
▪ Current emotion parameter (pc , nc)t-1 of chatting partner (user)
▪ Output
▪ Emotion parameter (p, n)t
▪ Q) Avalanche?
▪ Hard problem.
▪ Why? Because we do not know why!
▪ Let’s simulate with a joke.
(pc , nc)t-1
(p, n)i<t
(p, n)t
Training sentiment engine
▪ Best point: it does not depend on language!
▪ I already made emotion reading machine (as you saw)
▪ My sentiment simulation engine requires sentiment parameters as
input, not sentence.
▪ Scenario
▪ Train sentiment engine with English
▪ Use sentiment engine with Korean
Bonus: We need two of them.
Why? For more interesting!
Double-identity chat-bot engines
▪ Why double identity?
▪ Bot talks to bot: Siri VS. Siri / Google Home talks to Google Home
▪ It’s funny!
▪ But is it worth for me?
▪ Do I need more bots than one?
▪ How can we make ‘multiple’ bots with same context?
▪ Same context with different opinion?
▪ Making one engine with dual identity is easier than running two
engines.
…Still testing / working on the topic now.
One page summary
The simplest is the best
NOT SIMPLE ANYMORE T_T
Are you caring me now?
You / Care / I / Now
You / Look / Tired /Yesterday
You looked tired yesterday.
Ah, you looked very tired yesterday.
[GUESS] I [CARE] [PRESENT]
Disintegrator
Context analyzer
Decision maker
Grammar generator
Tone generator
Lexical
Output
Sentence generator
Deep-learning model
(sentence-to-sentence
+ context-aware word generator)
Grammar generator
Context
memory
Knowledge engine
Emotion engine
Context
parser
Tone generator
Disintegrator
Response
generator
NLP + StV
Context
analyzer
+
Decision
maker
Lexical
Input
Deep-learning model
(sentence-to-sentence
+ context-aware word generator)
Knowledge engine
Emotion engine
Context
parser
Context
analyzer
+
Decision
maker
Emotion Reader Sentiment Engine
Sentiment ML
SentimentSpace
Reader ML
Context
memory
Zoom-in emotion
▪ Emotion Engine Summary
▪ Emotion Reader (ER)
▪ Sentiment Engine (SE)
▪ Sentiment ML: RNN for sentiment
parameter estimation
▪ SentimentSpace
▪ WordVec space for ER / SE
With nostalgic tools (to someone)
Demonstration
Data source
▪ Prototyping
▪ Idol master conversation script (translated by online fans)
▪ Field tests
▪ Animations only with female characters
▪ New data!
▪ Communication script from Idol master 2 / OFA
▪ Script from Idol master PS
Data converter (from PyCon APAC 2016!)
.smi to .srt
Join
.srt files into one .txt
Remove
timestamps
and
blank lines
Remove
Logo / Ending
Song scripts
: Lines with
Japanese
Characters
and
the next lines
of them
Fetch
Character names
Nouns
Numbers
using
custom dictionary
(Anime characters,
Locations,
Specific nouns)
cat *.srt >> data.txt
subtitle_converter.py
*.smi file format is de facto standard of movie caption files in Korea
Extract Conversations
Conversation data
for sequence-to-sequence
Bot model
Reformat
merge
sliced captions
into one line
if last_sentence [-1] == '?':
conversation.add((
last_sentence,
current_sentence))
Remove
Too short sentences
Duplicates
Sentence data
for
disintegrator
grammar model
tone model
Train
disintegrator
integrator with
grammar model
tone model
Train
bot model
subtitle_converter.py
Pandas + DBMS
pandas
https://github.com/inureyes/pycon-kr-2017-demo-nanika
Simplification for demo
▪ No context engine: simple seq2seq-based chat-bot model
▪ github.com/tensorflow/models/tutorials/rnn/translate
▪ No knowledge engine
▪ Labeled data for training sentiment engine
▪ Target parameter values are generated from response sentence
▪ Unreal: It is not recommended.
▪ Far from realistic mimicking
▪ In fact, SE is not needed in this case. (but this is demo)
…with support from backbone machine
via internet through lablup.AI PaaS.
2017-08-11 14:44:17.817210: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't
compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU
computations.2017-08-11 14:44:17.817250: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow
library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up
CPU computations.2017-08-11 14:44:17.817262: W tensorflow/core/platform/cpu_feature_guard.cc:45] The
TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could
speed up CPU computations.2017-08-11 14:44:17.817271: W tensorflow/core/platform/cpu_feature_guard.cc:45] The
TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could
speed up CPU computations.2017-08-11 14:44:17.817280: W tensorflow/core/platform/cpu_feature_guard.cc:45] The
TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could
speed up CPU computations.2017-08-11 14:44:18.826396: I
tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:893] successful NUMA node read from SysFS had negative
value (-1), but there must be at least one NUMA node, so returning NUMA node zero2017-08-11 14:44:18.827380: I
tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 0 with properties:name: Tesla K80major: 3
minor: 7 memoryClockRate (GHz) 0.8235pciBusID 0000:00:17.0Total memory: 11.17GiBFree memory: 11.11GiB
…
2017-08-11 14:44:19.804221: I tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 7 with
properties:name: Tesla K80major: 3 minor: 7 memoryClockRate (GHz) 0.8235pciBusID 0000:00:1e.0Total memory:
11.17GiBFree memory: 11.11GiB
2017-08-11 14:44:19.835747: I tensorflow/core/common_runtime/gpu/gpu_device.cc:961] DMA: 0 1 2 3 4 5 6 7
2017-08-11 14:44:19.835769: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835775: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 1: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835781: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 2: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835785: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 3: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835790: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 4: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835794: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 5: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835800: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 6: Y Y Y Y Y Y Y Y
2017-08-11 14:44:19.835805: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 7: Y Y Y Y Y Y Y Y
Bot training procedure (initialization)
There was a clip a.k.a. Office Assistant
Note: It is NOT a joke.
A long time ago,
© clippit, Microsoft Office 97
It was not fun.
© Yotsuba&!, ASCII Mediaworks
Back to the basic
Back to the basic
Go back to the past
You can make AI even with Windows!
Windows!
…Windows? © Retrobot / 영실업
If you were crazy about desktop eyecandy,
▪ SSP / 나니카
▪ Compatible Baseware of 伺か
▪ 伺か: Desktop accessary for geeks in 2000s.
http://ssp.shillest.net
https://plaza.rakuten.co.jp/iiwisblog/diary/200602250000/
Baseware
Ghost
Makoto
Image Resources
Configurations
Script engine
Script data
SSP structure / modules
▪ Makoto: Translating module
▪ Shiori: Script execution module
▪ Ghost: Shiori + Shell
ShellShiori
▪ Structure
▪ Baseware (SSP): execution engine
▪ Shell: character image files /
animations
result
Poster * seanloh.wordpress.com/2013/07/03/ghost-in-the-shell-1995/
The problem
▪ How to connect my chatbot engine to SSP?
▪ Candidate 1: headline feature (webpage/RSS parser) to read answer
from ML engine
▪ Rejected: Reading interval too slow.
▪ Candidate 2: make new plugin
▪ Candidate 3: Create ghost DLL
▪ Rejected: More than 10 years passed since I use Windows OS.
▪ So, I decide to write the pseudo-SSP presentation (shell) layer
▪ The problem: I decided it yesterday.
First with the head, and with the heart
▪ Making Web-based Shell for
demonstration
▪ Starts at 18:00PM
▪ Could record demo at 1:30AM
▪ https://github.com/inureyes/pyco
n-kr-2017-demo-nanika
▪ Will be public after this talk.
Ghost (Engine)Web-based SSP Shell
Express.js
Polymer 2
Python 3
TensorFlow 1.2
Node.js
Browser CLI
Polling
REST
API
Pseudo-SSP demonstration structure
▪ Shell
▪ Written in node.js
▪ Why not django-rest?
▪ Ghost
▪ Written in Python3 / TensorFlow
▪ Context: seq2seq
▪ Emotion: Sentimental Engine / Emotion reader
Demo time
The end is near. (In many ways)
Summary
▪ Today
▪ Dive into SyntaxNet and DRAGNN
▪ Emotion reading procedure using SentiWordNet and deep learning
▪ Emotion simulating with Sentiment Engine
▪ My contributions / insight to you
▪ Dodging Korean-specific problems when using SyntaxNet
▪ My own emotion reading / simulation algorithm]
▪ Dig into communication with multiple chat-bots
▪ And heartbreaking demonstration!
And next...
▪ Public service to read emotion from conversation
▪ Emobot: it will read the emotional context changes
▪ Invite to telegram, slack, etc.
▪ Talkativot.ai
▪ Tired of request wave…
▪ Pipeline for garage bots
▪ Also as a service
–Bryce Courtenay's The Power of One
“First with the head, then with the heart.”
github.com/lablup/talkativot
And I prepare the shovel for you:
…Maybe, from late this month?
Thank you for listening :)
@inureyes
jeongkyu.shin
github.com/inureyes
Here are a few key points about understanding emotion in language:- Emotions are expressed through both lexical cues (words that directly indicate an emotion) and contextual/social cues. A full understanding requires looking at both. - The same words can express different emotions depending on context. Things like sarcasm, metaphor, and implied meanings add complexity.- Cultural and individual factors influence how emotions are expressed and understood. There are cultural differences in emotion vocabularies and conventions. - Emotions exist on a spectrum and often occur together or transition between each other. Models need to capture nuanced differences and combinations.- Non-verbal cues like tone of voice, facial expressions, body language etc provide important emotional signals

More Related Content

What's hot

Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...
Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...
Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...Edureka!
 
Chatbots and Deep Learning
Chatbots and Deep LearningChatbots and Deep Learning
Chatbots and Deep LearningAndherson Maeda
 
[2A4]DeepLearningAtNAVER
[2A4]DeepLearningAtNAVER[2A4]DeepLearningAtNAVER
[2A4]DeepLearningAtNAVERNAVER D2
 
Deep Learning Cases: Text and Image Processing
Deep Learning Cases: Text and Image ProcessingDeep Learning Cases: Text and Image Processing
Deep Learning Cases: Text and Image ProcessingGrigory Sapunov
 
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...Simplilearn
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow modelsjtoy
 
Sequence learning and modern RNNs
Sequence learning and modern RNNsSequence learning and modern RNNs
Sequence learning and modern RNNsGrigory Sapunov
 
Deep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingJonathan Mugan
 
Introduction to TensorFlow 2.0
Introduction to TensorFlow 2.0Introduction to TensorFlow 2.0
Introduction to TensorFlow 2.0Databricks
 
TensorFlow example for AI Ukraine2016
TensorFlow example  for AI Ukraine2016TensorFlow example  for AI Ukraine2016
TensorFlow example for AI Ukraine2016Andrii Babii
 
Modeling Electronic Health Records with Recurrent Neural Networks
Modeling Electronic Health Records with Recurrent Neural NetworksModeling Electronic Health Records with Recurrent Neural Networks
Modeling Electronic Health Records with Recurrent Neural NetworksJosh Patterson
 
Generating Natural-Language Text with Neural Networks
Generating Natural-Language Text with Neural NetworksGenerating Natural-Language Text with Neural Networks
Generating Natural-Language Text with Neural NetworksJonathan Mugan
 
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기NAVER Engineering
 
A Multiscale Visualization of Attention in the Transformer Model
A Multiscale Visualization of Attention in the Transformer ModelA Multiscale Visualization of Attention in the Transformer Model
A Multiscale Visualization of Attention in the Transformer Modeltaeseon ryu
 
論文輪読資料「Gated Feedback Recurrent Neural Networks」
論文輪読資料「Gated Feedback Recurrent Neural Networks」論文輪読資料「Gated Feedback Recurrent Neural Networks」
論文輪読資料「Gated Feedback Recurrent Neural Networks」kurotaki_weblab
 
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...Universitat Politècnica de Catalunya
 
Sequence Learning with CTC technique
Sequence Learning with CTC techniqueSequence Learning with CTC technique
Sequence Learning with CTC techniqueChun Hao Wang
 
[DL輪読会]Pay Attention to MLPs (gMLP)
[DL輪読会]Pay Attention to MLPs	(gMLP)[DL輪読会]Pay Attention to MLPs	(gMLP)
[DL輪読会]Pay Attention to MLPs (gMLP)Deep Learning JP
 
Long Short Term Memory
Long Short Term MemoryLong Short Term Memory
Long Short Term MemoryYan Xu
 

What's hot (20)

Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...
Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...
Creating Chatbots Using TensorFlow | Chatbot Tutorial | Deep Learning Trainin...
 
Chatbots and Deep Learning
Chatbots and Deep LearningChatbots and Deep Learning
Chatbots and Deep Learning
 
[2A4]DeepLearningAtNAVER
[2A4]DeepLearningAtNAVER[2A4]DeepLearningAtNAVER
[2A4]DeepLearningAtNAVER
 
Deep Learning Cases: Text and Image Processing
Deep Learning Cases: Text and Image ProcessingDeep Learning Cases: Text and Image Processing
Deep Learning Cases: Text and Image Processing
 
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
 
Sequence learning and modern RNNs
Sequence learning and modern RNNsSequence learning and modern RNNs
Sequence learning and modern RNNs
 
Deep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language Processing
 
Introduction to TensorFlow 2.0
Introduction to TensorFlow 2.0Introduction to TensorFlow 2.0
Introduction to TensorFlow 2.0
 
TensorFlow example for AI Ukraine2016
TensorFlow example  for AI Ukraine2016TensorFlow example  for AI Ukraine2016
TensorFlow example for AI Ukraine2016
 
Modeling Electronic Health Records with Recurrent Neural Networks
Modeling Electronic Health Records with Recurrent Neural NetworksModeling Electronic Health Records with Recurrent Neural Networks
Modeling Electronic Health Records with Recurrent Neural Networks
 
Generating Natural-Language Text with Neural Networks
Generating Natural-Language Text with Neural NetworksGenerating Natural-Language Text with Neural Networks
Generating Natural-Language Text with Neural Networks
 
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기
사람들과 자연스러운 대화를 나누는 일상대화 인공지능 만들기
 
Angular and Deep Learning
Angular and Deep LearningAngular and Deep Learning
Angular and Deep Learning
 
A Multiscale Visualization of Attention in the Transformer Model
A Multiscale Visualization of Attention in the Transformer ModelA Multiscale Visualization of Attention in the Transformer Model
A Multiscale Visualization of Attention in the Transformer Model
 
論文輪読資料「Gated Feedback Recurrent Neural Networks」
論文輪読資料「Gated Feedback Recurrent Neural Networks」論文輪読資料「Gated Feedback Recurrent Neural Networks」
論文輪読資料「Gated Feedback Recurrent Neural Networks」
 
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
 
Sequence Learning with CTC technique
Sequence Learning with CTC techniqueSequence Learning with CTC technique
Sequence Learning with CTC technique
 
[DL輪読会]Pay Attention to MLPs (gMLP)
[DL輪読会]Pay Attention to MLPs	(gMLP)[DL輪読会]Pay Attention to MLPs	(gMLP)
[DL輪読会]Pay Attention to MLPs (gMLP)
 
Long Short Term Memory
Long Short Term MemoryLong Short Term Memory
Long Short Term Memory
 

Similar to Here are a few key points about understanding emotion in language:- Emotions are expressed through both lexical cues (words that directly indicate an emotion) and contextual/social cues. A full understanding requires looking at both. - The same words can express different emotions depending on context. Things like sarcasm, metaphor, and implied meanings add complexity.- Cultural and individual factors influence how emotions are expressed and understood. There are cultural differences in emotion vocabularies and conventions. - Emotions exist on a spectrum and often occur together or transition between each other. Models need to capture nuanced differences and combinations.- Non-verbal cues like tone of voice, facial expressions, body language etc provide important emotional signals

Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...
Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...
Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...Databricks
 
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...AI Frontiers
 
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0Plain Concepts
 
5_RNN_LSTM.pdf
5_RNN_LSTM.pdf5_RNN_LSTM.pdf
5_RNN_LSTM.pdfFEG
 
Ai meetup Neural machine translation updated
Ai meetup Neural machine translation updatedAi meetup Neural machine translation updated
Ai meetup Neural machine translation updated2040.io
 
Recurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text AnalysisRecurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text Analysisodsc
 
String Comparison Surprises: Did Postgres lose my data?
String Comparison Surprises: Did Postgres lose my data?String Comparison Surprises: Did Postgres lose my data?
String Comparison Surprises: Did Postgres lose my data?Jeremy Schneider
 
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...Chris Fregly
 
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...Maninda Edirisooriya
 
Synthetic dialogue generation with Deep Learning
Synthetic dialogue generation with Deep LearningSynthetic dialogue generation with Deep Learning
Synthetic dialogue generation with Deep LearningS N
 
Building a Neural Machine Translation System From Scratch
Building a Neural Machine Translation System From ScratchBuilding a Neural Machine Translation System From Scratch
Building a Neural Machine Translation System From ScratchNatasha Latysheva
 
Simplifying training deep and serving learning models with big data in python...
Simplifying training deep and serving learning models with big data in python...Simplifying training deep and serving learning models with big data in python...
Simplifying training deep and serving learning models with big data in python...Holden Karau
 
Data oriented design and c++
Data oriented design and c++Data oriented design and c++
Data oriented design and c++Mike Acton
 
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRFEnd-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRFJayavardhan Reddy Peddamail
 
Language translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowLanguage translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowS N
 
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...Chris Fregly
 
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobil
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobilNLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobil
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobilDatabricks
 
Engineering Intelligent NLP Applications Using Deep Learning – Part 2
Engineering Intelligent NLP Applications Using Deep Learning – Part 2 Engineering Intelligent NLP Applications Using Deep Learning – Part 2
Engineering Intelligent NLP Applications Using Deep Learning – Part 2 Saurabh Kaushik
 
High-Performance Networking Using eBPF, XDP, and io_uring
High-Performance Networking Using eBPF, XDP, and io_uringHigh-Performance Networking Using eBPF, XDP, and io_uring
High-Performance Networking Using eBPF, XDP, and io_uringScyllaDB
 

Similar to Here are a few key points about understanding emotion in language:- Emotions are expressed through both lexical cues (words that directly indicate an emotion) and contextual/social cues. A full understanding requires looking at both. - The same words can express different emotions depending on context. Things like sarcasm, metaphor, and implied meanings add complexity.- Cultural and individual factors influence how emotions are expressed and understood. There are cultural differences in emotion vocabularies and conventions. - Emotions exist on a spectrum and often occur together or transition between each other. Models need to capture nuanced differences and combinations.- Non-verbal cues like tone of voice, facial expressions, body language etc provide important emotional signals (20)

Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...
Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...
Building a Pipeline for State-of-the-Art Natural Language Processing Using Hu...
 
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...
Training at AI Frontiers 2018 - Lukasz Kaiser: Sequence to Sequence Learning ...
 
Chatbot ppt
Chatbot pptChatbot ppt
Chatbot ppt
 
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0
DotNet 2019 | Pablo Doval - Recurrent Neural Networks with TF2.0
 
5_RNN_LSTM.pdf
5_RNN_LSTM.pdf5_RNN_LSTM.pdf
5_RNN_LSTM.pdf
 
Ai meetup Neural machine translation updated
Ai meetup Neural machine translation updatedAi meetup Neural machine translation updated
Ai meetup Neural machine translation updated
 
Recurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text AnalysisRecurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text Analysis
 
String Comparison Surprises: Did Postgres lose my data?
String Comparison Surprises: Did Postgres lose my data?String Comparison Surprises: Did Postgres lose my data?
String Comparison Surprises: Did Postgres lose my data?
 
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...
High Performance Distributed TensorFlow in Production with GPUs - NIPS 2017 -...
 
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...
Lecture 9 - Deep Sequence Models, Learn Recurrent Neural Networks (RNN), GRU ...
 
Synthetic dialogue generation with Deep Learning
Synthetic dialogue generation with Deep LearningSynthetic dialogue generation with Deep Learning
Synthetic dialogue generation with Deep Learning
 
Building a Neural Machine Translation System From Scratch
Building a Neural Machine Translation System From ScratchBuilding a Neural Machine Translation System From Scratch
Building a Neural Machine Translation System From Scratch
 
Simplifying training deep and serving learning models with big data in python...
Simplifying training deep and serving learning models with big data in python...Simplifying training deep and serving learning models with big data in python...
Simplifying training deep and serving learning models with big data in python...
 
Data oriented design and c++
Data oriented design and c++Data oriented design and c++
Data oriented design and c++
 
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRFEnd-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
 
Language translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowLanguage translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlow
 
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...
Tallinn Estonia Advanced Java Meetup Spark + TensorFlow = TensorFrames Oct 24...
 
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobil
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobilNLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobil
NLP-Focused Applied ML at Scale for Global Fleet Analytics at ExxonMobil
 
Engineering Intelligent NLP Applications Using Deep Learning – Part 2
Engineering Intelligent NLP Applications Using Deep Learning – Part 2 Engineering Intelligent NLP Applications Using Deep Learning – Part 2
Engineering Intelligent NLP Applications Using Deep Learning – Part 2
 
High-Performance Networking Using eBPF, XDP, and io_uring
High-Performance Networking Using eBPF, XDP, and io_uringHigh-Performance Networking Using eBPF, XDP, and io_uring
High-Performance Networking Using eBPF, XDP, and io_uring
 

More from Jeongkyu Shin

Boosting machine learning workflow with TensorFlow 2.0
Boosting machine learning workflow with TensorFlow 2.0Boosting machine learning workflow with TensorFlow 2.0
Boosting machine learning workflow with TensorFlow 2.0Jeongkyu Shin
 
Machine Learning in Google I/O 19
Machine Learning in Google I/O 19Machine Learning in Google I/O 19
Machine Learning in Google I/O 19Jeongkyu Shin
 
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발Jeongkyu Shin
 
TensorFlow 2: New Era of Developing Deep Learning Models
TensorFlow 2: New Era of Developing Deep Learning ModelsTensorFlow 2: New Era of Developing Deep Learning Models
TensorFlow 2: New Era of Developing Deep Learning ModelsJeongkyu Shin
 
Machine Learning Model Serving with Backend.AI
Machine Learning Model Serving with Backend.AIMachine Learning Model Serving with Backend.AI
Machine Learning Model Serving with Backend.AIJeongkyu Shin
 
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기Jeongkyu Shin
 
오픈소스 라이선스를 둘러싼 소송들
오픈소스 라이선스를 둘러싼 소송들오픈소스 라이선스를 둘러싼 소송들
오픈소스 라이선스를 둘러싼 소송들Jeongkyu Shin
 
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크Backend.AI: 오픈소스 머신러닝 인프라 프레임워크
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크Jeongkyu Shin
 
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개Jeongkyu Shin
 
회색지대: 이상과 현실 - 오픈소스 저작권
회색지대: 이상과 현실 - 오픈소스 저작권회색지대: 이상과 현실 - 오픈소스 저작권
회색지대: 이상과 현실 - 오픈소스 저작권Jeongkyu Shin
 
Google Polymer in Action
Google Polymer in ActionGoogle Polymer in Action
Google Polymer in ActionJeongkyu Shin
 
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)Jeongkyu Shin
 
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)Jeongkyu Shin
 
OSS SW Basics Lecture 14: Open source hardware
OSS SW Basics Lecture 14: Open source hardwareOSS SW Basics Lecture 14: Open source hardware
OSS SW Basics Lecture 14: Open source hardwareJeongkyu Shin
 
OSS SW Basics Lecture 12: Open source in research fields
OSS SW Basics Lecture 12: Open source in research fieldsOSS SW Basics Lecture 12: Open source in research fields
OSS SW Basics Lecture 12: Open source in research fieldsJeongkyu Shin
 
OSS SW Basics Lecture 10: Setting up term project
OSS SW Basics Lecture 10: Setting up term projectOSS SW Basics Lecture 10: Setting up term project
OSS SW Basics Lecture 10: Setting up term projectJeongkyu Shin
 
OSS SW Basics Lecture 09: Communications in open-source developments
OSS SW Basics Lecture 09: Communications in open-source developmentsOSS SW Basics Lecture 09: Communications in open-source developments
OSS SW Basics Lecture 09: Communications in open-source developmentsJeongkyu Shin
 
OSS SW Basics Lecture 08: Software Configuration Management (2)
OSS SW Basics Lecture 08: Software Configuration Management (2)OSS SW Basics Lecture 08: Software Configuration Management (2)
OSS SW Basics Lecture 08: Software Configuration Management (2)Jeongkyu Shin
 
OSS SW Basics Lecture 06: Software Configuration Management
OSS SW Basics Lecture 06: Software Configuration ManagementOSS SW Basics Lecture 06: Software Configuration Management
OSS SW Basics Lecture 06: Software Configuration ManagementJeongkyu Shin
 
OSS SW Basics Lecture 04: OSS Licenses and documentation
OSS SW Basics Lecture 04: OSS Licenses and documentationOSS SW Basics Lecture 04: OSS Licenses and documentation
OSS SW Basics Lecture 04: OSS Licenses and documentationJeongkyu Shin
 

More from Jeongkyu Shin (20)

Boosting machine learning workflow with TensorFlow 2.0
Boosting machine learning workflow with TensorFlow 2.0Boosting machine learning workflow with TensorFlow 2.0
Boosting machine learning workflow with TensorFlow 2.0
 
Machine Learning in Google I/O 19
Machine Learning in Google I/O 19Machine Learning in Google I/O 19
Machine Learning in Google I/O 19
 
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발
머신러닝 및 데이터 과학 연구자를 위한 python 기반 컨테이너 분산처리 플랫폼 설계 및 개발
 
TensorFlow 2: New Era of Developing Deep Learning Models
TensorFlow 2: New Era of Developing Deep Learning ModelsTensorFlow 2: New Era of Developing Deep Learning Models
TensorFlow 2: New Era of Developing Deep Learning Models
 
Machine Learning Model Serving with Backend.AI
Machine Learning Model Serving with Backend.AIMachine Learning Model Serving with Backend.AI
Machine Learning Model Serving with Backend.AI
 
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기
그렇게 커미터가 된다: Python을 통해 오픈소스 생태계 가르치기
 
오픈소스 라이선스를 둘러싼 소송들
오픈소스 라이선스를 둘러싼 소송들오픈소스 라이선스를 둘러싼 소송들
오픈소스 라이선스를 둘러싼 소송들
 
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크Backend.AI: 오픈소스 머신러닝 인프라 프레임워크
Backend.AI: 오픈소스 머신러닝 인프라 프레임워크
 
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개
모바일 개발자를 위한 ML Kit: Machine Learning SDK 소개
 
회색지대: 이상과 현실 - 오픈소스 저작권
회색지대: 이상과 현실 - 오픈소스 저작권회색지대: 이상과 현실 - 오픈소스 저작권
회색지대: 이상과 현실 - 오픈소스 저작권
 
Google Polymer in Action
Google Polymer in ActionGoogle Polymer in Action
Google Polymer in Action
 
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)
구글의 머신러닝 비전: TPU부터 모바일까지 (Google I/O Extended Seoul 2017)
 
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)
기술 관심 갖기: 스타트업 기술 101 (Interested in Tech?: Startup Technology 101)
 
OSS SW Basics Lecture 14: Open source hardware
OSS SW Basics Lecture 14: Open source hardwareOSS SW Basics Lecture 14: Open source hardware
OSS SW Basics Lecture 14: Open source hardware
 
OSS SW Basics Lecture 12: Open source in research fields
OSS SW Basics Lecture 12: Open source in research fieldsOSS SW Basics Lecture 12: Open source in research fields
OSS SW Basics Lecture 12: Open source in research fields
 
OSS SW Basics Lecture 10: Setting up term project
OSS SW Basics Lecture 10: Setting up term projectOSS SW Basics Lecture 10: Setting up term project
OSS SW Basics Lecture 10: Setting up term project
 
OSS SW Basics Lecture 09: Communications in open-source developments
OSS SW Basics Lecture 09: Communications in open-source developmentsOSS SW Basics Lecture 09: Communications in open-source developments
OSS SW Basics Lecture 09: Communications in open-source developments
 
OSS SW Basics Lecture 08: Software Configuration Management (2)
OSS SW Basics Lecture 08: Software Configuration Management (2)OSS SW Basics Lecture 08: Software Configuration Management (2)
OSS SW Basics Lecture 08: Software Configuration Management (2)
 
OSS SW Basics Lecture 06: Software Configuration Management
OSS SW Basics Lecture 06: Software Configuration ManagementOSS SW Basics Lecture 06: Software Configuration Management
OSS SW Basics Lecture 06: Software Configuration Management
 
OSS SW Basics Lecture 04: OSS Licenses and documentation
OSS SW Basics Lecture 04: OSS Licenses and documentationOSS SW Basics Lecture 04: OSS Licenses and documentation
OSS SW Basics Lecture 04: OSS Licenses and documentation
 

Recently uploaded

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesZilliz
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 

Recently uploaded (20)

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector Databases
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 

Here are a few key points about understanding emotion in language:- Emotions are expressed through both lexical cues (words that directly indicate an emotion) and contextual/social cues. A full understanding requires looking at both. - The same words can express different emotions depending on context. Things like sarcasm, metaphor, and implied meanings add complexity.- Cultural and individual factors influence how emotions are expressed and understood. There are cultural differences in emotion vocabularies and conventions. - Emotions exist on a spectrum and often occur together or transition between each other. Models need to capture nuanced differences and combinations.- Non-verbal cues like tone of voice, facial expressions, body language etc provide important emotional signals

  • 1. Let Android dream electric sheep: Making emotion model for chat-bot with Python3, NLTK and TensorFlow Jeongkyu Shin Lablup Inc.
  • 2. Let Android dream electric sheep: Making emotion model for chat-bot with Python3, NLTK and TensorFlow Jeongkyu Shin Lablup Inc. Illustration * © Idol M@aster / Bandai Namco Games. All rights reserved.
  • 3.
  • 4. –Bryce Courtenay's The Power of One “First with the head, then with the heart.”
  • 5. Today’s focus ▪ NLP and Sentiment: Big problems when making chatbots ▪ Natural Language Understanding ▪ SyntaxNet and DRAGNN ▪ Emotion reading ▪ SentiWordNet and SentiSpace[1] ▪ Emotion simulation ▪ ML Sentiment engine [1] Our own definition for sentimental state space Illust: http://www.eetimes.com/document.asp?doc_id=1324302
  • 6. I’m not sure but I’ll try to explain the whole process I did
  • 7. And I assume that you already have experience / knowledge about machine learning Illustration *(c) marioandluigi97.deviantart.com
  • 8. Things that will not be covered today ▪ Phase space / embedding dimension ▪ Recurrent Neural Network (RNN) ▪ GRU cell / LSTM cell ▪ Multi-layer stacking ▪ Batch process for training ▪ Vector representation of language sentence ▪ Sequence-to-sequence model ▪ Word2Vec
  • 9. So we will go through ▪ Understanding Language ▪ Reading Emotion ▪ Mimicking Sentiment ▪ And make it more complex for fun © Sega. http://www.sega.com Image from Sonic Wikia : http://sonic.wikia.com/wiki/Sonic_the_Hedgehog
  • 10. It’s even hard for humanbeings. Let’s start from Language Understanding:
  • 11. Lexical Output Chat-bots with Machine Learning Context Analyzer Natural Language Processor Response Generator Decision maker Sentence To vector converter Deep-learning model (RNN / sentence-to-sentence) Knowledgebase (useful with TF/IDF ask bots) Per-user context memory Lexical Input Deep-learning model SyntaxNet / NLU (Natural Language Understanding) Today’s focus!
  • 12. Understanding Languages ▪ The structure of language ▪ “Noun” and “Verb” ▪ “Context” ▪ POS (Part-of-speech) ▪ Roles for the words ▪ Added as tags ▪ Only one meaning in the current sentence context ▪ Generalized POS tags ▪ Some POS tags are very common (noun, verb, …) ▪ Others? Quite complicated!
  • 13. SyntaxNet (2016) ▪ Transition-based framework for natural language processing ▪ Feature extraction ▪ Representing annotated data ▪ Evaluation ▪ End-to-end implementation using deep learning ▪ No language-awareness/dependencies: data-driven ▪ Interesting points ▪ Found general graph structure between different human languages (2016-7) ▪ http://universaldependencies.org *github.com/tensorflow/tensorflow
  • 14. Generating NLP with SyntaxNet Obtaining Data POS Tagging Training SyntaxNet POS tagger Dependency parsing Transition-based Parsing Training Parser
  • 15. SyntaxNet implementation ▪ Transition-based dependency parser ▪ SHIFT, LEFT ARC, RIGHT ARC ▪ “deviation” ▪ Configuration+Action ▪ Training ▪ Local pre-training / global training *Kong et al., (2017)
  • 17. NMT / PBMT / GNMT ▪ Neural Machine Translation ▪ End-to-end learning for automated translation ▪ Requires extensive computation both training and inference ▪ GNMT ▪ LSTM Network w/ 8 encoder & 8 decoder ▪ Much better than PBMT (Phase-Based Machine Translation) *Wo et al., (2016)
  • 18. DRAGNN (2017) ▪ Dynamic Recurrent Acyclic Graphical Neural Networks (Mar. 2017) ▪ Framework for building multi-task, fully dynamically constructed computation graphs ▪ Not GAN (Generative Adversarial Network)! ▪ Supports ▪ Training and evaluating models ▪ Pre-trained analyze models (McParsey) for 40 language ▪ Except Korean. (of course; ) ▪ Problem? ▪ Python 2 (like most NLP toolkits) *Kong et al., (2017)
  • 19. *Android meets TensorFlow, Google I/O (2017)
  • 20.
  • 21. Model differences ▪ DRAGNN[1]: End-to-end, deep recurrent models ▪ Use to extend SyntaxNet[2] to be end-to-end deep learning model ▪ TBRU: Transition-Based Recurrent Unit ▪ Uses both encoder and decoder ▪ TBRU-based multi-task learning : DRAGNN ▪ SyntaxNet: Transition-based NLP ▪ Can train SyntaxNet using DRAGNN framework [1] Kong et al., (2017) [2] Andor et al., (2016)
  • 22. Encoder/Decoder (2 TBRU) Bi-LSTM Tagging (3 TBRU Y1 Y2 Y3 Y4 Y5 Y1 Y2 Y3 Y4 Y Transition Based Recurrent Unit (TBRU) Network Cell Discrete state Recurrence fcn Input embeddings network activations Figure 1: High level schematic of a Transition-Based Recurrent Unit ( TBRU ▪ Transition-based recurrent unit ▪ Discrete state dynamics: allow network connections to be built dynamically as a function of intermediate activations ▪ Potential of TBRU: extension and combination ▪ Sequence-to- sequence ▪ Attention mechanisms ▪ Recursive tree-structured models Encoder/Decoder (2 TBRU) Bi-LSTM Tagging (3 TBRU) Y1 Y2 Y3 Y4 Y5 Y1 Y2 Y3 Y4 Y5 Stack-LSTM (2 TBRU) Y1 Y2 Y3 Y4 Y5 Transition Based Recurrent Unit (TBRU) Network Cell Discrete state Recurrence fcn Input embeddings network activations Figure 1: High level schematic of a Transition-Based Recurrent Unit (TBRU), and common network Encoder/Decoder (2 TBRU) Bi-LSTM Tagging (3 TBRU) Y1 Y2 Y3 Y4 Y5 Y1 Y2 Y3 Y4 Y5 Transition Based Recurrent Unit (TBRU) Network Cell Discrete state Recurrence fcn Input embeddings network activations Figure 1: High level schematic of a Transition-Based Recurrent Unit (TBRU architectures that can be implemented with multiple TBRUs. The discrete recurrences and fixed input embeddings, which are then fed through a network c an action which is used to update the discrete state (dashed output) and provid consumed through recurrences (solid output). Note that we present a slightly sim LSTM (Dyer et al., 2015) for clarity. NLP problems; Dyer et al. (2015); Lample et al. (2016); Kiperwasser and Goldberg (2016); Zhang et al. (2016); Andor et al. (2016), among others. tures by providing a fin puts that ‘attend’ to relev space. Unlike recursive*Kong et al., (2017)
  • 23. Parsey McParseface ▪ Parsey McParseface (2017) ▪ State-of-art deep learning-based text parser ▪ Performance comparison Model News Web Questions Martins et al. (2013) 93.10 88.23 94.21 Zhang and McDonald (2014) 93.32 88.65 93.37 Weiss et al. (2015) 93.91 89.29 94.17 Andor et al. (2016)* 94.44 90.17 95.40 Parsey McParseface 94.15 89.08 94.77 Model News Web Questions Ling et al. (2015) 97.44 94.03 96.18 Andor et al. (2016)* 97.77 94.80 96.86 Parsey McParseface 97.52 94.24 96.45 POS (part-of-speech) tagging For different language domains *github.com/tensorflow/tensorflow
  • 24. McParseface model / DRAGNN framework *github.com/tensorflow/tensorflow
  • 25. Where is Korean? ▪ Korean language-specific characteristics ▪ Order / sequence is not important / Distance is important ▪ However, SyntaxNet works even with raw Korean sequence! ▪ Even Japanese translation is supported ▪ Better solution? ▪ Now working on creating preprocessing NM for increase DRAGNN performance ▪ With `Josa` ML © Flappy Pepe by NADI games
  • 26. Now, let’s move to the emotion reading part. Looks easier but hard, in fact.
  • 27. Problems for next-gen chatbots ▪ Hooray! Deep-learning based chat bots works well with Q&A scenario! ▪ General problems ▪ Inhuman: restricted for model training sets ▪ Cannot "start" conversation ▪ Cannot handle continuous conversational context and its changes ▪ And, “Uncanny Valley” ▪ Inhuman speech / conversation. ▪ Why? How?
  • 28.
  • 30. So, let’s just focus on the details of humanbeings : What makes us human? imdb.com ingorae.tistory.com/1249
  • 31. I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.
  • 32. Lexical Output Sentence generatorDeep-learning model (sentence-to-sentence + context-aware word generator) Emotion engine Grammar generator Context memory Knowledge engine Emotion engine Context parser Tone generator Disintegrator Response generatorNLP + StV Context analyzer+Decision maker Lexical Input Today’s focus!
  • 33. Conversational context locator ▪ Using Skip-gram and bidirectional 1-gram distribution in recent text ▪ I ate miso soup this morning. => Disintegrate first ▪ Bidirectional 1-gram set (reversible trigram): {(I,miso soup),Eat}, {(eat,today),miso soup}, {(miso soup,morning),today} ▪ Simplifying: {(<I>,<FOOD>),<EAT>}, {(<EAT>,Today),<FOOD>}, {(<FOOD>,morning),Today} ▪ Distribution: more simplification is needed ▪ {(<I>,<FOOD>), <EAT>}, {(<TIME:DATE>,<EAT>), <FOOD>}, {(<FOOD>,<TIME:DAY>),< TIME:DATE>} ▪ Now we can calculate multinomial distribution I Today MorningMiso soupEat <I> Today Morning<FOOD><EAT> <I> <TIME:DATE> <TIME:DAY><FOOD><EAT> *I’ll use trigram as abbreviation of reversible trigram
  • 34. Conversational context locator ▪ Using Skip-gram and bidirectional 1-gram distribution in recent text ▪ 나는 오늘 아침에 된장국을 먹었습니다. => Disintegrate first ▪ Bidirectional 1-gram set: {(나,아침),오늘}, {(오늘,된장국),아침}, {(아침,먹다),된장국} ▪ Simplifying: {(<I>,아침),오늘}, {(오늘,<FOOD>),아침}, {(아침,<EAT>),<FOOD>} ▪ Distribution: more simplification is needed ▪ {(<I>,<TIME:DAY>), <TIME:DATE>}, {(<TIME:DATE>,<FOOD>), <TIME:DAY>}, {(<TIME:DAY> ,<EAT>),<FOOD>} ▪ Now we can calculate multinomial distribution 나 오늘 아침 된장국 먹다 <I> 오늘 아침 <FOOD> <EAT> <I> <TIME:DATE> <TIME:DAY> <FOOD> <EAT> *I’ll use trigram as abbreviation of reversible trigram
  • 35. Conversational context locator ▪ Training context space ▪ Context-marked sentences (>20000) ▪ Context: LIFE / CHITCHAT / SCIENCE / TASK ▪ Prepare Generated trigram sets with context bit ▪ Train RNN with 1-gram-2-vec ▪ Matching context space ▪ Input trigram sequence to context space ▪ Take the dominator axis ▪ Using Skip-gram and trigram distribution in recent text ▪ {(<I>,<TIME:DAY>), <TIME:DATE>} ▪ {(<TIME:DATE>,<FOOD>), <TIME:DAY>} ▪ {(<TIME:DAY>,<EAT>),<FOOD>} ▪ With distribution ▪ Calculate maximum likelihood significance and get significant n-grams ▪ Uses last 5 sentences
  • 36. For better performance ▪ Characteristics of Korean Language ▪ Distance between words: important ▪ Sequence between words: not important ▪ Different from English ▪ How to read more contextual information from longer text? (e.g. Documents) ▪ Change from trigram to in-range tri pairs ▪ I ate miso soup this morning: ▪ In range 1: {(<I>,<FOOD>), <EAT>} ▪ In range 2: {(<TIME:DATE>), <EAT>} ▪ In range 3: {(<TIME:DAY>), <EAT>} ▪ Heavily depends on the length of original sentence ▪ Short? ▪ Long? <I> <TIME:DATE> <TIME:DAY><FOOD><EAT>
  • 37. Emotion engine ▪ Input: text sequence ▪ Output: Emotion flag (6-type / 3bit) ▪ Training set ▪ Sentences with 6-type categorized emotion ▪ Positivity (2), negativity (2), objectivity (2) ▪ Uses senti-word-net to extract emotion ▪ 6-axis emotion space by using Word2Vec model ▪ Current emotion indicator: the most weighted emotion axis using Word2Vec model Illustration *(c) http://ontotext.fbk.eu/ [0.95, 0.05, 0.11, 0.89, 0.92, 0.08] [1, 0, 0, 0, 0, 0] 0x01 index: 1 2 3 4 5 6 Position in senti-space:
  • 38. Making emotional context locator ▪ Similar to conversational context locator ▪ Just use 1-gram from input ▪ Add the corresponding word vector on emotion space ▪ How to? ▪ Use NLTK python library ▪ NLTK has corpora / data for SentiWordNet ▪ Also gives download option! import nltk nltk.download() Downloading NLTK dataset
  • 39. Making emotional context locator ▪ Get emotional flag from sentence Sample test routine for Sentimental state from nltk.corpus import sentiwordnet as swn def get_senti_vector(sentence, pos=None): result = dict() for s in sentence.split(' '): if s not in result.keys(): senti = list(swn.senti_synsets(s.lower(), pos)) if len(senti) > 0: mostS = senti[0] result[s] = [mostS.pos_score(), 1.0- mostS.pos_score(), mostS.neg_score(), 1.0- mostS.neg_score(), mostS.obj_score(), 1.0 - mostS.obj_score()] return result {'I': [0.0, 1.0, 0.25, 0.75, 0.75, 0.25], 'happy': [0.875, 0.125, 0.0, 1.0, 0.125, 0.875], 'super': [0.625, 0.375, 0.0, 1.0, 0.375, 0.625], 'surprised': [0.125, 0.875, 0.25, 0.75, 0.625, 0.375]} {'Hello': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0], 'I': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0], 'am': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0], 'happy': [0.875, 0.125, 0.0, 1.0, 0.125, 0.875], 'was': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0], 'super': [0.0, 1.0, 0.0, 1.0, 1.0, 0.0], 'surprised': [0.125, 0.875, 0.0, 1.0, 0.875, 0.125]} sentence = "Hello I am happy I was super surprised" result = get_senti_vector(sentence) Adj. only All morpheme
  • 40. Creating Korean SentiWordNet ▪ Procedure to generate Korean SentiWordNet corpus for i in swn.all_senti_synsets(): data.append(i) <maimed.s.01: PosScore=0.0 NegScore=0.0> <fit.a.01: PosScore=0.5 NegScore=0.0> <acceptable.s.04: PosScore=0.25 NegScore=0.0> <suitable.s.01: PosScore=0.125 NegScore=0.0> <worthy.s.03: PosScore=0.875 NegScore=0.0> <unfit.a.01: PosScore=0.25 NegScore=0.0> <불구의.s.01: PosScore=0.0 NegScore=0.0> <알맞다.a.01: PosScore=0.5 NegScore=0.0> <만족스럽다.s.04: PosScore=0.25 NegScore=0.0> <적합하다.s.01: PosScore=0.125 NegScore=0.0> <훌륭하다.s.03: PosScore=0.875 NegScore=0.0> <부적합하다.a.01: PosScore=0.25 NegScore=0.0> 2. Translate words into Korean3. Treat synonym <불구의.s.01: PosScore=0.0 NegScore=0.0> <알맞다.a.01: PosScore=0.5 NegScore=0.0> <적합하다.a.01: PosScore=0.5 NegScore=0.0> <어울리다.a.01: PosScore=0.5 NegScore=0.0> <만족스럽다.s.04: PosScore=0.25 NegScore=0.0> <적합하다.s.01: PosScore=0.125 NegScore=0.0> <훌륭하다.s.03: PosScore=0.875 NegScore=0.0> <부적합하다.a.01: PosScore=0.25 NegScore=0.0> 4. Choose the score from ‘representative word’ 1. Get every synsets from sentiwordnet data
  • 41. Tips to create Korean SentiWordNet ▪ Get open dictionary files (stardict(.ifo) / mdict (.mdx) format) ▪ You will find non-free dictionary data files on the web. Use them with your own risk. ▪ Convert them to .csv format using pyglossary (https://github.com/ilius/pyglossary) package ▪ https://github.com/ilius/pyglossary/blob/master/doc/Octopus%20MDi ct/README.rst will help understanding and manipulating .mdx file. ▪ Use the file for automatic translation of sentiwordnet data
  • 42. Reading emotion with SentimentSpace ▪ Creating emotion space ▪ 1. Generate word space using word2vec model ▪ 2. Substitute word to SentiWordNet set ▪ 3. Now we get SentimentSpace! ▪ 4. Get the emotion state by giving disintegrated word set into SentimentSpace ▪ Focuses on reading emotion ▪ Final location on WordVec space = Average sentivector of nearest neighbors *SentimentSpace: our definition / approach to simulate emotion. SentimentSpace: WordVec Space with folded 2-sentiment dimension [.85, .15, .0] [.75, .05, .20] [.65, .15, .20] [.25, .10, .65]
  • 43. Unfolding SentimentSpace ▪ Unfolded SentimentSpace ▪ Near nodes = Similar sentiments ▪ Great representation with serious problem ▪ Value resolution ▪ `Forgotten emotion` [.85, .15, .0] [.75, .05, .20] [.25, .15, .20] [.25, .10, .65] acceptable fit Unfavorable unsatisfactory objectivity
  • 44. Reading emotion from sentence 1. Extract sentiwords from sentence 2. Locate the sentence on sentispace via Vector-sum of senti-words 3. Average the senti-state values on the point § Why using wordvec space instead of senti- space directly? § Resolution problem on sentiword dataset sentence = "Hello I am happy I was super surprised” fragments = (‘I’, ‘happy’, ‘super’, ‘surprised’)
  • 45. Tips for SentimentSpace ▪ When picking the best match from candidates ▪ e.g. fit ➜ ▪ 1. Just pick the first candidate from senti sets ▪ 2. Calc the average Pos/Neg scores- [ 0.25, 0 ] ▪ When generating Korean SentiWordNet corpus ▪ 1. Do not believe the result. You will need tremendous amount of pre / postprocessing ▪ SentimentSpace is very rough. Keep in mind to model the emotion engine <fit.a.01: PosScore=0.5 NegScore=0.0> <acceptable.s.04: PosScore=0.25 NegScore=0.0> <suitable.s.01: PosScore=0.125 NegScore=0.0> <worthy.s.03: PosScore=0.875 NegScore=0.0>
  • 46.
  • 47. Do we really know what emotion is? And, mimic the sentiment. © Sports chosun
  • 48. –Ridley Scott’s blade runner "More human than human" is our motto. (c) http://www.westword.com/arts/the-top-five-best-futurama-episodes-ever-5800102
  • 49. Modeling emotion model ▪ Goal: ▪ Simulating ‘emotion’ of bot ▪ What is emotion? ▪ How emotion changes? ▪ How? ▪ Define ‘emotion changes’ as changes of SentimentSpace paremeters ▪ Deep-learning model of parameter transitions by conversation
  • 50. Simulating Sentiment ▪ The next goal: let machine have emotion ▪ We can read emotion from text now. ▪ How can we `mimic` the emotion? ▪ What is emotion? ▪ How to teach sentiment to machine? ▪ Sentiment Engine: ▪ Machine Learning Model to mimic emotional state and transitions
  • 51. Teaching sentiment engine ▪ Characteristics of Emotion ▪ Memory effect: Affected by both short and long-range memory ▪ Flip-flop: can be reversed by small artifacts / events ▪ Mutuality: Both self and partner affects each other ▪ ML model structure requires: ▪ Memory ▪ Allows catastrophe ▪ On-demand input with self-sustained information RNN We will use To simulate sentimentality
  • 52. Sentiment engine structure ▪ Input ▪ Prior emotion parameter sequences (p, n)i<t ▪ Current emotion parameter (pc , nc)t-1 of chatting partner (user) ▪ Output ▪ Emotion parameter (p, n)t ▪ Q) Avalanche? ▪ Hard problem. ▪ Why? Because we do not know why! ▪ Let’s simulate with a joke. (pc , nc)t-1 (p, n)i<t (p, n)t
  • 53. Training sentiment engine ▪ Best point: it does not depend on language! ▪ I already made emotion reading machine (as you saw) ▪ My sentiment simulation engine requires sentiment parameters as input, not sentence. ▪ Scenario ▪ Train sentiment engine with English ▪ Use sentiment engine with Korean
  • 54. Bonus: We need two of them. Why? For more interesting!
  • 55. Double-identity chat-bot engines ▪ Why double identity? ▪ Bot talks to bot: Siri VS. Siri / Google Home talks to Google Home ▪ It’s funny! ▪ But is it worth for me? ▪ Do I need more bots than one? ▪ How can we make ‘multiple’ bots with same context? ▪ Same context with different opinion? ▪ Making one engine with dual identity is easier than running two engines.
  • 56. …Still testing / working on the topic now.
  • 57. One page summary The simplest is the best NOT SIMPLE ANYMORE T_T
  • 58. Are you caring me now? You / Care / I / Now You / Look / Tired /Yesterday You looked tired yesterday. Ah, you looked very tired yesterday. [GUESS] I [CARE] [PRESENT] Disintegrator Context analyzer Decision maker Grammar generator Tone generator Lexical Output Sentence generator Deep-learning model (sentence-to-sentence + context-aware word generator) Grammar generator Context memory Knowledge engine Emotion engine Context parser Tone generator Disintegrator Response generator NLP + StV Context analyzer + Decision maker Lexical Input
  • 59. Deep-learning model (sentence-to-sentence + context-aware word generator) Knowledge engine Emotion engine Context parser Context analyzer + Decision maker Emotion Reader Sentiment Engine Sentiment ML SentimentSpace Reader ML Context memory Zoom-in emotion ▪ Emotion Engine Summary ▪ Emotion Reader (ER) ▪ Sentiment Engine (SE) ▪ Sentiment ML: RNN for sentiment parameter estimation ▪ SentimentSpace ▪ WordVec space for ER / SE
  • 60. With nostalgic tools (to someone) Demonstration
  • 61. Data source ▪ Prototyping ▪ Idol master conversation script (translated by online fans) ▪ Field tests ▪ Animations only with female characters ▪ New data! ▪ Communication script from Idol master 2 / OFA ▪ Script from Idol master PS
  • 62. Data converter (from PyCon APAC 2016!) .smi to .srt Join .srt files into one .txt Remove timestamps and blank lines Remove Logo / Ending Song scripts : Lines with Japanese Characters and the next lines of them Fetch Character names Nouns Numbers using custom dictionary (Anime characters, Locations, Specific nouns) cat *.srt >> data.txt subtitle_converter.py *.smi file format is de facto standard of movie caption files in Korea
  • 63. Extract Conversations Conversation data for sequence-to-sequence Bot model Reformat merge sliced captions into one line if last_sentence [-1] == '?': conversation.add(( last_sentence, current_sentence)) Remove Too short sentences Duplicates Sentence data for disintegrator grammar model tone model Train disintegrator integrator with grammar model tone model Train bot model subtitle_converter.py Pandas + DBMS pandas https://github.com/inureyes/pycon-kr-2017-demo-nanika
  • 64. Simplification for demo ▪ No context engine: simple seq2seq-based chat-bot model ▪ github.com/tensorflow/models/tutorials/rnn/translate ▪ No knowledge engine ▪ Labeled data for training sentiment engine ▪ Target parameter values are generated from response sentence ▪ Unreal: It is not recommended. ▪ Far from realistic mimicking ▪ In fact, SE is not needed in this case. (but this is demo)
  • 65. …with support from backbone machine via internet through lablup.AI PaaS.
  • 66. 2017-08-11 14:44:17.817210: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.2017-08-11 14:44:17.817250: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.2017-08-11 14:44:17.817262: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.2017-08-11 14:44:17.817271: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.2017-08-11 14:44:17.817280: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.2017-08-11 14:44:18.826396: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:893] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero2017-08-11 14:44:18.827380: I tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 0 with properties:name: Tesla K80major: 3 minor: 7 memoryClockRate (GHz) 0.8235pciBusID 0000:00:17.0Total memory: 11.17GiBFree memory: 11.11GiB … 2017-08-11 14:44:19.804221: I tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 7 with properties:name: Tesla K80major: 3 minor: 7 memoryClockRate (GHz) 0.8235pciBusID 0000:00:1e.0Total memory: 11.17GiBFree memory: 11.11GiB 2017-08-11 14:44:19.835747: I tensorflow/core/common_runtime/gpu/gpu_device.cc:961] DMA: 0 1 2 3 4 5 6 7 2017-08-11 14:44:19.835769: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835775: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 1: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835781: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 2: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835785: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 3: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835790: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 4: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835794: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 5: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835800: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 6: Y Y Y Y Y Y Y Y 2017-08-11 14:44:19.835805: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 7: Y Y Y Y Y Y Y Y Bot training procedure (initialization)
  • 67. There was a clip a.k.a. Office Assistant Note: It is NOT a joke. A long time ago, © clippit, Microsoft Office 97 It was not fun.
  • 68. © Yotsuba&!, ASCII Mediaworks
  • 69. Back to the basic
  • 70. Back to the basic Go back to the past
  • 71. You can make AI even with Windows! Windows! …Windows? © Retrobot / 영실업
  • 72. If you were crazy about desktop eyecandy, ▪ SSP / 나니카 ▪ Compatible Baseware of 伺か ▪ 伺か: Desktop accessary for geeks in 2000s. http://ssp.shillest.net https://plaza.rakuten.co.jp/iiwisblog/diary/200602250000/
  • 73. Baseware Ghost Makoto Image Resources Configurations Script engine Script data SSP structure / modules ▪ Makoto: Translating module ▪ Shiori: Script execution module ▪ Ghost: Shiori + Shell ShellShiori ▪ Structure ▪ Baseware (SSP): execution engine ▪ Shell: character image files / animations result
  • 75. The problem ▪ How to connect my chatbot engine to SSP? ▪ Candidate 1: headline feature (webpage/RSS parser) to read answer from ML engine ▪ Rejected: Reading interval too slow. ▪ Candidate 2: make new plugin ▪ Candidate 3: Create ghost DLL ▪ Rejected: More than 10 years passed since I use Windows OS. ▪ So, I decide to write the pseudo-SSP presentation (shell) layer ▪ The problem: I decided it yesterday.
  • 76. First with the head, and with the heart ▪ Making Web-based Shell for demonstration ▪ Starts at 18:00PM ▪ Could record demo at 1:30AM ▪ https://github.com/inureyes/pyco n-kr-2017-demo-nanika ▪ Will be public after this talk.
  • 77. Ghost (Engine)Web-based SSP Shell Express.js Polymer 2 Python 3 TensorFlow 1.2 Node.js Browser CLI Polling REST API
  • 78. Pseudo-SSP demonstration structure ▪ Shell ▪ Written in node.js ▪ Why not django-rest? ▪ Ghost ▪ Written in Python3 / TensorFlow ▪ Context: seq2seq ▪ Emotion: Sentimental Engine / Emotion reader
  • 79. Demo time The end is near. (In many ways)
  • 80.
  • 81. Summary ▪ Today ▪ Dive into SyntaxNet and DRAGNN ▪ Emotion reading procedure using SentiWordNet and deep learning ▪ Emotion simulating with Sentiment Engine ▪ My contributions / insight to you ▪ Dodging Korean-specific problems when using SyntaxNet ▪ My own emotion reading / simulation algorithm] ▪ Dig into communication with multiple chat-bots ▪ And heartbreaking demonstration!
  • 82. And next... ▪ Public service to read emotion from conversation ▪ Emobot: it will read the emotional context changes ▪ Invite to telegram, slack, etc. ▪ Talkativot.ai ▪ Tired of request wave… ▪ Pipeline for garage bots ▪ Also as a service
  • 83. –Bryce Courtenay's The Power of One “First with the head, then with the heart.”
  • 84. github.com/lablup/talkativot And I prepare the shovel for you: …Maybe, from late this month?
  • 85. Thank you for listening :) @inureyes jeongkyu.shin github.com/inureyes