Enviar pesquisa
Carregar
Neural Network and NLP
•
18 gostaram
•
1,989 visualizações
Mark Chang
Seguir
Neural Network and NLP Tutorial http://coscup2015.kktix.cc/events/handson-nndl
Leia menos
Leia mais
Tecnologia
Denunciar
Compartilhar
Denunciar
Compartilhar
1 de 21
Baixar agora
Baixar para ler offline
Recomendados
Computational Poetry 電腦賦詩
Computational Poetry 電腦賦詩
Mark Chang
Neural Language Model Tutorial
Neural Language Model Tutorial
Mark Chang
淺談深度學習
淺談深度學習
Mark Chang
Machine Learning Introduction
Machine Learning Introduction
Mark Chang
漫談人工智慧:啟發自大腦科學的深度學習網路
漫談人工智慧:啟發自大腦科學的深度學習網路
Jason Tsai
初探深度學習技術與應用
初探深度學習技術與應用
Fuzhou University
漫談人工智慧:啟發自大腦科學的深度學習網路
漫談人工智慧:啟發自大腦科學的深度學習網路
Jason Tsai
(Deep) Neural Networks在 NLP 和 Text Mining 总结
(Deep) Neural Networks在 NLP 和 Text Mining 总结
君 廖
Recomendados
Computational Poetry 電腦賦詩
Computational Poetry 電腦賦詩
Mark Chang
Neural Language Model Tutorial
Neural Language Model Tutorial
Mark Chang
淺談深度學習
淺談深度學習
Mark Chang
Machine Learning Introduction
Machine Learning Introduction
Mark Chang
漫談人工智慧:啟發自大腦科學的深度學習網路
漫談人工智慧:啟發自大腦科學的深度學習網路
Jason Tsai
初探深度學習技術與應用
初探深度學習技術與應用
Fuzhou University
漫談人工智慧:啟發自大腦科學的深度學習網路
漫談人工智慧:啟發自大腦科學的深度學習網路
Jason Tsai
(Deep) Neural Networks在 NLP 和 Text Mining 总结
(Deep) Neural Networks在 NLP 和 Text Mining 总结
君 廖
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Mark Chang
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Mark Chang
Information in the Weights
Information in the Weights
Mark Chang
Information in the Weights
Information in the Weights
Mark Chang
PAC Bayesian for Deep Learning
PAC Bayesian for Deep Learning
Mark Chang
PAC-Bayesian Bound for Deep Learning
PAC-Bayesian Bound for Deep Learning
Mark Chang
Domain Adaptation
Domain Adaptation
Mark Chang
NTU ML TENSORFLOW
NTU ML TENSORFLOW
Mark Chang
NTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANs
Mark Chang
Generative Adversarial Networks
Generative Adversarial Networks
Mark Chang
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
Mark Chang
The Genome Assembly Problem
The Genome Assembly Problem
Mark Chang
DRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive Writer
Mark Chang
Variational Autoencoder
Variational Autoencoder
Mark Chang
TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--深度學習
Mark Chang
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用
Mark Chang
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用
Mark Chang
TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習
Mark Chang
Computational Linguistics week 10
Computational Linguistics week 10
Mark Chang
Neural Doodle
Neural Doodle
Mark Chang
Mais conteúdo relacionado
Mais de Mark Chang
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Mark Chang
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Mark Chang
Information in the Weights
Information in the Weights
Mark Chang
Information in the Weights
Information in the Weights
Mark Chang
PAC Bayesian for Deep Learning
PAC Bayesian for Deep Learning
Mark Chang
PAC-Bayesian Bound for Deep Learning
PAC-Bayesian Bound for Deep Learning
Mark Chang
Domain Adaptation
Domain Adaptation
Mark Chang
NTU ML TENSORFLOW
NTU ML TENSORFLOW
Mark Chang
NTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANs
Mark Chang
Generative Adversarial Networks
Generative Adversarial Networks
Mark Chang
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
Mark Chang
The Genome Assembly Problem
The Genome Assembly Problem
Mark Chang
DRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive Writer
Mark Chang
Variational Autoencoder
Variational Autoencoder
Mark Chang
TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--深度學習
Mark Chang
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用
Mark Chang
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用
Mark Chang
TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習
Mark Chang
Computational Linguistics week 10
Computational Linguistics week 10
Mark Chang
Neural Doodle
Neural Doodle
Mark Chang
Mais de Mark Chang
(20)
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Information in the Weights
Information in the Weights
Information in the Weights
Information in the Weights
PAC Bayesian for Deep Learning
PAC Bayesian for Deep Learning
PAC-Bayesian Bound for Deep Learning
PAC-Bayesian Bound for Deep Learning
Domain Adaptation
Domain Adaptation
NTU ML TENSORFLOW
NTU ML TENSORFLOW
NTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANs
Generative Adversarial Networks
Generative Adversarial Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
The Genome Assembly Problem
The Genome Assembly Problem
DRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive Writer
Variational Autoencoder
Variational Autoencoder
TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習
Computational Linguistics week 10
Computational Linguistics week 10
Neural Doodle
Neural Doodle
Neural Network and NLP
1.
Neural Network and
Natural Language Processing Mark Chang
2.
大綱 • 怎麼讓電腦學會文字的語意?語意向量 •
用類神經網路把文字轉成語意向量 • 用神經圖靈機來做藏頭詩產生器
3.
怎麼讓電腦學會文字的語意? • 事實上,一個詞彙的語意,可以從它附近的詞彙推測出來 •
Ex:祭止兀 祭止兀 是 最佳 助選員 蔡正元 是 最佳 助選員 罷免 祭止兀 失敗 罷免 蔡正元 失敗 祭止兀和蔡正元, 語意相近
4.
語意向量 祭止兀 是 最佳
助選員 蔡正元 是 最佳 助選員 罷免 祭止兀 失敗 罷免 蔡正元 失敗 (x1=罷免, x2 =助選員, ... , xn) • 把一個詞彙對應到一個n維度的向量,每個維度代表一個詞彙 • 數值非0的維度,表示該詞彙的附近有那個詞彙 祭止兀 (1, 1, 0, ..., xn) 蔡正元 (1, 1, 0, ..., xn)
5.
語意向量 蔡正元 (1, 1,...,
xn) 祭止兀 (1, 1,..., xn) 石內卜 (0, 1,..., xn)
6.
計算詞彙的語意相似度 • Cosine Similarity
• 向量A和向量B的 Cosine Similarity為: A · B |A||B| 蔡正元 (a1, a2, ..., an) 祭止兀 (b1, b2, ..., bn) 蔡正元與祭止兀的Cosine Similarity為: a1b1 + a2b2 + ... + anbn p a2 1 + a2 2 + ... + a2 n p b2 1 + b2 2 + ... + b2 n
7.
語意向量加減運算 女 + 父
-‐ 男 = 母 女 母 男 父 父 -‐ 男 父 -‐ 男
8.
用類神經網路把文字轉成語意向量 (x1=罷免, x2 =助選員,...,
xn) 祭止兀 祭止兀 用周圍的詞彙來當成向量維度:維度大小為詞彙總類(好幾萬個) 用類神經網路來計算語意向量:維度大小可以自行決定(幾十到幾百) x1 x2 x3 x4 xn ... x1 x2 x3 x4 xn ...
9.
用類神經網路把文字轉成語意向量 祭 止 兀 詞彙 編碼(One-‐Hot
Encoding) Neural Network (Auto-‐Encoder) 語意向量 1.2 0.7 0.5 1 0 0 0
10.
編碼(One-Hot Encoding) 祭止兀 •
把詞彙對應到一個n維度的向量,每個維度代表一個詞彙 • 每個詞彙的向量中,只有一個維度為1,其餘皆為0 • 每個詞彙的向量都互相垂直 蔡正元 罷免 失敗 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1
11.
Auto-Encoder 1 0 0 0 祭 止 兀 祭 止 兀 維度較小 的編碼 1.2
0.7 0.5 維度較大 的編碼 輸入值和輸出值一樣 1 0 0 0
12.
Word2Vec 祭 止 兀 • 若把某個字周圍的字一起給編碼進去,就可以得到語意向量
罷免 失敗 祭止兀 語意向量(包含 周圍字的語意) 1.2 1.3 0.2 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 隨機交換
13.
Word2Vec 祭 止 兀 罷 免 失 敗 祭 止 兀 0 0 1 0 1 0 0 0 0 0 0 1 「罷免」的語意向量 「失敗」的語意向量 1 0 0 0
14.
實作 • word2vec
15.
用神經圖靈機來做藏頭詩產生器 原始資料: 全唐詩 兩萬首 編碼 (One-‐Hot
Encoding) 訓練 資料 神經圖靈機 標準 答案 每句中給 一個字 神經圖靈機 填滿 整首詩 挖空格
16.
用神經圖靈機來做藏頭詩產生器 神經圖靈機
17.
用神經圖靈機來做藏頭詩產生器 N-‐gram Language Model
18.
實作 • 藏頭詩產生器
19.
後續研究 • 用word2vec做pre-‐training •
文言文中文斷詞 • 藏頭詩產生器Javascript版本
20.
延伸閱讀 • 類神經網路訓練過程,公式推導: • hMp://cpmarkchang.logdown.com/posts/277349-‐neural-‐network-‐backward-‐ propagaRon
• 神經語言模型: • hMp://cpmarkchang.logdown.com/posts/255785-‐neural-‐network-‐neural-‐ probabilisRc-‐language-‐model • hMp://cpmarkchang.logdown.com/posts/276263-‐-‐hierarchical-‐probabilisRc-‐neural-‐ networks-‐neural-‐network-‐language-‐model • Word2vec • hMp://arxiv.org/pdf/1301.3781.pdf • hMp://papers.nips.cc/paper/5021-‐distributed-‐representaRons-‐of-‐words-‐and-‐ phrases-‐and-‐their-‐composiRonality.pdf • hMp://www-‐personal.umich.edu/~ronxin/pdf/w2vexp.pdf
21.
講者聯絡方式: • Mark Chang
• facebook:hMps://www.facebook.com/ckmarkoh.chang • Github:hMp://github.com/ckmarkoh • Blog:hMp://cpmarkchang.logdown.com • email:ckmarkoh at gmail.com • Fumin • Github:hMps://github.com/fumin • Email:awawfumin at gmail.com
Baixar agora