Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Paper sharing_deep learning for smart manufacturing methods and applications
1. Deep learning for smart manufacturing:
Methods and applications
From Journal of Manufacturing Systems
Jinjiang Wang, Yulin Ma, Laibin Zhanga, Robert X. Gao, Dazhong Wu(2018)
報告人:陳佑昇
2021/08/06
4. Vocabularies 2/3
/34
3
P. English Chinese
146 contour 輪廓
146 spurious redundancy 干擾冗餘
147 sensory data 感知資料
147 aggregated 聚合
147 Prescriptive analytics 指示性分析
147 Up to date 最新的
147 Building blocks 基石
148 backpropagation 反向傳播算法
149 contractive 收縮的
149 Principle component
analysis (PCA)
主成分分析
P. English Chinese
149 Stochastic gradient
descent (SGD)
隨機梯度下降法
149 denoising 去噪
149 Gaussian noise 高斯噪聲
149 Resistant 抵抗
149 perturbation 擾動
150 vanishing 消失
150 Forget gate 遺忘閥
150 prognostics 預測
151 arbitrary 任意
152 thresholding 門檻
5. Vocabularies 3/3
/34
4
P. English Chinese
152 assessment 判定
152 fracture 斷裂
152 incipient 期初的
152 spectrum 幅度
152 vibration 震動
152 permutation 排列
152 energy operator 能量算子
152 planetary gearbox 行星齒輪
變速箱
152 corruption 腐壞
152 wind turbine 風力渦輪機
P. English Chinese
152 rolling bearing 滾動軸承
152 propagation 傳播
152 remaining useful life
(RUL)
剩餘使用壽命
152 turbofan 渦輪發動機
152 polishing 拋光
152 semiconductor 半導體
152 ceramic bearing 陶瓷軸承
153 matter 問題
153 curse of dimensionality 維度災難
153 fusion 融合
6. Content
1. Introduction
2. Overview of data driven intelligence
3. Deep learning for smart manufacturing
4. Applications to smart manufacturing
5. Discussions and outlook
/34
5
7. Introduction
• Various countries have developed
strategic roadmaps to transform
manufacturing to take advantage
of the emerging infrastructure
• According SMLC survey:
82% of the companies using smart
manufacturing technologies have
experienced increased efficiency
and 45% of the companies of the
companies experienced increased
customer satisfaction
Germany(2010)-
Industry 4.0
United States(2011)-
Created a systematic
framework
China(2015)-
Manufacturing 2025
/34
6
8. Introduction
• The massive data in smart manufacturing imposes a variety of challenges
• Data driven intelligence need to obtain more actionable and insightful
information
/34
7
• Deep learning
towards highly
nonlinear and
complex feature
abstraction
10. The evolution of data-driven artificial intelligence 1/8
Timeline Proposed models Reference
Infancy period
(1940s)
MP model
Mcculloch WS, Pitts WH. A logical calculus of the ideas immanent in
nervous activity. Bull Math Biophys 1943;5(4):115–33.
Hebb rule
Samuel AL. Some studies in machine learning using the game of checkers
II—recent progress. Annu Rev Autom Program 2010;44(1–2):206–26.
/34
9
• MP model(1943) and Hebb rule(1949) were proposed to discuss how neurons
worked in human brain
• Significant artificial intelligence capabilities like playing chess games and solving
simple logic problems were developed
11. The evolution of data-driven artificial intelligence 2/8
Timeline Proposed models Reference
First upsurge
period
(1960s)
Perceptron Rosenblatt F. Perceptron simulation experiments. Proc IRE1960;48(3):301–9.
Adaptive Linear Unit
Widrow B, Hoff ME. Adaptive switching circuits. Cambridge: MIT
Press;1960.
/34
10
• Perceptron(1956) was proposed to simulate the nervous system of human
learning with linear optimization
• Adaptive Linear Unit(1959) had been successfully used in practical applications
• Criticized due to the difficulty in handling non-linear problems, such as XOR (or
XNOR) classification
12. The evolution of data-driven artificial intelligence 3/8
/34
11
Timeline Proposed models Reference
Second upsurge
period
(1980s)
Hopfield network circuit
Tank DW, Hopfield JJ. Neural computation by concentrating
information intime. Proc Natl Acad Sci USA 1987;84(7):1896.
Back Propagation
Werbos PJ. Backpropagation through time: what it does and how
to do it.Proc IEEE 1990;78(10):1550–60.
Boltzmann Machine
Sussmann HJ. Learning algorithms for Boltzmann machines. 27th
IEEEconference on decision and control 1988;1:786–91.
Support Vector Machine
Vapnik VN. An overview of statistical learning theory. IEEE Trans
NeuralNetw 1998;10(5):988–99.
• Hopfield networks(1982) serve as associative memory systems with binary threshold nodes
• BP(1974) algorithm was proposed to solve non-linear problems in complex neural network
and put forward the BM(1985)
• SVM(1997) showed decent performance on classification and regression
13. The evolution of data-driven artificial intelligence 4/8
/34
12
Timeline Proposed models Reference
Second upsurge
period
(1980s)
Restricted
Boltzmann Machine
Smolensky P. Information processing in dynamical systems: foundations
ofharmony theory. Parallel distributed processing: explorations in
themicrostructure of cognition. Cambridge: MIT Press; 1986.
Auto Encoder
Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-
propagating errors. Nature 1986;323(6088):533–6.
• There are traditional machine learning techniques discuss before, they require
human expertise for feature extraction and highly relies on the engineered features.
RBM and AE are Deep learning models which uses data representation learning
• RBM(1986) was developed by obtaining the probability distribution of Boltzmann
Machine
• AE(1986) was proposed using the layer-by-layer Greedy learning algorithm to
minimize the loss function
14. The evolution of data-driven artificial intelligence 5/8
/34
13
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Recurrent Neural Network
Hihi SE, Hc-J MQ, Bengio Y. Hierarchical recurrent neural networks
for Long-Term dependencies. Adv Neural Inf Process Syst
1995;8:493–9.
Long short-term Memory
Hochreiter S, Schmidhuber J. Long short-Term memory. Neural
Comput1997;9(8):1735.
Convolutional Neural
Network
Lécun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning
applied to document recognition. Proc IEEE 1998;86(11):2278–324.
• RNN(1995) was proposed for feature learning from sequence data
• LSTM(1997) was proposed to tackle the vanishing gradient problem and deal with
complex time sequence data
• CNN(1998) was put forward to handle two dimensional inputs
• Many attempts no satisfactory performance was reported before 2006
15. The evolution of data-driven artificial intelligence 6/8
/34
14
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Deep Belief Network
Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data withneural
network. Science 2006;313(5786):504–7. Hinton GE, Osindero S, Teh YW. A fast
learning algorithm for deep beliefnets. Neural Comput 2014;18(7):1527–54.
Deep Auto Encoder
Deng L, Seltzer M, Yu D, Acero A, Mohamed A, Hinton GE. Binary coding of
speech spectrograms using a deep auto-encoder. Proceedings of 11th annual
conference of the international speech communication association2010;3:1692–5.
Sparse Auto Encoder
Schölkopf B, Platt J, Hofmann T. Efficient learning of sparse representations with an energy-
Based model. Proceedings of advances in neural information processingsystems 2006:1137–44.
Ranzato MA, Boureau YL, Lecun Y. Sparse feature learning for deep belief networks. Proceedings
of international conference on neural information processing systems 2007;20:1185–92.
• DBN(2006) was proposed by reducing computational complexity, and the parameters were
successfully learned through layer-wise pre-training and fine tuning
• Deep Auto Encoder(2005) was proposed by adding more hidden layers to deal with high
nonlinear input
• SAE(2006) was put forward to reduce dimensionality and learn sparse representations
• Deep learning gained increasing popularity
16. The evolution of data-driven artificial intelligence 7/8
/34
15
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Deep Boltzmann Machine
Salakhutdinov RR, Hinton GE. Deep Boltzmann machines. J Mach Learn
Res2009;5(2):1967–2006.
Denosing Auto Encoder
Larochelle H, Lajoie I, Bengio Y, Manzagol PA. Stacked denoisingautoencoders:
learning useful representations in a deep network with alocal denoising criterion.
J Mach Learn Res 2010;11(12):3371–408.
Deep Convolutional
Neural Network
Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with
deepconvolution neural networks. International conference on neuralinformation
processing systems 2012;25:1097–105.
• DBM(2009) was proposed to learn ambiguous input data robustly, and the model
parameters were optimized using layer-wise pre-training
• DAE(2010) was presented to reconstruct the stochastically corrupted input data, and force
the hidden layer to discover more robust features
• DCNN(2012) was introduced with deep structure of Convolutional Neural Network, and it
showed superior performance in image recognition
17. The evolution of data-driven artificial intelligence 8/8
/34
16
Timeline Proposed models Reference
Third boom
period
(after 2000s)
Generative Adversarial
Network
Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et
al.Generative adversarial nets. Int Conf Neural Inf Process Syst2014;3:2672–80.
Attention-based LSTM
Wang Y, Huang M, Zhao L, Zhu X. Attention-based LSTM for aspect-level
sentiment classification. Proceedings of conference on empirical methods
innatural language processing 2016:606–15.
• GAN(2014) contained two independent models acting as adversaries
• Attention-based LSTM model(2016) was proposed by integrating attention
mechanism with LSTM
• Nowadays, more and more new models are being developed even per week.
19. Comparison between deep learning and traditional
machine learning
/34
18
• Deep learning is easier to model the nonlinear relationship using compositional
function
• The high level abstract representation in feature learning makes deep learning more
flexible and adaptable to data variety
• The ability to avoid feature engineering is regarded as a great advantage in smart
manufacturing
21. Deep learning for smart manufacturing
/34
20
Data
modelling
Analysis
Supporting
real-time data
processing
22. Convolutional neural network
CNN is a multi-layer feed-forward artificial neural network which is firstly
proposed for two-dimensional image processing
/34
21
23. Restricted Boltzmann machine and its variant
/34
22
RBM is a two-layer neural network consisting of visible and hidden layer
The highest
layers are
undirected /
Lower layers are
directed
Deep Belief
Network
(DBN)
The hidden
units are
grouped into a
hierarchy of
layers
Deep
Boltzmann
Machine
(DBM)
24. Auto encoder and its variants
AE is an unsupervised learning algorithm extracting features from input
data without label information needed
/34
23
Denoising and
discover more
robust features
DAE
(Denoising)
Imposing sparsity
constraints
SAE
(Sparse)
Force the model
resistant to small
perturbations
CAE
(Contractive)
25. Recurrent neural network and its variants
RNN has unique characteristic of topology connections between the
neurons formed directed cycles for sequence data
/34
24
RNN
Difficulty in dealing
with long-term
sequence data
LSTM
Allows information
flow down with linear
interactions
26. Model comparison
Model Principle Pros. Cons.
CNN
Abstracted features are learned
by stacked convolutional and
sampling layers.
Reduced parameter number,
invariance of shift, scale and
distortion.
High computational
complexity for high
hierarchical model training.
RNN
Temporal pattern stored in the
recurrent neuros connection
and distributed hidden states
for time-series data.
Short-term information is
retained and temporal
correlations are captured in
sequence data.
Difficult to train the model
and save the long-term
dependence.
PBM
Hidden layer describes variable
dependencies and connections
between input or output layers
as representative features.
Robust to ambiguous input
and training label is not
required in pre-training stage.
Time-consuming for joint
parameter optimization.
AE
Unsupervised feature learning
and data dimensionality
reduction are achieved through
encoding.
Irrelevance in the input is
eliminated, and meaningful
information is preserved.
Error propagation layer-by-
layer and sparse
representations are not
guaranteed.
/34
25
Table 3. Comparison between different deep learning models
28. Applications to smart manufacturing
• Computational intelligence
is an essential part of smart
manufacturing to enable
accurate insights for
better decision making
• Product quality inspection,
fault diagnosis, and
defect prognosis has been
investigated for a wide
range of manufacturing
systems recently
Model Application scenarios Reference
CNN
Surface integration inspection
1/4
[72–75]
Machinery fault diagnosis
8/8
[77–84]
DBN
Machinery fault diagnosis
8/8
[85–92]
Predictive analytics &
defect prognosis
2/4
[109–112]
AE Machinery fault diagnosis
3/11
[93–103]
RNN
Predictive analytics &
defect prognosis
4/5
[104–108]
/34
27
Table 5. A list of deep learning models with applications
[Reference (kind of application number) / (total number) ]
29. Descriptive analytics for product quality inspection
Inspected employing machine
vision and image processing
techniques to detect surface
defect for enhanced product
quality in manufacturing
Deep learning has been
investigated to learn high-level
generic features and applied to
a wide range of textures or
difficult-to-detect defects
cases
/34
28
This Photo by CASE PRESS is licensed under CC BY-NC
CNN
30. Diagnostic analytics for fault assessment
Monitor machinery conditions,
identify the incipient defects,
diagnose the root cause of
failures, and then incorporate the
information into manufacturing
production and control
Deep learning models
outperform traditional machine
learning technique in terms of
classification accuracy
CNN
• bearing, gearbox, wind
generator and rotor
DBN
• aircraft engine, chemical process,
reciprocating compressor, rolling element
bearing, high speed train and wind turbine
AE
• planetary gearbox, wind
turbine and rolling bearing
/34
29
31. Predictive analytics for defect prognosis
Develop and implement an
intelligent maintenance strategy
that allows manufacturers to
determine the condition of in-
service systems in order to predict
when maintenance should be
performed
Deep learning has been presented
for anomaly prediction of
machine, optimize job schedule
and balance the computational
load
/34
30
DBN
• material removal rate, chemical
mechanical polishing and ceramic
bearing
RNNs (LSTM)
• rolling bearing, machine health
monitoring, tool wear prediction
and aircraft turbofan engine
33. Discussions and outlook
Most companies do not know
what to do with the data they
have, and they lack software
and modelling to interpret
and analyze them
To address the challenges, the
deep learning for smart
manufacturing are discussed in
terms of data matter, model
selection, model
visualization, generic model,
and incremental learning
Improved data
collection
Use and
sharing
Predictive
model
design
Generalized
predictive
models
Connected factories
and control
processes
/34
32
Five gaps are identified in smart manufacturing From Kusiak A. Smart
manufacturing must embrace big data. Nature2017;544(7648):23–5.
34. Discussions and outlook
Challenge Description Solution
Data matter
1. Model heavily depend on the scale and
quality of datasets
2. Class imbalance problem
1. Extracting the relevant data and
applying appropriating task
2. Appropriate measures and
integration of boot strapping
Model
selection
Complexity in manufacturing process
(Different problems)
Supervised : dealing with data rich but
knowledge sparse problems, namely
labelled data are available
Model
visualization
Need to be understood by manufacturing
engineers
Visualization and fusion may contribute
a more effective model
Generic
model
Not bonded to specific machines
1. Increase its width or depth
2. Applied to large scale and real time
analytics using GPU
3. Choosing appropriate models
Incremental
learning
learning algorithms are not fundamentally
built to learn incrementally and are therefore
susceptible to the data velocity issues
Necessary to enable deep learning with
incremental learning capabilities
/34
33
35. Conclusions
Deep learning provides advanced analytics
and offers great potentials to smart
manufacturing in the age of big data
The emerging research effort of deep
learning in applications of manufacturing is
also summarized
Deep learning may be push into cloud,
enabling more convenient and on-
demand computing services for smart
manufacturing
/34
34