CNN-Syllabus
2021-09-08 19:23:33 1 举报
AI智能生成
CNN前世今生脑图
作者其他创作
大纲/内容
Definition
Use Convolution Operation in place of General Matrix Multiplication
Neural Network
Deal with Visual Imagery Originally
Structure Timeline
Channel Boosting
Channel Boosted CNN
2018
Attention
CBAM
2018
Attention
Residual Attention Module
2018
Channel Boosting
Channel Boosted CNN
2018
Feature Map Exploitation
SE Net
2018
Width Exploitation
Pyramidal Net
2017
Width Exploitation
Poly Net
2017
Width Exploitation
Wide ResNet
2017
Width Exploitation
ResNext
2017
Depth Revolution
FractalNet
2017
Depth Revolution
DenseNet
2016
Feature Map Exploitation
CMPE-SE
2018
Multi-Path Connectivity
Depth Revolution
Highway Net
2015
Width Exploitation
ResNext
2017
Skip Connections
FractalNet
2017
Parameter Optimisztion
Feature visualisation
zfNet
2013
Skip Connections
Dense Net
2016
Depth Exploitation
Spatial Exploitation
Programming ImageNet
2010
NVIDIA
2007
GPU Applied
2006
Max Pooling
2006
CNN Stagnation
Early 2000
Depth Revolution
Skip Connection
2015
ResNet
ResNet18
ENet
SegNet
FCN
DeconvNet
Deeplab
GCN
ResNet34
ResNet50
ResNet101
ResNet152
VGG
2014
Effective Receptive Field (Small Size Filters)
VGG-19
VGG-16
GoogleNet
2014
Factorization
Inception-ResNet-v2
Inception-ResNet-v1
Inception V4
Inception V3
BottleNeck
Inception V2
Inception V1
Inception Block
Parallelism
Spatial Exploitation
AlexNet
2012
Squeeze Net
Shuffile Net
Lenet5
1998
ConvNet
1989
Neocognitron
1979
Programming
PyTorch
Keras
Tensorflow
Problems
overfitting
子主题
Buzz words
Pooling
max Pooling
average Pooling
stochastic pooling
Pooling Size
Mask Matrix
Feature Map
Convolutional layers
an input layer
hidden layers
an output layer
Filter
Above 2D, Normally 3D
Kernel Size
2D
Stride
Padding
Dilation
Weights
Parameters
Parameter sharing
Local connectivity
Spatial arrangement
Early stopping
Added regularizer
weight decay
max norm constraints
Receptive Field
Devices
GPU
Xeon Phi
CPU
Fine-tuning
Human interpretable explanations
Residual Connection
Factorization
Downsamping
Upsampling
Attention
Feature Invariant
Normalisztion
Local Response Normalization
Data Augmentation
Optimizer
Exponentially weighted average
bias correction in exponentially weighted average
momentum
Nesterov Momentum
Adagrad
Adadelta
RMSprop
Adam
Convergence
Transfer
Gradient Descent
Batch gradient descent
Mini-batch gradient descent
stochastic gradient descent
Activation Function
ReLU
ReLU
ReLU6
SoftPlus
SoftMax
Tanh
Sigmoid
Bias_add
Dropout (Neuro)
dropout rate
range=[0,1)
empirally set to [0.3,0.5]
before dropout
after dropout
rescale rate
rescale rate = 1 / (1 - dropout rate)
DropConnect
DepthConcat
Forward
Backpropagation
Applications
Image recognition
Video analysis
Natural language processing
Anomaly Detection
Drug discovery
Health risk assessment
Biomarkers of aging discovery
Checkers game
Go
Time series forecasting
Cultural Heritage and 3D-datasets
Pooling layer
Loss layer
Activation layer
Fully connected layer
0 条评论
下一页