超越ResNet的緊密連接卷積神經網路

Topic: Densely connected convolutional networks

Dosudo deep learning newsletter #4

c1_1

Editor:  George.Wu

Resources:    Paper link     Github      Tensorflow code

label:    Network architecture

2017  CVPR best paper

一般來說越深層的convolutional neural networks(CNNs)會有越好的效能, 但會產生梯度消失問題(vanishing gradient),就是從輸入層到越來越深入的層在訓練時梯度訊息會遞減或消散.有不少的研究嘗試解決這個問題, 希望讓梯度訊號即使通過上百層的網路也不會消散.其中包括殘差網路 Residual Net(ResNets) [1] 建立層與層之間的直通通道及Highway network [2] 進一步利用可調適的方式決定哪些層之間有直通通道哪些沒有. Stochastic Net [3] 則是在訓練時隨機丟棄直通通道.這些改善方案的相同點就是建立非相鄰層與層之間的短路徑.

Dense Convolutional Network (DenseNet)不只加入這種非相鄰層與層之間的短路徑, 而是把所有的層都連結在一起.在訓練過程中, 這樣的確可以確保梯度訊息能正確的傳遞到每一層,而在直觀意義上的確更高層的特徵不僅僅受益於前一層的特徵,低層特徵也能直接貢獻在高層特徵上.舉例來說, 圖像處理上前幾層通常是邊緣偵測的結果, 但這些特徵可能也可以直接對臉孔偵測提供直接的幫助.

實際上DenseNet的網路是由很多的Dense Block組成, 每個Dense block裡有數層size及filter數目相同的convolutional layer互相緊密結合,而Dense block之間則稱為transition layer 由一層convolutional layer及pooling層組成.DenseNet的優點在於解決了vanishing gradient問題, 強化了特徵傳播及feature reuse, 並大大減低了參數數量.

c1_2

Reference:

[1] He, Kaiming, et al. “Deep residual learning for image recognition.” Proceedings of the IEEE conference on computer vision and pattern recognition. 2016.    Github

[2] Srivastava, Rupesh Kumar, Klaus Greff, and Jürgen Schmidhuber. “Highway networks.” arXiv preprint arXiv:1505.00387 (2015).  Github

[3] Huang, Gao, et al. “Deep networks with stochastic depth.” European Conference on Computer Vision. Springer International Publishing, 2016. Github

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s