[en] In the framework of convolutional neural networks, downsampling is often performed with an average-pooling, where all the activations are treated equally, or with a max-pooling operation that only retains an element with maximum activation while discarding the others. Both of these operations are restrictive and have previously been shown to be sub-optimal. To address this issue, a novel pooling scheme, named ordinal pooling, is introduced in this work. Ordinal pooling rearranges all the elements of a pooling region in a sequence and assigns a different weight to each element based upon its order in the sequence. These weights are used to compute the pooling operation as a weighted sum of the rearranged elements of the pooling region. They are learned via a standard gradient-based training, allowing to learn a behavior anywhere in the spectrum of average-pooling to max-pooling in a differentiable manner. Our experiments suggest that it is advantageous for the networks to perform different types of pooling operations within a pooling layer and that a hybrid behavior between average- and max-pooling is often beneficial. More importantly, they also demonstrate that ordinal pooling leads to consistent improvements in the accuracy over average- or max-pooling operations while speeding up the training and alleviating the issue of the choice of the
pooling operations and activation functions to be used in the networks. In particular, ordinal pooling mainly helps on lightweight or quantized deep learning architectures, as typically considered e.g. for embedded applications.
Centre de recherche :
Montefiore Institute - Montefiore Institute of Electrical Engineering and Computer Science - ULiège Telim
Disciplines :
Ingénierie électrique & électronique
Auteur, co-auteur :
Deliège, Adrien ; Université de Liège - ULiège > Dép. d'électric., électron. et informat. (Inst.Montefiore) > Télécommunications
Istasse, Maxime
Kumar, Ashwani
De Vleeschouwer, Christophe
Van Droogenbroeck, Marc ; Université de Liège - ULiège > Dép. d'électric., électron. et informat. (Inst.Montefiore) > Télécommunications
Langue du document :
Anglais
Titre :
Ordinal Pooling
Date de publication/diffusion :
2020
Nom de la manifestation :
British Machine Vision Conference (BMVC)
Organisateur de la manifestation :
British Machine Vision Association
Lieu de la manifestation :
Cardiff, Royaume-Uni
Date de la manifestation :
from 09-09-2019 to 12-09-2019
Manifestation à portée :
International
Titre de l'ouvrage principal :
30th British Machine Vision Conference
Peer reviewed :
Peer reviewed
Intitulé du projet de recherche :
DeepSport
Organisme subsidiant :
DGTRE - Région wallonne. Direction générale des Technologies, de la Recherche et de l'Énergie [BE]
Commentaire :
Code will be available at
https://github.com/mistasse/ordinal-pooling-layers.
Y. Boureau, N. Le Roux, F. Bach, J. Ponce, and Y. Lecun. Ask the locals: Multi-way local pooling for image recognition. In IEEE Int. Conf. Comput. Vision (ICCV), pages 2651-2658, Barcelona, Spain, Nov. 2011.
Y. Boureau, J. Ponce, and Y. LeCun. A theoretical analysis of feature pooling in visual recognition. In Int. Conf. Mach. Learn. (ICML), pages 111-118, Haifa, Israel, June 2010.
B. Fernando, E. Gavves, M. Jose Oramas, A. Ghodrati, and T. Tuytelaars. Modeling video evolution for action recognition. In IEEE Int. Conf. Comput. Vision and Pattern Recogn. (CVPR), pages 5378-5387, Boston, MA, USA, June 2015.
B. Graham. Fractional max-pooling. CoRR, abs/1412.6071, Dec. 2014.
C. Gulcehre, K. Cho, R. Pascanu, and Y. Bengio. Learned-norm pooling for deep feed-forward and recurrent neural networks. In Machine Learning and Knowledge Discovery in Databases, volume 8724 of Lecture Notes Comp. Sci., pages 530-546. Springer, 2014.
K. He, G. Gkioxari, P. Dollar, and R. Girshick. Mask R-CNN. In IEEE Int. Conf. Comput. Vision (ICCV), pages 2980-2988, Venice, Italy, Oct. 2017.
K. He, X. Zhang, S. Ren, and J. Sun. Deep residual learning for image recognition. In IEEE Int. Conf. Comput. Vision and Pattern Recogn. (CVPR), pages 770-778, Las Vegas, NV, USA, June 2016.
J. Hu, L. Shen, and G. Sun. Squeeze-and-excitation networks. CoRR, abs/1709.01507, 2017.
G. Huang, Z. Liu, L. van der Maaten, and K. Weinberger. Densely connected convolutional networks. In IEEE Int. Conf. Comput. Vision and Pattern Recogn. (CVPR), pages 2261-2269, Honolulu, HI, USA, July 2017.
A. Kolesnikov and C. Lampert. Seed, expand and constrain: Three principles for weakly-supervised image segmentation. In Eur. Conf. Comput. Vision (ECCV), volume 9908 of Lecture Notes Comp. Sci., pages 695-711. Springer, 2016.
A. Krizhevsky, I. Sutskever, and G. Hinton. ImageNet classification with deep convolutional neural networks. In Adv. in Neural Inform. Process. Syst. (NIPS), volume 25, pages 1097-1105, 2012.
Y. Lecun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proc. of IEEE, 86(11):2278-2324, Nov. 1998.
C.-Y. Lee, P. Gallagher, and Z. Tu. Generalizing pooling functions in CNNs: Mixed, gated, and tree. IEEE Trans. Pattern Anal. Mach. Intell., 40(4):863-875, Apr. 2018.
M. Lin, Q. Chen, and S. Yan. Network in network. CoRR, abs/1312.4400, Dec. 2013.
B. Moons, K. Goetschalckx, N. Van Berckelaer, and M. Verhelst. Minimum energy quantized neural networks. In Asilomar Conference on Signals, Systems, and Computers, pages 1921-1925, Pacific Grove, CA, USA, 2017.
P. O. Pinheiro and R. Collobert. From image-level to pixel-level labeling with convolutional networks. In IEEE Int. Conf. Comput. Vision and Pattern Recogn. (CVPR), jun 2015.
S. Ren, K. He, R. Girshick, and J. Sun. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell., 39(6):1137-1149, June 2017.
D. Scherer, A. Müller, and S. Behnke. Evaluation of pooling operations in convolutional architectures for object recognition. In Int. Conf. Artificial Neural Networks (ICANN), volume 6354 of Lecture Notes Comp. Sci., pages 92-101. Springer, 2010.
Z. Shi, Y. Ye, and Y. Wu. Rank-based pooling for deep convolutional neural networks. Neural Networks, 83:21-31, Nov. 2016.
C. Szegedy, S. Ioffe, V. Vanhoucke, and A. Alemi. Inception-v4, Inception-ResNet and the impact of residual connections on learning. In AAAI Conf. Artificial Intell., pages 4278-4284, San Francisco, CA, USA, Feb. 2017.
M. Zeiler and R. Fergus. Stochastic pooling for regularization of deep convolutional neural networks. In Int. Conf. on Learn. Rep. (ICLR), Scottsdale, Arizona, May 2013.
S. Zhai, H. Wu, A. Kumar, Y. Cheng, Y. Lu, Z. Zhang, and R. Feris. S3Pool: Pooling with stochastic spatial sampling. In IEEE Int. Conf. Comput. Vision and Pattern Recogn. (CVPR), pages 4003-4011, Honolulu, HI, USA, July 2017.