seminar on optimization for deep neural networks

3 views
Skip to first unread message

Radu Horaud

unread,
Jun 15, 2018, 5:39:00 AM6/15/18
to diffusion-montbonnot, ljk-...@univ-grenoble-alpes.fr, smile-in...@googlegroups.com, FERIELLE PODGORSKI

Sparse representation, dictionary learning, and deep neural networks: their connections and new algorithms

Seminar  by Mostafa Sadeghi, Sharif University of Technology, Tehran

Tuesday 19 June 2018, 14:30 – 15:30, room F107, INRIA Montbonnot Saint-Martin


Abstract. Over the last decade, sparse representation, dictionary learning, and deep artificial neural networks have dramatically impacted on the signal processing and machine learning areas by yielding state-of-the-art results in a variety of tasks, including image enhancement and reconstruction, pattern recognition and classification, and automatic speech recognition. In this talk, we touch on these subjects by presenting a brief introduction to them, as well as introducing new algorithms and perspectives. Specifically, we will introduce efficient algorithms for sparse recovery and dictionary learning, which are mostly based on proximal methods in optimization. Furthermore, we will present a new algorithm to systematically design large artificial neural networks using a progression property. This is a greedy algorithm that progressively adds nodes and layers to the network. We will also talk about an effective method, inspired by available dictionary learning techniques, to reduce the number of training parameters in neural networks, thereby facilitating their use in applications with limited memory and computational resources. More connections among sparse representation, dictionary learning, and deep neural networks will also be discussed.


More information here:

-----------------------
Radu HORAUD  
Assistant: Nathalie Gillot (Nathali...@inria.fr)












Radu Horaud

unread,
Jun 18, 2018, 8:47:54 AM6/18/18
to diffusion-montbonnot, ljk-...@univ-grenoble-alpes.fr, smile-in...@googlegroups.com
Reply all
Reply to author
Forward
0 new messages