Free Tutorial

Pre-Workshop Tutorial: Deep Neural Networks Demystified


Tagungsraum März 9, 2017 10:30 - 12:15

Abonnieren und Teilen

Dr. Ralph Grothmann

The tutorial deals with a new hype wave in neural network modeling called Deep Learning or Deep Neural Networks. We will look behind the scenes and will explain the differences between “standard” feedforward and deep neural network models. The decay in gradient information over long chains of hidden layers can be avoided by e.g. multiple outputs, information highways or short cuts, as well as stochastic learning rules. Auto-associators and convolutions enable us to model high-dimensional input spaces by the automated generation of features. Besides deep feedfoward neural networks we will also deal with time-delay recurrent neural network architectures, where deepness is a natural feature when non-algorithmic learning techniques like error backpropagation through time are used. Simple recurrent neural networks, long-short term memory networks (LSTM), echo state networks and large recurrent neural networks are popular examples. We will give examples of the application of deep neural networks form recent project work and show the merits of the “new” approach in comparison to non-deep modeling techniques.