Bauhaus-Universität Weimar
  1. Media
  2. Computer Science Department
  3. Webis
  4. Teaching
  5. Previous Semesters

Deep Learning Study Group

Study Group "Deep Learning"

  • General Information
  • Sessions
  • Literature

General Information

Lecturer: (various)
Venue: Thursday 15:30 – 17:00, Mediathek, B11
First Session: November 12th, 2015

Sessions

  • [November 12th, 2015]
    Introduction and Planning.
  • [November 19th, 2015]
    Reading: (Bengio, 2009) chapters 1-3.
  • [December 17th, 2015]
    Reading: (Socher, 2015) Lecture Notes.

Upcoming sessions

  • [tbd]
    Distributed Representations of Words and Phrases and their Compositionality (Johannes)
  • [tbd]
    Ad Hoc Monitoring of Vocabulary Shifts over Time (Michael)
  • [tbd]
    Overview: Recursive und recurrent neural networks (Jonas)
  • [tbd]
    GloVe: Global Vectors for Word Representation (Tim)
  • [tbd]
    Improving Word Representations via Global Context and Multiple Word Prototypes (Martin)
  • [tbd]
    Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank (Henning)

Literature

Video Lectures:

  • Richard Socher, CS224d. Stanford University, 2015.
  • Geoffrey Hinton, Neural Networks for Machine Learning. Coursera, 2012.


Textbooks and Chapters:

  • Yoshua Bengio, Deep Learning. Book in preparation for MIT Press, 2015.
  • Yoshua Bengio, Learning Deep Architectures for AI, Foundations and Trends in Machine Learning, Vol. 2, No. 1 (2009) 1–127. 
  • Hal Daumé III, A Course in Machine Learning, (preprint), 2015.


Surveys and Collections: 

  • Jürgen Schmidhuber, Deep Learning in Neural Networks: An Overview. Neural Networks, Volume 61, January 2015, Pages 85-117.
  • Jürgen Schmidhuber, Literature Recommendations for New Researchers. Reddit.com, 2015.
  • Yoshua Bengio et al., Deep Learning Tutorials. Deeplearning.net, 2015.
  • Wojciech Samek, Hot Topics in Machine Learning: Deep Learning (Reading Material). TU Berlin, 2015.
  • Christopher D. Manning, Last Words: Computational Linguistics and Deep Learning. COLI, MIT Press, 2015.
  • Myungsub Choi, Jiwon Kim, Awesome Recurrent Neural Networks: A curated list of resources dedicated to recurrent neural networks, github.com.


Articles and Applications:

  • Richard Socher et al., Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. EMNLP'13.
  • Giuseppe Attardi, DeepNL: a Deep Learning NLP pipeline. NAACL'15.
  • Andrej Karpathy, The Unreasonable Effectiveness of Recurrent Neural Networks. 2015.


Software: 

  • Theano and pylearn2 (Python)
  • Torch7 (Lua)
  • Caffe (C++)
  • Tensorflow (Python/C++)
  • Keras (Python)
© 1994-2021 Bauhaus-Universität Weimar