Information Theory, Pattern Recognition, and Neural Networks

David MacKay, University of Cambridge

A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms (Cambridge University Press, 2003)" which can be bought at Amazon, and is available free online. A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge.

Introduction to information theory
* The possibility of reliable communication over unreliable channels. The (7,4) Hamming code and repetition codes.

Entropy and data compression
* Entropy, conditional entropy, mutual information, Shannon information content. The idea of typicality and the use of typical sets for source coding. Shannon's source coding theorem. Codes for data compression. Uniquely decodeable codes and the Kraft-MacMillan inequality. Completeness of a symbol code. Prefix codes. Huffman codes. Arithmetic coding.

Communication over noisy channels
* Definition of channel capacity. Capacity of binary symmetric channel; of binary erasure channel; of Z channel. Joint typicality, random codes, and Shannon's noisy channel coding theorem. Real channels and practical error-correcting codes. Hash codes.

Statistical inference, data modelling and pattern recognition
* The likelihood function and Bayes' theorem. Clustering as an example

Approximation of probability distributions
* Laplace's method. (Approximation of probability distributions by Gaussian distributions.)
* Monte Carlo methods: Importance sampling, rejection sampling, Gibbs sampling, Metropolis method. (Slice sampling, Hybrid Monte Carlo, Overrelaxation, exact sampling)
* Variational methods and mean field theory. Ising models.

Neural networks and content-addressable memories
* The Hopfield network.

Dates:
  • Free schedule
Course properties:
  • Free:
  • Paid:
  • Certificate:
  • MOOC:
  • Video:
  • Audio:
  • Email-course:
  • Language: English Gb

Reviews

No reviews yet. Want to be the first?

Register to leave a review

Show?id=n3eliycplgk&bids=695438
Included in selections:
Small-icon.hover Machine Learning
Machine learning: from the basics to advanced topics. Includes statistics...
NVIDIA
More on this topic:
Medium_79461800 Monte Carlo Methods in Finance
Non-bankers can learn to understand the mathematical models that have made the...
6-441s10 Information Theory
6.441 offers an introduction to the quantitative theory of information and its...
3-320s05 Atomistic Computer Modeling of Materials (SMA 5107)
This course uses the theory and application of atomistic computer simulations...
Mooc_statistical_mechanics_2609 Statistical Mechanics: Algorithms and Computations
In this course you will learn a whole lot of modern physics (classical and quantum...
Mas-160f07 Signals, Systems and Information for Media Technology
This class teaches the fundamentals of signals and information theory with emphasis...
More from 'Mathematics, Statistics and Data Analysis':
6e8a49e3-e74b-4a74-81b7-ebaf9c82c620-e20771d7a2a2.small Derivatives Markets: Advanced Modeling and Strategies
Financial derivatives are ubiquitous in global capital markets. Students will...
Ddjlogo Doing Journalism with Data: First Steps, Skills and Tools
This free 5-module online introductory course gives you the essential concepts...
Ampcamp4-logo Big Data Mini Course: AMP Camp 4 hands-on exercises
The exercises we cover today will have you working directly with the Spark specific...
Google_logo_41 Digital Analytics Fundamentals
This three-week course provides a foundation for marketers and analysts seeking...
Uoft_logo Introduction to Machine Learning (CSC2515, Fall 2008)
Introductory course in machine learning by world leading expert Geoffrey Hinton...

© 2013-2019