• Taming wave functions with neural networks
    The wave function is essential to most calculations in quantum mechanics, and yet it's a difficult beast to tame. Can neural networks help?
  • Differentiable memory and the brain
    DeepMind's Differentiable Neural Computer (DNC) represents the state of the art in differentiable memory models. I introduce an analogy between the DNC and human memory, then discuss where it breaks down.
  • Learning the Enigma with Recurrent Neural Networks
    Recurrent Neural Networks (RNNs) are Turing-complete. In other words, they can approximate any function. As a tip of the hat to Alan Turing, let's see if we can use them to learn the Nazi Enigma.
  • A bird's eye view of synthetic gradients
    Synthetic gradients achieve the perfect balance of crazy and brilliant. In a 100-line Gist I'll introduce this exotic technique and use it to train a neural network.
  • The art of regularization
    Regularization seems fairly insignificant at first glance, but it has a huge impact on deep models. I'll use a one-layer neural network trained on the MNIST dataset to give an intuition for how common regularization techniques affect learning.
  • Scribe: realistic handwriting with TensorFlow
    In this post, I will demonstrate the power of deep learning by using it to generate human-like handwriting. This work is based on Generating Sequences With Recurrent Neural Networks by Alex Graves
  • What is deep learning?
    After being obsessed with this field for more than a year, I should have a concise and satisfying answer. Strangely, I have three.