July 2017

Senior Honors Thesis

Quantum mechanics with James Whitfield.

Fall 2015

Semester at CERN

Simulating Higgs radiation.

2013

Dartmouth College

Major in physics.

Outside of research, I spend time with family and friends, run endurance races, and make things out of wood. Ask me about raising pigs.

Lagrangian Neural Networks

Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, Shirley Ho

Neural Reparameterization Improves Structural Optimization

Stephan Hoyer, Jascha Sohl-Dickstein, Sam Greydanus

Hamiltonian Neural Networks

Sam Greydanus, Misko Dzamba, Jason Yosinski

Metalearning Biologically Plausible Semi-Supervised Update Rules

Keren Gu, Sam Greydanus, Luke Metz, Niru Maheswaranathan, Jascha Sohl-Dickstein

The Paths Perspective on Value Learning

Sam Greydanus, Chris Olah

Learning Finite State Representations of Recurrent Policy Networks

Anurag Koul, Sam Greydanus, Alan Fern

Visualizing and Understanding Atari Agents

Sam Greydanus, Anurag Koul, Jonathan Dodge, Alan Fern

Learning the Enigma with Recurrent Neural Networks

Sam Greydanus

show more

Approximating Matrix Product States with Neural Networks

Sam Greydanus, James Whitfield

Machine Learning for fMRI Data

Traditionally, neuroscientists have used a simple (but computationally demanding) technique called searchlight analysis to understand fMRI data. New machine learning approaches have yielded great success as well. The goal of this project was to compare the performances of the two approaches.

Sam Greydanus, Luke Chang

Higgs Electroweak Calibration

My CERN research project. Particle physicists use a theory called the Standard Model to predict physics in the Large Hadron Collider. I used two Monte Carlo physics simulators to see what happens to numerical models of Higgs physics when we leave out the Electroweak force.

Sam Greydanus, Andre Mendes

Image Classification at BigML

BigML is a machine learning startup which aims to make machine learning accessible to people without specialized backgrounds. As an intern, I tested the feasibility of using their online interface to detect cat heads from the Microsoft cat head database

Sam Greydanus, Poul Peterson

Nearshore Physics and Automated Ship Wake Detection

Salt water and fresh water have different densities and form layers. Special waves, called internal waves, exist between these layers. In this project, I tracked internal waves in time-lapse footage from the Columbia River Estuary.

Sam Greydanus, Robert Holman

AGU 2014 (Oral Presentation)

Fractal Tree

The physics of a tree blowing in the wind is very difficult to model because the system is so nonlinear. I made a numerical model of the system by expressing it as a fractal.

Baby A3C

It's amazing that deep RL agents can master complex environments using just pixels and a few rewards. While learning about these agents, I built a high-performance Atari A3C agent in just 180 lines of PyTorch.

Friendly qLearning

What happens when qLearning agents interact with one another? The goal of this toy JavaScript model is to create emergent social behavior. I am especially interested in discovering "social neurons" in the model's network.

show more

Pythonic OCR

I trained a deep Convolutional Neural Network in Keras, then rewrote it in numpy to run as a web demo.

Full Tilt

When I was first teaching myself to program, I wrote a few simple apps for Android. My best one is a spinoff of the popular iOS game Tilt to Live.

Pong

I used a policy gradient method written in TensorFlow to beat the Atari Pong AI.

Cartpole

I also solved the Cartpole control problem using Policy Gradients.

Dartbook

While working for the Digital Arts Lab at Dartmouth, I wrote an iOS textbook exchange app. We launched it on the App Store but disbanded soon afterwards. It is a cool little project!

Stereograms

When I discovered stereograms, I got so excited that I wrote code to make my own.

MrLondon

While learning about recurrent neural networks I trained a deep character-level model to write in the style of one of my favorite authors, Jack London.

I wrote a math + code introduction to neural networks and backpropagation. Uses pure numpy and a Jupyter notebook.

An article I wrote about depth perception (page 27) for the Dartmouth Undergraduate Journal of Science

A class project about numerically modeling quantum systems in MATLAB

A gist for dynamic plotting with matplotlib. Useful for updating loss functions within a training loop.

A quick derivation of the advantage actor-critic policy gradient methods. For an AI seminar I gave at Oregon State.

I wrote a numpy model of synthetic gradients for a neural network MNIST classifier. Inspired by this DeepMind paper.

An Jupyter notebook about Mixture Density Networks implemented in Google's TensorFlow library

Some other Android apps I wrote a few years ago are on Google Play

I wrote a Generative Adversarial Network repo for MNIST in PyTorch

I did a Sudoku on a plane and then decided to solve the general case.