US Team, Gothenburg, Sweden
Microsoft New England
Azure Machine Learning Team
Research Semester at CERN
Monte Carlo simulation of Higgs Radiation.
My advisor is Luke Chang
Oceanography REU at Oregon State
Nearshore physics and video feature extraction
Internship at BigML
Machine Learning for image classification.
Major in Physics.
Bio. I am an undergraduate physics student at Dartmouth College. My research experience covers particle physics, deep learning, and the neuroscience of human interactions. I am particularly excited about recurrent neural networks, deep generative models, and the scientific applications of deep learning.
I am one of the captains of the Dartmouth Endurance Racing Team and Vice President of the Dartmouth Physics Society. In my free time, I like climbing, fishing, and being outdoors. Before Dartmouth, I raised pigs in the countryside around Corvallis, Oregon.
Learning the Enigma with Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are Turing-complete. In other words, they can approximate any function. As a tip of the hat to Alan Turing, I formulate the Enigma's decryption function as a sequence-to-sequence translation task and learn it with a large RNN.
Approximating Matrix Product States with Neural Networks
The wave function is essential to most calculations in quantum mechanics and yet it’s a difficult beast to tame. In this project, I trained a neural network to approximate the ground state wave function of a many-body quantum system. This was my senior honors thesis.
Sam Greydanus, James Whitfield
Machine Learning for fMRI data
Traditionally, neuroscientists have used a simple (but computationally demanding) technique called searchlight analysis to understand fMRI data. New machine learning approaches have yielded great success as well. The goal of this project was to compare the performances of the two approaches.
Sam Greydanus, Luke Chang
Higgs Electroweak Calibration
My CERN research project. Particle physicists use a theory called the Standard Model to predict physics in the Large Hadron Collider. I used two Monte Carlo physics simulators to see what happens to numerical models of Higgs physics when we leave out the electroweak force.
Sam Greydanus, Andre Mendes
Image Classification project at BigML
BigML is a machine learning startup which aims to make machine learning accessible to people without specialized backgrounds. As an intern, I tested the feasibility of using their online interface to detect cat heads from the Microsoft cat head database
Sam Greydanus, Poul Peterson
Nearshore physics and automated ship wake detection
There are waves between layers of water that have slightly different densities called internal waves. In this project I wrote an algorithm that automatically recognized and tracked internal waves in time-lapse footage from the Columbia River Estuary.
Sam Greydanus, Robert Holman
AGU 2014 (Oral Presentation)
The physics of a tree blowing in the wind is very difficult to model because the system is so nonlinear. I made a numerical model of the system by expressing it as a fractal.
It's amazing that deep RL agents can master complex environments using just pixels and a few rewards. While learning about these agents, I built a high-performance Atari A3C agent in just 180 lines of PyTorch.
I trained a deep Convolutional Neural Network in Keras, then rewrote it in numpy to run as a web demo.
When I was first teaching myself to program, I wrote a few simple apps for Android. My best one is a spinoff of the popular iOS game Tilt to Live.
I used a policy gradient method written in TensorFlow to beat the Atari Pong AI.
I also solved the Cartpole control problem using Policy Gradients.
While working for the Digital Arts Lab at Dartmouth, I wrote an iOS textbook exchange app. We launched it on the App Store but disbanded soon afterwards. It is a cool little project!
When I discovered stereograms, I got so excited that I wrote code to make my own.
While learning about recurrent neural networks I trained a deep character-level model to write in the style of one of my favorite authors, Jack London.
I wrote a math + code introduction to neural networks and backpropagation. Uses pure numpy and a Jupyter notebook
An article I wrote about depth perception
(page 27) for the Dartmouth Undergraduate Journal of Science
A class project
about numerically modeling quantum systems in MATLAB
for dynamic plotting with matplotlib. Useful for updating loss functions within a training loop.
An Jupyter notebook
about Mixture Density Networks implemented in Google's TensorFlow library
Some other Android apps
I wrote a few years ago are on Google Play
I wrote a Generative Adversarial Network repo
for MNIST in PyTorch