rage against the machine learning
personal website of ryan r. curtin

I am a standard-issue carbon-based meat popsicle produced in the late 1980s in the general vicinity of Washington, D.C. and then introduced to the general patterns of human activity in an artificially designed suburb called Columbia. After developing basic motor skills and an adeptness at video games, I relocated to Portland, Oregon and attempted to learn social skills with a very low level of success. As a result, I enrolled for undergraduate studies at Georgia Tech in the fall of 2005 and did not manage to escape until the fall of 2015, having collected three degrees.

My time at Tech was varied and unplanned, but it did (mostly) follow the laws of causality. As an undergraduate I was an electrical engineering student with an interest in digital signal processing. For some years I worked as a sysadmin in the School of Math, a primarily Linux-based department. At some points I was a TA, usually not by choice. As time progressed I found myself more interested in the algorithmic aspects of signal processing, which led me to machine learning.

After a year's incarceration at GTRI ELSYS, I worked for Dr. David Anderson's research group, where I found myself more interested in machine learning; I then found myself happily a part of the FASTLab, led by Dr. Alex Gray. But the lab dissolved and since that time I have found temporary homes with the chicken people at GTRI, Dr. Charles Isbell's pfunk research group and Dr. Rich Vuduc's HPC Garage.

Dismayed with the prospect of spending an entire decade of my life at Tech, I chose to graduate in August 2015, limiting my tenure to precisely nine years and 362 days. After two weeks of grueling temporary semi-retirement, I was granted refuge at Symantec's new Center for Advanced Machine Learning (CAML), where I continue my research on fast algorithms.

In my research, one important thing I do is develop the fast C++ machine learning library mlpack, which contains a growing collection of high-quality implementations of cutting-edge machine learning algorithms and is in use by both research groups and industry groups worldwide. The particular research I focus on is a niche type of algorithm called dual-tree algorithms. These algorithms are widely applicable to numerous machine learning problems: nearest neighbor search, kernel density estimation, minimum spanning tree calculation, maximum inner product search (or maximum kernel search), k-means, and even approximate matrix multiplication. In short, I'm interested in making slow algorithms fast. And fast algorithms faster.

But that only paints a picture of what I do to pay my bills. Although I enjoy that very much, it is only a sampling of the activities I pursue. At other times I may be found driving (or fixing) a 1930 Model A around the country and documenting historically relevant bridges, operating a blast furnace, winning kart races, interacting with my cats, turning cardboard into fashionable clothing, over-engineering seating solutions, or spending time with my wife, for whom I learned metallurgy. Also, I have yet to meet someone who can beat me at Double Dash.

about
   cv (as pdf)
   contact

back to index