External to my research I spend a lot of time on projects in robotics, app development, and machine learning. These are a couple of my projects from the last year or so, but there are more are on my GitHub and Bitbucket.
Inspired by the work of Seb Madgwick, I got super interested in motion tracking and position determination through MEMS devices. As a (bad) skateboarder, I often wondered what I was doing wrong when I tried certain tricks, so I wanted to see if I could actually SEE what my feet were doing. Having access to a 3D printer, I enclosed a microcontroller, IMU, and bluetooth transmitter in a package you could attach to your shoes:
Data from these is recorded by either a computer or iPhone, then processed into visualizations of the motions of your feet (and skateboard). The end product would be a 3D rendering of your shoes and skateboard, which you could watch from any angle to help you learn. A demonstration of this concept is shown in a video here.
I also wanted to be able to identify specific tricks from a reduce number number of measurements. Taking from the handwriting detection community, I developed a classification technique that looked for certain temporal features that were unique to each maneuver. Recording 6 skateboarders performing an identical trick set, I was able to determine whether a specific trick was performed to some confidence level. I was also able to see the unique style of each skateboarder, which was really cool.
The entire project is available here on my bitbucket.
I’m a dedicated user of Jupyter (ipython) notebooks for my research. I like to have several Jupyter servers running for different projects, so I made a macOS app to manage the opening of these sessions, and remember previous sessions. The simple interface looks like this:
This is the first piece of software I’ve written that I use everyday, dozens of times. Super simple and only saves a ~10 seconds each time, but seconds add up… Not app store approved yet, but if you’d like to try it out, a build of the app is available here on my github. My original implementation was actually written completely in Python with PyQt, and is available here.
Test Chamber Controller
We use the Elephant Plasma Chamber at Dartmouth College to test the sensors we build for rockets in a controlled laboratory environment. The system is composed of a large vacuum chamber with a plasma source at one end. Various power supplies, diagnostics, and a motion table are used to calibrate the diagnostics we build. The motion table is used to adjust the orientation of the diagnostic to the plasma source, and can tilt with two degrees of freedom.
When I first arrived at Dartmouth I took on the task of controlling all of these devices with one system. I built a python-based GUI which interfaces which all these devices, adjusting the orientation of the sensor, controlling voltage and current levels, and recording/plotting data. The software also allows for predefined scans of orientation or parameter settings, increasing our accuracy and speed in diagnostic calibration dramatically! Here’s a pretty crappy screenshot:
Snakes From Space!
I wanted to make an iOS game, and after my terrible Android game, I figured I’d stick with a classic. I think the only person who plays is my girlfriend… but it’s not totally unfun.
This one’s actually available on the App Store!
These were some of my earlier robotics projects. Super fun and hacky.