This is a list of the software projects I have worked on and contributed to in the past.


Phonemic is a cross-platform, open-source text-to-speech library that bridges a wide variety of text-to-speech technologies into a single API for Java applications. Phonemic also offers a simplified TCP protocol, allowing multiple clients to share one text-to-speech service.

In the press: Announcing Phonemic 1.0: Cross Platform Speaking Library

Project home: Bitbucket


Sodbeans is a derivative of the NetBeans development environment with significant modifications that make it easier to use for blind and visually impaired individuals. These enhancements include text-to-speech (through Phonemic) and magnification services. Our team won the Oracle Java Innovation Award in 2011 for our work on Sodbeans.

My work on this project included a wide variety of sub-projects. I re-developed the text to speech system to use Phonemic, implemented magnification tools and refactored a variety of interfaces to be more accessible. I helped design and edit the Sodbeans curriculum in use at a large number of schools for the blind across the United States. I also implemented an omniscient debugger (a debugger with backward and forward capabilities) running on the Java Virtual Machine using a (heavily modified) version of the scalable TOD Omniscient Debugger. TOD required significant modification to achieve our goals of backwards-execution and an accessible speaking interface for the debugger.

Project home: QuorumLanguage.com

In the press: Sodbeans Wins JavaOne 2011 Duke's Choice Award

The following video shows Sodbeans and other tools in use at our coding summer camp at the Washington State School for the Blind. I narrated this video in 2012.


Quorum is the first evidence-based programming language. By this, we mean every aspect of Quorum is studied and peer-reviewed before being accepted into the language. This creates syntax which is easier to understand for novices, without sacrificing any of the power needed by experts. Quorum is fully object-oriented and bootstrapped. The underlying functionality for Quorum is provided by the Java Virtual Machine.

I worked on this project from 2011 to 2013, and helped implement the bytecode emission layer for the Java Virtual Machine and many other aspects of the compiler. In the time I worked on Quorum, we transitioned it from a slow interpreted language to a fast, JVM-enabled language.

Project home: QuorumLanguage.com


Campimetre is a tool developed as part of my senior design class while at Southern Illinois University. Campimetre combines the input from a MIDI keyboard and the ASL Eye Tracker to assess the behavior of musicians while they sight read. Campimetre provides information on where a user is looking when they press a particular note. With this information, cognitive psychologists can determine how far ahead the sight-reading musician is looking while playing. This is useful for the development of a variety of cognitive theories, including the influence of practice on sight-reading performance.

View the final project poster and the final project presentation.

Bird's the Word

Bird's the Word is an application written in PHP to assist with the analysis of the most frequently used words on Twitter, based on geolocation information. Tweet text and geolocation information is provided by the Twitter REST APIs. Data from the Twitter APIs is imported into a MySQL database which can be queried to gather word usage data. The application includes a dictionary of common words to filter out frequently used words such as the, allowing users to determine what topics are popular without relying on trending hashtags. Optionally, a full dictionary can also be imported to analyze the frequency and kinds of typos Twitter users frequently make. I completed this project as part of my Databases Programming class in 2012 at Southern Illinois University.


MagnifyWedge is a research project I proposed for the National Science Foundation's Graduate Research Fellowship Program in 2012. The grant was approved in 2013. I completed this project while working under Dr. Robert St. Amant at North Carolina State University. This project assessed the feasability of directing full-screen magnification users to off-screen targets using a visualization technique known as Wedge. I completed this project in 2014 and defended it for my written qualifier while at North Carolina State University. Results of the project are currently awaiting academic publication.


DriveBy is a tool that allows users to generate videos of driving directions retrieved from Google Maps. Using the Google Maps and Google Street View APIs, this software not only creates videos of routes, but allows the videos to be explored one driving step at a time. I completed this project for my Human Computer Interaction class in 2014 at North Carolina State University.

Live demo coming soon, when time permits.


TactPlot is my latest project, investigating the use of the palm for providing driving directions to the blind and visually impaired. Using a modified 3D printer carrying a stylus, routes can be drawn on the palm which are then interpreted to navigate an area, such as a building.