Go Wasm: Compiling code in the browser with WebAssembly October 2020

Browsers have become powerful beasts. First used to share research papers at CERN, the browser can now run Google Earth, play Unity 3D games, and even design buildings in AutoCAD.

With this kind of power, could a browser compile and run your code too? Ridiculous. Surely that couldn’t work…

Then again, why not? There’s no way I could ignore such a fascinating challenge. After four months of punching the keys and poring over documentation, I’ve finally created my answer: Go Wasm.

read more

Staff Software Engineer IBM June 2018 – October 2020

I develop and maintain IBM Cloud’s Kubernetes Service cluster management API microservices & CLI, the Web Terminal cluster addon & proxy microservice, and the Kubernetes Dashboard proxy microservice to name a few.

Over the past year I overhauled the CLI user experience and created the Web Terminal cluster addon & its proxy. Within my larger team, I’ve also helped launch a global API to unify the IKS region-specific APIs and cluster management APIs for both VPC and RedHat OpenShift.

Most of our services are written in Go and some, like the Web Terminal, use JavaScript/CSS/HTML for the front-end.

Noonian September 2014 – January 2019

Noonian is a research tool for finding the laptop that fits your needs. You can search for laptops and then run comparisons to narrow down your choices quickly. Check it out at noonian.com.

Later on, the plan is to open this up to comparing builds of different computers and their components. With Noonian, I aim to avoid the negative tendencies and shortcomings typically associated with product search by making it easier to shop smarter.

Bashtion May 2018

A framework for writing reusable and testable modules in Bash. Sometimes using Bash is simply inescapable, but large legacy scripts can be hard to manage. I sought to create a framework that would make it easy to gradually improve legacy code.

Bashtion encourages modular, reusable, and testable code. It does this by setting up Bash with sensible defaults, a built-in error tracing system, support for writing tests, and generates warnings for deviations from best practices. The project is open source and you can check it out on GitHub.

IKS Kubernetes Dashboard Proxy IBM April 2018

The Kubernetes Dashboard is a great way to see what’s going on in your Kubernetes cluster. We created a service that uses IBM Cloud’s authentication to provide convenient access to that dashboard in IKS. My co-worker, Chris Kirkland, wrote a great blog post to announce the new service.

I added monitoring, alerting, and improved the user experience for our each of our failure modes. Today, my team and I continue to maintain and further develop the service.

Software Engineer IBM June 2017 – May 2018

I worked to improve our users’ experiences while using the IBM Cloud Container Service. I helped build and maintain a metrics and log-forwarding service, a Kubernetes Dashboard proxy, and various other internal services.

Throughout my experience in this role I strove for high code readability, test coverage, and service robustness to deliver top-notch experiences for our users. Our services have had strong positive feedback for ease of use and reliability.

We have had negative feedback, of course, so we tackle these issues promptly. Typically we can address the customer’s issue within one or two 1-week sprints. For important changes we have pushed out code changes the same day. To double-check ourselves, we also run automatic integration tests on all changes along the way to production.

read more

IKS Logs & Metrics Forwarder IBM June 2017

On IBM Cloud’s Kubernetes Service, you can create a cluster and tick a box to enable logging. The magic behind that little box is enabled with a logs and metrics forwarder that my team builds and maintains.

The metrics and log-forwarding service captures various stats and logs generated in a user’s Kubernetes cluster, then forwards them to the IBM Log Analysis and Monitoring services by default. We picked up the service in its proof-of-concept stage and completely revamped it to fit newer features and our rigorous quality standards to run in production.

read more

Python Pool Analysis August 2016

It can be difficult to find the best method for parallel data processing. In Python, I came across six different options to run the same code concurrently, some truly in parallel and some interleaved as green threads. Overwhelmed with options, I decided to dig into the details of each pool and really test them.

I created a CLI to test each of the six different Python pool implementations for both CPU and I/O-bounded workloads. I have open sourced the CLI and my test results on GitHub so other developers can run their own tests or just skip the dirty details of benchmarking and use my own conclusions as a reference.

Software Engineering Intern IBM January 2016 – May 2017

After I returned to IBM, my first project was to help develop a metric reporting service for my squad’s auto-scaling cloud services. Second, I co-developed a continuous-integration Jenkins job generator that saves our team many hours of time by ensuring the pipeline is consistent and easy to use. Lastly, I developed a Python library that increased the speed of one of our most critical services by at least 8 times.

Throughout my work at IBM, I helped write and maintain a data collection application that reports metrics on IBM Bluemix Autoscaling services.

I also co-developed our current continuous integration pipelines. I prepared the pipeline deployment jobs with end-to-end tests by writing a Jenkins job generator. All of our services have switched to using the generated jobs, automatically building our app images, and deploying those apps on successful builds. This saves us hours of time each week spent digging through Jenkins jobs looking for bugs and hours of time training newcomers how to use our system.

In my last few months I wrote a Message Hub library for Python clients to support IBM’s Kafka messaging service. This library increased the speed of one our most important services by at least 8 times. In this client, I increased performance over our old client by wrapping a Kafka C library in Python, control resources with context managers to ensure proper cleanup, and provide a simpler API for our Python apps to connect to. In this project, I learned how to develop Python libraries for installation into other projects, wrap lower-level languages in Python, and strategies for increasing the performance of our application services.

Software Engineering Intern IBM May – August 2015

With my OpenStack Innovation squad, I developed a monitoring and alerting system for the servers running the Bluemix Container Cloud. This system not only can detect current problems but also future problems, alerting the right people before the problem actually occurs. This changed reactive maintenance to proactive maintenance.

This project was picked up for use in multiple teams shortly before I returned to school. The monitoring and alerting system became a hot topic in IBM Cloud because of its ease of use, simple deployment, and proactive alerting.

Within three weeks I had the monitoring and alerting stack designed, deployed, and up in our development environment. My intern partner and I made significant strides toward predictive alerts during and after it had been deployed. We also took steps in order to get it in staging (and soon after, production) just before school began for me again. This project was built using custom, as well as public, Docker containers and deployed via docker-compose.

In addition to the above, my team and I moved on to Auto Scaling for virtual machines in OpenStack. I built a small OpenStack Heat template parser and modifier utility in Python. I then helped integrate this into the auto scaling API web server.

I worked closely with my team members. I also was able to give them advice regarding tools like Docker and how to use containers in deployment, some help with Apache server configuration, and my experience with building RESTful APIs.

Web Developer Intern Apple June – August 2014

With Mac Quality I worked on two utility websites to enable others in Product Integrity to access, manipulate, and customize the data collected from the machine testing process.

In two weeks, I built a front end web service for Product Integrity personnel to access individual test runs and all associated data in one place.

I worked closely with another intern to create a second web service. It enables anyone in QA, with permissions, to edit a project’s information used in reports. I created the interface for QA to customize test run summary output, which increases division efficiency. We made the project modification process significantly faster. The end result is condensed detail page information so that it is far more readable and concise.

Home Server November 2013

I manage my Ubuntu server very regularly, running a wide variety of applications from inside virtual machines. More recently I’ve been deploying some internal services within Docker as well.

I run multiple websites via the LAMP stack (Linux, Apache, MySQL, PHP). I also have some instances of Mongo, Cassandra as well as Gitlab used in personal development.

Web & iOS Developer Smarter Homes 2013 – 2014

As the webmaster for Smarter Homes of Austin, I was responsible for adding content to the site and optimizing it to draw search engine traffic. As the iOS developer, I worked on a point-of-sales application. It helped them more effectively perform their bookkeeping and make sales by keeping prices and inventory readily available.

read more

Arc October 2012 – June 2014

Initially an attempt to build a game from the ground up, developing this game engine gave us valuable insight into how games ought to be developed. In the process of creating the Arc reactor, we learned how to work as a tight team with a powerful focus on making an intuitive game library API.

With OpenGL, we built the reactor in Java and we probably spent more time arguing about the right name for a method than actually writing the code; however this did result in a beautifully crafted API! In the end, we realized the project was way over our heads in complexity. We needed the know-how for optimizing for performance and our skills were not yet ready for making the aforementioned game. We learned a great deal through teamwork, code management techniques, and API development. Soon after wrapping up our ideas, we moved on to working on Noonian.