Welcome to my website!
Here, you can find my code, art, and links to my various outposts on the World Wide Web.
I am a programmer and artist.
Professionally, I've written a variety of software for a variety of purposes:
(While my professional experience is chiefly with C++, Rust, and C#, I can solve problems using just about anything approaching a turing machine.)
Unprofessionally, I like to make art, watch college football, play Dungeons and Dragons, read and write, and create fun bits of software.
This is a subset of the code I have written either professionally, academically, or in an occasional fit of unrestrained productivity.
Fourier Synthesis of Ocean Waves: Blizzard Entertainment (C++, Python, Houdini HDK + VEX, Katana Op API) (2023)
Ocean surface evaluation library based on prior work by e.g. Tessendorf and Horvath with integration into FX team's workflow across Houdini and Katana. Intended for use in Blizzard cinematics.
Created for a research and development internship with Blizzard; the above sample render is their property, posted with permission.
Shimmer: Open Source Spectral Renderering System (Rust) (GitHub) (2023)
Renders from Shimmer. (Left) Kroken scene by Angelo Ferretti (Right) Crown scene by Martin Lubich.
Shimmer is a physically based, spectral rendering system based on the ray tracing algorithm witten in pure Rust. It is principally based on the architecture described in PBRT 4th edition, but it is not meant to be a 1:1 port of that project to Rust, and so may differ where desired.
Reference Rover (Under Development) (Next.js, Rust, Tauri, Python, Tailwind) (GitHub) (2024)
I am currently developing ReferenceRover, a tool for locally organizing and searching for artistic reference materials. It is a desktop application built using Rust and Tauri with a Next.js frontend.
The key feature is private, local, GPU-accelerated semantic search of image data. Semantic image and text encodings are generated CLIP model running on user machines via the ONNX runtime. The resulting vectors are indexed and searched via the hierarchical navigable small world (HNSW) algorithm.
Ray Tracing in One Weekend, in Rust: Yet Another Ray Tracer (YART) (Rust) (GitHub) (2022)
A Rust implementation of Peter Shirley's books on ray tracing, with some extensions such as parallelization via rayon and tiled rendering. I've also used it to experiment with hash-based ray path prediction (HRPP). I've since directed my interest in light transport towards Shimmer.
Feriphys: Physics Simulation for Computer Graphics (Rust, wgpu) (GitHub) (2022)
Feriphys is a crate for physics simulations for computer graphics applications. Features include deformable meshes, cloth simulation, flocking, CPU-bound particle simulations, smoothed particle hydrodynamics, and rigid body dynamics. It is based on “Foundations of Physically Based Modelling and Animation” by Donald House and John C. Keyser, under the tutelage of the latter.
The simulations are rendered in a custom engine built using wgpu.
This project is pedagogical, with some known issues that I don't have plans to fix (if I return to this code, it will be for a rewrite in a new engine). For production applications, prefer an established engine like rapier.
PixelFluidEngine: Fluid Surface Simulation via Heightfield Approximation (C++) (GitHub) (2020)
This project is an implementation of the iWaves algorithm I wrote many years ago. It works well on the CPU, and has a functional-but-slow CUDA implementation (which was my introduction to programming on the GPU). I wouldn't necessarily use this implementation in production, but it is neat enough to share.
PalettePuzzle.com: Daily Color Mixing Puzzles (TypeScript, React + Next.js, HTML, CSS, PostgreSQL) (Website, GitHub) (2024)
A puzzle game similar to Wordle where the goal is to mix a target color from a palette in as few steps as possible. A cron job supplies new puzzles each day.
Color of Connection: Immersive Live Dance Performance Tech (TouchDesigner, Python, GLSL) (2023)
"The Color of Connection" was a performance done in collaboration with dance faculty at Texas A&M. The project explores choreography in an immersive 360 environment from Igloo Vision. Pairs of dancers interact with each other and their digital counterparts across tangible and virtual space.
I was responsible for all technical aspects of the production. Dancer motion was captured using a pair of Kinects, from which optical flow was calculated and used to drive a fluid simulation written in GLSL based on the stable fluids model. This fluid simulation was then displayed via the Igloo system to complement the dance performance.
(This is a 360 interactive video, scroll around!)
360 Pedagogy: Interactive Narrative in Immersive 360 Environments for Education (Unity, C#, Igloo Core Engine) (2023)
In collaboration with Dr. Hwaryoung Seo and Michael Bruner, created a Unity-based system for building interactive narrative projects for a 360 degree environment. Users can interact with a 360-degree screen using a bespoke point-and-click system using VIVE controllers, and student developers can use our scaffolding to build arbitrary narrative graphs for 360 videos with triggers for arbitrary effects and interactions. Students at Texas A&M used this system in the “Immersive Virtual Environments” course in 2023 to create narrative works around a theme of accessibility. A paper on the project and associated pedagogy was accepted to ISEA 2024.