I am currently engaged with in new project planning with multiple existing clients. I am also available for new projects and proposals. Please be in touch via the “Contacts” page.
Role: Software Lead
Location: Google Headquarters Mountain View, CA
Client: SenovvA
Work: A custom sign for Google in the lobby of one of the buildings at their campus headquarters. The sign uses individually controlled LEDs embedded in the iconic Google “G” to deliver a unique visual experience.
Throughout the day, the sign changes appearance based on the current time. This includes a sunrise in the morning, animated rainbows throughout the day, a sunset at night, and starshine throughout the evening.
Technology: I architected and developed all of the software for the project, which used Node.js and DMX to control a system using the Entec Pixelator.
Role: Software Lead
Location: Red Hook, Brooklyn, NY
Client: A_DA
Work: Technoechophenomena is an interactive audio and light experience, based on the music and work of Moses Sumney. It was created by Sumney in concert with A_DA, the interactive arm of Listen, with technical sponsorship from Microsoft.
I contributed to the project as the Kinect Developer and Software Systems Architect. Using 4 Kinect Azure cameras, I parsed data about user location and body position. This included writing a custom pose detector using the C# Kinect Azure API. User data from each camera was then combined and unified into a single data feed.
The user data feed was then used by a custom cue management system, written in node.js. The cue manager forwards cues in real time to both the audio and lighting systems, using OSC and sACN, respectively.
Technology: C#, Kinect Azure API, Node.js, OSC, sACN
Publication: Moses Sumney’s technoechophenomena As Therapy
Role: Technology Lead
Location: National Cowboy Museum in Oklahoma City, OK
Client: Superfly
Work: The wall consists of a large Planar LED Array, two photo kiosks, and an array of Zed spatial sensors. Users can either take their portrait at the kiosks and send it to the mosaic on the wall, or simply walk in front of the wall to interact with it in real time.
I managed a team of two additional interactive developers, who created the wall and the kiosk visual content. I designed the software architecture, wrote the software to parse and use the spatial sensor data, and created a middleware layer to allow all of these components to communicate in real time.
Technologies: Python, Node.js, and numerous Linux-based admin scripts. Websockets were used to communicate with the HTML/CSS kiosk, and TouchDesigner based wall.
Role: Project Lead, Concept and Software Development
Location: YouTube
Client: The Dandy Warhols
Work: I created the concept, design, and implementation for The Dandy Warhols’ video “Motor City Steel”. With the help of editor Dan Scofield, I used green screen footage of the band in combination with OpenFrameworks, camera vision, and a number of photo and video sources to create an animated vision of the story of “Travis and Ricki” and their attempts at love. The video received over 15,000 views in its first week, over 285,000 views so far, and launched the start of the band’s 25th anniversary tour.
Video: The Dandy Warhols - Motor City Steel
Technology: OpenFrameworks, Camera Vision, Face Tracking
As Lead Interactive Developer I architected and designed the technology platform for a mosaic "wall" of 86 Samsung devices for the Green Room at the 86th Academy Awards. The devices ran a custom Android app, which synchronized their output, and allowed for unified content transitions across devices.
Technologies Used: Android, Java, Processing
Role: Software Lead
Client: Office Of Things
Location: Charlottesville, VA
Work: With JT Bachman and Katie Stranix of Office of Things I created a new immersive experience for the campus at the University of Virginia’s School of Architecture. “Stella” provides the user with music and lighting in an enclosed space, intended to engender safety, relaxation, and thought.
Input from the user via physical push buttons triggers LED animation in Stella’s immersive dome, while an audio port offers sonic isolation via compositions created by Bachman. The animations were defined in a collaborative iterative process between Bachman and myself.
Technology: Arduino Mega, NeoPixel LED strands, MP3 shield
Publication: STRANIX AND BACHMAN'S OFFICE OF THINGS PRESENT STELLA, A STUDY OF IMMERSIVE RESTORATIVE SPACE
Role: Software Lead
Location: Brookfield Place, New York, NY
Client: Rockwell Group LAB
Work: Brookfield Place Luminaries is an iconic holiday light installation designed by the LAB at Rockwell Group. As Lead Interactive Developer on the project, I designed the software architecture, capacitive cube interaction, and the content for the 2016 and 2017 seasons.
Publications:
The New York Times - Basking in a New Holiday Glow, No Evergreen Needed
Architectural Digest - David Rockwell on the Importance of Human Connection in Design
Inhabitat - Dazzling canopy of holiday lanterns returns to the Winter Garden at Brookfield Place
Video:
Technology: DMX, Processing
Role: Software Lead
Location: YouTube Headquarters, San Bruno, CA
Client: Office Of Things
Work: Soft Screen is a responsive architectural partition designed for the main lobby space of the offices of a media company in California. The project, which measures 120’ wide and 40’ tall, features over 30,000 LEDs hidden behind a fuzzy acoustical membrane. The overall effect is a seamless integration of the analog architecture of the lobby space and the digital platform of the company it represents. By standing on medallions distributed throughout the lobby, visitors trigger interactive features within the wall, making it your own experience for a few moments.
Technology: Electron.js, Node.js, Processing.js, Youtube iFrame API
Based on an original design from Hecho, Inc., and in collaboration with Dan Scofield and Tucker Viemeister, we created two iconic LED walls for the Brooklyn night club Baby's Allright.
Technologies Used: Addressable LEDs, Processing
Role: Software Lead, Senior Developer
Location: XFinity Retail Stores
Client: Elephant
Work: A suite of games for XFinity retail stores. The games recall 1980’s arcade video gamer experiences, and allowed users to learn about XFinity products while having fun at the same time.
In the first version of the app, I was a senior developer and worked on implementation and optimization of the games themselves. In the second version, I took on the role of lead developer. In addition to managing the software team and its deliverables, I also spent considerable time optimizing and rearchitecting the React backend.
Technology: HTML, CSS, and the React framework.
Role: Software Lead
Location: YouTube Headquarters, San Bruno, CA
Client: Office Of Things
Work: The Meditation Chambers – Arches, Skydome & Horizon – are a series of approximately 100 square foot spaces completed for a tech company in San Francisco.
They provide an intense immersive experience: a concentration of color, light and sound in spaces made specifically for workplace escape, both mental and physical. They are bigger than they seem from the outside, a byproduct of their unassuming outside appearance, compressed entry sequence, and the visual interplay of light and sound within.
Each room is a built illustration of the expansive magic that punctuates daily life for those who manage to slow down and open themselves to quiet contemplation and subtle shifts of the environment.
Technology: Arduino, Addressable LEDs, MP3 Shield
Photos: Tom Harris
Role: Software Lead
Location: YouTube Headquarters, San Bruno, CA
Client: Office Of Things
Work: How does a media company create separation between spaces? How can a wall be both visible with video content and invisible when turned off? How does a constantly changing company use a screen without looking immediately dated? Screen Play is an immersive and interactive architectural partition designed for the tech offices in California, the purpose of which is multifold but ultimately quite simple: when a design can remove the frame from a screen, from the imagery it contains, then the technology fades and only the video remains.
Screen Play proposes that by breaking the image into many vertical screens, what we see can be boundless, provide transparency, and remain relevant; it can be a partition, a video wall, and with reflective materials, disappear altogether.
Technology: Electron.js, Node.js, Processing.js, Youtube iFrame API
Photos: Tom Harris
Role: Creative Technology Advisor
Location: Los Angeles, CA
Client: Superfly
Work: I provided creative technology consulting to the Superfly executive team related to the implementation of the Netflix Squid Game Experience.
Responsibilities included reviewing budget and software design documents, identifying potential discrepancies, and meeting with development teams to review software architecture decisions and intentions.
The Superfly executive team was provided with my feedback and review to allow them to demonstrate to their client (Netflix) sufficient third party oversight and review of the creation process for the project.
I was the Lead Interactive Developer for the Rockwell Group technology team that implemented multiple galleries for the Hudson Yards Experience Center. This included software architecture, coordination of multiple large screen displays, inter-room communication, dynamic DMX LED lighting, and more.
Technologies Used: Node.js, DMX, HTML5, CSS
The Docks is a site specific installation created by the Rockwell Group LAB. It uses LED panels and surface tablets to create an interactive "dock" setting, where users can "make" virtual fish and flowers. As the Lead Interactive Developer on The Docks, I planned the software architecture, and wrote four custom applications: A tablet-based app for creating virtual fish and flowers, a geometrically mapped animation of the river, animated trees with changing seasons, and a real-time camera vision engine.
Technologies Used: Processing, Spacebrew, Java, Camera Vision
Role: Senior Software Developer
Client: Elephant
Location: US XFinity Retail Stores
Work: With Elephant I implemented an in-browser AR experience using the 8th Wall framework. A large-scale poster in XFinity stores nationwide allowed users to take an augmented reality photo with the starts of the film Sing 2.
Technology: Backend Software Architecture, 8th Wall API, CSS, HTML, Asynchronous Messaging
While working at Rockwell Group, I helped to create a hand-wired LED array for their concrete tray design for Wallpaper: Handmade 2016. Using programmable LEDs and a Teensy micro-controller, the tray could display multiple patterns corresponding to predetermined glassware locations.
Technologies Used: Teensy, Addressable LED Strips
My first project with Rockwell Group LAB, the DigiTree uses custom hanging enclosures to turn a large ficus tree into an interactive social network. Via a local photo booth, or through Twitter hashtags, users can see their content appear on screens in the tree in real time. As an Interactive Developer, my roles on the project included software architecture planning, a custom real-time content server, and managing a third party developer in the creation of a custom CMS.
Technologies Used: Java, Node.js, Spacebrew
Role: Senior Creative Technologist
Location: Hudson Yards, New York, NY
Client: Rockwell Group LAB
Work: Over three years between 2017 and 2020 I worked with the Rockwell Group LAB as the Senior Creative Technologist on the Hudson Yards Observation Deck. My roles included specifying and overseeing technology choices, programming a custom large-scale, multi-format, LED array for the entry experience, participating in large cross-team organizational meetings, and helping to plan and supervise on-site installation.
Technology: Processing (Entry LED Control)
In 2012, I worked with Tucker Viemeister and JCDecaux on a winning technology bid for Los Angeles International Airport's Bradley Terminal. The work included ideation and brainstorming, as well as the formulation of a comprehensive document outlining the proposal.
Since 2012, I have worked on various occasions with the band Lucius to create reactive projections to accompany their music. The work has involved audio reactivity and projection mapping, and has appeared in music videos, as well as at live performances at New York’s Mercury Lounge and Bowery Ballroom.
Video: Don’t Just Sit There
Video: Turn It Around
Technologies Used: Processing, Openframeworks, MadMapper
I worked with Arisohn and Murphy to create multiple prototypes for an artist’s installation in New York. Both prototypes involved site-specific movement and presence detection, one using a light-based sensor, and the other using camera vision. Both implementations then talked to DMX controllers to trigger a spotlight when presence was detected.
Technologies Used: Arduino, DMX, OpenCV, Processing
I created GridMusic as a real-time interactive sound installation. Using a virtual grid overlaid on a public space, the system interprets movement in each node of the grid. Each node of the grid is also associated with a given sine wave. When movement occurs, the volume of that wave is increased, and when it is idle, it is decreased. The result is a sonic landscape that represents the movement throughout the space, and changes throughout the day.
Technologies Used: Java, Camera Vision, MIDI, OpenCV
Sine Beats is a live music performance and immersive experience. In it, I combine sound and light, using custom controllers to mix 6 pairs of closely attuned sine waves and their corresponding light sources. The volume of the two waves, and their proximity, creates an unmistakeable "beating" effect that forms the backbone of the piece. As the volume and beating builds, each controller also adjusts the strobing of a halogen lamp, to create a visual effect that mimics the generated audio in real time.
Technologies Used: Arduino, PureData
In the Headroom projects I explored the synthesis of two subjects' facial features in real time. The first (Mk. I) used digital cameras mounted on helmets, with a custom controller to adjust resolution in real time. The second (Mk. II) was a collaboration with Luis Violante, and explored the same principles, but with laser-cut, two sided mirrors.
Technologies Used: OpenFrameworks, Arduino
With my collaborator Dan Scofield, we conceived of an outdoor interactive soundscape that changed in real time based on movement within the space. Using multiple cameras for movement detection in a custom software platform, as well as a large scale surround array routed through MAX/MSP, we deployed the installation in a public garden as part of The Switched On Garden.
Technologies Used: Camera Vision, Processing, MaxMSP
Between 2006 and 2009 I was a Software Design Engineer for Dolby's Digital Cinema platform in the ShowStore group. Key responsibilities included adding support for GPIO, 3D Content, and Serial Automation.
Technologies Used: Java, C, PostgreSQL