Tom Shannon

Machine Learning,Technology and Art,Impactful Software,Social Justice,



I'm Tom — a mutli-disciplinary technologist and artist. I currently work as a software developer and consultant at ThoughtWorks, based in Chicago. I specialize in machine learning, cloud computing, and helping other artists create technology driven art installations.

I attended Carthage College for my undergraduate degree and received my B.A. in physics and computer science. During my studies, I helped lead several NASA funded research projects, involving building suborbital rocketry payloads, high powered rockets, and a small earth observing satellite.

Since then, I have collaborated with other artists and created innovative software for their art installations. Some of this software includes emotion driven film and recreating lost ancestry records with generative adversial networks. Some of these pieces have gone on to exhibit at the Cooper Hewitt Smithsonian Museum, ARS Electronica, and BOZAR. For future projects, I am interested in creating artwork addressing social, economic, and cultural issues through the use of new media techniques to mediate human experience.


Face Values: Exploring Artificial Intelligence
New York City, New York
September 2019 - March 2020

STARTS Prize — Honorary Mention
Linz, Austria
September 9th - 13th, 2020

BOZAR 2020
Speculating on the Future Through Art And Science
Brussels, Belgium
September 26th - October 25th, 2020

SXSW 2020 (Cancelled due to COVID)
Immersive Futures Lab
Austin, Texas
March 17th - 18th, 2020

Project Credits

Director & Producer: Karen Palmer

Commissioned by: Ellen Lupton Senior Curator, Contemporary Design, The Cooper Hewitt Smithsonian Design Museum

Software Development Coordination: ThoughtWorks Arts Program Director Andrew McWilliams, Julien Deswaef & The ThoughtWorks Arts Team

Development Team: Tom Shannon, Emilio Escobedo, Lauren O’Neal, Dan Lewis-Toakley, J.C. Holder, Stephanie Weber, Peter Graves, Lee Faria, Diana Gámez Díaz, Emily Sachs, Whelan Workmaster, Andrew Zou, Ling Tran, Margaret Plumley, Megan Andrea Louw

R&D: Emily Balcetis, Associate Professor of Psychology NYU, Lab Director The SPAM (Social Perception Action & Motivation) Lab

Film Production Team: She Shot Me Films

Sound Design: Mike Wyeld

Actors: Police Officer Michael Mirlas, Black Male Hassan Farrow, White Male Jeremie Egiazarian, Background female Christin Johnson

Perception iO: Emotion Driven Film

Perception iO is an immersive storytelling experience that reveals how your emotions can influence your perception of reality. Perception iO puts you in the point of view of a police officer who is wearing a body camera. The footage from the body camera will be used to train an artificial intelligence system for the future of law enforcement. In this film you are put into two volatile sitations with characters of different races. Based on your emotional response, the interactive film narrative changes, which ultimately has an effect on the characters you see in the film. Through this experience, participants become aware of their own subconscious bias, spefically towards race, and how it can have an effect training articicial intelligence of the future. Through this, Perception iO calls for regulation and democratization in the artificial intelligence sector today.

Personal Accomplishments

  • Led software development on project from inception with other core developers at ThoughtWorks Arts Research Labs

  • Installed Perception iO at BOZAR, ARS Electronica, and Cooper Hewitt Smithsonian Museum exhibitions

  • Built emotion recognition models to detect anger, fear, and calm responses in participants

  • Working with artist to expand to web and mobile versions of Perception iO after increased demand for exhibitions

Learn More

Inquisitive Introspection

Showcasing November 29th

Inquisitive Introspection is an installation that forces participants to ask difficult questions about their lives posed by their future self. The installation consists of a double sided mirror that the participant approaches. When they approach, an image is taken of them. Using a generative adversarial model, an image of their future self is generated and is projected through the double sided mirror with an interior display. This image is animated with a first order motion model to ask questions such as: Are you happy with what you are doing in your life? If not, what could you do to change it? After the line of conversation, the projection of the particpiant's future self vanishes. The participant is left looking at themself in the mirror to think about these questions.

Personal Accomplishments

  • Architected a cloud based software pipeline to create animations of a participant's future self using open source libraries and generative models

  • Integrated a Raspberry Pi into a double sided smart mirror that interacts with the cloud based pipeline to relay videos of a partipant's future self

  • Designing an intimate experience for particpants to encounter their future selves; including an emotional decompression conversation to discuss the experience of the participant


Click for More Details

A link to my undergraduate thesis for physics and computer science

besopy: A Python Library for 3D Topology Optimization Problems

For my undergraduate thesis, I created besopy, a Python package that allows you to run 3D structural topology optimization problems using the bi-directional evolutionary structure optimization (BESO) method. This code was written based on work done for the 2D case by Xie and Huang in the book, Evolutionary Topology Optimization of Continuum Structures: Methods and Applications. This work included building a finite element analysis suite for hexahedral elements. Future work is planned to be done to improve performace using cloud computing and the use of generative models.

Personal Accomplishments

  • Built finite element analysis to calculate stiffness for different materials in hexahedral elements using Lagrange shape functions

  • Implemented 3D case for BESO topology optimization that exported 3D VTK and STL models

  • Integrated solution to produce animations of design process and compared solution with an alternative method of optimization

  • Wrote undergraduate thesis demonstrating results of comparison


Click for More Details

A link with description of RockSat-X program on NASA page

Personal Accomplishments

  • Designed an experimental payload that flew aboard a NASA Improved Malemute Sounding Rocket

  • This experiment recorded very low frequency electromagnetic waves from lightning strikes in the upper atmosphere

  • Built high bandwidth, multi-channel, data acquisition software that sampled analog signals from mangetic loop and electric plate antennas

  • Integrated payload into rocket with NASA engineers at Wallops Flight Facility

Suborbital Rocketry Payload Development: Observing VLF Electromagnetic Waves from Lightning

This was a project conducted in my first year of undergraduate, where I helped build an experimental payload that flew aboard a NASA Malemute Sounding Rocket. The experiment my team and I built was a receiver that collected very low frequency electromagnetic waves from lightning signals in the upper atmosphere.


Click for More Details

A link to the poster I presented at the 33rd Annual American Society for Gravitational and Space Research Conference, Seattle, WA (2017)

Personal Accomplishments

  • Helped lead design and procurement of off the shelf electronics for NASA funded CubeSat

  • Developed software for push broom multi-spectral camera in orbit modes

  • Wrote isothermal simulations of electronics and satellite shell in low earth orbit

Canopy Near Infrared Observing Project (CaNOP): Small Satellite Development

The CaNOP CubeSat mission is a student lead project to build a small satellite that takes multispectral images with similiar resolution to large scale satellites such as LandSat. The system is composed of off the shelf units brought together. During this project, I played many roles, but created thermal simulations of the CubeSat and electronic parts. I also helped create software for operational modes that the multispectral camera will use to determine behavior in orbit.