Tom Shannon

Machine Learning,Technology and Art,Generative Installations,Impactful Software,Social Justice,



I'm Tom — a mutli-disciplinary technologist and artist. I currently work as a software developer and consultant at ThoughtWorks, based in Chicago. I specialize in machine learning, cloud computing, mobile development, and technology driven art installations.

I attended Carthage College for my undergraduate degree and received my B.A. in physics and computer science. During my studies, I helped lead several NASA funded research projects, involving building suborbital rocketry payloads, high powered rockets, and a small earth observing satellite.

Since then, I have collaborated with artists to create unique technology driven art installations that push boundaries of what art can do. Some of this software includes emotion driven film, eye gaze tracking on mobile devices, and re-creating lost ancestry records using generative adversial networks. Some of these pieces have gone on to exhibit at places such as the Cooper Hewitt Smithsonian Museum, ARS Electronica, BOZAR, and Science Gallery Dublin. For future projects, I am interested in continuing to create generative artwork that addresses social, economic, and cultural issues in our society.


Here are a few projects I am proud of working on!


Face Values: Exploring Artificial Intelligence
New York City, New York
September 2019 - March 2020

STARTS Prize — Honorary Mention
Linz, Austria
September 9th - 13th, 2020

BOZAR 2020
Speculating on the Future Through Art And Science
Brussels, Belgium
September 26th - October 25th, 2020

Speculating on the Future Through Art And Science
Brussels, Belgium
September 26th - October 25th, 2020

SXSW 2020 (Cancelled due to COVID)
Immersive Futures Lab
Austin, Texas
March 17th - 18th, 2020

Project Credits

Director & Producer: Karen Palmer

Commissioned by: Ellen Lupton Senior Curator, Contemporary Design, The Cooper Hewitt Smithsonian Design Museum

Software Development Coordination: ThoughtWorks Arts Program Director Andrew McWilliams, Julien Deswaef & The ThoughtWorks Arts Team

Development Team: Tom Shannon, Emilio Escobedo, Lauren O’Neal, Dan Lewis-Toakley, J.C. Holder, Stephanie Weber, Peter Graves, Lee Faria, Diana Gámez Díaz, Emily Sachs, Whelan Workmaster, Andrew Zou, Ling Tran, Margaret Plumley, Megan Andrea Louw

R&D: Emily Balcetis, Associate Professor of Psychology NYU, Lab Director The SPAM (Social Perception Action & Motivation) Lab

Film Production Team: She Shot Me Films

Sound Design: Mike Wyeld

Actors: Police Officer Michael Mirlas, Black Male Hassan Farrow, White Male Jeremie Egiazarian, Background female Christin Johnson

Perception iO: Emotion Driven Film

Perception iO is an immersive storytelling experience that reveals how your emotions can influence your perception of reality. Perception iO puts you in the point of view of a police officer who is wearing a body camera. The footage from the body camera will be used to train an artificial intelligence system for the future of law enforcement. In this film you are put into two volatile sitations with characters of different races. Based on your emotional response, the interactive film narrative changes, which ultimately has an effect on the characters you see in the film. Through this experience, participants become aware of their own subconscious bias, spefically towards race, and how it can have an effect training articicial intelligence of the future. Through this, Perception iO calls for regulation and democratization in the artificial intelligence sector today.

Personal Accomplishments

  • Led software development on project from inception with other core developers at ThoughtWorks Arts Research Labs

  • Installed Perception iO at BOZAR, ARS Electronica, and Cooper Hewitt Smithsonian Museum exhibitions

  • Built emotion recognition models to detect anger, fear, and calm responses in participants

  • Working with artist to expand to web and mobile versions of Perception iO after increased demand for exhibitions

Learn More


Machine Learning for Creativity and Design
September 2019 - March 2020

Winning Entry
September 9th - 13th, 2020

Project Credits

ThoughtWorks Artist in Residence: Nouf Aljowaysir

Thoughtworks Arts Residency Director: Andy McWilliams

Development Team: Austin Garrard, Vini Macedo, Nikola Savic, Tom Shannon, Rohit Naidu, Alwina Oyewoleturner, Ellen Pearlman, Shraddha Surana

Learn More

Salaf [سلف]

Salaf [سلف], a piece envisioned by Nouf Aljowaysir and created through the ThoughtWorks Arts Resicency Program, is an ongoing exploration of using generative adversarial networks (GANs) to investigate cultural transmission across generations and to help construct a story of the artist's heritage. Over the course of the exploratory research period, it was evident that many of the open source datasets we initially were using were biased and white-washed. With the limited availability of images of non-western Arab people, it was difficult to generate high resolution models to generate scenes that would help convey the artist's vision. While experimenting, we found it easier to generate interesting images when training on images that cut out the individuals out of the image using person segmentation; what remained was a silouhette of past records of her ancestral lineage. This additionally helped depict the eradication of the artist's ancestral collective memory.

Personal Accomplishments

  • Trained generative adversarial networks (GANs) models on images of non-western Arab figures across five generations to help artist in residence explore ways of generating images that help represent her family's ancestral story

  • Idenitified shortcomings with open source image recognition models such as AWS Rekognition, which classified many images of non-western figures as soldier, armored, and military — highlighting the danger of biased models

  • Acted as technical consultant to help guide artist in residence in different ways to use GANs to help her convey the message she wanted to share with her work

Inquisitive Introspection

Inquisitive Introspection is an installation that forces participants to ask difficult questions about their lives posed by their future self. The installation consists of a double sided mirror that the participant approaches. When they approach, an image is taken of them. Using a generative adversarial model, an image of their future self is generated and is projected through the double sided mirror with an interior display. This image is animated with a first order motion model to ask questions such as: Are you happy with what you are doing in your life? If not, what could you do to change it? After the line of conversation, the projection of the particpiant's future self vanishes. The participant is left looking at themself in the mirror to think about these questions.

Personal Accomplishments

  • Architected a cloud based software pipeline to create animations of a participant's future self using open source libraries and generative models

  • Integrated a Raspberry Pi into a double sided smart mirror that interacts with the cloud based pipeline to relay videos of a partipant's future self

  • Designing an intimate experience for particpants to encounter their future selves; including an emotional decompression conversation to discuss the experience of the participant


Click for More Details

A link to my undergraduate thesis for physics and computer science

besopy: A Python Library for 3D Topology Optimization Problems

For my undergraduate thesis, I created software to conduct 3D structural topology optimization problems using the bi-directional evolutionary structure optimization (BESO) method. This code was written based on work done for the 2D case by Xie and Huang in the book, Evolutionary Topology Optimization of Continuum Structures: Methods and Applications. This work included building a finite element analysis suite for hexahedral elements and optimization methods. Future work is planned to be done to improve performace using cloud computing GPUs and to explore the usage of generative models to create similar structures in exceedly less time.

Personal Accomplishments

  • Built finite element analysis to calculate stiffness for different materials in hexahedral elements using Lagrange shape functions

  • Implemented 3D case for BESO topology optimization that exported 3D VTK and STL models

  • Integrated solution to produce animations of design process and compared solution with an alternative method of optimization

  • Wrote undergraduate thesis demonstrating results of comparison


Click for More Details

A link with description of RockSat-X program on NASA page

Personal Accomplishments

  • Designed an experimental payload that flew aboard a NASA Improved Malemute Sounding Rocket

  • This experiment recorded very low frequency electromagnetic waves from lightning strikes in the upper atmosphere

  • Built high bandwidth, multi-channel, data acquisition software that sampled analog signals from mangetic loop and electric plate antennas

  • Integrated payload into rocket with NASA engineers at Wallops Flight Facility

Suborbital Rocketry Payload Development: Observing VLF Electromagnetic Waves from Lightning

This was a project conducted in my first year of undergraduate, where I helped build an experimental payload that flew aboard a NASA Malemute Sounding Rocket. The experiment my team and I built was a receiver that collected very low frequency electromagnetic waves from lightning signals in the upper atmosphere.


Click for More Details

A link to the poster I presented at the 33rd Annual American Society for Gravitational and Space Research Conference, Seattle, WA (2017)

Personal Accomplishments

  • Helped lead design and procurement of off the shelf electronics for NASA funded CubeSat

  • Developed software for push broom multi-spectral camera in orbit modes

  • Wrote isothermal simulations of electronics and satellite shell in low earth orbit

Canopy Near Infrared Observing Project (CaNOP): Small Satellite Development

The CaNOP CubeSat mission is a student lead project to build a small satellite that takes multispectral images of the Earth with similiar resolution to large scale satellites such as LandSat. The system is composed of off the shelf units brought together. During this project, I played many roles, but created thermal simulations of the CubeSat and electronic parts. I also helped create software for operational modes that the multispectral camera will use to determine behavior in orbit.