Machine Learning,Technology and Art,Impactful Software,Social Justice,
I'm Tom — a mutli-disciplinary technologist and artist. I currently work as a software developer and consultant at ThoughtWorks, based in Chicago. I specialize in machine learning, cloud computing, and helping other artists create technology driven art installations.
I attended Carthage College for my undergraduate degree and received my B.A. in physics and computer science. During my studies, I helped lead several NASA funded research projects, involving building suborbital rocketry payloads, high powered rockets, and a small earth observing satellite.
Since then, I have collaborated with other artists and created innovative software for their art installations. Some of this software includes emotion driven film and recreating lost ancestry records with generative adversial networks. Some of these pieces have gone on to exhibit at the Cooper Hewitt Smithsonian Museum, ARS Electronica, and BOZAR. For future projects, I am interested in creating artwork addressing social, economic, and cultural issues through the use of new media techniques to mediate human experience.
COOPER HEWITT SMITHSONIAN MUSEUM
Face Values: Exploring Artificial Intelligence
New York City, New York
September 2019 - March 2020
Speculating on the Future Through Art And Science
September 26th - October 25th, 2020
Director & Producer: Karen Palmer
Commissioned by: Ellen Lupton Senior Curator, Contemporary Design, The Cooper Hewitt Smithsonian Design Museum
Software Development Coordination: ThoughtWorks Arts Program Director Andrew McWilliams, Julien Deswaef & The ThoughtWorks Arts Team
Development Team: Tom Shannon, Emilio Escobedo, Lauren O’Neal, Dan Lewis-Toakley, J.C. Holder, Stephanie Weber, Peter Graves, Lee Faria, Diana Gámez Díaz, Emily Sachs, Whelan Workmaster, Andrew Zou, Ling Tran, Margaret Plumley, Megan Andrea Louw
R&D: Emily Balcetis, Associate Professor of Psychology NYU, Lab Director The SPAM (Social Perception Action & Motivation) Lab
Film Production Team: She Shot Me Films
Sound Design: Mike Wyeld
Actors: Police Officer Michael Mirlas, Black Male Hassan Farrow, White Male Jeremie Egiazarian, Background female Christin Johnson
Perception iO is an immersive storytelling experience that reveals how your emotions can influence your perception of reality. Perception iO puts you in the point of view of a police officer who is wearing a body camera. The footage from the body camera will be used to train an artificial intelligence system for the future of law enforcement. In this film you are put into two volatile sitations with characters of different races. Based on your emotional response, the interactive film narrative changes, which ultimately has an effect on the characters you see in the film. Through this experience, participants become aware of their own subconscious bias, spefically towards race, and how it can have an effect training articicial intelligence of the future. Through this, Perception iO calls for regulation and democratization in the artificial intelligence sector today.
Showcasing November 29th
A link to my undergraduate thesis for physics and computer science
For my undergraduate thesis, I created besopy, a Python package that allows you to run 3D structural topology optimization problems using the bi-directional evolutionary structure optimization (BESO) method. This code was written based on work done for the 2D case by Xie and Huang in the book, Evolutionary Topology Optimization of Continuum Structures: Methods and Applications. This work included building a finite element analysis suite for hexahedral elements. Future work is planned to be done to improve performace using cloud computing and the use of generative models.
A link with description of RockSat-X program on NASA page
This was a project conducted in my first year of undergraduate, where I helped build an experimental payload that flew aboard a NASA Malemute Sounding Rocket. The experiment my team and I built was a receiver that collected very low frequency electromagnetic waves from lightning signals in the upper atmosphere.
A link to the poster I presented at the 33rd Annual American Society for Gravitational and Space Research Conference, Seattle, WA (2017)
The CaNOP CubeSat mission is a student lead project to build a small satellite that takes multispectral images with similiar resolution to large scale satellites such as LandSat. The system is composed of off the shelf units brought together. During this project, I played many roles, but created thermal simulations of the CubeSat and electronic parts. I also helped create software for operational modes that the multispectral camera will use to determine behavior in orbit.