A surveilling installation that confronts the audience with the panopticon of digital age

Highlight: Communication Arts 2019, Interactive shortlist

Collaboration - Tong Wu & Barak Chamo


In the age of mass surveillance and commoditized information, we are all, willingly or not, constantly watched. Our devices, online presence, transaction history and even physical presence have become an asset for social network, world governments and corporations to own - we live in a digital panopticon devoid of consent and, in many cases, even awareness. As in Bentham’s architecture, the watchman’s all-seeing eye keeps us from misstepping and policing ourselves.

Through Panopticon, we've created an experience of being both the "observee" and the "observer". It is a critique and a wakeup call to the alarming prevalence of mass surveillance all around us, to how tracking and surveillance devices have become largely invisible to us and to how insensitive we have become toward privacy invasion. 

Inspired by the system of control of the same name, we created the project “Panopticon”, an interactive media sculpture that breaks the illusion of privacy and control of our digital identity and physical presence. Multiple camera rigs monitor users throughout the exhibition space, tracking and capturing faces, and collecting the facial data to be projected on a semi-torus sculpture.




  • laser-cut modeled foam core

  • Acrylic thread

  • Apple Mac Minis

  • Logitech c920 webcams

  • LG Short-throw projector

  • DMX spotlights

  • Vectorworks

  • Max/MSP

  • MadMapper

  • Adobe Premiere CC

  • Amazon AWS Rekognition

  • OpenCV

  • Python 3

The Hand

Featured in:

1. Final Presentation

As the continuation of their midterm project , Tong and Nick decide to develop their ideas and design a RPS game robotic hand (and arm) that are more interactive and with more personalities. The final presentation for this project is an installation in which users could simply come and play with the RPS game robotic hand without wearing any extra devices.



2. Process Explanation


3. Documentations

(1) 3D printed Robotic Hand & Forearm

To build the hand and the forearm of the robotic hand, we used the models from InMoov, an open source model library for 3D printed life-size robot. We used superglue and 3mm filaments to connect finger joints, 3d printed bolts to connect between the forearm and the hand, and used fishing lines and servos to control finger movements.



(2) Robotic Hand & Forearm Assembly

12-Assembling the hand+forearm13-Assembling the hand+forearm

The Orignial Version (Nov.2017):

(3) RPS Robotic Hand w/ Flex Sensitive Glove

               We designed a flex sensitive glove for this installation. There are two flex sensors, a "start the game" button, and a “throw” button, all sewed to a right-hand glove and connected to a Arduino Nano board, which has a nRF24L01 wireless transceiver that talks to the other same-kind transceiver from the robotic hand part.

               To play the game, player would put on the glove, press the "start the game" button, then press the "throw" button twice, and make a hand gesture indicating his/her choice of rock, paper or scissors.


09-Sewing sensors and other components10-Sewing the glove

Final Version (December, 2017):

(4) Leap Motion

00-leap motion

For better user experience, we use leap motion to replace the glove with flex sensors, which was in the previous version of the RPS game robotic hand. By doing so, users could simply activate the robotic hand by doing certain gestures such as waving.

(5)Re-design the forearm

01-all kinds of design

We also redesign the cover for the wrist part and rearrange the motors, so a steel tube could run through the arm part, which allows the hand to be mounted inside an acrylic enclosure and move up and down freely.

02-run the tube through03-drill through the hand05- building enclosure

Two other elements are added to the design: a gesture display board, and a set of LED lights. The gesture display board would light up the corresponding icon as what the user does during the game, which helps users to understand whether or not their gestures are detected correctly. And the set of LED lights would light up in turn when an user throws twice before doing a RPS gesture.

Flappy Shadow

1. Final Presentation

"Flappy shadow" is a project by Kai Hung and Tong Wu that allows users to create and fly a virtual bird from shadow puppets.

2. Idea Development

Kai is mainly responsible for designing visuals for the project. We went through several versions of designs and finally decided the design of the virtual bird should fly over a black background and mimic the hand gesture for shadow bird. I focuses on the coding part of the project. Originally we decided to use a leap motion combined with wakinator to detect and differentiate users’ hand gesture. But then we found out the detection range was so limited, that severely influenced the engagement and the interaction of uses. So we finally decided to use Kinect and kinectron combined with p5.js to achieve expected outcome. Kinect does an exllecent job in tracking user's hands positions, but its sensitivity also gives us trouble given the show environment was very dark and crowded. Although we have programmed the Kinect to track only the closest body’s hand position, it is still easily interrupted when people are all crowded in one small room.

4. System diagram


5. Source Code at GitHub Page

"Flappy Shadow"