Action: Jem-Jem
Experience Design: Exploring Tactile Communication through Repetitive Gestures (2023)
Rhino, Silicon Casting, Arduino, Force Sensor, Touch Designer, After Effect

Concept Development & Material Exploration
In this phase, the "jem jem" gesture is transformed into a tactile device that facilitates visual communication. The design process involves selecting a finger puppet-like prototype, as it allows users to intuitively understand how to grip and interact with the device. Material exploration is also crucial, focusing on finding the right balance between softness and durability to replicate the gesture’s tactile feel while ensuring comfort and ease of use.

Prototype 1
In the concept development phase, prototype 4 was chosen because the finger puppet-like design made it easy for users to intuitively understand how to grip and interact with the device.

3D Modeling in Rhino & Mold 3D printing
the scale of the device was carefully considered to ensure it fits comfortably in the hand, with four indentations designed to securely position the fingers, providing both stability and a natural feel during use.

Prototype 4-1
Casting Single part

Prototype 4-2
Casting Two Parts
During the silicone casting process, several experiments were conducted to determine the most effective method. We tested casting the device as a single piece with the sensors embedded, as well as casting it in two separate parts that would be bonded together afterward. Additionally, we experimented with mixing glitter into the silicone to control firmness and tested using threads to achieve similar control.
The results showed that casting in two parts and then bonding them provided more stability, particularly in adjusting the sensor connections and placement. The glitter mixture proved challenging, as it tended to shift due to gravity, making precise positioning difficult. In contrast, mixing threads into the silicone was more stable and offered better control over the firmness of the device.

Project Overview
"Action: JemJem" is inspired by the traditional Korean gesture "jem jem," where parents repeatedly clench and unclench a baby's hand to encourage hand mobility and communication. This project explores how such simple yet meaningful actions can be translated into visual communication through a tactile device. Using TouchDesigner, Arduino, and tactile device, the design encourages repetitive actions of holding and releasing, making it suitable for individuals with handy-scale capabilities.
Prototype 2

Prototype 3

Prototype 4



Prototype 4-3
Casting with Thread


Prototype 4-4
Casting with Glitter


Checking Data Values
with Silicone Device
Using Data to change Visual

Sensor Integration and Testing
During this stage, the silicone device was connected to Arduino and TouchDesigner to translate the values from the force sensors embedded within the device into visual changes. This setup aimed to encourage users to engage in the repetitive hand-clenching and unclenching motion by providing immediate, responsive visual feedback. The integration of the sensors allowed for real-time interaction, ensuring that the tactile experience of "jem jem" was reflected through dynamic visual outputs, further enhancing the immersive experience.
User Testing and Insights
User testing is conducted to observe how people interact with the prototype, with the feedback informing further refinements. As participants tested the device, they instinctively gauged the range of their grip strength, from the weakest to the strongest, and repeatedly performed the clenching action to see how the visual feedback on the screen responded to their power. This natural exploration demonstrated how effectively the device could capture and reflect the varying degrees of force, helping to ensure that the final product is both functional and engaging, enhancing communication through a simple yet meaningful tactile gesture.

Application Possibilities
The "Action: JemJem" device has the potential to bridge the gap between digital and physical worlds by providing tactile feedback that directly matches the user’s physical input. This precise alignment ensures that digital responses feel natural and intuitive, making the interaction more immersive. Similar to how double-clicking has become second nature, this device could evolve to enable more complex digital reactions based on the force applied, reducing the disconnect between physical actions and their digital outcomes. This capability opens up a range of applications, from enhancing gaming experiences with more responsive controls to aiding in therapeutic exercises by training fine motor skills.
