Developing whole-body tactile skins for robots remains a challenging task, as existing solutions often prioritize modular, one-size-fits-all designs, which, while versatile, fail to account for the robot's specific shape and the unique demands of its operational context. In this work, we introduce the GenTact Toolbox, a computational pipeline for creating versatile whole-body tactile skins tailored to both robot shape and application domain. Our pipeline includes procedural mesh generation for conforming to a robot's topology, task-driven simulation to refine sensor distribution, and multi-material 3D printing for shape-agnostic fabrication. We validate our approach by creating and deploying six capacitive sensing skins on a Franka Research 3 robot arm in a human-robot interaction scenario. This work represents a shift from "one-size-fits-all" tactile sensors toward context-driven, highly adaptable designs that can be customized for a wide range of robotic systems and applications.
The GenTact Toolbox provides a comprehensive pipeline for designing and fabricating custom tactile skins:
This approach allows for rapid prototyping and iteration of tactile sensor designs that are specifically tailored to both the robot's physical form and its intended use case.
GenTact comes with an addon for Blender and Isaac Sim for generating and simulating the sensors respectively. The final output is a seperable mesh file containing the mold of the skin and the embedded sensors. This multi-layered mesh can then be printed on a 3D printer with multiple filament heads to create the final skin.
Our Blender addon provides an intuitive interface for designing tactile skins. It allows users to generate sensor layouts that conform to any 3D mesh, with controls for sensor density, distribution patterns, and connection routing. The primary constraints that make creating tactile skins, such as surface coverage, goemetry conformaty, wiring, and reproducability are effectively removed by procedurally generating the skin design.
The skin is procedurally generated using Blender's geometry nodes architecture. Data from the mesh, user-defined parameters, and a painted heat map are used to algorithmically define and instantiate a tactile skin mesh.
Our Isaac Sim extension enables 1:1 simulation of the procedurally generated tactile skins. The extension supports importing the procedurally generated sensors, simulating contact data, connecting to real frabricated sensors via ROS2, and optimizing the sensor placement for a given task.
Simulated contact data can be used to recreate a new sensor layouts that are optimized for a given task. Below is an example simulation of a Unitree H1 robot performing a box moving task. In this example, contact is consistently observed on the bottom right of the H1's chest plate. If we would like to capture higher fidelity data in this concentrated region as opposed to spending compute resources on the full chest plate area, we can use a Butterworth filter to quickly iterate a new sensor layout map that is algorithmically defined.
The Franka Research 3 robot arm was fully covered in procedural tactile skin units. The sensors were designed for a human-robot interaction scenario and inform the robot's motion planner about spatial regions were contact was observed to avoid collisions.
@article{kohlbrenner2024gentact,
title={GenTact Toolbox: A Computational Design Pipeline to Procedurally Generate Context-Driven 3D Printed Whole-Body Tactile Skins},
author={Kohlbrenner, Carson and Escobedo, Caleb and Bae, S Sandra and Dickhans, Alexander and Roncone, Alessandro},
journal={arXiv preprint arXiv:2412.00711},
year={2024}
}