[ICRA2025] Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
-
Updated
Apr 4, 2025 - Python
[ICRA2025] Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
Open simulation tool for large-scale vision-based tactile sensing devices, based on a proposed SOFA-GAZEBO-GAN framework.
Extended Sensing via Dynamic Tactile Sensors
DexSkin — High-Coverage Conformable Robotic Skin for Contact-Rich Manipulation
A visual tool for placing predefined patterns onto a surface and generating the resulting gcode
IX-HapticSight: open, safety-first Optical–Haptic Interaction Protocol for robots/XR. Maps vision→touch with consent-aware social gestures, tri-level hazard maps, force envelopes, state machine, culture profiles, sim scenes & tested ref code. MIT + Responsible Use.
Investigating the efficacy of haptic feedback modalities during delicate object manipulation by integrating stress-sensor data with an ESP32-bound multi-actuator array.
This repo contains all the ROS2 packages developed at AI4CE lab for interfacing with various specialized sensors
Vision-based tactile sensing gripper integrating computer vision and deep learning for slip detection in robotic grasping.
Touch & pressure tracking system using a Velostat-based resistive matrix with dynamic noise calibration, adaptive thresholds, and real-time data visualization.
Code for an autonomous walking robot
MATLAB code for simulating and analyzing the static performance of Variable Stiffness Soft Pneumatic Sensing Chambers (VSSPSCs).
Add a description, image, and links to the tactile-sensing topic page so that developers can more easily learn about it.
To associate your repository with the tactile-sensing topic, visit your repo's landing page and select "manage topics."