To this end, we study:
Our research is intended to advance the design and control of human-machine systems as well as autonomous robotic systems.
Example applications include:
We are characterizing the reflex-like grip responses of the human hand to rotational disturbances of a grasped object. The goal is to better understand grip responses that require simultaneous coordination of adduction/abduction and flexion/extension across fingers. By better understanding patterned responses in humans, we can develop bio-inspired artificial reflexes to enhance the functionality of anthropomorphic robotic and prosthetic hands without adding to the user’s cognitive burden.
The exploration and manipulation of unstructured environments via tactile feedback still presents many challenges. In collaboration with the ASU Micro/Nanofluidics Laboratory, we are developing a capacitance-based MEMS tactile sensor skin capable of detecting normal and shear forces and local vibrations. Using an array configuration and by tuning the material properties of the polymer PDMS at different sites, localized measurements with various magnitudes can be obtained simultaneously. PDMS has the desirable properties of being waterproof, chemically inert, and non-toxic. In addition, the sensor will be conformable to the curvilinear and deformable fingertip, which is crucial for performance. Computer simulations are being developed, and prototypes are under construction. Mapping of capacitance readings to external stimuli will be performed using nonlinear regression models, such as artificial neural networks.
Human users of teleoperated manipulators lack a conscious perception of rich tactile feedback, even when using devices as intimately connected to the human body as a neuroprosthesis. In collaboration with the ASU Sensorimotor Research Group, we are working to close the sensory portion of the human-machine loop. Using nonhuman primates and a sensorized robot hand, we are mapping the relationships between biological somatosensory cortex recordings and artificial tactile sensor readings. The ultimate goal is to understand how and when to stimulate the brain to provide the user with a conscious perception of tactile events occurring at an artificial fingertip.
We are currently developing a multi-level control system that maintains voluntary control at the higher human level, but implements autonomous “survival behaviors ” at the lower machine level to address communication delays between human and machine. These survival behaviors can be implemented with sensory-event driven artificial reflexes that are inspired by human grip responses. The current research testbed uses two of three fingers on a robot hand (Barrett Technology) with flexion/extension and spread capabilities to mimic the thumb and index finger of a human hand. An anthropomorphic research testbed is forthcoming
We are characterizing digit control in a three-fingered grasp for a task that closely relates to an activity of daily living. The pouring task requires coordination of adduction/abduction and flexion extension degrees of freedom for successful completion. Dynamic stability is required for purposeful translation and rotation of the fluid-filled container against gravity.