θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and...

10

Click here to load reader

Transcript of θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and...

Page 1: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

An Overview of Dexterous Manipulation

Allison M. Okamura, Niels Smaby and Mark R. Cutkosky

Dexterous Manipulation Laboratory

Stanford University

[email protected]

Abstract

This paper for the ICRA 2000 Symposium on Dex-

terous Manipulation presents an overview of research

in dexterous manipulation. We �rst de�ne robotic

dexterous manipulation in comparison to traditional

robotics and human manipulation. Next, kinematics,

contact types and forces are used to formulate the dex-

terous manipulation problem. Dexterous motion plan-

ning is described, which includes grasp planning and

quality measures. We look at various mid- and low-

level control frameworks, and then compare manipula-

tion versus exploration. Finally, we list what we see as

the current limiting factors in dexterous manipulation,

and review the state of the art and future of the �eld.

1 Introduction

Dexterous Manipulation is an area of robotics inwhich multiple manipulators, or �ngers, cooperate tograsp and manipulate objects. Dexterous manipula-tion di�ers from traditional robotics primarily in thatthe manipulation is object-centered. That is, the prob-lem is formulated in terms of the object to be manip-ulated, how it should behave, and what forces shouldbe exerted upon it. Dexterous manipulation, requir-ing precise control of forces and motions, cannot beaccomplished with a conventional robotic gripper; �n-gers or specialized robotic hands must be used.

Although humans are not the only creatures capa-ble of manipulation, it is quintessentially a human ac-tivity. The fraction of the human motor cortex de-voted to manipulation and the number and sensitivityof mechanoreceptors in our palms and �ngertips areindications of the importance of manipulation in hu-mans. Within a year of birth, a human infant is clearlymore dexterous than today's robots, though far shortof adult skill in grasping and handling objects.

It should come as no surprise then that the major-ity of robot hands designed for dexterous manipula-tion are anthropomorphic in design. Research has alsobeen done to classify human grasping and manipula-tion with an eye to providing a knowledge-based ap-proach to grasp choice for robots [5, 6, 23, 15]. Whilethis approach has had some success in emulating hu-

man grasp choices for particular objects and tasks,other researchers have argued that robot hands, andthe circumstances in which they work, are fundamen-tally di�erent from the human condition. A model-based approach, based on the kinematics and dynam-ics of manipulating an object with the �ngertips, hastherefore dominated the �eld. The results of this ap-proach are now adequate for manipulations of objectsin controlled environments. As discussed in Section 6,the main limitation appears to be a lack of adequatetactile sensing for robust manipulation control.

Examples of autonomous, robotic dexterous manip-ulation are still con�ned to the research laboratory.However, the model-based approach has already pro-vided considerable insight into the nature of dexterousmanipulation, both in robots and in humans. Someof these results are now being applied to reconstruc-tive surgery, in which hand surgeons perform tendontransfer surgeries on patients with quadriplegia and/ornerve palsies to improve their grasping ability [39].

Future applications of robotic dexterous manipula-tion may include tasks where �ne manipulation is re-quired, yet it is infeasible or dangerous for a human toperform the task. Examples include underwater sal-vage and recovery, remote planetary exploration, andretrieval of objects from hazardous environments.

2 Formulation of the Dexterous

Manipulation Problem

The �rst step in moving an object from one con�g-uration to another using robotic �ngers is to formu-late the dexterous manipulation (DM) problem (Fig-ure 1). This problem sets the framework for determin-ing the required actuator forces/torques to producethe desired motions of the object. In keeping with anobject-centered approach, we work \backwards" fromthe object to the manipulators. The development ofthe kinematic portion of the DM problem, done herefrom force/torque relationships, can also be accom-plished from linear and angular velocity relationships.

This development requires knowing the geometricrelationships of the dexterous manipulator-objectsystem (i.e., contact locations), object geometry,�ngertip and link geometry, and the kinematics of the

1

Page 2: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

A

B

θ

framecoordinatecontact

framecoordinateobject

1

Figure 1: A typical dexterous manipulation problem:Moving an object from con�guration A to con�gura-tion B.

manipulator. In addition, it is assumed that contactis maintained throughout the manipulation.

2.1 Kinematics

2.1.1 Jacobian Relationships

The �rst step in developing the kinematics of the DMproblem is to calculate the required �ngertip forcesfrom a desired force/torque wrench on the object. Thebasis for this calculation is the grasp Jacobian relation-ship.

fobj = Gftip (1)

The grasp Jacobian (or grasp map), G, can be ob-tained by resolving each �ngertip force to a commoncoordinate frame embedded in the object. For each�ngertip i, this force resolution results in the mappingmatrix Gi.

fobji = Giftipi (2)

In Equation 2, the force vectors f are generalizedvectors: they may include both forces and torques.The individual mapping matrices Gi are concatenatedto form the grasp map G, and the �ngertip force vec-tors are also grouped into one vector.

fobj =�G1 G2 ::: Gm

�2664

ftip1ftip2:::

ftipm

3775 (3)

Note that Equation 3 is a simpli�ed treatment ofthe problem. Typically the �ngertip forces are repre-sented in a coordinate frame at the contact point onthe surface of the object. Then, knowing each contacttype (see the sections below and [22]), the number ofallowable force directions at each contact is reduced,

Joint Contact Object

Force/TorqueSpace

VelocitySpace

JT

T

h

hJ

τspace

fspace

c

xspace

θspace

G

G

v

ω

fm

c

Figure 2: The role of the hand and grasp Jacobians.For the �ngers, � is the vector of joint torques and _� isthe vector of joint velocities. For the contacts, fc is thevector of contact forces and _xc is the vector of contactpoint velocities. For the object, the resultant forcevector and the vector of object velocities are shown.

minimizing the dimension of the problem. For a de-tailed treatment of this topic see [19, 28, 30].

The grasp Jacobian developed above allows us tocalculate the required contact forces from the desiredforce on the object. In order to produce these forces atthe �ngertips, we now develop a hand Jacobian, whichwill allow us to calculate the joint torques from thecontact forces [19]. The hand Jacobian, Jh, is basedon the standard Jacobian, which relates end e�ectorforces to individual joint torques for a robotic manip-ulator (in this case one for each �nger).

�i = JiT ftipi (4)

In the DM problem, these individual Jacobians arebrought together to form the hand Jacobian.

2664

�1�2:::

�m

3775 =

2664

JT1

0 ::: 00 JT

2::: :::

::: ::: ::: 00 ::: 0 JT

m

3775

2664

ftip1ftip2:::

ftipm

3775 (5)

A nice conceptual picture of the roles of the graspJacobian and the hand Jacobian is shown in Figure 2[25]. Given a set of contact forces, the individual jointtorques can be obtained by multiplying by the trans-pose of the hand Jacobian, Jh, and the forces on theobject can be obtained by multiplying by the graspJacobian, G.

� = JhT fc (6)

fobj = Gfc (7)

Alternatively, from the kinematic point of view, thecontact point velocities can be obtained from the �ngerjoint velocities by multiplying by the hand Jacobian orby multiplying the object velocity by the grasp Jaco-bian:

2

Page 3: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

Jh _� = _xc = GTvobj (8)

The kinematic and force relationships described bythe above equations and Figure 2 are not necessarilyone to one. The system may be over-constrained (i.e.the �ngers may not be able to accommodate or re-sist all object motions or forces) or the system maybe under-constrained (i.e. there are multiple choicesfor �nger joint velocities or torques). Typically, anunder-constrained system is desired for dexterous ma-nipulation tasks. The detection of these conditions canbe accomplished by treating the combination of �ngersand object as a parallel-chain mechanism and evaluat-ing the manipulability of the object with respect to thepalm [19]. A summary of kinematic measures usefulin dexterous manipulation is provided in [6].

2.1.2 Rolling and Sliding

The previous development of the kinematics is forpoint contacts on the object, which do not move dur-ing the manipulation. However, the geometry of typi-cal robotic �ngers causes the contact points to travelon the surface of the object as the con�guration ofthe �ngers change during the manipulation. This istypically a non-holonomic constraint: the rolling of a�ngertip on an object requires that the velocities of thecontact points on each of the two objects must remainthe same. Including rolling and sliding constraints inthe DM problem kinematics involves the application ofdi�erential geometry and a parameterization of boththe �ngertip and object surfaces. This analysis takesas inputs the relative velocities (linear and angular) ofthe contact points on the objects (in this case on the�nger and the object), and outputs the parameterizedcontact point velocities on the surfaces of the objects(Figure 3). Figure 4 shows a typical example of theprogression of the �nger and object contact coordinateframes for the rolling of a �nger on the surface of anobject. For a detailed treatment of the di�erential ge-ometry involved in rolling see [29].

We will now look at the contact constraints andreduce a general rolling and sliding problem to purerolling in the contact plane. When the only constraintis to maintain contact, the relative velocity of the con-tact points can have no component in the surface nor-mal direction (vz = 0); this is assumed for most DMproblems. Adding the rolling constraint means thatthere can be no relative linear velocity between thetwo contact points, therefore vx = 0 and vy = 0. Fi-nally, if we specify soft �nger contact (we do not allowspin of the �ngertip about the contact normal) in theplane, there are two more constraints on angular veloc-ity (!x = 0 and !z = 0). A summary of the velocityconstraints for pure rolling in the plane is shown inTable 1.

However, dexterous manipulation is not con�ned topure rolling. Slip often occurs, and is useful when ex-

ContactEquations

Relative velocitiesof contact points on

each of the two objects

Velocities of contactframes on objects(1 and 2) and spin

vx

ωx

ωy

ωz ψ

vy

u1

u2

}

}

Figure 3: Contact variables (adapted from [29]).

Figure 4: The contact frames for rolling. u1 and u2are the contact frames for objects 1 and 2, respectively.After rolling occurs, the new contact frames will be u0

1

and u0

2.

ploring an unknown object or when changing the poseof a grasp to maintain control of the object [34]. OnediÆculty with slip is that �ngertip contact sensors areoften necessary to determine contact location, whereaswith pure rolling one can determine the current con-tact location from relative motions and starting con-tact locations. For a more detailed treatment of slidingmanipulation see [18, 35].

2.2 Contact Types and Force Closure

Early in the study of dexterous manipulation itwas recognized that the kinematics and dynamics arestrongly in uenced, even dominated, by the contactconditions at the �ngertips [7]. At a basic level, thereare three representative contact types: point contactwithout friction, point contact with friction, and soft�nger contact (Figure 5). The point contact withoutfriction can only resist a unidirectional force normalto the surface. Adding friction will allow it to resist

Maintain contact vz = 0No sliding vx = 0, vy = 0No spin !z = 0

Planar rolling !x = 0

Table 1: Contact velocity constraints for pure rollingin the plane

3

Page 4: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

Frictionlesspoint contact

Point contactwith friction

Soft-fingercontact

Fn

n

ty

ty

tx

F

FM

F

F

F

F

tx

z

n

Figure 5: Common contact types and their associatedforces/moments.

Fc Fc

No Slip Slip

FrictionCones

Figure 6: A representation of contact forces, with andwithout slip.

tangential forces and the soft �nger can, in addition,resist a torque about the normal to the surface.

When analyzing grasps which involve frictional con-tacts, it is also important to consider the friction cone.This is a geometric representation of the frictionalforce limits due to the static coeÆcient of friction. Inorder for no slippage to occur, the contact force mustlie within the friction cone (Figure 6). For a soft �n-ger, the friction cone can be replaced by a limit sur-face that includes torsional friction [12]. For a moredetailed treatment of contact types and force closuresee references [28, 22].

Choosing contact locations is an important part ofdexterous manipulation. Typically, it is important toachieve force closure on an object when manipulatingit (there are also important non-force closure manipu-lations - pushes, tips, etc.). Force closure requires thatthe grasp of the object can resist any possible direc-tion of force/torque perturbation to the object. Whena grasp has force closure on an object it is referred toas being a stable grasp.1

It is also important to distinguish between a force

1There are also other de�nitions of a stable grasp [6].

closure grasp and a \manipulable" grasp. Force clo-sure requires only that the �ngers can resist an ex-ternally applied force, i.e. the opposing resistive forcecan be passive or structural. A manipulable grasp re-quires that the manipulator can actively accommodateall object motion directions while maintaining contact.

2.3 Internal and External Forces

If frictional forces are relied upon to achieve a sta-ble grasp, it is important to provide suÆcient contactnormal forces. To increase these normal forces, an in-

ternal force is supplied. This is a vector of contactforce magnitudes that impart no resultant force to theobject and thus lie in the null space of the grasp map,G. If a vector of contact force magnitudes is decom-posed into those producing external forces and thoseproducing internal forces, each separate force magni-tude vector must satisfy the unidirectional constraintsrequired by the contact types (i.e. �ngers can onlypush, not pull, on the object surface). In general,there are many solutions of grasp forces that satisfygrasp stability while keeping each contact force insideits friction limits and supplying some internal force.This leads to an optimization problem, as discussedin the next section. One treatment of this problem isthe \virtual linkage" concept developed in [41], wherevirtual links are imagined between each unique pair ofcontact points. The internal forces are then the forcesthat each virtual link would experience during the ma-nipulation. For other, more detailed treatments of theinternal/external force concept see [42, 28].

3 Dexterous Motion Planning

In dexterous motion planning, there are two mainobjectives to consider: planning the motion of the ob-ject to achieve a desired con�guration or accomplish atask, and planning the grasp or motion of the �ngersrequired to impart this motion. This discussion over-laps with the grasp planning and optimization workreviewed in the Contact and Grasp Symposium at thisconference [22].

3.1 Grasp Planning for Desired ObjectMotion

There are two main questions that can be used toaddress the problems of grasp and grasp gait planning.These are: How can a sequence of local motions andregrasps be found which will result in the �nal desiredobject motion? Is there a guarantee that a new sta-ble grasp will always exist given a speci�c object ge-ometry, �nger workspace and grasp such that motioncan continue? These questions are answered using thekinematics of the robot hand, a reachability analysis,and consideration of regrasping and gaiting.

4

Page 5: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

Given a desired object motion, one must �rst deter-mine whether the �ngers can move the object withoutregrasping. This can be accomplished by a reachabil-ity analysis [19]. Using the range of possible manipu-lations for a given hand and object, a workspace canbe constructed that is a function not only of the geom-etry of the hand and object, but also of the rolling orsliding that may occur at the contacts. If the �ngersmust disconnect with the object in order to move theobject to the desired position, then one can considerregrasping. A sequence of �nger motions and regraspsare known as a grasp gait [24].

When moving or reorienting an object with a hand,only a �nite amount of local motion can be impartedto the object before a new grasp must be found. Thiscan be due to the limited workspace of the �ngers ofany hand (human or robot), or collisions among the�nger links, the environment and the object beingmanipulated. One method to determine whether localmotions will suÆce to reorient the object is the graspmap, a graphical representation of all stable grasps[24]. In planning, it is also important to realize thata new grasp cannot always be found if the objectis moved locally until a �nger reaches a workspacelimit; often a grasp gait must occur before the limit isreached.

3.2 Grasp Optimizations and QualityMeasures

Manipulators used for dexterous manipulation typ-ically have kinematic redundancy. In addition, thereare usually multiple choices for contact locations thatachieve force closure on an object. Therefore, therecan be an in�nite number of possible grasps for a ma-nipulation. We want to pick the \best" grasp. Thatis, we want to choose the optimal contact locations,contact forces, and �nger poses for a particular ma-nipulator, object and task combination.

In order to choose the best grasp, we need to de-velop a metric that will measure the \quality" of agiven grasp. It is common for this measure to de-pend on the task requirements. An example of such ameasure was developed by Li and Sastry [25], who sep-arated the task requirements into two parts: wrench(or force) requirement and twist (or motion) require-ment. Each is represented by task ellipsoids, whoseaxes indicate the relative magnitude requirements forthe elements of the wrench or twist vector.

While researchers have formulated good conceptualquality measures for grasps, using these measures forautomatic grasp choice remains diÆcult. Many suc-cessful optimization techniques have been developedfor specifying contact forces given known contact lo-cations and task requirements. Some of these are eÆ-cient enough for real time computation (for example,[4]). Searching for the optimal contact locations is in-herently more diÆcult, because the quality measure is

High-Level Task planning, discrete eventsystems, grasp choice

Mid-Level Phases, transitions, event detec-tion

Low-Level Operational space dynamics, co-operative object impedance con-trol, kinematics, forces

Table 2: Levels of Control for Dexterous Manipulation

typically a non-convex (and non-linear) function overthe search space and thus standard convex optimiza-tion techniques can not be used. There have been at-tempts at contact location synthesis, but no algorithmhas been widely adopted. (An investigation of thisproblem is presented in [16].) One notable contact lo-cation choice algorithm, by Nguyen, uses an analyticgeometry approach to choose positions for opposinggrasps that are robust with respect to errors in po-sition [31]. Some of the more successful autonomouscontact location choice systems are knowledge based[23, 37].

4 Control Frameworks

The control of dexterous manipulation can be de-composed into three main levels, as shown in Table2. High-level control includes task and motion plan-ning and grasp choice. Most of these issues have beendiscussed already in this paper. Mid-level control in-cludes manipulation phases, for example whether the�ngers are operating independently or cooperatively,and whether force or motion control is required. Tran-sitions between these phases must be accomplishedsmoothly, and events must be sensed to trigger thetransitions. Low-level control includes basic strategies,(e.g. force or impedance control) and the formulationsof the control problem in the appropriate space (e.g.the operational space of the grasped object).

4.1 Mid-Level Control

The middle level of control for dexterous manipu-lation has received relatively little attention comparedto the high and low levels. One approach is proposedby Hyde and Cutkosky [13]. The middle level of con-trol involves the management of the various phases ofa task, as well as the speci�cation of control laws tobe used when transitioning between phases. Researchin neurophysiology has shown that humans graspingand manipulating objects use a similar approach, re-ceiving signals from specialized skin cells that triggershifts between phases of a manipulation task [17].

Di�erent phases are likely to use di�erent controllaws, as dexterous manipulation is characterized bychanging kinematic and dynamic con�gurations and

5

Page 6: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

Free Motion Phase Contact Phase

Ev

ent

Co

nfi

den

ceIn

tera

ctio

nF

orc

e

Co

mm

and

edV

elo

city

Alert Action

Event A crossesalert confidence

Event B crossesalert confidence

Event A crossescommitment confidence

Start-upAction

A

B

Figure 7: Events and transitions between phases (adapted from [13]).

constraints. If the transitions between phases are notmanaged carefully, these changes can cause undesir-able behavior. Smoothness can be critical when eventdetection relies on dynamic sensors, which are inher-ently sensitive to vibrations and small motion discon-tinuities. Smooth transitioning can be accomplishedby modifying the start and end portions of phases [13].Figure 7 shows the how events are used to trigger tran-sitions between phases, as well as the actions taken atthe beginning and end of a phase.

Knowing when a particular event, such as a �ngermaking contact with an object, has occurred is oftena diÆcult task in the presence of other events thatresult in similar sensor signals. Thus, context-sensitiveevent detection and sensor fusion are used to de�nethe probability that a particular event has occurred[9, 14, 38].

4.2 Low-Level Control

Unlike the mid-level control strategies describedabove, low-level control problems have received thor-ough attention in previous work. This is primarilybecause the strategies for low-level control of dexter-ous manipulators are a direct extension of those usedfor single and cooperating robot arms.

The formulation of the control has taken severalforms. One that stands out is the operational spaceformulation, which has been used in object-centereddexterous manipulation [20]. The basic idea of thisapproach is to control motions and contact forcesthrough the use of control forces that act at the op-erational point of a grasped object. Through jointspace/operational space relationships formed using Ja-cobians and a Lagrange or Newton-Euler formulation,an operational space dynamic model of the system iscreated.

�(x)�x + �(x; _x) + p(x) = F; (9)

where �(x) is the mass matrix, �(x; _x) is the termfor centrifugal and Coriolis forces, p(x) is the term forgravity, and F is the operational space force.

Using this framework, a control structure can bechosen for dynamic decoupling and motion control.For example, a basic PD (proportional-derivative) con-troller of the form

F� = �kv( _x� _xd) (10)

_xd =kp

kv(xg � x) (11)

may be used to move the operational point to a goalposition xg when a particular trajectory is not re-quired.

The concept of impedance control is also importantin dexterous manipulation. In this type of control,we specify the desired impedance of the object beingmanipulated [10]. That is, the object in the grasp ofthe robot can have an apparent mass, as well as sti�-ness and damping when subjected to external forces.The control block diagram in Figure 8 shows a controlframework using object impedance control for rollingmanipulation. This diagram shows the path of infor-mation from commands, through control laws, to ap-plication on the dexterous hand [14]. Another controllaw, with explicit control of the contact trajectory, ispresented in [36].

5 Manipulation Versus Exploration

As shown by Klatzky and Lederman [21], manipula-tion and exploration go hand in hand. We can obtaina precise de�nition for each separately: \Pure" manip-ulation occurs when the object is completely known.\Pure" exploration happens when the object is �x-tured and is not known. Most dexterous manipula-tion is a combination of manipulation and exploration.

6

Page 7: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

DesiredBody

Position

Commands Control Laws Application and Feedback

DesiredContact

Locations

DesiredInternalForce

RollingController

ForwardGrasp

Transform

BackwardGrasp

Transform

FingerControl

ContactPoint

Sensing

Plant

ObjectImpedanceController

InternalForce

Controller

Figure 8: Control block diagram for manipulation with rolling.

And in a non-ideal world, we need manipulation forexploration and vice versa.

There are a number of examples of simultaneousmanipulation and exploration. Bicchi, et al. [3] ma-nipulate by rolling and also build up a model of theobject at the same time. Okamura, et al. manipu-late and explore an object where the general shape isassumed but small features may be explored [34, 33].Allen primarily uses exploration with robot �ngers tobuild up an object model using mathematical shapedescriptions (superquadrics) [1].

6 Current Limiting Factors

Several robotic hands have been developed in thepast two decades, although at the time of this writingthey are used mainly in research laboratories. Limita-tions in hardware and in software (algorithms) hampertheir application outside the laboratory.

6.1 Hardware

High bandwidth and well-controlled actuation isnecessary for dexterous manipulation. In addition, ac-tuators should be small and lightweight in compari-son to the hand they are actuating. Currently, smallbut powerful actuators are rare and expensive. How-ever, new technologies such as voice-coil actuation andrare earth magnets are enabling smaller construction.Backdrivability is another issue; in order to increasepower, designers of robotic hands often use transmis-sion systems that cause small motion errors to resultin high interaction forces, reducing backdrivability.

Sensors are essential for dexterous manipulation, asfeedback of contact information is vital to accomplishany task where the environment in unknown. Tactilesensors are the primary sources of information for mostdexterous manipulation tasks (although some work hasbeen done on combining vision sensors with dexterous

manipulation/haptic exploration [2]). Tactile sensorscan be divided into two categories: extrinsic and in-trinsic. Extrinsic sensors can sense actual contact lo-cation. They are often array sensors and are limited inresolution and accuracy. Examples include capacitivearrays, pin arrays, micromachined force sensor arrays,and optical measurement systems (i.e., CCD). Prob-lems with these sensors include sensitivity to noise,delicacy, poor resolution, slow data acquisition andprocessing, diÆculties in manufacturing, and high cost[11].

Intrinsic sensors use global measurements fromwhich contact information can be extracted. Typesof intrinsic sensors include force/strain sensors, opti-cal sensors (i.e. PSD), and joint torque sensing. Whilesome progress has been made, these sensors are oftentoo large for dexterous robotic �ngers, are expensivefor multiple degrees of freedom, and provide limited in-formation. For example, the optical waveguide tactilesensor designed by Maekawa, et al. gives only the cen-troid of all contact locations, which is a�ected by theintensity of the di�erent contacts [26]. Intrinsic sen-sors also cannot tell the di�erence between one contactand multiple contacts.

6.2 Software

The mathematical complexity involved in spatialrolling and sliding manipulations has discouraged theuse of 3-D multi-�ngered hands. While some investiga-tors do work with 3-D manipulators [36], others havechosen to limit dexterous manipulation to 2-D. An-other approach is to design a simpli�ed manipulatorthat uses planar �ngertips for 3-D rolling [3, 8].

There is certainly a gap between theoretical promiseand practical delivery due to the complexity of specify-ing and controlling automated grasping and manipula-tion tasks. There are several areas of dexterous manip-ulation in which better algorithms are required beforesigni�cant improvement can be made. Currently, most

7

Page 8: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

grasp choice and optimization systems that use mul-tiple �ngers are quite slow and the calculations mustbe done o�-line, particularly when contact locationsmust be determined. Another area for improvementis motion planning. Similar to grasp choice, the al-gorithms are slow and cannot be accomplished duringthe manipulation task. One �nal area is the use oftactile sensing in control. Understanding and usingtactile sensor output for direct servoing has been thesubject of some recent work [33, 27, 43], however, im-provements in tactile sensing and data interpretationare needed to accomplish this in less controlled condi-tions.

7 Discussion

At present, autonomous, real time dexterous ma-nipulation in unknown environments still eludes us.In much of the current research, it appears that wehave given up on anthropomorphic hands because ofdiÆculties in hardware development and autonomouscontrol. The recent trend has been to break the dex-terous manipulation problem into small parts that canbe studied separately with specialized hardware. Inmany cases the research is done in simulation ratherthan experimentally.

Although autonomous dexterous manipulation re-mains impractical outside of the laboratory,a promis-ing interim solution is supervised manipulation. Inthis approach, a human provides the high-level graspand manipulation planning, while the robot performs�ne (dexterous) manipulations [32]. Another methodis teaching by demonstration (gesture-based program-ming) [40]. The human may also perform the inter-pretation of tactile information in supervised remoteexploration [32].

The miniaturization of manipulation is another areaof with promise. However, manipulations occurring ona very small scale are dominated by friction and Vander Waals forces. Stable grasping is often not neces-sary; the objects will stick directly to the manipulator.

References

[1] Peter K. Allen and Paul Michelman, Acquisitionand interpretationof 3-d sensor data from touch,IEEE Transactions on Robotics and Automation6 (1990), no. 4, 397{404.

[2] Peter K. Allen, Andrew T. Miller, Paul Y. Oh,and Brian S. Leibowitz, Using tactile and visual

sensing with a robotic hand, Proceedings of theIEEE International Conference on Robotics andAutomation (1997), 676{681.

[3] Antonio Bicchi, Alessia Marigo, and DomenicoPrattichizzo, Dexterity through rolling: Manip-

ulation of unknown objects, Proceedings of theIEEE International Conference on Robotics andAutomation (1999), 1583{1588.

[4] Martin Buss, Hideki Hashimoto, and John B.Moore, Dextrous hand grasping force optimiza-

tion, IEEE Transactions on Robotics and Au-tomation 12 (1996), no. 3, 406{418.

[5] Mark R. Cutkosky, Robotic grasping and �ne ma-

nipulation, Kluwer Academic Publishers, 1985.

[6] Mark R. Cutkosky and Robert D. Howe, Human

grasp choice and robotic grasp analysis, DextrousRobot Hands (Subramanian T. Venkataramanand Thea Iberall, eds.), Springer-Verlag, 1990,pp. 5{31.

[7] Mark R. Cutkosky and Paul Wright, Friction, sta-bility and the design of robot �ngers, InternationalJournal of Robotics Research 5 (1986), no. 4, 20{37.

[8] P. Datseris and W. Palm, Principles on the dvel-

opment of mechanical hands which can manipu-

late objects by means of active control, Journal ofMechanisms, Transmissions, and Automation inDesign 107 (1985), no. 2, 148{156.

[9] B. Eberman and J. Kenneth Salisbury, Applica-tion of change detection to dynamic contact sens-

ing, International Journal of Robotics Research13 (1994), no. 5, 369{394.

[10] Neville Hogan, Impedance control: an approach to

manipulation: parts i, ii, and iii, ASME Journalof Dynamic Systems, Measurement, and Control107 (1985), 1{24.

[11] Robert D. Howe and Mark R. Cutkosky, Touchsensing for robotic manipulation and recognition,The Rbotics Review 2 (O. Khatib, J. Craig, andT. Lozano-Perez, eds.), MIT Press, 1992, pp. 55{112.

[12] , Practical force-motion models for sliding

mnaipulation, International Journal of RoboticsResearch 15 (1996), no. 6, 557{572.

[13] James M. Hyde and Mark R. Cutkosky, Phasemanagement framework for event-driven dextrous

manipulation, IEEE Transactions on Roboticsand Automation 14 (1998), no. 6, 978{985.

[14] James M. Hyde, Marc Tremblay, and Mark R.Cutkosky, An object-oriented framework for

event-driven dexterous manipulation, Experimen-tal Robotics IV, Springer-Verlag, 1995.

8

Page 9: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

[15] Thea Iberall, A neural network for planning

hand shapes in human prehension, Proceedings ofthe 1988 Automatic Controls Conference (1988),2288{2293.

[16] John W. Jameson, Analytic techniques for auto-

mated grasp, Ph.D. thesis, Stanford University,Department of Mechanical Engineering, 1985.

[17] R.S. Johansson and G. Westling, A�erent sig-

nals during manipulative tasks in man, Informa-tion Processing in the Somatosensory System:Proceedings of an International Seminar at theWenner-Gran Center (O. Franzen and J. West-man, eds.), Macmillan, 1991.

[18] Imin Kao and Mark R. Cutkosky, Quasistatic ma-

nipulation with compliance and sliding, Interna-tional Journal of Robotics Research 11 (1992),no. 1, 20{40.

[19] Je�rey Kerr and Bernard Roth, Analysis of multi-

�ngered hands, International Journal of RoboticsResearch 4 (1986), no. 4, 3{17.

[20] Oussama Khatib, Uni�ed approach for motion

and force control of robot manipulators: the op-

erational space formulation, IEEE Journal ofRobotics and Automation 3 (1987), no. 7, 43{53.

[21] Robert L. Klatzky and Susan Lederman, Intel-ligent exploration by the human hand, DextrousRobot Hands (Subramanian T. Venkataramanand Thea Iberall, eds.), Springer-Verlag, 1990,pp. 66{81.

[22] Vijay Kumar and Antonio Bicchi, Review pa-

per for contact and grasp symposium, SymposiumProceedings of the IEEE International Conferenceon Robotics and Automation (2000, in review).

[23] C. Laugier and J. Pertin, Automatic grasp-

ing: A case study in accessibility analysis, Ad-vanced Software in Robotics (A. Danthine andM. Geradin, eds.), Elsevier Science Publishers,1984, pp. 201{214.

[24] Susanna R. Leveroni, Grasp gaits for planar ob-

ject manipulation, Ph.D. thesis, MassachusettsInstitute of Technology, Department of Mechani-cal Engineering, 1997.

[25] Zexiang Li and Shankar S. Sastry, Issues in dex-

trous robot hands, Dextrous Robot Hands (Subra-manian T. Venkataraman and Thea Iberall, eds.),Springer-Verlag, 1990, pp. 154{186.

[26] H. Maekawa, K. Tanie, and K. Komoriya,A �nger-shaped tactile sensor using an optical

waveguide, Proceedings of the IEEE InternationalConference on Systems, Man and Cybernetics(1993), 403{408.

[27] , Tactile sensor based manipulation of

an unknown object by a multi�ngered hand with

rolling contact, Proceedings of the IEEE Interna-tional Conference on Robotics and Automation(1995), 743{750.

[28] Matthew T. Mason and J. Kenneth Salisbury,Robot hands and the mechanics of manipulation,The MIT Press, 1985.

[29] D.J. Montana, The kinematics of contact and

grasp, International Journal of Robotics Research7 (1988), no. 3, 17{32.

[30] R. Murray, Zexiang Li, and Shankar Sastry, Amathematical introduction to robotic manipula-

tion, CRC Press, 1994.

[31] Van-Duc Nguyen, Constructing force closure

grasps, International Journal of Robotics Re-search 7 (1988), no. 3, 3{16.

[32] Allison M. Okamura, Michael A. Costa,Michael L. Turner, Christopher Richard,and Mark R. Cutkosky, Haptic exploration of

surfaces, Experimental Robotics VI (Peter Corkeand James Trevelyan, eds.), Lecture Notes inControl and Information Sciences, vol. 250,Springer-Verlag, 2000, ISBN: 1 85233 210 7,pp. ??{??

[33] Allison M. Okamura and Mark R. Cutkosky, Hap-tic exploration of �ne surface features, Proceed-ings of the IEEE International Conference onRobotics and Automation 3 (1999), 2930{2936.

[34] Allison M. Okamura, Michael L. Turner, andMark R. Cutkosky, Haptic exploration of objects

with rolling and sliding, Proceedings of the IEEEInternational Conference on Robotics and Au-tomation 3 (1997), 2485{2490.

[35] Daniela Rus, Analysis of musical-instrument

tones, International Journal of Robotics Research18 (1999), no. 4, ??{??

[36] N. Sarkar, X. Yun, and V. Kumar, Dynamic con-

trol of 3-d rolling contacts in two-arm manipu-

lation, IEEE Transactions on Robotics and Au-tomation 13 (1997), no. 3, 364{376.

[37] Sharon Stans�eld, A robotic perceptual system

utilizing passive vision and active touch, Inter-national Journal of Robotics Research 7 (1988),no. 6, 138{161.

[38] Marc R. Tremblay, Using multiple sensors and

contextual information to detect events during a

manipulation task, Ph.D. thesis, Stanford Uni-versity, Department of Mechanical Engineering,1995.

9

Page 10: θ frame - Semantic Scholar · motor cortex de-v oted to manipulation and the n um b er and sensitivit y of mec ... torques). T ypically, an under-constrained system is desired for

[39] Francisco Valero-Cuevas, Clinical applications of

dexterous manipulation and grasp, SymposiumProceedings of the IEEE International Conferenceon Robotics and Automation (2000, in review).

[40] Richard M. Voyles and Pradeep K. Khosla,Gesture-based programming: a preliminary

demonstration, Proceedings of the IEEE Interna-tional Conference on Robotics and Automation(1999), 708{713.

[41] David Williams and Oussama Khatib, The virtuallinkage: a model for internal forces in multi-grasp

manipulation, Proceedings of the IEEE Interna-tional Conference on Robotics and Automation 1

(1993), 1025{1030.

[42] Tsuneo Yoshikawa and Kiyoshi Nagai, Analysisof multi-�ngered grasping and manipulation, Dex-trous Robot Hands (Subramanian T. Venkatara-man and Thea Iberall, eds.), Springer-Verlag,1990, pp. 5{31.

[43] H. Zhang, H. Maekawa, and K. Tanie, Sensitiv-ity analysis and experiments of curvature estima-

tion based on rolling contact, Proceedings of theIEEE International Conference on Robotics andAutomation (1996), 3514{3519.

10