RIASSUNTO
In this work we propose a method to extract visual features from a tool in the hand of a robot to derive basic properties how to handle this tool correctly. We want to show how a robot can improve its accuracy in certain tasks by a visual exploration of geometric features. We also show methods to extend the proprioception of the robots arm to the new end-effector including the tool. By a combination of 3D and 2D data, it is possible to extract features like geometric edges, flat surfaces and concavities. From those features we can distinguish several classes of objects and make basic measurements of potential contact areas and other properties relevant for performing tasks. We also present a controller that uses the relative position or orientation of such features as constraints for manipulation tasks in the world. Such a controller allows to easily model complex tasks like pancake flipping or sausage fishing. The extension of the proprioception is achieved by a generalized filter setup for a set of force torque sensors, that allows the detection of indirect contacts performed over a tool and extract basic information like the approximated direction from the sensor data.