Despite the increasing number of collaborative robots in human-centered manufacturing, currently, industrial robots are still largely preprogrammed with very little autonomous features. In this context, it is paramount that the robot planning and motion generation strategies are able to account for changes in production line in a timely and easy-to-implement fashion. The same requirements are also valid for service robotics in unstructured environments where an explicit definition of a task and the underlying path and constraints are often hard to characterize. In this regard, this paper presents a real-time point-to-point kinematic task-space planner based on screw interpolation that implicitly follows the underlying geometric constraints from a user demonstration. We demonstrate through example scenarios that implicit task constraints in a single user demonstration can be captured in our approach. It is important to highlight that the proposed planner does not learn a trajectory or intends to imitate a human trajectory, but rather explores the geometric features throughout a one-time guidance and extend such features as constraints in a generalized path generator. In this sense, the framework allows for generalization of initial and final configurations, it accommodates path disturbances, and it is agnostic to the robot being used. We evaluate our approach on the 7 DOF Baxter robot on a multitude of common tasks and also show generalization ability of our method with respect to different conditions.

This content is only available via PDF.
You do not currently have access to this content.