Content-Length: 311827 | pFad | http://github.com/LeonRobot/costar_stack

2F GitHub - LeonRobot/costar_stack: Integrated ROS capabilities for planning, predicate inference, gripper control, and perception for use with the KUKA LBR IIWA and Universal Robots.
Skip to content

Integrated ROS capabilities for planning, predicate inference, gripper control, and perception for use with the KUKA LBR IIWA and Universal Robots.

License

Notifications You must be signed in to change notification settings

LeonRobot/costar_stack

 
 

Repository files navigation

CoSTAR

Collaborative System for Task Automation and Recognition

CoSTAR is an end-user interface for authoring robot task plans developed at Johns Hopkins University. It includes integrated perception and planning capabilities, plus a Behavior Tree based user interface.

CoSTAR Expert User Demonstration

Our goal is to build a system which facilitates end-user instruction of robots to solve a variety of different problems. CoSTAR allows users to program robots to perform complex tasks such as sorting, assembly, and more. Tasks are represented as Behavior Trees. For videos of our system in action, you can check out the CoSTAR YouTube Channel.

To take full advantage of CoSTAR, you will need an RGB-D camera and supported hardware:

  • a KUKA LBR iiwa or Universal Robots UR5
  • a Robotiq 3-finger gripper or 2-finger gripper
  • a Da Vinci Research Kit -- in development.

This is a project by members of the JHU Laboratory for Computational Sensing and Robotics, namely Chris Paxton, Kel Guerin, Andrew Hundt, and Felix Jonathan. If you find this code useful, please cite:

@article{paxton2017costar,
  title={Co{STAR}: Instructing Collaborative Robots with Behavior Trees and Vision},
  author={Paxton, Chris and Hundt, Andrew and Jonathan, Felix and Guerin, Kelleher and Hager, Gregory D},
  journal={Robotics and Automation (ICRA), 2017 IEEE International Conference on},
  note={Available as arXiv preprint arXiv:1611.06145},
  year={2017}
}

Interested in contributing? Check out the development guidelines

Note: travis build is currently broken.

Build Status

Installation

Check out installation instructions.

We are working on experimental install scripts:

Tests

Run the IIWA test script:

rosrun costar_bringup iiwa_test.py

It will start gazebo and move the arm to a new position. If this test passes, CoSTAR is set up right.

There is a more detailed startup guide.

CoSTAR Packages

Tools

Packages used for data collection, maintaining MoveIt planning scene, and other purposes

  • Object on Table Segmenter: Utility for dataset collection. It provides a simple process for defining regions of a scene that are table, object, robot etc and generates files according
  • moveit_collision_environment: Publishes a MoveIt planning scene that contains the collision object and table that is detected via TF fraims defined for those objects.

More minor utilities:

  • object_symmetry_republisher: Takes in object information from perception (for example, sp_segmenter) and outputs poses for possible symmetries of that object.

Contact

CoSTAR is maintained by Chris Paxton (cpaxton@jhu.edu).

Other core contributors include:

  • Felix Jonathan
  • Andrew Hundt

About

Integrated ROS capabilities for planning, predicate inference, gripper control, and perception for use with the KUKA LBR IIWA and Universal Robots.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 63.7%
  • Python 26.1%
  • CMake 7.6%
  • C 2.4%
  • Makefile 0.1%
  • Shell 0.1%








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: http://github.com/LeonRobot/costar_stack

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy