Skip to content

Packages related to autonomous mobile robots using Deep Learning and Reinforcement algorithms

License

Notifications You must be signed in to change notification settings

surfertas/amr_core

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS packages related to autonomous mobile robot (amr) powered by Raspberry Pi

Related Posts:

  1. Autonomous Mobile Robot #1: Data collection to a trained model
  2. Autonomous Mobile Robot #2: Inference as a ROS service
  3. Autonomous Mobile Robot #3: Pairing with a PS3 Controller for teleop
  4. Autonomous Mobile Robot #4: Using GCP Storage

Note: Readme is incomplete will work to detail out requirements overtime.

Setup

  1. Clone amr_core repository.
$ git clone https://github.com/surfertas/amr_core.git
  1. Install joy package (used for teleoperation) into amr_worker.
$ sudo apt-get install ros-indigo-joy
  1. Install video_stream_opencv package into amr_worker.
$ git clone https://github.com/ros-drivers/video_stream_opencv.git
  1. Configure data_storage.yaml found here The setup will require a external SSD connected to the raspi via the specified device driver.

Launch files are found here.

Training

On Raspi:

$ roslaunch amr_bringup amr_teleop_bringup.launch
  • Teleop assumes PS3 dualshock3 controller
  • Bluetooth setup instructions for PS3 controller. This needs to be done once amr_teleop_bringup.launch is launched in order to control the robot wireless.

At this point, your robot should be subscribing to topics related to images, and commands(throttle and steer). Confirm that such topics are being published by inputting rostopic list on command line.

In the data_storage.yaml configuration file, you can set the frequency at which the system saves the features (image) and label (commands) to disk. Note that you need to specify where you want to store the data.

Once data has been collected you can retrieve the pickle file, and use the training repo found in amr_models to train the appropriate model.

Using the trained model

Once the model has been trained, place the saved model file in the models folder of the appropriate package in amr_master. (e.g. place the controller model in /models of amr_nn_controller_service. Make sure the path to the model file is updated in ./config.

On Jetson TX initiate the service by:

$ roslaunch amr_nn_controller_service.launch

On Raspi launch the neural network driven controller by:

$ roslaunch amr_bringup amr_nn_bringup.launch

The intended set up is that amr_worker is running on an edge machine, (e.g. Raspberry Pi) while amr_master should be running on a more powerful master resource (e.g. Jetson TX).

License

MIT

About

Packages related to autonomous mobile robots using Deep Learning and Reinforcement algorithms

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published