Source
@inproceedings{ichnowski_2023_fogros2,
author={Ichnowski, Jeffrey and Chen, Kaiyuan and Dharmarajan, Karthik and Adebola, Simeon and Danielczuk, Michael and Mayoral-Vilches, Víctor and Jha, Nikhil and Zhan, Hugo and Llontop, Edith and Xu, Derek and Buscaron, Camilo and Kubiatowicz, John and Stoica, Ion and Gonzalez, Joseph and Goldberg, Ken},
booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
title={{FogROS2}: An Adaptive Platform for Cloud and Fog Robotics Using {ROS} 2},
year={2023},
pages={5493-5500},
doi={10.1109/ICRA48891.2023.10161307}
}
| (UC Berkeley) | IEEE | arXiv |
TL;DR
ROS 2 + Cloud computing

Flash Reading
- Abstract: Cloud and fog computing for robotics with ROS 2 (successor of FogROS1 [1]).
- Introduction: FogROS1 has limitations in latency, usability, and automation, which FogROS2 aims to address. FogROS2 is also integrated with Foxglove for remote visualization and monitoring. FogROS1 needs ~ 4min startup time and ~ 5 s image round-trip time, while FogROS2 lowers these latencies (63% startup and 97% image round-trip) by using application-specific cloud computing images, consistent cloud computer based on Kubernetes backend, switching from TCP to UDP secured networking, and adding H.264 video compression. Three applications are tested: visual SLAM, grasp planning, and motion planning.
- Related Work: FogROS1 [1] and cloud robotics [2]. Compared to previous work (RoboEarth, Rapyuta, AWS greengrass, etc.), FogROS2 pushes robot nodes from robot to the cloud, so that ROS 2 users can access cloud resources without learning additional tools. FogROS2 uses VPNs to connect robots to the cloud.
- Background on ROS: A core improvement in ROS 2 is to change from a proprietary publish-subscribe system to DDS (Data Distribution Service) [3], which is a standard for distributed systems.
- Approach: The frontend is a launch system specifying what nodes to launch and where. It provides an integrated command line interface (CLI) with ROS 2. The new launch system enables custom logic. For example, selecting the nearest cloud machine based on the robot’s location, and choosing the correct image and machine type based on the application. FogROS2 automates the process of creating images and supports Kubernetes for managing cloud machines. At launch time, FogROS2 can set up transparent streaming compression (middleware/system-level compression) between robot and cloud via H.264. At last, it supports Foxglove for remote visualization and monitoring, utilizing the advantages of the cloud.
References
- [1] FogROS: An adaptive framework for automating fog robotics deployment, CASE 2021. Online.
- [2] A Survey of Research on Cloud Robotics and Automation, T-ASE 2015. Online.
- [3] Data Distribution Service 1.4, Object Management Group (OMG) 2015. Online.
Extensions
A demo code for frontend launch script:
def generate_launch_description():
ld = FogROSLaunchDescription()
### Cloud machine
machine1 = AWSCloudInstance(
region="us-west-1", # or find_nearest_aws_region()
ec2_instance_type="g4dn.xlarge")
### Grasp motion on the robot
grasp_motion_node = Node(
package="fogros2_examples",
executable="grasp_motion",
output="screen")
### Grasp planning on the cloud
grasp_planning_node = CloudNode(
package="fogros2_examples",
executable="grasp_planner",
output="screen",
machine=machine1)
ld.add_action(grasp_motion_node)
ld.add_action(grasp_planning_node)
return ld