This post is about connecting ROS2 robots to the cloud with a quick and simple Greengrass setup. I'll guide you through building a sample component that can be hosted on a server connected to your robot fleet, which will connect the robots to AWS IoT Core without modifying the robot software in any way. Setup requires installing only a few prerequisites. Let's see how it works!
This post is also available in video form, if you want to follow along:
Simulators such as Gazebo, NVIDIA Isaac Sim, and O3DE are incredibly helpful tools when it comes to robotics development. Robots are very slow and expensive to develop, and a large part of the reason is not being able to test how the robot works for real because of how slow it is to set up and run tests. With simulations, we can repeatedly run a robot through exactly the same setup every time to see how it behaves, allowing us to more quickly modify the robot software and run it again.
There's one issue - the more realistic the simulation, the more powerful the computer needed to run it. That's where the cloud comes in; instead of buying the hardware outright and running the computer in-house, you can run simulations on demand in the cloud, and only pay for the server while you're using it!
In this post, I'll show you how to run a multi-robot sample simulation using O3DE in the cloud, using an on-demand EC2 instance with graphics hardware available. If you'd like to follow along, this post is also available in video form using the link below:
This post is about how to build an AWS Step Functions state machine and how you can use it to interact with IoT edge devices. In this case, we are sending a smoothie order to a "robot" and waiting for it to make that smoothie.
The state machine works by chaining together a series of Lambda functions and defining how data should be passed between them (if you're not sure about Lambda function, take a look at this blog post!). There's also a step where the state machine needs to wait for the smoothie to be made, which is slightly more complicated - we'll cover that later in this post.
This post is also available in video form - check the video link below if you want to follow along!
This is the second part of the "ROS2 Control with the JetBot" series, where I show you how to get a JetBot working with ROS2 Control! This is a sequel to the part 1 blog post, where I showed how to drive the JetBot's motors using I2C and PWM with code written in C++.
In this post, I show the next step in making ROS2 Control work with the WaveShare JetBot - wrapping the motor control code in a System. I'll walk through some concepts, show the example repository for ROS2 Control implementations, and then show how to implement the System for JetBot and see it running.
This post is also available in video form - check the video link below if you want to follow along!
This post shows how to build two simple functions, running in the cloud, using AWS Lambda. The purpose of these functions is the same - to update the status of a given robot name in a database, allowing us to view the current statuses in the database or build tools on top of it. This is one way we could coordinate robots in one or more fleets - using the cloud to store the state and run the logic to co-ordinate those robots.
This post is also available in video form - check the video link below if you want to follow along!
Welcome to a new series - setting up the JetBot to work with ROS2 Control interfaces! Previously, I showed how to set up the JetBot to work from ROS commands, but that was a very basic motor control method. It didn't need to be advanced because a human was remote controlling it. However, if we want autonomous control, we need to be able to travel a specific distance or follow a defined path, like a spline. A better way of moving a robot using ROS is by using the ROS Control interfaces; if done right, this means your robot can autonomously follow a path sent by the ROS navigation stack. That's our goal for this series: move the JetBot using RViz!
The first step towards this goal is giving ourselves the ability to control the motors using C++. That's because the controllers in ROS Control requires extending C++ classes. Unfortunately, the existing drivers are in Python, meaning we'll need to rewrite them in C++ - which is a good opportunity to learn how the serial control works. We use I2C to talk to the motor controller chip, an AdaFruit DC Motor + Stepper FeatherWing, which sets the PWM duty cycle that makes the motors move. I'll refer to this chip as the FeatherWing for the rest of this article.
First, we'll look at how I2C works in general. We don't need to know this, but it helps to understand how the serial communication works so we can understand the function calls in the code better.
Once we've seen how I2C works, we'll look at the commands sent to set up and control the motors. This will help us understand how to translate the ROS commands into something our motors will understand.
The stage after this will be in another article in this series, so stay tuned!
This post is also available in video form - check the video link below if you want to follow along!
These Docker Compose files can be used to run public Docker components, or pull private images from ECR. This means that you can deploy your own system of microservices to any platform compatible with AWS Greengrass.
This post is also available in video form - check the video link below if you want to follow along!
This post shows how to build a Robot Operating System 2 node using Rust, a systems programming language built for safety, security, and performance. In the post, I'll tell you about Rust - the programming language, not the video game! I'll tell you why I think it's useful in general, then specifically in robotics, and finally show you how to run a ROS2 node written entirely in Rust that will send messages to AWS IoT Core.
This post is also available in video form - check the video link below if you want to follow along!