Recently we did a 2 week visit, as part of Horizon Europe CSA project – Strengthening Research and Innovation Excellence in Autonomous Aerial Systems (AeroSTREAM), to The University of New Mexico Agile Manufacturing lab led by professor Rafael Fierro. Apart from meeting new people and seeing the charms of New Mexico, we had an unique opportunity to work with multiple industrial grade robotic arms and state-of-the-art manufacturing equipment.
The purpose of our visit was to utilize the robotic arms in a novel and innovative way. Despite the time constraint of just 2 weeks, we knew the perfect use case – on orbit inspection and repair. Specifically, completing the deployment of partially unfolded solar panels.The plan was easy to follow. Firstly, the arm moves to the inspection position where we grab an image and detect the satellite using a machine learning model. Secondly, we estimate the angle of the solar panel in relation to the satellite body and if the panel isn’t fully opened, we move to the repair step. The repair step consists of moving the panel to its fully open position, then moving back to the inspection position and repeating the process if the panel is still not successfully deployed. With the help of the awesome team working in the lab we managed to control the WAM Arm from Barrett Technology in no time. Unfortunately, there was an issue with the WAM power supply so we quickly swapped over to the Ur5e. For those unfamiliar, Ur5e is one of the most popular robotic arms out there and it was a great opportunity to try and learn what its limitations are and how we might implement it in some of our future projects.
The first problem was that the wrist mounted camera didn’t have ROS support and there wasn’t any documentation to follow. So we developed a small script that allowed us to pull images from the camera at reasonable FPS (Frames Per Second), with the only restriction being the time it takes the camera to focus once stable, which can last up to 1.5s. Finishing the script allowed us to capture our first dataset and get to training the model.
As per the above image, we can see results from our validation batch, our final dataset had around 300 images of which more than 70% were bad classes. We also had to annotate some images before we had a good enough model to automatically do that for us. The easy part was done, but now we had to control the arm itself. Fortunately enough, everything could be controlled using ROS so at least a part of the learning curve was familiar. The position of the arm in relation to the object was predefined which simplified our use case a lot, but there was still a problem of moving the satellite wing just enough to not overdo it and possibly break the satellite. The first step as mentioned before was to move the arm in the position where it can take and process the picture of the satellite. Afterwards, there was a pipeline that estimated the solar panel angle (in relation to the satellite body) and moved the arm to the predefined position in relation to the estimated angle. The arm moved just enough to fix the solar panel and then retracted a bit so as not to damage the satellite.
That summarizes our job in the lab but there are still things we can talk about.
One of those was our visit to the NewSpace Nexus conference where we had a chance to network with people who literally live space. Current and retired NASA engineers, scientists working for the US Air Force Research Lab, and small startups who were more than happy to share their experiences with us.
It wasn’t just work, we had a great time traveling around New Mexico and visiting cult places from Breaking Bad. The experience itself was amazing and we can’t wait to do it all over again.