Sensors and Simulation Project

During the first semester of 2023, I partook in Programming for Mechatronic Systems, a subject focused on utilising sensors to control simulated robots. This subject was taught using Linux (Ubuntu 20.04) and ROS using a GitHub repository for version control. Throughout the semester, we were tasked with working on a tiered assessment, the first assessment in which we designed an Ackerman steering model for a simulation of an Audi R8, and it was required to drive and hit specific goals. The second assessment was similar, building upon A1. However, there was an addition of a quadcopter, which was then using simulated lidar sensors to control the position above the car; both platforms were required to follow a path and complete several missions.

The final Project was using the Audi R8 platform; this Project involved the sensing and control of an Ackerman platform in a track racing-inspired scenario. The Ackerman is an autonomous robot equipped with two sensors and a laser scanner (a 180-view laser scanner is attached at the front of the platform). The missions involved determining goals for the platform to reach while moving on the track and controlling the Audi R8 to achieve these goals, thus performing a lap. This assessment involved publishing and subscribing to several ROS topics to get and give data.

The inner working of the node is such that it:

  • Is idle until activated via service call, commences mission from the current location of Ackerman in response to orange/mission service call (data field is true).

  • Uses the goals provided in /orange/goals for planning a mission to drive around the track.

  • In addition to these goals, detect and use other goals from the laser using the cones.

  • If there are no cone pairs to be used, the car stops immediately and abandons the mission.

I also subscribed to a service to receive input data from the laser sensors and make the platform move. Using the Laser sensor on the front of the Ackerman, I could determine cone points and the centre of the road. I could then determine whether a goal lay on the centre of the road. I could also use the laser scan readings to determine if the fire truck was moved in front of the car and stop the platform. I could also publish markers on the road for the cone points and the road centre. I then utilised unit testing alongside ROS bags; playing the bags, I could then test if the goal point lay on the road and if the truck was in front of the car.

Overall, the Project was very enjoyable and began to tie in several concepts I have been learning about over my degree. I particularly enjoyed the project's practical nature as it allowed me to deal hands-on with ROS, Linux and GitHub. I received a Distinction for this Project.