Overview
Using OM1, the TurtleBot4 (TB4) is able to autonomously explore spaces such as your home. There are several parts to this capability. To get started, launch OM1:TB4 RPLIDAR Laserscan Data
OM1 uses the TB4’s RPLIDAR to tell the core LLMs about nearby objects. This information flows to the core LLMs from/input/plugins/rplidar.py
. The RPLIDAR data are also used in the action driver to check for viable paths right before motions are executed. See the RPLidar setup documentation for more information.
Core LLM Directed Motion
Depending on the environment of the TB4, the core LLMs can generate contextually appropriate motion commands.actions/move_turtle/interface.py
and are converted to TB4 zenoh/cycloneDDS cmd_vel
motions in /actions/move_turtle/connector/zenoh.py
.
TB4 Physical Collision Switches
In addition to LIDAR data, the TB4 also uses collision switches to detect hazards. When those switches are triggered, two things happen:- TB4 Basic Low Level (Firmware) Collision Avoidance
Create3
and cannot be changed by a user.
- TB4 Enhanced Collision Avoidance
action
driver to ensure prompt responses to physical collisions:
Object Avoidance and Collision Switch States
Normal
- The LIDAR does not sense anything in proximity (within 1m or closer).
- The collision switches are open.
Object Nearby and Possible Moves are Constrained
- The LIDAR senses objects in proximity and informs the core LLMs about which paths are possible.
- The collision switches are open.
Collision Switches Triggered
- The collision switches are triggered.
action
level collision avoidance code will command a 100 deg avoidance rotation. Once this rotation is complete, the system reverts to responding to commands from the core LLMs.