In this configuration, the quadruped observes its environment, listens and speaks, but there is no AI-controlled movement.Run
Copy
Ask AI
uv run src/run.py unitree_go2_basic
In this mode, the quadruped is configured to (1) use a small local VLM, (2) listen to you, and (3) to speak to you. The amount of speech (“always”) is set via the "silence_rate": 0, // vocalize all speech outputs setting in actions:speak:config.
OM1 will provide LIDAR and other data to a system of LLMs, allowing them to autonomously explore indoor and outdoor environments.In this mode, the quadruped is configured to (1) use a cloud VLM, (2) listen to you, and (3) to speak to you occasionally, unless you spoke first, in which case it will always respond. The amount of speech (“sometimes”) is set via the "silence_rate": 6, // vocalize every 6th speech output setting in actions:speak:config.
OM1 will provide LIDAR and other data to a system of LLMs, allowing them to autonomously explore indoor and outdoor environments. Also, the system will log position (local odometry and GPS data) and Bluetooth data, as the basis for reliable navigation and path planning. In this mode, the quadruped is configured to use a cloud VLM. There is no speech in this configuration.
Was this page helpful?
Assistant
Responses are generated using AI and may contain mistakes.