Quick Start
Learn how to install, set up and configure OM1.
System Requirements
Operating System
Linux (Ubuntu 20, 22, 24)
MacOS 12.0+
Hardware
Sufficient memory to run vision and other models
Reliable WiFi or other networking
Sensors such as cameras, microphones, LIDAR units, IMUs
Actuators and outputs such as speakers, visual displays, and movement platforms (legs, arms, hands)
Hardware connected to the "central" computer via
Zenoh,CycloneDDS, serial, usb, or custom APIs/libraries
Software
Ensure you have the following installed on your machine:
Python>= 3.10uv>= 0.6.2 as the Python package manager and virtual environmentportaudiofor audio input and outputffmpegfor video processingGet your OpenMind API key here
UV (A Rust and Python package manager)
PortAudio Library
For audio functionality, install portaudio:
Install python3-dev
ffmpeg
For video functionality, install FFmpeg:
CLI
OM1 provides a command-line interface (CLI). The main entry point is src/run.py which provides the following commands:
start: Start an agent with a specified config
config_name: Name of the config file (without.json5extension) in the/configdirectory.--log-level: Optional log level (default:INFO). UseDEBUGfor detailed logs.--log-to-file: Optional flag to log tologs/{config_name}.log(default:False).
Installation and Setup
Clone the repository
Run the following commands to clone the repository and set up the environment:
Set the configuration variables
Locate the config folder and add your OpenMind API key to /config/spot.json5 (for example). If you do not already have one, you can obtain a free access key at https://portal.openmind.org/.
Or, create a .env file in the project directory and add the following:
Note: Using the placeholder key openmind_free will generate errors.
Run the Spot Agent
Run the following command to start the Spot Agent:
Note: Agent configuration names are only required when switching between different agents. Once an agent has been run, it becomes the default for subsequent executions.
Spot is just an example agent configuration.
If you want to interact with the agent and see how it works, make sure ASR and TTS are configured in spot.json5.
ASR configuration (check in agent_inputs)
TTS configuration (check in agent_actions)
During the first execution, the system will automatically resolve and install all project dependencies. This process may take several minutes to complete before the agent becomes operational.
Runtime Configuration
Upon successful initialization, a .runtime.json5 file will be generated in the config/memory directory. This file serves as a snapshot of the agent configuration used in the current session.
Subsequent Executions
After the initial run, you can start the agent using the simplified command:

The system will automatically load the most recent agent configuration from memory. Additionally, a .runtime.json5 file will be created in the root config directory, which persists across sessions unless a different agent configuration is specified.
Switching Agent Configurations
To run a different agent (for example, the conversation agent), specify the configuration name explicitly:
WebSim to check input and output
Go to http://localhost:8000 to see real time logs along with the input and output in the terminal. For easy debugging, add --debug to see additional logging information.
Understanding the Log Data
The log data provide insight into how the spot agent makes sense of its environment and decides on its next actions.
First, it detects a person using vision.
Communicates with an external AI API for response generation.
The LLM(s) decide on a set of actions (dancing and speaking).
The simulated robot expresses emotions via a front-facing display.
Logs latency and processing times to monitor system performance.
More Examples
There are more pre-configured agents in the /config folder. They can be run with the following command:
For example, to run the cubly agent:
If you configure a custom agent, replace <agent_name> with your agent and run the below command:
To get started with development, refer here
Last updated
Was this helpful?
