OTA Setup
Setup OTA
Cloud Docker Management Service
The cloud docker management service allows remote management of Docker containers via a web interface. To enable this service, follow these steps:
Step 1: Sign up
Sign up for an account on OpenMind Portal.
Step 2: Get OpenMind API key
Get you API key from the Dashboard page.
Step 3: Set the API key
Set your API key as an environment variable in your Bash profile:
vim ~/.bashrcAdd the following lines:
export OM_API_KEY="your_api_key_here"Step 4: Get the API Key ID
Get API_KEY_ID from the Dashboard page. The API Key ID is a 16-digit character string, such as om1_live_<16 characters>. Now, export the API Key ID as an environment variable:
vim ~/.bashrcexport OM_API_KEY_ID="your_api_key_id_here"Reload your Bash profile:
Step 5: Set the robot type that you are using
Setup OTA Update Services
To enable the Over-The-Air (OTA) update service for Docker containers, you need to set up two docker services: ota_agent and ota_updater. These services will allow you to manage and update your Docker containers remotely via the OpenMind Portal.
To create a ota_updater.yml file, follow these steps:
Copy the content from the ota_updater.yml template to ota_updater.yml file.
Note: You can use the stable version as well. The file example provided is the latest version.
Save and close the file (:wq in vim).
Start OTA Updater Service
A .ota directory will be automatically created in your home directory to store OTA configuration files.
Now, you can set up the ota_agent service. Create an ota_agent.yml file:
Navigate to the OTA directory:
Copy the content from the ota_agent.yml template to ota_agent.yml file.
Note: You can use the stable version as well. The file example provided is the latest version.
Save and close the file.
Start OTA Agent Service
Verify both services are running:
Expected output: Both ota_updater and ota_agent containers listed.
You can now manage and update your Docker containers remotely via the OpenMind Portal.
Model Downloads
Riva Models
Riva models are encrypted and require authentication to download. To download Riva models, you need to set up the NVIDIA NGC CLI tool.
Install NGC CLI
⚠️ Warning: Run the following commands in your root directory (
cd ~). Otherwise, Docker Compose may not locate the required files.
To generate your own NGC api key, check this video.
This will ask several questions during the install. Choose these values:
⚠️ Warning: NGC CLI creates a
.bash_profilefile if it doesn't exist. If you already have a.bashrcfile, merge them manually to avoid losing your bash configuration.
ngc cli will create a .bash_profile file if it does not exist. If you already have a .bashrc file, please make sure to merge the two files properly. Otherwise, your bash environment may not work as expected.
Download Riva Models
Download Riva Embedded version models for Jetson 7.0:
This will ask the NGC api key to download the model, use <YOUR_API_KEY>. It will take a while to download.
Note: The following command is for testing.
Run Riva locally:
Now, please expose these environment variables in your ~/.bashrc file to use Riva service:
OpenMind Riva Docker Image for Jetson
We created a openmindagi/riva-speech-server:2.24.0-l4t-aarch64 docker image that has Riva ASR and TTS endpoints with example code to run Riva services on Jetson devices. You can pull the image directly without downloading the models from NGC:
The dockerfile can be found here and the docker-compose file can be found here.
Note: Once you download the models from NGC and export the environment variables, you can use OpenMind Portal to download Riva dockerfile and run Riva services.
Test Riva Services
Once you have Riva services running, you can use the following script to test the ASR and TTS endpoints:
Port Reference
Services use the following ports:
1935
MediaMTX RTMP Server
Video streaming
6790
Riva ASR WebSocket
Speech recognition API
6791
Riva TTS HTTP
Text-to-speech API
8000
MediaMTX RTMP API
RTMP control
8001
MediaMTX HLS API
HLS streaming
8554
MediaMTX RTSP
RTSP streaming
8860
Qwen 30B Quantized
LLM inference
8880
Kokoro TTS
TTS engine
8888
MediaMTX Streaming
Streaming control
50000
Riva Server API
Internal Riva API
50051
Riva NMT Remote API
Remote TTS/ASR APIs
Troubleshooting
OTA services won't start
Check API key is correct: echo $OM_API_KEY
NGC CLI not found
Verify PATH: echo $PATH includes ngc-cli directory
Riva models download fails
Confirm NGC API key is valid and you have quota
Port already in use
Check what's running: sudo lsof -i :PORT_NUMBER
Docker permission denied
Add user to docker group: sudo usermod -aG docker $USER
Last updated
Was this helpful?
