Industrial manufacturing
Industrial Internet of Things | Industrial materials | Equipment Maintenance and Repair | Industrial programming |
home  MfgRobots >> Industrial manufacturing >  >> Manufacturing Technology >> Manufacturing process

Building a Low‑Cost Autonomous Lego Tank with NVIDIA Jetson Nano & Isaac SDK

Building a Low‑Cost Autonomous Lego Tank with NVIDIA Jetson Nano & Isaac SDK

In this guide I demonstrate how to transform a standard LEGO EV3 kit into a fully autonomous tracked robot, powered by a NVIDIA Jetson Nano, the Isaac SDK, a YDLIDAR‑X4 LiDAR, and a Pixy2 vision camera. The result is a budget‑friendly alternative to the Carter and Kaya reference designs, yet it retains all the advanced perception and planning capabilities of NVIDIA’s robotics stack.

Jump to Part 8 or Part 10 for a complete autonomous navigation demo.

The project consists of the following components:

Why Isaac SDK and not ROS?

Why LEGO parts?

Choosing this path introduces a few challenges:

Part 1: Getting Started

1. Install Isaac SDK

2. Voice Recognition (Optional)

3. EV3dev Image

Flash the latest ev3dev‑stretch image onto a microSD or microSDHC card. microSDXC is not supported by the EV3 brick.

4. ARM Cross‑Compiler for ev3dev

sudo apt-get install gcc-arm-linux-gnueabi g++-arm-linux-gnueabi

Because Ubuntu 18.04 (host and Jetson) uses GLIBC 2.28 while ev3dev uses GLIBC 2.24, we must compile with dynamic linking except for the math library. Adjust the jetson-ev3/toolchain/CROSSTOOL file accordingly, or use a Debian 9 Docker image.

5. Jetson + EV3 Workspace

git clone https://github.com/andrei-ace/jetson-ev3.git

Edit jetson-ev3/WORKSPACE to point to your Isaac SDK path:

local_repository(
    name = 'com_nvidia_isaac',
    path = '/home/andrei/ml/isaac'
)

And update the toolchain path in jetson-ev3/toolchain/CROSSTOOL:

# edit with your path to the toolchain
linker_flag: '-L/home/andrei/ml/jetson-ev3/toolchain'

6. Connect Jetson Nano with EV3

Use a USB‑A to mini‑USB cable. SSH into the EV3 from the Jetson:

ssh user@ev3dev.local

Default password: maker.

7. The Ping‑Pong Tutorial

Isaac’s “Ping‑Pong” Codelet demonstrates inter‑process communication. Adapt it to send messages between the Jetson and the EV3 brick.

Build the EV3 server:

bazel build --config=ev3dev //apps/ev3/ping_pong:ev3_pong

Deploy the client on the Jetson:

<YOUR-ISAAC-INSTALL>/engine/build/deploy.sh --remote_user <YOUR-USER-JETSON> -p //apps/ev3/ping_pong:ping_pong-pkg -d jetpack43 -h <YOUR-JETSON-IP>

Run both binaries. The EV3 should speak the “Ping” messages.

Part 2: Controlling a Motor from Isaac

Using the same principles, control the EV3 motors via a custom Isaac driver that mimics the Segway RMP base. The Ev3ControlServer runs on the EV3:

bazel build --config=ev3dev //packages/ev3/ev3dev:ev3_control_server
ev3_control_server ev3dev.local:9000

Integrate the server with the Sight virtual joystick for manual teleoperation.

Part 3: Building the Robot

The chassis closely follows LEGO’s official EV3 Track3r design. The Jetson Nano case is sourced from this GitHub repo.

Part 4: Isaac Apps Architecture

An Isaac App consists of three layers:

Below is a concise example of a voice‑controlled navigation app. (See the full voice_control folder for all JSON files.)

{
  'name': 'voice_control',
  'modules': [
    '//apps/ev3/voice_control:voice_control_goal_generator',
    '@com_nvidia_isaac//packages/navigation',
    '@com_nvidia_isaac//packages/planner'
  ],
  'config_files': [
    'apps/ev3/voice_control/model/isaac_vcd_model.metadata.json'
  ],
  'config': {
    ...
  },
  'graph': {
    'nodes': [...],
    'edges': [...]
  }
}

Subgraphs are reusable; the navigation stack is shared across all robot variants.

Part 5: Running Isaac Apps on EV3

The EV3 driver implements the same API as the Segway RMP base, enabling direct reuse of existing Isaac apps such as the joystick controller, distributed GMapping, and full navigation stack.

Part 6: Odometry Calibration

Accurate odometry is critical for autonomous operation. The Ev3ControlServer exposes two RPC calls:

Using the wheel radius and track width, compute linear velocity as:

if left == right:
    v = left
elif left == -right:
    v = 0
else:
    v = (left + right) / 2

Angular velocity is derived from the differential of the wheel speeds divided by the track width.

Part 7: Voice‑Controlled Autonomous Navigation

The voice detection subgraph listens to a microphone and emits a voice_command_id when one of the trained keywords is detected. I trained an RNN on three words—“jetson”, “left”, and “right”—using the Isaac Speech Toolkit. The resulting model can be downloaded from the project repository.

When a command is received, the GoalGenerator publishes a goal to the global planner. For example, to rotate the robot 90° left:

auto proto = rx_detected_command().getProto();
int id = proto.getCommandId();
auto goal_proto = tx_goal().initProto();
goal_proto.setStopRobot(true);
goal_proto.setTolerance(0.1);
goal_proto.setGoalFrame('robot');
ToProto(Pose2d::Rotation(90), goal_proto.initGoal());
tx_goal().publish();

In practice, the system works reliably in the 10 W power mode. At 5 W the inference latency (≈30 ms) is too high for smooth operation.

Part 8: Mapping with Distributed GMapping

Mapping is performed on a host PC while the EV3 provides LiDAR data. Deploy the distributed GMapping app on the Jetson:

./engine/build/deploy.sh --remote_user andrei -p //apps/ev3:gmapping_distributed_ev3-pkg -d jetpack43 -h 192.168.0.218

On the host run:

bazel run apps/ev3:gmapping_distributed_host

Ensure the host JSON file points to the correct Jetson IP:

'tcp_subscriber': {
  'isaac.alice.TcpSubscriber': {
    'port': 5000,
    'host': '192.168.0.218'
  }
}

After a few minutes of exploration, a high‑resolution occupancy grid is generated.

Conclusion

With a modest investment in LEGO Technic parts, a YDLIDAR, and a Pixy2 camera, you can build a fully autonomous tracked robot that leverages the full power of NVIDIA’s Isaac SDK. The system supports voice commands, LiDAR‑based mapping, and GPU‑accelerated perception, making it a versatile platform for research, education, and hobby projects.

All code, configuration files, and trained models are available in the project repository.

Manufacturing process

  1. Oxygen Tank Manufacturing: From Raw Materials to Quality Assurance
  2. AI Revolutionizes Robotics: From Autonomous Vehicles to Smart Manufacturing
  3. JQR Quadruped Autonomous Robot: DIY Legged Robot for Autonomous Navigation and Rescue
  4. DIY Autonomous Nerf Sentry Turret – Build a Smart Nerf Gun with Stepper Motors, Camera, and Arduino
  5. Build Your Own Autonomous Tank: Walabot, Arduino, Raspberry Pi & Alexa Integration
  6. Build an Autonomous Home Assistant Robot: Full Parts List & Setup Guide
  7. Understanding Mixing Tanks: Design, Materials, and Applications
  8. Reserve Tanks: Purpose, Types, and Benefits
  9. Critical Safety Guidelines for Fuel Tank Welding
  10. How to Quickly and Safely Fix a Car Gas Leak