Industrial manufacturing
Industrial Internet of Things | Industrial materials | Equipment Maintenance and Repair | Industrial programming |
home  MfgRobots >> Industrial manufacturing >  >> Manufacturing Technology >> Manufacturing process

Real‑Time Motion Planning for Autonomous Vehicles in Urban Environments: A Scaled‑Down Prototype Study

Advanced autonomous vehicles promise a safer, more efficient future for urban mobility. Yet, despite significant industry investment, no fully autonomous system has entered the consumer market. A key bottleneck is the lack of a robust, real‑time motion‑planning engine that can safely navigate complex city streets.

Project Objective

The goal of this research is to design and implement a reliable real‑time motion‑planning system that reduces accident rates for autonomous cars. The system incorporates lane keeping, obstacle avoidance, moving‑car avoidance, adaptive cruise control, and basic accident avoidance.

Hardware Architecture

We constructed a scaled‑down “ego” (EGO) vehicle using an Adafruit 3‑layer round robot chassis kit with dual DC motors, a Raspberry Pi 3 B+, a RPLIDAR‑A1 laser scanner, two HC‑SR04 ultrasonic sensors, and an Anker PowerCore 13000 battery pack.

The Raspberry Pi serves as the processing hub, receiving raw data from the LIDAR and ultrasonic sensors via GPIO pins and delivering motor commands to a DRV8833 motor driver board.

Software Stack

All software runs on Raspbian. Python 3 is the primary language, leveraging the official LIDAR SDK and the VNC Viewer for remote monitoring. SSH enables wireless command‑line access.

Methodology

  1. Phase 1 – Hardware Build: Assemble the EGO vehicle and install the OS, drivers, and connectivity tools.
  2. Phase 2 – Control Development: Implement lane‑keeping, obstacle avoidance, moving‑car avoidance, and adaptive cruise control modules using sensor fusion.
  3. Phase 3 – Validation: Conduct a battery of reliability tests in a scaled‑down urban mock‑up, iteratively refine algorithms, and integrate all modules into a single motion‑planning stack.

Key Algorithms

Obstacle avoidance is based on LIDAR data segmented into forward and rear sectors; if an obstacle appears in the forward sector, the system checks the opposite lane via ultrasonic sensors before deciding to stop or change lanes. Adaptive cruise control adjusts speed by ±5 % or ±1 % of the current velocity to maintain a safe following distance.

Results & Improvements

Initial tests demonstrated:

We mitigated the vehicle’s top‑heavy design by removing the secondary Raspberry Pi and lowering the chassis, reducing flip‑over risk. Adding ultrasonic sensors enhanced side‑lane detection, and a smoother speed‑adjustment routine improved cruise control stability.

Future Work

Future iterations will explore machine‑learning‑based path planning, real‑time traffic‑signal integration, and multi‑vehicle coordination through V2X communication. Enhancing the adaptive cruise algorithm with predictive models based on distance‑time calculations is also planned.

Conclusion

By combining sensor fusion, modular control logic, and iterative testing, we have produced a prototype real‑time motion‑planning system that operates safely in a controlled urban environment. While not yet production‑ready, the architecture lays a solid foundation for Level 4 autonomous driving.

Acknowledgements

Special thanks to Dr. Lisa Fiorentini for her mentorship, to my family for unwavering support, and to my peers for constructive feedback.

References

  1. Aqib, “What is Raspberry pi? Getting Started with Raspberry Pi 3,” Electronics Hobbyists, 18‑Jan‑2019. https://electronicshobbyists.com/tutorial-1-what-is-raspberry-pi-getting-started-with-raspberry-pi-3/.
  2. Adafruit Industries, “Mini 3‑Layer Round Robot Chassis Kit – 2WD with DC Motors,” 2020. https://www.adafruit.com/product/3244.
  3. Adafruit Industries, “Adafruit DRV8833 DC/Stepper Motor Driver Breakout Board,” 2020. https://www.adafruit.com/product/3297.
  4. T. Huang, “RPLIDAR‑A1 360° Laser Range Scanner,” SLAMTEC, 2020. https://www.slamtec.com/en/Lidar/A1.
  5. “PowerCore 13000,” Anker, 2020. https://www.anker.com/store/powercore-13000/A1215011.
  6. “Welcome to lidar’s documentation! ¶,” LIDAR documentation, 2020. https://lidar.readthedocs.io/en/latest/.
  7. “RealVNC,” VNC® Connect, 2020. https://www.realvnc.com/en/connect/download/viewer/macos/.
  8. Adafruit Industries, “HC‑SR04 Ultrasonic Sonar Distance Sensor 2 × 10 kΩ resistors,” 2020. https://www.adafruit.com/product/3942.

Manufacturing process

  1. SmartThings Motion Sensor Using Computer Vision on Raspberry Pi 3
  2. Build a DIY Infrared Motion Sensor for Raspberry Pi – Step‑by‑Step Guide
  3. Autonomous Donkey Car: TensorFlow MobileNetV2 AI for Garbage Collection
  4. Motion‑Sensing Under‑Bed LED Light: Quiet, Smart Night Illumination
  5. DIY Arduino Cockroach Control System – Build for Under $30
  6. Car HUD: Real‑Time Windshield Speed & Compass Display
  7. Buffing: The Ultimate Car Finishing Technique for a Mirror‑Gloss Finish
  8. Advanced 3D Motion Tracking System: Next-Gen Sensor for Autonomous Tech
  9. Helical Gear Rack for Precise Linear Motion Systems
  10. Choosing the Right Turbocharger for Your Car: A Practical Guide