Industrial manufacturing
Industrial Internet of Things | Industrial materials | Equipment Maintenance and Repair | Industrial programming |
home  MfgRobots >> Industrial manufacturing >  >> Manufacturing Technology >> Manufacturing process

Automated Vision‑Based Object Tracking with Raspberry Pi and OpenCV

A pan/tilt servo system that automatically tracks color objects using computer vision.

Story

Introduction

In the previous tutorial we learned how to control a pan/tilt servo to position a Raspberry Pi camera. Today we’ll extend that setup so the camera can autonomously track a colored object.

This is my first hands‑on experience with OpenCV, and I’m already enamoured with this open‑source computer‑vision library.

OpenCV is free for academic and commercial use, offers C++, C, Python, and Java interfaces, and runs on Windows, Linux, macOS, iOS, and Android. In this series we’ll focus on the Raspberry Pi (Raspbian) and Python, because OpenCV is built for computational efficiency and real‑time applications—exactly what you need for physical‑computing projects.

Step 1: BOM – Bill of Materials

Main parts:

(*) You can buy a complete pan/tilt platform with servos or build your own.

Step 2: Installing OpenCV 3

On a Raspberry Pi V3 running the latest Raspbian Stretch, the most reliable method is to follow Adrian Rosebrock’s tutorial (Raspbian Stretch: Install OpenCV 3 + Python on your Raspberry Pi). It guides you through building OpenCV from source and sets up a dedicated Python virtual environment.

After completing the tutorial, activate the virtual environment and confirm the installation:

source ~/.profile
workon cv
python
import cv2
print(cv2.__version__)

You should see version 3.3.0 or newer. The output confirms that OpenCV is ready for use inside the cv virtual environment.

Step 3: Testing Your Camera

With OpenCV installed, verify that the PiCamera streams correctly. Run the following script (or download simpleCamTest.py):

import cv2
cap = cv2.VideoCapture(0)
while True:
    ret, frame = cap.read()
    frame = cv2.flip(frame, -1)  # Flip vertically if needed
    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
    cv2.imshow('frame', frame)
    cv2.imshow('gray', gray)
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break
cap.release()
cv2.destroyAllWindows()

Press q or Ctrl+C to exit. A successful run shows both color and grayscale feeds.

Step 4: Color Detection with OpenCV

OpenCV works in the HSV (Hue, Saturation, Value) space, which aligns better with human color perception. To detect a specific color, you first determine its BGR values, then convert to HSV and establish lower/upper bounds.

Example: Tracking a yellow object. Using a color picker (e.g., PowerPoint), you found the BGR values (71, 234, 213). Convert them to HSV with the script below (or download bgr_hsv_converter.py):

import sys, numpy as np, cv2
blue, green, red = int(sys.argv[1]), int(sys.argv[2]), int(sys.argv[3])
color = np.uint8([[[blue, green, red]]])
hsv = cv2.cvtColor(color, cv2.COLOR_BGR2HSV)
hue = hsv[0][0][0]
print('Lower bound:', [hue-10, 100, 100])
print('Upper bound:', [hue+10, 255, 255])

Running python bgr_hsv_converter.py 71 234 213 prints:

Lower bound: [24, 100, 100]
Upper bound: [44, 255, 255]

Use these bounds to mask the color in an image:

import cv2, numpy as np
img = cv2.imread('yellow_object.JPG', 1)
img = cv2.resize(img, (0,0), fx=0.2, fy=0.2)
hsv = cv2.cvtColor(img, cv2.COLOR_BGR2HSV)
lower = np.array([24, 100, 100], dtype=np.uint8)
upper = np.array([44, 255, 255], dtype=np.uint8)
mask = cv2.inRange(hsv, lower, upper)
cv2.imshow('mask', mask)
cv2.imshow('image', img)
while True:
    k = cv2.waitKey(0)
    if k == 27:
        break
cv2.destroyAllWindows()

Save the script as colorDetection.py and run python colorDetection.py (ensure the image is in the same directory).

Step 5: Real‑Time Object Tracking

Building on the mask, we can track the object’s centroid frame‑by‑frame. The code below borrows from Adrian Rosebrock’s Ball Tracking tutorial and incorporates the HSV bounds we defined earlier.

First, ensure the imutils library is installed:

pip install imutils

Then run ball_tracking.py:

python ball_tracking.py

The script flips the frame vertically (if needed) and applies the mask. When the object is detected, a circle and centroid are drawn.

Step 6: Testing the GPIOs

Now let’s add a red LED to the Pi and confirm that we can control it via GPIO.

Wiring: Connect the LED cathode to GPIO 21, the anode to ground through a 220 Ω resistor.

If RPi.GPIO is not yet installed in the cv environment, install it:

pip install RPi.GPIO

Test script (LED_simple_test.py):

import sys, time, RPi.GPIO as GPIO
led = int(sys.argv[1])
freq = int(sys.argv[2])
GPIO.setmode(GPIO.BCM)
GPIO.setup(led, GPIO.OUT)
GPIO.setwarnings(False)
print(f'Blinking LED on GPIO {led} every {freq}s, 5 times.')
for _ in range(5):
    GPIO.output(led, GPIO.LOW)
    time.sleep(freq)
    GPIO.output(led, GPIO.HIGH)
    time.sleep(freq)
GPIO.cleanup()

Run with python LED_simple_test.py 21 1 to blink the LED five times every second.

Step 7: Color Recognition & GPIO Interaction

Combine the tracking logic with GPIO control so that the LED lights up whenever the target color is detected. The core additions are:

import RPi.GPIO as GPIO
led = 21
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.setup(led, GPIO.OUT)
GPIO.output(led, GPIO.LOW)
ledOn = False

Inside the detection loop, after confirming a valid radius, add:

if not ledOn:
    GPIO.output(led, GPIO.HIGH)
    ledOn = True

Full script: object_detection_LED.py. Run with python object_detection_LED.py. The LED turns on only when the object’s color falls within the HSV bounds.

Step 8: The Pan/Tilt Mechanism

With OpenCV and GPIO working, attach the servos to an external 5 V supply. Wire the data pins as follows:

Share the ground between the Pi, servos, and power supply. Optionally place a 1 kΩ resistor in series between the Pi GPIO and servo data pin to protect the Pi from current spikes.

Test servo control with angleServoCtrl.py:

from time import sleep
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)

def setServoAngle(servo, angle):
    pwm = GPIO.PWM(servo, 50)
    pwm.start(8)
    duty = angle / 18. + 3.
    pwm.ChangeDutyCycle(duty)
    sleep(0.3)
    pwm.stop()

if __name__ == '__main__':
    import sys
    servo = int(sys.argv[1])
    GPIO.setup(servo, GPIO.OUT)
    setServoAngle(servo, int(sys.argv[2]))
    GPIO.cleanup()

Example: python angleServoCtrl.py 17 45 moves the tilt servo to 45°.

Step 9: Determining the Object’s Real‑Time Position

To center the camera on the target, we first need the object’s centroid coordinates. Modify objectDetect_LED.py to print the x and y values:

def mapObjectPosition(x, y):
    print(f'[INFO] Object center at X={x}, Y={y}')

The coordinates range from 0–500 (left‑to‑right) for x and 0–350 (top‑to‑bottom) for y. Move the object and observe the live output.

Step 10: Object Position Tracking System

Define a “centered” region (e.g., 220 < x < 280 and 160 < y < 210). If the object falls outside, adjust the pan and tilt servos accordingly:

def mapServoPosition(x, y):
    global panAngle, tiltAngle
    if x < 220:
        panAngle = min(panAngle + 10, 140)
        positionServo(panServo, panAngle)
    if x > 280:
        panAngle = max(panAngle - 10, 40)
        positionServo(panServo, panAngle)
    if y < 160:
        tiltAngle = min(tiltAngle + 10, 140)
        positionServo(tiltServo, tiltAngle)
    if y > 210:
        tiltAngle = max(tiltAngle - 10, 40)
        positionServo(tiltServo, tiltAngle)

The helper positionServo(servo, angle) simply calls angleServoCtrl.py:

import os

def positionServo(servo, angle):
    os.system(f'python angleServoCtrl.py {servo} {angle}')
    print(f'[INFO] Positioning servo GPIO {servo} to {angle}°')

All scripts are in the same directory; download the complete implementation from objectDetectTrack.py.

Conclusion

This project demonstrates how to combine OpenCV, Raspberry Pi GPIO, and a pan‑tilt servo mount to create an autonomous color‑object tracking system. Feel free to experiment with different colors, shapes, and servo speeds.

Full code and additional resources are available on GitHub and the MJRoBot.org blog.

Thank you for reading—see you in the next tutorial!

Manufacturing process

  1. Night Vision Scopes: Boost Your Sight in Low Light
  2. Mastering Python Object‑Oriented Programming: Classes, Inheritance, Encapsulation & Polymorphism
  3. Speathe: Breath‑Driven Communication for Paralyzed Users
  4. Real‑Time Face Recognition on Raspberry Pi – End‑to‑End Guide
  5. Master Python's Object-Oriented Programming: A Comprehensive Guide
  6. Precision Surveillance: Object Tracking with Arduino and Servos
  7. Build a Real-Time Face-Tracking System with Arduino & OpenCV
  8. Measuring Accuracy of Dynamic Object Tracking in Motion Capture Systems
  9. New Filter Boosts Robot Vision for Accurate 6‑D Pose Estimation
  10. Detecting Red Lights on PLCnext with OpenCV and Python