Walabasquiat: Interactive Generative Art Powered by Walabot, Raspberry Pi & Android
Walabasquiat is a live generative art installation that turns motion detected by a Walabot 3D imaging sensor into dynamic visual patterns. The system runs on a Raspberry Pi, uses Processing for rendering, and streams sensor data via a lightweight Flask RESTful API to an Android live‑wallpaper app.
Story
Idea 🤔 💡
Since the 1990s, generative art has fascinated me—especially William Latham’s Organic Art PC, which used genetic algorithms to morph simple shapes into intricate, life‑like forms. I envisioned a public installation where visitors could directly influence those algorithms through their own movement. Walabasquiat realizes that vision by feeding real‑time positional data from the Walabot sensor into a Processing sketch that continuously mutates its visual output.
Getting Started 🔰 👩💻
Connecting the Walabot to a Raspberry Pi is a plug‑and‑play affair. Insert the device into a USB port with a 2.5 A+ power supply, then install the official Walabot API:
cd ~
wget https://s3.eu-central-1.amazonaws.com/walabot/WalabotInstaller/Latest/walabot_maker_1.0.34_raspberry_arm32.deb
sudo dpkg -i walabot_maker_1.0.34_raspberry_arm32.deb
After that, install the Python bindings:
pip install WalabotAPI --no-index --find-links="/usr/share/walabot/python/"
Run the bundled example to verify the sensor is communicating:
cd /usr/share/doc/walabot/examples/python
python SensorApp.py
You’ll see live X/Y/Z coordinates and amplitude values stream to your terminal. With the sensor working, the real creative work begins.
Development Process 💪 💻
The core challenge was exposing the Walabot data to Processing. Rather than embed the Walabot SDK directly in a Python Mode sketch—which can trigger version conflicts—I built a lightweight RESTful API with Flask. The API serialises sensor readings into JSON, enabling any network‑capable client to consume them.
I adapted walabot‑web‑api for Linux and the Creator firmware, then integrated it into my Pi. With the API running, I leveraged the HYPE Processing library (still fully functional despite its last update two years ago) to create a responsive, particle‑style visualisation of the sensor data. The result is a living tapestry that swarms around each detected target.
For visitors, I packaged the same logic into Walabasquiandroid, an Android live wallpaper. It keeps the visual experience lightweight, so users can enjoy the generative patterns on their phones long after leaving the exhibition.
Steps to Reproduce 📑 🚀
To duplicate the installation, follow these steps:
- Attach the Walabot to the Raspberry Pi and install the API as described above.
- Download and launch the Flask server:
cd /usr/share/doc/walabot/examples/python
sudo wget https://raw.githubusercontent.com/ishotjr/walabot-web-api/rpi/app.py
python3 app.py
Verify connectivity with a simple curl request:
curl -i https://192.168.1.69:5000/walabot/api/v1.0/sensortargets
Replace 192.168.1.69 with your Pi’s local IP (discoverable via ip addr show). The response will resemble the JSON payload shown earlier.
- Install Processing on the Pi (CLI preferred):
curl https://processing.org/download/install-arm.sh | sudo sh
- Clone the project repositories and install the HYPE library:
cd ~/sketchbook
git clone https://github.com/ishotjr/Walabasquiat.git
git clone https://github.com/hype/HYPE_Processing.git
unzip HYPE_Processing/distribution/HYPE.zip -d ~/sketchbook/libraries/HYPE
Launch Processing from the Raspberry Pi’s application menu, open the Walabasquiat sketch, and let the generative art respond to live sensor data.
For a portable experience, build the Android live wallpaper using the same RESTful interface. The result is a vibrant, real‑time visual narrative that bridges hardware, software, and human interaction.
Visual Highlights
Walabasquiat swarms around three Walabot targets in real time.
Walabasquiandroid – the Android live‑wallpaper companion.
Read More Detail
Explore the full code and documentation on GitHub and learn how to extend the installation to new sensors or visual styles.
Manufacturing process
- Building an Interactive Laser Projector with Zigbee Sensors and Gravio Edge Platform
- Posture Pal with Walabot – Real‑Time Distance Monitoring to Stop Neck & Back Pain
- Bark Back: An IoT Pet Monitor for Smart Sound Management
- Walabot Touchpad: Transform Any Surface into a Wireless Touch Interface
- Walabot‑Powered Toilet Hygiene Monitor
- SIGHT: Smart Glasses Empowering the Blind
- Dynamic LED Triangle Art: Build a 45-LED RGB Display with Arduino UNO
- LED Tower Art: Programmable RGB Cylinder-Inspired Design
- Generative Design Explained: How AI Optimizes Engineering Solutions
- Linear Guides Installation: A Comprehensive Guide for Reliable Production Lines