Overview:
When we approach an as pedestrians, we generally make eye contact with the approaching / waiting driver to show our intention to cross before stepping into the crosswalk.
What if there is no driver and the vehicle is autonomous? We are told these cars are smart, but how do we REALLY know they see us?
Inspired by the Larson Scanner (sweeping red lights from Knight Rider and Battlestar Galactica, named after TV producer Glen A. Larson), I thought it is a good starting point since it is very recognizable as robotic. Without adding extra hardware, I want to use the existing LEDs to accomplish the goal of "seeing" the pedestrian.
The red light on the Larson Scanner does not convey a friendly feeling. Furthermore we associate it with "stop" in a traffic scenario. Green light conveys proceed and is friendlier. Once the pedestrian is spotted, we will switch to a green light that follows the pedestrian as they make their way across the front of the vehicle.
For the detection part of the system, I was limited with what is on hand and I settled on using five Sharp distance sensors. These sensors are quite limiting in their range and FOV, but since this a proof of concept they are adequate.
The sensors and NeoPixel strip are connected to an Arduino, then mounted to the front of a car.
Issues / improvements:
Due to the short time frame and limitation of the hardware I had access to, here are some of the issues I came across that I want to improve.
- Sensors: The five sensors I used had a very narrow field of view, limited range, and very finicky. For version 2 I would use a Kinect for smoother tracking.
- Code: Better code for improved green LED tracking behavior
- "Looking" Behavior: Explore various way the LEDs "look" are you.
Bonus
I've I created my prototype, I came across two versions of the same idea done by SEMCOM and RISE Viktoria.