By David Thames and Noah Miller
Ever wondered how Google’s Self-Driving Car or Tesla Autopilot works? We did, so we decided to spend a Saturday trying to make one ourselves. We began working on this last weekend at a hackathon sponsored by our living and learning community here at Virginia Tech. It was only 7 hours or so on Saturday afternoon, but we got to play with some fun technology and will be continuing this work in the coming weeks.
Our idea was to replicate the ideas using by the big companies on self-driving cars on a small scale with cheap hardware. We want to use software to provide the real improvements over your average obstacle avoidance robotics.
For our rapid hackathon prototyping, we decided to use our Lego NXT set to build us a quick car. We amazon primed a few extra NXT cables and a couple of cheap female ports for the NXT wire connections, so we could make the NXT components Arduino/pi compatible.
We wanted to replicate the driving of a real car, so we wouldn’t settle for any of the tread style steering you typically see at robotics competitions. We set up two NXT motors to drive the back wheels and one NXT motor to steer the front wheels.
An Ultrasonic sensor spins 180 degrees on a servo on top of the car. Most of the self-driving cars out there use similar technology: a spinning LiDAR sensor. LiDAR uses lasers in much the same way ultrasonic sensors use sound waves. They send out a beam and bounce the photons/waves off of objects. It calculates the distance based on how long the signal takes to return. LiDAR allows for much more complexity in depth sensing, but ultrasonic will do for our purposes. With this, we will theoretically be able to obtain the distance of all objects (on the same horizontal plane as the sensor) within a semicircle of a 400cm (~13ft) radius around the sensor. In addition to this data, we have a 720p webcam to use for extra information directly in front of the vehicle.
For ease of connection, we planned to hook up our motors and sensors to the Arduino. The Arduino then connects to the pi via serial port, and the pi does the real processing.
We will define our ‘smart car’ as a vehicle that can autonomously do the following: determine its location on its own given a simple blueprint-like floor map of a building, route itself based on the floor map to given coordinates, and intelligently deal with obstacles (finding a completely new route through the building if a whole hallway is blocked off, waiting for a person to walk by before continuing, etc.).
Unfortunately, we were unable to finish the hardware in time for the end of the hackathon, leaving our whole project dead in the water for demos.
Very early on we were able to get the ultrasonic sensor working. We scrapped our lego ultrasonic sensor early on in favor of another sensor available for the Hackathon (which I want to thank Scott Ziv for letting us keep). The sensor sits on a servo which rotates it 180 degrees back and forth (Using the very technical components of cardboard and hot glue). It outputs data from the Arduino’s serial port in bursts using an array which contains data points from the entire 180-degree range.
One major pitfall: the ports we bought for the NXT cables were not breadboard compatible, nor did any female wires we had fit them. After going through 4 ports, a bunch of cardboard, some melted plastic, and way too much time, we had one port working, so we ended up just stripping NXT wires for the rest.
The two rear motors we were able to set up to drive forward on Arduino power. However, when we tried to set them up to the logical pins on the Arduino our Arduino refused to output more than 2V rather than the full 5V it should be outputting. With little time left, we tried at least 3 different methods using relays, capacitors, and transistors, all to no avail. We were always missing one component or needed one volt more. And this leaves the steering motor; unfortunately, the steering needed to go both ways, and since none of our H-bridges worked properly, the car never got the ability to turn.
On the software side, we made some good progress. While we couldn’t test much without the hardware, we worked on some algorithms and theories which I believe can help guide this project.
We focused mainly on helping the robot determine its location. In our whiteboarding session, we realized, although obstacles the car sees will not necessarily be on its map, where the car sees blank space the map should also show blank space (barring sensing errors and major map discrepancies). So we went about determining an algorithm to find areas where one shape fits in the other. We coded some quick components of these algorithms in c++, but in the end worked most on trying to get the hardware working before demos, leaving us with bits and pieces of programming ideas more than actual code.
We got some good direction from this experience even though it wasn’t much of a success on its own. We know what changes to make to the hardware moving forward. We will have to acquire an H-bridge (and learn how to use it) for steering. We decided to scrap the Arduino in favor of just a raspberry pi powered by a phone power brick to reduce the coding we need for communication between devices. In terms of the code, most of the work will be trying to apply our ideas in a concrete and efficient manner. Then, once we have the basics out of the way, we can work on the cool stuff. Analyzing how the big players (Google, Uber, etc.) make their cars intelligent in reacting to obstacles, maybe throwing some machine learning or crowdsourcing in here and there, etc.
Finally, where do we go from here? Well, for anyone interested in hearing more about this project, you will be happy to hear that this weekend we will be at BitCamp, a full 36-hour hackathon, to rebuild this project from the ground up.
Update: After realizing BitCamp was having a autonomous car workshop, we decided to work on a more novel project for that hackathon. More on this soon!