line1.jpg (10521 bytes)
The completed propulsion chassis
line2.jpg (8926 bytes)
Close-up view of gear reduction
line3.jpg (9539 bytes)
Under view showing drag wheel and sensor mounting rails
line4.jpg (7898 bytes)
Completed sensor package. Note the light sensor block (blue) at the right on the bottom of the sensor package
line7.jpg (9854 bytes)
A side-rear view of the completed robot.
line601.jpg (6697 bytes)
Using the Lego test course to debug the program and adjust the sensor
line606.jpg (4149 bytes)
Completed model on outdoor course.

This line following robot is an on-the-fly design we created. The project is built using parts from the Lego Mindstorms Robotics Invention Set along with a few borrowed parts from the Discovery Set.

The propulsion chassis is our own design. We had several goals in mind as we put this together. The robot had to have enough torque to be able to move some additional weight such as a trailer or load. It also needed to have a larger wheelbase to navigate slightly rough terrain (such as carpeting) gracefully.

To gain the ability to move larger loads, we employed a double gear reduction system to provide more torque at the cost of model speed. A close-up view of the gear system is shown at the left. The control module is mounted externally to the propulsion system to increase the wheelbase and to provide a low-profile design. The propulsion chassis and the control module were snapped together, then held in place with several reinforcing strips.

Note the dual wheel construction on the propulsion chassis. We found during the preliminary testing stage that the model had trouble turning on linoleum due to the weight of the control module on the drag wheel. We corrected this by adding traction in the form of the dual wheels.

Thought was then given to the design of the sensor array. Our initial desire was to create a line following robot for use in a closed-loop environment delivery role. The use of the photo-sensor block was mandatory to accomplish this goal. The concept behind the touch sensor system had two fundamental design criteria. One was that the unit should be able to stop should it contact a stationary object. Secondly we considered how the unit would recognize that it was at each station in it's route, or when it was time to leave that station.

Our touch sensor module is a modified version of the "bug" double switch sensor from the Discovery Set. We increased the width and changed the mounting method, but the mechanical operation of the touch switches and trigger arms remains virtually unchanged. We decided to use control module programming to set the left sensor as the station stop sensor, and to set the right sensor as the resume or start sensor.

Once we completed assembly of the sensor module, preliminary model testing began. As we began using self-propulsion we noted several weak mechanical attachment points in the assembly design. To correct these problems we added reinforcing bars on the sides to the immediate rear of the main drive wheels. Clearance was available due to the location of the drive gears and wheels in relation to the chassis frame.

We also increased the mechanical strength of the sensor unit by providing additional support perpendicular to the sensor package on both the top and bottom of the sensor package mount.

We became more excited as the programming began. First, the robot waits for it's right touch sensor to be depressed to start. Then it moves forward as long as it senses the black line under the IR sensor. When the robot strays from the path, it corrects by reversing one of it's wheels and turning. Should it not encounter the line, it reverses both wheels to turn the opposite direction, a little more than twice as long. This continues until it finds the line, then resumes forward motion.

The unit continues along the path until either touch sensor encounters an obstacle. At that point the unit stops, backs up slightly, then waits for it's right sensor to be engaged before continuing. All of the programming was done with the standard drag-and-drop RCX interface software.

We found the the robot actually follows the line in a crab-like fashion, never quite going straight along but instead constantly adjusting it's position. The side to which it faces during this crab-like motion only changes when the robot encounters a curve or corner, or when it has a little difficulty in re-acquiring the guide line.

As you can see from the picture on the indoor test course, this robot was ideal for delivering small messages to stops along it's path. We identified it's stops by placing soda pop cans on the left side of the line at the places we wanted it to stop. In a practical real-world situation we would have added a second optical sensor perpendicular to the direction of motion from the first sensor, and placed cross strips of tape at the desired stops.

We really had a great time on the outdoor course for several hours until the batteries finally gave out!