Department of Information Technology

Lab 2: Programming in LEGO Mindstorms NXT

Introduction

See also the lab introduction slides.

Assignment Goals

In this assignment, you will learn how to do basic real-time programming on an embedded device with a runtime that supports real-time tasking. In particular, you will use Ada to program the microcontroller of a LEGO Mindstorms NXT control brick to make it interact with its environment. It will run an Ada runtime system based on Ravenscar Small footprint profile.

Working in Groups

Solve this assignment in your groups. The lab should be done in groups of 4 people, or in exceptional cases in groups of 3 people. Submissions by a single student will normally not be accepted. All students participating in the group shall be able to describe all parts of the solution. Each group will receive a LEGO Mindstorms package during/right the lab period. The box includes all the necessary parts for solving the assignment, and the group is responsible for handing the package back at the end of the course. All hardware issues (like handing out/taking back the boxes) are handled by lab assistants.

Report Hand-in

The report must consist of the following:

  1. Listings of well commented and well structured programs for the Part2, Part3 and Part4 (Ada projects),
  2. General descriptions of your solutions and how your code is supposed to work and why,
  3. Answers to all parts of this assignment.

Solutions have to be submitted via the student portal, deadline for submission is October 2nd, 23:59. No submissions will be accepted after this point.

Note: To get approved on this assignment, you should return the LEGO Mindstorms package as given to you (check the instruction pages in the box). Make sure that the package still contains the NXT unit, all sensors (light sensor, 2x touch sensors, sound sensor, distance sensor), all three motors and all the cables. Please also flash the original firmware back into the brick right after you finish the demonstration, TA will help you in that.

Note/2: Please make sure you are using clear, concise and fluent language, that your report is clearly structured/layout-ed (which includes good code comments!) and that the technical quality of your solution is sufficient. Hand-ins with non-indented code will be discarded without further consideration.

Demonstration

The last part of this assignment consists of building a robot car that can follow a car on a track (see below). All groups have to show that their car is able to complete the tour.

Please sign up for the demonstration times on this Doodle poll.

LEGO Mindstorms NXT

The LEGO Mindstorms Robotics Invention System consists of a bunch of LEGO pieces and the NXT unit with some sensors and motors, see picture on the left. The NXT unit is an autonomous programmable microcomputer, using an Atmel 32-bit ARM7 processor (specifically, AT91SAM7S256) running at 48 MHz together with a 8 bit AVR co-processor. The NXT brick can be used to control actuators, like an integrated sound generator, lights, and motors, and read input from various sensors, like light sensors, pressure sensors, rotation sensors, and distance sensors. The NXT brick also has an LCD display (useful for printing information) and USB and Bluetooth communication ports. The NXT unit is built for the easy attachment of LEGO building blocks and pieces.

Instead of the standard firmware and default programming platform of the NXT, we will use Ada Ravenscar SFP runtime system, which is a port of original Ada runtime systems to the LEGO NXT platform. Ravenscar Small Footprint Profile (SFP) supports a subset of original Ada language suitable for predictable execution of real-time tasks in memory constrained embedded systems. The Ada runtime system it uses is very small (4186 lines!) but still supports features like tasking, fixed priority preemptive scheduling and resource sharing using Immediate Ceiling Priority Protocol (ICPP). As a result, Ada program with its runtime system and drivers are small enough to run inside NXT RAM. For more details on the Ravenscar SFP and the Ada NXT runtime system, please check NXT runtime and Ravenscar profile pages.

Note: In this lab, we will not use a real-time operating system, only a runtime system supporting Ada tasking and scheduling features. Ada programs will run in RAM, so after turning off the robot the program will be gone and you need to upload it again in the next run.

Our Ada programs for NXT will contain following parts:

  • The Ada main procedure file, e.g., helloworld.adb
  • The Ada package file for tasks specification, e.g., tasks.ads
  • The Ada file for task source code, e.g., tasks.adb

The NXT Ada implementation uses drivers written in Ada. Unfortunately there is no proper API documentation for this driver library. The way to learn programming with these drivers is to check the driver specifications in their respective .ads files and the example codes. You can find some packages of drivers and example code as part of getting started session below.

The compilation toolchain first compiles the Ada file into an ARM binary and then generates the whole system's binary by merging the driver binaries with it. This includes definitions of all tasks, resources, event objects, etc.

Getting Started with NXT using Ada

All software necessary to work with Ada and NXT platform is installed on the Windows lab machines in the lab 2315. This includes software for flashing the firmware, compiling programs and uploading them.

Program Compilation: To compile and upload Ada NXT code in windows we use Cygwin terminal (link in desktop). Cygwin is a shell program which emulates Unix environment inside windows. Every path inside cygwin should be prefixed with /cygdrive/. For example, in our lab machines you can go to the demos folder by command cd /cygdrive/c/gnat/2012/share/examples/mindstorms-nxt/demos.

In order to compile Ada NXT programs, all you need to do is to have an appropriate makefile in the current directory. (It is recommended that you use a different subdirectory for each part of the assignment.) The PRG field in the makefile should contain the name of the main procedure and additionally RAVENSCAR_SRC should point to the relative path of the folder gnat/2012/lib/mindstorms-nxt/ravenscar. If you are not sure about configuring the RAVENSCAR_SRC path then you can place your code directory inside the demos folder (C:\GNAT\2012\share\examples\mindstorms-nxt\demos\) and use the same RAVENSCAR_SRC path as mentioned in the given makefile. For compiling use "make all" command. Compiler will compile all the required drivers and at the end will generate a compressed file with same name as the main procedure (no extension in Windows, with .bin in Linux).

Preparing the robot: In order to start with the lab, you first need to change a setting in the original firmware of the robot. Switch the NXT brick on and change the "Settings/sleep" option to "never" so that the robot does not go to sleep mode automatically. Now put it into reset mode by pressing the reset button (at the back of the NXT, upper left corner beneath the USB connector) for more than 5 seconds. The brick will start ticking shortly after. This means you robot is ready for uploading the code into its ram. In this lab, the robot will be always on "reset mode" when you upload a program as the code of the previous run can not reside in the ram after turning it off.
The original firmware can be flashed back with the help of TA which you please do before handing back the box.

Program Upload: In Windows, NXT program uploading is done by a Samba client. Make sure that the NXT brick is connected to the USB port of your PC and that it is turned on (also ticking). Now we want to upload the compiled "tests" program (example code in demos/basic_tests). Change your current directory to the program folder using cygwin terminal and then run this command (the command is samba_run <name_of_main_procedure>):

samba_run tests

Successful upload will show something similar (address may be different):
Image download complete.
Image started at 0x0020235c

Now you can disconnect the robot and try testing it. You can turn off the robot by pressing the middle orange button.

Drivers and Examples:

  • Ada drivers for NXT, originally located in C:/gnat/2012/lib/mindstorms-nxt/drivers
  • Ada demos for NXT, originally located in C:/gnat/2012/share/examples/mindstorms-nxt/demos
  • A complete example of motor_test application
  • Some examples of Ada periodic and sporadic tasks, originally located in C:/gnat/2012/mindstorms-nxt/facilities
  • A set of programs for low level testing of sensors. These programs are tested with Linux-based compiler but will work with Windows-based compilers (often with small changes).

Working at home: If you like to work at home, you can install the compilation and upload toolchain yourself. Since this depends heavily on your setup, we can't give you any direct support. However, installation in Windows is farely simple and instructions for Windows and Linux installation can be found in instruction file.


Part 1: Warm-Up

This first part is supposed to get you used to compiling and uploading programs, together with simple input/output operations on the NXT platform. The program you will write is a simple "hello world!" that additionally prints a sensed light value on the LCD display of the NXT brick.

Program Skeleton

Create a helloworld.adb file with the main procedure:

with Tasks;
with System;

procedure helloworld is

   pragma Priority (System.Priority'First);

begin

   Tasks.Background;

end helloworld;

Note that we assigned lowest priority to this procedure by using attribute 'First which indicates the first value of a range. This procedure is calling a procedure background (the main procedure of Tasks) of package Tasks.

Next, we need to define the Tasks pacakge in tasks.ads:

with Ada.Real_Time;       use Ada.Real_Time;
with NXT;                 use NXT;
-- Add required sensor and actuator package --

package Tasks is

   procedure Background;

   private

   --  Define periods and times  --

   --  Define of used sensor ports  --

   --  Init sensors --

end Tasks;

Finally, we need to implement Ada tasks in tasks.adb file:

with System;
with NXT.AVR;		      use NXT.AVR;
with NXT.Buttons;             use NXT.Buttons;
with Nxt.Display;             use Nxt.Display;

package body Tasks is

   ----------------------------
   --  Background procedure  --
   ----------------------------
   procedure Background is
   begin
      loop
         null;
      end loop;
   end Background;

   -------------
   --  Tasks  --
   -------------   
   task HelloworldTask is
      -- define its priority higher than the main procedure --
      pragma Storage_Size (4096); --  task memory allocation --
   end HelloworldTask;

   task body HelloworldTask is
      Next_Time : Time := Time_Zero;
 
   begin      
      -- task body starts here ---

      loop
         -- read light sensors and print ----

	 if Current_Button = Power_Button then
            Power_Down;
         end if;
         Next_Time := Next_Time + Period_Display;
      end loop;
   end HelloworldTask;
    
end Tasks;

Writing The Code

Now, attach a light sensor to the NXT brick and fill in the rest of HelloworldTask. Your code should do the following:

  1. Display "Hello World!"
  2. Read the light sensor value and display it repeatedly with a delay of 100ms in between.

Consult the examples and the drivers package to get more information about API usages.

Note: Light sensors are bit tricky to initialize. Check the make() function in nxt-light_sensors_ctors.ads to understand how to use it. For light sensor you need to use both the nxt-light_sensors_ctors (to initialize) and the nxt-light_sensors package.

Note/2: Try different procedures of the nxt-display package to master output in the display. For more advanced kind of display you can use the nxt-display-concurrent package from facilities.

Make sure your code compiles without error and executes as desired on the NXT brick. Try to measure light values of different surfaces (light ones, dark ones, ...).


Part 2: Event-driven Scheduling

In this part, you will learn how to program event-driven schedules with NXT. The target application will be a LEGO car that drives forward as long as you press a touch sensor and it senses a table underneath its wheels with the help of a light sensor. For this purpose, build a LEGO car that can drive on wheels. You may find inspiration in the manual included in the LEGO box. Further, connect a touch sensor (using a standard sensor cable) to one of the sensor inputs.

Handling Events

Ideally events generated by external sources are detected by the interrupt service routines (ISRs). This allows to react immediately to signals from various sources. Unfortunately, most of the sensors on the NXT are working in a polling mode, they need to be asked for their states again and again, instead of getting active themselves when something interesting happens.

Our workaround for this is to create a small, second task that and checks the sensors periodically (about every 10ms). If the state of the sensor changed, it generates the appropriate event for us.

First we need to create a protected object named "Event" with single entry as:

protected Event is
       entry Wait(event_id : out Integer);
       procedure Signal(event_id : in Integer);
private
        -- assign priority that is ceiling of the user tasks priorities --
       Current_event_id : Integer;     -- Event data declaration
       Signalled : Boolean := False;   -- This is flag for event signal
end Event;

protected body Event is
      entry Wait(event_id : out Integer) when Signalled is
      begin
         event_id := Current_event_id;
         Signalled := False;
      end Wait;

      procedure Signal(event_id : in Integer) is
      begin
         Current_event_id := event_id;
         Signalled := True;
      end Signal;
end Event;

This protected object can be used by different tasks to communicate between them. For example, a task can block on receiving event:

Event.wait(received_event);

An event dispatcher task can notify the blocked task by sending an event:

Event.signal(event_id);

In order to do this, declare and implement a task "EventdispatcherTask". It should call the appropriate API function to read the touch sensor and compare it to it's old state. (A static variable may be useful for that.) If the state changed, it should release the corresponding event by using signal procedure of the Event protected object. You may define two event ids like "TouchOnEvent" and "TouchOffEvent". Just as in part 1, put your code in an infinite loop with a delay in the end of the loop body.

Now create a new task "MotorcontrolTask" that does the following in an infinite loop:

  1. Wait for event "TouchOnEvent"
  2. Make the car move forward by activating the motors
  3. Wait for event "TouchOffEvent"
  4. Make the car stop

As suggested by the names of the events, the idea is that they should occur as soon as the user presses and releases the attached touch sensor. In order for MotorcontrolTask to have priority over EventdispatcherTask, make sure to assign a lower priority to the latter. Otherwise, the infinite loop containing the sensor reading would just make the system completely busy and it could never react to the generated events.
Add further some nice status output on the LCD.

This should complete your basic event-driven program. Compile and upload the program and try whether the car reacts to your commands. In case of errors, read the error messages of the compiler and/or consult the driver code descriptions.

Extending The Program

Attach a light sensor to your car that is attached somewhere in front of the wheel axis, close to the ground, pointing downwards. Extend the program to also react to this light sensor. The car should stop not only when the touch sensor is released, but also when the light sensor detects that the car is very close to the edge of a table. (You may need to play a little bit with the "Hello World!" program in order to find appropriate light levels.) The car should only start moving again when the car is back on the table and the touch sensor is pressed (again).

The edge detection should happen in EventdispatcherTask and be communicated to MotorcontrolTask via the event protected object. Use two new events for that purpose. Make sure you define and use all events properly. Further, the display should provide some useful information about the state of the car.

Important Notes:

  • The job of EventdispatcherTask is just to create the events signaling button and light behavior, independent from each other.
  • All "logic" should happen in MotorcontrolTask, i.e., when to or not to move, depending on the current state. The task must not read the sensors by itself nor communicate with EventdispatcherTask by other means than the event protected object, i.e., shared variables etc. are not allowed!
  • A frequent mistake is that the periodic EventdispatcherTask generates an event each time it is executed, reporting the current state of the sensors. Instead, it should only generate events at state changes: one (and only one) in the moment when the touch sensor is pressed down and one when it is released. The same for the light sensor: When the table edge is detected, generate one event. Don't "spam" the event system with redundant information. (Of course, when the touch sensor is pressed again, a new event needs to be generated.)

What To Hand In

Please hand in only the source of the full (second) program that includes the light sensor code. Make sure you include brief explanations and that your source is well-commented. (Note that hand-ins without meaningful comments will be directly discarded.)


Part 3: Periodic Scheduling

Real-time schedulers usually schedule most of their tasks periodically. This usually fits the applications: Sensor data needs to be read periodically and reactions in control loops are also calculated periodically and depend on a constant sampling period. Another advantage over purely event-driven scheduling is that the system becomes much more predictable, since load bursts are avoided and very sophisticated techniques exist to analyze periodic schedules. (You will learn about response-time analysis later during the course.)

The target application in this part will make your car move forward until the end of the table is reached and then start moving backward. Additionally, the touch sensor is used to tell the car to stop moving backwards and to again move in forward direction. (Note that this is a new program again, so for now, do not just extend the program from the event-driven assignment part. Create a new program instead.)

Periodic Tasks

The structure of the system in this part is as follows: We have three tasks that are scheduled periodically, with different periods:

  1. A task "MotorcontrolTask" that only takes care of controlling the motors and receives commands from the other tasks.
  2. A task "ButtonpressTask" that senses the state of the buttons and sends new commands to MotorcontrolTask.
  3. A task "DisplayTask" that displays some interesting information about what is going on currently.

(Note that there is no task sensing the distance yet. This will come later.)

Obviously, the tasks can have different periods, since while we would like the car to react fast to the button press, we can't (optically, as humans) read updated information from the display faster than in certain intervals anyway.

Basic Periodic Schedule

Before we implement the actual tasks, we need a way for them to communicate. We will use a data structure that is used by the sensing tasks to communicate their movement wishes to MotorcontrolTask. Define a "driving_command" record with fields for driving direction, car speed and update_priority. We will use this global variable to pass information about driving between tasks.

Define update_priorities (actually integer values) PRIO_DEFAULT and PRIO_BUTTON with values 1 and 2. These are not the priorities of executing the tasks but only used when a task wants to update "driving_command".

The sensing tasks can at any time write a new "command" into this (global) record, and the idea is that they should only succeed in doing so, if they don't already "see" a higher update_priority in the global record. For this reason each task that will use this structure should be assigned their own update priority (some integer value, like PRIO_BUTTON for ButtonpressTask). Further, write a function change_driving_command with update_priority, speed and driving_direction as parameters that can be used by tasks to update the record.

Using this, we will now define the tasks in our system:

  • Define a task "MotorcontrolTask" that "executes" the "driving command". The starting default "driving command" is to move forward. It should set the speed of the motors accordingly and update driving direction based on priority of the driving command. Define the task with a period of 50ms.
  • Define a task "ButtonpressTask" that reads the touch sensor. If the button is in the "pressed down" state, it should try to set the driving command to drive forwards if the car is moving backwards with update_priority PRIO_BUTTON. Further, define the task with a period of 10ms.
  • Further, define a third task "DisplayTask" that outputs some useful information on the LCD and is periodic with period 100ms.

Before we are done with the program, there is a potential problem to be taken care of. The global record "driving_command" is shared by several tasks that are writing to and reading from it. Do you need to protect it in any way? If so, do that.

Finally, compile and run the program. So far, the behavior is not too exciting, since all the car can do is just moving forwards. The exciting part will come next.

Add a light sensor like part 1. Using the above structure of periodic tasks, extend your program with a fourth periodic task "EdgeDetectionTask". It should have period 100ms and read the light sensor value in each instance. Using the sensor reading, it should try to set the driving command to a value that would make the car drive backwards when edge of the table is reached. Define the update_priority PRIO_EDGE used for writing to "driving_command" higher than the PRIO_BUTTON so that the car will move backwards even touch sensor is pressed outside the table. This task will send "driving_command" with PRIO_DEFAULT when it is inside the table so that button press can override it.

The task displaying useful information should be extended to display even more useful information. Further, the following hints may help:

  • You need to calibrate the light sensor for detecting table edge.
  • You may play with the task periods in order to experiment with reaction times.

When you are done, compile, upload and test your program. Make sure that the behavior is as desired.

What To Hand In

Please hand in only the source of the full (second) program that includes the light sensor code. Make sure you include brief explanations and that your source is well-commented. (Again, note that hand-ins without meaningful comments will be directly discarded.)


Part 4: Line Tracker

In the last part you will use all the knowledge you acquired in the above parts in order to create a car that can simultaneously:

  • Follow a line that is drawn on the floor (not necessarily a straight one!)
  • Complete the lap as fast as possible, maximum allowed time is 1 minute

You may use any of the techniques you learned above to define and schedule tasks, read sensors and send commands to the motors. The line tracking should be done with the light sensor.

The line to be followed will have the following shape:
lego-track.png

A test track is available from Jakaria's office and during the labs in the corresponding lab rooms. For accurate sensing you will have to recalibrate the reflection values of the track and the background before each race. The reflection value depends on the ambient lighting and the track condition. The race will most likely take place on a new fresh track.

In order to pass the lab, all teams need to demonstrate a working car that can do both jobs accurately. This has to be demonstrated on Monday, 02.10., see the schedule for details. The procedure will be as follows:

  • First, each team needs to demonstrate that its car can follow the line and complete a full lap. The car should not leave the track completely. Each team has 3 tries to complete this task. If all 3 tries failed, the team is disqualified.
  • Second, we measure the time that your car takes to complete a lap. A car should not take more than 1 minutes to complete the challenge.
  • If you fail to do none of the jobs above then you may fail the lab.
  • The group with firstest lap (challenge 2) will get some bonus points in the final exam.

Clarified rules:

  1. You may only use 1 light sensor. In principle, I don't have anything against you playing with/implementing advanced approaches, but this rule is supposed to keep conditions equal for all groups.
  2. Same holds for the other sensors: You may only use what *one* LEGO box provides, i.e., 3 motors, 1 light sensor, 1 distance sensor, etc.
  3. You do not have to drive backwards. Your car should make corrections if it is going away from the line.
  4. Don't assume a direction on the track. Your car must work both clock- and counterclockwise equally well (it's up to you how well...)

Useful Advice

Some issues arise every year to some of the groups. Experience tells that you should take care of the following things:

  • Build a physically robust car. Solving the labs and passing the race challenge will be difficult otherwise.
  • Do not hard-code the thresholds for the values of the light sensor, i.e., for classification of "track", "off track" or anything else. Demonstrating the car will most likely happen under different light conditions and will fail if you do that. Note that this is the single most frequent reason for students to have serious problems in the demonstration. (Some groups were unable to respect this issue and failed the whole course because of that!) Instead, build some simple-but-smart sensor calibration into the beginning of your program, so your car can adapt to different environments. You may assume that the lighting conditions do not change during the demonstration.
  • You have two weeks from hardware hand-out to the car competition. Start early, work intensely and try to finish on time.

Demonstration Results

Everyone did good job! We hope you had fun!

Demonstration Results

Group # Best Timing (seconds)
1 20
2 21
3 16
4 16
5 20
6 24
7 14.9
11 19
12 40

Updated  2017-10-09 13:27:36 by Syed Md Jakaria Abdullah.