Saturday 26 May 2012

It's all hooked up!

I've been slacking a little bit on keeping this blog up to date, but still progressing robot wise. My latest report is that I finally wired up the final components to MmBot - the Sabre Tooth 2x5 motor controller, and 2 quadrature encoders I have been building. Together they allow me to control MmBot's speed and direction quite accurately, and as result I will be able to reliably issue commands such as 'turn 90 degrees' or 'go forwards 3.5 metres'. First up though, here's MmBot in all her completeness:


And another one from above:


All parts are now attached and wired up. That's:
  • 2 Infra red range finders (front left/right)
  • 2 LinkSprite JPEG colour cameras (the eyes)
  • 1 Infra red motion detector (the nose)
  • 1 Ultra sound range finder (on top of head)
  • 2 quadrature encoders, each containing 2 infra red reflectivity sensors (next to the wheels)
  • 1 Blue smirf blue tooth modem
  • 1 Sabre tooth 2x5 motor controller
  • 2 18V DC geared motors
  • 1 Arduino Mega
The only bits missing are the interactive bits, which I may or may not add to this version of MmBot. These would be:
  • 1 speak jet voice/sound synthesizer (tested but not connected to MmBot)
  • 1 voice recognition chip
  • 2 led matrices
  • Speaker
  • Any extra leds I want to stick on!
While it'd be nice to get the interactive bits on as well, the circuitry is becoming a bit messy and I don't need it to achieve my initial goal of wandering around the office, identifying people or points of interest and looking cute. Plus my Raspberry Pi finally has a delivery date (3 weeks), and the cooler interactive stuff will be much easier and more powerful once it's hooked up.

The main thing I got working today though was the quadrature encoder and motor controller. This first fairly boring video shows the motor controller in action, gradually swinging the motors between full reverse and full forwards:



Next, things get a little more interesting. I start by asking both motors to go at 75% power and plonk MmBot down on the floor. Now you might hope that she would go in a straight line - after all I'm sending the same power to each motor. Unfortunately even in the best of scenarios motors aren't perfectly matched, and if you then introduce things like friction or wobbly wheels resulting from my supreme workmanship MmBot drives around in circles.

Not to worry though - that's why I built my encoders in the first place. These devices allow me to measure the actual speed the wheels are turning at. For MmBot to travel in a straight line both wheels need to turn at the same rate, so all I need to do is write some code which:
  • Supplies a fixed amount of power to left wheel (say 75%)
  • Initially supply the same amount of power to the right wheel
  • If the right wheel is going slower than the left wheel, supply it with more power
  • If the right wheel is going faster than the left wheel, supply it with less power
Sounds very simple, and it is in fact fairly simple. The only problem with this feedback type code is that the results aren't instantaneous - supplying more power to a wheel will allow it to reach a higher speed, but it takes time to have an effect. You have to be careful that your code accounts for this, or you'll find yourself constantly over compensating, and will end up driving all over the place!

You've seen my earlier code to read from the quadrature encoders, and other code to trigger sabre tooth motor controllers. this tiny bit of new code in the loop achieves speed control:

      //check where left encoder has got to relative to right encoder
      if(left_encoder_pos < right_encoder_pos)
      {
        //left is behind right, so we need to increase right motor speed
        motor_right_speed = min(255,motor_right_speed+1);
        MotorSerial.write(20);
        MotorSerial.write(motor_right_speed);    
        
        //and reset encoder positions (to avoid constant over compensation)
        left_encoder_pos = 0;
        right_encoder_pos = 0;
      }
      else if(left_encoder_pos > right_encoder_pos)
      {
        //left is ahead of right, so need to decrease right motor speed
        motor_right_speed = max(128,motor_right_speed-1);
        MotorSerial.write(20);
        MotorSerial.write(motor_right_speed);         

        //and reset encoder positions (to avoid constant over compensation)
        left_encoder_pos = 0;
        right_encoder_pos = 0;
      }

I wire up some switches on MmBot to turn on/off motors, and enable/disable speed control. This video shows the difference. Disclaimer: this is one of the worst filmed videos in the world, and my video editing abilities range from 0 to 0. I clearly need a tripod, a good camera and a lot of practice with AfterFx. Check out the start bit and the end bit, and pretend you never saw the middle bit.



And that's that! MmBot V1 is pretty much hardware complete. I'm in 2 minds now - I could go on and try to get some better code in there - take some steps towards autonomy. On the other hand, I now have a raspberry Pi in the post. Needs a bit of thought :)

-Chris

Sabre Tooth Motor Controller

I've got quadrature encoders working to measure speed and they're now attached to MmBot (although not wired to the arduino yet). Next I need to replace my very basic home made motor controller with the nice Sabre Tooth one I have. In a much earlier post I had a first blast at this but it turned out I had the Sabre Tooth RC. Fortunately, my standard Sabre Tooth 2x5 has now arrived which supports serial communication, and here it is:

Sabre Tooth 2x5 Motor Controller
It's a neat little piece of kit. On the left you can see the motor power connection, with the motor connectors top and bottom. On the right is GND and Vcc coming from Arduino, plus a white signal cable. This controller can run in lots of different modes which are configured with the switches at the bottom.

After a bit of experimentation I find the right setup for a simple serial connection at 9600 baud rate. In this mode you send the motor a value from 1 to 127 to control motor A (1=full reverse, 64=stop,127=full forwards), and 128 to 255 to control motor B in a similar fashion.  In addition, sending a 0 instantly stops both motors.

To get things going, here's the first basic circuit I build:

Circuit diagram of Arduino connected to Sabre Tooth
It's a pretty simple setup. On the left I have the Sabre Tooth connected to a 12V battery, a 12V DC motor , and (instead of another motor) the oscilliscope. On the right you can see the Arduino wired up, with GPIO3 going to the signal 1 connector. Signal 2 is not needed for serial communication. With this setup I can control the actual motor by sending values from 1 to 127, and I can monitor the signals that get sent to a motor by sending values from 128 to 255.

So that's the circuit, here it is built:

Arduino connected to Sabre Tooth, controlling a 12V DC motor.
All pretty simple so far, and now for some equally simple code:


#include <SoftwareSerial.h>

SoftwareSerial MotorSerial(3,2);

void setup()  
{
  MotorSerial.begin(9600);

  Serial.begin(9600);
  Serial.println("Hello"); 
}

void loop()
{
  //gradually take motors from full stop to full reverse
  for(int i = 64; i >= 1; i--)
  {
    MotorSerial.write(i);
    MotorSerial.write(i+128);
    delay(100);
  }
  
  //take motors from full reverse, back to stop and then to full forwards
  for(int i = 1; i <= 127; i++)
  {
    MotorSerial.write(i);
    MotorSerial.write(i+128);
    delay(100);
  }
  
  //take motors back down to full stop
  for(int i = 127; i >= 64; i--)
  {
    MotorSerial.write(i);
    MotorSerial.write(i+128);
    delay(100);
  }
}

In the setup function I simply initialize a software serial connection, which is transmitting via GPIO 3 at 9600 baud. The main loop simply sends different values to the motors to slowly get to full speed in one direction,  then gradually go to full speed in the other direction, and eventually come back to a stop.

And finally, a video of it in action:


All good. Massive thumbs up to Dimension Engineering - this piece of kit isn't just really powerful - it's really easy to use. Not the cheapest of controlellers, but I'd highly recommend it if you're willing to spend a few pounds.


Some Code And Lots Of Graphs

Over the past few days I've been experimenting with ways to improve my new home made quadrature encoder. By the end of my previous post I was successfully taking readings and converting them into rpm, which were then printed out to the serial monitor and finally graphed in excel. However my initial results were fairly noisy as you can see from this graph:

Initial spikey encoder results - rpm plotted against time
In theory that's the rpm of the wheel over time, however the wheel is turning at a roughly constant velocity so in a perfect world the graph would be a nice straight line. Now that's not necessarily going to be achievable (especially with my fairly rugged home made solution), however it's worth having a go at improving my results. Be forewarned: this post is gonna be very graphy and probably close to none-understandable, especially if you didn't catch the last one - read on if you dare!

The first thing I do is sit down and think of what might be causing these inaccuracies. I'm not convinced it's a sensor issue, and the Arduino is easily fast enough to keep up. After some pondering, the first few potential culprits I come up with are:
  • If the sensors are not out of phase by exactly 1/4 of a wave length then I'd expect readings to alternate between slightly too fast, then slightly too slow every interrupt. They should however average out, making every 2nd interrupt (half a wave) or 4th interrupt (a whole wave) right on the money. Note: not worked out which one yet, but the key is, it'd be very very regular :).
  • The wheel is a little wobbly, which means it's getting closer to and farther from the sensors each revolution. If the sensors are perfectly aligned this wouldn't be a problem, however if they were at a slight angle it could turn the square wave into a slightly sin-waveish thing. I'd expect to see fairly regular inaccuracies in a pattern that repeats once per revolution (every 32 interrupts) if this was the issue.
  • It could be the time difference between when my interrupt triggers to tick the encoder counter, and when I actually do my RPM calculations (every ms) in the main loop. If this were the issue I'd expect to see fairly random inaccuracies within a threshold of roughly 5ms (the time it takes for a single encoder tick at 360rpm)
To track down the culprit, I print out rpm calculations every 50ms, then every 150ms, then every 250ms, then every 1000ms and plot them against encoder position:

Encoder results at different intervals - still very spikey
Longer gaps will inevitably give flatter lines as they allow more interrupts to pass in between readings, so the data will naturally be smoothed. However the difference in patterns between the graphs should still indicate which of the above problems (if any) are the issue. As you can see, regardless of interval time, we get extremely spikey results with no discernable pattern. Before getting clever it occurs to me that I'm measuring times in milliseconds, but a single encoder tick can take around 5ms - this gives me at best a 20% error margin. I make a very simple change so the code does calculations at a microsecond level of accuracy instead, and look what happens:

Encoder results using microsecond accuracy instead of millsecond
Wow! What a difference. I'm still getting spikey results, but there's clearly a median line for each interval time, and the errors that bounce either side of it are extremely regular.

Now I'm convinced I'm not losing data due to accuracy (as even the errors are predictable!) I switch to running at constant 150ms intervals, printing out the rpm using 4 different calculations:

  • No error compensation
  • Compensation for time difference between interrupt and calculation time
  • Compensation for out of phase sensors
  • Combined both settings above

I make 2 changes to achieve this. First, my interrupt function now records the time at which it took it's last reading:

void updateEncoder()
{
  encoder_time = micros(); //record time in microseconds of reading
  byte new_encoder = readEncoder();
  int dir = QEM[current_encoder*4+new_encoder];
  encoder_pos += dir;
  current_encoder = new_encoder;
}

Then I update the loop function as follows:

  //record current time, the current encoder position, and the time the last encoder reading occured
  long new_time         = micros();
  long new_encoder_pos  = encoder_pos;
  long new_encoder_time = encoder_time;
  
  //calculate rpm with no compensation
  current_rpm_nocomp    = calcrpm(last_time, new_time, last_encoder_pos, new_encoder_pos);
  
  //calculate rpm using time compensation (i.e. we use the last encoder time rather than current time in calculations)
  current_rpm_timecomp  = calcrpm(last_encoder_time, new_encoder_time, last_encoder_pos, new_encoder_pos);
  
  //calculate rpm, only updating if it's an even numbered reading
  current_rpm_evencomp  = (new_encoder_pos & 1) ? current_rpm_evencomp : calcrpm(last_time, new_time, last_encoder_pos, new_encoder_pos);
  
  //calculate rpm if even numbered reading, using time compensation
  current_rpm_allcomp   = (new_encoder_pos & 1) ? current_rpm_allcomp : calcrpm(last_encoder_time, new_encoder_time, last_encoder_pos, new_encoder_pos);
  
  //record last readings to use in next calculations
  last_time             = new_time;
  last_encoder_pos      = new_encoder_pos;
  last_encoder_time     = new_encoder_time;


Printing out the results and plotting against time, I now get the following graph:

Different approaches to error compensation for the encoder
The purple line contains the fully compensated data and look how flat it is! I decide that given this is a prototype (in MmBot V2 I'll use motors with built in encoders) this is good enough for now. So, as one final step, I attach the encoder to the inside of MmBot as you can see here:

Black/white wheel disk on inside of wheel, and 2 IR reflectivity
sensors facing it to make a quadrature encoder
And with it attached, I plot encoder position against time, with the motor turning on/off at regular intervals:

Encoder position plotted against time (in microseconds)
with motor turning on/off at regular intervals
Pretty slick!

That's it for now - next up I'll properly attach both encoders, connect them to the Arduino Mega inside MmBot, wire up my Sabre Tooth motor controller and get some real navigation going!

Saturday 19 May 2012

Quadrature Encoder

In the last post I began adding some more sensors, the last of which was a rotary encoder. This device is used to measure the speed of a spinning wheel, using a black+white striped disk and an infra red reflectivity sensor:

Wheel with disk
Sensor pointing at disk
Connected to oscilloscope

On the left you can see the coloured disk, centre is a cross section to show the sensor and wheel, and right shows the actual square wave signal coming out as I spin the wheel. For more info on the basic setup, see my last post (More Sensors).

Now it's time for something more interesting. My next goal is to be able to identify whether the motor is going forwards or backwards. With just 1 sensor this isn't possible, as I just get a binary value back. However it occurred to me that using 2 out-of-phase waves I'll be able to work out which way the wheel is turning. My original idea was to have a disk with 2 waves on, and have 2 sensors - one for each wave. You can see this disk on the right hand side below:

3 potential wheel disks - single wave low res (left), single wave high res (centre), dual wave (right)
However after talking to Alex about it, he pointed out that you can achieve the same result with a single wave disk, provided you have 2 IR sensors that are positioned out of phase with the square wave printed on the disk. I put together 2 extra disks as you can see above - a hi res one (middle), and in case the sensor can't handle it, a lower resolution one (left).

This diagram shows how I'll mount the sensors to be 'out of phase' with the wave represented by the black/white stripes on the disk:

2 sensors for quadrature encoder mounted out of phase with the pattern on the disk
You can see when the disk is lined up so the left sensor is in the centre of the black section, the right sensor is exactly on the boundary between the black section and the next white section. So how does this help? Well, hopefully this wave shows things a bit more clearly:

Wave's generated from each sensor by the turning wheel
Here you can see the waves generated by both sensors as the wheel rotates. Note how they are the same wave, just out of phase by 1/4 of a wave length. The red line shows where the wheel is right now, and you can see from the reading that Wave 1 is low (over black), and Wave 2 is on the boundary between black and white - just like in the earlier cross section. Now, if the wheel is turning, one of 2 things can happen:
  • The wheel can turn left (red line moves towards green dashed line). When the red line hits the green line, a change in wave 1 is detected (goes from low to high), and wave 2 is high
  • The wheel can turn right (red line moves towards blue dashed line). When the red line hits the blue line, a change in wave 1 is detected (goes from low to high), and wave 2 is low
So, we were in the centre of a low point in wave 1. We waited until a change was detected in wave 1, and then the reading from wave 2 tells us which direction we went in! This principle can be applied at any point in time - we wait for a change in a wave, and then the reading from the other wave gives you direction. On top of that, the frequency of either wave tells you speed. So... speed and direction! Hurray.

Next step, build something. To start, I solder cables to 4 sensors (2 for each wheel):

2 sensors mounted on wood (right), my awesome soldering (left)
The sensors on the right are mounted on some wood ready for attaching. The ones on the left are just there so you can admire my awesome soldering :)

Sensors connected to oscilloscope
Close up of sensors pointing at disk
I now build a simple construction (mostly out of blu-tac, cardboard and tape) to mount a motor on the side of my desk, with the 2 sensors held directly in front of it. The motor has virtually no resistance and a full 12V going through, so is running at around 360rpm. This video shows the system at work, with one of the infra red sensors connected to an oscilloscope:


That's step 1 done - I can take readings from 2 sensors in front of a wheel. Next up, I connect it to an Arduino - initially just into analogue inputs 1 and 2:

Sensors connected to analog inputs on Arduino
This simple program is the first test I do (I left out setup as it doesn't do anything except init the serial connection):

int left_buff[256];
int buff_pos = 0;

void loop()
{
  left_buff[buff_pos] = analogRead(1);
  buff_pos++;
  if(buff_pos >= 256)
  {
     for(int i = 0; i < buff_pos; i++)
       Serial.println(left_buff[i]);
     buff_pos = 0;
  }
 
  delay(5);
}

This is basically reading values from the left sensor every 5ms (significantly less than 1/4 of a wave length at 360rpm) into a buffer. When the buffer fills, it gets dumped to the serial monitor and the process begins again. The resulting output isn't very pretty:

721
584
32
195
827
38
47
858
376
43
698
752
43
233
879
... and lots more!...

But what happens if we plonk it into excel and draw a graph?

Output from Arduino when reading a sensor as the wheel slows down
This shows the output as the motor slows down - we're still getting that wave. It's not square as we aren't sampling often enough, so I decide to try another approach:

void loop()
{
  int new_left_state = analogRead(1) > 100 ? 1 : 0;
  int new_right_state = analogRead(2) > 100 ? 1 : 0;
  
  digitalWrite(3,new_left_state);
  digitalWrite(4,new_right_state); 
}

Now I'm reading the wave constantly, with no delays whatsoever. I couldn't output to serial fast enough to display these readings, however what I can do is output digital results to the digital pins, and monitor the output with an oscilloscope:

Square wave coming out of digital pin, generated by Arduino reading sensors
As you can see, I now get a perfectly regular square wave. If you can read the settings you'll see the wave alternates about once every 10ms. With 16 coloured strips on the card this means it takes 160ms for the wheel to turn once, thus the wheel turns 6.25 times per second. Multiply by 60 and you're at 375 rpm - just 15 off what I expected!

The Arduino is reading sensor data now, and I've proved it works with the short program above. However this code (and anything that used it) needs to be updating constantly so it can detect changes in sensor output as soon as they occur. This won't fit into any useful program, so I need to switch the code to use interrupts. This is an event triggered by the hardware that interupts whatever is happening and runs some code.

Unfortunately interupts on the Arduino are only available for digital IO, so the first problem is how to convert the analog output from the sensors into a digital input for the Arduino. After worrying about this for a while, I just plug the sensors directly into the digital IO. Fortunately enough the range coming from the analog sensor is wide enough to cross the digital IO reference voltage so I get a nice solid square wave coming out straight away. Now some modifications to code to use interupts instead of constant polling.

//PIN Definitions
#define PIN_IRLEFT 2
#define PIN_IRRIGHT 3
#define PIN_OUTLEFT 4
#define PIN_OUTRIGHT 5

void leftChange()
{
  digitalWrite(PIN_OUTLEFT,digitalRead(PIN_IRLEFT));
}

void rightChange()
{
  digitalWrite(PIN_OUTRIGHT,digitalRead(PIN_IRRIGHT));
}

///////////////////////////////////////////////////////////////////
//init
///////////////////////////////////////////////////////////////////
unsigned long last_print = 0;
void setup()  
{
  //init pins
  pinMode(PIN_IRLEFT,INPUT);
  pinMode(PIN_IRRIGHT,INPUT);
  pinMode(PIN_OUTLEFT,OUTPUT);
  pinMode(PIN_OUTRIGHT,OUTPUT);
  pinMode(PIN_OUTXOR,OUTPUT);
    
  //init serial port for debugging and print message  
  Serial.begin(115200);
  Serial.println("Hello");
  
  attachInterrupt(0, leftChange, CHANGE);
  attachInterrupt(1, rightChange, CHANGE);
}

As you can see here, there's no actual need for an loop function at all. I attach the interrupt 0 (which is on pin 2) to leftChange, and interrupt 1 (which is on pin 3) to rightChange. When an interupt is triggered I read the corresponding pin and output it's value just as with the earlier example. Once again, the oscilllosope shows me a square wave - things are working!

Now to get some useful data out (with the help of this tutorial by OddBot). The main thing he points out is how to convert the encoder signal into a direction. I end up with this code which is based off his examples:

byte current_encoder = 0;
int QEM [16] = {0,-1,1,2,1,0,2,-1,-1,2,0,1,2,1,-1,0};
long encoder_pos = 0;
byte readEncoder()
{
   byte left = digitalRead(PIN_IRLEFT);
   byte right = digitalRead(PIN_IRRIGHT);
   digitalWrite(PIN_OUTLEFT,left);
   digitalWrite(PIN_OUTRIGHT,right);
   return left | (right << 1);
}

void updateEncoder()
{
  byte new_encoder = readEncoder();
  int dir = QEM[current_encoder*4+new_encoder];
  encoder_pos += dir;
  current_encoder = new_encoder;
}

I'm doing 3 things here:
  • Reading a number from 0 to 3 to indicate the 2 bit encoder value (1 bit per sensor)
  • Using the QEM (quadrature encoder matrix) lookup table to convert a previous+current encoder value into a direction
  • Incrementing or decrementing the current encoder position based on direction
In other words, each time the interupt is triggered, I either increment or decrement the current encoder position.

Next up, I add my loop function as follows:

void loop()
{  
  long new_time         = millis();
  long new_encoder_pos  = encoder_pos;
  long pos_delta        = new_encoder_pos - last_encoder_pos;
  long time_delta       = new_time-last_time;
  
  last_time             = new_time;
  last_encoder_pos      = new_encoder_pos;
  
  long rpm              = ((pos_delta * 60000) / time_delta)/32;
  
  Serial.println(rpm);
  
  delay(100);
}

This is basically taking a change in encoder position and the time passed to work out the speed the wheel is turning. Just pos_delta / time_delta would give me stripes-per-millisecond which isn't too useful, but multiply it by 60000 (milliseconds -> minutes), and divide by 32 (number of interrupts per revolution) and you get rpm! And surprise surprise, if I print out the rpm and graph it in excel:

RPM graph coming from Arduino
It's spikier than I'd like (although that's probably due to badly positioned sensors and a wobbly blu-tac attached wheel), but it's clearly giving a solid reading of just around 400rpm - probably what I'd expect from this motor given all that's currently attached is a small wooden disk!

Well, that's it for now. I have a quadrature encoder built. Next up - fit the disk and sensors to MmBot. Once that's done I can take the rpm of each wheel and use it to work out the the actual velocity of the robot at the point the wheel is touching the ground, eventually giving me the ability to reliably say 'go forwards 5m', or 'turn 90 degrees'.

-Chris


Thursday 17 May 2012

Even More Sensors!

I've been pondering how to push forwards with the AI side of this project, and come to the conclusion that I really need a good set of sensors in order to get much success. If I had my Raspberry Pi and a couple of web cams it might be possible to do everything using just vision (although I doubt it), but realistically the more sensors the better. So, with that in mind I set about adding more sensors to MmBot - hurray!

You already heard about the PING ultrasound sensor in earlier posts, but as a recap it's a little device that sends an ultra sound pulse and waits for it to come back (aka sonar) which is one way of measuring distance. Another approach is to use infra red range finders. These devices shine an infra red LED in a fairly tight beam, and measure the strength of the beam that comes back. They each have their own faults and benefits, so I'm gonna use both.

First up, here's my newly soldered and ready to mount set of sensors:

New sensors ready to attach
The ultra sound and motion sensors will be mounted on the head (partly as it's a good place - high up and moveable, but mostly because they're cute and the motion sensor makes a good nose). I'll attach the infra red sensors on the front of the robot on each side so they can detect oncoming obstacles.

After a bit of fiddling and a load of tape, I get the sensors temporarilly attached to MmBot in a layout I'm happy with:

New sensors taped to MmBot
I'm not convinced the ultra sound looks great on top of the head - something I'll need to fix in the final design. However the IR sensors on the sides look fine, and as I said earlier - the motion sensor still makes a good nose. It even glows red when it detects motion!

Once things are wired up I'm not in the mood to start programming much (partly cos I have a cold), so I get my new oscilloscope and attach it to read the output from one of the IR range finders:

Wave being generated by moving my hand in front of the IR range finder
 This video shows the whole thing in motion:


After testing all the sensors it's time to properly attach them. For the IR sensors I simply drill a hole and bolt them to the chassis. Similarly, the motion sensor goes on the front of the head so a bit of wood and a couple of bolts later it's easily attached. However the ultra sound sensor needs to sit on a frame as you can see here:

Ultrasound sensor in a frame on top of the head
There's nothing to bolt it to, and I'm not convinced I can simply glue it to the top of the head. There'll be too much torque when the head turns and the frame will quickly break off. To solve this, I cut 2 grooves into the head frame that the sensor frame can slot into:

Ultra sound sensor frame slotting into the head
Now I can glue it, and with a bit of the old saw dust + super glue = instant hardening mega solid resin trick the bond will be stronger than the wood!

That's the first set of sensors going, but while doing so it occured to me that a key part of knowing where you are is the ability to take a guess at your current location based on your movement since your last known location. For example, if I know there's a bar 10 miles away at 45 degrees north, then I turn to 45 degrees north and walk for 10 miles, I can place a good bet that I'm somewhere near the bar.

The predictive approach detailed above gets less and less accurate over time, however when combined with other senses it is extremely useful and can massively reduce computation time. You can see why by considering a scenario we all know too well - being lost on a long car journey. When this happens I generally try and find a street sign, then try to find that sign on the map. However, as I know roughly where I am, I only have to scan a small section of map (hopefully) to find the street. Without this predictive ability I'd need to scan the entire UK road atlas every time I needed to find a street.

All of MmBot's motion is driven from 2 wheels. If I can measure the speed each wheel is turning, I can derive the speed each side of the robot is moving at, and thus it's path across the floor. So how to measure wheel speed? Well it turns out this is a solved problem and uses a device called an encoder. This device uses infra red reflectivity sensors at it's heart. They're just like the range finders mentioned earlier, but much less accurate and only operate within around 1.5cm. This video shows one attached to an oscilloscope (sorry for rotatedness):



Despite the odd angle, you can hopefully see the signal changing as I move my finger over the sensor.

To make an encoder you take advantage of the fact that black reflects less IR than white. If you attach a disk like this to the face of a wheel or cog:

Disk attached to inside of wheel
Then attach an IR reflectivity sensor opposite it, as you can see in this cross section of MmBot:

Cross section of MmBot, showing IR encoder  and wheel disk

As the wheel turns, the amount of IR reflected back will alternate between low (black strip) and high (white strip). By measuring the speed at which the signal alternates, I can derive the speed at which the wheel is rotating and thus (assuming there's no slip), the speed at which the robot is moving! The alternating signal generates a square wave as this diagram illustrates:

Wave (top) being generated from alternating colour (bottom) as the wheel slows down

And here it is in real life!

Oscilloscope showing signal from rotating wheel
As the wheel turns it generates a signal which you can see displayed on my oscilloscope. This video show it in action, with the wheel turning alternately on/off:


That's where I got to today. Next up I'm going to refine my wheel encoder design and try to get it bidirectional (i.e. it can tell the difference between forwards/backwards).

-Chris







Saturday 5 May 2012

Debugging the eyes

The cameras are mounted on a head, they're talking to the Arduino, it's talking to the blue tooth, the blue tooth is talking to a PC and the PC is displaying stereo images on screen:

Stereo image from MmBot's eyes
Unfortunately I'm still hitting an issue where after a few photos, something goes wrong and the PC stops getting data from the blue tooth. This post is gonna be another one of the debugging ones - I'll document my progress working out the issue as I work it out so it's gonna be rambling :)

At the moment, I have some basic code on the PC that:
  • Requests an image be taken
  • Waits until it has an image size
  • Repeatedly reads content until image is finished
On the Arduino side I've added some debug prints. The first bit of logging looks like this:

Requesting camera take picture
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
Camera picture taken
Requesting camera file size
Camera file size received: 6244
Requesting camera take picture
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
Camera picture taken
Requesting camera file size
Camera file size received: 6824

Here the PC has requested a picture from each eye, and then requested the file size. The unprocessed bytes are from the camera system, which show it's processing commands other than 'get content', so doesn't need to read the response. Here the unprocessed bytes are the response from the take picture request.

Next up, the Arduino starts printing out a log of what's being sent to the pc:

L[0] sz=32, strm=384
R[0] sz=32, strm=416
L[32] sz=32, strm=608
R[32] sz=32, strm=640
L[64] sz=32, strm=832
R[64] sz=32, strm=832
L[96] sz=32, strm=1024
R[96] sz=32, strm=1024
L[128] sz=32, strm=1024
R[128] sz=32, strm=1024
L[160] sz=32, strm=1024
R[160] sz=32, strm=1024

This shows the cameras (L or R), the address in the image being sent inside '[ ]', the size of the chunk, and how full the stream buffer is. Currently I can read from the camera faster than I can send to the PC, so you see the buffers gradually fill up to 1024 bytes (the total buffer size).

After a while, a camera will reach the end of the image, and it's stream buffer will start emptying as the PC reads the remaining bytes:

R[5184] sz=32, strm=1024
L[5216] sz=32, strm=1024
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
R[5216] sz=32, strm=1024
L[5248] sz=32, strm=1023
R[5248] sz=32, strm=1024
L[5280] sz=32, strm=991
R[5280] sz=32, strm=1024

Here you can see the 'unprocessed bytes' from the 'stop picture' command that the left camera received when the left image was complete. Following that, the left stream buffer begins emptying.

A little while later, the right camera does the same:

L[5792] sz=32, strm=479
R[5792] sz=32, strm=1024
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
L[5824] sz=32, strm=447
R[5824] sz=32, strm=1011
L[5856] sz=32, strm=415
R[5856] sz=32, strm=979

And eventually both buffers, empty, the PC shows the image, and the process repeats:

R[6784] sz=32, strm=51
R[6816] sz=19, strm=19
Requesting camera take picture
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
Camera picture taken

OK, potentially not too interesting overview, but important to understanding what's going on. In this example everything went fine and I acquired an image similar to the one at the top.

Now it's been going wrong in a few different ways, all of which end with the PC waiting for data that isn't coming. Whether it's all the same problem just showing different symptoms, or several problems I don't yet know. Here's an example of something going wrong:

R[7232] sz=32, strm=102
R[7264] sz=32, strm=70
R[7296] sz=32, strm=38
Requesting camera take picture                        <--- succesfully requested the left camera take a picture
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
Camera picture taken
Requesting camera file size
Camera file size received: 7828                        <--- got the file size
L[0] sz=32, strm=352                                <--- begin reading left stream
R[7328] sz=6, strm=6                                <--- eh? right feed still reading???
L[32] sz=32, strm=544                                
L[64] sz=32, strm=736
L[96] sz=32, strm=928
L[128] sz=32, strm=1024
L[160] sz=32, strm=1024
L[192] sz=32, strm=1024

What appears to have happened is the PC assumed the right image was finished before it was. I'm guessing the series of events went:
  • PC continues reading left/right images until it has got all data from both
  • PC then requests a new picture be taken from both cameras
  • For some reason, the assumption that the right camera was finished was incorrect!
  • As a result, the left state machine restarts as normal, but the right state machine does nothing, as it's still in the 'read content' state
  • The PC now requests file sizes from both cameras. It gets the new size from the left camera, but the size of the previous image (as the Arduino hasn't finished yet) from the right image
  • We now start reading content. the left works as normal - starting at address 0 and filling up the buffer. However the right simply reads the final few bytes, and then, having not got a command to take another picture, simply stops.
  • Finally, the PC reads the entire left image, then sits there waiting for the right image to come through - which isn't happening as it was never sent!
Phew! So, err, why? Well, first I do a few more tests to see if the symptoms change. Previously I've seen 2 other symptoms - sometimes the stream values seem to go complete corrupt (and I start reading from position 1827361928), and occasionally a byte just seems to get lost - the pc receives 63 bytes when it was supposed to get 64. After a few tests I see both problems occur again.

This could be to do with loss of data over blue tooth, but some of these problems stink of a logic bug somewhere - especially the first one. I make a couple of tweaks to the PC code:
  • I make sure it waits until the message comes back that the 'start picture' was succesful - my Arduino code already supported this (by sending back 1 or 0), so I may as well use it
  • I print out the image sizes it's expecting
The PC quickly gets stuck after a couple of images, failing to start taking the right photo. Basically the first problem I mentioned has occured, but I've caught it early. I check the print outs and the right camera was expecting 6176 bytes, which the Arduino agrees with. The last block requested and sent to the camera was at address 6144, of size 32 bytes, but there were 38 in the buffer. Clearly the PC made the right call - it got exactly 6176 bytes and then moved on to the next shot. So where did the extra 6 bytes come from?

Then I notice a really scary line in the Arduino output:

R[5152] sz=32, strm=1030

My 1024 byte stream buffer has 1030 bytes in it. Oh dear. It doesn't matter what the blue tooth is doing - there is no way this should ever happen if the code is correct. This explains the getting stuck due to 6 bytes, and also explains the corrupt data - memory trampling.  It's time to revisit my stream filling and see how on earth this can happen.

At first glance, I can see a line of code that would cause the problem if something else had gone wrong:


          //this is the main content bit
          //first, we bail out if we haven't yet sent the remote the data that is in the picture buffer
          if(PictureStreamUsed >= PICTURE_STREAM_SIZE)
            break;

This 'bail out' is fine provided you always read the same sized chunks (except the very last one), and PICTURE_STREAM_SIZE is divisible by the chunk size. In theory I do exactly this - only getting 32 bytes at a time. However in order for my stream to work I must ensure that:
  • The 'PictureStreamWrite' is always a multiple of 32, and never goes above 1024-32
  • The 'PictureStreamUsed' is always a multiple of 32, and never goes above 1024
If any chunk other than the last one is not 32 bytes, this code will fail. I could change the above code to handle it, but that'd just be hiding the deeper problem. Why am I getting chunks that aren't 32 bytes? First thing to do - print out what I am getting and see!

Requesting 32 bytes to camera stream at 992
Received 32 bytes to camera stream at 992
L[5120] sz=32, strm=1024
Requesting 32 bytes to camera stream at 0                     <----- left requests 32 bytes
Received 31 bytes to camera stream at 0                       <----- left gets 31 bytes back - finished?
Requesting 32 bytes to camera stream at 31                    <----- no! left requests 32 bytes again
Received 7 bytes to camera stream at 31                       <----- left gets 7 bytes back!
R[5120] sz=32, strm=1024
Requesting 32 bytes to camera stream at 0
Received 32 bytes to camera stream at 0
L[5152] sz=32, strm=1030
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
.........
Requesting 32 bytes to camera stream at 320
Received 32 bytes to camera stream at 320
L[5472] sz=32, strm=710
R[5472] sz=32, strm=1024
Requesting 32 bytes to camera stream at 352                  <---- right requests 32 bytes
Received 11 bytes to camera stream at 352                    <---- right gets 11 bytes back - finished?
Requesting 32 bytes to camera stream at 363                  <---- no! right requests 11 bytes again
Received 3 bytes to camera stream at 363                     <---- right gets 3 bytes back
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 

Well, it seems my assumption that the camera would keep giving me the data I asked for until there was none left was incorrect. There's 2 chunks at the end of each image that are of none-32 byte multiples. This is already bad and explains corruption, but in theory, provided I end up with the right amount of data, it should work most of the time. Unless....

Requesting 32 bytes to camera stream at 928
Received 23 bytes to camera stream at 928 total read 7095 of 7096
Requesting 32 bytes to camera stream at 951
Received 7 bytes to camera stream at 951 total read 7102 of 7096
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
R[6048] sz=32, strm=1024

Yes. It appears the camera happily returns more data than is actually in the image, and the code they suggest to mark the actual end (detect 0xff,0xd9) doesn't actually... well... work properly. I guess at this point I'd expect no less from this camera. Can you tell it's not my favourite thing in the world?

So there's 2 problems to handle:
  • The fact that I get additional bytes
  • The none-32 byte results.
The first issue is trivial. I'm already recording how many bytes have been sent, so I simply change the wait condition at the end of the camera state machine to be:

          //finally, wait for remote to drain the camera stream
          if(PictureSendAddress < PictureSize)
            break;

And for good measure, adjust the command code to avoid sending extra data:

          //clamp the amount to discard extra bytes at the end of the image
          if( (CS1.PictureSendAddress+bytes_to_send) > CS1.PictureSize )
          {
            bytes_to_send = CS1.PictureSize - CS1.PictureSendAddress;
          }

Things are much more stable now, and the code gets loads of pictures. Now all I have to deal with is the much less common issue of data corruption. This only occurs if the none-32 byte chunks arrive right at the end of the stream. This diagram shows the problem (it's got 8 byte chunks, not 32 byte, but the principle is the same):


Each row shows the buffer filling up, initially in full sized chunks nice and uniformly. However on row 4 we get a smaller chunk back, leaving a gap at the end. This stops the write pointer wrapping round. Now if the data stopped there this would be fine, but the image is not finished and another chunk is requested. The chunk is not full size, but it does go off the end of the buffer, corrupting any data that follows it. As it happens, the data following the stream buffers in my code is the image size and stream information, which explains why I occasionally end up reading from insane stream positions.

So what to do? Well first, I need to make the problem happen again. I've not seen it since the last fix which is mildly concerning - have I accidently made it go away? If so I need to know why. If not, I need to fix it. The issue occurs due to a buffer overrun, so I can make it more common by reducing the buffer size. I shrink it down to 96 bytes (from 1024). This means there's a 1 in 3 chance that a read will be at the end of the buffer. It takes a few photos to occur, but eventually I get:

Requesting 32 bytes to camera stream at 64
Received 15 bytes to camera stream at 64 total read 6319 of 6320
Requesting 32 bytes to camera stream at 79
Received 7 bytes to camera stream at 79 total read 3732081255 of 1000798054
Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 

The read at byte 79 can cause up to 32 bytes to arrive (even if the camera api only gives you back 7, more can still be written). As a result, the stream buffer could be written up to byte 114, and it's only 96 bytes in size. The question is, do I actually need those extra bytes? i.e. The final 2 reads (the ones that are none-uniform) can be up to 64 bytes in total, but it's conceivable that I only need the first 32 bytes of them. It sounds weird, but print outs like this make me think it's possible:

Received 31 bytes to camera stream at 64 total read 7103 of 7104
Received 7 bytes to camera stream at 95 total read 7110 of 7104

Look carefully and you'll see the first none-uniform chunk is 31 bytes in size (1 byte off a full chunk), and we end up with 7103 bytes of a 7104 byte image. All I need is 1 byte to finish the image, and all I need is 1 byte to finish the chunk. This pattern seems to repeat - they almost always match up, and even when they don't, I never end up needing more than 1 full chunk to finish the image. If this is the case, I can simply extend the stream buffer to have 32 bytes of padding in, and allow it to overflow. I decide to try and see what happens, as this will make life much easier! You can see the small change here:

    byte PictureBuffer[PICTURE_STREAM_SIZE+32];

This seems to work and my images come through fine. The whole system is much more stable now, and head upstairs for a cup of tea. When I come back, it's still taking pictures happily!

However I'm still getting occasional image corruption, and once or twice the pc code stops responding, waiting for data that doesn't come. My gut says this is due to overloading the blue tooth and I recall reading a small note in the modem docs about wiring up the ready-to-receive and ready-to-send pins being advisable. This is because blue tooth is wireless, so isn't a perfectly stable (in terms of timing) connection, and I could easily end up overloading the transmit buffer. To test this, I remove a lot of the serial print outs (which always slow things town), increase the stream buffer, and turn up the rate at which I attempt to feed data back to the PC. With both commands set to transmit up to 256 bytes back to PC at a time and a stream buffer of 1024 bytes I start getting print outs like this:

Unprocessed bytes: 5: 0x76 0x00 0x36 0x00 0x00 
Camera picture taken
Requesting camera file size
Camera file size received: 7200
L[0] sz=256, strm=288
R[0] sz=256, strm=320
L[256] sz=256, strm=256
R[256] sz=256, strm=320
L[512] sz=256, strm=288
R[512] sz=256, strm=288
L[768] sz=256, strm=256
R[768] sz=256, strm=288
L[1024] sz=256, strm=256
R[1024] sz=256, strm=288
L[1280] sz=256, strm=256
R[1280] sz=256, strm=288

What you see here is very good! The stream buffer is no longer filling up, meaning I am transmitting to PC at the same rate as I receive from the cameras - exactly what you want from a streaming system, and I'm back to my 1 frame every 5 to 10 seconds rate. It's still very stable, and I'm tempted to say I'll leave the fancy blue tooth flow control for another day, and simply make my PC code more robust against lost data. Not by getting all clever with none-blocking code - simply by making it time out if data hasn't arrived, and tweak the PC+MmBot code so it is able to give up and restart a picture.

Wow! Eyes now work. Given image processing will be limited with this version (until I can do it onboard - either on a Raspberry Pie or some other more hardcore processor), I think I'll say that's as good as the camera feed is getting for now. 0.1fps isn't great, but I can still do stereo imaging of the background and basic face recognition.

Next up I'll get some nice controls in the PC app for driving MmBot around manually and getting the camera feed. Then I'll add some infra-red range finders and ultra sound to the equation and get MmBot wondering randomly around the office, taking pictures and feeding them back. Not long before the first hints of autonomy!

-Chris


First Sight

Back on Thursday I managed to wire up both cameras to serial connections 2 and 3 on the Arduino, and began the process of getting the images over to the PC for display / processing. Over the past few days I've been gradually progressing, but getting the code to work reliably has been a hard task. Initially I was just gonna try and send them over usb serial and then handle blue tooth later, but as it turned out, using blue tooth for communication and the normal usb serial for debugging was much more useful. This is the pair of cameras wired up:

The cameras connected to the prototyping board inside MmBot
So, over the past few blogs I've ended up with:
  • A system for remote communications over blue tooth
  • 2 link sprite cameras, mounted on a nice 'head'
  • A nice API for using the cameras
First step was to work out how to allow the PC to send a command to request a camera image, get it's information and eventually retrieve the data. Remember this whole thing has to be totally none-blocking. In other words, it needs to be a piece of code that runs once per frame, checks the state of things, and takes action if there's something to do. I settled upon writing a state machine to acheive this, which is shown below:

State machine for reading data from cameras
You can see how much more complex things get when you want them to be none blocking!

Anyhoo, the key bit is really the blue section. Here we're repeatedly grabbing some data from the camera, waiting for it to get to the pc, and then bailing out if we've reached the end of the photo. 

With a bit of debugging this process actually worked fairly well, but I wasn't entirely satisfied. The issue here is that the system has to wait for the next chunk of the image to get to the pc, before being allowed to read more from the camera. Surely it'd be better if we could be busy reading the next chunk while simultaneously sending the current chunk to the PC!

Doing the sending/receiving simultaneously isn't actually that tricky - it's basically a case of streaming. At the moment I just have 1 buffer, so after reading it from the camera, I have to wait until it's been requested by pc and transferred into the serial transmit buffer before refilling. However if I make that buffer big enough to contain 2,3 or even 32 chunks of image then I can be reading from one and writing to another! This code can easily get tricky to write, but I've written it wrong enough times over the years to finally know how to get it right first time :) Here's the state machine code I ended up with:

    void UpdateCamera()
    {
      switch(CameraStage)
      {
        case 0:
          //idle - i.e. nothing requested
          break;
        case 1:
          //remote has requested a picture be taken, so tell camera to begin taking picture
          Serial.println("Requesting camera take picture");
          Cam->StartTakingPicture();
          CameraStage++;
          //fall through
        case 2:
          //wait for camera response
          if(!Cam->Update())
            break;
          Serial.println("Camera picture taken");
          CameraStage++;
          //fall through
        case 3:
          //request the file size
          Serial.println("Requesting camera file size");
          Cam->GetFileSize(&PictureSize);
          CameraStage++;
          //fall through
        case 4:
          //wait for camera response
          if(!Cam->Update())
            break;
          Serial.print("Camera file size received: ");
          Serial.println(PictureSize);
          CameraStage++;
          //fall through
        case 5:
          //this is the main content bit
          //first, we bail out if we haven't yet sent the remote the data that is in the picture buffer
          if(PictureStreamUsed >= PICTURE_STREAM_SIZE)
            break;
          //next, we check if we have reached the end of the photo
          if(PictureReadAddress >= PictureSize)
          {
            //reached the end, so jump to stage 7 (where we stop taking the picture)
            CameraStage=7;
            break;
          }  
          //need more data, so begin getting content
          Cam->GetContent(PictureBuffer+PictureStreamWrite,&PictureAmountRead,32,PictureReadAddress);
          CameraStage++;
           //fall through
        case 6:
          //wait for camera response
          if(!Cam->Update())
            break;
          //got response, so increment the read address and loop back to stage 5
          PictureStreamWrite = (PictureStreamWrite + PictureAmountRead) % PICTURE_STREAM_SIZE;
          PictureStreamUsed += PictureAmountRead;
          PictureReadAddress += PictureAmountRead;
          CameraStage=5;
          break;
        case 7:
          //all done, so tell camera to stop taking picture
          Cam->StopTakingPicture();
          CameraStage++;
          //fall through
        case 8:
          //wait for camera
          if(!Cam->Update())
            break;    
          CameraStage++;
          //fall through
        case 9:
          //finally, wait for remote to drain the camera stream
          if(PictureStreamUsed > 0)
            break;
          //reset camera state
          CameraStage=0;
          PictureSize = 0;
          PictureAmountRead = 0;
          PictureReadAddress = 0;
          PictureSendAddress = 0;
          PictureStreamWrite = 0;
          PictureStreamRead = 0;
          PictureStreamUsed = 0;
          //fall through
        default:
          //anything that hits here just goes back to stage 0
          CameraStage = 0;
          break;  
      }
    }  

It's quite a common model in c++. You use a switch statement that falls through from case to case, only breaking out when some required condition isn't met yet (such as the camera hasn't responded yet). The code above contains the stream filling part (see cases 5 and 6), however the stream reading is in the command processing here:

    case COMMAND_PHOTO_DATA_LEFT:
      {
        //check if there's any data available
        if(CS1.PictureStreamUsed > 0)
        {
          //got data, so work out how much to send, up to a maximum of (currently) 32 bytes
          int bytes_to_send = min(CS1.PictureStreamUsed,32);
          
          //clamp the amount to avoid overrunning the end of the stream buffer
          if( (CS1.PictureStreamRead+bytes_to_send) > PICTURE_STREAM_SIZE )
          {
            bytes_to_send = PICTURE_STREAM_SIZE-CS1.PictureStreamRead;
          }
           
          //write out number of bytes to send, followed by the actual data
          WriteWord(bytes_to_send);
          Comms->write(CS1.PictureBuffer+CS1.PictureStreamRead,bytes_to_send);
          //update stream position
          CS1.PictureStreamRead = (CS1.PictureStreamRead+bytes_to_send)%PICTURE_STREAM_SIZE;
          CS1.PictureStreamUsed -= bytes_to_send;
          CS1.PictureSendAddress += bytes_to_send;
        }
        else
        {
          //no data, so write '0' (to indicate 0 bytes)
          WriteWord(0);
        }
        GCurrentCommand = TOTAL_COMMANDS;
      }
      break;

I've got one set of those commands for the left camera, and one for the right. The main command (get content) is sent by the pc to request the next chunk of the photo. Provided data is available, we pull it out of the picture buffer and send it out the serial port. Note that it's all wrapped up in a simple class. This allows me to have 2 cameras and 2 streams - 1 for each eye!

With a few tweaks to the PC side code I was able to show both images on screen. This is a photo of MmBot looking at me:

MmBot looking at me

And this is what MmBot could see (for some reason I have grey hair here - I really don't btw):

What MmBot can see when looking at me!
Awesome!

I won't bother showing the new PC code - it's basically the same stuff that I've already posted a few times. A thread sends commands to MmBot over blue tooth, waits for data to come back, and once a whole photo is received it tells the main thread to show it in an image box. The only difference is that it now reads 2 pictures instead of one.

It's now Saturday and I've come a fair way with it, but am still occasionally losing data, and eventually entering some form of deadlock, where the PC is waiting for data that isn't coming. Hopefully a few hours debugging today will solve the problem. My gut right now is that I'm overflowing the buffer on the blue tooth modem, and need to figure out / wire up the 'ready to send' pin.

Unfortunately once this is working I doubt I'm gonna be getting more than 1 frame every 5 to 10 seconds - the fact is the Arduino is cool but doesn't have the raw power needed for image processing.. Even so, it's a good experiment and once I get the raspberry pie I'll be able to do the work on-board in something closer to real-time.