line avoiding maze with object retrieval and deposit- MiniBloq and Sparkiduino

Hi 🙂

A few days ago I posted about the way Sparki can sense lines, or changes underneath him:

https://ashbotandsparki.wordpress.com/2015/06/26/how-does-sparki-sense-lines/

Here is a program that makes use of this information, including the potential flaws.

In this program Sparki is intended to move forward until he reads a change in the reflected light underneath (so reaches a darker surface) and then turn 90 round the corner in the opposite direction. So if the sensor on the far left detected change the Sparki would turn right and vica versa. This leaves the middle sensors open for picking up an object with the grippers and depositing an object at the end of the maze.

Here is a screenshot of part of the MiniBloq code (it wont fit):

Untitled12

And here is the arduino version:

#include <Sparki.h>

void setup()
{
	float threshold = 500;
	while(true)
	{
		int lineCenter = sparki.lineCenter();
		int edgeLeft = sparki.edgeLeft();
		int edgeRight = sparki.edgeRight();
                float threshold = 500;
		sparki.servo(0);
		sparki.RGB(0,255,0);
		if((edgeLeft<threshold))
		{
			sparki.servo(90);
			delay(2000);
			sparki.RGB(0,0,255);
			sparki.moveRight(90);
		}
		else
		{
		}
		if((edgeRight<threshold))
		{
			sparki.servo(-(90));
			delay(2000);
			sparki.RGB(0,0,255);
			sparki.moveLeft(90);
		}
		else
		{
		}
		if((edgeLeft>threshold))
		{
			sparki.moveForward(0);
		}
		else
		{
		}
		if((edgeRight>threshold))
		{
			sparki.moveForward(0);
		}
		else
		{
		}
		if((lineCenter<threshold))
		{
			delay(1000);
			sparki.moveForward(3);
			sparki.gripperClose();
			delay(4000);
			sparki.gripperStop();
			delay(2000);
			sparki.RGB(255,0,0);
		}
		else
		{
		}
		if(((edgeRight<threshold)&&(edgeLeft<threshold)))
		{
			sparki.gripperOpen();
			delay(4000);
			sparki.gripperStop();
			delay(2000);
			sparki.RGB(0,0,255);
			sparki.beep(540, 1000);
			sparki.moveBackward(3);
			sparki.moveLeft(180);
		}
		else
		{
		}
		delay(2000);
	}
}

void loop()
{
}

Here is a video of Sparki completing a maze by staying inside the lines:

https://drive.google.com/file/d/0B6y9ITteBcqLLThlN3JMZElIYm8/view?usp=sharing

And finally a picture of the awesome bot getting ready to start the maze

DSC_8101

Let me know what you think 🙂 🙂

Light Following and Avoiding – MiniBloq and Sparkiduino

This week I decided to include an example of the code in both miniBloq and Sparkiduino.

Sparkiduino is a type of programming environment that uses Arduino syntax and is designed for programming Sparki, so is updated with Sparki’s library and example codes.

sparkiduino

I played around with light following and avoiding in this code and actually found attempting to guide Sparki with a torch is really fun and unpredictable as I only had control of a torch and not the other light sources.

In this code Sparki’s light sensors positioned along the front of his shell, measure the light input, print this input on the screen (only Sparkiduino version)  and then use conditional if statements to move towards the light. Obviously, to avoid the light simply program the robot to move in the opposite direction.

In the sparkiduino code Sparki prints out the light readings on the LCD screen
In the sparkiduino code Sparki prints out the light readings on the LCD screen

I used if statements instead of if/else statements as I wanted the measurements of light to be updated often to help Sparki respond quickly. If it was an if/else statement the code would jump straight to else based on the initial reading rather than taking a new reading and then deciding what to do.

Here is a screenshot of the miniBloq code:

Untitled3

and here is the Arduino based code:

#include <Sparki.h> // include the sparki library

void setup() 
{
}

void loop() {
  int left   = sparki.lightLeft();   // measure the left light sensor
  int center = sparki.lightCenter(); // measure the center light sensor
  int right  = sparki.lightRight();  // measure the right light sensor
  sparki.clearLCD();
  sparki.print("left = ");
  sparki.println(left);
  sparki.print("center = ");
  sparki.println(center);
  sparki.print("right = ");
  sparki.print(right);
  sparki.updateLCD();
  delay (2000);
  
  if ( (center > left) && (center > right) ) // if the center light is the strongest
  {  
    sparki.moveForward(); // move forward
  }

  if ( (left > center) && (left > right) )  // if the left light is the strongest
  {   
    sparki.moveLeft(); // turn left
  }

  if ( (right > center) && (right > left) )  // if the right light is the strongest
  {  
    sparki.moveRight(); // turn right
  }
  
  delay(100); // wait 0.1 seconds
}

Finally here’s a video of Sparki in action following the light source. WARNING at the end of the video my torch makes the light flash.

https://drive.google.com/file/d/0B6y9ITteBcqLc1BBOGZNMzVTaUk/view?usp=sharing

Let me know what you think 🙂 🙂

Follow Hand Program

Now this idea stemmed from the Wowwee MiP robot.

I’m quite a fan of this little guy as a fun toy to mess around with. it’s advertised based on its two wheel self balancing tech and it’s gesture sense. In the following program I attempted to recreate the gesture sense tech with Sparki, which up to a point was fairly straightforward and successful. Here is a print screen of some of the blocks used in miniBloq to program this:

Untitled

I started off with a while loop, during which Sparki, while an object such as my hand was less than 10 cm away from his Ultrasonic range finder then he would move forward and continue until the object could no longer be detected.

Now this is great, but MiP can interpret gestures to turn around. This does not always go according to plan, but that adds to the fun of an entertaining toy robot that doesn’t need to always be accurate and precise. to mimic this with Sparki I used another while loop, during which while an object or hand is more than ten cm away he rotates on the spot. Once an object or a hand is again in range then Sparki beeps, changes led to blue and follows the object or hand again.

This is a really fun way to interact with Sparki especially for younger people who are maybe too young to code. Also, it’s a great way to experiment with the while loop which I often overlook when programming.

There is one problem with using MiniBloq to code this and that is the lack of an increment counter. When using a while loop the program will repeat infinitely and so most of the time we add an increment counter so that it only repeats to a certain point and doesn’t break the computer. This is quite easy through the languages similar to C++ including Javascript and Arduino, but is something missing from MiniBloq.

Here is the generated code:

#include

void setup()
{
while(true)
{
sparki.servo(0);
while((sparki.ping()<10)) { delay(2000); sparki.RGB(0,255,0); sparki.moveForward(0); sparki.beep(540, 1000); } sparki.servo(0); while((sparki.ping()>10))
{
delay(2000);
sparki.moveRight(0);
}
}
}

void loop()
{
}

Let me know what you think 🙂 🙂

Oh and a pic of the two bro-bots to finish :

 DSC_0003

EDIT:

As really helpfully suggested by franzcalvo, here is a link to a video showing sparki running the follow hand program.

 

https://drive.google.com/file/d/0B6y9ITteBcqLWmp5aTdPNURWUzg/edit?usp=docslist_api

 

MiniBloq line following and gripper use

This was an awesome code to mess around with and taught me a lot.

Untitled

Initially I thought that Sparki could use his ultrasonic sensor to find out if there was an object in the path or not but I found that this then made it difficult for him to continue following the line (especially at corners) as the object would block the light, making the line sensor detect less light and conning Sparki into thinking he was still on the line.

This threw a spanner in my project but actually Sparki had 5, yes 5, line sensors, and so I decided I was willing to lose a bit of accuracy and instead of using the middle sensor to line follow, as this is the one that will be blocked by the object, I used an edge sensor to follow the line, keeping the line on the left or right of Sparki.

Even though this solves the problem, I decided instead to use the line sensors individually with commands for Sparki to pick up an object and drop an object depending on the line sensors readings. This worked really well and was fun to put together and play around with. It also meant that Sparki can pick up more than one object throughout the maze and provides a lot more freedom.

20150123_165834

here is the Minibloq generated code that I ended up with:

#include <Sparki.h>

void setup()
{
float threshold = 800;
float lineLeft = 0;
float lineRight = 0;
float edgeRight = 0;
while(true)
{
sparki.servo(0);
lineLeft = sparki.lineLeft();
lineRight = sparki.lineRight();
edgeRight = sparki.edgeRight();
if((edgeRight<threshold))
{
sparki.RGB(0,250,0);
sparki.moveForward(0);
}
else
{
sparki.RGB(250,0,0);
sparki.moveRight(0);
}
if(((edgeRight<threshold)&&(lineRight<threshold)))
{
delay(5000);
sparki.beep(440, 1000);
sparki.RGB(0,0,250);
sparki.gripperClose();
delay(2000);
sparki.gripperStop();
delay(1000);
}
else
{
}
if(((edgeRight<threshold)&&(lineLeft<threshold)))
{
delay(3000);
sparki.gripperOpen();
delay(2000);
sparki.gripperStop();
delay(1000);
}
else
{
}
delay(100);
}
}

void loop()
{
}

and here is a link to a video of Sparki running the completed program :

https://drive.google.com/file/d/0B6y9ITteBcqLSnVidmdFNkZhUU0/view?usp=sharing

let me know what you think 🙂 🙂

Using MiniBloq to solve a maze

This was one of the first things I worked on with Sparki.

I used MiniBloq which is a graphical programming tool which helps n00bs like me get used to programming and thinking like a programmer before we start with arduino.

The first thing I realised was how much I would need to tell Sparki to help him navigate and how much we as humans rely on our senses to understand our environment.

This is what the MiniBloq software looks like and I found that it is simple to use and easy to grasp.

minibloq

As you can see, the generated code is present on the right hand side so it is easy to make the leap from graphical programming to cc++

This is the generated code from this program:

#include

void setup()
{
while(true)
{
sparki.servo(0);
sparki.RGB(0,255,0);
sparki.moveForward(10);
if((sparki.ping()<10))
{
sparki.servo(90);
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(-(90));
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(90);
delay(1000);
sparki.moveRight(180);
}
else
{
sparki.servo(-(90));
sparki.RGB(0,0,255);
delay(1000);
sparki.moveLeft(90);
}
}
else
{
sparki.servo(-(90));
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(90);
sparki.RGB(0,0,255);
delay(1000);
sparki.moveRight(90);
}
else
{
sparki.servo(90);
sparki.RGB(0,0,255);
delay(1000);
sparki.moveRight(90);
}
}
}
else
{
sparki.servo(-(90));
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(90);
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(0);
delay(1000);
}
else
{
sparki.servo(0);
delay(1000);
}
}
else
{
sparki.servo(90);
delay(1000);
if((sparki.ping()<10))
{
sparki.servo(0);
delay(1000);
}
else
{
sparki.servo(0);
delay(2000);
}
}
}
}
}

void loop()
{
}

In this program Sparki relies on his ultrasonic range sensor to detect if his path is clear and the direction he should travel.

Here is a video of the program in action:

https://drive.google.com/file/d/0B6y9ITteBcqLYlU2aXpPaDJkTUk/view?usp=sharing

check it out and let me know what you think 🙂 🙂