Thursday, July 14, 2011

0

KINECT SERVICES FOR MRDS AVAILABLE!

After the release of Kinect SDK, here is what many people was waiting for: Kinect Services for MRDS.



This package allows you to use your Kinect within Microsoft Robotics Developer Studio and moreover use it in a Simulated Environment! 

Good job MRDS team!

Thursday, June 16, 2011

0

MICROSOFT TO RELEASE A SIMULATED VERSION OF KINECT FOR MRDS

Microsoft launched a beta version of an SDK for Kinect. Which aims to motivate people to develop crazy kinect-based applications for PC under Windows (Check ouy ROS for the OpenSource version. ).

The good news is that, recently, Trevor Taylor (Program Manager in the Microsoft Robotics Group and co-author of the book "Professional Microsoft Robotics Developer Studio") confirmed me in the forum of MRDS that the guys at Microsoft are working on providing a simulated version of Kinect for Robotics Developer Studio.

In my humble opinion, I believe that it will be a major contribution to the robotics community as you will be able to easily run very advanced experiments in a completely simulated environment!

Trevor promised that it will be ready soon so, get ready!

Tuesday, June 14, 2011

0

OPENCV: NEW COOKBOOK

Last week came to my attention a new book about OpenCV: OpenCV 2 Computer Vision Application Programming Cookbook. Stay tuned for the upcoming review on this blog!

Monday, May 30, 2011

11

LIBSSC32: ARDUINO + SSC32

Today I would like to share a library for Arduino that makes it easy to control a SSC32.

But, what is a SSC32?
The SSC32 is a Serial Servo Controller, capable of controlling up to 32 servo motors at a time. Very useful in robotic projects such as arms or humanoids.


You can buy it at http://www.lynxmotion.com for about $40.  The thing about this piece of hardware is that it is very flexible. It is not only a servo controller, it also provides some pins where you can connect sensors and read  its values. And the most useful of its features (in my humble opinion) is that you can specify a group of servos, set each servo inside the group to move to a different location and specify how long will it take for all the servos to finish the movement. It doesn't matter if each servo is far or near its final location, SSC32 will calculate and apply the speed for each servo so all them finish at the same time!! 

This is very powerful to create complex movements. And they will be performed smoothly

Lets get down to business.
In my case, I am using the SSC32 to control an arm with 6 servos (more about this in future posts)  and I thought it would be nice to be able to command it from an Arduino.


The only "problem" is that the serial protocol defined by the SSC32 is a bit "uncomfortable" to handle, so I created a class to handle it and make it a bit more comfortable.

You can find the library for Arduino here (decompress it inside the "libraries" folder of your arduino environment):




And here is an example of how to use the library:

#include <SSC32.h>

/*
  Tests the SSC32 library. By Martin Peris (http://www.martinperis.com)
  This example code is in the public domain.
 */

SSC32 myssc = SSC32();

void setup() {


  //Start comunications with the SSC32 device  
  myssc.begin(9600);



}

void loop() {

  //Move motor 0 to position 750
  //The first command should not define any speed or time, is used as initialization by SSC32
  myssc.servoMove(0,750);

  delay(1000);

  //Move motor 1 to position 750
  //The first command should not define any speed or time, is used as initialization by SSC32
  myssc.servoMove(1,750);

  delay(1000);

  //Move motor 0 to position 1500. It will take 5 seconds to finish the movement.
  myssc.servoMoveTime(0,1500,5000);

  delay(5500);

  //Move motor 1 to position 900. It will take 5 seconds to finish the movement
  myssc.servoMoveTime(1,900,5000);

  //Move both servos to position 750 at the same time. 
  //The movement will take 5 seconds to complete
  //Notice that currently motor 0 is at position 1500 and motor 1 is at position 900,
  //but they will reach position 750 at the same time
  myssc.beginGroupCommand(SSC32_CMDGRP_TYPE_SERVO_MOVEMENT);
  myssc.servoMoveTime(1,750,5000);
  myssc.servoMoveTime(5,750,5000);
  myssc.endGroupCommand();

  delay(5500);

}



[EDIT 2011/11/24]: The library has been adapted to the new version of arduino's IDE. Thanks to Marco Schwarz.

Monday, April 11, 2011

0

ARDUINO + WIICHUCK

Lately I have been playing around with a WiiChuck connected to an Arduino. I bought an adapter to connect the WiiChuck to the arduino and downloaded the source code from the arduino playground, by Tim Hirzel.


I compiled the example and uploaded it to the arduino, but there was no response from the Wiichuck. The example was supposed to print all the data coming from the Wiichuck on the serial port, but it was not working.

I found the solution in this forum. The problem was that you need to define the "Power" and "Gnd" pins for the Wiichuck in order to power it up. So here is the modified WiiChuckClass:

/*
 * Nunchuck -- Use a Wii Nunchuck
 * Tim Hirzel http://www.growdown.com
 * 
 notes on Wii Nunchuck Behavior.
 This library provides an improved derivation of rotation angles from the nunchuck accelerometer data.
 The biggest different over existing libraries (that I know of ) is the full 360 degrees of Roll data
 from teh combination of the x and z axis accelerometer data using the math library atan2. 

 It is accurate with 360 degrees of roll (rotation around axis coming out of the c button, the front of the wii),
 and about 180 degrees of pitch (rotation about the axis coming out of the side of the wii).  (read more below)

 In terms of mapping the wii position to angles, its important to note that while the Nunchuck
 sense Pitch, and Roll, it does not sense Yaw, or the compass direction.  This creates an important
 disparity where the nunchuck only works within one hemisphere.  At a result, when the pitch values are 
 less than about 10, and greater than about 170, the Roll data gets very unstable.  essentially, the roll
 data flips over 180 degrees very quickly.   To understand this property better, rotate the wii around the
 axis of the joystick.  You see the sensor data stays constant (with noise).  Because of this, it cant know
 the difference between arriving upside via 180 degree Roll, or 180 degree pitch.  It just assumes its always
 180 roll.


 * 
 * This file is an adaptation of the code by these authors:
 * Tod E. Kurt, http://todbot.com/blog/
 *
 * The Wii Nunchuck reading code is taken from Windmeadow Labs
 * http://www.windmeadow.com/node/42


 * Modified by Martin Peris, http://blog.martinperis.com to declare which are the power pins
 * for the wiichuck, otherwise it will not be powered up
 */

#ifndef WiiChuck_h
#define WiiChuck_h

#include "WProgram.h"
#include <Wire.h>
#include <math.h>


// these may need to be adjusted for each nunchuck for calibration
#define ZEROX 510  
#define ZEROY 490
#define ZEROZ 460
#define RADIUS 210  // probably pretty universal

#define DEFAULT_ZERO_JOY_X 124
#define DEFAULT_ZERO_JOY_Y 132

//Set the power pins for the wiichuck, otherwise it will not be powered up
#define pwrpin PORTC3
#define gndpin PORTC2


class WiiChuck {
    private:
        byte cnt;
        uint8_t status[6];              // array to store wiichuck output
        byte averageCounter;
        //int accelArray[3][AVERAGE_N];  // X,Y,Z
        int i;
        int total;
        uint8_t zeroJoyX;   // these are about where mine are
        uint8_t zeroJoyY; // use calibrateJoy when the stick is at zero to correct
        int lastJoyX;
        int lastJoyY;
        int angles[3];

        boolean lastZ, lastC;


    public:

        byte joyX;
        byte joyY;
        boolean buttonZ;
        boolean buttonC;
        void begin()
        {
            //Set power pinds
            DDRC |= _BV(pwrpin) | _BV(gndpin);

            PORTC &=~ _BV(gndpin);

            PORTC |=  _BV(pwrpin);

            delay(100);  // wait for things to stabilize   


            //send initialization handshake
            Wire.begin();
            cnt = 0;
            averageCounter = 0;
            Wire.beginTransmission (0x52);      // transmit to device 0x52
            Wire.send (0x40);           // sends memory address
            Wire.send (0x00);           // sends memory address
            Wire.endTransmission ();    // stop transmitting
            update();
            for (i = 0; i<3;i++) {
                angles[i] = 0;
            }
            zeroJoyX = DEFAULT_ZERO_JOY_X;
            zeroJoyY = DEFAULT_ZERO_JOY_Y;
        }


        void calibrateJoy() {
            zeroJoyX = joyX;
            zeroJoyY = joyY;
        }

        void update() {

            Wire.requestFrom (0x52, 6); // request data from nunchuck
            while (Wire.available ()) {
                // receive byte as an integer
                status[cnt] = _nunchuk_decode_byte (Wire.receive()); //
                cnt++;
            }
            if (cnt > 5) {
                lastZ = buttonZ;
                lastC = buttonC;
                lastJoyX = readJoyX();
                lastJoyY = readJoyY();
                //averageCounter ++;
                //if (averageCounter >= AVERAGE_N)
                //    averageCounter = 0;

                cnt = 0;
                joyX = (status[0]);
                joyY = (status[1]);
                for (i = 0; i < 3; i++)
                    //accelArray[i][averageCounter] = ((int)status[i+2] << 2) + ((status[5] & (B00000011 << ((i+1)*2) ) >> ((i+1)*2))); 
                    angles[i] = (status[i+2] << 2) + ((status[5] & (B00000011 << ((i+1)*2) ) >> ((i+1)*2)));

                //accelYArray[averageCounter] = ((int)status[3] << 2) + ((status[5] & B00110000) >> 4); 
                //accelZArray[averageCounter] = ((int)status[4] << 2) + ((status[5] & B11000000) >> 6); 

                buttonZ = !( status[5] & B00000001);
                buttonC = !((status[5] & B00000010) >> 1);
                _send_zero(); // send the request for next bytes

            }
        }


    // UNCOMMENT FOR DEBUGGING
    //byte * getStatus() {
    //    return status;
    //}

    float readAccelX() {
       // total = 0; // accelArray[xyz][averageCounter] * FAST_WEIGHT;
        return (float)angles[0] - ZEROX;
    }
    float readAccelY() {
        // total = 0; // accelArray[xyz][averageCounter] * FAST_WEIGHT;
        return (float)angles[1] - ZEROY;
    }
    float readAccelZ() {
        // total = 0; // accelArray[xyz][averageCounter] * FAST_WEIGHT;
        return (float)angles[2] - ZEROZ;
    }

    boolean zPressed() {
        return (buttonZ && ! lastZ);
    }
    boolean cPressed() {
        return (buttonC && ! lastC);
    }

    // for using the joystick like a directional button
    boolean rightJoy(int thresh=60) {
        return (readJoyX() > thresh and lastJoyX <= thresh);
    }

    // for using the joystick like a directional button
    boolean leftJoy(int thresh=60) {
        return (readJoyX() < -thresh and lastJoyX >= -thresh);
    }


    int readJoyX() {
        return (int) joyX - zeroJoyX;
    }

    int readJoyY() {
        return (int)joyY - zeroJoyY;
    }


    // R, the radius, generally hovers around 210 (at least it does with mine)
   // int R() {
   //     return sqrt(readAccelX() * readAccelX() +readAccelY() * readAccelY() + readAccelZ() * readAccelZ());  
   // }


    // returns roll degrees
    int readRoll() {
        return (int)(atan2(readAccelX(),readAccelZ())/ M_PI * 180.0);
    }

    // returns pitch in degrees
    int readPitch() {
        return (int) (acos(readAccelY()/RADIUS)/ M_PI * 180.0);  // optionally swap 'RADIUS' for 'R()'
    }

    private:
        byte _nunchuk_decode_byte (byte x)
        {
            x = (x ^ 0x17) + 0x17;
            return x;
        }

        void _send_zero()
        {
            Wire.beginTransmission (0x52);      // transmit to device 0x52
            Wire.send (0x00);           // sends one byte
            Wire.endTransmission ();    // stop transmitting
        }

};


#endif


Works like a charm for me :)

Wednesday, March 2, 2011

0

ARDUINO NIGHT RIDER

Today I would like to share an idea that I had some time ago. Since I came to Tsukuba I've met  a lot of nice people, one of them is my good friend Rob Howland. You see, he is a skater. And he loves going around in his skate, sometimes even late at night.

The problem is that the area where we live is kind of dark at night and sometimes you don't even see the road. Not to mention the potential danger of cars not being able to see you.


So, that is why I designed the prototype board for the "Arduino Night Rider":



It will have 30 LEDs, 24 of which will be blue, they will be placed surrounding the skate and used to make cool effects. There will be a white one in the front (I sketched it as a single white LED, but it will be a group of white leds so it can light up the way in front of you), this led will be always on. 

There will be a red LED at the back that will continously blink (like in F1 cars) so you will be easily noticed from behind. There will be 4 yellow leds, one on each corner of the skate and will be used as "direction lights".

3 Tilt sensors will be placed along the skate, two of them will sense which direction you are turning when you turn (left or right) and if you turn left then the 2 yellow leds placed on the left side will blink. The same for the right side when you turn right. 

The last tilt sensor will sense when you raise up the skate and triggers some cool stuff with the blue leds.

The electronics is quite simple and program the arduino would take me a couple of afternoons but I think the most difficult part will be to make it "look professional" 

I don't know if we will have the time, but it would be very cool if we finally do this :) 

Sunday, February 6, 2011

0

MRDS: HOW TO ADD YOUR CUSTOM SIMULATED OBJECTS (AND DON'T DIE ON THE ATTEMPT)

As you might know, the simulator integrated in Microsoft Robotics Developer Studio can import 3D objects into your simulated world. The most common format used to import those objects is ".obj", it is a simple format that most 3D Design Programs can export to.

For the project that I am working right now it was needed to insert a realistic 3D model of a human face into the simulation.  The model of the 3D face was stored in 2 different sets of files:

Head 1.obj
Head 1.mtl
Head 1.bmp

Eyeballs 1.obj
Eyeballs 1.mtl
Eyeballs 1.bmp

The first set of files corresponds to the model of the head and the second set corresponds to the model of the eyes. The .obj files define the geometry of the objects, the .mtl define the properties of the materials of the objects and the .bmp is the texture.

So, first lets try to insert the head. The source code would look like this, add it to the definition of the simulated world:

//Insert the head 
SingleShapeEntity head =
 new SingleShapeEntity(
  new SphereShape(
   new SphereShapeProperties(
    0,
    new Pose(),
    0.1f)),
  new Vector3(0.0f, 0.5f, -2f));
head.State.Assets.Mesh = "Head 1.obj";
head.SphereShape.SphereState.Material = new MaterialProperties("sphereMaterial", 0.5f, 0.4f, 0.5f);
head.State.Name = "Head";

SimulationEngine.GlobalInstancePort.Insert(head);
 


Ok, now compile and execute the simulation:


Uhmmm... WHERE IS MY HEAD? Ok, don't panic! Let's try to find if it is out there. Select the "edit mode" on the simulator window:


Well, it looks like the head is inserted in the simulation, but for some esoteric reason we are not able to see it. Lets look around and try to find it. Use the mouse to rotate the camera in the simulated environment.


Did you see that?


There is some vertices there... wait a minute! Lets move the camera away about 20 meters.


There it is!! But it is a huge head!! No problem... just resize it. To do that, just add the proper line of code before inserting the object into the Simulation Engine:

head.MeshScale = new Vector3(0.01f, 0.01f, 0.01f);



Great, but, where is the texture? mmm thats odd. Why is it not showing the texture?

If you open the File "Head 1.obj" you will see something like this:

# 3ds Max Wavefront OBJ Exporter v0.97b - (c)2007 guruware
# File Created: 30.01.2011 20:28:48

mtllib Head 1.mtl

...

In the .obj file there is a reference to "Head 1.mtl", I'd bet having spaces in the file name would freak out MRDS , well... let's open that file in a text editor:

# 3ds Max Wavefront OBJ Exporter v0.97b - (c)2007 guruware
# File Created: 30.01.2011 20:28:48

newmtl defaultMat
 Ns 30.0000
 Ni 1.5000
 d 1.0000
 Tr 0.0000
 Tf 1.0000 1.0000 1.0000 
 illum 2
 Ka 0.5500 0.5500 0.5500
 Kd 0.5500 0.5500 0.5500
 Ks 0.0000 0.0000 0.0000
 Ke 0.0000 0.0000 0.0000
 map_Ka C:\Users\sama-sama\Desktop\Head 1\Head 1.bmp
 map_Kd C:\Users\sama-sama\Desktop\Head 1\Head 1.bmp

mmm Interesting, the file "Head 1.mtl" has an absolute path reference to the file "Head 1.bmp" (that, by the way, doesn't even exist in my computer, it is a path on the computer of the designer that modeled the head.Thanks a lot 3ds Max Studio). Let's try something: Remove the spaces in the file names and get rid of the absolute paths.

Now the files should be named as:

Head1.obj
Head1.mtl
Head1.bmp

Eyeballs1.obj
Eyeballs1.mtl
Eyeballs1.bmp

Now the content of Head1.obj should look like:

# 3ds Max Wavefront OBJ Exporter v0.97b - (c)2007 guruware
# File Created: 30.01.2011 20:28:48

mtllib Head1.mtl

...

And the content of Head1.mtl should be like:

# 3ds Max Wavefront OBJ Exporter v0.97b - (c)2007 guruware
# File Created: 30.01.2011 20:28:48

newmtl defaultMat
 Ns 30.0000
 Ni 1.5000
 d 1.0000
 Tr 0.0000
 Tf 1.0000 1.0000 1.0000 
 illum 2
 Ka 0.5500 0.5500 0.5500
 Kd 0.5500 0.5500 0.5500
 Ks 0.0000 0.0000 0.0000
 Ke 0.0000 0.0000 0.0000
 map_Ka Head1.bmp
 map_Kd Head1.bmp

Save the changes to all the files, and execute again:


Yeah! Much better!! Now we only have to do the same changes in the files corresponding to the eyes and insert them into the simulation, the final source code would be:

//Insert the head 
SingleShapeEntity head =
 new SingleShapeEntity(
  new SphereShape(
   new SphereShapeProperties(
    0,
    new Pose(),
    0.1f)),
  new Vector3(0.0f, 0.5f, -2f));
head.State.Assets.Mesh = "Head1.obj";
head.SphereShape.SphereState.Material = new MaterialProperties("sphereMaterial", 0.5f, 0.4f, 0.5f);
head.State.Name = "Head";
head.MeshScale = new Vector3(0.01f, 0.01f, 0.01f);
SimulationEngine.GlobalInstancePort.Insert(head);
 
SingleShapeEntity eyeballs =
 new SingleShapeEntity(
  new SphereShape(
   new SphereShapeProperties(
    0,
    new Pose(),
    0.01f)),
  new Vector3(0.0f, 0.5f, -2f));
eyeballs.State.Assets.Mesh = "eyeballs1.obj";
eyeballs.SphereShape.SphereState.Material = new MaterialProperties("sphereMaterial", 0.5f, 0.4f, 0.5f);
eyeballs.State.Name = "Eyeballs";
eyeballs.MeshScale = new Vector3(0.01f, 0.01f, 0.01f);
SimulationEngine.GlobalInstancePort.Insert(eyeballs);


Let me introduce you to Mary:




I would like to specially thank Sama-Sama Studio for providing several realistic head models for the experiments of my project.