Thursday, December 30, 2010

Full amBX SPOT (lights, fan and rumble pad)

Today I had some time to finish the SunSPOT amBX controller. I modified the controls of the fan and the rumble pad to turn them on and off by just pressing a button once. The lights are controlled by the tilt of the SPOTs accelerometer. The x-axis controls the intensity of red. The y-axis controls the green spectrum. The z-axis is responsible for the blue spectrum.

Tuesday, December 28, 2010

amBX SPOT (fan and rumble pad)

A year ago, I wrote a JAVA wrapper library for my amBX gaming system. This gaming system allows games to give the user visual and haptic/force feedback. You have fans blowing when you drive a car, you have a rumble pad vibrating if you get shot and you have brilliant lights which change to the main color of your current screen to set an ambient mood :).

For a long time there weren't any open SDKs. Only licensed game developers could benefit from the system. After a while they released an open SDK written in C. Since I work mostly and preferably with JAVA, I decided to write a wrapper to control the system from my applications. I used JNA which has some advantages over JNI as you don't need code stubs and headerfiles in C and such.

I refactored some of that code and connected it with my SunSPOTs. Now I can control the fans and the rumble pad with one of my SPOTs. If I press and hold a button on the SPOT, a request is transmitted to the SunSPOT basestation which triggers the fan or the pad. If I release the button, a request is transmitted to turn off the amBX system. If I have time in the next few days, I will try to control the lights with the accelerometer of the SPOT.

Friday, December 24, 2010

A SPOTlight for Christmas

A while ago I modified the tip of our Christmas Tree with one of my SunSPOTs. As you might recall, it had a program running which indicated if it is Christmas already. On that particular day it wasn't. The SPOT responded with a red warning light. Let's check again today to see if something has changed. (In Germany, Christmas is celebrated on the 24th of December)

Wednesday, December 22, 2010

ProximitySPOT / Sniffing Crawly

Yesterday I wanted to implement a proximity checker for my SunSPOT devices. I wanted to use the radio to determine the distance between two SPOTs. Well, as it happens from time to time, halfway through your implementation you stumble across some source code or library and you realize that someone already implemented it. In this case it was already on my harddrive because the SunSPOT SDK provides demos. One of those demos was to show the signal strength of a SPOT. Since there already was some nice code and an easy method to get the signal strength, I decided to modify my LEGO-Bot. His name is Crawly, as the title already implies. After giving it a name, it was even harder to take his head off. To make things right with Crawly I gave him something close to a sense of smell. Now he can move towards a signal source and stops if he is close enough.

This can be done in the following way. One SPOT can broadcast packets of data continously, while another one receives them and checks the signal strength while doing that. When reducing the distance between both SPOTs, the signal strength gets higher.

The SPOT mounted on Crawly broadcasts continously. The basestation you can see lying on the ground receives the signal and evaluates the strength of the signal. It is then compared with the last signal which was received. If the new signal strength value is higher than the last one, Crawly is on the right track and a command to move the servos forward is sent via bluetooth to crawly. If the signal strength gets lower, a command is sent to move left.

As you can see this is not a great pathfinding algorithm, but well I'm not a mathematician and this was just a little proof of concept experiment. So Crawly finds his way to the basestation by circling towards it.

Monday, December 20, 2010


After I modified my ChristmasSpot to display a longer lightshow on christmas, I decided to develop another little LED gimmick with the SPOT. I built a binary SPOTClock. The video has really bad quality because the monitor and the LEDs kind of dazzled the camera. A SunSPOT has an array of eight LEDs. This array is perfect to display a number up to eight bit which is a range of 0-255. The array is read from right to left. So from the least significant bit to the most significant bit. The blue color symbolizes the current hour, green for minutes and red for seconds. I can change the display mode with the onboard switches.

Wednesday, December 15, 2010

SunSPOTs are thrilled for Christmas

Since I abandoned my SunSPOTs for a while, I thought I use my free time to dust them off and let them take part in the christmas fuss. Despite my girlfriend shaking her head, I temporarily hung one of the SPOTs at the top of our christmas tree. After the SPOT is started I can press one of the switches and the SPOT checks if it is already christmas. Sadly today's response was negative. It doesn't look fancy and is very simple. Well you even can hear the crickets chirp :). As puzzled as my girlfriend is right now, she will like the animation if the SPOT announces that it is indeed christmas. We'll see :).

Sunday, December 12, 2010

Camera Bot

Some days have passed now, and I spent some time on the video streaming. Well more like "image streaming". As it turns out, android is not streaming friendly yet. With the release of 2.3 they already support audio streaming to the device. Streaming from the device to a server however is a different story. The typical video containers which are supported are 3gp and mp4. There is actually no streaming container implemented. The problem with both of these containers is that the header and meta data is written to a file only after the recording is done. So I could stream the raw data to my server but without the additional information it's worth nothing. There are some workarounds like implementing custom video containers of your own, but that's too much effort for my timeframe. Another workaround would be to save the video to a small file transmit it to the server which can process it and do the whole thing in a loop. The latency would be too big to be useful.

The whole platform restrictions and workarounds are described in a diploma thesis which I found while researching. There are app providers which promote video conferencing apps but as the platform doesn't support streaming naturally, my guess is that it had to be implemented in a workaround.

I used the approach of transmitting single snapshot images from the camera which I could stream over a socketconnection. On the server side I extended my JSF application with components from ICEfaces and PrimeFaces. I used the DynaImage from PrimeFaces which can handle streamed content. For updating the image periodically I used ICEfaces Ajax Push mechanism.

The server and client applications still have to be tweaked to avoid crashes and to provide a faster image update, but here is a first impression on the result. Yeah I know the tab is oversized for the robot but it's my only android device :).

Sunday, December 5, 2010

Web Bot

Legen...wait for it....daaary. I already set up my web application to control the bot. I still need to make some adjustments like making ajax requests instead of loading the whole page again, but I already got the communication going between the browser and the bot. What I did was deploying a JSF webapp on my local JBoss server which integrated my SocketConnector from yesterday as a managed bean. So at the press of each button I send a message via socket to my android device which sends a command via bluetooth to the LEGO bot. This may sound like overhead but I want to use the advantage of a longer range of my WiFi network to control the bot through the whole house. If my server would control the bot directly via bluetooth the connection to my bot would break after a few meters. In the example video you can see that I control the bot simultaneously by browser and by android device. I will need the next days for some refactorings and for the next big task of streaming video from the device to the server and displaying it to the user. I'm pretty excited...

Saturday, December 4, 2010

Console Bot

I managed to set up the socket communication pretty quickly. The android device serves as a host so that I had to implement the ServerSocket part there. A simple console application was implemented as a socket client. I integrated the whole socket communication part into the remote app from yesterday. So now the bot can be controlled via the onscreen buttons of the android device and via a simple console application. Some basic groundwork is done now. The next step is to set up the server, design some simple ajax direction buttons and integrate the SocketConnector I wrote for the console application. I think the hardest part later on will be to handle the video stream. If I move on that quickly, I might just get the whole thing done by end of December.

I apologize for the bad video quality, but I wanted to keep the files small as I don't have unlimited storage space on that blog. Hope you can still see what I described.

Friday, December 3, 2010

My BotRemote

Today I implemented my own version of a LEGO NXT remote. I had a look at the MINDroid code and I used their Bluetooth communication classes for my own remote implementation. So now the bot is no longer steered by tilting the device but by pressing simple direction buttons. This might sound inconvenient but remember the ultimate goal is to steer the bot via a browser. Keys or buttons are a better fit for this scenario. As that worked out pretty well so far, the next step will be to set up the local server and provide a web interface to steer the bot.

ButtonRemote in action:

Thursday, December 2, 2010

Android Remote Bot

I just downloaded the MINDroid app for android to see how the crawling bot can be controlled via bluetooth. It works pretty nice so far, so the whole bluetooth communication shouldn't be that hard to implement in my own app later on. However, I want to control the robot via a browser so the android device's tilt is not of interest right now. Over the next couple of days I need to implement my own version of a controlling app. Luckily the MINDroid app is open source and available at: github.

Here is a small demo of the remote controlled crawler.

Wednesday, December 1, 2010

Child's play: Or how a grown man plays with LEGO

Since I started my professional education by building a LEGO robot and learn the fundamentals of programming, I thought let's reflect what we have learned over the last three years and build a robot again. But this time it should do a heck of a lot more.
The ultimate goal here is to build a moving robot which processes its connected sensor's data and sends it to a mobile device via bluetooth. The device should be an android powered phone which should provide additional sensory data but more important should stream the current video cam data to a webserver via Wi-Fi. By using Wi-Fi and socket communication I hope to keep the latency to a minimum. If everything works out fine I should be able to control the robot via a browser.
Let's see how we get there... :)

Here is a first prototype of a crawling bot which tries to avoid obstacles when they are in ultrasonic range. I don't know if I keep that design because a bot on wheels will be a lot faster...mwuahahah