Archive for April, 2008

Unscientific GPS note

April 28, 2008

Last week I charged the different batteries and took a GTA01 Neo, a GTA02 Neo and a Nokia N810 with me to enable their GPSes on my way home from school. Then I saved the traces they logged and loaded into JOSM to have a look (GTA01, GTA02, N810 – gpx files converted using gpsbabel from nmea)

The devices made respectively 11.28km, 12.12km and 11.07km routes (sitting in the same bag the whole time).

All in all I like the GTA01 accuracy the most although all three sometimes have horrible errors. They all three have accuracy about near the bottom line of usability for OSM mapping for a city, so if you get a GPS with that in mind, it may be slightly disappointing. All three are quite good at keeping the fix while indoor but everytime there’s not enough real input available they will invent their own rather than admit (if you had physics experiments at high-school and had to prove theories that way, you know how this works), resulting in run-offs to alternative realities – especially the N810 likes to make virtual trips. They all three apparently do advanced extrapolation and most of the time get things right, but the GTA01 GPS (the hammerhead) very notably assumes in all the calculations that the vehicle in which you move has a certain inertia and treats tight turns as errors. I’m on a bike most of the time and can turn very quickly and it feels as if the firmware was made for a car (SpeedEvil thinks rather a supertanker).

It’s suprising how well they all three can determine the direction in which they’re pointing even when not moving (the GTAs more so). The firmwares seem to rely on that more than on the actual position data sometimes. This results in a funny effect that the errors they make are very consistent even if very big – once the GPS thinks it’s on the other side of a river from you (or worse in the middle), it will stay there as long as you keep going along the river.

I’m curious to see what improvement the galileo system brings over GPS.

UPDATE: I was curious about the precision with which the altitude is reported, which can’t be seen in JOSM.  First I found that the $GPGGA sentences on my GTA01 have always 000.0 in the elevation field, but the field before it (normally containing HDOP) has a value that kind of makes sense as an altitude, so I swapped the two fields (HDOP value should be < 20.0 I believe?).  Then I loaded the data into gnuplot to generate this chart:

The horizontal axis has longitude and vertical the elevation in metres above mean sea level.  Err, sure?  I might have screwed something up but I checked everything twice.  Except the GTA01 which might be a different value completely – but the is some correlation.  I’m not sure which one to trust now.

Advertisements

Panoramic photos revisited

April 21, 2008

Every some time there’s a place that would look great on a panoramic picture but I only have a plain touristic camera with me and decide to take a series of photos in all directions from one point with a plan to later glue them together to get something real. Then I forget it completely or sometimes I load Gimp and glue the pictures together. This can take an hour or two and gets boring, but the result is often ok (digital example, analog camera example). This time I decided to try to get my PC to do the gluing for me and use one of the tools that appeared in gentoo’s portage tree. This didn’t go faster than I would have done it manually (perhaps five x longer) but knowing about all the quirks, it may actually go faster next time, and the result is comparable with manual stitching. So let me just write down the things I wish I had known when I started. The package that is now in most distro’s repositories and that I used, is libpano12 and various tools associated with it.

First part is selecting the individual pictures and telling the computer how they are oriented in relation to each other so it knows how to glue them. Hugin (a GUI frontend for libpano) lets you select the shots and input the data necessary for libpano to make sense of the individual pictures, which is a list of common points on the overlapping parts of the photos. This part quite intuitive. Remember to build the latest hugin and latest wxGTK or hugin will segfault. Before loading pictures into hugin rotate them in Gimp to be at least more or less straight (if they aren’t). Have 2GB of free disc space. Choose one of the rectangular projection types because the fancy ones are not supported by the other tools we’ll need to use. The picking of common points can be done automatically by autopano-sift but this has mono and other heavy packages as dependencies so I avoided it. When you’re done placing the points and playing with all the other parameters in hugin, hugin will want to generate a script for the stitching program that will do the heavy work. Hugin knows about two such programs: PTstitcher (part of panotools) which sucks because it’s closed-source and only works on one arch. The second one is nona which sucks because it doesn’t support most of the format, projections, blending, auto adjusting. So we will choose “multiple TIFF” as the output format, set all the other parameters, save the project, tell hugin to generate the script to a text file and quit hugin. Now we have one other program to do the stitching: PTmender, an opensource replacement for PTstitcher, also part of panotools – this one supports the formats we want and some automatic colour balance / exposure correction but if you choose any of the plain bitmap output formats, it won’t blend them nicely together because it’s not supported yet. If you in turn want a format with the individual pieces on separate layers (XCF not supported yet, so PSD) to do the merging in Gimp, PTmender will segfault in some doubly-linked-list code. So we choose the “multiple TIFF” output format and get TIFFs that are ready to merge except they don’t have any transparency mask set. We could now use Gimp’s “alpha to selection” and “feather selection”, but there’s a program, enblend, that will do just this merging, and does it really well (using multi-resolution splines). It only operates on TIFFs, and seems to get all the parameters right if you don’t give any.

You may need to repeat some part of the process if you see something really bad in the enblend output, and if not then you just need to load it in Gimp and rotate / crop / scale the picture. To load TIFFs in Gimp you may need to rebuild with USE=”tiff” if you’re on Gentoo.

Trip

April 21, 2008

So I went to Brazil last month but had no time to put any pictures online, now I uploaded them here. Also uploaded some pictures from a trip to Spain that was just before that.

Brazil cities reminded me a lot of Peru, which was the only place I had seen in America (this comparison must seem awfully ignorant to anyone who lives in some place between Brazil and Peru). We spent one week in Ceará region seeking out best places for paragliding. One of the spots was the launch pad near Nossa Senhora Imaculada sanctuary near Quixada where “Sol” group (Brazil) took off last year and set the current world record in straight distance paraglider flight landing over 460km away. (Obviously this was a different season and incomparable weather conditions.)  I made an attempt to adapt my Neo1973 Linux phone to dub as a variometer using the altitude data from built-in GPS.  Impressively the measures are somewhere on the edge of being accurate enough for that purpose, but time resolution is way too low (normal variometers use air pressure changes rather than GPS).  The speaker is loud enough to emit the familiar beeping of a variometer (so good enough for showing off even if inaccurate).  The GTA02 should be much better with its 3D accelerometers, but I didn’t have time to play with it yet.

The second week the group split and I went Bossa ’08 conference that was in a fantastic setting and from where I brought home a collection of five geeky t-shirts.

I could’ve written a novel with all the keystrokes

April 18, 2008

In other words if you printed all the vim commands I typed, on Earth’s equator in 15pt font, you would get a very repetitive string printed on Earth’s equator line (which could no way be seen from outer space).

$ history | awk '{a[$2]++ } END{for(i in a){print a[i] " " i}}' | sort -rn | head
2673 vim
1475 cd
694 screen
632 make
475 man
329 ls
261 grep
231 cg-diff
161 aoss
149 cg-patch

The numbers sum to > 1000 because I use HISTSIZE=”10000″ and HISTCONTROL=”ignoredups” in my bashrc.

OMAP3 resources opened

April 9, 2008

Texas Instruments OMAP series of mobile CPUs have for some time had okay Linux support with parts of the code coming from community, parts from TI and parts from Nokia, one of the vendors. This month we start seeing results of TI’s recent efforts on making this support better by opening various technical resources that were available only to the vendors earlier. Yesterday the announcement of their DSP-bridge framework release under GPL was posted to the linux-omap list, and as of this week you can download the entire TRMs (35MB PDF each) for various OMAP3 CPUs from ti.com. Added to this are various types of manuals, example code and that covers also the recently announced 35xx models.

I had an occasion to be at TI’s Rishi Bhattacharya’s talk at BossaConference last month with a sneak peek on the process of opening OMAP3 related resources that had been ongoing internally for some time. Apparently more releases are planned including among other things some GPLed sources (and some freeware binaries) of DSP codecs for use on OMAP. This also should make life a fair bit easier. One of the interesting points was also the evaluation board for the new processors which looks a bit more like a final product than previously made evaluation boards. It’s called Zoom MDK and it’s sold by a third party. It includes a modem, optional battery and a neat case so it can potentially be used as a (only slightly oversize for today’s standards) phone, and comes equipped with a full Linux SDK. One of the points is also to make it more affordable so that individual developers are not excluded (currently only available through a beta programme but the final price was said to be aiming at below $900). There’s an effort to have Openmoko running on the thing. Looking forward to that and to the rest of the releases from TI.

ZoomMDK external view