OpenStreetMap logo OpenStreetMap

amillar's Diary

Recent diary entries

Garmin Nuvi 200

Posted by amillar on 1 October 2010 in English.

I got another GPS for the car, a Garmin Nuvi 200. I got it cheap on ebay for $20 because the mini-USB port had come loose. I managed to re-solder the USB port onto the board.

The USB port doesn't work for data transfer, but it does charge the battery. I don't know if it was my soldering job, but I suspect the port was already dead. The chip may have shorted out when the connector was getting stressed and broken before I got it.

I haven't tried recording tracks, but it does display the maps from mkgmap, including routing, using the SD card. I like the rendering better than the StreetPilot; the lines are better defined.

JOSM Java 1.6 on Mac OSX 10.4 Tiger

Posted by amillar on 12 September 2010 in English.

The latest JOSM upgrade requires Java 1.6. My Intel iMac is running 10.4 Tiger, and Apple doesn't supply Java 1.6 for it. So I had to find another solution.

I found an alternative JVM called SoyLatte, and it works for me to run the latest JOSM.

I followed the instructions at

and it works.

The main surprise that I found was that it uses X11, so I have to run that first. I'm a Linux user, so it isn't a problem for me. I always ran JOSM from the command-line in the OSX Terminal window, so now I run it from the X11 xterm window instead.

I don't have a PPC Mac, but I did see that there is a version of SoyLatte for PPC listed on their site.

Walking Papers

Posted by amillar on 9 July 2010 in English.

I finally got around to trying out Walking Papers

It is a great way to make edits and annotations to an existing area. In my case I had all the streets in the area, but I needed to mark some construction changes and turn restrictions. Walking papers is perfect for this.

If you haven't tried it, you need to :-)

Location: Buckman, Portland, Multnomah County, Oregon, 97214, United States

Fixing I-84 Oregon

Posted by amillar on 7 June 2010 in English.

I'm fixing I-84 in central and eastern Oregon. I'm converting the single way to dual one-way ways. I'm also fixing the route relation.

Anyone working on US Interstates should see the wiki page

and especially the relation analyzer tool

I got data out of Wikipedia for about 3800 airports. I have entered about 2300 in OSM so far. For each one, I'm finding it in the Landsat photos, adding the aerodrome node, and adding the main runway(s). And, of course, eliminating any duplication. I've learned the name for airport in quite a few languages now.

I'm getting good at reading the false-color images in some Landsat areas. It looks like some multi-spectrum combination in a lot of areas.

For some areas, the Landsat data contrasts better than the Yahoo visible-spectrum-only photos, even seeing through some light cloud cover occasionally. In fewer cases, the Yahoo photo has better contrast.

Yahoo doesn't have very good resolution for very much outside of the US, at least for the images available to JOSM through YWMS. Landsat is usually a little sharper. They aren't always aligned with each other. But for something as coarse as a runway in a sparsely-mapped area, either is good enough.

Only 1500 more airports to go....

Wikipedia data for airports

Posted by amillar on 13 May 2009 in English.

Wikipedia is changing to an OSM-compatible Creative-Commons license. See Although it isn't officially official yet, it is pretty clear that the licensing will be fixed very soon. This means we will be able to use WP data in OSM.

We don't want to make OSM a POI dumping ground for every geocoded Wikipedia article. But some data makes sense to use as a source for permanent OSM map data. The wikipedia infoboxes are moderately consistent codifications of data. Extracts are available at These can be used, with some pain, to create data to load into OSM.

The first thing I'm looking at is airports. They tend to be permanent landmarks, and have map-worthy data of both name labels and runways, which makes them good candidates for import.

Using this, I have been working on some scripts to turn the airport infobox data into OSM aerodrome nodes, and approximately-located runway ways. I create an aerodrome node with the designated lat/lon, and related information. I download surrounding existing OSM data, and merge the new node with any existing aerodrome node within 0.0100 degrees. I keep name variations as alternate ("name_1", "name_2") node names. I check for runways nearby, and if there aren't any, and the WP data has runway info, I create new runway ways. I use the length in meters, and the angle from the runway name (runway "3/21" = 30 degrees/210 degrees), and center it on the designated point.

I do these 5 at a time, and load the results into JOSM. I use the aerial photos (Landsat or Yahoo as available) to move the runway into the right place. The angles and location are close, so it is not hard to get them in the right place. It is annoying that the runways are often (always?) named for magnetic north rather than true north, but with the aerial photos it is easy enough to line them up correctly anyways.

I've fixed 200 or so airports so far. Only a few thousand to go.

I-395 near Pentagon

Posted by amillar on 6 April 2009 in English.

I did some motorway fixups on I-395 near the Pentagon and Arlington cemetary, based on aerial photos. There are definitely some complicated links there. I hope it got it right since I can see there has been recent construction in the area. Some was clearly poorer Tiger data, while other was hard to say. Somebody with local knowledge will have to sort out a lot of those links and lanes for directions and connections. What is that lane in the middle? I can't tell which way it goes. I wonder if it is a commuter lane that changes direction based on time of day... I'll leave that for someone who knows.

As a US user, I don't run into many accented characters. I'm usually blissfully ignorant of character encoding issues, happily using the same ASCII codes as I did 25 years ago on my Apple II.

In loading the USGS points of interest, especially for Puerto Rico, I am encountering various accented vowels and ñ (n~, hope I got it right here :-). The source data is encoded as iso-8859-1, while JOSM works in utf-8. I noticed this when I loaded my OSM file into JOSM, and got little squares instead of legible letters.

No problem, I thought, I can add an encoding line to the start of the OSM file, saying it is iso-8859-1. No variations of upper/lower case or with/without dashes made any difference. It still came up with little boxes for accented chars in JOSM. (This is JOSM version 1504). I think JOSM only takes UTF-8 from disk files, and doesn't obey the XML encoding specified in the file.

Doing a hex dump on the file showed single-byte values for the special characters, so I was sure it was iso-8859-1 and not utf-8.

Plan B: make the file really UTF-8. In Vim, I loaded the file, then did

:set fileencoding=utf-8

and saved the file. That did the trick. Hexdump showed multi-byte characters, and JOSM showed them correctly on the screen. Problem solved.

Merging POIs and fixing addresses

Posted by amillar on 25 March 2009 in English.

Before the recent GNIS mass load, I had created quite a few point of interest nodes for schools, churches, libraries, and fire stations in my area. Many of them, including all of the public schools, were duplicated with GNIS nodes. The JOSM node merge solved that easily where I found them. I used XAPI with wget to pull down an OSM file containing just the nodes of those types (search by amenity), which I could edit in JOSM and upload.

When I first created many of them, I put the street address in a tag because I figured it would be useful later. I couldn't find any consensus when I started on address tags, so I just put the address as one long string in a "street_address" tag. Nobody else uses it, but it was at least readable. I since found that the "addr:" namespace used in the Karlsruhe addressing system seems to fit well for these POI nodes.

I wrote a perl script to rewrite the tags. It parses the street_address tag and creates the number, street, city, state, and postal code tags, and removes the street_address tag. I had a few hundred of them, and this fixed them in short time. It probably took more time to write than fixing them by hand would have taken, but it was more fun :-)

TIGER artifacts

Posted by amillar on 16 August 2008 in English.

I just finished two trips, one from Portland to Seattle and one from Portland to Newport, OR on the coast. I recorded tracks from my GPS logger, and now I'm cleaning up the TIGER data for the covered areas.

Most of the trips were highway driving, which has given me some interesting insight into the TIGER data. The US Census Bureau documentation says that TIGER is a composite of multiple sources, and the highways show it.

US highway 20 in Oregon had some interesting artifacts. In Lincoln county, the way was a rough approximation of the road, with typical TIGER misalignment. As soon as it crossed into Benton county, the way was an exact match with the GPS track and the Yahoo aerial photos.

Interstate highway 5 in Washington state was interesting too. In Cowlitz county, I-5 was a single way with poor accuracy, but as soon as it crossed into Lewis county, it became dual ways with accurate placement. Based on the tiger:tlid tags, it looks like the dual ways were original Tiger data and not later OSM edits.

I expected that US highways and Interstate highways would be consistent for their whole length. But in retrospect, I imagine that not only do they export at county levels, much of the TIGER database was probably built by importing data at county level too.

Location: Lewis County, Washington, United States

OSMXAPI is dead or useless recently, apparently a victim of popularity overload.

I needed to grab a whole city in the U.S. to check for braided streets. I've been using OSMXAPI in the past because the main API only allows small areas. But recently OSMXAPI has been overloaded or unresponsive. Even worse, when I have gotten through, it has been giving me error messages that a new restriction mechanism is in place. It has rejected my requests, even when they are ridiculously small (one was only 250kb through the main API).

The osmxapi error message suggested using the planet file instead. This sounded bad, because I did not want to download 4GB just to get a few megabytes of city data. But I found a great set of extract files for the US at

It looks like they update them weekly. Thanks guys!

I am using Osmosis to pull out my desired city by bounding box. This looks like a good solution.

TaH Lowzoom progress

Posted by amillar on 6 July 2008 in English.

As Frederick Ramm put it recently, "Look at our current map at zoom level 2 and try not to run away crying." I think I'm finally making progress.

I've built some shell script hacks to work my way through a long list of z8 tiles. It is going to take weeks at my current rate, but that's OK.

I take a swath of z8 tiles, hundreds at a time, and rebuild them. Then I rebuild the z4s they inhabit, and then finally z0. This takes several days per cycle.

I also built some scripts to find and clean "unknown tile" artifacts. I'm better, or at least more comfortable, writing shell scripts than perl, so my scripts tend to be slow and kludgy, but I think it is working.

The "unknown type" tile has a certain color in the text, so I do a quick check for it at z8 using ppmhist. If I find it, then I loop through the z12 tiles inside to see if any match the unknown type tile. If I find one, I replace the image and then use the API to flag it for re-rendering. Replacing the image is a short-term fix but it is making the map at least look a little better, a little sooner. I'm checking both the "tile" layer and the "captionless" layer. If one is unknown and the other is not, I just copy one to the other. If both are unknown, I copy in the mapnik tile. In any case, the image will get replaced by a proper rendered tile soon.

I think I can look at z2 without crying so much now :-)