OpenStreetMap logo OpenStreetMap

Diary Entries in English

Recent diary entries

From the Build Plan developed with Claude

Project Purpose A web application enabling mappers and data quality analysts to select a geographic area, fetch Overture Maps POI data and OpenStreetMap data in parallel, algorithmically compare them, and produce two outputs: a reviewed, selective upload to OSM via OAuth2, and an annotated GeoJSON export classifying each Overture POI by assessment category.

No edits to OSM happen without manual review. Overture is more of an attention guide (at least that’s my hypothesis)

Here’s the chat

Lots of UX fine tuning to come. Quick observations:

  • Overture is very noisy, lots that is irrelevant (business mailing addresses in personal houses, mislocated POIs, closed places, duplications with differences across Meta/Microsoft/4SQ sources)
  • Matching is hard but can be improved
  • There is useful signal in Overture, places that need addition to OSM in new developments, but you have to have means to easily sort through and keep a record.

See full entry

In May 2025 the OSM website introduced two new optional fields in user profiles: company and location. I recently analyzed whether these fields could be useful for detecting organized (paid) editing accounts for my How Did You Contribute (HDYC) pages. Short summary:

  • Around 66k active user profiles analyzed
  • About 1,000 unique company entries
  • About 2,200 location values

Interestingly, large companies such as Apple, Amazon, or Meta still mostly appear in the profile description, not in the dedicated company field. I wrote a more detailed blog post here.

While working on this analysis I also added username history to HDYC, derived from the full changeset replication history. I am not fully happy with the current presentation and would appreciate feedback:

  1. Move the username section further down?
  2. Add a time-based filter (e.g. last 5–10 years)?
  3. Remove it.
Location: 0.000, 0.000

Until recently, I mainly used the opening_hours evaluation tool to quickly generate valid OSM opening hours. However, it often requires some manual work to simplify the syntax afterwards.

That’s why I tried using ChatGPT instead - and it works surprisingly well. You can simply copy and paste opening hours from websites, or even upload an image, and ask it to format them for the opening_hours tag.

Example

Scapino opening hours, Winschoten

LLM query

please format the opening hours in the attached image for the OSM 'opening_hours' tag.

Output

Mo 13:00-18:00; Tu-Th 09:30-18:00; Fr 09:30-21:00; Sa 09:00-17:00; Su off

This is a rather simple example, but it also works well with more complex opening hours.

Taking info from

https://www.rigacci.org/wiki/doku.php/doc/appunti/hardware/gps_logger_i_blue_747

and

https://www.technologyblog.de/2019/05/gps-rollover-zerstoert-gps-logger/

and

https://wiki.openstreetmap.org/wiki/Holux_M-241

a modified mtkbabel can set the time when reading from the device. That should probably go into some check whether it’s really a M-241 that is connected, but as long as the battery lasts the device shows the correct time again and logs tracks in this year and not dated 2006..

$ diff -u /usr/bin/mtkbabel ./mtkbabel
--- /usr/bin/mtkbabel   2019-10-12 12:23:29.000000000 +0200
+++ ./mtkbabel  2026-03-03 19:37:37.482923895 +0100
@@ -166,7 +166,7 @@
 #-------------------------------------------------------------------------
 my $debug    = $LOG_WARNING;     # Default loggin level.
 my $port     = '/dev/ttyUSB0';   # Default communication port.
-my $baudrate = 115200;           # Default port speed.
+my $baudrate = 38400;           # Default port speed.
 my $ro_weeks = 0;                # Weeks offset to fix Weeks Rollover Bug

 # GPX global values.
@@ -356,6 +356,18 @@
     set_data_types($model_id);
 }

+# Set time to work around week rollover bug
+
+my ($sec,$min,$hour,$mday,$mon,$year) = gmtime;
+
+$year += 1900;      # year is years since 1900
+$mon  += 1;         # month is 0-11
+
+packet_send(sprintf('PMTK335,%04d,%02d,%02d,%02d,%02d,%02d', $year, $mon, $mday, $hour, $min, $sec));
+$ret = packet_wait('PMTK001');
+printf "Set time string: $year, $mon, $mday, $hour, $min, $sec\n";
+printf "Return for setting time: $ret\n";
+
 #-------------------------------------------------------------------------
 # Erase memory.
 #------------------------------------------------------------------------- 

I have created a tag, diet:excipient_free=* , which is about finding clean supplements, i.e., without harmful ingredients that can make us infertile, inflamed, obese or even epileptic.

For example, whenever we look for magnesium (bis)glycinate, we want one thing, but many so-called “magnesium” supplements come with a lot more ingredients that might reduce the price, or enhance the appearance, but of course, at a cost; to hurt and make us need another supplement to compensate with the side effects. (Maybe they should rename those “magnesium” supplements to corn syrup supplements instead.)

Almost if not all of those ingredients fall into one category, excipients. Let’s use diet:excipient_free=* on pharmacies and nutrition supplement stores to promote a healthier future without dyes, fillers, flavorants, preservatives and other inactive ingredients that can cost us our health.

Posted by SimonPoole on 4 March 2026 in English. Last updated on 6 March 2026.

OpenStreetMap old timers know about the infamous 2009 TIGER import of road data in the US that continues giving to this day. Our story has none of the, maybe deliberate, shenanigans (see the TIGER improvement project) that went on back then in the US, but there are clearly some similarities in the lessons that should be learnt.

Back in July of 2011 the Swiss community undertook a big effort to import municipality boundaries from swisstopo (the marketing name of the federal Swiss GIS department) (see osm.wiki/Switzerland/swissBOUNDARIES3D). Being an OSM n00b at the time with just a bit over a year editing experience I didn’t really do anything useful for the import proper, but I did organise the explicit permission needed from swisstopo as this was many years before their data would become available for use on open terms for us in September 2021. With a couple of technical hiccups along the way that are not really documented, we finally managed to complete the work by early August.

Fast forward to today: I’ve been going on for a few years now that we really need a quality assurance process so that we can discover and track differences between swisstopos data and what is in OSM. We knew and fully expected that there would be differences, because:

  • at the time of the import we simplified the boundaries quite significantly because of the resource constraints of the computer hardware available to us,
  • over the last 14 years we independently followed the mergers and other changes of the municipalities (see osm.wiki/Switzerland/2026_Municipality_Mergers), and didn’t expect this to improve the accuracy of the boundaries, and we knew that now and then there would be associated minor geometry changes that we wouldn’t be able to track,
  • and then just general decay due to glueing and accidental modifications.

Thanks to work by our community member habi inspired by earlier work by Branko Kokanović from Serbia, we have now have daily QA data and boy, we were wrong.

See full entry

Posted by aleesteele on 3 March 2026 in English. Last updated on 6 March 2026.

3 March 2026: Writing this at a Missing Maps “London” remote meeting, realizing that I’d never written a OSM diary about the research I did within the ecosystem. I’m so late! But I’d love to still write this down. This placeholder is cross-linked with my blog.

From October 2020 to June 2021, I conducted ethnographic research within the (humanitarian) OpenStreetMap universe, trying to understand how communities, crises, and corporations came together on OSM. My thesis was ultimately about how humanitarian technologies like open source maps are used and created in response to crisis, and the convoluted mix of humanitarian values, corporate interests, and international networks that intersect on the OpenStreetMap platform.

The project and community is incredibly complex, a confluence of humanitarian actors, technology workers, and crowdsourced labor. My initial questions focused on why people contribute to open-source platforms like OSM (and Wikipedia for that matter), but they later evolved into what role humanitarian mapping plays within the wider ecosystem of geospatial and mapping technologies it is a part of.

Increasingly, as this was just before the wave of new AI technologies, I found that OSM data was being used in order to train AI systems like those used for road detection, etc.

While the written work is in the process of publication (eventually!), there are a number of public videos that share some of my public-facing findings on the subject.

Crisis Maps, Community, and Corporations (an Anthropologist’s perspective)

This talk shares my initial findings from this period, drawing from interviews and studies of political economy, science and technology studies, and humanitarianism. Social science methods might help us to better understand this changing period of OSM and HOT history, as it heads into the future.

Mapping crises, communities and capitalism on OpenStreetMap: situating humanitarian mapping in the (open source) mapping supply chain

See full entry

Posted by jwheare on 3 March 2026 in English.

I started a new wiki talk page discussion on the conflicting/controversial usage of the wetland=tidalflat tag regarding implied and explicit surface types:

Also posted a comment on positive related changes being worked on by the carto team:

Every map tells a story. Some stories are drawn with roads and buildings. Others are written through people, voices, and lived experiences. This is the story of how mapping became a bridge between climate vulnerability and community resilience in the heart of Dhaka. Under the Climate Resilience Fellowship, proudly supported by OpenMappingHub Asia Pacific, our Team 8 embarked on a journey called “Healthy Homes, Safer Futures.” Our goal was simple yet powerful: to strengthen climate awareness and resilience among vulnerable communities living in Dhaka’s urban informal settlements.

Where It All Began

In early May, all ten fellowship teams gathered in Dhaka, sharing ideas and aspirations for climate action. We were two coordinators: Mohammad Azharul Islam — Oceanographer and GIS Analyst at the Center for Geoservice and Research Ahsan Habib Saimon — Capacity Building Officer at Christian Commission for Development in Bangladesh Together, they envisioned a project that would connect data, digital tools, and grassroots knowledge to create safer living environments.

See full entry

Location: Duaripara, Pallabi, Dhaka, Dhaka Metropolitan, Dhaka District, Dhaka Division, Bangladesh

Nimman Road, Chiang Mai(Thailand) is a well-mapped, high-traffic corridor. It scores a B on network density: good intersection frequency, reasonable block lengths. But it scores near zero on crossing coverage because there are no highway=crossing nodes tagged within the 800m analysis radius. The street has physical crossings. They’re just invisible to any tool that relies on OSM, which is most tools.

That’s what SafeStreets shows: not just a score, but which data gap is causing it.

Nimman Road, Chiang Mai — SafeStreets walkability analysis showing 4.6/10 Car-dependent score with Street Grid 2.8, Tree Canopy 5.5, Destinations 7.2

What SafeStreets is?

A free tool that scores the walkability and pedestrian safety of any street address globally(graded out of 10). No account required, 190+ countries. OSM is the backbone, and the only data source that works everywhere.

How OSM powers it, three functions?

See full entry

Location: Chiang Mai City Municipality, Fa Ham, Mueang Chiang Mai District, Chiang Mai Province, Thailand

Portal North Bridge construction and study documents

https://archive.org/details/@isstatenisland/lists/7/portal-bridge-documents?sort=date

I gathered and uploaded documents relating to the Portal Bridge capacity enhancement project and its replacement, Portal North Bridge. The documents (except the Amtrak bulletins) come from NJDEP’s DocMiner. The Amtrak bulletins were retrieved by FOIA request. It appears the FEIS disappeared off the web many years ago.

The original plans intended to build a 3-track fixed span to the north. The documents from 2019 and later depict the currently chosen plan, the two-track fixed structure to the north. The south structure is not funded.

https://archive.org/details/portal-bridge-project-feis-final-4f-october-2008 Portal Bridge Capacity Enhancement Project - Final Environmental Impact Statement and Final Section 4(f) Evaluation, October 2008

https://archive.org/details/portal-bridge-project-feis-final-4f-appendix-vol1-october-2008 Portal Bridge Capacity Enhancement Project - Final Environmental Impact Statement and Final Section 4(f) Evaluation, October 2008: Appendix Volume 1

https://archive.org/details/portal-bridge-project-feis-final-4f-appendix-vol2-october-2008 Portal Bridge Capacity Enhancement Project - Final Environmental Impact Statement and Final Section 4(f) Evaluation, October 2008: Appendix Volume 2

https://archive.org/details/portal-bridge-project-relocation-study-january-2010 Portal Bridge Capacity Enhancement Project - Relocation Feasibility Study, January 2010

https://archive.org/details/portal-bridge-project-gc02-construction-plan-sheets-2019 Portal Bridge Capacity Enhancement GC.02 Contract - Construction Plan Sheets, August 15th 2019

https://archive.org/details/portal-bridge-project-environmental-impact-sheets-2020-2025 Portal Bridge Capacity Enhancement Project - Environmental Impact Sheets, January 2020 with November 2025 modifications

See full entry

Location: Kearny, Hudson County, New Jersey, 07032, United States
Posted by SirfHaru on 24 February 2026 in English.

OK. Last year I wrote a short guide on mapping Indian addresses but I lost it in my tiny pursuit to delete myself. Today I suddenly came across the fact that the guide was actually used by mappers and, hence, as a result I am now writing this post to become a replacement for that old guide. Since this is a new one, I don’t want to just rehash the old stuff and instead this time I am going to take a simple problem and show how I would solve it from scratch.

A1, Tower 2, Sector 11, RK Puram, South West District, Delhi, India

A problem very similar to this one came up in OSM India’s XMPP channel today. So, how does one go about mapping this address?

As it’s usually the case we can ignore the district, state, and country part as they are all very well mapped in India. This leaves us with everything upto RK Puram.

If you are thinking that something as big as RK Puram should surely be already on the map then you are wrong; In my “career” I have actually seen larger areas without any nodes for them. So we will in fact check if it’s already on the map and, guess what, it actually is already mapped as a suburb, so that’s one less step for us! I should mention that in OSM there are three “neighbourhood” levels below the district: quarter, suburb, and neighbourhood in decreasing order of size. In most cases suburb and neighbourhood should be enough for you, but it is important to be aware of quarter for special situations.

Now let’s check for Sector 11. As of writing this, Sector 11 isn’t on the map. So I will put a neighbourhood node at the approximate centre of Sector 11. (Remember that neighbourhood is smaller than suburb.) We are making good progress.

See full entry

Location: Sector 12, Ramakrishna Puram, Vasant Vihar Tehsil, New Delhi, Delhi, India
Posted by GanderPL on 24 February 2026 in English. Last updated on 4 March 2026.

Introduction: What is the Model Context Protocol (MCP)?

To make it easier for AI assistants to communicate with databases and various external systems, the Model Context Protocol (MCP) was created – a kind of API for AI that describes how to use a given service.

MCP works a bit like Swagger / OpenAPI for developers: it precisely specifies which “tools” are available, what parameters they accept, and what responses they return, so that an AI assistant knows how to query a given server correctly. The difference is that MCP is designed exclusively for AI, not for humans – it does not provide a traditional user interface, only a contract that a language model can use.

This post is therefore mainly aimed at developers of AI applications and assistants: it describes a new tool they can integrate into their projects to work more effectively with OpenStreetMap tagging data.


A few months ago, I worked on a new project: the OSM Tagging Schema MCP — a Model Context Protocol (MCP) server built for AI assistants and LLM applications that interact with OpenStreetMap tagging data.

It serves as a bridge between AI systems and the official OpenStreetMap tagging schema, allowing agents to validate tags, query values, search presets, and suggest improvements using the structured knowledge from the @openstreetmap/id-tagging-schema library.

The current 3.x release is technically stable — all tools and endpoints work reliably without errors — but it should still be considered experimental. Active development on version 3 has ended; for now, I only maintain it through dependency updates.

The next major step will be version 4, a complete rewrite developed with AI-assisted coding, focusing on a cleaner architecture, long-term maintainability, and deeper MCP integration.

You can try the service live here: mcp.gander.tools/osm-tagging

See full entry

Posted by pointblue on 23 February 2026 in English.

I successfully put Novato Baylands Point Blue Conservation Science as a pin on the map. However, I have not had success with editing the directions that maps provides to get you to the site. The directions still route you past the facility, rather than stopping right at the facility. They should tell you to go down Aberdeen Rd, and then the location is on your right. Thanks for any assistance with editing the route.

Location: Ignacio, Novato, Marin County, California, 94949, United States
Posted by marcie39 on 23 February 2026 in English.

I’m new to editing OpenStreetMap, so this is my first change! I noticed that most neighbourhood areas in Lethbridge, my local city, don’t have a name shown in OSM. However, they’re all neatly shown on an official 2024 map from the government of Lethbridge, so I used it as a source. I did notice that some areas are already named in other ways, but I couldn’t find the item that holds the name. This induced visual clutter by doubling some names (those of the industrial parks, Copperwood, and seemingly Paradise Canyon), but I still added the names to the neighbourhood areas for consistency anyways. If anyone around knows how to get rid of this without removing the naming consistency, it would be great if this slight issue could be resolved. I haven’t actually tested the map yet, since I just uploaded the edit, but if what I’m describing is actually a problem, please help? Anyway, I intend on updating and adding a lot of things to Lethbridge (like adding addresses and new buildings) in the near-ish future, so it’d be fun to get to know the local OSM community.

I have a large set of photographs I made while running. They are geotagged, as I took them with my phone camera. The compass direction is completely unreliable, but lat/lon is more trustworthy. I thought it would be an interesting experiment to extract greenery like grass and trees from these photographs. It can be a useful addition for creating routes that are more pleasant to walk, since the eye-level point of view is not available in OSM. As this is based on my personal photographs, it has the additional benefit of recommending routes that I tend to use. The first challenge I encountered is that out of a few thousand photographs, only a handful were taken during the daytime. After deduplicating and dropping all photos that contain no greenery, this becomes a relatively small set of waypoints. I decided not to extrapolate additional points along OSM ways to keep the dataset small and avoid adding misleading info. The greenery detection works well enough with the SegFormer model, although it is somewhat slow locally. My plan is to select waypoints from this dataset before calling OSRM. This way I get routes that are more enjoyable to walk and run, but are generally longer than the default shortest route. You can find my dataset on Kaggle.

Location: Ba Dinh Ward, Hà Nội, 11120, Vietnam