OpenStreetMap logo OpenStreetMap

fititnt's Diary Comments

Diary Comments added by fititnt

Post When Comment
OpenStreetMap NextGen Benchmark 1 of 4: Static and unauthenticated requests

(which is not the case here, web frameworks are unlikely to be exactly identical and have different features, etc…).

Yep. This part is relevant. Simplistic benchmarks such as output static content can vary more between frameworks of the same language than if you get a typical framework between two languages (and make a reasonable effort to optimize how to run in production). Both Django (Python) and rails-api (Ruby) as example, are both the more popular between developers, but on benchmark are the slowest:

The typical production use of web applications always has some kind of web framework (even if minimalistic) in special if it is interpreted (which is the case for python and ruby, but may be less C, Go, Rust, etc). And one reason for this is because the alternative to not use any framework means not only the developer, but contributors also have knowledge of how to safely implement things such as authentication, session management, tokens, etc etc etc. It also means to document (and keep up to date) the code conventions on how to organise the code.

I tried to find comparisons between the two frameworks and the difference is small (maybe there’s too few comparisons, however the ones I found, the Ruby on Rails a bit faster than Django on the over simplistic tests). So, I think the idea of assuming a full application written in modern python necessarily is faster than modern ruby may not be significant at all the more features are added. While the current version of his code is not using Django, even if we use as baseline any python framework with more performance, the more features he adds, the less prominent the difference will be.

That is not true. There can be massive differences in speed between programming languages (…)

Trivia: at least on (even without web framework, access to database, etc) part of ruby algorithm (using ruby 3.2.0 +YJIT) implementations are more efficient than the python equivalent.

OpenStreetMap NextGen Benchmark 1 of 4: Static and unauthenticated requests

the benchmark is mostly measuring docker overhead, not the ruby code in production.

I cannot confirm this. When running the Rails server in development mode, you will notice a similar increased runtime even without Docker in place. This is expected behavior as mentioned before. Rails developer mode is not suitable for performance testing.

Hummm. So the benchmark is running both inside Docker and development mode.

One reason to me feel strange such difference is because in general the same algorithm would give similar performance across programming languages which are similar (e.g. interpreted vs interpreted), so unless any alternative is doing more work, a ruby vs python using more recent versions likely would have similar results. So I assumed it would be docker.

By the way, this benchmark is doing something too simple (static page). As soon as it start to work with real data, a heavy part of the work will be from the database (which I assume will be the same for ruby and python), which means any performance difference is more likely to be smaller. And, if not smaller, it might be easy to optimize the queries in the rails port.

OpenStreetMap NextGen Benchmark 1 of 4: Static and unauthenticated requests

There’s at least one flaw in the methodology: the benchmark is mostly measuring docker overhead, not the ruby code in production.

In cases such as fast running operations, this overhead becomes significant.

But I can understand this might be the first time you are doing such type of benchmarks, so it’s okay make this mistake.

Help purchase 1:50k Topographic series of Swaziland?

Bump! (there’s no like here, so I will leave a comment)

Perhaps this type of map may be unusual among mappers more accustomed to aerial images, but they are useful!!

Here in Brazil I often use the “Cartas Topográficas do Excército Brasileiro”. Although it is outdated (some parts as old as 30 years, not as easy keep updated, because it requires field survey), there is a lot that does not usually change (although it still needs to be reconfirmed with more sources, such as aerial image). Also, sometimes aerial image is not as evident what a feature is, but with these kind of maps, helps a lot.

So, yes, this kind of map, is useful!

Numeração 100%

E reparei que você está até colocando operator:wikidata=, por exemplo . Que lindo isso!

Olha, além de queries do Overpass, alguma coisa também tem as queries na Wikidata no Os de polícia no Brasil estou usando essa aqui

No futuro posso tentar ver alguma forma de facilitar interligar melhor as coisas. Mas em geral, é bem mais manual por ponto exato na OpenStreetMap do que seria na Wikidata. Inclusive na OpenStreetMap a costuma ficar mais óbvio erros de posicionamento do que seria na Wikidata (que por exemplo aceita tanto sem endereço/coordenadas, como pode aceitar apenas endereço no modo texto).

Numeração avenida angelica

Olha, se voce tiver telegram, me adiciona nele e diz que me conheceu por esse link (meu usuário lá é fititnt). Também pode usar mensagem direta aqui pela OSM. Estou dando uma ajuda nessas coisas.

Mas sobre o seu comentário, então. Tem coisa que talvez tenha como ter ajuda, mas mesmo que seja para aprovar (uma olhada rápida um por um) as coisas relacionadas a adicionar endereço ou referencia a algo importante, são meio manuais.

Por exemplo, se endereço já estiver quase perfeito, tem como agilizar por algo que uma fonte externa diz que está naquele endereço. Mas as vezes, o endereço é um prédio, mas o que um humano colocaria seria um ponto dentro do prédio (em vez de editar mais tags no prédio).

Nos EUA, que eles trabalham bastante com PoIs, eles preferem deixar as amenity separada dos prédios pois facilita muito.

Numeração avenida angelica


Dados de endereço são muito úteis!

Moving Python scripts to OAuth2

By the way, looking at the is clear there’s some providers.

Did you know if OpenStreetMap Wiki (mediawiki) and Wikidata (the mediawiki/wikibase) have Oauth2? If yes, maybe consider implementing it.

Moving Python scripts to OAuth2

Have things kind of thing already done is helpful, in special if the same programming language the dev like me would likely to do cli tools.

(All my tools still read-only and, if any, they export files to be used with OSM editors)

Numeração avenida angelica

Que legal, tchê.

Como você fez para coletar os números?

헤어질 결심[Decision To Leave] with OSM...

Oh, permaneça aqui! Sem problema descansar um pouco.

Você é uma das pessoas mais fantásticas que conheço dentro da OpenStreetMap, e é super ativo. É preciso mais gente como você.

Generalization of extraction of example codes, tabular data and Infoboxes from MediaWikis such as

Wikitext is only one of the page content models that MediaWiki supports.

Good to know other content models! Maybe I also create some syntax sugar (e.g. instead of raw string , return something else).

But for data-like content, beyond wikitext (in special the tabular data) Wikibase JSON could be abstracted to return at least the labels, which would later be used, for example, for translations. Note that it is complex to convert from/to a RDF-like dataset to other datasets, but the translation part of items might be so common that it could be worth an abstraction.

(…) you can write a wiki page using simple wikitext syntax as long as you avoid breaking several lightly documented tools that place arbitrary constraints on exactly how you write (e.g., whitespace and capitalization) it due to assumptions they make. Writing for the renderer, in other words. ^(new emphasis mine)

Yes, the challenging part of parsing wikitext is exactly this. This is one of the reasons (at least if using the tool to extract data) it is more forgiving for who writes, and strict on what output generates.

(…) I also appreciate your emphasis on reusing existing content without creating extra maintenance overhead. However, we should view this kind of tooling as being complementary to structured data, not in competition with it.

I agree with the complementary. In fact, the implementation cited here is intentionally lower level, without the database part (it does have a sqlite, but for cache requests). The fun part is done outside.

On the reusing existing content without creating extra maintenance overhead, this is really a focus. While the tool is not self-sufficient for a full solution, by making it it focused on the parsing (and allowing be reusable with other wikis) could increase some extra conventions where wikitext alone is insufficient.

Generalization of extraction of example codes, tabular data and Infoboxes from MediaWikis such as

Humm, interesting the first comment is about how Wikibase abstractes the data! And yes, I found it relevant to explain this internal part, because it really depends on external storage to add the “true” linked data storage.

I mean, when Wikibase stores items data as JSON on a single page, these pages are on a big textarea, so by default, the SQL database cannot really understand its internals. And I mean, it is not even MongoDB where there’s native support for JSON fields.

Another trivia is that if trying to do data mining using MediaWiki with Wikibase API, it’s likely to be item by item (maybe it can allow pre-fetch related, so still very useful) however if somewhat brute force the wikitext (which will be JSON) with vanilla MediaWiki API, then even without special user account (admins or bots) is possible to fetch 50 pages at once. I know this may sound a bit low level, but if we’re talking about synchronizing content, as long as the content stored on the MediaWiki can be exported without need to always work with Full wikidumps available here

About your comment comparing with VisualEditor, I guess the Wikibase interface is more a form-like interface (enforces some structure, not very advanced integrity check, but does some checks). I have not fully tested the alternatives, but I’m sure there’s other MediaWiki extensions which could enforce a form-like entry, to restrict what users can do. So, the analogy with VisualEditor is not perfect, because the link you passed about the VisualEditor, it still allows more freedom for the user with higher challenges to parse (compared to any form-like interface).

Maybe closer analogy than the MediaWiki VisualEditor, the Wikibase editing page is similar to how iD editor allows users to edit an already well detailed item (depending on the tag, the field changes appearance, suggest different values, etc).

(…) It’s JSON, which explains just how disconnected it actually is to the MediaWiki experience. That’s why it feels so foreign and disorienting, and functions like the completely tacked-onto experience it provides.

I think that from a perspective of a “MediaWiki experience”, even trying to not break the mental flow of editing as text (while still fully machine readable) at least some types of trade offs are necessary. The Wikibase (and any other MediaWiki with a form-like UI) explicitly enforce (sometimes too much, or sometimes not allowing for a full strict validation; I know, both ideals are contradictory) how to add/edit data, but even if we parse wikitext directly, the parser could still benefit from hints (such as suggested filename of a code sample) which might not worth show for the user who only cares about visual text, not metadata.

This part is briefly commented on in the dairy, but both conversion tables (e.g. {{yes}} => true) and explanations of what the parameters on most important infoboxes are may not be in the same page (also, would be too redundant) but would still be in some place (preferable in the same wiki). And the syntax where these instructions may not be possible is only natural language.

But it isn’t a duck.

Yes, I also liked the analogy! But again, “Wikidata” is the project (formal explanation:, “Wikibase” is an extension for MediaWiki (see also: So the Wikidata as a project actually is a full linked data. Another interesting fact is that Wikibase (even without triplestore) somewhat still linked data, because it does expose persistent URLs and still fast. So the self description on, “Wikibase is open-source software for creating collaborative knowledge bases, opening the door to the Linked Open Data web.” still very true.

The diary could get more complex, but in theory, a future proxy for each page on could still somewhat be linked data as soon as the person request some format like RDF/Turtle. Same principle could apply for the main API, which today returns XML, but a pull request started in 2019 by the Overpass main developer added JSON output (link:, so in theory, even the main API, rails-port, could also be explicitly “linked data”. I started a early draft for that 2022 here which do a very rudimentar conversion from the XML to RDF/Turtle as a proxy (if person request XML or JSON, it does noting, just output the true output of the de faco API). That was the very easy part, the true challenge (beyond the slow process of how to agree on a schema good enough) would be start to build the endpoints for every tag, so if a tool try to fetch PREFIX osmt: <>, it would work

Olha quem voltou!

Opa, ainda que você não tenha parado por tanto tempo assim, seja bem vindo de novo.

Sou do Brasil também. Comecei há menos de um ano.

Notas do OpenStreetMap

Nem todos os herois usam capas.


Problema em visualizar a camada Maxar. Mais alguém?

Usa ESRI. Maxar costumava não exibir metadados e qual dada era a foto (Bing e ESRI dizem) mas as vezes ESRI e Maxar era perceptível serem o mesmo fundo (porém Maxar não citava data.

E dependendo do estado em que você mora, pode ter secretaria estadual que tenha camadas de fundo on-line.

Drive-by surveying of road side fuel filling stations and use of satellite images for map update

Great guide!

Analyzing OSM's Tile Logs

Fantastic work!

Could you post the code somewhere? It could be in the gist itself. It is shared as an image.

Fédé des pros d'OSM, la fausse bonne idée / OSM pros' federation, the wrong, good idea

Okay, I just had one idea which might help mitigate issues for any local group, existing or future, which is allowed to seek money with the trademark: every organization with explicitly rights to use the trademark have contact point to work as Ombudsman mechanisms on how they operate as local chapter, and such mechanism be a contact point which go directly to OSMF (or any relevant working group). This must be near places that ask for donations and requirement if either the donation campaign uses OpenStreetMap trademark or the organization have OpenStreetMap trademarks on its name (so it implies have relation).

The rationale here is either corruption-related complains (e.g. like a organization saying that done something for their public, but the actual work was by others) or abuse of the trademark rights to antagonize valid initiatives in the region (special if are truly volunteer based, but (in case of being the first in a region) a official chapter could start kick members from own organization or simply block then on any communication channel from the country).

Such a complaint mechanism could be more than good enough as a deterrent. Any organization with explicitly trademark rights (even if it starts democratic, it could still have a takeover) would have a realistic fear of losing trademark rights. It also would make it easier to accept new chapters for regions without one when it is implied that it can be revoked, because if the intent already starts to profit on the trademark without any meaningful impact, then eventually complaints would arise. The rationale for explicitly revoking a local chapter is worse than being an informal local chapter, so yes, this is why I think it is a good deterrent.

Like I implied in the previous comment, it must be reasonable why a local chapter (or, in this case, an organization which I suppose is not looking to be a local chapter) could be somewhat “economic operator” in Europe while it would be not allowed in Africa. The fear here is not average disputes between mappers of a region, but the single official representative in a region abusing its powers like already happened with informal ones for pure financial gains. One direct consequence would be organizations or local chapters be conservative on their promises of making impact to seek donations, because doing it with others’ work (including those not members of the organization) could make them have rights revoked.

PS: Looking at (archive page ) and aware that already exist an OSMF approved local chapter in France (which use different wording on the fear of the diary post of use of “OSM” on term makes even more sense to me. I do understand it might contradict my saying to avoid exclusive rights in a region while focusing on the new group, however the way the FPOSM explains itself could easily be confused as a local chapter! However, the nature of say itself as “experts” likely soon also imply services, might soon or later be a organization allowed by OSMF to have trademark use that will use OpenStreetMap data from others in France while saying it is “fixing it” and investing massive effort on marketing to get more money, which deviate from the community (as OpenStreetMap mappers and open source software developers, not mere “community data users”). To be clear, I cannot discuss this case in particular (I’m less interested in Europe than the global south), but OSMF could try to investigate better with the local French community the context behind this. Maybe there’s more going on.

Fédé des pros d'OSM, la fausse bonne idée / OSM pros' federation, the wrong, good idea

Greetings from the lusophony community!

I think I got the idea of what he fears on formalization of exclusive see PS2 rights for the OpenStreetMap trademark on France itself.

For context, see this eye opening report on what happened with the francophone community , which goes in detail with the disastrous consequences it had in Haiti , to quote part of the text:

“(….) “Haiti provided an example of how bad things can go when one association (COSMHA - Communauté OSM Haiti mostly based in Port-Au-Prince area) had been active as a de facto LC and a de facto “economic operator” providing paid services around OSM. Over time, volunteerism tended to disappear or be very limited to the extent that the association operated solely under a business logic for the only benefits of some of its members. In parallel, tensions grew within the membership resulting into its shrinking and its control by a few. Entry in the association was made difficult. The internal democracy was limited. The association through its de facto OSMF chapter role seeked control over all OSM activities (community, association and business) in the island. This resulted into violent relations with individuals and other groups (in Port-Au-Prince, Saint-Marc or North/North-East) around any community volunteering activities as well as around economic opportunities. Tensions were such that certain mappers stopped their OSM activity or left the island in 2013. The overall resulted into less volunteer community-based activities, a dependence on economic project for any activity and a shrinking of the number of active local mappers (…)

While (despite massive money from international aid on projects) there’s no Haiti community anymore, considering the francophone Africa, despite from time to time well marketed projects and call to actions to make easier to allow creation of local chapters, the transparency is minimal and not volunteering focused, but mostly around economic incentives to bootstrapping add data without care on the impacts that it crowding out volunteers. So, is unclear if is a matter of time to also francophone Africa have only outsiders doing as volunteering, without any true envolvien from people in the region because the existing groups have incentives to antagonize any perceived “competition” which can be better and the real threat is it includes local doing improving the map because want the map be better, not by mere payment (if no money is deliver better results than paying , then become a threat when justifying donations).

So, with all this said, if a group in France will have any special right to explore the trademark and talk in name of OpenStreetMap as local representative on a very official level, then OSMF, for sake of consistency, would need to review status, including the very comercial ones not already allowed, on for example Africa.

Whatever happens if the group in France has such a right (which now it seems to want explicitly), then must be clear what makes it different from others.

PS.: I’m not too extreme to say no local group could use “OpenStreetMap” or closely related terms of OSMF trademarks. However, the fact of being able to seek money while using the OpenStreetMap trademark must have minimal safeguards.

PS2.: I just noticed that there’s already a local chapter in France, which, different from the organization mentioned to be presented in the next OSMF board meeting, does not use focus like “experts” or “services”. So, I’m sorry if my comments here may be perceived to target the wrong, older group, but it must be said that yes, attempts as economic operators are known to cause harm on existing contributors.