Viewing infrared imagery for any place on Earth (tutorial)
Posted by fghj753 on 14 September 2025 in English.First post in hopefully a series of entries where I’m planning to share various OSM-related experiments I have conducted over the years.
Mapper asked if area should be mapped as grassland, scrub or heath. Since this kind of information is better to be extracted from infrared imagery, without knowing where the mapper is from, I pointed them to use global satellite dataset provided by European Space Agency (OSM wiki link) at Copernicus browser.
Two most commonly user IR imageries are using CIR-NRG and CIR-NGR styles. These acronyms essentially mean that when compared to regular RGB (red, green, blue) pictures, infrared images drop blue signal channel and instead use IR as red, and then original red and green as green and blue. For NRG, red becomes green and green turns blue; NGR is vice versa with green staying green and reds are blue.
Turned out that while Copernicus does have multiple infrared imagery layers (such as one simply called False Color is NGR), but because infrared channel is relatively overexposed compared to visible light, then default configuration showed nothing but red (IR) patches on black earth. Solution was building a custom rendering with linear adjustments.
In hindsight, Copernicus browser has button called “Effects and advanced options applied” where one could apply most of those transformations directly without custom layers.
Anyways, custom layer is basically javascript code that should execute on client’s browser for every single pixel on image and calculate RGB values for each. Here’s the tutorial.
I haven’t figured out most feasible way to add pictures to diary posts yet.
- Go to https://browser.dataspace.copernicus.eu/
- Click on green diagonal arrow (↗️) on sidepanel to see latest images
- Select Layers -> Custom (at bottom of list) -> Custom script (rightmost tab)
- Paste script and click apply below textbox.
- Colour balance and lightness can be tweaked using sliders under button “Effects and advanced options applied”
Credit for writing the function goes to the Large Language Model.
//VERSION=3
// Forestry false-color (NGR -> RGB) with percentile-based piecewise stretch
// ---- Paste percentile values from histograms here ----
const P = {
B08: { p10: 0.03, p50: 0.30, p90: 0.64 }, // NIR
B03: { p10: 0.02, p50: 0.063, p90: 0.15 }, // Green
B04: { p10: 0.01, p50: 0.041, p90: 0.12 } // Red
};
function setup() {
return {
input: [{ bands: ["B08","B03","B04","dataMask"], units: "REFLECTANCE" }],
output: [
{ id: "default", bands: 4 }, // RGB + alpha
{ id: "index", bands: 1, sampleType: "FLOAT32" }
]};}
/**
* Percentile stretch maps v so p10->0, p50->0.5, p90->1.
*/
let p = (v, p10, p50, p90) =>
v <= p10 ? 0 :
v >= p90 ? 1 :
v <= p50 ? 0.5 * (v - p10) / (p50 - p10)
: 0.5 + 0.5 * (v - p50) / (p90 - p50);
function evaluatePixel(s) {
// NGR -> RGB mapping for forestry false color:
// R <- NIR (B08), G <- Green (B03), B <- Red (B04)
let r = p(s.B08, P.B08.p10, P.B08.p50, P.B08.p90);
let g = p(s.B03, P.B03.p10, P.B03.p50, P.B03.p90);
let b = p(s.B04, P.B04.p10, P.B04.p50, P.B04.p90);
return {
default: [r, g, b, s.dataMask],
// Index is used for building historgram - helps to guess 10-50-90 cutoffs
index:[
//Math.min(70,Math.floor(s.B08*100)),
//Math.min(16,Math.floor(s.B03*100)),
//Math.min(16,Math.floor(s.B04*100)),
r,g,b // Sanity check on output
// Histogram does support multi-element array, but they are later simply concated to single array
]};}
The function will turn three input channels into adjusted RGB channels. Each channels is linearly adjusted so that value p10 becomes 0.0, p50 0.5 and p90 turn to 1.0. pX stands for percentile. There’s also 4th output series called index, which was used for manually determining percentile values as application supports calculating histogram for just single series at once.
Originally published on OSM World discord on 2025-09-14
Discussion
Comment from rphyrin on 16 September 2025 at 09:16
Thank you for the script. I tried it myself, and it looks pretty cool. I’ve never actually coded a script to manipulate raw satellite imagery before. (In the past, I experimented with Google Earth Engine, but I didn’t really understand how to manipulate the raw data through scripting)
Now I see some interesting colors, but I don’t yet know how to interpret them. So I consulted ChatGPT.
It explained that red indicates healthy vegetation (which reflects strongly in near-infrared); bare soil or built-up areas appear bluish or cyan (since they reflect more in the red and green bands than in near-infrared);
and water usually shows up dark (low reflectance in all bands).
In some cases, there are exceptions, such as clouds, which appear as very bright white (reflecting all bands) with dark patches nearby, caused by the cloud’s shadow.
Comment from fghj753 on 16 September 2025 at 19:26
I have used IR layer mostly to get better insights to what kind of foliage is there. Most commonly to tell if green patch on visible light spectrum is rather mowed grass or longer grassland and if forest is broadleaved or needleleaved.
My objective was to replicate colours i have seen on Estonian Land Board’s forestry layer, but apparently i couldn’t get colours to match.
Sky often contains clouds, but at upper left edge should be some kind of slider to adjust cloud coverage limiter. That way you would get clearer picture, but it might be couple of weeks-months older view.
PS. I found documentation page describing what kind of bands Sentinel-2 has and how are they useful. Same documentation also covers other satellites and their capabilities. https://custom-scripts.sentinel-hub.com/custom-scripts/sentinel-2/bands/