chevron_left Back to all articles

Making environments load faster in three.js with FastHDR

Published by Felix Herbst
  • rendering
  • three.js
  • optimization
  • HDR

Ballroom Equirectangular Panorama loaded as FastHDR file

Most 3D scenes today use image-based lighting (IBL). They rely on an high‑dynamic‑range image (HDRI) to approximate environment light and reflections for physically based rendering (PBR). We found a way to make HDRI loading in three.js and Needle Engine fast, smooth, and memory‑friendly.

HDR environments in the FastHDR format can load 10× faster than EXR and up to 5× faster than UltraHDR in our tests. They also use around 95% less GPU memory and avoid main‑thread stalls during loading entirely. They’re faster, lighter, and built entirely on open standards.

FastHDR is designed to work seamlessly with existing three.js and Needle Engine workflows, making it easy to integrate into your projects. If you want to get started quickly, you can download ready-to-use files from our website:

View and download FastHDR environments

The correct technical term for this format is “KTX2-supercompressed Prefiltered Mipmapped Radiance Environment Maps in UASTC HDR format”. We’ll go over what this means later in the article, combining two main ideas:

  1. Pre-calculating diffuse lighting from an environment map.
  2. Compressing that texture in a GPU-friendly way.
FastHDR is just a name for a combination of technical concepts .

We chose the name “FastHDR” because the technical term is long, complex, and hard to remember. It’s not a trademark or proprietary tech!

The concepts behind image-based lighting

Equirectangular panoramas as environment maps

Environment maps are a bit different to regular textures: instead of describing how a surface looks, they describe how an environment looks – the entirety of a scene around the viewer. Another word for that is “panorama image”, or 360° image. There are different formats to store such images, but the one that is most widely used nowadays is an equirectangular 360° image.

Basically, all 360 degrees around the viewer are unwrapped onto a rectangle – so they look like this:

Equirectangular panorama example – Ballroom

And here’s another example:

Equirectangular panorama example – San Giuseppe Bridge

Using an environment image as light source

These images can be loaded into 3D applications as environments, so they “wrap around” the scene again. Objects in the scene can then reflect what’s on that environment, and also be lit by it.

In technical terms, this is called image-based lighting (IBL), and one can differentiate between radiance (specular reflections) and irradiance (diffuse lighting). IBL is popular because it delivers high visual quality at low runtime cost and works great across devices.

This is how a scene, with spheres made of different materials, looks with image-based lighting from the Ballroom panorama above. You can see the reflections of the scene in the mirror in the middle, and you can also see that the lighting and reflections on the other spheres are consistent with the environment.

Scene lit with image-based lighting

And here’s the same scene with the Giuseppe Bridge applied to it. You can see that lighting comes from a different direction and has a different color temperature, and again the scenery is picked up correctly by the reflections.

Scene lit with a different image

Without an IBL, lights would have to be placed manually to illuminate the scene. Here, instead of an IBL we use one directional light and one spot light. It immediately looks “unrealistic”: the mirror in the middle doesn’t even feel like a mirror anymore. It takes a lot of work to get this to look “realistic”. Famously, for the first “Jurassic Park” movie, scenes were lit with over 50 individual lights – this was before the invention of image-based lighting.

Scene lit without image-based lighting

With image-based lighting, different lighting scenarios can be applied to scenes, just by changing the environment image. And when both the materials and the environment follow the physical principles used in physically-based rendering (PBR), an object always looks “plausible” under different environments:

ShaderBall object lit with various different environment images

Capturing lights with high-dynamic range images

High-dynamic range (HDR) images capture a wider range of light intensities than standard images. This is important for realistic lighting in 3D scenes, as in this way, the difference between, say, a lamp and the sun, can be expressed in the image data.

HDR images are typically stored in formats like OpenEXR (.exr) or Radiance HDR (.hdr), which can represent these brights. Image formats like PNG and JPEG traditionally can’t do that, as they’re made to represent images for displays, but there are newer approaches that allow to express some dynamic range in them as well.

Here’s how the Ballroom looks at lower exposure – you can see that the lamps and windows stay bright even when the rest of the scene is almost black. This high dynamic range enables accurate reflections and lighting in digital scenes. Compare this with the image at the beginning of the article, where the lamps don’t look particularly bright.

Reducing exposure on the Ballroom image, we can see the lamps and windows have bright intensities in this image

Rough reflections and blurry backgrounds

When working with physically-based materials, there is often a mix of surfaces in a scene: some are metallic, some are rough, others are smooth, with varying degrees of reflectivity and metalness. They interact with light and environments in different ways through reflections, coatings, matte surfaces, and transmission:

Scene lit with image-based lighting

Especially noticeable is that rough surfaces scatter light more, creating softer reflections, while smooth surfaces produce sharper, mirror‑like reflections. Getting that right in real time is computationally expensive; instead, we can precompute various filtered versions of the original image, and then blend between those versions depending on the roughness (scattering) of the material.

This technique is called Pre‑filtered Mipmapped Environment Maps (PMREM), and in Needle and three.js these prefiltered maps look like this.

A PMREM environment map with mipmaps in the CubeUV mapping. The metallic sphere on the right has varying roughness levels to demonstrate how the different levels from the PMREM are applied.

In three.js, this is called a CubeUV mapping; the equirectangular image is converted into the faces of a cube at various degrees of blur. When a shader wants to get the “right” reflection and lighting for a specific roughness, it can look into the right cube face at the right mipmap level to get a pretty accurate result.

Environments in three.js

In three.js, the environment texture drives lighting and reflections (scene.environment), while the background (scene.background) controls what you see behind your objects. You can use the same texture for both, or choose a different background for artistic reasons, like a gradient or a static photo backdrop. Often, the background is using the same texture as the environment, and is blurred (using scene.backgroundBlurriness).

Image with blurred background

Making environments load faster

The current approach in three.js

If you look at three.js examples, almost all of them follow the same pattern:

  1. They load an environment in EXR or HDR format
  2. They run it through a PMREM generator which creates that prefiltered map layout from above
  3. The output of that generator is used as environment, and also often as background.
PMREM for backgrounds allows for pretty blur

The advantage of using a prefiltered map as a scene background is that the background can be efficiently blurred at runtime. Three.js has a property backgroundBlurriness for that purpose, and the Needle web component also has a handy background-blurriness attribute. It looks nice!

This approach has a number of downsides:

  1. EXR files are very large for web standards, and HDR files are even larger. A 2k environment map is easily 8-10 MB in size, and a 4k environment may be 20 MB. This results in longer loading times and increased bandwidth usage. Often, people end up using lower-resolution images to counter the extra size.

  2. PMREM generation is GPU-accelerated, but still takes additional computation time. This can be a bottleneck for real-time applications that need to load fast.

  3. EXR, HDR and UltraHDR are CPU-compressed formats. To upload them to the graphics card, they need to be unpacked into a full bitmap with zero compression. Together with the additional memory needed during PMREM generation, this means that a single 4k environment map can peak at and exceed 250 MB of GPU memory in “half float” format, and roughly double that for full float accuracy.

Calculating GPU memory size

As a rough guide, an unpacked 4k equirectangular HDR texture (4096×2048) at RGB half‑float (16 bits per channel) is ~48–50 MB just for the base image. PMREM generation then renders a CubeUV mip chain (six faces, multiple levels) with multiple intermediate render targets. Including those temporary buffers, memory use during generation can reach into the hundreds of megabytes on typical GPUs.

The current approach in Unity

Unity takes a bit of a different approach: they preconvolute the skybox material whenever lighting for a scene is rebaked, and put the results into cubemap mipmaps. This also takes longer to precompute, and makes it non-trivial to switch to a different environment at runtime.

Let’s look at options to make this faster!

Making downloads faster: UltraHDR

Traditional HDR formats like EXR and Radiance HDR compress poorly over the network. LDR formats like JPEG/WebP compress extremely well but can’t store true HDR values.

A relatively new format, UltraHDR, bridges that gap by storing:

  • a base JPEG (LDR), plus
  • a log‑encoded gainmap JPEG and EXIF metadata describing how to reconstruct a single HDR image from these two LDR images.

UltraHDR does indeed significantly reduce download size. However, reconstructing the HDR image requires a considerable amount of CPU work to combine base + gainmap, and the final texture is still uploaded as uncompressed bitmap to the GPU. So UltraHDR improves download time, but neither loading time nor GPU memory are improved.

Reducing GPU memory: KTX2

KTX2 is a container for GPU‑ready (or GPU‑transcodable) textures. It compresses pretty well (but is still larger than JPEG in HDR use cases), so downloads are faster.

The biggest improvement is that KTX2 files stay compressed when uploaded to the GPU, they don’t need to be “unpacked” into a bitmap. They may have to be transcoded depending on the device, but by design those transcoded formats will still be GPU-compressed.

The result is much much lower GPU memory usage (95% less!), and almost zero time to upload to the GPU. Additionally, all the processing and transcoding happens in a separate thread (called a worker in the web), so the main thread and everything else going on while loading a scene simply happens in parallel.

So, KTX2 is awesome! Needle uses it to automatically optimize and compress all textures in glTF files for our users by default, and it makes files smaller and more efficient. There are also multiple different compression formats to choose from, either ETC1S (for smaller files and somewhat lower quality) and UASTC (for higher quality, but also larger files). As of 2025, KTX2 can also contain HDR content in the UASTC HDR format.

Turns out PMREM generation is part of the problem

This all sounds great – but when you load an environment map in KTX2 format, you’ll still be hit with PMREM generation. You end up using only about 20% less GPU memory, and still take some performance hit. For larger environments, you may use hundreds of MB of GPU memory, which is especially precious on mobile platforms like iOS and Quest, and the generation step may stutter.

Making loading even faster: precalculating PMREM

The PMREM generation process is deterministic: the same input produces the same filtered result. That means we can precompute the PMREM once, and ship the prefiltered map directly, in well-compressed KTX2 format.

Bringing precomputing and compressing together

Combine both ideas and you get FastHDR: a prefiltered PMREM stored as KTX2 with HDR data. This format brings together the best of all worlds: the file downloads quickly, decodes on a worker, and is guaranteed to stay compressed on the GPU. Startup is smooth, memory usage is low, and you skip runtime PMREM generation entirely.

In practice this also parallelizes better: the worker does the texture work while the main thread and GPU focus on loading your models.

So here it is – the technological marvel that are KTX2‑supercompressed prefiltered mipmapped radiance environment maps in UASTC HDR 4×4! And because that’s a rather hard‑to‑remember name, we have started calling this particular way of encoding an environment FastHDR.

Here’s a loading speed comparison, including download, transcoding, and GPU upload, between FastHDR and EXR at 4k resolution. FastHDR takes 1.4s, most of which is the file download. EXR takes its time, and ends after a whopping 8.5s.

Watch the full comparison video on YouTube.

Some cool numbers and performance comparisons

With FastHDR, higher resolutions become feasible for interactive use — where previously a 1k EXR was the norm, we can now use 2k or even 4k FastHDR files, at better performance and vastly reduced GPU memory usage.

FastHDR loading performance statistics

More statistics can be found on our FastHDR page: cloud.needle.tools/hdris.

Using FastHDR in your projects

Using FastHDR in Needle Engine

Needle Engine supports KTX2 environments out of the box. If you’re using our web component, you can point it at a FastHDR map directly:

<needle-engine
  background-image="https://cdn.needle.tools/static/hdris/ballroom_2k.pmrem.ktx2"
  environment-image="https://cdn.needle.tools/static/hdris/ballroom_2k.pmrem.ktx2">
</needle-engine>

You can learn more about the <needle-engine> web component and its handy attributes in our docs.

Why use Needle Engine?

Needle Engine adds a component system and lots of well-crafted opinions to three.js: how to load and optimize 3D models, attach components to objects, manage memory and scene complexity, physics and multimedia, and much more.

It also seamlessly handles IBL, progressive loading, workers and more out of the box, so you don’t have to worry about getting the details right. Plus, if you’re starting from Unity or Blender, Needle provides a familiar workflow and easy integration.

Using FastHDR in a three.js app

FastHDR works today – if you’re using a recent enough three.js version, you’re good to go! Let’s look at what you can do today to speed up environment loading in your app.

  1. Go to https://cloud.needle.tools/hdris
  2. Pick an environment you like
  3. Change your environment code to the following:
import { KTX2Loader } from 'three/examples/jsm/loaders/KTX2Loader.js';
import { CubeUVReflectionMapping } from 'three';

// ...

const loader = new KTX2Loader();
loader.setTranscoderPath('three/examples/js/libs/basis/');
loader.detectSupport(renderer);
loader.load('https://cdn.needle.tools/static/hdris/ballroom_2k.pmrem.ktx2', (texture) => {
    // This is important, as it tells three.js this file is already a PMREM
    texture.mapping = CubeUVReflectionMapping;
    scene.environment = texture;
    scene.background = scene.environment;
});

That’s it! This simple change gives you 50–90% faster loads, ~70% smaller downloads, dramatically lower GPU memory usage, and smoother startup (work happens off the main thread). In many cases, replacing a 1k EXR with a 2k or even 4k FastHDR still results in faster, smoother loads.

How to find the environment code in your three .js app

Quite likely, you’re already using image‑based lighting, because most three.js examples do.

The code your should replace with the lines above probably looks like this:

import { PMREMGenerator } from 'three';
import { EXRLoader } from 'three/examples/jsm/loaders/EXRLoader.js';

// ...

// EXR is a high‑dynamic‑range (HDR) image format
const loader = new EXRLoader();
loader.load('path/to/your.exr', (texture) => {
  // PMREM prefilters the environment for rough/smooth reflections
  const pmremGenerator = new PMREMGenerator(renderer);
  pmremGenerator.compileEquirectangularShader();
  const envRT = pmremGenerator.fromEquirectangular(texture);
  scene.environment = envRT.texture;
  // Optional: also show it as blurred background
  scene.background = envRT.texture;
  texture.dispose();
  pmremGenerator.dispose();
});

Creating files in the FastHDR format

We’re working on some tools to make it super easy to create files in the FastHDR format.

The goal is to provide a web tool for small-ish files (1k, 2k) that can generate the PMREM texture and compress it, all in the browser. Additionally, we’re going to release a command-line tool for large textures (4k and up), because compressing those doesn’t fit into browser memory limitations.

In the meantime, the three.js example for the EXRExporter is a good starting point to load an EXR file, and export a PMREM texture in the right format: three.js Example: EXRExporter.

Once you have that texture, you can run it through a built version of https://github.com/BinomialLLC/basis_universal to compress it into the KTX2 format with the following command:

./basisu -hdr_4x4 my_pmrem.exr
This is for developers

Building a program yourself can be scary – if you’re not familiar with that, you can just use the preprocessed files from https://cloud.needle.tools/hdris for now, and wait for better tooling to magically appear.

Current state and future work

We’ve published a website with 20+ FastHDR maps, preprocessed in 1k, 2k, 4k. They are ready to be used with your three.js projects, and of course also work in Needle and React-Three-Fiber. You can find the environment maps here:

View Environments at cloud.needle.tools/hdris

On the platform side, Needle Engine already supports loading FastHDR files, and our build pipeline will gain support for processing it as well. We’re also looking into making an easier browser‑based tool, but it might need some server-side processing due to browser memory limits.

We’re also reaching out across three.js and the KTX2/Basis ecosystem to support additional HDR transcode targets in the future (for example, UASTC HDR 6x6), which will further reduce file sizes. And beyond the web, it’s interesting how we can bring the same fast, efficient runtime loading of environments to engines like Unity or Unreal.

Additional notes

Why the name “FastHDR”?

Technically, all of these pieces of technology exist for a (relatively short) while now. Everything is standardized, implemented, and ready to use. But we think giving a specific way of encoding specific textures a name helps to create a common understanding and vocabulary around these concepts, making it easier for developers to communicate and collaborate.

It worked for UltraHDR – an arcane and complex file format become something you can discuss, share, and implement more easily. We hope the same will happen for FastHDR in the context of three.js and beyond.

HDR environment file types and how to load them in three.js

Image-based lighting works best with high-dynamic range images (HDR). Typical image formats like PNG or JPEG have a limited dynamic range – there’s a definition of what “white” is, and nothing can be brighter than that.

In contrast, high-dynamic range images contain a much wider range of brightness levels, allowing for more realistic lighting and reflections.

Brightness in HDR files

Brightness in HDR images goes beyond “white”, and instead of being stored as byte (number from 0 to 255), it’s stored as a floating-point value (like 15.03, or 0.4). This is also called a half-float or float format, depending on the accuracy used. As a point of reference, the sun is approximately 100.000x brighter than “white”. Some HDR formats can express that, others can “only” go up to around 65.000 for technical reasons.

File types for HDR images include:

File Type Description three.js Loader
.exr OpenEXR is a high dynamic range (HDR) image file format developed by Industrial Light & Magic (ILM) for use in visual effects and computer graphics. It supports various lossless compression methods like ZIP and RLE, making it suitable for professional workflows. EXRLoader
.hdr Radiance HDR is a simpler HDR image format, often resulting in larger file sizes due to less efficient compression. HDRLoader
.jpeg with gainmap While JPEG is typically a low dynamic range (LDR) format, the addition of a gainmap (a second JPEG carrying brightness information) allows it to approximate HDR data. This format is increasingly used on Android devices, notably by Google as “UltraHDR”. UltraHDRLoader
.ktx2 This is the format FastHDR uses. KTX2 is a container format for GPU textures, supporting various compression methods, including Basis Universal. It is designed for efficient transmission and storage of texture data, making it ideal for real-time applications. KTX2Loader

Each of these formats has a corresponding loader in three.js, such as EXRLoader, HDRLoader, and UltraHDRLoader. For best performance, consider using KTX2 files with KTX2Loader as described earlier in this article.

References and further reading

View and download FastHDR environments
Company Roadmap FAQ Compare Articles HDRIs Report Problem

Send us a message

Please enter your message below. We will get back to you as soon as possible.
Contact

Send us a message

Please enter your message below. We will get back to you as soon as possible.
Imprint