Nov 242013

Last time I covered HDR render targets, tone mapping and automatic exposure control. Now it’s time to simulate some camera imperfections that give the illusion that something in an image is brighter than it actually is on the screen.


The first effect to simulate is bloom. This is when the light from a really bright object appears to “bleed” over the rest of the image. This is an image with no bloom – the sun is just a white circle, and doesn’t look particularly bright:


No bloom

With a bloom effect the sun looks a lot brighter, even though the central pixels are actually the same white colour:


With bloom

Theory of bloom

Why does this happen? There is a good explanation on Wikipedia but this is the basic idea.

Camera lenses can never perfectly focus light from a point onto another point. My previous diagrams had straight lines through lens showing the path of the light through the lens. What actually happens is that light (being a wave) diffracts through the aperture creating diffraction patterns. This means the light from a single point lands on the sensor as a bright central spot surrounded by much fainter concentric rings, called the Airy pattern (the rings have been brightened in this picture so you can see them easier):


Airy disk

Usually this isn’t a problem – at normal light levels the central peak is the only thing bright enough to be picked up by the sensor, and it fits within one pixel. However, with very bright lights, the diffraction pattern is bright enough to be detected. For anything other than a really tiny light source the individual rings won’t be visible because they’ll all overlap and blur together, and what you get is the appearance of light leaking from bright areas to dark areas.

This effect is pretty useful for us. Because people are used to bright objects blooming, by doing the reverse and drawing the bloom we perceive the object as brighter than it really is on screen.


The idea of rendering bloom is the same as Bokeh depth of field. Recall from the depth of field that each pixel is actually the shape of the aperture, drawn at varying sizes depending how in focus it is. So to draw Bokeh ‘properly’ each pixel should be draw as a larger texture. To draw bloom ‘properly’ you would instead draw each pixel with a texture of the Airy pattern. For dim pixels you would only see the bright centre spot, and for very bright pixels you would see the circles as well.

That’s not very practical though so we can take shortcuts which make it much quicker to draw at the expense of physical accuracy. The main optimisation is to do away with the Airy pattern completely and use a Gaussian blur instead. When you draw many Airy patterns in neighbouring pixels the rings average out and you are left with something very similar to a Gaussian blur:


Gaussian blur

The effect we are trying to simulate is bright pixels bleeding over darker neighbours, so what we’ll do is find the bright pixels in the image, blur them and then add them back onto the original image.

To find the bright pixels in the image we take the frame buffer, subtract a threshold value based on the exposure and copy the result into a new texture:


The extracted bloom – the original image with a threshold value subtracted

 Then create more textures, each half the size of the previous one, scaling down the brightness slightly with each one (depending how far you want the bloom to spread). Here are two of the downsized textures from a total of eight:


The 1/8th size downsized extracted bloom


The 1/64th size downsized extracted bloom

Because we’re not simulating bloom completely accurately, there are a few magic numbers we can tweak (like the threshold and downscaling darkening) to control the overall size and brightness of the bloom effect. Ideally we would work it all out automatically from the Airy disk and camera properties, but this method looks good enough and is more controllable to give the type of image you want.

Now we have all the downsized textures we need to blur them all. I’m using an 11×11 Gaussian blur which is soft enough to give an almost completely smooth image when they’re all added up again. A larger blur would give smoother results but would take longer to draw. The reason for doing the downscaling into multiple textures is that it is much quicker to perform smaller blurs on multiple smaller textures than it is to perform a massive blur on the original sized image.

After blurring, the two textures above look like this (and similarly for all the others):


Blurred 1/8th size bloom


Blurred 1/64th size bloom

Then to get the final image we simply add up all of the blurred textures (simple bilinear filtering is enough to get rid of the blockiness), scale it by some overall brightness value and add it back on top of the tonemapped image from last time. The end result will then be something like this, with obvious bloom around the sun but also some subtle bleeding around other bright areas like around the bright floor:


The great thing about this is that you don’t need to do anything special to make the sun or other bright lights bloom – it’s all just handled automatically, even for ‘accidental’ bright pixeld like intense specular highlights.

That’s not quite everything that you can do when rendering bright things. Next time I’ll describe that scourge of late-90s games – lens flare. (It looks better these days…)

Nov 132013

Last week we went to Reykjavik in the hope of finally spotting the aurora borealis. It was my fourth time in Iceland with no luck on the previous visits (it was generally cloudy) so I was hoping for good weather. Luckily the day we arrived it was completely clear all night, so we were hopeful as we headed out in the car after dinner. As soon as we’d left the lights of the town it was apparent that there was something happening as there was a faint, slightly green, streak across the sky, a little way above the northern horizon.

There were still street lights on the road at this point, so we carried on out into the wilderness and found a layby to stop in (where there was already a coach tour parked up, but it was all we could find in the dark). Away from all the light pollution the aurora was much more visible, and started to grow and move around a bit. Success at last!

There was something in the sky for the whole three hours we were out there, sometimes getting stronger and sometimes fading away. It’s a lot fainter in real life than you see in photos, and the movement is a lot slower than you see in videos, but it’s still really impressive and beautiful.

I wasn’t planning on taking any photos as it’s really hard without decent equipment, but as the display was lasting so long I decided to have a go anyway. This was using a shutter time of 6 seconds, and maximum aperture and ISO. The photos are really quite bad, but they give an idea of what it looked like! The reflections are from the roof of the car, which made do as a poor tripod substitute…