Decoration, that’s what you need
As usual, I had a sense of guilt. On this occasion, it came from my neighbours putting up exuberant Christmas decorations: illuminated inflatable snowmen and Santas (slightly horrifying when they go flat), “meteor shower rain” lights, a faintly-projected reindeer: that kind of thing. Historically, we’ve restrained ourselves to a tastefully decorated fake tree in the living room, which the cats can push over or chew. It’s a bit late to keep up with the Joneses, but I should show willing. For once, I remembered this in October before decoration prices went up.
Being lazy and not fond of heights, the obvious solution involved decorations that project onto the side of the house. I checked out my wall for darkness, and it seemed workable – although, being a genius, I’d ignored a tree blocking a streetlight, and (deciduous) trees are unaccountably less opaque in October (when I was planning) than at Christmas. Still, it would do. But I had a problem: I couldn’t find suitable decorations that weren’t expensive, rubbish, or both – and most needed mains power outside my house, which is somewhere between impossible and a massive pain in the socket. I briefly investigated building a snowfall laser projector, but I object to the aesthetics of green snow, and it seems impossible to buy cheap blue laser pointers: most blue lasers are the “burn a hole in things” variety (I have a purple 50mW for astronomy, but it’ll take someone’s eye out and I’m not using it near Heathrow even if it weren’t out of budget to buy lots of).
The project
Then I had a light bulb moment (well, LED…). Online stores are flooded with portable video projectors cheap enough that putting one where it could be stolen in front of my house was an acceptable risk (it’s a fairly safe neighbourhood and I can point a security camera at it). Of course, their image quality sucks: they’re low resolution (480 x 272 in the case of one I picked up, and I’ve seen 320 x 240), maximum brightness is usually very limited, etc. – but for decorations, none of that matters. Some can run from a USB power brick, and for a small premium some have internal batteries, which avoids the inconvenience; the 2-3 hour run time is plenty for evening decorations. Most importantly, they’ll play a video from flash memory (microSD, USB, etc.) These are cheaper than some dedicated Christmas projectors, and much more flexible.
All I needed was an effect adequate when placed on the side of my house by an iffy projector, in video form. That’s where CGI came in: I decided to render a snowstorm, and have the snowflakes settle on some geometry in the scene to produce an abstract effect. First up, research: I spent some time playing the Frozen Wilds part of Horizon Zero Dawn to see what makes for a plausible snowflake. Conclusion: you needn’t render a snowflake as a fractal (Frozen’s fractals everywhere notwithstanding), which got me out of texturing; a monochrome blob will do. They do need to vary though, so a lot of slightly different blobs it was. And I’d need proper depth perspective, so the distant ones looked smaller, and a bit of a depth of field blur. And then the complicated bit: I need the projection keystone-adjusted (cheap projectors don’t do keystone and don’t have the pixels to do it right anyway). I could render to a rectangle and then keystone stretch that, but it’s wasteful and probably won’t sample very nicely, so I wanted to render warped.
DIY
Never one to do things the easy way when I can reinvent the wheel, the lack of texture and need for variety made me think “software rasteriser” – I only needed to draw triangles, I didn’t need homogeneous depth interpolation, which is good because I couldn’t get my head around how to do that with two independent perspective transforms, but I could work out how to transform the vertices – and getting a triangle into a buffer isn’t rocket science. So, I wrote a trivial triangle rasteriser, setting opacity in a (single channel, because snowflakes are white) float (fortunately that patent expired) frame buffer. I’ll see about using a GPU next year.
Now the snowflakes: for traditional reasons I made them hexagonal. Each vertex had a random radius and offset from 60° spacing, for a more organic look. I kept them planar, with a rotation around the vertical axis and (random) rate of rotation. I gave them a central position and a speed vector (pointing down, obviously). To render, I used a few overlaps of the same shape, with scale determined by the Z-axis to give a subtle depth of field blur. That was a snowflake. They were small, so rendering them didn’t take many cycles.
Rendering small things at varying sub-pixel offsets between frames was likely to make them shimmer a bit more than would be nice. So, I used 16x (4 x 4) oversampling, with a pyramid filter so tiny objects moving between pixels would gradually contribute to both and blend between them rather than popping.
I like to move it (move it)
For each frame I added the direction delta to the centre position of the flake, rotated it by a rotation speed and added a small random amount to the direction delta, giving something a bit like Brownian motion. The result… well, it’s not absolutely convincing, but it looks vaguely like a snowstorm. I kept 50,000 snowflakes in flight, renewing 20 per frame (by randomly positioning them in the air above the frame and randomising everything again), which showed enough settled snow but kept some in flight. I’d intended to make flakes blend out before I reset them, but the twinkling effect isn’t objectionable, so I didn’t bother.
Now they had to hit something. I had the cunning plan of two maps: height covering the rendered box, depth at the screen resolution. Height would stop snowflakes from moving when they landed, depth made solid objects opaque. I considered drawing depth in Photoshop and deriving height from it, but I wanted snow to fall behind things, and also didn’t trust my ability to draw. I, therefore, needed 3D geometry – but fortunately not much accuracy, so I could just keep triangles small and constant-shade them rather than interpolating depth. Generating the maps once and using them for all the frames of snow rendering meant that the rendering speed wasn’t relevant, so I could over-tessellate.
Shaping up (and back)
Fortunately, I only needed abstract shapes for snowflakes to land on, not photorealism – and certainly not shading. I planned several scenes over advent – originally one a day, but that was ambitious since I wrote this on the 9th December and I’d just started. Triangles, cylinders and cones are enough to draw obvious scenes like houses, trees, etc., so I wrote a tessellator for those, extending the rasteriser to write to the depth and height maps (with a min/max update instead of blending).
The flat ground was boring, so I initialised the height with Perlin noise (thanks Wikipedia for the source). Real depth for that is complicated, but fortunately, the ground is nearly planar, so I can ignore it – it never occludes anything. Finally, I tweaked the flake motion tests because drifting flakes stuck to the “side” of things that weren’t there and added the ability to “step” the flakes across frames without rendering, so I could start several frames in without the rendering overhead – I could then split up the rendering task across cores (or computers).
Faux flakes, fir reels
My first (real) case is a tree made of a few cones and a cylindrical trunk. I’ll try to be clever in the future and give it some more realistic branches, but that’ll do for a start. I could add something moving to the scene (other than the falling flakes, little moves after the first few seconds), but the map rendering is currently very simplistic and slow, so the more static I can keep it the better. I might keep the option to composite something moving with a static object, although I’ll need the ability to move “stationary” flakes with whatever’s moving if that’s what they hit. A fun project t follow up with.
Wrapping up: the present
And that was it: draw a simple scene to height/depth, generate random snowflakes, drop them on it, render them to a warped framebuffer, and write the anti-aliased result to a lot of .ppms (about 20,000) – fortunately at low resolution. (Important lesson: don’t leave a file browser window open on the directory where you’re generating that many files, or it’ll go into meltdown and saturate a CPU). A quick run through FFmpeg let me turn the result into a .mp4 file, and I was pleasantly surprised to find everything just worked when I stuck the USB stick in the projector. Fingers crossed it doesn’t get rained (or actually snowed) on and short out – tonight is soggy, so my experiment was brief.
And that is how we celebrate the holiday’s, Imagination style.
Merry Christmas, everyone!