Teapot Rendering – 2016

Most of my programming work for this submission was actually spent writing a Blender plugin so I can setup scenes and animations with Blender and export them to my renderer’s file format for rendering. Without this plugin it would have been really difficult to setup an animation as complicated as my submission this year. The plugin is open source and available on Github, it still has many limitations and bugs, but I’m working on improving it.

I’ve also implemented the Beckman and GGX microfacet models described in Microfacet Models for Refraction through Rough Surfaces by Walter et al. to improve my material model. Over the winter break last year I spent some time implementing image-parallel distributed rendering, where parts of the image are rendered in parallel by different computers, which I discuss in detail here.

The animation contains a few different standard models in addition to ones I created:

I also make use of a mix of analytic and measured materials, the measured materials come from the MERL BRDF Database. The different Blender scenes I made to setup and animate the short can be downloaded on my Google Drive. They won’t have the same materials or light settings as in the video though since I still have to setup those by hand in the scene file, but the exported scene files are also in the drive as the JSON/OBJ file pairs in the subdirectories.

Open Source!

This animation was rendered with my ray tracer tray_rust, which is an open source path tracer I’ve written entirely in Rust. Check out the source on Github! You can also find the Blender plugin I wrote to export the scenes for this short there as well.

Stills

Here are some stills from the short to highlight certain interesting frames.