Archive for April, 2010

RENDERING LIMITATIONS / LIGHTING RIG

April 29, 2010

Over the last two weeks I have explored the possibilities of using Maxwell Render to output my short film. My tests have shown positive results, however if I choose this rendering engine there are some major elements that need to be reconsidered within the framework of the animation. This sudden change has raised some technical issues surrounding my lighting rig that must be addressed immediately if I am to stay on track.

Because of Mental Ray’s long standing integration into Maya and its resulting flexibility, I could essentially rig any shader, geometry or particle to multiple light sources in world space. In Maxwell, lights are created by applying emitter materials to geometry from which real world photometric data is used to light the scene. Unfortunately this does not allow me to connect or ‘rig’ a light source to drive particles or simulations as I’ve been developing. At first this looked liked a major problem. Then, on gathering my thoughts and conscientiously going over my intentions I realised not only the potential but also the benefits of only using environmental Image Based Lighting in my scene.

Reading back over my blog I glanced over my Research Paper on HDRI and perceptual realism. I realised that this method was very much prevalent in my practical work as almost all of my scenes are lit with environmental IBL along with standard Maya lights. In Maxwell I found that creating an emitting light source in the scene doubled the render times which at this point in production is too heavy a hit to my timetabled workflow. When only using IBL the renders are quicker and the illumination is highly effective. So, the conclusion is that I will use one HDRI of an indoor environment which has multiple artificial light sources in it to light my scene. Getting the right amount of perceptually artificial light in my scene is very important particularly because the film deals with the prolonged effects of unnatural light.

I found these stats that highlight the amount of light around us in different environments, it is taken from;
http://jrscience.wcp.muohio.edu/nsfall00/FinalArticles/Isourmodernizingcultureki.html

Normal room 100-200 lux*
Brightly lit office or grocery store 1,000-2,000 lux*
Cloudy, January noon in Chicago 4,000 lux*
Spring sunrise 10,000 lux*
Summer early afternoon 100,000 lux*

*Lux is a measurement of light output or intensity (Rea 1993).

Advertisements

IS ITĀ OVER

April 28, 2010

I’m involved in a group exhibition called Is It Over held in a WW2 bunker behind Cafe Otto in Dalston. It opens next Thursday 6th May and will run for 5 days. The premise of the show is to invite artists to create work responding site specifically to the disused bunker which has laid untouched for 60 years. So far my work has taken 105 hours to render 220 frames (out of 400), I’m just hoping it will finish without error by next Wednesday. I will post a entry explaining the work and its processes shortly.

MAXWELL RENDER / PRODUCTION WORKFLOW

April 28, 2010

The process of rendering within computer graphics has a long history of attempting to simulate real world objects. For this to occur interactions of light and object have been at the center of generating believable imagery that adhere to the physical laws of electromagnetic radiation. My whole area of study has been consistently focused around this process. Therefore when I discovered an advanced lighting renderer for Maya, it felt only natural to focus my attention on it.

Maxwell Render is a rendering engine based on the mathematical equations governing light transport, meaning that all elements, such as emitters, materials and cameras, are derived from physically accurate models. Maxwell Render is unbiased, so no tricks are used to calculate the lighting solution in every pixel of a scene; the result will always be a correct solution, as it would be in the real world. Maxwell Render can fully capture all light interactions between all elements in a scene, and all lighting calculations are performed using spectral information and high dynamic range data.

I found when using Mental Ray to create photo realistic materials I was spending an extensive amount of time building complex shaders through extensive node trees in Maya. Even after hours of testing and tweaking, the results were varied and generally left me feeling a little unsatisfied. With Maxwell, the results are far easier to achieve and in my opinion appear more perceptually real. This can be put down to many things including linear gamma correction, physical camera distortions and real global illumination effects. However with all these positive points comes a variety of performance downsides. The render times are considerably longer than Mental Ray which is a major factor, especially in the world of animation. Even the 2:30min film I am currently working on will need a considerable length of time to render, here’s my breakdown;

– Approximated length of short film; 2:30mins, therefore 180 seconds.
– 180s with 24 frames per second = 4,320 frames in all.
– Averaging 30 minutes per frame render time (720p) at 2 frames p/hr = 48 frames p/day.
– 4320 / 48 = 90 days of render time.

Realistically I have under 60 days to finish this film, so already I’ve overshot the mark. If I bring this down to say 2 minutes, and shorten the render time to 20 minutes per frame, I think this could be done in around 40 days of constant rendering on one machine. So to break this down further, I already have one i7 quad core mac at home which will do the majority of these batch renders. Luckily the guys at work have offered a machine or two if they are not being used over the weekends to help finish the job, excellent news that may save my life.

LIGHT DRIVEN PARTICLE SIMULATION / V.02

April 8, 2010

In order to take the lighting rig a step further I needed to consider the inverse-square falloff of real life illumination when affecting the particles. With the help from a colleague at work we scripted a creation expression within the Per Particles attributes that defined the scale of each particle based on the position of the light and it’s falloff intensity. As the moving light approaches the particles they balloon in size and change colour, only returning to their original state when outside of the falloff radius.

While this currently all appears rather abstract, the intention is to use a simulation like this within my film to illustrate the transformative effects of photoreception in biological organisms. These tests are effectively helping further develop the physical interaction of a light source with any given objects in my scene. I think the next phase will be for me to demonstrate how texture and geometry can be driven in a similar way to the particles shown here.

The expression below shows how the position and intensity of the light source (pointLightShape1) is controlling the particles (nParticleShape1) size (radiusPP) and their colour (rgbPP).