July 3, 2010

For the show I’ve printed a catalogue of images and thoughts based around my exploration into perceptual realism. It is a visual journal investigating the dichotomy between the photographic and computer generated image, of which I’m printing 100 copies to hand out instead of postcards at the degree show.

Each page shows a 3 tone render of the synthetic model on one side and it’s integration into the photograph on the other. It is a practical example of how the indexcial nature of the photographic image helps position synthetic objects within the same ontological claim to truth. I collected and modeled these elements during the research for my thesis so it’s pleasing to showcase them in printed form.



July 3, 2010


July 2, 2010

I’ve always had a clear idea about how to present my final piece, however simple it may be it is important to explain my proposed methods.

The animation is being rendered 720 HD and not full 1080 due to the increased render times incurred by such an increase in resolution. The frames are already taking an hour to render, only being made possible by out sourcing to a render farm in Sweden, Rendernet. With my project exploring hyper-realism in computer graphics it would make sense to showcase my work at the highest possible resolution. I could use a full HD iMac but in my opinion computer monitors are far less an engaging medium for the audience than a projector. Unfortunately a High Definition projector is too expensive to hire, therefore I’ve decided to use a standard definition projector hired from the central loans store. I feel that the loss in resolution will be compensated for by the scale and impact of the projection. A pair of headphones plugged into the dvd player will allow the viewer to fully immerse themselves into the short film.

The length of the film is 1:14mins and will loop continuously throughout the exhibition. Because it documents the 24 hours of light in Piccadilly Circus this loop should feel rhythmic and circadian. It will be interesting to see how this loop disrupts the linear disposition of the work, providing the audience with no beginning or end. Considering I only have half of my renders back so far it’s unfortunate I will have much time to examine and reflect on this looping technique before next week.


June 24, 2010

24 hour time-lapse of Piccadilly Circus reflected by the HDRI sphere.


June 21, 2010

For the texturing I used 4k and even 8k UV maps knowing full well I would be positioning the camera very close to the meter. This level of detail has succeeded in creating a believable physical object made plausible by the simulations of a real film camera within Maxwell.

Instead of animating a digital lux meter in the film, I was keen to remain true to the dialogue between photography and CGI. The light meter is loosely based on an old General Electric model, connecting analogue functions to the digital present. I changed the logo to EV (exposure value) to keep in fitting with the photographic disposition of the film. When illuminating the scene I had to manually adjust the f-stop, ISO settings, shutter speed and exposure values of the Maxwell camera to simulate the photographic action. From these settings I animated the light meter’s needle for each sequential HDRI. It seemed quite strange that I was using a virtual camera to calculate the response of a light meter, an action that would normally be in reverse.




June 19, 2010

I’ve completed a final edit of the film which can be seen as a playblast below. The film is 1:14mins with each HDRI lasting 4 seconds, sequentially lighting both the dynamic wave and light meter. In the end I decided to overlap each HDRI by a second, allowing me to blend the renders together in post for a seemless change in environmental illumination.

I’ve archived the whole project within Maya, uploading it shortly to Rendernet who are fortunately rendering it for me. With all the high res textures and dynamics caches needed in my scenes this file is exceeding a massive 2.5Gb. If all goes to plan I should have the raw renders back mid next week to edit, comp and sound-lay.

I’m collaborating with Zai Tang who is very kindly creating the sound design ( He was on the MADA course last year where he investigated the relationship between soundscapes and our perception of the urban environment. Although based in Singapore, he has some sound recordings of Piccadilly Circus that will help drive the experiential and immersive quality of the film.


June 9, 2010

Here are some early developments of the wavelength simulation for Photoperiod Piccadilly. The main objective is to simulate a natural, organic motion while at the same time maintaining control of the peaks and frequencies of each wave. From the red neon lit night to the bright midday environment, this crescendo of illumination is being visualized through the dynamically changing wavelengths.

In Maya dynamic curves are created with linear and smooth stiffness values, allowing for a fluid rippling movement when turbulence and gravity is applied. As I mentioned before, the simulation cannot be controlled directly by the varying luminance values in the scene due to the limited integration of Maxwell into the programming of Maya. While this removes a direct connection between physical light and virtual space I am still happy with the conceptual process of simulating such a response.


June 4, 2010


The film uses real world lighting data over a single day to show the fluctuating intensity of environmental light in Piccadilly Circus. Twenty-four High Dynamic Range Images are sequentially used to illuminate a perceptually real synthetic environment, simulating the length of day and night; The Photoperiod Effect. Through synthetic manipulation a dynamic wave responds to the changes in Lux (illuminance) values throughout the day as both artificial and natural light fills the space. Photographic exposure values are calculated by a virtual light meter, connecting physical actions with simulated experiences. With photography’s ontological claim to truth, HDRI creates a physically accurate lighting probe that in turn renders a simulacrum of hyper-real imagery. The indexical nature of photographic realism becomes inherently problematic as the analogue and digital become perceptually unified. Site specific environmental illumination is captured and recorded with the intention of simulating a physical response within a virtual space. It is this methodological approach to lighting as a creative process that drives my work.


May 22, 2010

By the time I got home early on Thursday morning I had not slept in 41 hours. By then I was in a sort of dreamlike state, going over the events of the previous day in my head, hoping I had collected enough data for my project and wondering if the experience had been worth while. All in all the experience itself was extremely interesting, particularly the time between 4:30-6:00 a.m when I was the only person in the junction. Then to see how busy it became during the afternoon made me realise how relevant it was to use the phrase ‘it’s like Piccadilly Circus’, which is commonly used in the UK to refer to a place or situation which is extremely busy with people. I was asked at one point by a curious man whether I worked for Google Earth, at least it’s good to know my actions appeared genuinely professional.

Looking over the photographs I’ve noticed a couple of variables that I should consider if the results are to remain consistent. The first is a slight unbalancing of the camera during each exposure which if ignored will create a blurred and jittered HDRI which is not what I want. Also I was very conscious of the sphere getting dirty over the 24 hours with me constantly moving it around. The results of the last few hours are a little murky, however this wasn’t as bad as I had predicted and shouldn’t affect the illumination of my scene in 3D.

I’m using a program called Photomatix to convert the images into HDRI (also referred to as radiance maps). I had intended to do this in Photoshop but there is no function that allows me to ‘unwrap’ the mirrorball into a flat 360 degree panoramic image. After photographing the sphere from the side closest to the neon signs I did the same for the other side, facing the memorial. I thought this would allow for a fairly decent 360 degree view of Piccadilly Circus when both sides were unwrapped and stitched together. However, with the final HDRI next to each other the stitching process became a little tricky as both images became warped and felt out of place when aligned. Ultimately I was pretty happy with how both the images had unwrapped from the mirrorball in Photomatixs. The ‘pinching’ of the corners wasn’t as severe as I expected, leading me to the conclusion that I only need to use the images from one side, and not both. In the end I have decided to use the side closet to the neon screens as this plays such an important role in the illumination of my scene during the night time.

I positioned the camera exactly 1 meter from the sphere, photographing exposures of 1s, 1/8, 1/250, 1/2000 and 1/4000. After that I used the Digital Lux Meter to record the surrounding environmental illumination. My results can be seen in the table below.


00:00 – 25-50
01:00 – 25-50
02:00 – 25-50
03:00 – 25-50
04:00 – 25-50
05:00 – 180
06:00 – 4000
07:00 – 19000
08:00 – 44000
09:00 – 58000
10:00 – 68000
11:00 – 86000
12:00 – 97000
13:00 – 118000
14:00 – 19000
15:00 – 12000
16:00 – 9600
17:00 – 7600
18:00 – 10300
19:00 – 4000
20:00 – 3000
21:00 – 180
22:00 – 25-50
23:00 – 25-50
24:00 – 25-50


May 16, 2010

Not long after the Bunker exhibition I started to feel increasingly drawn towards the conceptual relevance of using site specific Image Based Lighting within a virtual environment. For the majority of this degree I have researched, questioned and utilized such a technique to incorporate perceptually real synthetic objects into live-action scenes. I think it’s important for me to reiterate how my project has maintained this consistent method of practice that at the core has been responsible for examining hyper-real computer generate images. As a point of departure these notions of simulating realism have questioned the relationship between a real physical space and the virtual world.

As a result I have slightly altered the nature of my final film taking into consideration the importance of exploring artificially illuminated real world places. Following my interest in photobiology I’ve been observing how the human circadian system is entrained to a 24-hour light-dark pattern that mimics the earth’s natural light/dark pattern. We are constantly surrounded by environments that disrupt this pattern, particularly in urban spaces where the bright lights of the city never sleep. I began thinking of certain places in London which never completely fall to the darkness of night, whose surroundings are artificially lit 24/7. Lux is the SI unit of illuminance and is used in photometry as a measure of the intensity of light as perceived by the human eye. Using this calculation I hope to investigate the difference in environmental illumination over the course of a day.


The idea is to spend 24 consecutive hours in Piccadilly Circus where I intend to record every hour the surrounding environmental light. Firstly this will be calculated using a Digital Lux Meter which should give me an accurate reading of the light intensity, noting this down I am interested to see how this changes throughout the night and day. Secondly, a panoramic photograph will be taken every hour of the busy junction that will later be converted into 24 separate High Dynamic Range Images. By sampling the illumination and lighting data of Piccadilly Circus over a full day I intend to examine the changes in quality and intensity of light from the sky as well as the emitting neon signs.

My intention is to document and redefine elements of Piccadilly Circus through photographic texture mapping, creating an abstract representation of the space in Maya. The 24 HDRI’s will be used sequentially to illuminate the scene replaying the entire day over a 2 minute animation. As the intensity and quality of light changes from night to day a dynamic particle simulation will respond and visualize the results collected by the Digital Lux Meter.