Computer Graphics Rendering Competition at Saarland University
Suhas Gopal - Team Name: bidoof
Links:
The idea is mostly inspired from anime scenes of water stream flowing around a house. For eg, the stills from “Kimi no na wa” movie shown below.
I wanted to depict a similar night time scene with a small stream and a bridge, lit by lamps or lights from inside the house. Most of the lights are essentially just floating, as I didn’t know how to exacly model using proper 3D meshes. 😁
I started off with modelling the ground, river, street lamp and bridge in Blender. After exporting it to OBJ and MTL files, and enabling loading of materials, I faced some issues in actually getting the materials to show up. So using the material name I had assigned in blender I added the materials to MatLib
manually.
The texture UV mapping which I tried to do in Blender also did not transfer properly when loading into my renderer, so I removed the ground and river (which were modfied cubes in blender scene) and made them into quads added programmatically to the scene. I also downloaded a model of Japanese Temple from internet, applied some lambertian materials to it and added it to my scene.
Then I worked on the lighting. I added different kinds of light to the scene - Area light (in the form of box), multiple point lights, Spotlight for street lamp. (But the street lamp is not actually a light, it is just an emissive lambertian, with a spotlight right below, as I could not figure out how to put a light inside the solid)
Even with multiple lights the scene was quite dark, so I had to add extra point light high up in the scene to give somewhat of an ambient light.
Water
Reflection, specularity and water waves (normal bumps) on water.
Spheres
Its a combination of dielectric sphere, emissive lambertian and a point light stacker vertically.
Moon texture and starry environment map
Multi Threaded Rendering: I collect all the samples that have to be done for each pixel and start maximum possible no of hardware threads in parallel, with each thread getting almost equal no of samples. When all threads are complete I read all the colors stores from each computation from a vector and write to the image.
Box light: I used the makeBox
function from assignments, to make an area light of the quads of the box. All the yellow-orangish boxes in the image are such lights.
Normal Maps: I collected all the objects loaded from OBJ file to a hashmap by name. And added normal maps as a new type of map for the given group identified by name. (Like how we have coordmapper to get coords, I use the NormalMapper to get the normal in integrator). I implemented an Image normal mapper which takes an ImageTexture as input and uses the RGB values to modify the normal of the intersection. This did not entirely come out well I think. But the effect can be seen on the ground as well as the river.
a. Before bump map
b. After bump map
The texture applied in Blender could not transferred to the renders, especially the regular textures look worse. So texture on wall is not nice. And also I had to change the river and ground to a quad so that I could texture them manually.
Environment map appears much brighter than the original image.