Category Archives: Uncategorised

WEEK 11 Destroying the making of the city in houdini

Firstly, I need to do camera tracking inside nuke so that I can determine the location of the building as well as go to the scene to match it. A big thank you to minhan for using mocha to make a “character roto” which helps with the masks used for tracking.

I imported into houdini by exporting the nuke camera, and these locators below are the axes that used to be in nuke, which makes it tremendously easier for me to locate them.

Firstly, I need to group the models I downloaded online so that I can subsequently paste the materials. And then crumble it.

I ran into a huge problem when dealing with crushing, which took me a week to fix. Many models downloaded from the internet have no thickness but only flakes, however houdini crushes the flakes inside, initially I tried converting to vdb mode but it takes a lot of time and tends to crash if you use this mode, and in the end it doesn’t even render. After going through a lot of tutorials on the internet, I decided that it would be better to process the faces in maya and give them thickness, which is a very tedious step, so if you have the chance it would be better to model it yourself.

I managed to break the staircase in its entirety, next I needed to create the smoke produced during the staircase breaking process.

I used the node “debrissource”, which helps me to identify the particles in the middle of the breakup, through which the smoke is generated.

Still running simulations, I’ve tested at least 4 versions

After importing into nuke, I checked and found a few frames that were missing and had extreme problems with the camera tracking, I needed to fix them.
1.Modify scatter
2.Adjust the camera tracking
3.Redo the smoke spread
4.Re-render

WEEK 10 Group programme discussion

Firstly, we discussed a number of versions of the story, for example identifying the theme as a lonely boy, but due to the difficulty of creating vfx for the story scenes and the lack of logic. So after a lot of discussion we decided to go for “The choice of the plastic bag” and create a vfx for the last part of the video.

So, after the theme was chosen, we explored the plot: the next step was to find a coffee shop where we could film and draw the subplot. The following subplot was drawn by Minhan Lv.

minhan done

On Wednesday of this week we gathered together in a restaurant near King’s Cross railway station for a three-hour shoot. As I had injured my shoulder, I had to play the role of director to help them out, sakshi did the filming and minhan and mahadev did the roles.

Our plan was to create special effects in this environment, such as a city that starts out as a ruined and destroyed city, and gradually turns into an environmentally friendly green city.

Very enjoyable shoot, first experience of the rigours of shooting. You need to shoot a shot many times to get the effect you want.

Here’s some of our dialogue

Minghan :Hi bro how are you doing

Mahadev: Good ,Good
Is it a plastic bag?

Minghan :yeahh (Pause)
You know these plastic bags are just so convenient. Who cares about reusable ones?

Mahadev :
I do

Using reusable bags is so important for the environment.
Its about making small changes for a better future

Minghan
Oh come on. Whats the harm in using platic once in a while

Mahadev : Imagine a world where everyone thought that way. Our cities would turn into chaotic messes, covered in plastic waste. Pollution would choke our air, and wildlife would suffer immensely.

Minghan : Thats so true, What have we done?

Mahadev: Small choices can make a big difference. Choosing reusable over plastic isn’t just about convenience; it’s about preserving the beauty of our planet for generations to come.

Mighan : You’re right. I never looked at it that way. Maybe it’s time for a change.

WEEK 13 Photoshop composite background shot & nuke

A few weeks ago, I have finished the flower bloom animation effect, now I need to synthesise the environment in which the flower grows, as I didn’t shoot this shot on location but found an image to use as a background so I need to change the environment of it, I chose to use ps to help me do this.

Photoshop camera raw

The ground in this scene was too smooth and didn’t look like a soil environment for flowers to grow in, so I added brown soil with stones to this image

When I finished the background for the first shot, I had a discussion with manos. One of the ideas he offered me was to add an additional top-to-bottom shot in front of my shot, which would better illustrate the story.

So I’m still using ps and have filled this image so as to give the camera plenty of room to move!

In order to make the plants grow more realistically into the real environment, I will use the roto tool in nuke to add shadows to the soil and ground.

Photoshop (bata)ai to expand the scene images and then make a video
Final result (flower growth)

Additional Narrative Footage

I’ve been thinking about how to represent the fact that the world has been invaded by zombies, and in addition to a lot of ruins, I think I need to add some dated blood and handprints or patterns or something. Therefore, I downloaded the material from the internet for learning and imported it into nuke for compositing, and added the element of grass, which also symbolises the blossoming of life in no man’s land.

roto in nuke
grass with car
The final result composited in nuke(scene 1)
The final result composited in nuke(scene 2)

WEEK 11 COPY CAT & particle IN NUKE

Batch Size: Leave it automatic.

Crop size: Smaller crop Faster to train, is only aware of smaller details
Use or 256 or 512 for better quality.

Log2Lin4:typically used to convert an image from a logarithmic color space to a linear color space. This is a common operation in the compositing process, as the logarithmic color space is better suited for images with a wide dynamic range, while the linear color space is better suited for compositing operations and adjustments.

Check point. You can link other data set with similar content in your training.

Change the RGB channel in the shuffle and connect the mask to the grade, so that the roto of the eye is reflected.

Particle in nuke:

  • By changing the axis of the card, you can change the direction of particle emission.
  • Different options can change the emission state and shape of the particles
  • Adjusting these nodes can change the size of the particles

ParticleTurbulence: change the particle direction and add some blur.

Test of head:

  • at first, put in the file of head, and change the texture.
  • create another transform which is change the rotate of head, just like particles of head.
  • this part, focus on adjust noise and blur and some distort when we export the last step.

WEEK 10 Gizmo emulating & COPY CAT IN NUKE

Edgedetect:This node helps me extract the edges of an object, but only if the object is alpha-able.

Connect this node with “grade” to adjust the value of gain to change the brightness and color of the edges, which is very suitable for some cool video effects.

IDistort:This node regulates the shape of the border that has been processed, deforming it and giving it a dynamic feel.

Glint effect

Glint:This node is used to help the overall glow and to change the direction of the light source.

First it was done for the character roles in the foreground and then the background. And next the form of processing in different ways is described (about Gizom)

These are two very important nodes.

  1. Hueshift: adjusts the local color of the scene, adding multiple colors.
  2. Direction blur wrapper: similar to motion blur

However, the gizom node can replace these nodes above (edgedetect and grade and so on)

WEEK 12 Simulate the growth of flowers in houdini

reference video

First, I categorized my flower models into flowers, roots, and also leaves. I started with the root part of the flower, I needed to simulate the growth process and curve of the root, so I first used “add” to add its line, and then I added “for each” in order to create the effect for each root.

The node “pathdeform” is very useful for me, the principle is that I first create the point of the rhizome, and then combine the “tube” with the path of the point, so that I can simulate the rhizome of the flower.

The final “foreach” combines to the root of each plant, including when I make leaves, the process is the same.

This was the most challenging step in making the leaves, I wanted them to look more realistic. So I tried to add vellum to make the leaves wiggle randomly, but vellum takes a long time to calculate and it can’t make leaves growing on one side wiggle, so it was a failed method. But this reminds me that I can use the “noise” node.

On top of that, I need to make a fertile soil that incorporates the growth of the flowers and creates some collisions. At the most basic level, the first thing that needs to be done is to create a sphere and change its shape in order to fill this irregular sphere with particles. And set up the colliders (flowers and leaves) so that the leaves get a more realistic feedback when they grow and touch the soil.

Successfully, I did it. It was very not an easy challenge for me, I never learned to code so it was very hard for me and I had to refer to other people’s videos for the codes. But, I would go and try to figure out what the codes meant and which ones the controls were, which was very helpful. Next, I’ll be going for live action videos and 3D tracking in nuke.

WEEK 11 matte painting in nuke (personal work)

reference video

At the very beginning, I looked for a background reference image, which is a panoramic view of London

I tried importing the models into maya to adjust their perspective as well as to adjust the camera angle.

This step is easier for me because I downloaded these models from the internet as I wanted to portray a surrealistic future world, so this saves me a lot of time.

Some of the models had problems with the mapping, so I had to import the models into Substance Painter to recreate the UVs and maps, and I was able to deeply feel that I was more proficient with these software compared to the first semester, which was a memorable process for me.

For this step I’m going to use the Arnold sky lights from maya, which will be more realistic as opposed to area lights and spot lights.

I rendered using the tiff format because I found that if I exported to the png format I would have some color formatting issues that were wrong, giving me a serious image.

I then use ps to adjust the tones to make them look more coordinated.

In this step I want to add a hazy mask (roto) behind the entire environment, this will make my scene look more spatial and make the front and back model’s look primary and secondary.

Add clouds – this is a very critical step. It will make the whole scene look more atmospheric!

I used project3d and the card node in conjunction with camera movement to help create a nudge effect in the scene.

Overall, I don’t think matte painting is very difficult, although this is my first go at this method. Probably because I would have used ps to adjust the image before, as well as adding nodes inside of nuke all from what I learned in the first semester. This experience was very useful for me, matte painting allows me to create wide shots that are realistic and difficult to photograph.

Complete video (first shot)

WEEK 10 reference pictures and storyboard

The first shot: abandoned urban buildings, tall buildings, wide-angle shooting, big scenes

The second shot: The street scene, the building pasted a huge poster, the big title reads “People and zombies coexist peacefully”

The third shot: The city bridge, traffic to traffic (symmetrical shot) the background is the huge night scene of the London Eye

The fourth shot: In a ruins, a plant is growing (simulated time-lapse photography)

WEEK 9&10 COLLABORATE WITH OUR GROUP

I have to say that the biggest problem I had with this project was the ue5 background turning black when rendered.

When I render my scene the background of the display is black, but in the UE5 display window it shows normal.

I consulted Emily and she advised me not to use SkyLight if I have an HDRI background. this renders it so that the background is black.

good display
bad display

On top of that, I created a script for an introductory video and here are my lines

My name is liuyifang from visual effects program. I was inspired by the many different colors of coral, so I looked for a lot of reference pictures. I tried to imagine this ocean filled with vivid turquoise and blue colors. Coral reefs flourish in this area, filled with colorful fish and other ocean creatures. Yet another shot focuses on the waning sunlight and the ocean taking on a deeper blue and indigo color. Here are some unique creatures that illuminate the darkness with a fascinating glow. This part was mainly done by me and Negin respectively. For the building aspect of the scene, first I used squares to build the overall layout and shots of the scene. I used maya and c4d to create the basic shape of the coral, especially I used the “volume modeling” in c4d which is very useful for me. Then I use zbrush to add detail to the coral, which makes the object more detailed. I would then look at the general layout in maya to import into ue5 and put all the models together. Finally I use nuke to create bubbles, lights and other effects.

I made two versions of the scene, one to make it look a little more realistic, and the other to make my scene more in keeping with the style of the group as a whole
personal video (the last choice)
personal video
introduction video
breakdown video (personal)

I had a great time working with the group and would like to work with them again if the opportunity arises! In the beginning, we presented our ideas to each other and we respected each other’s ideas to modify and improve them, which is the most important thing. We don’t argue or just give up, but rather, we improve each other’s ideas and provide each other with good advice.

making of video

Here’s a compilation of what we all worked on, with a full showcase video at the end

final video

WEEK 9 F_ReGrain in nuke

use the key mix which can allows you to blend two input images together based on a third matte input.

1. Import Footage: Import the footage of the person walking in front of the green screen into Nuke. This will be your Foreground (A).

2. Import Background: Import the background image or footage that you want to replace the green screen with. This will be your Background (B).

3. Create a Matte: Use a Keyer node (like Primatte or Keylight) to key out the green screen and generate a matte that separates the person from the background. This matte will be your Matte

F_ReGrain

I need to match the grain pattern of CGI elements or other composited elements with the grain of the original footage.

Input: F_ReGrain takes several inputs:

  • Source Image: The image or footage that you want to regrain.
  • Reference Image: The reference image or footage from which you want to sample the grain.
  • Grain Map: A grayscale map that controls the strength of the regraining effect in different areas of the image. White areas apply more grain, while black areas apply less grain.

when I need to add some visual effects, such as explosions, fire, smoke, or particle effects. Visual effects elements are often created digitally and may lack the natural grain present in live-action footage. F_ReGrain can be used to apply a matching grain pattern to the visual effects elements, ensuring they blend seamlessly with the original footage.