Advanced Modeling and Animation, weekly learning

 

WEEK 1-Briefly introduce the course requirements

The teacher began this class with a basic overview of the course. The class was divided into three parts: the first hour was devoted to theory, the middle two hours were dedicated to practical tutorials, and the final hour was dedicated to hands-on practice. The teacher explained that the entire course would last 14 weeks, with the initial learning involving disassembling the model, followed by a gradual integration process.

The teacher also asked us to introduce ourselves.

  • The teacher discussed the key software tools we would learn:

1. Blender

This software is free and open source, suitable for both beginners and professionals. The teacher explained that some game companies are switching from Maya and 3DS Max to Blender. It's incredibly versatile, encompassing modeling, sculpting, animation, rendering, and game asset development. Its polyhedron modeling tools allow for both simple objects and high-definition characters and scenes.

However, the teacher repeatedly emphasized the importance of simplicity when modeling with Blender. Avoid overly complex geometry, as this can overwhelm the VR device's GPU. Later, we would learn how to use Blender for UV unwrapping and texturing, for example, using a single texture repeatedly for a wall material to conserve resources. For this assignment, we also need to use it to design models and then import them into other games.

2. Unreal Engine

The teacher required that we install version 5.5.5.4, saying that this would ensure a consistent operating environment for the entire class, as otherwise, we might encounter problems when working on projects later. To install, we first need to download the Epic Games launcher, then download the engine from within the launcher. The teacher has already sent two links in the Microsoft Teams general channel, and I saved them for after class.

This engine is incredibly convenient. There's no need to switch back and forth between software—you can build models directly within it without exiting and returning to Blender to add content. You can also adjust scene levels and crop scenes without exiting the engine. The teacher also mentioned a virtual geometry system called Nanite, saying it can integrate data points without affecting performance, but it's important to master the basics before diving into its application.

3. Other Support Software

There's also Adobe Animate, a professional animation software that allows for real-time development, such as programming a car to move or creating an accident scene and recording it as an animation in real time. Mixamo (also owned by Adobe) is an open-source animation platform. By importing model rigs, you can directly use the existing animation assets, eliminating the need to start from scratch. This saves a lot of time.

  • The teacher explained the course assignments and homework requirements.

We were required to design two interactive space models of a future London, created in Blender and then imported into other games. We also needed to come up with two interactive strategies, such as making lights flash when people approach or moving them up and down. The teacher specifically reminded us not to create overly complex models, such as flashing neon signs, as this might overwhelm the VR device's GPU. We would also provide client samples for reference.

We had to create animations based on the assignment models, importing them into animation software, adding keyframes, and arranging the animation sequences on a timeline. Furthermore, we had to write reflection reports after each assignment and lecture.

WEEK 2

This week, the teacher covered the following key points:

1. Modeling Basics and Unit Settings

• Introduced how to set units (meters, centimeters, etc.) in modeling software to ensure that model dimensions are consistent with reality.

• Emphasized the importance of scaling to avoid errors when importing into other software or engines due to incorrect proportions.

2. Coordinate Axes and Object Positioning

• Explained the role of the X, Y, and Z axes in modeling.

• Mentioned that objects need to be placed at the correct coordinate origin (0,0,0) to facilitate subsequent adjustments and export.

3. Common Modifiers

• Boolean Operations: Introduced the use of Difference, Union, and Intersection.

• Arrays: Quickly copy objects and create regular arrangements.

• Mirrors: Improve efficiency through symmetrical modeling.

4. Export and Compatibility

• Explained how to export models to FBX format, emphasizing the importance of selecting options such as scaling and applying modifiers when exporting.

• Emphasized the importance of UV unwrapping and material setup for proper display on other platforms.

5. Materials and Node Expressions

• Mentioned the use of parameters (such as scalar and vector parameters) in material nodes to control color, texture, or material properties.

• Explained the meaning of RGB/vector values ​​in materials.

Class Exercises

The teacher demonstrated how to build the basic structure of a Ferris wheel and how to use Blender's functions. We needed to learn and practice using methods such as Booleans, arrays, and scaling.

WEEK 3

This week, the teacher focused on the core concepts and operations of the Unreal Engine Material System. The class was divided into two parts: explanation and practical application.

  • Material System Basics (Material Domain)

The teacher explained that Unreal Engine has three main types of materials:

1. Surface: The most common type, used for surfaces (characters, models, etc.).

2. Deferred Decal: Used to add dynamic artifacts to surfaces, such as bullet holes or stains after a shot.

3. Post-Process: Applied to the entire scene, used to adjust visual effects such as color grading, blurring, or distortion.

  • Blend Mode

Material transparency and blending methods are mainly divided into three categories:

1. Opaque: Completely opaque, commonly used for metal or solid surfaces.

2. Masked: Uses a black and white map to control transparency (black = transparent, white = visible), suitable for leaves, gratings, etc.

3. Translucent: Used for light-transmitting objects such as glass, water, and smoke, with adjustable transparency.

  • Textures and Parameter Control

The instructor demonstrated how to add different textures in the Material Editor:

• Base Color

• Normal Map

• Roughness

• Metallic

Two parameter types were also introduced:

• Scalar Parameter: Controls a single value (such as brightness, roughness, etc.).

• Vector Parameter: Controls color or direction (such as RGB values).

These parameters can be used with nodes such as "Multiply" and "Add" to dynamically adjust the material's appearance.

Practical part (hands-on exercises)


During the class, students attempted the following:

• Importing a model (.fbx) file into Unreal Engine;

• Creating a new material and applying it to the model;

• Using the Time node and Sine function to create flickering or breathing lighting effects;

• Understanding Double-Sided rendering settings;

• Adjusting the Sky Light and Directional Light to control ambient light and shadow effects;

• Creating a new Level (scene) and saving the project.

But during the practice, my software could not be opened. 


The teacher said it might be because I did not download it on the same network. The teacher asked me to download the Unreal Engine software from the same network again.


 Because of this, I did not keep up with the teacher's progress during the practice, so I planned to go home and find some teaching videos to make up for what I missed in class.

Week 4 — Sky and Material Setup Practice


This week's Advanced Modeling class primarily expanded upon the previous class. The instructor guided us through setting up the sky and lighting system in Unreal Engine and further explained how to create and apply material maps. The focus of this class was to help us understand how the "ambiance" of a virtual scene is created.

  • Review and Initial Environment Setup

At the beginning of the class, the instructor had us check that the scene setup from last week was correct and that the base map opened successfully. We then learned:

• How to select a starting map from Project Settings;

• How to create a new scene using Ctrl+N;

• Creating a basic object (such as a cube) in a new map and using it as the ground plane;

• Learning the function of the Reset Location button: returning objects to the center of the scene (0,0,0).

  •  Lighting System

The instructor emphasized the importance of lighting in spatial representation and walked us through setting up several common light sources:

1. Directional Light—simulates sunlight;

2. Sky Light—simulates ambient diffuse reflection;

3. Exponential Height Fog—adds a sense of spatial layering and depth.

We also learned to use the WASD keys to move the camera and adjust the camera speed (controlled by the scroll wheel). If the camera gets lost, double-clicking an object quickly brings it back to its original position.

  • Daytime and Nighttime Sky System Setup

After completing the lighting, we moved on to setting up the sky system.

The instructor demonstrated two environments:

• Daytime scene: using Directional Light and Sky Atmosphere;

• Night scene: using BP_SkySphere (from the engine content) to create a starry sky effect.

In the night sky settings, we also adjusted:
• Star Brightness;

• Cloud Density;

• Sky Color Variation (controlling the day-night transition by adjusting the sun's angle).

The instructor mentioned that combining these parameters can significantly change the atmosphere of a space and will be very helpful for subsequent project presentations.

  • Material & Texture

The second half of the course focused on material creation. The instructor explained how to:

1. Download material textures from websites such as 3DTextures.com;

2. Import them into Unreal Engine's Content Browser;

3. Create standard PBR materials using nodes such as Base Color, Metallic, Roughness, and Normal Map;

4. Adjust node connections to achieve more realistic reflections and textures on the surface.

In addition, we learned how to create a Multiply node to control material intensity and use Parameter nodes to make materials adjustable in real time, facilitating subsequent experiments.

                                Week 5 :Submit assignment 1



Week 6 — Creating Animations and Interactions Using Level Sequence


Class Topic Introduction

This week's advanced modeling course focused on how to create animations using Level Sequences in Unreal Engine and implement interactive control through Level Blueprints. Using a "door opening animation" as an example, the instructor guided us through the entire process from model import and animation creation to trigger logic settings. This learning experience helped us further understand the relationship between Unreal Engine's animation system and Blueprints.

Project Structure and Resource Import

Before starting, the instructor emphasized the importance of a good file structure and proper saving habits. We needed to create a new folder in the Content Browser to manage models, textures, and other resources. Next, we imported the externally downloaded .fbx model files into the corresponding folder. The instructor specifically reminded us to save immediately after importing (Ctrl + S or Ctrl + Shift + S) to prevent data loss due to project crashes.

Creating Level Sequence Animations

After importing the resources, the instructor demonstrated how to create a basic Level Sequence. Right-click in the Content Browser, select Cinematics > Level Sequence, name it "Door_Sequence," and double-click to open it. Next, drag the door model into the Sequencer timeline, expand the Transform property, add keyframes for different time points, and adjust the rotation or position to create the door opening animation. The instructor also demonstrated how to adjust the animation duration to make the movement look more natural.

Adding a Trigger Box

After the animation was complete, the instructor guided us to add a Trigger Box to the scene. By searching for "Trigger Box" in the "Place Actors" panel on the left, dragging it to the front of the door, and adjusting its size and position using the scaling tool, the subsequent animation events will be triggered when the player approaches and enters this area. This step adds basic interactivity to our scene.

Blueprint Logic and Animation Triggering

Next, the instructor taught us how to use Level Blueprints to control the animation playback logic. First, select the Trigger Box, then create a "Begin Overlap" event node in the Blueprint, and drag in the "Door_Sequence" we just created as a reference. Connect the node: OnActorBeginOverlap → Play (Level Sequence) to implement the door opening animation when the player enters the trigger area. Finally, compile and save the Blueprint to test the effect.

Running Tests and Debugging

After completing the Blueprint logic, the instructor reminded us to save and compile frequently. Click "Play" to enter game mode for testing. When the character walks into the Trigger Box area in front of the door, the door will automatically open. If the animation does not play, check if the Level Sequence was saved correctly or if any node connections were missing. Through repeated testing, we gradually mastered the debugging method.

WEEK 7

In class, the teacher had us try using VR to view Unreal Engine scenes.



The teacher then asked us to do Project 2 in this class, and told us we could ask him questions if we had any.

WEEK 8

I didn't go to class this week because I was sick. This week is the submission deadline for assignment 2.

WEEK 9

We missed class this week, but the teacher said they'll make it up for us.

WEEK 10

This week's course focused on Unreal Engine's Control Rig system, animation blueprints, skeletal control logic, and the final project export method. The instructor demonstrated step-by-step how to create and edit controllers in UE, how to bind bones, how to make controllers drive animations, and explained the requirements for the final VR project.

🔧 01. What is Control Rig? What can it do?

Control Rig is a built-in procedural animation tool in Unreal Engine. It allows us to design and control animations directly within UE, without needing to complete them in Maya or Blender first and then import them.

The instructor repeatedly emphasized:

• It allows animation logic to be generated in real-time within UE, making it more dynamic and better suited for interactive and game development.

• We can define skeletal behavior and controller logic, including position, rotation, and other data, through script nodes (Blueprint-style nodes).

• Control Rig supports physical animation and mechanical structure animation, such as robotic arms and linkage systems, allowing different bones to move in proportional motion.

In short, Control Rig = an "animation generator" within UE.

🔩 02. Creating a Control and Binding it to Bones in UE


The instructor demonstrated how to have one Control control multiple bones and explained the control flow logic:

1. Create a new control for the bones.

2. Set the "Get → Set" control flow in the diagram.

3. The controller's Transform value drives the target bone.

4. Different bones can be set with different rotation ratios, for example:

• First bone segment: 100%

• Second bone segment: 50%

• Third bone segment: 35%

→ This creates a natural linkage effect.


The instructor pointed out: Even if the bones are not completely connected, control relationships can be artificially established in the Control Rig, making the mechanical structure logic more flexible.

🧩 03. Animation Blueprint + Control Rig

After the controller is set up, it can be integrated into the Animation Blueprint for real-time animation.

The general process is as follows:

• Call the Control Rig in the Anim Blueprint

• Connect the controller parameters to the logic diagram

• You can adjust, blend, and synchronize the animation in real time.

The instructor mentioned:

“Control Rigs provide a more dynamic development approach, allowing us to integrate with interactive systems, which is what we'll be using in our projects.”

🛠 04. Detailed Controller Operations (Class Demonstration)

the instructor assisted students step-by-step, including:

• Finding the Skeleton

• Creating a Control Rig Blueprint

• Creating controllers for parts such as the head and arms

• Adjusting the controller's color, rotation axis, and position alignment

• Allowing the controller to affect multiple bones

• Using the Execute Context node to drive the logic

The instructor also reminded:

• The controller's rotation axis must be correctly aligned with the skeleton

• If the controller cannot drive the skeleton, it may be due to incorrectly setting the "Get Transform / Set Transform" process or not binding the target bone.

During class, students continuously tried the operations, and the instructor provided guidance one by one.

🎬 05. Using Sequencer

the instructor begins explaining how to import animations into Sequencer:

• Add the created animation to the Sequence

• Run the animation logically in the Level Blueprint

• The Sequence is a crucial part of the subsequent VR demonstration.

📦 06. Project Export (APK) and VR Testing

The instructor explains the final steps:

• Completely set up the scene

• Add animations, lighting, and environment

• Then export as an APK

• The instructor will test it on a VR device.

This is a crucial step in the final project.

🎯 07. Final Assignment Requirements (Very Important)

Teacher Reiterates:

You need to:

✔ Environment Modeling

✔ Basic Animation

✔ Lighting Design

✔ A VR-compatible scene demonstration

You do not need to:

✘ Complex character movements

✘ High-difficulty simulations

The key is: a complete, VR-compatible environment + basic interactive or animated elements.

WEEK 11

I didn't go to class this week because I had a stomachache and stayed home. I studied the materials the teacher sent out on Teams this week.


I taught myself the Unreal Engine 5 lighting baking process at home using instructional videos and PowerPoint presentations uploaded by my teacher. This resource included demonstration videos for multiple steps, including "Example of Baked Lighting," "Example of Realtime Lighting," "Step 1 – Add Lightmass Importance Volume," "Step 2 – Set All Non-Movable Objects to Static," and "Step 3 – Bake Lighting," allowing me to practice step-by-step.

First, I learned the basic principles of lighting baking from the PowerPoint presentations: Light baking pre-calculates lighting effects and stores them in a lightmap, thus reducing the amount of real-time computation during game execution. Baked scenes can maintain realistic lighting and good performance, making them especially suitable for architectural scenes and static objects.


Next, I watched the demonstration videos "Example Realtime Lighting" and "Example of Baked Lighting," which clearly compared the differences between Realtime and Baked lighting. Real-time lighting consumes more resources, while baked lighting results in more stable lighting and softer shadows, without flickering due to camera movement. Through the videos, I could visually see the advantages of baking.




In the video "Step 1 – Add Lightmass Importance Volume", I learned that adding a Lightmass Importance Volume before baking allows Unreal Engine to calculate light more accurately within a specified area. Without this step, the baked result may be blurry or unstable.


Continuing with "Step 2 – Set All Non-Movable Objects to Static," the video explains that all non-movable objects must be set to Static so that Unreal Engine includes them in the baking calculations. If objects remain Movable or Stationary, they cannot be accurately baked, resulting in incorrect or missing shadows.


After completing the setup, I watched "Step 3 – Bake Lighting," a video demonstrating how to actually bake lights. Once baking begins, the system calculates data such as lighting, shadows, indirect lighting, and ambient occlusion (AO). The video also demonstrates post-baking checks, including observing whether the lighting and shadows are clean, whether there are seams, and whether the UVs are correctly unfolded.


Finally, I combined the video content with the PowerPoint presentation to summarize the entire light baking process: first, understand the principles of lighting; then, arrange the lights, prepare the model and UVs; next, set the Lightmass Volume, adjust the object states, and finally, perform the baking. Although I couldn't attend the class in person, through the video and PowerPoint presentation, I still fully understood the workflow of baking lights in Unreal Engine. After learning, I have more confidence in controlling the lighting and can understand why certain scenes need baking and in which situations it is appropriate to use real-time lighting.

I also reviewed the UX environment development tutorial and final result example for a future-oriented theme park that my teacher provided. This gave me a lot of inspiration for my final project.




The teacher gave extra lessons for Week 8 this week.

I attended a supplementary course that focused on Blender model optimization and Unreal Engine 5 scene performance optimization. In class, through demonstration videos and documents provided by the instructor, I learned how to improve the efficiency of my work from the modeling stage and further optimize it within the engine. The content of this supplementary course was very practical and closely related to future project production.

The first part of the course was about Blender model simplification methods. The class demonstrated how to use the Decimate tool to reduce the polygon count of a model. By adjusting the Ratio and observing the shape retention of the model, we learned how to make the model lighter without affecting the visual effect. This technique is particularly crucial for optimizing VR or large scenes because fewer polygons result in faster rendering speeds.

Next, we learned how to set LOD (Level of Detail) for a model in UE5. LOD automatically adjusts the model's complexity based on distance, ensuring that distant objects do not consume excessive resources. The class demonstrated how to add LODs, set parameters, and the actual switching effects in a scene. Through this exercise, I understood that LOD (Level of Rendering) is a crucial aspect of real-time rendering optimization, especially indispensable in VR environments.

The class also included steps on how to display FPS in UE5. FPS is a key metric for scene performance, so we need to constantly monitor frame rate stability. The instructor demonstrated several ways to enable FPS, allowing us to quickly assess the scene's optimization status. Maintaining above 90 FPS is particularly essential in VR environments; otherwise, noticeable discomfort may occur.

One of the key focuses of this supplementary lesson was a PowerPoint presentation on "Optimizing Immersive VR Environments." The document introduced the specific requirements of VR environments from theory to practice, including high frame rates, low latency, simplified materials and shadows, reduced physics calculations, and the importance of occlusion culling. The class also explained the principles of Foveated Rendering, which gave me a deeper understanding of how VR rendering is optimized for human eye characteristics.

Overall, this supplementary lesson provided me with a systematic understanding of how to create high-performance 3D scenes, from modeling and texturing to LOD and engine optimization. Whether creating VR environments or general UE5 projects, these techniques directly impact the smoothness of the final product. The supplementary course was very comprehensive and helped me strengthen my overall understanding of the optimization process.

WEEK 12

In this lesson, the instructor primarily guided us through the entire process of creating and exporting a complete project. Through a series of teaching examples and demonstrations of operational steps, the instructor aimed to help us clearly grasp the sequence and logic of project creation, rather than simply completing individual operations.

At the beginning of the lesson, the instructor introduced the project creation phase, emphasizing the importance of establishing a solid foundation before formal production, including file organization and basic project settings. The instructor reminded us that a unclear initial structure can lead to many unnecessary problems during later modifications and exporting; therefore, establishing good work habits from the outset is crucial.

Subsequently, the instructor explained the project content creation process through operational examples. This part focused on how to add and adjust various elements within the project. The instructor repeatedly stressed the importance of continuous testing during the creation process to ensure each step functions correctly, rather than checking for problems all at once after project completion.

In the latter part of the process, the instructor discussed the project integration and connection steps. At this stage, it's necessary to confirm that all content is correctly linked and functions as expected. This stage prepares for the final output, and the instructor reminded us to carefully check details to avoid export failures due to small errors.

Finally, the instructor focused on explaining the project export process and demonstrated how to export the project as an APK file using a tutorial video. The instructor explained that exporting is the last step in the entire process and should only be performed after confirming that the project is running correctly. The relevant tutorial videos and steps were also uploaded to the school's SharePoint site for our convenience in reviewing and following along after class.

WEEK 13

The teacher said our final assignment submission and VR game recording – December 31st

This final assignment requires each student to submit a VR game recording video. During class on December 31st, submit your APK file via Microsoft Teams. The teacher will install your APK file on your VR device, then ask you to play the game and record the video. Please ensure you are ready to export your APK file.

If you are unable to attend class on December 31st, please inform him in advance. He will assist us with recording the game video on campus on December 27th.


WEEK 14

This week was the Christmas holiday, so we didn't have classes. I worked on my final project at home. I encountered some problems exporting the APK and kept getting errors. So, on Sunday, I went to school to ask my teacher for help and successfully exported the APK, solving the problem.


WEEK 15

This week we finally recorded the APK.


Self-Reflection

This 15-week course provided me with a systematic and in-depth understanding of advanced modeling and animation techniques, which has played a crucial role in enhancing my future design capabilities.

📍 Learning Attitude and Adaptation Process

At the beginning of the course, I was unfamiliar with the course tools and workflows. The software introduced by the instructor, such as Blender and Unreal Engine, has extremely rich functionality, which initially excited me but also put some pressure on me. Through in-class exercises and after-class research, I gradually shifted from an observer's perspective to active practice, which made my learning methods more mature.

📍 Growth in Technical Skills

Model Setup and Scene Construction

• When learning the basic modeling settings (units/axis/basic modifiers), I became more attentive to the rigor and standardization of the scene.

• I understood how to use tools such as Boolean and mirroring to quickly construct complex objects and gradually improved the clarity and logical organization of the models.

Materials and Environmental Atmosphere Design

• I realized that the parameters of material nodes not only affect surface styles but also directly impact visual expression. When adjusting lighting and sky systems in Unreal Engine, I focused more on creating the overall atmosphere.

Animation and Interaction Design

• Learning to create simple animations using Level Sequences and Blueprints, and implementing animation triggering and interaction logic, greatly broadened my spatial expression abilities.

📍 Problems Encountered and Solutions

During the course, I encountered a problem where the Unreal Engine software wouldn't open, causing me to fall behind in class. Although this affected my immediate learning, I proactively searched for instructional videos and resources at home to catch up, which made me realize the importance of independent problem-solving.

Furthermore, due to the long course duration, I missed several classes due to illness. To avoid knowledge gaps, I proactively reviewed the course materials and instructional videos shared by the instructor in Teams after missing classes, which honed my self-learning abilities.

📍 Insights from the Course

This course made me realize that creating virtual spaces is not about creating isolated static models, but rather a process of integrating technology, interaction, and expressiveness. Through Blender's modeling and Unreal Engine's real-time rendering and control logic, I witnessed the immense potential of 3D design in VR and future interactive experiences.

Simultaneously, I realized that each step of technical accumulation is not isolated: standardized modeling methods, detailed handling of lighting and materials, and even the setting of animation triggering logic all contribute to forming a more complete and realistic spatial expression system.

📍 Future Directions

My next plans are:

✔ To further study advanced modeling and texturing techniques in Blender.

✔ To enhance my design capabilities for interactive blueprints and dynamic systems in Unreal Engine.

✔ To apply the learning outcomes of this course to future spatial design projects, improving the overall expressiveness and interactivity of my work.

Comments