WEEK 1-Briefly introduce the course requirements
The teacher began this class with a basic overview of the course. The class was divided into three parts: the first hour was devoted to theory, the middle two hours were dedicated to practical tutorials, and the final hour was dedicated to hands-on practice. The teacher explained that the entire course would last 14 weeks, with the initial learning involving disassembling the model, followed by a gradual integration process.
The teacher also asked us to introduce ourselves.
- The teacher discussed the key software tools we would learn:
1. Blender
This software is free and open source, suitable for both beginners and professionals. The teacher explained that some game companies are switching from Maya and 3DS Max to Blender. It's incredibly versatile, encompassing modeling, sculpting, animation, rendering, and game asset development. Its polyhedron modeling tools allow for both simple objects and high-definition characters and scenes.
However, the teacher repeatedly emphasized the importance of simplicity when modeling with Blender. Avoid overly complex geometry, as this can overwhelm the VR device's GPU. Later, we would learn how to use Blender for UV unwrapping and texturing, for example, using a single texture repeatedly for a wall material to conserve resources. For this assignment, we also need to use it to design models and then import them into other games.
2. Unreal Engine
The teacher required that we install version 5.5.5.4, saying that this would ensure a consistent operating environment for the entire class, as otherwise, we might encounter problems when working on projects later. To install, we first need to download the Epic Games launcher, then download the engine from within the launcher. The teacher has already sent two links in the Microsoft Teams general channel, and I saved them for after class.
This engine is incredibly convenient. There's no need to switch back and forth between software—you can build models directly within it without exiting and returning to Blender to add content. You can also adjust scene levels and crop scenes without exiting the engine. The teacher also mentioned a virtual geometry system called Nanite, saying it can integrate data points without affecting performance, but it's important to master the basics before diving into its application.
3. Other Support Software
There's also Adobe Animate, a professional animation software that allows for real-time development, such as programming a car to move or creating an accident scene and recording it as an animation in real time. Mixamo (also owned by Adobe) is an open-source animation platform. By importing model rigs, you can directly use the existing animation assets, eliminating the need to start from scratch. This saves a lot of time.
- The teacher explained the course assignments and homework requirements.
We were required to design two interactive space models of a future London, created in Blender and then imported into other games. We also needed to come up with two interactive strategies, such as making lights flash when people approach or moving them up and down. The teacher specifically reminded us not to create overly complex models, such as flashing neon signs, as this might overwhelm the VR device's GPU. We would also provide client samples for reference.
We had to create animations based on the assignment models, importing them into animation software, adding keyframes, and arranging the animation sequences on a timeline. Furthermore, we had to write reflection reports after each assignment and lecture.
WEEK 2
This week, the teacher covered the following key points:
1. Modeling Basics and Unit Settings
• Introduced how to set units (meters, centimeters, etc.) in modeling software to ensure that model dimensions are consistent with reality.
• Emphasized the importance of scaling to avoid errors when importing into other software or engines due to incorrect proportions.
2. Coordinate Axes and Object Positioning
• Explained the role of the X, Y, and Z axes in modeling.
• Mentioned that objects need to be placed at the correct coordinate origin (0,0,0) to facilitate subsequent adjustments and export.
3. Common Modifiers
• Boolean Operations: Introduced the use of Difference, Union, and Intersection.
• Arrays: Quickly copy objects and create regular arrangements.
• Mirrors: Improve efficiency through symmetrical modeling.
4. Export and Compatibility
• Explained how to export models to FBX format, emphasizing the importance of selecting options such as scaling and applying modifiers when exporting.
• Emphasized the importance of UV unwrapping and material setup for proper display on other platforms.
5. Materials and Node Expressions
• Mentioned the use of parameters (such as scalar and vector parameters) in material nodes to control color, texture, or material properties.
• Explained the meaning of RGB/vector values in materials.
Class Exercises
The teacher demonstrated how to build the basic structure of a Ferris wheel and how to use Blender's functions. We needed to learn and practice using methods such as Booleans, arrays, and scaling.
WEEK 3
This week, the teacher focused on the core concepts and operations of the Unreal Engine Material System. The class was divided into two parts: explanation and practical application.
- Material System Basics (Material Domain)
The teacher explained that Unreal Engine has three main types of materials:
1. Surface: The most common type, used for surfaces (characters, models, etc.).
2. Deferred Decal: Used to add dynamic artifacts to surfaces, such as bullet holes or stains after a shot.
3. Post-Process: Applied to the entire scene, used to adjust visual effects such as color grading, blurring, or distortion.
- Blend Mode
Material transparency and blending methods are mainly divided into three categories:
1. Opaque: Completely opaque, commonly used for metal or solid surfaces.
2. Masked: Uses a black and white map to control transparency (black = transparent, white = visible), suitable for leaves, gratings, etc.
3. Translucent: Used for light-transmitting objects such as glass, water, and smoke, with adjustable transparency.
- Textures and Parameter Control
The instructor demonstrated how to add different textures in the Material Editor:
• Base Color
• Normal Map
• Roughness
• Metallic
Two parameter types were also introduced:
• Scalar Parameter: Controls a single value (such as brightness, roughness, etc.).
• Vector Parameter: Controls color or direction (such as RGB values).
These parameters can be used with nodes such as "Multiply" and "Add" to dynamically adjust the material's appearance.
Practical part (hands-on exercises)
During the class, students attempted the following:
• Importing a model (.fbx) file into Unreal Engine;
• Creating a new material and applying it to the model;
• Using the Time node and Sine function to create flickering or breathing lighting effects;
• Understanding Double-Sided rendering settings;
• Adjusting the Sky Light and Directional Light to control ambient light and shadow effects;
• Creating a new Level (scene) and saving the project.
But during the practice, my software could not be opened.
The teacher said it might be because I did not download it on the same network. The teacher asked me to download the Unreal Engine software from the same network again.
Because of this, I did not keep up with the teacher's progress during the practice, so I planned to go home and find some teaching videos to make up for what I missed in class.
Week 4 — Sky and Material Setup Practice
This week's Advanced Modeling class primarily expanded upon the previous class. The instructor guided us through setting up the sky and lighting system in Unreal Engine and further explained how to create and apply material maps. The focus of this class was to help us understand how the "ambiance" of a virtual scene is created.
- Review and Initial Environment Setup
At the beginning of the class, the instructor had us check that the scene setup from last week was correct and that the base map opened successfully. We then learned:
• How to select a starting map from Project Settings;
• How to create a new scene using Ctrl+N;
• Creating a basic object (such as a cube) in a new map and using it as the ground plane;
• Learning the function of the Reset Location button: returning objects to the center of the scene (0,0,0).
- Lighting System
The instructor emphasized the importance of lighting in spatial representation and walked us through setting up several common light sources:
1. Directional Light—simulates sunlight;
2. Sky Light—simulates ambient diffuse reflection;
3. Exponential Height Fog—adds a sense of spatial layering and depth.
We also learned to use the WASD keys to move the camera and adjust the camera speed (controlled by the scroll wheel). If the camera gets lost, double-clicking an object quickly brings it back to its original position.
- Daytime and Nighttime Sky System Setup
After completing the lighting, we moved on to setting up the sky system.
The instructor demonstrated two environments:
• Daytime scene: using Directional Light and Sky Atmosphere;
• Night scene: using BP_SkySphere (from the engine content) to create a starry sky effect.
In the night sky settings, we also adjusted:
• Star Brightness;
• Cloud Density;
• Sky Color Variation (controlling the day-night transition by adjusting the sun's angle).
The instructor mentioned that combining these parameters can significantly change the atmosphere of a space and will be very helpful for subsequent project presentations.
- Material & Texture
The second half of the course focused on material creation. The instructor explained how to:
1. Download material textures from websites such as 3DTextures.com;
2. Import them into Unreal Engine's Content Browser;
3. Create standard PBR materials using nodes such as Base Color, Metallic, Roughness, and Normal Map;
4. Adjust node connections to achieve more realistic reflections and textures on the surface.
In addition, we learned how to create a Multiply node to control material intensity and use Parameter nodes to make materials adjustable in real time, facilitating subsequent experiments.
Week 5 :Submit assignment 1
Week 6 — Creating Animations and Interactions Using Level Sequence
Class Topic Introduction
This week's advanced modeling course focused on how to create animations using Level Sequences in Unreal Engine and implement interactive control through Level Blueprints. Using a "door opening animation" as an example, the instructor guided us through the entire process from model import and animation creation to trigger logic settings. This learning experience helped us further understand the relationship between Unreal Engine's animation system and Blueprints.
Project Structure and Resource Import
Before starting, the instructor emphasized the importance of a good file structure and proper saving habits. We needed to create a new folder in the Content Browser to manage models, textures, and other resources. Next, we imported the externally downloaded .fbx model files into the corresponding folder. The instructor specifically reminded us to save immediately after importing (Ctrl + S or Ctrl + Shift + S) to prevent data loss due to project crashes.
Creating Level Sequence Animations
After importing the resources, the instructor demonstrated how to create a basic Level Sequence. Right-click in the Content Browser, select Cinematics > Level Sequence, name it "Door_Sequence," and double-click to open it. Next, drag the door model into the Sequencer timeline, expand the Transform property, add keyframes for different time points, and adjust the rotation or position to create the door opening animation. The instructor also demonstrated how to adjust the animation duration to make the movement look more natural.
Adding a Trigger Box
After the animation was complete, the instructor guided us to add a Trigger Box to the scene. By searching for "Trigger Box" in the "Place Actors" panel on the left, dragging it to the front of the door, and adjusting its size and position using the scaling tool, the subsequent animation events will be triggered when the player approaches and enters this area. This step adds basic interactivity to our scene.
Blueprint Logic and Animation Triggering
Next, the instructor taught us how to use Level Blueprints to control the animation playback logic. First, select the Trigger Box, then create a "Begin Overlap" event node in the Blueprint, and drag in the "Door_Sequence" we just created as a reference. Connect the node: OnActorBeginOverlap → Play (Level Sequence) to implement the door opening animation when the player enters the trigger area. Finally, compile and save the Blueprint to test the effect.
Running Tests and Debugging
After completing the Blueprint logic, the instructor reminded us to save and compile frequently. Click "Play" to enter game mode for testing. When the character walks into the Trigger Box area in front of the door, the door will automatically open. If the animation does not play, check if the Level Sequence was saved correctly or if any node connections were missing. Through repeated testing, we gradually mastered the debugging method.
WEEK 7
In class, the teacher had us try using VR to view Unreal Engine scenes.
The teacher then asked us to do Project 2 in this class, and told us we could ask him questions if we had any.









Comments
Post a Comment