Virtual Reality II

Advanced Virtual reality course. Course covers more virtual reality techniques and features and also using VR with 3D reconstruction.

1. Introduction

Course overview and news

2. Source controll - git & UE

Exercise to demonstrate UE5 integration with a versioning system (git in this case). We use both the UE5 environment and the git bash console and basic git GUI. The demo includes project init, commits, remote repository, working in multi-person, collisions and their resolution.
Let's create a simple UE5 project. We will look at the structure and form of the data (text vs. binary).
Initializing git repository, initial commit. Learning about LFS (Large Files Support), and the settings stored in .gitattributes and .gitignore -...
Introducing other git clients with GUI and git bash. Basic git commands (git status, add, commit). Git current main branch...
A sample where all the information about the local repository (the .git directory) is stored. A note about the usefulness...
Working with project branches demonstrated on a scene with two cubes. Creating a new branch from the master (cubes), switching...

3. Texturing

Advanced texturing workflow. 3D painting & smart materials with Quixel MIXER

4. 3D reconstruction

We use a 3D reconstruction technique called Photogrammetry to prepare the scene. We can either use the resulting models directly in the scene (either point clouds or meshes), or we can use them for basic insight into the scale of the scene and the placement of objects.
Overview of available tools for 3D reconstruction. Paid and free options. Reality Capture, Meshroom, Colmap, OpenDroneMap, 3DF Zephyr.
Sample input data - more than 300 photos of the attic, photos are not optimally photographed. Loading into RealityCapture, and...
The coloration of a dense point cloud. Setting the reconstruction region and trimming the model. Creating a model of the...
Export point clouds (points and colors) to .las format. Sample of exported model in another software (Cloud Compare), proof that...
Create a standard VR template or use an older project with a VR setup. Create a new scene, enable the...
Eye-Dome lighting conveys better visualization of point clouds by emphasizing edges created by discontinuities in depth. We set the PostProcessVolume,...
In this section, we'll demonstrate the point cloud in VR and allow you to walk around the scene using teleportation....
The current point cloud orientation is inappropriate for further editing. For this reason, we will fix the point cloud alignment...

5. Interaction interaface for VR

Comon Interaface for Trace interactions, ie. ray casting for object interaction.
Project setup, directory structure, Blueprint names, git init, git push.
Interface declaration, interface application on a cube object, implementation of first-hit function, testing by O-key-press.
Cube extension - set material collor for different interface functions: TraceHitObject, TraceLeaveObject, TraceActiveDown, TraceActiveUp.
Setting up Interaction component (fce. LineTrace, InteractWithHit), implementation of LineTrace function.
Implementation of InteractWithHit: storing local variables, introducing collapsed nodes, old/new actor trace hit decision. Pawn extension with TraceInteractionComponent and simple...
Implementation of InteractWithHit: If new object in hit, stop Trace old object, start Trace new object. Debugging LineTrace by turning...
Implementation of InteractWithHit: Call TraceMove if is the same object and check if the component has changed.
Change state of InteractiveObject (Cube) when pressing mouse button: Added Events in TraceInteractionComponent, reaction on mouse click in Pawn and...
TraceMove implementation (arrow orientation). Implementation of Sphere based on Cube.

6. Template analysis and basic use

UE5 comes with a very well prepared template for creating a VR world. An important part of it is the so-called VRpawn, i.e. the implementation of the user's player. This VRpawn has implemented the basic ability to move, grab things and trigger actions. We will try out how the simpler scripts of this object are implemented, and we will describe the more complex ones and show how to use them in our projects.
Introducing this and other tutorials in this section.
We'll connect the Oculus Quest VR glasses, using the Quest app and for developers (Meta Developer Hub). We will make...
Basic connection to a VR headset and reading data from the device. We will create our own Pawn and prepare...
Implementation of VRpawn rotation by a fixed number of degrees. We read the change of the analog stick on the...
Teleportation is an extension of our previous experiments in the chapter on Interaction Interface. Therefore, here we will only study...
Transfer of Teleportation functionality from VR template Pawn to our VRPawn. Sample copy of necessary variables and functions. Debugging of...
Analysis of functions providing object grasping and activation (grap and trigger) in a basic VR template.

7. Locomotion and navigation adjustment

Movement in virtual reality (locomotion) is often implemented by means of so-called teleportation. We will demonstrate the use of navigation objects to navigate actors from UE5 for VRpawn. But we'll look at the possibilities of editing and modifying these bypasses - creating a restricted area, controlling progress through the world, dynamic updating.
Demonstration of the use of teleportation in a new scene. Starting with an empty scene, add NavMeshBoundsVolume, sample modifications (NavModifierVolume)...
Other NavMesh visualization options. Modification of the generated navigation network to reach/restrict to the desired areas. NavMesh quality discussion.
Change the NavMesh generation from static to dynamic (dynamic with modifiers and dynamic), creating objects that dynamically affect the generated...
Blocking the maze area for navigation. If the user comes behind the direction of the maze entrance, the teleportation option...

8. Buttons on VR controllers

Virtual reality controllers are equipped with several buttons, joysticks and other input elements. We will demonstrate binding the device to basic actions such as trigger, grip, menu, but also create custom actions for buttons that are not directly used in the VR template.
Switch to interaction branch and checkout start tag: git checkout tags/step_interaction_start -b upTo_vid06 Adding the ability to grab objects. Demonstration...
Creating an Actor (bp_tool) to demonstrate the trigger function. The tool needs to be held in hand and shows the...
Extending the previous example by reading a smooth trigger change. Fix VRtemplate to read TriggerAxis on the right controller.
Haptic feedback when the trigger is pressed on the corresponding controller.
Using the menu from VRtemplate and extending it with a new item. Demonstration of displaying the rendering speed in FPS.
Extension of input settings for triggering an event after pressing other buttons on the controller (ABXY). Extension of VRInteractionBPI with...

9. Virtual drivers

VR worlds are full of functional objects that closely resemble what we are used to from reality. From an interaction perspective, these objects should behave very similarly to the real world. Therefore, here we will try our hand at implementing basic "constrained" interaction features such as a button (translational movement in one direction activated by pressing), a slider (control by grasping in one direction), and a lever (rotational movement by grasping around an axis). A wide range of elements such as doors, drawers, buttons, rotary controls and many more can then be implemented using these examples.
Virtual button - part 1 - change material
Virtual button - part 2 - physical behaviour
Virtual button - part 3 - set state and turn on light
Virtual slider - grap move limit
virtual slider - estimate value <0;1>
virtual slider - control two lights - events and level blueprint
virtual slider - other event and dispatcher possibilities

Lever I

9.3.1
virtual lever - bp_preparation
virtual lever - lever drive motion control