diff --git a/index.rst b/index.rst index 099008b..d293e91 100644 --- a/index.rst +++ b/index.rst @@ -108,7 +108,6 @@ The main documentation for the site is organized into the following sections: tutorials/scripting/index tutorials/shaders/index tutorials/ui/index - tutorials/vr/index .. toctree:: diff --git a/tutorials/vr/img/minimum_setup.png b/tutorials/vr/img/minimum_setup.png deleted file mode 100644 index d1acd19..0000000 Binary files a/tutorials/vr/img/minimum_setup.png and /dev/null differ diff --git a/tutorials/vr/index.rst b/tutorials/vr/index.rst deleted file mode 100644 index 94c6949..0000000 --- a/tutorials/vr/index.rst +++ /dev/null @@ -1,11 +0,0 @@ -XR (AR/VR) -========== - -.. toctree:: - :maxdepth: 1 - :name: toc-tutorials-vr - - xr_primer - openxr/index - oculus_mobile/index - openvr/index diff --git a/tutorials/vr/oculus_mobile/developing_for_oculus_quest.rst b/tutorials/vr/oculus_mobile/developing_for_oculus_quest.rst deleted file mode 100644 index 146298a..0000000 --- a/tutorials/vr/oculus_mobile/developing_for_oculus_quest.rst +++ /dev/null @@ -1,115 +0,0 @@ -.. _doc_developing_for_oculus_quest: - -Developing for Oculus Quest -=========================== - -Introduction ------------- - -This tutorial goes over how to get started developing for the -*Meta Quest* with the Godot Oculus Mobile plugin. - -Before starting, there are two things you need to do: - -First you need to go through the steps on the :ref:`doc_exporting_for_android` -page. This leads you through installing the toolset that Godot -needs to export to Android devices. - -Next you need the Quest plugin. You can get it from the Asset -Library or manually download it from `here `__. - -Setting Up Godot ----------------- - -To get started open Godot and create a new project. - -.. image:: img/quest_new_project.png - -Make sure to choose the ``GLES2`` renderer. Due to the -Quest's GPU this backend is far better suited for the Quest. - -Copy the addons folder from the Oculus Mobile asset into your Godot -project. Your project tree should look similar to this: - -.. image:: img/quest_project_tree.png - -Now you can start building the main scene: - -- Add an :ref:`ARVROrigin ` node first. -- Then add three child nodes to the origin node, one :ref:`ARVRCamera ` and two :ref:`ARVRController ` nodes. -- Assign controller ID 1 to the first :ref:`ARVRController ` and rename that to ``LeftHand``. -- Assign controller ID 2 to the second :ref:`ARVRController ` and rename that to ``RightHand``. -- Finally add a :ref:`MeshInstance ` as a child node to our first :ref:`ARVRController ` and create a box shape, resize the box so each side is set to 0.1. Now duplicate the :ref:`MeshInstance ` and move it to the second :ref:`ARVRController ` node. These will stand in for our controllers. - -.. image:: img/quest_scene_tree.png - -Now add a script to the main node and add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends Spatial - - - var perform_runtime_config = false - - - onready var ovr_init_config = preload("res://addons/godot_ovrmobile/OvrInitConfig.gdns").new() - onready var ovr_performance = preload("res://addons/godot_ovrmobile/OvrPerformance.gdns").new() - - - func _ready(): - var interface = ARVRServer.find_interface("OVRMobile") - if interface: - ovr_init_config.set_render_target_size_multiplier(1) - - if interface.initialize(): - get_viewport().arvr = true - - - func _process(_delta): - if not perform_runtime_config: - ovr_performance.set_clock_levels(1, 1) - ovr_performance.set_extra_latency_mode(1) - perform_runtime_config = true - -Before you can export this project to the Quest you need to do three -more things. - -First go into the project settings and make sure that the main scene -is the scene we run. Godot does not ask you to set this on export. - -.. image:: img/quest_project_settings.png - -Then go into the export menu and configure a new Android export. If -you still haven't gone through the :ref:`doc_exporting_for_android` -page do it now. If you didn't you'll have some red messages on this -screen. - -If you did you can forge ahead and make a few small changes to the -export settings. First change the XR Mode to ``Oculus Mobile VR``. -Then change the Degrees of Freedom mode to ``6DOF``. - -.. image:: img/quest_export_settings.png - -Now save and close the export window. - -Setting Up Your Quest ---------------------- - -Follow `these instructions `__ to -setup your device for development. - -Once your device is set up and connected, click the **Android logo** that should be visible in the top-right corner of the Godot editor. -When clicked, it exports your project and runs it on the connected device. -If you do not see this Android logo, make sure you have create an Android export preset -and that the preset is marked as **Runnable** in the Export dialog. - -The above does the bare minimum to get your project running on the Quest, -it's not very exciting. Holger Dammertz has made a great toolkit for the -quest that contains a lot of scenes to get help you on your way including -really nice controller meshes. - -You can find the toolkit `here `__. - -If you want to help out with improving the plugin please join us `here `__. diff --git a/tutorials/vr/oculus_mobile/img/quest_export_settings.png b/tutorials/vr/oculus_mobile/img/quest_export_settings.png deleted file mode 100644 index 091cff3..0000000 Binary files a/tutorials/vr/oculus_mobile/img/quest_export_settings.png and /dev/null differ diff --git a/tutorials/vr/oculus_mobile/img/quest_new_project.png b/tutorials/vr/oculus_mobile/img/quest_new_project.png deleted file mode 100644 index f7a2ec1..0000000 Binary files a/tutorials/vr/oculus_mobile/img/quest_new_project.png and /dev/null differ diff --git a/tutorials/vr/oculus_mobile/img/quest_project_settings.png b/tutorials/vr/oculus_mobile/img/quest_project_settings.png deleted file mode 100644 index 597ff95..0000000 Binary files a/tutorials/vr/oculus_mobile/img/quest_project_settings.png and /dev/null differ diff --git a/tutorials/vr/oculus_mobile/img/quest_project_tree.png b/tutorials/vr/oculus_mobile/img/quest_project_tree.png deleted file mode 100644 index 7468e2d..0000000 Binary files a/tutorials/vr/oculus_mobile/img/quest_project_tree.png and /dev/null differ diff --git a/tutorials/vr/oculus_mobile/img/quest_scene_tree.png b/tutorials/vr/oculus_mobile/img/quest_scene_tree.png deleted file mode 100644 index 2bbc026..0000000 Binary files a/tutorials/vr/oculus_mobile/img/quest_scene_tree.png and /dev/null differ diff --git a/tutorials/vr/oculus_mobile/index.rst b/tutorials/vr/oculus_mobile/index.rst deleted file mode 100644 index 5497479..0000000 --- a/tutorials/vr/oculus_mobile/index.rst +++ /dev/null @@ -1,8 +0,0 @@ -Oculus mobile plugin (deprecated) -================================= - -.. toctree:: - :maxdepth: 1 - :name: toc-tutorials-vr-oculus_mobile - - developing_for_oculus_quest diff --git a/tutorials/vr/openvr/index.rst b/tutorials/vr/openvr/index.rst deleted file mode 100644 index 38c1b03..0000000 --- a/tutorials/vr/openvr/index.rst +++ /dev/null @@ -1,8 +0,0 @@ -OpenVR plugin -============= - -.. toctree:: - :maxdepth: 1 - :name: toc-tutorials-vr-openvr - - vr_starter_tutorial/index diff --git a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_hands.png b/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_hands.png deleted file mode 100644 index 45f036e..0000000 Binary files a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_hands.png and /dev/null differ diff --git a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_pistol.png b/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_pistol.png deleted file mode 100644 index 972c9d8..0000000 Binary files a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_pistol.png and /dev/null differ diff --git a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_sword.png b/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_sword.png deleted file mode 100644 index 629f75a..0000000 Binary files a/tutorials/vr/openvr/vr_starter_tutorial/img/starter_vr_tutorial_sword.png and /dev/null differ diff --git a/tutorials/vr/openvr/vr_starter_tutorial/index.rst b/tutorials/vr/openvr/vr_starter_tutorial/index.rst deleted file mode 100644 index b7cff42..0000000 --- a/tutorials/vr/openvr/vr_starter_tutorial/index.rst +++ /dev/null @@ -1,9 +0,0 @@ -VR starter tutorial -=================== - -.. toctree:: - :maxdepth: 1 - :name: doc_vr_starter_tutorial - - vr_starter_tutorial_part_one - vr_starter_tutorial_part_two diff --git a/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_one.rst b/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_one.rst deleted file mode 100644 index f509b1b..0000000 --- a/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_one.rst +++ /dev/null @@ -1,1146 +0,0 @@ -.. _doc_vr_starter_tutorial_part_one: - -VR starter tutorial part 1 -========================== - -Introduction ------------- - -.. image:: img/starter_vr_tutorial_sword.png - -This tutorial will show you how to make a beginner VR game project in Godot. - -Keep in mind, **one of the most important things when making VR content is getting the scale of your assets correct**! -It can take lots of practice and iterations to get this right, but there are a few things you can do to make it easier: - -- In VR, 1 unit is typically considered 1 meter. If you design your assets around that standard, you can save yourself a lot of headache. -- In your 3D modeling program, see if there is a way to measure and use real world distances. In Blender, you can use the MeasureIt add-on; in Maya, you can use the Measure Tool. -- You can make rough models using a tool like `Google Blocks `__, and then refine in another 3D modelling program. -- Test often, as the assets can look dramatically different in VR than on a flat screen! - -Throughout the course of this tutorial, we will cover: - -- How to tell Godot to run in VR. -- How to make a teleportation locomotion system that uses the VR controllers. -- How to make a artificial movement locomotion system that uses the VR controllers. -- How to create a :ref:`RigidBody `-based system that allows for picking up, dropping, and throwing RigidBody nodes using the VR controllers. -- How to create simple destroyable target. -- How to create some special :ref:`RigidBody `-based objects that can destroy the targets. - -.. tip:: While this tutorial can be completed by beginners, it is highly - advised to complete :ref:`doc_your_first_2d_game`, - if you are new to Godot and/or game development. - - **Some experience with making 3D games is required** before going through this tutorial series. - This tutorial assumes you have experience with the Godot editor, GDScript, and basic 3D game development. - A OpenVR-ready headset and two OpenVR-ready controllers are required. - - This tutorial was written and tested using a Windows Mixed Reality headset and controllers. This project has also been tested on the HTC Vive. Code adjustments may be required - for other VR Headsets, such as the Oculus Rift. - -The Godot project for this tutorial is found on the `OpenVR GitHub repository `__. The starter assets for this tutorial can be found in the releases -section on the GitHub repository. The starter assets contain some 3D models, sounds, scripts, and scenes that are configured for this tutorial. - -.. note:: **Credits for the assets provided**: - - - The sky panorama was created by `CGTuts `__. - - - The font used is Titillium-Regular - - - The font is licensed under the SIL Open Font License, Version 1.1 - - - The audio used are from several different sources, all downloaded from the Sonniss #GameAudioGDC Bundle (`License PDF `__) - - - The folders where the audio files are stored have the same name as folders in the Sonniss audio bundle. - - - The OpenVR addon was created by `Bastiaan Olij `__ and is released under the MIT license. It can be found both on the `Godot Asset Library `__ and on `GitHub `__. *3rd party code and libraries used in the OpenVR addon may be under a different license.* - - - The initial project, 3D models, and scripts were created by `TwistedTwigleg `__ and is released under the MIT license. - -.. tip:: You can find the finished project on the `OpenVR GitHub repository `__. - - -Getting everything ready ------------------------- - -If you have not already, go to the `OpenVR GitHub repository `__ and download the "Starter Assets" file from the releases. Once you have the -starter assets downloaded, open up the project in Godot. - -.. note:: The starter assets are not required to use the scripts provided in this tutorial. - The starter assets include several premade scenes and scripts that will be used throughout the tutorial. - -When the project is first loaded, the Game.tscn scene will be opened. This will be the main scene used for the tutorial. It includes several nodes and scenes already placed -throughout the scene, some background music, and several GUI-related :ref:`MeshInstance ` nodes. - -_________________ - -The GUI-related :ref:`MeshInstance ` nodes already have scripts attached to them. These scripts will set the texture of a :ref:`Viewport ` -node to the albedo texture of the material of the :ref:`MeshInstance ` node. This is used to display text within the VR project. Feel free to take a look -at the script, ``GUI.gd``, if you want. We will not be going over how to to use :ref:`Viewport ` nodes for displaying UI on :ref:`MeshInstance ` -nodes in this tutorial . - -If you are interested in how to use :ref:`Viewport ` nodes for displaying UI on :ref:`MeshInstance ` nodes, see the :ref:`doc_viewport_as_texture` -tutorial. It covers how to use a :ref:`Viewport ` as a render texture, along with how to apply that texture onto a :ref:`MeshInstance ` node. - -_________________ - -Before we jump into the tutorial, let's take a moment to talk about how the nodes used for VR work. - -The :ref:`ARVROrigin ` node is the center point of the VR tracking system. The position of the :ref:`ARVROrigin ` is the position -the VR system considers the 'center' point on the floor. The :ref:`ARVROrigin ` has a `world scale` property that effects the size of the user within -the VR scene. For this tutorial, it is set to `1.4`, as the world was originally just a tad to big. As mentioned earlier, keeping the scale relatively consistent is -important in VR. - -The :ref:`ARVRCamera ` is the player's headset and view into the scene. The :ref:`ARVRCamera ` is offset on the Y axis by the VR user's height, -which will be important later when we add teleportation locomotoin. If the VR system supports room tracking, then the :ref:`ARVRCamera ` will move as the player moves. -This means that the :ref:`ARVRCamera ` is not guaranteed to be in the same position as the :ref:`ARVROrigin ` node. - -The :ref:`ARVRController ` node represents a VR controller. The :ref:`ARVRController ` will follow the position and rotation of the VR -controller relative to the :ref:`ARVROrigin ` node. All of the input for the VR controllers happens through the :ref:`ARVRController ` node. -An :ref:`ARVRController ` node with an ``ID`` of ``1`` represents the left VR controller, while an :ref:`ARVRController ` controller with an -``ID`` of ``2`` represents the right VR controller. - -To summarize: - -- The :ref:`ARVROrigin ` node is the center of the VR tracking system and is positioned on the floor. - -- The :ref:`ARVRCamera ` is the player's VR headset and view into the scene. - -- The :ref:`ARVRCamera ` node is offset on the Y axis by the user's height. - -- If the VR system supports room tracking, then the :ref:`ARVRCamera ` node may be offset on the X and Z axes as the player moves. - -- The :ref:`ARVRController ` nodes represent the VR controllers and handle all of the input from the VR controllers. - - -Starting VR ------------ - -Now that we have gone over the VR nodes, let's start working on the project. While in ``Game.tscn``, select the ``Game`` node and make a new script called ``Game.gd``. -In the ``Game.gd`` file, add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends Spatial - - func _ready(): - var VR = ARVRServer.find_interface("OpenVR") - if VR and VR.initialize(): - get_viewport().arvr = true - - OS.vsync_enabled = false - Engine.target_fps = 90 - # Also, the physics FPS in the project settings is also 90 FPS. This makes the physics - # run at the same frame rate as the display, which makes things look smoother in VR! - - .. code-tab:: csharp - - using Godot; - using System; - - public class Game : Spatial - { - public override void _Ready() - { - var vr = ARVRServer.FindInterface("OpenVR"); - if (vr != null && vr.Initialize()) - { - GetViewport().Arvr = true; - - OS.VsyncEnabled = false; - Engine.TargetFps = 90; - // Also, the physics FPS in the project settings is also 90 FPS. This makes the physics - // run at the same frame rate as the display, which makes things look smoother in VR! - } - } - } - -Let's go over what this code does. - -_________________ - -In the ``_ready`` function, we first get the OpenVR VR interface using the ``find_interface`` function in the :ref:`ARVRServer ` and assign it to a variable -called `VR`. If the :ref:`ARVRServer ` finds an interface with the name OpenVR, it will return it, otherwise it will return ``null``. - -.. note:: The OpenVR VR interface is not included with Godot by default. You will need to download the OpenVR asset from the - `Asset Library `__ or `GitHub `__. - -The code then combines two conditionals, one to check if the `VR` variable is NOT null (``if VR``) and another calls the initialize function, which returns a boolean based on -whether the OpenVR interface was able to initialize or not. If both of these conditionals return true, then we can turn the main Godot :ref:`Viewport ` into -an ARVR viewport. - -If the VR interface initialized successfully, we then get the root :ref:`Viewport ` and set the `arvr` property to ``true``. This will tell Godot to use the initialized -ARVR interface to drive the :ref:`Viewport ` display. - -Finally, we disable VSync so the Frames Per Second (FPS) is not capped by the computer monitor. After this we tell Godot to render at ``90`` frames per second, which is the -standard for most VR headsets. Without disabling VSync, the normal computer monitor may limit the frame rate of the VR headset to the frame rate of the computer monitor. - -.. note:: In the project settings, under the ``Physics->Common`` tab, the physics FPS has been set to ``90``. This makes the physics engine run at the same frame rate as - the VR display, which makes physics reactions look smoother when in VR. - -_________________ - -That is all we need to do for Godot to launch OpenVR within the project! Go ahead and give it a try if you want. Assuming everything works, you will be able to look around -the world. If you have a VR headset with room tracking, then you will be able to move around the scene within the limits of the room tracking. - -Creating the controllers ------------------------- - -.. image:: img/starter_vr_tutorial_hands.png - -Right now all that the VR user can do is stand around, which isn't really what we are going for unless we are working on a VR film. Lets write the code for the -VR controllers. We are going to write all the code for the VR controllers in one go, so the code is rather long. That said, once we are finished you will be -able to teleport around the scene, artificially move using the touchpad/joystick on the VR controller, and be able to pick up, drop, and throw -:ref:`RigidBody `-based nodes. - -First we need to open the scene used for the VR controllers. ``Left_Controller.tscn`` or ``Right_Controller.tscn``. Let's briefly go over how the scene is setup. - -How the VR controller scene is setup -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -In both scenes the root node is a ARVRController node. The only difference is that the ``Left_Controller`` scene has the ``Controller Id`` property set to ``1`` while -the ``Right_Controller`` has the ``Controller Id`` property set to ``2``. - -.. note:: The :ref:`ARVRServer ` attempts to use these two IDs for the left and right VR controllers. For VR systems that support more than 2 - controllers/tracked-objects, these IDs may need adjusting. - -Next is the ``Hand`` :ref:`MeshInstance ` node. This node is used to display the hand mesh that will be used when the VR controller is not holding onto a -:ref:`RigidBody ` node. The hand in the ``Left_Controller`` scene is a left hand, while the hand on the ``Right_Controller`` scene is a right hand. - -The node named ``Raycast`` is a :ref:`Raycast ` node that is used for aiming where to teleport to when the VR controller is teleporting. -The length of the :ref:`Raycast ` is set to ``-16`` on the Y axis and is rotated so that it points out of the pointer finger of the hand. The ``Raycast`` node has -a single child node, ``Mesh``, that is a :ref:`MeshInstance `. This is used for visually showing where the teleportation :ref:`Raycast ` is aiming. - -The node named ``Area`` is a :ref:`Area ` node will be used for grabbing :ref:`RigidBody `-based nodes when the VR controller grab mode is set to ``AREA``. -The ``Area`` node has a single child node, ``CollisionShape``, that defines a sphere :ref:`CollisionShape `. When the VR controller is not holding any objects and the grab button is pressed, -the first :ref:`RigidBody `-based node within the ``Area`` node will be picked up. - -Next is a :ref:`Position3D ` node called ``Grab_Pos``. This is used to define the position that grabbed :ref:`RigidBody ` nodes will follow then -they are held by the VR controller. - -A large :ref:`Area ` node called ``Sleep_Area`` is used to disable sleeping for any RigidBody nodes within its :ref:`CollisionShape `, -simple called ``CollisionShape``. This is needed because if a :ref:`RigidBody ` node falls asleep, then the VR controller will be unable to grab it. -By using ``Sleep_Area``, we can write code that makes any :ref:`RigidBody ` node within it not able to sleep, therefore allowing the VR controller to grab it. - -An :ref:`AudioStreamPlayer3D ` node called ``AudioStreamPlayer3D`` has a sound loaded that we will use when an object has been picked up, dropped -or thrown by the VR controller. While this is not necessary for the functionality of the VR controller, it makes grabbing and dropping objects feel more natural. - -Finally, the last nodes are the ``Grab_Cast`` node and it's only child node, ``Mesh``. The ``Grab_Cast`` node will be used for grabbing :ref:`RigidBody `-based -nodes when the VR controller grab mode is set to ``RAYCAST``. This will allow the VR controller to grab objects that are just slightly out of reach using a Raycast. The ``Mesh`` -node is used for visually showing where the teleportation :ref:`Raycast ` is aiming. - -That is a quick overview of how the VR controller scenes are setup, and how we will be using the nodes to provide the functionality for them. Now that we have looked at the -VR controller scene, let's write the code that will drive them. - -The code for the VR controllers -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Select the root node of the scene, either ``Right_Controller`` or ``Left_Controller``, and make a new script called ``VR_Controller.gd``. Both scenes will be using -the same script, so it doesn't matter which you use first. With ``VR_Controller.gd`` opened, add the following code: - -.. tip:: You can copy and paste the code from this page directly into the script editor. - - If you do this, all the code copied will be using spaces instead of tabs. - - To convert the spaces to tabs in the script editor, click the ``Edit`` menu and select ``Convert Indent To Tabs``. - This will convert all the spaces into tabs. You can select ``Convert Indent To Spaces`` to convert tabs back into spaces. - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends ARVRController - - var controller_velocity = Vector3(0,0,0) - var prior_controller_position = Vector3(0,0,0) - var prior_controller_velocities = [] - - var held_object = null - var held_object_data = {"mode":RigidBody.MODE_RIGID, "layer":1, "mask":1} - - var grab_area - var grab_raycast - - var grab_mode = "AREA" - var grab_pos_node - - var hand_mesh - var hand_pickup_drop_sound - - var teleport_pos = Vector3.ZERO - var teleport_mesh - var teleport_button_down - var teleport_raycast - - # A constant to define the dead zone for both the trackpad and the joystick. - # See https://web.archive.org/web/20191208161810/http://www.third-helix.com/2013/04/12/doing-thumbstick-dead-zones-right.html - # for more information on what dead zones are, and how we are using them in this project. - const CONTROLLER_DEADZONE = 0.65 - - const MOVEMENT_SPEED = 1.5 - - const CONTROLLER_RUMBLE_FADE_SPEED = 2.0 - - var directional_movement = false - - - func _ready(): - # Ignore the warnings the from the connect function calls. - # (We will not need the returned values for this tutorial) - # warning-ignore-all:return_value_discarded - - teleport_raycast = get_node("RayCast") - - teleport_mesh = get_tree().root.get_node("Game/Teleport_Mesh") - - teleport_button_down = false - teleport_mesh.visible = false - teleport_raycast.visible = false - - grab_area = get_node("Area") - grab_raycast = get_node("Grab_Cast") - grab_pos_node = get_node("Grab_Pos") - - grab_mode = "AREA" - grab_raycast.visible = false - - get_node("Sleep_Area").connect("body_entered", self, "sleep_area_entered") - get_node("Sleep_Area").connect("body_exited", self, "sleep_area_exited") - - hand_mesh = get_node("Hand") - hand_pickup_drop_sound = get_node("AudioStreamPlayer3D") - - connect("button_pressed", self, "button_pressed") - connect("button_release", self, "button_released") - - - func _physics_process(delta): - if rumble > 0: - rumble -= delta * CONTROLLER_RUMBLE_FADE_SPEED - if rumble < 0: - rumble = 0 - - if teleport_button_down == true: - teleport_raycast.force_raycast_update() - if teleport_raycast.is_colliding(): - if teleport_raycast.get_collider() is StaticBody: - if teleport_raycast.get_collision_normal().y >= 0.85: - teleport_pos = teleport_raycast.get_collision_point() - teleport_mesh.global_transform.origin = teleport_pos - - - if get_is_active() == true: - _physics_process_update_controller_velocity(delta) - - if held_object != null: - var held_scale = held_object.scale - held_object.global_transform = grab_pos_node.global_transform - held_object.scale = held_scale - - _physics_process_directional_movement(delta); - - - func _physics_process_update_controller_velocity(delta): - controller_velocity = Vector3(0,0,0) - - if prior_controller_velocities.size() > 0: - for vel in prior_controller_velocities: - controller_velocity += vel - - controller_velocity = controller_velocity / prior_controller_velocities.size() - - var relative_controller_position = (global_transform.origin - prior_controller_position) - - controller_velocity += relative_controller_position - - prior_controller_velocities.append(relative_controller_position) - - prior_controller_position = global_transform.origin - - controller_velocity /= delta; - - if prior_controller_velocities.size() > 30: - prior_controller_velocities.remove(0) - - - func _physics_process_directional_movement(delta): - var trackpad_vector = Vector2(-get_joystick_axis(1), get_joystick_axis(0)) - var joystick_vector = Vector2(-get_joystick_axis(5), get_joystick_axis(4)) - - if trackpad_vector.length() < CONTROLLER_DEADZONE: - trackpad_vector = Vector2(0,0) - else: - trackpad_vector = trackpad_vector.normalized() * ((trackpad_vector.length() - CONTROLLER_DEADZONE) / (1 - CONTROLLER_DEADZONE)) - - if joystick_vector.length() < CONTROLLER_DEADZONE: - joystick_vector = Vector2(0,0) - else: - joystick_vector = joystick_vector.normalized() * ((joystick_vector.length() - CONTROLLER_DEADZONE) / (1 - CONTROLLER_DEADZONE)) - - var forward_direction = get_parent().get_node("Player_Camera").global_transform.basis.z.normalized() - var right_direction = get_parent().get_node("Player_Camera").global_transform.basis.x.normalized() - - # Because the trackpad and the joystick will both move the player, we can add them together and normalize - # the result, giving the combined movement direction - var movement_vector = (trackpad_vector + joystick_vector).normalized() - - var movement_forward = forward_direction * movement_vector.x * delta * MOVEMENT_SPEED - var movement_right = right_direction * movement_vector.y * delta * MOVEMENT_SPEED - - movement_forward.y = 0 - movement_right.y = 0 - - if movement_right.length() > 0 or movement_forward.length() > 0: - get_parent().global_translate(movement_right + movement_forward) - directional_movement = true - else: - directional_movement = false - - - func button_pressed(button_index): - if button_index == 15: - _on_button_pressed_trigger() - - if button_index == 2: - _on_button_pressed_grab() - - if button_index == 1: - _on_button_pressed_menu() - - - func _on_button_pressed_trigger(): - if held_object == null: - if teleport_mesh.visible == false: - teleport_button_down = true - teleport_mesh.visible = true - teleport_raycast.visible = true - else: - if held_object is VR_Interactable_Rigidbody: - held_object.interact() - - - func _on_button_pressed_grab(): - if teleport_button_down == true: - return - - if held_object == null: - _pickup_rigidbody() - else: - _throw_rigidbody() - - hand_pickup_drop_sound.play() - - - func _pickup_rigidbody(): - var rigid_body = null - - if grab_mode == "AREA": - var bodies = grab_area.get_overlapping_bodies() - if len(bodies) > 0: - for body in bodies: - if body is RigidBody: - if !("NO_PICKUP" in body): - rigid_body = body - break - - elif grab_mode == "RAYCAST": - grab_raycast.force_raycast_update() - if grab_raycast.is_colliding(): - var body = grab_raycast.get_collider() - if body is RigidBody: - if !("NO_PICKUP" in body): - rigid_body = body - - - if rigid_body != null: - - held_object = rigid_body - - held_object_data["mode"] = held_object.mode - held_object_data["layer"] = held_object.collision_layer - held_object_data["mask"] = held_object.collision_mask - - held_object.mode = RigidBody.MODE_STATIC - held_object.collision_layer = 0 - held_object.collision_mask = 0 - - hand_mesh.visible = false - grab_raycast.visible = false - - if held_object is VR_Interactable_Rigidbody: - held_object.controller = self - held_object.picked_up() - - - func _throw_rigidbody(): - if held_object == null: - return - - held_object.mode = held_object_data["mode"] - held_object.collision_layer = held_object_data["layer"] - held_object.collision_mask = held_object_data["mask"] - - held_object.apply_impulse(Vector3(0, 0, 0), controller_velocity) - - if held_object is VR_Interactable_Rigidbody: - held_object.dropped() - held_object.controller = null - - held_object = null - hand_mesh.visible = true - - if grab_mode == "RAYCAST": - grab_raycast.visible = true - - - func _on_button_pressed_menu(): - if grab_mode == "AREA": - grab_mode = "RAYCAST" - if held_object == null: - grab_raycast.visible = true - - elif grab_mode == "RAYCAST": - grab_mode = "AREA" - grab_raycast.visible = false - - - func button_released(button_index): - if button_index == 15: - _on_button_released_trigger() - - - func _on_button_released_trigger(): - if teleport_button_down == true: - - if teleport_pos != null and teleport_mesh.visible == true: - var camera_offset = get_parent().get_node("Player_Camera").global_transform.origin - get_parent().global_transform.origin - camera_offset.y = 0 - - get_parent().global_transform.origin = teleport_pos - camera_offset - - teleport_button_down = false - teleport_mesh.visible = false - teleport_raycast.visible = false - teleport_pos = null - - - func sleep_area_entered(body): - if "can_sleep" in body: - body.can_sleep = false - body.sleeping = false - - - func sleep_area_exited(body): - if "can_sleep" in body: - # Allow the CollisionBody to sleep by setting the "can_sleep" variable to true - body.can_sleep = true - -This is quite a bit of code to go through. Let's go through what the code does step-by-step. - -Explaining the VR controller code -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, let's go through all the class variables in the script: - -* ``controller_velocity``: A variable to hold a rough approximation of the VR controller's velocity. -* ``prior_controller_position``: A variable to hold the VR controller's last position in 3D space. -* ``prior_controller_velocities``: An Array to hold the last 30 calculated VR controller velocities. This is used to smooth the velocity calculations over time. -* ``held_object``: A variable to hold a reference to the object the VR controller is holding. If the VR controller is not holding any objects, this variable will be ``null``. -* ``held_object_data``: A dictionary to hold data for the :ref:`RigidBody ` node being held by the VR controller. This is used to reset the :ref:`RigidBody `'s data when it is no longer held. -* ``grab_area``: A variable to hold the :ref:`Area ` node used to grab objects with the VR controller. -* ``grab_raycast``: A variable to hold the :ref:`Raycast ` node used to grab objects with the VR controller. -* ``grab_mode``: A variable to define the grab mode the VR controller is using. There are only two modes for grabbing objects in this tutorial, ``AREA`` and ``RAYCAST``. -* ``grab_pos_node``: A variable to hold the node that will be used to update the position and rotation of held objects. -* ``hand_mesh``: A variable to hold the :ref:`MeshInstance ` node that contains the hand mesh for the VR controller. This mesh will be shown when the VR controller is not holding anything. -* ``hand_pickup_drop_sound``: A variable to hold the :ref:`AudioStreamPlayer3D ` node that contains the pickup/drop sound. -* ``teleport_pos``: A variable to hold the position the player will be teleported to when the VR controller teleports the player. -* ``teleport_mesh``: A variable to hold the :ref:`MeshInstance ` node used to show where the player is teleporting to. -* ``teleport_button_down``: A variable used to track whether the controller's teleport button is held down. This will be used to detect if this VR controller is trying to teleport the player. -* ``teleport_raycast``: A variable to hold the :ref:`Raycast ` node used to calculate the teleport position. This node also has a :ref:`MeshInstance ` that acts as a 'laser sight' for aiming. -* ``CONTROLLER_DEADZONE``: A constant to define the deadzone for both the trackpad and the joystick on the VR controller. See the note below for more information. -* ``MOVEMENT_SPEED``: A constant to define the speed the player moves at when using the trackpad/joystick to move artificially. -* ``CONTROLLER_RUMBLE_FADE_SPEED``: A constant to define how fast the VR controller rumble fades. -* ``directional_movement``: A variable to hold whether this VR controller is moving the player using the touchpad/joystick. - -.. note:: You can find a great article explaining all about how to handle touchpad/joystick dead zones `here `__. - - We are using a translated version of the scaled radial dead zone code provided in that article for the VR controller's joystick/touchpad. - The article is a great read, and I highly suggest giving it a look! - -That is quite a few class variables. Most of them are used to hold references to nodes we will need throughout the code. Next let's start looking at the functions, starting -with the ``_ready`` function. - -_________________ - -``_ready`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -First we tell Godot to silence the warnings about not using the values returned by the ``connect`` function. We will not need the returned -values for this tutorial. - -Next we get the :ref:`Raycast ` node we are going to use for determining the position for teleporting and assign it to the ``teleport_raycast`` variable. -We then get the :ref:`MeshInstance ` node that we will use to show where the player will be teleporting to. The node we are using for teleporting -is a child of the ``Game`` scene. We do this so the teleport mesh node is not effected by changes in the VR controller, and so the teleport mesh can be used by both VR controllers. - -Then the ``teleport_button_down`` variable is set to false, ``teleport_mesh.visible`` is set to ``false``, and ``teleport_raycast.visible`` is set to ``false``. This sets up the variables -for teleporting the player into their initial, not teleporting the player, state. - -The code then gets the ``grab_area`` node, the ``grab_raycast`` node, and the ``grab_pos_node`` node and assigns them all to their respective variables for use later. - -Next the ``grab_mode`` is set to ``AREA`` so the VR controller will attempt to grab objects using the :ref:`Area ` node defined in ``grab_area`` when the VR controller's -grab/grip button is pressed. We also set the ``grab_raycast`` node's ``visible`` property to ``false`` so the 'laser sight' child node of ``grab_raycast`` is not visible. - -After that we connect the ``body_entered`` and ``body_exited`` signals from the ``Sleep_Area`` node in the VR controller to the ``sleep_area_entered`` and ``sleep_area_exited`` functions. -The ``sleep_area_entered`` and ``sleep_area_exited`` functions will be used to make :ref:`RigidBody ` nodes unable to sleep when nearby the VR controller. - -Then the ``hand_mesh`` and ``hand_pickup_drop_sound`` nodes are gotten and assigned them to their respective variables for use later. - -Finally, the ``button_pressed`` and ``button_release`` signals in the :ref:`ARVRController ` node, which the VR controller extends, are connected to the -``button_pressed`` and ``button_released`` functions respectively. This means that when a button on the VR controller is pressed or released, the ``button_pressed`` or ``button_released`` -functions defined in this script will be called. - - -``_physics_process`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First we check to see if the ``rumble`` variable is more than zero. If the ``rumble`` variable, which is a property of the :ref:`ARVRController ` node, is more -than zero then the VR controller rumbles. - -If the ``rumble`` variable is more than zero, then we reduce the rumble by ``CONTROLLER_RUMBLE_FADE_SPEED`` every second by subtracting ``CONTROLLER_RUMBLE_FADE_SPEED`` multiplied by delta. -There is then a ``if`` condition to check if ``rumble`` is less than zero, which sets ``rumble`` to zero if its value is less than zero. - -This small section of code is all we need for reducing the VR controller's rumble. Now when we set ``rumble`` to a value, this code will automatically make it fade over time. - -_________________ - -The first section of code checks to see if the ``teleport_button_down`` variable is equal to ``true``, which means this VR controller is trying to teleport. - -If ``teleport_button_down`` is equal to ``true``, we force the ``teleport_raycast`` :ref:`Raycast ` node to update using the ``force_raycast_update`` function. -The ``force_raycast_update`` function will update the properties within the :ref:`Raycast ` node with the latest version of the physics world. - -The code then checks to see if the ``teleport_raycast`` collided with anything by checking of the ``is_colliding`` function in ``teleport_raycast`` is true. If the :ref:`Raycast ` -collided with something, we then check to see if the :ref:`PhysicsBody ` the raycast collided with is a :ref:`StaticBody ` or not. We then check to -see if the collision normal vector returned by the raycast is greater than or equal to ``0.85`` on the Y axis. - -.. note:: We do this because we do not want the user to be able to teleport onto RigidBody nodes and we only want the player to be able to teleport on floor-like surfaces. - -If all these conditions are met, then we assign the ``teleport_pos`` variable to the ``get_collision_point`` function in ``teleport_raycast``. This will assign ``teleport_pos`` to the -position the raycast collided at in world space. We then move the ``teleport_mesh`` to the world position stored in ``teleport_pos``. - -This section of code will get the position the player is aiming at with the teleportation raycast and update the teleportation mesh, giving a visual update on where the user will be teleporting -to when the release the teleport button. - -_________________ - -The next section of code first checks to see if the VR controller is active through the ``get_is_active`` function, which is defined by :ref:`ARVRController `. If the -VR controller is active, then it calls the ``_physics_process_update_controller_velocity`` function. - -The ``_physics_process_update_controller_velocity`` function will calculate the VR controller's velocity through changes in position. It is not perfect, but this process gets a rough -idea of the velocity of the VR controller, which is fine for the purposes of this tutorial. - -_________________ - -The next section of code checks to see if the VR controller is holding an object by checking to see if the ``held_object`` variable is not equal to ``null``. - -If the VR controller is holding an object, we first store it's scale in a temporary variable called ``held_scale``. We then set the ``global_transform`` of the held object -to the ``global_transform`` of the ``held_object`` node. This will make the held object have the same position, rotation, and scale of the ``grab_pos_node`` node in world space. - -However, because we do not want the held object to change in scale when it is grabbed, we need to set the ``scale`` property of the ``held_object`` node back to ``held_scale``. - -This section of code will keep the held object in the same position and rotation as the VR controller, keeping it synced with the VR controller. - -_________________ - -Finally, the last section of code simply calls the ``_physics_process_directional_movement`` function. This function contains all the code for moving the player when the -touchpad/joystick on the VR controller moves. - - -``_physics_process_update_controller_velocity`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function resets the ``controller_velocity`` variable to zero :ref:`Vector3 `. - -_________________ - -Then we check to see if there are any stored/cached VR controller velocities saved in the ``prior_controller_velocities`` array. We do this by checking to see if the ``size()`` function -returns a value greater than ``0``. If there are cached velocities within ``prior_controller_velocities``, then we iterate through each of the stored velocities using a ``for`` loop. - -For each of the cached velocities, we simply add its value to ``controller_velocity``. Once the code has gone through all of the cached velocities in ``prior_controller_velocities``, -we divide ``controller_velocity`` by the size of the ``prior_controller_velocities`` array, which will give us the combined velocity value. This helps take the previous velocities into -account, making the direction of the controller's velocity more accurate. - -_________________ - -Next we calculate the change in position the VR controller has taken since the last ``_physics_process`` function call. We do this by subtracting ``prior_controller_position`` from the -global position of the VR controller, ``global_transform.origin``. This will give us a :ref:`Vector3 ` that points from the position in ``prior_controller_position`` to -the current position of the VR controller, which we store in a variable called ``relative_controller_position``. - -Next we add the change in position to ``controller_velocity`` so the latest change in position is taken into account in the velocity calculation. We then add ``relative_controller_position`` -to ``prior_controller_velocities`` so it can be taken into account on the next calculation of the VR controller's velocity. - -Then ``prior_controller_position`` is updated with the global position of the VR controller, ``global_transform.origin``. We then divide ``controller_velocity`` by ``delta`` so the velocity -is higher, giving results like those we expect, while still being relative to the amount of time that has passed. It is not a perfect solution, but the results look decent most of the time -and for the purposes of this tutorial, it is good enough. - -Finally, the function checks to see if the ``prior_controller_velocities`` has more than ``30`` velocities cached by checking if the ``size()`` function returns a value greater than ``30``. -If there are more than ``30`` cached velocities stored in ``prior_controller_velocities``, then we simply remove the oldest cached velocity by calling the ``remove`` function and passing in -a index position of ``0``. - -_________________ - -What this function ultimately does is that it gets a rough idea of the VR controller's velocity by calculating the VR controller's relative changes in position -over the last thirty ``_physics_process`` calls. While this is not perfect, it gives a decent idea of how fast the VR controller is moving in 3D space. - - -``_physics_process_directional_movement`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function gets the axes for the trackpad and the joystick and assigns them to :ref:`Vector2 ` variables called ``trackpad_vector`` and ``joystick_vector`` respectively. - -.. note:: You may need to remap the joystick and/or touchpad index values depending on your VR headset and controller. The inputs in this tutorial are the index values of a - Windows Mixed Reality headset. - -Then ``trackpad_vector`` and ``joystick_vector`` have their deadzones account for. The code for this is detailed in the article below, with slight changes as the code is converted from -C# to GDScript. - -Once the ``trackpad_vector`` and ``joystick_vector`` variables have had their deadzones account for, the code then gets the forward and right direction vectors relative to the -global transform of the :ref:`ARVRCamera `. What this does is that it gives us vectors that point forward and right relative to the rotation of the user camera, -the :ref:`ARVRCamera `, in world space. These vectors point in the same direction of the blue and red arrows when you select an object in the Godot editor with -the ``local space mode`` button enabled. The forward direction vector is stored in a variable called ``forward_direction``, while the right direction vector is stored in a variable -called ``right_direction``. - -Next the code adds the ``trackpad_vector`` and ``joystick_vector`` variables together and normalizes the results using the ``normalized`` function. This gives us the -combined movement direction of both input devices, so we can use a single :ref:`Vector2 ` for moving the user. We assign the combined direction to a variable called ``movement_vector``. - -Then we calculate the distance the user will move forward, relative to the forward direction stored in ``forward_direction``. To calculate this, we multiply ``forward_direction`` by ``movement_vector.x``, -``delta``, and ``MOVEMENT_SPEED``. This will give us the distance the user will move forward when the trackpad/joystick is pushed forward or backwards. We assign this to a variable called -``movement_forward``. - -We do a similar calculation for the distance the user will move right, relative to the right direction stored in ``right_direction``. To calculate the distance the user will move right, -we multiply ``right_direction`` by ``movement_vector.y``, ``delta``, and ``MOVEMENT_SPEED``. This will give us the distance the user will move right when the trackpad/joystick is pushed right or left. -We assign this to a variable called ``movement_right``. - -Next we remove any movement on the ``Y`` axis of ``movement_forward`` and ``movement_right`` by assigning their ``Y`` values to ``0``. We do this so the user cannot fly/fall simply by moving the trackpad -or joystick. Without doing this, the player could fly in the direction they are facing. - -Finally, we check to see if the ``length`` function on ``movement_right`` or ``movement_forward`` is greater than ``0``. If it is, then we need to move the user. To move the user, we perform a global -translation to the :ref:`ARVROrigin ` node using ``get_parent().global_translate`` and pass in the ``movement_right`` variable with the ``movement_forward`` variable added to it. This -will move the player in the direction the trackpad/joystick is pointing, relative to the rotation of the VR headset. We also set the ``directional_movement`` variable to ``true`` so the code knows this -VR controller is moving the player. - -If the ``length`` function on ``movement_right`` or ``movement_forward`` is less than or equal to ``0``, then we simply set the ``directional_movement`` variable to ``false`` so the code knows this VR -controller is not moving the player. - - -_________________ - -What this function ultimately does is takes the input from the VR controller's trackpad and joystick and moves the player in the direction the player is pushing them. Movement is relative to the rotation -of the VR headset, so if the player pushes forward and turns their head to the left, they will move to the left. - - -``button_pressed`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""" - -This function checks to see if the VR button that was just pressed is equal to one of the VR buttons used in this project. The ``button_index`` variable is passed in by the -``button_pressed`` signal in :ref:`ARVRController `, which we connected in the ``_ready`` function. - -There are only three buttons we are looking for in this project: the trigger button, the grab/grip button, and the menu button. - -.. note:: You may need to remap these button index values depending on your VR headset and controller. The inputs in this tutorial are the index values of a - Windows Mixed Reality headset. - -First we check if the ``button_index`` is equal to ``15``, which should map to the trigger button on the VR controller. If the button pressed is the trigger button, -then the ``_on_button_pressed_trigger`` function is called. - -If the ``button_index`` is equal to ``2``, then the grab button was just pressed. If the button pressed is the grab button, the ``_on_button_pressed_grab`` function is called. - -Finally, if the ``button_index`` is equal to ``1``, then the menu button was just pressed. If the button pressed is the menu button, the ``_on_button_pressed_menu`` function is called. - - -``_on_button_pressed_trigger`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function checks to see if the VR controller is not holding by checking if ``held_object`` is equal to ``null``. If the VR controller is not holding anything, then -we assume that the trigger press on the VR controller was for teleporting. We then make sure that ``teleport_mesh.visible`` is equal to ``false``. We use this to tell if -the other VR controller is trying to teleport or not, as ``teleport_mesh`` will be visible if the other VR controller is teleporting. - -If ``teleport_mesh.visible`` is equal to ``false``, then we can teleport with this VR controller. We set the ``teleport_button_down`` variable to ``true``, set -``teleport_mesh.visible`` to true, and set ``teleport_raycast.visible`` to ``true``. This will tell the code in ``_physics_process`` that this VR controller is going to -teleport, it will make the ``teleport_mesh`` visible so the user knows where the are teleporting to, and will make ``teleport_raycast`` visible to the player has a -'laser sight' they can use to aim the teleportation pos. - -_________________ - -If ``held_object`` is not equal to ``null``, then the VR controller is holding something. We then check to see if the object that is being held, ``held_object``, extends -a class called ``VR_Interactable_Rigidbody``. We have not made ``VR_Interactable_Rigidbody`` yet, but ``VR_Interactable_Rigidbody`` will be a custom class we will use -on all of the special/custom :ref:`RigidBody `-based nodes in the project. - -.. tip:: Don't worry, we will cover ``VR_Interactable_Rigidbody`` after this section! - -If the ``held_object`` extends ``VR_Interactable_Rigidbody``, then we call the ``interact`` function, so the held object can do whatever it is supposed to do when -the trigger is pressed and the object is held by the VR controller. - - -``_on_button_pressed_grab`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function checks to see if ``teleport_button_down`` is equal to ``true``. If it is, then it calls ``return``. We do this because we do not want the user to be -able to pick up objects while teleporting. - -Then we check to see if the VR controller is currently not holding anything by checking if ``held_object`` is equal to ``null``. If the VR controller is not holding anything, -then the ``_pickup_rigidbody`` function is called. If the VR controller is holding something, ``held_object`` is not equal to ``null``, then the ``_throw_rigidbody`` function is called. - -Finally, the pick-up/drop sound is played by calling the ``play`` function on ``hand_pickup_drop_sound``. - - -``_pickup_rigidbody`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First the function makes a variable called ``rigid_body``, which we'll be using to store the :ref:`RigidBody ` that the VR controller is going to -pick up, assuming there is a RigidBody to pick up. - -_________________ - -Then the function checks to see if the ``grab_mode`` variable is equal to ``AREA``. If it is, then it gets all of the :ref:`PhysicsBody ` nodes within the ``grab_area`` using -the ``get_overlapping_bodies`` functions. This function will return an array of :ref:`PhysicsBody ` nodes. We assign the array of :ref:`PhysicsBody ` to a new -variable called ``bodies``. - -We then check to see if the length of the ``bodies`` variable is more than ``0``. If it is, we go through each of the :ref:`PhysicsBody ` nodes in ``bodies`` using a for loop. - -For each :ref:`PhysicsBody ` node, we check if it is, or extends, a :ref:`RigidBody ` node using ``if body is RigidBody``, which will return ``true`` if the -:ref:`PhysicsBody ` node is or extends the :ref:`RigidBody ` node. If the object is a :ref:`RigidBody `, then we check to make sure there is not -a variable/constant called ``NO_PICKUP`` defined in the body. We do this because if you want to have :ref:`RigidBody ` nodes that cannot be picked up, all you have to do is -define a constant/variable called ``NO_PICKUP`` and the VR controller will be unable to pick it up. If the :ref:`RigidBody ` node does not have a variable/constant defined with -the name ``NO_PICKUP``, then we assign the ``rigid_body`` variable to the :ref:`RigidBody ` node and break the for loop. - -What this section of code does is goes through all of the physics bodies within the ``grab_area`` and grabs the first :ref:`RigidBody ` node that does not have a -variable/constant named ``NO_PICKUP`` and assigns it to the ``rigid_body`` variable so we can do some additional post processing later in this function. - -_________________ - -If the ``grab_mode`` variable is not equal to ``AREA``, we then check to see if it is equal to ``RAYCAST`` instead. If it is equal to ``RAYCAST``, we force the ``grab_raycast`` node to update -using the ``force_raycast_update`` function. The ``force_raycast_update`` function will update the :ref:`Raycast ` with the latest changes in the physics world. We then check -to see if the ``grab_raycast`` node collided with something using the ``is_colliding`` function, which will return true if the :ref:`Raycast ` hit something. - -If the ``grab_raycast`` hit something, we get the :ref:`PhysicsBody ` node hit using the ``get_collider`` function. The code then checks to see if the node hit is -a :ref:`RigidBody ` node using ``if body is RigidBody``, which will return ``true`` if the :ref:`PhysicsBody ` node is or extends the -:ref:`RigidBody ` node. Then the code checks to see if the :ref:`RigidBody ` node does not have a variable named ``NO_PICKUP``, and if it does not, -then it assigns the :ref:`RigidBody ` node to the ``rigid_body`` variable. - -What this section of code does is sends the ``grab_raycast`` :ref:`Raycast ` node out and checks if it collided with a :ref:`RigidBody ` node that does -not have a variable/constant named ``NO_PICKUP``. If it collided with a RigidBody without ``NO_PICKUP``, it assigns the node to the ``rigid_body`` variable so we can do some -additional post processing later in this function. - -_________________ - -The final section of code first checks to see if ``rigid_body`` is not equal to ``null``. If ``rigid_body`` is not equal to ``null``, then the VR controller found a -:ref:`RigidBody `-based node that can be picked up. - -If there is a VR controller to pickup, we assign ``held_object`` to the :ref:`RigidBody ` node stored in ``rigid_body``. We then store the :ref:`RigidBody ` node's -``mode``, ``collision_layer``, and ``collision_mask`` in ``held_object_data`` using ``mode``, ``layer``, and ``mask`` as keys for the respective values. This is so we can reapply them -later when the object is dropped by the VR controller. - -We then set the :ref:`RigidBody `'s mode to ``MODE_STATIC``, it's ``collision_layer`` to zero, and it's ``collision_mask`` to zero. This will make it where the held -:ref:`RigidBody ` cannot interact with other objects in the physics world when held by the VR controller. - -Next the ``hand_mesh`` :ref:`MeshInstance ` is made invisible by setting the ``visible`` property to ``false``. This is so the hand does not get in the way of the held object. -Likewise, the ``grab_raycast`` 'laser sight' is made invisible by setting the ``visible`` property to ``false``. - -Then the code checks to see if the held object extends a class called ``VR_Interactable_Rigidbody``. If it does, then sets a variable called ``controller`` on ``held_object`` to ``self``, and -calls the ``picked_up`` function on ``held_object``. While we haven't made ``VR_Interactable_Rigidbody`` just yet, what this will do is set tell the ``VR_Interactable_Rigidbody`` class that it is -being held by a VR controller, where the a reference to the controller is stored in the ``controller`` variable, through calling the ``picked_up`` function. - -.. tip:: Don't worry, we will cover ``VR_Interactable_Rigidbody`` after this section! - - The code should make more sense after completing part 2 of this tutorial series, where we will actually be using ``VR_Interactable_Rigidbody``. - -What this section of code does is that if a :ref:`RigidBody ` was found using the grab :ref:`Area ` or :ref:`Raycast `, it sets it up so that -it can be carried by the VR controller. - -``_throw_rigidbody`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First the function checks to see if the VR controller is not holding any object by checking if the ``held_object`` variable is equal to ``null``. If it is, then it simply -calls ``return`` so nothing happens. While this shouldn't be possible, the ``_throw_rigidbody`` function should only be called if an object is held, this check helps ensure -that if something strange happens, this function will react as expected. - -After checking if the VR controller is holding an object, we assume it is and set the stored :ref:`RigidBody ` data back to the held object. We take the ``mode``, ``layer`` and -``mask`` data stored in the ``held_object_data`` dictionary and reapply it to the object in ``held_object``. This will set the :ref:`RigidBody ` back to the state it was prior to -being picked up. - -Then we call ``apply_impulse`` on the ``held_object`` so that the :ref:`RigidBody ` is thrown in the direction of the VR controller's velocity, ``controller_velocity``. - -We then check to see if the object held extends a class called ``VR_Interactable_Rigidbody``. If it does, then we call a function called ``dropped`` in ``held_object`` and set -``held_object.controller`` to ``null``. While we have not made ``VR_Interactable_Rigidbody`` yet, but what this will do is call the ``droppped`` function so the :ref:`RigidBody ` -can do whatever it needs to do when dropped, and we set the ``controller`` variable to ``null`` so that the :ref:`RigidBody ` knows that it is not being held. - -.. tip:: Don't worry, we will cover ``VR_Interactable_Rigidbody`` after this section! - - The code should make more sense after completing part 2 of this tutorial series, where we will actually be using ``VR_Interactable_Rigidbody``. - -Regardless of whether ``held_object`` extends ``VR_Interactable_Rigidbody`` or not, we then set ``held_object`` to ``null`` so the VR controller knows it is no longer holding anything. -Because the VR controller is no longer holding anything, we make the ``hand_mesh`` visible by setting ``hand_mesh.visible`` to true. - -Finally, if the ``grab_mode`` variable is set to ``RAYCAST``, we set ``grab_raycast.visible`` to ``true`` so the 'laser sight' for the :ref:`Raycast ` in ``grab_raycast`` is visible. - - -``_on_button_pressed_menu`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function checks to see if the ``grab_mode`` variable is equal to ``AREA``. If it is, then it sets ``grab_mode`` to ``RAYCAST``. It then checks to see if the VR controller is not -holding anything by checking to see if ``held_object`` is equal to ``null``. If the VR controller is not holding anything, then ``grab_raycast.visible`` is set to ``true`` so the -'laser sight' on the grab raycast is visible. - -If the ``grab_mode`` variable is not equal to ``AREA``, then it checks to see if it is equal to ``RAYCAST``. If it is, then it sets the ``grab_mode`` to ``AREA`` and sets ``grab_raycast.visible`` -to ``false`` so the 'laser sight' on the grab raycast is not visible. - -This section of code simply changes how the VR controller will grab :ref:`RigidBody `-based nodes when the grab/grip button is pressed. If ``grab_mode`` is set to ``AREA``, then -the :ref:`Area ` node in ``grab_area`` will be used for detecting :ref:`RigidBody ` nodes, while if ``grab_mode`` is set to ``RAYCAST`` the :ref:`Raycast ` -node in ``grab_raycast`` will be used for detecting :ref:`RigidBody ` nodes. - - -``button_released`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The only section of code in this function checks to see if the index of the button that was just released, ``button_index``, is equal to ``15``, which should map to the trigger button -on the VR controller. The ``button_index`` variable is passed in by the ``button_release`` signal in :ref:`ARVRController `, which we connected in the ``_ready`` function. - -If the trigger button was just released, then the ``_on_button_released_trigger`` function is called. - - -``_on_button_released_trigger`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The only section of code in this function first checks to see if the VR controller is trying to teleport by checking if the ``teleport_button_down`` variable is equal to ``true``. - -If the ``teleport_button_down`` variable is equal to ``true``, the code then checks if there is a teleport position set and whether the teleport mesh is visible. It does this by -checking to see if ``teleport_pos`` is not equal to ``null`` and if ``teleport_mesh.visible`` is equal to ``true``. - -If there is a teleport position set and the teleport mesh is visible, the code then calculates the offset from the camera to the :ref:`ARVROrigin ` node, which is assumed to be the -parent node of the VR controller. To calculate the offset, the global position (``global_transform.origin``) of the ``Player_Camera`` node has the global position of the :ref:`ARVROrigin ` -subtracted from it. This will result in a vector that points from the :ref:`ARVROrigin ` to the :ref:`ARVRCamera `, which we store in a variable called ``camera_offset``. - -The reason we need to know the offset is because some VR headsets use room tracking, where the player's camera can be offset from the :ref:`ARVROrigin ` node. Because of this, when we teleport we want to -keep the offset created by room tracking so that when the player teleports, the offset created by the room tracking is not applied. Without this, if you moved in a room and then teleported, instead -of appearing at the position you wanted to teleport at, your position would be offset by the amount of distance you have from the :ref:`ARVROrigin ` node. - -Now that we know the offset from the VR camera to the VR origin, we need to remove the difference on the ``Y`` axis. We do this because we do not want to offset based on the user's height. -If we did not do this, when teleporting the player's head would be level with the ground. - -Then we can 'teleport' the player by setting the global position (``global_transform.origin``) of the ARVROrigin node to the position stored in ``teleport_pos`` with ``camera_offset`` subtracted from it. -This will teleport the player and remove the room tracking offset, so the user appears exactly where they want when teleporting. - -Finally, regardless of whether the VR controller teleported the user or not, we reset the teleport related variables. ``teleport_button_down`` is set to ``false``, ``teleport_mesh.visible`` is -set to ``false`` so the mesh is invisible, ``teleport_raycast.visible`` is set to ``false``, and ``teleport_pos`` is set to ``null``. - - -``sleep_area_entered`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The only section of code in this function checks to see if the :ref:`PhysicsBody ` node that entered the ``Sleep_Area`` node -has a variable called ``can_sleep``. If it does, then it sets the ``can_sleep`` variable to ``false`` and sets the ``sleeping`` variable to ``false``. - -Without doing this, sleeping :ref:`PhysicsBody ` nodes would not be able to be picked up by the VR controller, even if the VR controller -is at the same position as the :ref:`PhysicsBody ` node. To work around this, we simply 'wake up' :ref:`PhysicsBody ` nodes -that are close to the VR controller. - - -``sleep_area_exited`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The only section of code in this function checks to see if the :ref:`PhysicsBody ` node that entered the ``Sleep_Area`` node -has a variable called ``can_sleep``. If it does, then it sets the ``can_sleep`` variable to ``true``. - -This allows :ref:`RigidBody ` nodes that leave the ``Sleep_Area`` to sleep again, saving performance. - -_________________ - -Okay, whew! That was a lot of code! Add the same script, ``VR_Controller.gd`` to the other VR controller scene so both VR controllers have the same script. - -Now we just need to do one thing before testing the project! Right now we are referencing a class called ``VR_Interactable_Rigidbody``, but we have not defined it yet. -While we will not be using ``VR_Interactable_Rigidbody`` in this tutorial, let's create it real quick so the project can be run. - - - -Creating a base class for interactable VR objects -------------------------------------------------- - -With the ``Script`` tab still open, create a new GDScript called ``VR_Interactable_Rigidbody.gd``. - -.. tip:: You can create GDScripts in the ``Script`` tab by pressing ``File -> New Script...``. - -Once you have ``VR_Interactable_Rigidbody.gd`` open, add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - class_name VR_Interactable_Rigidbody - extends RigidBody - - # (Ignore the unused variable warning) - # warning-ignore:unused_class_variable - var controller = null - - - func _ready(): - pass - - - func interact(): - pass - - - func picked_up(): - pass - - - func dropped(): - pass - - -Let's quickly go through what this script. - -_________________ - -First we start the script with ``class_name VR_Interactable_Rigidbody``. What this does is that it tells Godot that this GDScript is a new class that called ``VR_Interactable_Rigidbody``. -This allows us to compare nodes against the ``VR_Interactable_Rigidbody`` class in other script files without having to load the script directly or do anything special. We can compare -the class just like all of the built-in Godot classes. - -Next is a class variable called ``controller``. ``controller`` will be used to hold a reference to the VR controller that is currently holding the object. If a VR controller is not -holding the object, then the ``controller`` variable will be ``null``. The reason we need to have a reference to the VR controller is so held objects can access VR controller specific -data, like ``controller_velocity``. - -Finally, we have four functions. The ``_ready`` function is defined by Godot and all we do is simply have ``pass`` as there is nothing we need to do when the object is added to the scene -in ``VR_Interactable_Rigidbody``. - -The ``interact`` function is a stub function that will be called when the interact button on the VR controller, the trigger in this case, is pressed while the object is held. - -.. tip:: A stub function is a function that is defined but does not have any code. Stub functions are generally designed to be overwritten or extended. In this project, we are using - the stub functions so there is a consistent interface across all interactable :ref:`RigidBody ` objects. - -The ``picked_up`` and ``dropped`` functions are stub functions that will be called when the object is picked up and dropped by the VR controller. - -_________________ - -That is all we need to do for now! In the next part of this tutorial series, we'll start making special interactable :ref:`RigidBody ` objects. - -Now that the base class is defined, the code in the VR controller should work. Go ahead and try the game again, and you should find you can teleport around by pressing the touch pad, -and can grab and throw objects using the grab/grip buttons. - -Now, you may want to try moving using the trackpads and/or joysticks, but **it may make you motion sick!** - -One of the main reasons this can make you feel motion sick is because your vision tells you that you are moving, while your body is not moving. -This conflict of signals can make the body feel sick. Let's add a vignette shader to help reduce motion sickness while moving in VR! - - - -Reducing motion sickness ------------------------- - -.. note:: There are plenty of ways to reduce motion sickness in VR, and there is no one perfect way to reduce motion sickness. See - `this page on the Oculus Developer Center `__ - for more information on how to implement locomotion and reducing motion sickness. - -To help reduce motion sickness while moving, we are going to add a vignette effect that will only be visible while the player moves. - -First, quickly switch back to ``Game.tscn``. Under the :ref:`ARVROrigin ` node there is a child node called ``Movement_Vignette``. This node is going to apply a simple -vignette to the VR headset when the player is moving using the VR controllers. This should help reduce motion sickness. - -Open up ``Movement_Vignette.tscn``, which you can find in the ``Scenes`` folder. The scene is just a :ref:`ColorRect ` node with a custom -shader. Feel free to look at the custom shader if you want, it is just a slightly modified version of the vignette shader you can find in the -`Godot demo repository `__. - -Let's write the code that will make the vignette shader visible when the player is moving. Select the ``Movement_Vignette`` node and create a new script called ``Movement_Vignette.gd``. -Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends Control - - var controller_one - var controller_two - - - func _ready(): - yield(get_tree(), "idle_frame") - yield(get_tree(), "idle_frame") - yield(get_tree(), "idle_frame") - yield(get_tree(), "idle_frame") - - var interface = ARVRServer.primary_interface - - if interface == null: - set_process(false) - printerr("Movement_Vignette: no VR interface found!") - return - - rect_size = interface.get_render_targetsize() - rect_position = Vector2(0,0) - - controller_one = get_parent().get_node("Left_Controller") - controller_two = get_parent().get_node("Right_Controller") - - visible = false - - - func _process(_delta): - if controller_one == null or controller_two == null: - return - - if controller_one.directional_movement == true or controller_two.directional_movement == true: - visible = true - else: - visible = false - -Because this script is fairly brief, let's quickly go over what it does. - - -Explaining the vignette code -^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -There are two class variables, ``controller_one`` and ``controller_two``. These variables will hold references to the left and right VR controllers. - -_________________ - -In the ``_ready`` function first waits for four frames using ``yield``. The reason we are waiting four frames is because we want to ensure the VR interface is ready -and accessible. - -After waiting the primary VR interface is retrieved using ``ARVRServer.primary_interface``, which is assigned to a variable called ``interface``. -The code then checks to see if ``interface`` is equal to ``null``. If ``interface`` is equal to ``null``, then ``_process`` is disabled using ``set_process`` with a value of ``false``. - -If ``interface`` is not ``null``, then we set the ``rect_size`` of the vignette shader to the render size of the VR viewport so it takes up the entire screen. We need to do this because -different VR headsets have different resolutions and aspect ratios, so we need to resize the node accordingly. We also set the ``rect_position`` of the vignette shader to zero so it -is in the correct position relative to the screen. - -The left and right VR controllers are then retrieved and assigned to ``controller_one`` and ``controller_two`` respectively. Finally, the vignette shader is made invisible by default -by setting it's ``visible`` property to ``false``. - -_________________ - -In ``_process`` the code first checks if either ``controller_one`` or ``controller_two`` are equal to ``null``. If either node is equal to ``null``, then ``return`` is called so -nothing happens. - -Then the code checks to see if either of the VR controllers are moving the player using the touchpad/joystick by checking if ``directional_movement`` is equal to ``true`` in -``controller_one`` or ``controller_two``. If either of the VR controllers are moving the player, then the vignette shader makes itself visible by setting it's ``visible`` property -to ``true``. If neither VR controller is moving the player, so ``directional_movement`` is ``false`` in both VR controllers, than the vignette shader makes itself invisible by setting -it's ``visible`` property to ``false``. - -_________________ - -That is the whole script! Now that we have written the code, go ahead and try moving around with the trackpad and/or joystick. You should find that it is less motion sickness-inducing -then before! - -.. note:: As previously mentioned, there are plenty of ways to reduce motion sickness in VR. Check out - `this page on the Oculus Developer Center `__ - for more information on how to implement locomotion and reducing motion sickness. - - - -Final notes ------------ - -.. image:: img/starter_vr_tutorial_hands.png - -Now you have fully working VR controllers that can move around the environment and interact with :ref:`RigidBody `-based objects. -In the next part of this tutorial series, we will be creating some special :ref:`RigidBody `-based objects for the player to use! - -.. warning:: You can download the finished project for this tutorial series on the Godot OpenVR GitHub repository, under the releases tab! diff --git a/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_two.rst b/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_two.rst deleted file mode 100644 index 465bc9a..0000000 --- a/tutorials/vr/openvr/vr_starter_tutorial/vr_starter_tutorial_part_two.rst +++ /dev/null @@ -1,1035 +0,0 @@ -.. _doc_vr_starter_tutorial_part_two: - -VR starter tutorial part 2 -========================== - -Introduction ------------- - -.. image:: img/starter_vr_tutorial_sword.png - -In this part of the VR starter tutorial series, we will be adding a number of special :ref:`RigidBody `-based nodes that can be used in VR. - -This continues from where we left on in the last tutorial part, where we just finished getting the VR controllers working and defined a custom -class called ``VR_Interactable_Rigidbody``. - -.. tip:: You can find the finished project on the `OpenVR GitHub repository `__. - - -Adding destroyable targets --------------------------- - -Before we make any of the special :ref:`RigidBody `-based nodes, we need something for them to do. Let's make a simple sphere target that will break into a bunch of pieces -when destroyed. - -Open up ``Sphere_Target.tscn``, which is in the ``Scenes`` folder. The scene is fairly simple, with just a :ref:`StaticBody ` with a sphere shaped -:ref:`CollisionShape `, a :ref:`MeshInstance ` node displaying a sphere mesh, and an :ref:`AudioStreamPlayer3D ` node. - -The special :ref:`RigidBody ` nodes will handle damaging the sphere, which is why we are using a :ref:`StaticBody ` node instead of something like -an :ref:`Area ` or :ref:`RigidBody ` node. Outside of that, there isn't really a lot to talk about, so let's move straight into writing the code. - -Select the ``Sphere_Target_Root`` node and make a new script called ``Sphere_Target.gd``. Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends Spatial - - var destroyed = false - var destroyed_timer = 0 - const DESTROY_WAIT_TIME = 80 - - var health = 80 - - const RIGID_BODY_TARGET = preload("res://Assets/RigidBody_Sphere.scn") - - - func _ready(): - set_physics_process(false) - - - func _physics_process(delta): - destroyed_timer += delta - if destroyed_timer >= DESTROY_WAIT_TIME: - queue_free() - - - func damage(damage): - if destroyed == true: - return - - health -= damage - - if health <= 0: - - get_node("CollisionShape").disabled = true - get_node("Shpere_Target").visible = false - - var clone = RIGID_BODY_TARGET.instance() - add_child(clone) - clone.global_transform = global_transform - - destroyed = true - set_physics_process(true) - - get_node("AudioStreamPlayer").play() - get_tree().root.get_node("Game").remove_sphere() - - -Let's go over how this script works. - -Explaining the Sphere Target code -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, let's go through all the class variables in the script: - -* ``destroyed``: A variable to track whether the sphere target has been destroyed. -* ``destroyed_timer``: A variable to track how long the sphere target has been destroyed. -* ``DESTROY_WAIT_TIME``: A constant to define the length of time the target can be destroyed for before it frees/deletes itself. -* ``health``: A variable to store the amount of health the sphere target has. -* ``RIGID_BODY_TARGET``: A constant to hold the scene of the destroyed sphere target. - -.. note:: Feel free to check out the ``RIGID_BODY_TARGET`` scene. It is just a bunch of :ref:`RigidBody ` nodes and a broken sphere model. - - We'll be instancing this scene so when the target is destroyed, it looks like it broke into a bunch of pieces. - - -``_ready`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -All the ``_ready`` function does is that it stops the ``_physics_process`` from being called by calling ``set_physics_process`` and passing ``false``. -The reason we do this is because all the code in ``_physics_process`` is for destroying this node when enough time has passed, which we only want to -do when the target has been destroyed. - - -``_physics_process`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First this function adds time, ``delta``, to the ``destroyed_timer`` variable. It then checks to see if ``destroyed_timer`` is greater than or equal to -``DESTROY_WAIT_TIME``. If ``destroyed_timer`` is greater than or equal to ``DESTROY_WAIT_TIME``, then the sphere target frees/deletes itself by calling -the ``queue_free`` function. - -``damage`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -The ``damage`` function will be called by the special :ref:`RigidBody ` nodes, which will pass the amount of damage done to the target, which is a function argument -variable called ``damage``. The ``damage`` variable will hold the amount of damage the special :ref:`RigidBody ` node did to the sphere target. - -First this function checks to make sure the target is not already destroyed by checking if the ``destroyed`` variable is equal to ``true``. If ``destroyed`` is equal to ``true``, then -the function calls ``return`` so none of the other code is called. This is just a safety check so that if two things damage the target at exactly the same time, the target cannot be -destroyed twice. - -Next the function removes the amount of damage taken, ``damage``, from the target's health, ``health``. If then checks to see if ``health`` is equal to zero or less, meaning that the -target has just been destroyed. - -If the target has just been destroyed, then we disable the :ref:`CollisionShape ` by setting it's ``disabled`` property to ``true``. We then make the ``Sphere_Target`` -:ref:`MeshInstance ` invisible by setting the ``visible`` property to ``false``. We do this so the target can no longer effect the physics world and so the non-broken target mesh is not visible. - -After this the function then instances the ``RIGID_BODY_TARGET`` scene and adds it as a child of the target. It then sets the ``global_transform`` of the newly instanced scene, called ``clone``, to the -``global_transform`` of the non-broken target. This makes it where the broken target starts at the same position as the non-broken target with the same rotation and scale. - -Then the function sets the ``destroyed`` variable to ``true`` so the target knows it has been destroyed and calls the ``set_physics_process`` function and passes ``true``. This will start -executing the code in ``_physics_process`` so that after ``DESTROY_WAIT_TIME`` seconds have passed, the sphere target will free/destroy itself. - -The function then gets the :ref:`AudioStreamPlayer3D ` node and calls the ``play`` function so it plays its sound. - -Finally, the ``remove_sphere`` function is called in ``Game.gd``. To get ``Game.gd``, the code uses the scene tree and works its way from the root of the scene tree to the root of the -``Game.tscn`` scene. - - -Adding the ``remove_sphere`` function to ``Game.gd`` -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -You may have noticed we are calling a function in ``Game.gd``, called ``remove_sphere``, that we have not defined yet. Open up ``Game.gd`` and -add the following additional class variables: - -.. tabs:: - .. code-tab:: gdscript GDScript - - var spheres_left = 10 - var sphere_ui = null - -- ``spheres_left``: The amount of sphere targets left in the world. In the provided ``Game`` scene, there are ``10`` spheres, so that is the initial value. -- ``sphere_ui``: A reference to the sphere UI. We will use this later in the tutorial to display the amount of spheres left in the world. - -With these variables defined, we can now add the ``remove_sphere`` function. Add the following code to ``Game.gd``: - -.. tabs:: - .. code-tab:: gdscript GDScript - - func remove_sphere(): - spheres_left -= 1 - - if sphere_ui != null: - sphere_ui.update_ui(spheres_left) - - -Let's go through what this function does real quick: - -First, it removes one from the ``spheres_left`` variable. It then checks to see if the ``sphere_ui`` variable is not equal to ``null``, and if it is not -equal to ``null`` it calls the ``update_ui`` function on ``sphere_ui``, passing in the number of spheres as an argument to the function. - -.. note:: We will add the code for ``sphere_ui`` later in this tutorial! - -Now the ``Sphere_Target`` is ready to be used, but we don't have any way to destroy it. Let's fix that by adding some special :ref:`RigidBody `-based nodes -that can damage the targets. - - -Adding a pistol ---------------- - -Let's add a pistol as the first interactable :ref:`RigidBody ` node. Open up ``Pistol.tscn``, which you can find in the ``Scenes`` folder. - -Let's quickly go over a few things of note in ``Pistol.tscn`` real quick before we add the code. - -All of the nodes in ``Pistol.tscn`` expect the root node are rotated. This is so the pistol is in the correct rotation relative to the VR controller when it is picked up. The root node -is a :ref:`RigidBody ` node, which we need because we're going to use the ``VR_Interactable_Rigidbody`` class we created in the last part of this tutorial series. - -There is a :ref:`MeshInstance ` node called ``Pistol_Flash``, which is a simple mesh that we will be using to simulate the muzzle flash on the end of the pistol's barrel. -A :ref:`MeshInstance ` node called ``LaserSight`` is used to as a guide for aiming the pistol, and it follows the direction of the :ref:`Raycast ` node, -called ``Raycast``, that the pistol uses to detect if its 'bullet' hit something. Finally, there is an :ref:`AudioStreamPlayer3D ` node at the end of the -pistol that we will use to play the sound of the pistol firing. - -Feel free to look at the other parts of the scene if you want. Most of the scene is fairly straightforward, with the major changes mentioned above. Select the :ref:`RigidBody ` -node called ``Pistol`` and make a new script called ``Pistol.gd``. Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends VR_Interactable_Rigidbody - - var flash_mesh - const FLASH_TIME = 0.25 - var flash_timer = 0 - - var laser_sight_mesh - var pistol_fire_sound - - var raycast - const BULLET_DAMAGE = 20 - const COLLISION_FORCE = 1.5 - - - func _ready(): - flash_mesh = get_node("Pistol_Flash") - flash_mesh.visible = false - - laser_sight_mesh = get_node("LaserSight") - laser_sight_mesh.visible = false - - raycast = get_node("RayCast") - pistol_fire_sound = get_node("AudioStreamPlayer3D") - - - func _physics_process(delta): - if flash_timer > 0: - flash_timer -= delta - if flash_timer <= 0: - flash_mesh.visible = false - - - func interact(): - if flash_timer <= 0: - - flash_timer = FLASH_TIME - flash_mesh.visible = true - - raycast.force_raycast_update() - if raycast.is_colliding(): - - var body = raycast.get_collider() - var direction_vector = raycast.global_transform.basis.z.normalized() - var raycast_distance = raycast.global_transform.origin.distance_to(raycast.get_collision_point()) - - if body.has_method("damage"): - body.damage(BULLET_DAMAGE) - elif body is RigidBody: - var collision_force = (COLLISION_FORCE / raycast_distance) * body.mass - body.apply_impulse((raycast.global_transform.origin - body.global_transform.origin).normalized(), direction_vector * collision_force) - - pistol_fire_sound.play() - - if controller != null: - controller.rumble = 0.25 - - - func picked_up(): - laser_sight_mesh.visible = true - - - func dropped(): - laser_sight_mesh.visible = false - -Let's go over how this script works. - - -Explaining the pistol code -^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, notice how instead of ``extends RigidBody``, we instead have ``extends VR_Interactable_Rigidbody``. This makes it where the pistol script extends the -``VR_Interactable_Rigidbody`` class so the VR controllers know this object can be interacted with and that the functions defined in ``VR_Interactable_Rigidbody`` -can be called when this object is held by a VR controller. - -Next, let's look at the class variables: - -* ``flash_mesh``: A variable to hold the :ref:`MeshInstance ` node that is used to simulate muzzle flash on the pistol. -* ``FLASH_TIME``: A constant to define how long the muzzle flash will be visible. This will also define how fast the pistol can fire. -* ``flash_timer``: A variable to hold the amount of time the muzzle flash has been visible for. -* ``laser_sight_mesh``: A variable to hold the :ref:`MeshInstance ` node that acts as the pistol's 'laser sight'. -* ``pistol_fire_sound``: A variable to hold the :ref:`AudioStreamPlayer3D ` node used for the pistol's firing sound. -* ``raycast``: A variable to hold the :ref:`Raycast ` node that is used for calculating the bullet's position and normal when the pistol is fired. -* ``BULLET_DAMAGE``: A constant to define the amount of damage a single bullet from the pistol does. -* ``COLLISION_FORCE``: A constant that defines the amount of force that is applied to :ref:`RigidBody ` nodes when the pistol's bullet collides. - - -``_ready`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -This function gets the nodes and assigns them to their proper variables. For the ``flash_mesh`` and ``laser_sight_mesh`` nodes, both have their ``visible`` property set to ``false`` -so they are not visible initially. - -``_physics_process`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The ``_physics_process`` function first checks to see if the pistol's muzzle flash is visible by checking if ``flash_timer`` is more than zero. If ``flash_timer`` is more than -zero, then we remove time, ``delta`` from it. Next we check if the ``flash_timer`` variable is zero or less now that we removed ``delta`` from it. If it is, then the pistol -muzzle flash timer just finished and so we need to make ``flash_mesh`` invisible by setting it's ``visible`` property to ``false``. - -``interact`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""" - -The interact function first checks to see if the pistol's muzzle flash is invisible by checking to see if ``flash_timer`` is less than or equal to zero. We do this so we -can limit the rate of fire of the pistol to the length of time the muzzle flash is visible, which is a simple solution for limiting how fast the player can fire. - -If ``flash_timer`` is zero or less, we then set ``flash_timer`` to ``FLASH_TIME`` so there is a delay before the pistol can fire again. After that we set ``flash_mesh.visible`` -to ``true`` so the muzzle flash at the end of the pistol is visible while ``flash_timer`` is more than zero. - -Next we call the ``force_raycast_update`` function on the :ref:`Raycast ` node in ``raycast`` so that it gets the latest collision info from the physics world. -We then check if the ``raycast`` hit something by checking if the ``is_colliding`` function is equal to ``true``. - -_________________ - -If the ``raycast`` hit something, then we get the :ref:`PhysicsBody ` it collided with through the ``get_collider`` function. We assign the -hit :ref:`PhysicsBody ` to a variable called ``body``. - -We then get the direction of the :ref:`Raycast ` by getting it's positive ``Z`` directional axis from the :ref:`Basis ` on the ``raycast`` node's ``global_transform``. -This will give us the direction the raycast is pointing on the Z axis, which is the same direction as the blue arrow on the :ref:`Spatial ` gizmo when -``Local space mode`` is enabled in the Godot editor. We store this direction in a variable called ``direction_vector``. - -Next we get the distance from the :ref:`Raycast ` origin to the :ref:`Raycast ` collision point by getting the distance from the global position, ``global_transform.origin`` -of the ``raycast`` node to the collision point of the :ref:`Raycast `, ``raycast.get_collision_point``, using the ``distance_to`` function. This will give us the distance the -:ref:`Raycast ` traveled before it collided, which we store in a variable called ``raycast_distance``. - -Then the code checks if the :ref:`PhysicsBody `, ``body``, has a function/method called ``damage`` using the ``has_method`` function. If the :ref:`PhysicsBody ` -has a function/method called ``damage``, then we call the ``damage`` function and pass ``BULLET_DAMAGE`` so it takes damage from the bullet colliding into it. - -Regardless of whether the :ref:`PhysicsBody ` has a ``damage`` function, we then check to see if ``body`` is a :ref:`RigidBody `-based node. If ``body`` is a -:ref:`RigidBody `-based node, then we want to push it when the bullet collides. - -To calculate the amount of force applied, we simply take ``COLLISION_FORCE`` and divide it by ``raycast_distance``, then we multiply the whole thing by ``body.mass``. We store this calculation in -a variable called ``collision_force``. This will make collisions over a shorter distance apply move force than those over longer distances, giving a *slightly* more realistic collision response. - -We then push the :ref:`RigidBody ` using the ``apply_impulse`` function, where the position is a zero Vector3 so the force is applied from the center, and the collision force is the ``collision_force`` variable we calculated. - -_________________ - -Regardless of whether the ``raycast`` variable hit something or not, we then play the pistol shot sound by calling the ``play`` function on the ``pistol_fire_sound`` variable. - -Finally, we check to see if the pistol is being held by a VR controller by checking to see if the ``controller`` variable is not equal to ``null``. If it is not equal to ``null``, -we then set the ``rumble`` property of the VR controller to ``0.25``, so there is a slight rumble when the pistol fires. - - -``picked_up`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""""" - -This function simply makes the ``laser_sight_mesh`` :ref:`MeshInstance ` visible by setting the ``visible`` property to ``true``. - -``dropped`` function step-by-step explanation -""""""""""""""""""""""""""""""""""""""""""""" - -This function simply makes the ``laser_sight_mesh`` :ref:`MeshInstance ` invisible by setting the ``visible`` property to ``false``. - - -Pistol finished -^^^^^^^^^^^^^^^ - -.. image:: img/starter_vr_tutorial_pistol.png - - -That is all we need to do to have working pistols in the project! Go ahead and run the project. If you climb up the stairs and grab the pistols, you can fire them at the sphere -targets in the scene using the trigger button on the VR controller! If you fire at the targets long enough, they will break into pieces. - - - -Adding a shotgun ----------------- - -Next let's add a shotgun to the VR project. - -Adding a special shotgun :ref:`RigidBody ` should be fairly straightforward, as almost everything with the shotgun is the same as the pistol. - -Open up ``Shotgun.tscn``, which you can find in the ``Scenes`` folder and take a look at the scene. Almost everything is the same as in ``Pistol.tscn``. -The only thing that is different, beyond name changes, is that instead of a single :ref:`Raycast `, there are five :ref:`Raycast ` nodes. -This is because a shotgun generally fires in a cone shape, so we are going to emulate that effect by having several :ref:`Raycast ` nodes that will rotate -randomly in a cone shape when the shotgun fires. - -Outside of that, everything is more or less the same as ``Pistol.tscn``. - -Let's write the code for the shotgun. Select the :ref:`RigidBody ` node called ``Shotgun`` and make a new script called ``Shotgun.gd``. Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends VR_Interactable_Rigidbody - - var flash_mesh - const FLASH_TIME = 0.25 - var flash_timer = 0 - - var laser_sight_mesh - var shotgun_fire_sound - - var raycasts - const BULLET_DAMAGE = 30 - const COLLISION_FORCE = 4 - - - func _ready(): - flash_mesh = get_node("Shotgun_Flash") - flash_mesh.visible = false - - laser_sight_mesh = get_node("LaserSight") - laser_sight_mesh.visible = false - - raycasts = get_node("Raycasts") - shotgun_fire_sound = get_node("AudioStreamPlayer3D") - - - func _physics_process(delta): - if flash_timer > 0: - flash_timer -= delta - if flash_timer <= 0: - flash_mesh.visible = false - - - func interact(): - if flash_timer <= 0: - - flash_timer = FLASH_TIME - flash_mesh.visible = true - - for raycast in raycasts.get_children(): - - if not raycast is RayCast: - continue - - raycast.rotation_degrees = Vector3(90 + rand_range(10, -10), 0, rand_range(10, -10)) - - raycast.force_raycast_update() - if raycast.is_colliding(): - - var body = raycast.get_collider() - var direction_vector = raycasts.global_transform.basis.z.normalized() - var raycast_distance = raycasts.global_transform.origin.distance_to(raycast.get_collision_point()) - - if body.has_method("damage"): - body.damage(BULLET_DAMAGE) - - if body is RigidBody: - var collision_force = (COLLISION_FORCE / raycast_distance) * body.mass - body.apply_impulse((raycast.global_transform.origin - body.global_transform.origin).normalized(), direction_vector * collision_force) - - shotgun_fire_sound.play() - - if controller != null: - controller.rumble = 0.25 - - - func picked_up(): - laser_sight_mesh.visible = true - - - func dropped(): - laser_sight_mesh.visible = false - - -The majority of this code is exactly the same as the code for the pistol with just a few *minor* changes that are primarily just different names. -Due to how similar these scripts are, let's just focus on the changes. - -Explaining the shotgun code -^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Like with the pistol, the shotgun extends ``VR_Interactable_Rigidbody`` so the VR controllers know that this object can be interacted with and what functions are -available. - -There is only one new class variable: - -* ``raycasts``: A variable to hold the node that has all of the :ref:`Raycast ` nodes as its children. - -The new class variable replaces the ``raycast`` variable from ``Pistol.gd``, because with the shotgun we need to process multiple :ref:`Raycast ` nodes -instead of just one. All of the other class variables are the same as ``Pistol.gd`` and function the same way, some just are renamed to be non-pistol specific. - -``interact`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""" - -The interact function first checks to see if the shotgun's muzzle flash is invisible by checking to see if ``flash_timer`` is less than or equal to zero. We do this so we -can limit the rate of fire of the shotgun to the length of time the muzzle flash is visible, which is a simple solution for limiting how fast the player can fire. - -If ``flash_timer`` is zero or less, we then set ``flash_timer`` to ``FLASH_TIME`` so there is a delay before the shotgun can fire again. After that we set ``flash_mesh.visible`` -to ``true`` so the muzzle flash at the end of the shotgun is visible while ``flash_timer`` is more than zero. - -Next we call the ``force_raycast_update`` function on the :ref:`Raycast ` node in ``raycast`` so that it gets the latest collision info from the physics world. -We then check if the ``raycast`` hit something by checking if the ``is_colliding`` function is equal to ``true``. - -Next we go through each of the child nodes of the ``raycasts`` variable using a for loop. This way the code will go through each of the :ref:`Raycast ` nodes -that are children of the ``raycasts`` variable. - -_________________ - -For each node, we check to see if ``raycast`` is *not* a :ref:`Raycast ` node. If the node is not a :ref:`Raycast ` node, we simply use ``continue`` to skip it. - -Next we rotate the ``raycast`` node randomly around a small ``10`` degrees cone by settings the ``rotation_degrees`` variable of the ``raycast`` to a Vector3 where the X and Z axis -are a random number from ``-10`` to ``10``. This random number is selected using the ``rand_range`` function. - -Then we call the ``force_raycast_update`` function on the :ref:`Raycast ` node in ``raycast`` so that it gets the latest collision info from the physics world. -We then check if the ``raycast`` hit something by checking if the ``is_colliding`` function is equal to ``true``. - -The rest of the code is exactly the same, but this process is repeated for each :ref:`Raycast ` node that is a child of the ``raycasts`` variable. - -_________________ - -If the ``raycast`` hit something, then we get the :ref:`PhysicsBody ` it collided with through the ``get_collider`` function. We assign the -hit :ref:`PhysicsBody ` to a variable called ``body``. - -We then get the direction of the raycast by getting it's positive ``Z`` directional axis from the :ref:`Basis ` on the ``raycast`` node's ``global_transform``. -This will give us the direction the raycast is pointing on the Z axis, which is the same direction as the blue arrow on the :ref:`Spatial ` gizmo when -``Local space mode`` is enabled in the Godot editor. We store this direction in a variable called ``direction_vector``. - -Next we get the distance from the raycast origin to the raycast collision point by getting the distance from the global position, ``global_transform.origin`` of the ``raycast`` -node to the collision point of the raycast, ``raycast.get_collision_point``, using the ``distance_to`` function. This will give us the distance the :ref:`Raycast ` -traveled before it collided, which we store in a variable called ``raycast_distance``. - -Then the code checks if the :ref:`PhysicsBody `, ``body``, has a function/method called ``damage`` using the ``has_method`` function. If the :ref:`PhysicsBody ` -has a function/method called ``damage``, then we call the ``damage`` function and pass ``BULLET_DAMAGE`` so it takes damage from the bullet colliding into it. - -Regardless of whether the :ref:`PhysicsBody ` has a ``damage`` function, we then check to see if ``body`` is a :ref:`RigidBody `-based node. If ``body`` is a -:ref:`RigidBody `-based node, then we want to push it when the bullet collides. - -To calculate the amount of force applied, we simply take ``COLLISION_FORCE`` and divide it by ``raycast_distance``, then we multiply the whole thing by ``body.mass``. We store this calculation in -a variable called ``collision_force``. This will make collisions over a shorter distance apply move force than those over longer distances, giving a *slightly* more realistic collision response. - -We then push the :ref:`RigidBody ` using the ``apply_impulse`` function, where the position is a zero Vector3 so the force is applied from the center, -and the collision force is the ``collision_force`` variable we calculated. - -_________________ - -Once all of the :ref:`Raycast `\s in the ``raycast`` variable have been iterated over, we then play the shotgun shot sound by calling the ``play`` function on the ``shotgun_fire_sound`` variable. - -Finally, we check to see if the shotgun is being held by a VR controller by checking to see if the ``controller`` variable is not equal to ``null``. If it is not equal to ``null``, -we then set the ``rumble`` property of the VR controller to ``0.25``, so there is a slight rumble when the shotgun fires. - -Shotgun finished -^^^^^^^^^^^^^^^^ - -Everything else is exactly the same as the pistol, with at most just some simple name changes. - -Now the shotgun is finished! You can find the shotgun in the sample scene by looking around the back of one of the walls (not in the building though!). - - - -Adding a bomb -------------- - -Okay, let's add a different special :ref:`RigidBody `. Instead of adding something that shoots, let's add something we can throw - a bomb! - -Open up ``Bomb.tscn``, which is in the ``Scenes`` folder. - -The root node is a :ref:`RigidBody ` node that we'll be extending to use ``VR_Interactable_Rigidbody``, which has a :ref:`CollisionShape ` -like the other special :ref:`RigidBody ` nodes we've made so far. Likewise, there is a :ref:`MeshInstance ` called ``Bomb`` that is used to -display the mesh for the bomb. - -Then we have an :ref:`Area ` node simply called ``Area`` that has a large :ref:`CollisionShape ` as its child. We'll use this :ref:`Area ` -node to effect anything within it when the bomb explodes. Essentially, this :ref:`Area ` node will be the blast radius for the bomb. - -There is also a couple :ref:`Particles ` nodes. One of the :ref:`Particles ` nodes are for the smoke coming out of the bomb's fuse, while another -is for the explosion. You can take a look at the :ref:`ParticlesMaterial ` resources, which define how the particles work, if you want. We will not be covering -how the particles work in this tutorial due to it being outside of the scope of this tutorial. - -There is one thing with the :ref:`Particles ` nodes that we need to make note of. If you select the ``Explosion_Particles`` node, you'll find that its ``lifetime`` property -is set to ``0.75`` and that the ``one shot`` checkbox is enabled. This means that the particles will only play once, and the particles will last for ``0.75`` seconds. -We'll need to know this so we can time the removal of the bomb with the end of the explosion :ref:`Particles `. - -Let's write the code for the bomb. Select the ``Bomb`` :ref:`RigidBody ` node and make a new script called ``Bomb.gd``. Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends VR_Interactable_Rigidbody - - var bomb_mesh - - const FUSE_TIME = 4 - var fuse_timer = 0 - - var explosion_area - const EXPLOSION_DAMAGE = 100 - const EXPLOSION_TIME = 0.75 - var explosion_timer = 0 - var exploded = false - - const COLLISION_FORCE = 8 - - var fuse_particles - var explosion_particles - var explosion_sound - - - func _ready(): - - bomb_mesh = get_node("Bomb") - explosion_area = get_node("Area") - fuse_particles = get_node("Fuse_Particles") - explosion_particles = get_node("Explosion_Particles") - explosion_sound = get_node("AudioStreamPlayer3D") - - set_physics_process(false) - - - func _physics_process(delta): - - if fuse_timer < FUSE_TIME: - - fuse_timer += delta - - if fuse_timer >= FUSE_TIME: - - fuse_particles.emitting = false - - explosion_particles.one_shot = true - explosion_particles.emitting = true - - bomb_mesh.visible = false - - collision_layer = 0 - collision_mask = 0 - mode = RigidBody.MODE_STATIC - - for body in explosion_area.get_overlapping_bodies(): - if body == self: - pass - else: - if body.has_method("damage"): - body.damage(EXPLOSION_DAMAGE) - - if body is RigidBody: - var direction_vector = body.global_transform.origin - global_transform.origin - var bomb_distance = direction_vector.length() - var collision_force = (COLLISION_FORCE / bomb_distance) * body.mass - body.apply_impulse(Vector3.ZERO, direction_vector.normalized() * collision_force) - - exploded = true - explosion_sound.play() - - - if exploded: - - explosion_timer += delta - - if explosion_timer >= EXPLOSION_TIME: - - explosion_area.monitoring = false - - if controller != null: - controller.held_object = null - controller.hand_mesh.visible = true - - if controller.grab_mode == "RAYCAST": - controller.grab_raycast.visible = true - - queue_free() - - - func interact(): - set_physics_process(true) - - fuse_particles.emitting = true - - -Let's go over how this script works. - - -Explaining the bomb code -^^^^^^^^^^^^^^^^^^^^^^^^ - -Like with the other special :ref:`RigidBody ` nodes, the bomb extends ``VR_Interactable_Rigidbody`` so the VR controllers know this object can be interacted with and -that the functions defined defined in ``VR_Interactable_Rigidbody`` can be called when this object is held by a VR controller. - -Next, let's look at the class variables: - -* ``bomb_mesh``: A variable to hold the :ref:`MeshInstance ` node that is used for the non-exploded bomb. -* ``FUSE_TIME``: A constant to define how long the fuse will 'burn' before the bomb explodes -* ``fuse_timer``: A variable to hold the length of time that has passed since the bomb's fuse has started to burn. -* ``explosion_area``: A variable to hold the :ref:`Area ` node used to detect objects within the bomb's explosion. -* ``EXPLOSION_DAMAGE``: A constant to define how much damage is applied with the bomb explodes. -* ``EXPLOSION_TIME``: A constant to define how long the bomb will last in the scene after it explodes. This value should be the same as the ``lifetime`` property of the explosion :ref:`Particles ` node. -* ``explosion_timer`` A variable to hold the length of time that has passed since the bomb exploded. -* ``exploded``: A variable to hold whether the bomb has exploded or not. -* ``COLLISION_FORCE``: A constant that defines the amount of force that is applied to :ref:`RigidBody ` nodes when the bomb explodes. -* ``fuse_particles``: A variable to hold a reference to the :ref:`Particles ` node used for the bomb's fuse. -* ``explosion_particles``: A variable to hold a reference to the :ref:`Particles ` node used for the bomb's explosion. -* ``explosion_sound``: A variable to hold a reference to the :ref:`AudioStreamPlayer3D ` node used for the explosion sound. - - -``_ready`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -The ``_ready`` function first gets all of the nodes from the bomb scene and assigns them to their respective class variables for later use. - -Then we call ``set_physics_process`` and pass ``false`` so ``_physics_process`` is not executed. We do this because the code in ``_physics_process`` will start burning -the fuse and exploding the bomb, which we only want to do when the user interacts with the bomb. If we did not disable ``_physics_process``, the bomb's fuse would start -before the user has a chance to get to the bomb. - - -``_physics_process`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -The ``_physics_process`` function first checks to see if ``fuse_timer`` is less than ``FUSE_TIME``. If it is, then the bomb's fuse is still burning. - -If the bomb's fuse is still burning, we then add time, ``delta``, to the ``fuse_timer`` variable. We then check to see if ``fuse_timer`` is more than or equal to ``FUSE_TIME`` -now that we have added ``delta`` to it. If ``fuse_timer`` is more than or equal to ``FUSE_TIME``, then the fuse has just finished and we need to explode the bomb. - -To explode the bomb, we first stop emitting particles for the fuse by setting ``emitting`` to ``false`` on ``fuse_particles``. We then tell the explosion :ref:`Particles ` -node, ``explosion_particles``, to emit all of its particle in a single shot by setting ``one_shot`` to ``true``. After that, we set ``emitting`` to ``true`` on ``explosion_particles`` so it looks -like the bomb has exploded. To help make it look like the bomb exploded, we hide the bomb :ref:`MeshInstance ` node by setting ``bomb_mesh.visible`` to ``false``. - -To keep the bomb from colliding with other objects in the physics world, we set the ``collision_layer`` and ``collision_mask`` properties of the bomb to ``0``. We also -change the :ref:`RigidBody ` mode to ``MODE_STATIC`` so the bomb :ref:`RigidBody ` does not move. - -Then we need to get all of the :ref:`PhysicsBody ` nodes within the ``explosion_area`` node. To do this, we use the ``get_overlapping_bodies`` in a for loop. The ``get_overlapping_bodies`` -function will return an array of :ref:`PhysicsBody ` nodes within the :ref:`Area ` node, which is exactly what we are looking for. - -_________________ - -For each :ref:`PhysicsBody ` node, which we store in a variable called ``body``, we check to see if it is equal to ``self``. We do this so the bomb does not accidentally explode -itself, as the ``explosion_area`` could potentially detect the ``Bomb`` :ref:`RigidBody ` as a PhysicsBody within the explosion area. - -If the :ref:`PhysicsBody ` node, ``body``, is not the bomb, then we first check to see if the :ref:`PhysicsBody ` node has a function -called ``damage``. If the :ref:`PhysicsBody ` node has a function called ``damage``, we call it and pass ``EXPLOSION_DAMAGE`` to it so it takes damage from the explosion. - -Next we check to see if the :ref:`PhysicsBody ` node is a :ref:`RigidBody `. If ``body`` is a :ref:`RigidBody `, we want to move it -when the bomb explodes. - -To move the :ref:`RigidBody ` node when the bomb explodes, we first need to calculate the direction from the bomb to the :ref:`RigidBody ` node. To do this -we subtract the global position of the bomb, ``global_transform.origin`` from the global position of the :ref:`RigidBody `. This will give us a :ref:`Vector3 ` -that points from the bomb to the :ref:`RigidBody ` node. We store this :ref:`Vector3 ` in a variable called ``direction_vector``. - -We then calculate the distance the :ref:`RigidBody ` is from the bomb by using the ``length`` function on ``direction_vector``. We store the distance in a variable called -``bomb_distance``. - -We then calculate the amount of force the bomb will be applied to the :ref:`RigidBody ` node when the bomb explodes by dividing ``COLLISION_FORCE`` by -``bomb_distance``, and multiplying that by ``collision_force``. This will make it so if the :ref:`RigidBody ` node is closer to the bomb, it will be pushed farther. - -Finally, we push the :ref:`RigidBody ` node using the ``apply_impulse`` function, with a :ref:`Vector3 ` position of zero and ``collision_force`` -multiplied by ``direction_vector.normalized`` as the force. This will send the :ref:`RigidBody ` node flying when the bomb explodes. - -_________________ - -After we have looped through all of the :ref:`PhysicsBody ` nodes within the ``explosion_area``, we set the ``exploded`` variable to ``true`` so the code knows the bomb -exploded and call ``play`` on ``explosion_sound`` so the sound of an explosion is played. - -_________________ - -Alright, the next section of code starts by first checking if ``exploded`` is equal to ``true``. - -If ``exploded`` is equal to ``true``, then that means the bomb is waiting for the explosion particles to finish before it frees/destroys itself. We add time, ``delta``, to -``explosion_timer`` so we can track how long it has been since the bomb has exploded. - -If ``explosion_timer`` is greater than or equal to ``EXPLOSION_TIME`` after we added ``delta``, then the explosion timer just finished. - -If the explosion timer just finished, we set ``explosion_area.monitoring`` to ``false``. The reason we do this is because there was a bug that would print an error when you -freed/deleted an :ref:`Area ` node when the ``monitoring`` property was true. To make sure this doesn't happen, we simply set ``monitoring`` to false on ``explosion_area``. - -Next we check to see if the bomb is being held by a VR controller by checking to see if the ``controller`` variable is not equal to ``null``. If the bomb is being held by a VR controller, -we set the ``held_object`` property of the VR controller, ``controller``, to ``null``. Because the VR controller is no longer holding anything, we make the VR controller's hand mesh -visible by setting ``controller.hand_mesh.visible`` to ``true``. Then we check to see if the VR controller grab mode is ``RAYCAST``, and if it is we set ``controller.grab_raycast.visible`` to -``true`` so the 'laser sight' for the grab raycast is visible. - -Finally, regardless if the bomb is being held by a VR controller or not, we call ``queue_free`` so the bomb scene is freed/removed from the scene. - -``interact`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""" - -First the ``interact`` function calls ``set_physics_process`` and passes ``true`` so the code in ``_physics_process`` starts executing. This will start the bomb's fuse and -eventually lead to the bomb exploding. - -Finally, we start the fuse particles by setting ``fuse_particles.visible`` to ``true``. - - -Bomb finished -^^^^^^^^^^^^^ - -Now the bomb is ready to go! You can find the bombs in the orange building. - -Because of how we are calculating the VR controller's velocity, it is easiest to throw the bombs using a thrusting-like motion instead of a more natural throwing-like motion. -The smooth curve of a throwing-like motion is harder to track with the code we are using for calculating the velocity of the VR controllers, so it does not always work correctly -and can lead inaccurately calculated velocities. - - - -Adding a sword --------------- - -Let's add one last special :ref:`RigidBody `-based node that can destroy targets. Let's add a sword so we can slice through the targets! - -Open up ``Sword.tscn``, which you can find in the ``Scenes`` folder. - -There is not a whole lot going on here. All of the child nodes of the root ``Sword`` :ref:`RigidBody ` node are rotated to they are positioned correctly when the -VR controller picks them up, there is a :ref:`MeshInstance ` node for displaying the sword, and there is an :ref:`AudioStreamPlayer3D ` -node that holds a sound for the sword colliding with something. - -There is one thing that is slightly different though. There is a :ref:`KinematicBody ` node called ``Damage_Body``. If you take a look at it, you'll find that it -is not on any collision layers, and is instead only on a single collision mask. This is so the :ref:`KinematicBody ` will not effect other -:ref:`PhysicsBody ` nodes in the scene, but it will still be effected by :ref:`PhysicsBody ` nodes. - -We are going to use the ``Damage_Body`` :ref:`KinematicBody ` node to detect the collision point and normal when the sword collides with something in the scene. - -.. tip:: While this is perhaps not the best way of getting the collision information from a performance point of view, it does give us a lot of information we can use for post-processing! - Using a :ref:`KinematicBody ` this way means we can detect exactly where the sword collided with other :ref:`PhysicsBody ` nodes. - -That is really the only thing note worthy about the sword scene. Select the ``Sword`` :ref:`RigidBody ` node and make a new script called ``Sword.gd``. -Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends VR_Interactable_Rigidbody - - const SWORD_DAMAGE = 2 - - const COLLISION_FORCE = 0.15 - - var damage_body = null - - - func _ready(): - damage_body = get_node("Damage_Body") - damage_body.add_collision_exception_with(self) - sword_noise = get_node("AudioStreamPlayer3D") - - - func _physics_process(_delta): - - var collision_results = damage_body.move_and_collide(Vector3.ZERO, true, true, true); - - if (collision_results != null): - if collision_results.collider.has_method("damage"): - collision_results.collider.damage(SWORD_DAMAGE) - - if collision_results.collider is RigidBody: - if controller == null: - collision_results.collider.apply_impulse( - collision_results.position, - collision_results.normal * linear_velocity * COLLISION_FORCE) - else: - collision_results.collider.apply_impulse( - collision_results.position, - collision_results.normal * controller.controller_velocity * COLLISION_FORCE) - - sword_noise.play() - -Let's go over how this script works! - - -Explaining the sword code -^^^^^^^^^^^^^^^^^^^^^^^^^ - -Like with the other special :ref:`RigidBody ` nodes, the sword extends ``VR_Interactable_Rigidbody`` so the VR controllers know this object can be interacted with and -that the functions defined defined in ``VR_Interactable_Rigidbody`` can be called when this object is held by a VR controller. - -Next, let's look at the class variables: - -* ``SWORD_DAMAGE``: A constant to define the amount of damage the sword does. This damage is applied to every object in the sword on every ``_physics_process`` call -* ``COLLISION_FORCE``: A constant that defines the amount of force applied to :ref:`RigidBody ` nodes when the sword collides with a :ref:`PhysicsBody `. -* ``damage_body``: A variable to hold the :ref:`KinematicBody ` node used to detect whether the sword is stabbing a :ref:`PhysicsBody ` node or not. -* ``sword_noise``: A variable to hold the :ref:`AudioStreamPlayer3D ` node used to play a sound when the sword collides with something. - - -``_ready`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""" - -All we are doing in the ``_ready`` function is getting the ``Damage_Body`` :ref:`KinematicBody ` node and assigning it to ``damage_body``. -Because we do not want the sword to detect a collision with the root :ref:`RigidBody ` node of the sword, we call -``add_collision_exception_with`` on ``damage_body`` and pass ``self`` so the sword will not be detected. - -Finally, we get the :ref:`AudioStreamPlayer3D ` node for the sword collision sound and apply it to the ``sword_noise`` variable. - - -``_physics_process`` function step-by-step explanation -"""""""""""""""""""""""""""""""""""""""""""""""""""""" - -First we need to determine whether the sword is colliding with something or not. To do this, we use the ``move_and_collide`` function of the ``damage_body`` node. -Unlike how ``move_and_collide`` is normally used, we are not passing a velocity and instead are passing an empty :ref:`Vector3 `. Because we do not -want the ``damage_body`` node to move, we set the ``test_only`` argument (the fourth argument) as ``true`` so the :ref:`KinematicBody ` generates -collision info without actually causing any collisions within the collision world. - -The ``move_and_collide`` function will return a :ref:`KinematicCollision ` class that has all of the information we need for detecting collisions -on the sword. We assign the return value of ``move_and_collide`` to a variable called ``collision_results``. - -Next we check to see if ``collision_results`` is not equal to ``null``. If ``collision_results`` is not equal to ``null``, then we know that the sword has collided with something. - -We then check to see if the :ref:`PhysicsBody ` the sword collided with has a function/method called ``damage`` using the ``has_method`` function. If the -:ref:`PhysicsBody ` has a function called ``damage_body``, we call it and pass the amount of damage the sword does, ``SWORD_DAMAGE``, to it. - -Next we check to see if the :ref:`PhysicsBody ` the sword collided with is a :ref:`RigidBody `. If what the sword collided with is a -:ref:`RigidBody ` node, we then check to see if the sword is being held by a VR controller or not by checking to see if ``controller`` is equal to ``null``. - -If the sword is not being held by a VR controller, ``controller`` is equal to ``null``, then we move the :ref:`RigidBody ` node the sword collided with using -the ``apply_impulse`` function. For the ``position`` of the ``apply_impulse`` function, we use ``collision_position`` variable stored within the :ref:`KinematicCollision ` -class in ``collision_results``. For the ``velocity`` of the ``apply_impulse`` function, we use the ``collision_normal`` multiplied by the ``linear_velocity`` of the sword's -:ref:`RigidBody ` node multiplied by ``COLLISION_FORCE``. - -If the sword is being held by a VR controller, ``controller`` is not equal to ``null``, then we move the :ref:`RigidBody ` node the sword collided with using -the ``apply_impulse`` function. For the ``position`` of the ``apply_impulse`` function, we use ``collision_position`` variable stored within the :ref:`KinematicCollision ` -class in ``collision_results``. For the ``velocity`` of the ``apply_impulse`` function, we use the ``collision_normal`` multiplied by the VR controller's velocity multiplied by ``COLLISION_FORCE``. - -Finally, regardless of whether the :ref:`PhysicsBody ` is a :ref:`RigidBody ` or not, we play the sound of the sword colliding with -something by calling ``play`` on ``sword_noise``. - - -Sword finished -^^^^^^^^^^^^^^ - -.. image:: img/starter_vr_tutorial_sword.png - -With that done, you can now slice through the targets! You can find the sword in the corner in between the shotgun and the pistol. - - - -Updating the target UI ----------------------- - -Let's update the UI as the sphere targets are destroyed. - -Open up ``Main_VR_GUI.tscn``, which you can find in the ``Scenes`` folder. -Feel free to look at how the scene is setup if you want, but in an effort to keep this tutorial from becoming too long, we will not be covering the scene setup in this tutorial. - -Expand the ``GUI`` :ref:`Viewport ` node and then select the ``Base_Control`` node. Add a new script called ``Base_Control.gd``, and add the following: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends Control - - var sphere_count_label - - func _ready(): - sphere_count_label = get_node("Label_Sphere_Count") - - get_tree().root.get_node("Game").sphere_ui = self - - - func update_ui(sphere_count): - if sphere_count > 0: - sphere_count_label.text = str(sphere_count) + " Spheres remaining" - else: - sphere_count_label.text = "No spheres remaining! Good job!" - -Let's go over how this script works real quick. - -First, in ``_ready``, we get the :ref:`Label ` that shows how many spheres are left and assign it to the ``sphere_count_label`` class variable. -Next, we get ``Game.gd`` by using ``get_tree().root`` and assign ``sphere_ui`` to this script. - -In ``update_ui``, we change the sphere :ref:`Label `'s text. If there is at least one sphere remaining, we change the text to show how many spheres are still -left in the world. If there are no more spheres remaining, we change the text and congratulate the player. - - - -Adding the final special RigidBody ----------------------------------- - -Finally, before we finish this tutorial, let's add a way to reset the game while in VR. - -Open up ``Reset_Box.tscn``, which you will find in ``Scenes``. Select the ``Reset_Box`` :ref:`RigidBody ` node and make a new script called ``Reset_Box.gd``. -Add the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - extends VR_Interactable_Rigidbody - - var start_transform - - var reset_timer = 0 - const RESET_TIME = 10 - const RESET_MIN_DISTANCE = 1 - - - func _ready(): - start_transform = global_transform - - - func _physics_process(delta): - if start_transform.origin.distance_to(global_transform.origin) >= RESET_MIN_DISTANCE: - reset_timer += delta - if reset_timer >= RESET_TIME: - global_transform = start_transform - reset_timer = 0 - - - func interact(): - # (Ignore the unused variable warning) - # warning-ignore:return_value_discarded - get_tree().change_scene("res://Game.tscn") - - - func dropped(): - global_transform = start_transform - reset_timer = 0 - - -Let's quickly go over how this script works. - - -Explaining the reset box code -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Like with the other special :ref:`RigidBody `-based objects we've created, the reset box extends ``VR_Interactable_Rigidbody``. - -The ``start_transform`` class variable will store the global transform of the reset box when the game starts, the ``reset_timer`` class variable will hold the length of -time that has passed since the reset box's position has moved, the ``RESET_TIME`` constant defines the length of time the reset box has to wait before being reset, and -the ``RESET_MIN_DISTANCE`` constant defines how far the reset box has to be away from it's initial position before the reset timer starts. - -In the ``_ready`` function all we are doing is storing the ``global_transform`` of the reset position when the scene starts. This is so we can reset the position, rotation, and scale -of the reset box object to this initial transform when enough time has passed. - -In the ``_physics_process`` function, the code checks to see if the reset box's initial position to the reset box's current position is farther than ``RESET_MIN_DISTANCE``. If it is -farther, then it starts adding time, ``delta``, to ``reset_timer``. Once ``reset_timer`` is more than or equal to ``RESET_TIME``, we reset the ``global_transform`` to the ``start_transform`` -so the reset box is back in its initial position. We then set ``reset_timer`` to ``0``. - -The ``interact`` function simply reloads the ``Game.tscn`` scene using ``get_tree().change_scene``. This will reload the game scene, resetting everything. - -Finally, the ``dropped`` function resets the ``global_transform`` to the initial transform in ``start_transform`` so the reset box has its initial position/rotation. Then ``reset_timer`` is -set to ``0`` so the timer is reset. - - -Reset box finished -^^^^^^^^^^^^^^^^^^ - -With that done, when you grab and interact with the reset box, the entire scene will reset/restart and you can destroy all the targets again! - -.. note:: Resetting the scene abruptly without any sort of transition can lead to discomfort in VR. - - - -Final notes ------------ - -.. image:: img/starter_vr_tutorial_pistol.png - -Whew! That was a lot of work. - -Now you have a fully working VR project with multiple different types of special :ref:`RigidBody `-based nodes that can be used and extended. Hopefully this will -help serve as an introduction to making fully-featured VR games in Godot! The code and concepts detailed in this tutorial can be expanded on to make puzzle games, action games, -story-based games, and more! - -.. warning:: You can download the finished project for this tutorial series on the `OpenVR GitHub repository `__, under the releases tab! diff --git a/tutorials/vr/openxr/deploy_on_quest.rst b/tutorials/vr/openxr/deploy_on_quest.rst deleted file mode 100644 index 6695f73..0000000 --- a/tutorials/vr/openxr/deploy_on_quest.rst +++ /dev/null @@ -1,38 +0,0 @@ -.. _doc_deploy_on_quest: - -Deploying on Quest -================== - -The OpenXR plugin makes developing for Android seamless with developing desktop XR experiences. -Note that many Android based devices are very constrained performance-wise. Therefore, -**we highly recommend using the GLES2 renderer.** - -.. note:: - - Currently, the only Android-based device supported is the Meta Quest. - As Khronos is finalizing the official Android support for OpenXR, we will be able to offer further support soon. - -.. seealso:: - - As with any other Android device, please follow - `the instructions in the official Godot documentation for deploying to Android `__. - -Enable developer mode on the Quest ----------------------------------- - -You can only deploy games to the Meta Quest if developer mode is enabled. -You can do this from the Oculus support application installed on your phone. -Please `follow the instructions on the Oculus developer site `__. - -Setting up the export template ------------------------------- - -The instruction in the official Godot documentation already has you configuring an export template in Godot. However, a few extra settings are needed for XR deployment. - -Open the export settings again by opening the **Project > Export...** menu and select the Android export template you created. -If you haven't created it yet, do so now by pressing **Add...** and selecting **Android**. - -Scroll down to the **Xr Features** section. Here, the important setting is the **Xr Mode** which should be set to **OpenXR**. -Note that the other options shown here should be set according to your project's needs. - -.. image:: img/android_xr_features.png diff --git a/tutorials/vr/openxr/enable_plugin.rst b/tutorials/vr/openxr/enable_plugin.rst deleted file mode 100644 index 04f7f68..0000000 --- a/tutorials/vr/openxr/enable_plugin.rst +++ /dev/null @@ -1,13 +0,0 @@ -.. _doc_enable_plugin: - -Enabling the OpenXR plugin -========================== - -Due to the design of Godot's XR system, the plugin will always automatically load the OpenXR interface. -However, additional editor features will not be available unless the OpenXR plugin is enabled. - -For this, go to **Project > Project Settings** and select the **Plugins** tab: - -.. image:: img/enable_plugin.png - -Make sure the **Enable** checkbox is checked. diff --git a/tutorials/vr/openxr/handtracking.rst b/tutorials/vr/openxr/handtracking.rst deleted file mode 100644 index 8768dda..0000000 --- a/tutorials/vr/openxr/handtracking.rst +++ /dev/null @@ -1,121 +0,0 @@ -.. _doc_handtracking: - -Hand tracking -============= - -.. note:: - - Only available in versions **1.1.0 and later** of the OpenXR plugin. - -The hand tracking API was originally added to OpenXR by Microsoft to make the tracking information for the users hands and fingers available to the XR client. The API provides pose data for all the bones in the players hands but leaves some room for interpretation in how the API is implemented by the XR runtime. - -In SteamVR support was added based on Valves existing hand tracking system that also provides fully rigged bone data extrapolated from controller inputs and proximity sensors if hand tracking is not supported natively on the system used. - -Meta added support for this API to their mobile OpenXR runtime tied into their existing hand tracking functionality on the Quest. Note that you do need to enable hand tracking in the export settings for this to be active. The hand tracking API is only used for pure hand tracking, no pose data is presented when controllers are used. - -.. note:: - - When using the hand tracking API it is thus important that the capabilities of the target platform are taken into account. - This may improve in time as feedback is provided to the OpenXR working group. - -The hand tracking API defines the bone structure that all XR runtimes must abide to however it doesn't dictate the orientation of the bones in rest or any size requirements. - -.. image:: img/hand_tracking_bones.png - -Image courtesy of Khronos OpenXR specification. - -The hand tracking API is independent of the action system and doesn't make use of it's poses. Hand tracking data is provided internally in global space (relative to the tracking volumes origin point) and the hand tracking nodes should thus have the :ref:`ARVROrigin ` node as their parent, not a :ref:`ARVRController ` node. - -The plugin exposes the hand tracking API as two seperate systems. One that updates positions of a tree of nodes and one that updates bones of a :ref:`Skeleton ` so mesh deformation can be used. - -Node based hand tracking ------------------------- - -This implementation is the most versatile as it doesn't require any knowledge of the hand model in rest pose. Note that the plugin comes with two example scenes called `left_hand_nodes.tscn` and `right_hand_nodes.tscn` that you can instance as child nodes of the `ARVROrigin` node. These scenes contain logic to automatically resize the used meshes to fit the size of the provided bone data. - -.. image:: img/arvr_nodes_example.png - -At the root of this scene is a :ref:`Spatial ` node that has `config/OpenXRHand.gdns` assigned as its script. This class from the plugin will position the spatial node at the base of the hand (see Palm in our image up above) and will update the position and orientation of its children. - -.. image:: img/arvr_openxr_hand.png - -There are two properties here: - -* `Hand` identifies whether we are tracking the position of the left or right hand. -* `Motion Range` is only available on SteamVR and limits how far the hand can close, this is only used in conjunction with infered hand tracking based on controller input. - -Our spatial node needs a number of child nodes with hardcoded names that will be updated by our hand tracking system. The type of nodes is not important, our example script uses :ref:`MeshInstance ` nodes to also visualise the bones. - -.. image:: img/hand_tracking_nodes.png - -First we find the child node `Wrist`, underneath here there are nodes for each finger. Each node starts with the name of the finger followed by the name of the bone. The finger names are `Thumb`, `Index`, `Middle`, `Ring` and `Little`. The bone names are `Metacarpal`, `Proximal`, `Intermediate`, `Distal` and `Tip`. Ergo IndexDistal is the distal bone of the index finger. - -.. note:: - The thumb is the only finger that does not have a intermediate bone! - -The parent-child relationships of these nodes are important and the hand will only look correct if this structure if followed exactly. Note that adding extra nodes isn't a problem, the example scenes add a number of extra bone meshes in to complete the look of the hand. Note also that the example scenes have scripts attached to the wrist node that update the size and positions of these extra nodes. - -Skeleton based hand tracking ----------------------------- - -The second method supported by the OpenXR plugin is exposing the bone data as a :ref:`Skeleton ` node. In this case the solution is divided in two classes, one for placing the hand in space and the second to animate the aforementioned skeleton by updating the bone poses of the skeleton. - -This approach allows deforming a mesh which is a visually more pleasing solution however differences in implementation between the platforms does pose some problems. - -.. note:: - Microsoft has added another API to OpenXR that allows for retrieving a properly skinned hand mesh however as they are currently the only platform supporting this API it has not yet been added to the plugin. - -At this point in time the plugin only exposes the data as it is provided by the OpenXR runtime. The plugin has an example implementation based on meshes that Valve has made publicly available however these work most reliable when used in conjunction with SteamVR. -These scenes are `scenes/left_hand_mesh.tscn` and `scenes/right_hand_mesh.tscn` and can be childed to the :ref:`ARVROrigin ` node. - -.. image:: img/arvr_mesh_example.png - -Below is an overview of the steps needed to implement your own version. - -.. note:: - The best way to implement this logic is to ask an artist to model a hand in 3D software using real hands dimensions and create an armature for the hand that follows the bone structure exactly as the OpenXR specification dictates in the image at the top of this article. When skinning special care needs to be taken keeping in mind that if full hand tracking is available, the distance between joints will be determined by the actual size of the players hand and may thus be different to the 3D model. After importing the model into Godot you can add the required scripts to make everything work. - -To place the hand mesh in space a node needs to be added as a child to the :ref:`ARVROrigin ` node, this node needs to have the `config/OpenXRPose.gdns` script attached. When importing a 3D file you can add this script to the root node of the imported model. - -The `OpenXRPose` script isn't just used by the hand logic but also exposes other pose locations configured in the action map. - -.. image:: img/arvr_openxr_pose.png - -The following properties can be set on this node: - -* `Invisible If Inactive` enables logic that will automatically make this node invisible if the hand is not being tracked. -* `Action` specifies which action in the actionmap is being tracked, this needs to be set to the special type `SkeletonBase`. -* `Path` specified the OpenXR input path, this is `/user/hand/left` for the left hand and `/user/hand/right` for the right hand. - -The next step is adding the script `config/OpenXRSkeleton.gdns` to the skeleton node of the 3D model. This script has the same two properties as the `OpenXRHand` script namely `Hand` and `Motion Range` and they have the same use. - -Note that the bone names are standardised, the list of bone names is presented below and need to be suffixed with either `_L` or `_R` depending on whether the bone is for respectively the left hand or the right hand: - -* Palm -* Wrist -* Thumb_Metacarpal -* Thumb_Proximal -* Thumb_Distal -* Thumb_Tip -* Index_Metacarpal -* Index_Proximal -* Index_Intermediate -* Index_Distal -* Index_Tip -* Middle_Metacarpal -* Middle_Proximal -* Middle_Intermediate -* Middle_Distal -* Middle_Tip -* Ring_Metacarpal -* Ring_Proximal -* Ring_Intermediate -* Ring_Distal -* Ring_Tip -* Little_Metacarpal -* Little_Proximal -* Little_Intermediate -* Little_Distal -* Little_Tip - -Finally, and this is standard Godot functionality, a common addition to hand tracking is to track the location of the tip of a finger for physics interaction. This can be accomplished with the :ref:`BoneAttachment ` node. Simply add this as a child node to the :ref:`Skeleton ` node and select the bone you want to track. Now you can add the desired physics object as a child to this node. diff --git a/tutorials/vr/openxr/img/android_xr_features.png b/tutorials/vr/openxr/img/android_xr_features.png deleted file mode 100644 index 779a38f..0000000 Binary files a/tutorials/vr/openxr/img/android_xr_features.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/arvr_mesh_example.png b/tutorials/vr/openxr/img/arvr_mesh_example.png deleted file mode 100644 index 14a0b9b..0000000 Binary files a/tutorials/vr/openxr/img/arvr_mesh_example.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/arvr_nodes_example.png b/tutorials/vr/openxr/img/arvr_nodes_example.png deleted file mode 100644 index 2106312..0000000 Binary files a/tutorials/vr/openxr/img/arvr_nodes_example.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/arvr_openxr_hand.png b/tutorials/vr/openxr/img/arvr_openxr_hand.png deleted file mode 100644 index b97d407..0000000 Binary files a/tutorials/vr/openxr/img/arvr_openxr_hand.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/arvr_openxr_pose.png b/tutorials/vr/openxr/img/arvr_openxr_pose.png deleted file mode 100644 index 0acab06..0000000 Binary files a/tutorials/vr/openxr/img/arvr_openxr_pose.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/default_scene.png b/tutorials/vr/openxr/img/default_scene.png deleted file mode 100644 index d03e2e3..0000000 Binary files a/tutorials/vr/openxr/img/default_scene.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/editable_children.png b/tutorials/vr/openxr/img/editable_children.png deleted file mode 100644 index 3fc8a8b..0000000 Binary files a/tutorials/vr/openxr/img/editable_children.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/enable_plugin.png b/tutorials/vr/openxr/img/enable_plugin.png deleted file mode 100644 index 2d1a09d..0000000 Binary files a/tutorials/vr/openxr/img/enable_plugin.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/hand_tracking_bones.png b/tutorials/vr/openxr/img/hand_tracking_bones.png deleted file mode 100644 index 0cfaae7..0000000 Binary files a/tutorials/vr/openxr/img/hand_tracking_bones.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/hand_tracking_nodes.png b/tutorials/vr/openxr/img/hand_tracking_nodes.png deleted file mode 100644 index 365eeed..0000000 Binary files a/tutorials/vr/openxr/img/hand_tracking_nodes.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/start_passthrough.png b/tutorials/vr/openxr/img/start_passthrough.png deleted file mode 100644 index 710de7b..0000000 Binary files a/tutorials/vr/openxr/img/start_passthrough.png and /dev/null differ diff --git a/tutorials/vr/openxr/img/switch_runtime.png b/tutorials/vr/openxr/img/switch_runtime.png deleted file mode 100644 index b07fd9e..0000000 Binary files a/tutorials/vr/openxr/img/switch_runtime.png and /dev/null differ diff --git a/tutorials/vr/openxr/index.rst b/tutorials/vr/openxr/index.rst deleted file mode 100644 index 53050c5..0000000 --- a/tutorials/vr/openxr/index.rst +++ /dev/null @@ -1,72 +0,0 @@ -.. _doc_openxr_introduction: - -OpenXR plugin -============= - -Welcome to the Godot OpenXR documentation! - -Introduction ------------- - -This is the documentation for the `Godot OpenXR plugin `__. - -The plugin is supported on Godot 3.4 and later. However, it does **not** support the upcoming Godot 4.0 release. - -Getting started ---------------- - -To start a new project that supports OpenXR, start by opening up the Godot editor and creating a new project. - -Copy the plugin into this new project in the subfolder ``addons/godot_openxr/`` using your operating system's file manager. -It is important that the plugin is placed in this **exact** location in your project folder. - -Back in Godot, create a new 3D scene and press the **Instance Child Scene** button -(represented by a chain link icon) in the scene tree dock. -Select the ``addons/godot_openxr/scenes/first_person_controller_vr.tscn`` subscene -and add it to your scene. -Right-click the added node and select **Editable Children** to gain access -to some of the nodes in this subscene: - -.. image:: img/editable_children.png - -This is the bare minimum you need. However, for good measure, we suggest adding -a DirectionalLight node and a few MeshInstance nodes so you have something to see. -If you add those to the hand nodes, you can visualize where your controllers -are tracking. - -Your scene should now look something like this: - -.. image:: img/default_scene.png - -Now you can press the **Run** button in the top-right corner of the editor -to start your project and you should be able to look around. - -Next steps ----------- - -To turn this simple scene into a proper game, the sky is the limit. -Below, there are a few more topics specific to this plugin. -However, the following resources are a good place to continue: - -- :ref:`VR starter tutorial ` - in the official documentation may focus on OpenVR, but almost everything - there applies to OpenXR as well. -- `Godot XR tools `__ is a plugin - that contains a set of handy sub scene to quickly - implement locomotion, object interaction and UI elements in your XR experience. - -Please check the `Godot Engine community page `__ to find help from other Godot developers. -The ``#xr`` channel on the Godot Discord has become a vibrant Godot XR community. - -Plugin features ---------------- - -.. toctree:: - :maxdepth: 1 - :name: toc-tutorials-vr-openxr - - enable_plugin - runtime_selection - passthrough - deploy_on_quest - handtracking diff --git a/tutorials/vr/openxr/passthrough.rst b/tutorials/vr/openxr/passthrough.rst deleted file mode 100644 index b9468f2..0000000 --- a/tutorials/vr/openxr/passthrough.rst +++ /dev/null @@ -1,61 +0,0 @@ -.. _doc_passthrough: - -Passthrough -=========== - -.. note:: - - Only available in versions **1.1.1 and later** of the OpenXR plugin. - -Passthrough is a new feature introduced on the Meta Quest and added to the OpenXR spec as a Meta extension. -It is likely this implementation will be implemented by other vendors where the hardware supports a passthrough mode -and promoted to core at some point. - -Keep in mind that this feature is not guaranteed to be available. -Passthrough allows for the camera input on the headset to be used within the headset so the user can see the real world. -This allows us to implement an AR-like experience in a VR headset. - -If you are using the ``first person controller`` scene, you can simply -enable passthrough by checking the **Start Passthrough** option -on the controller node: - -.. image:: img/start_passthrough.png - -If you rather do this through code, you will first need to create an instance -of the ``OpenXRConfig`` object. -You can do this the same way the ``first person controller`` does -and assign the ``OpenXRConfig.gdns`` as the script to a node, -or you can instance it in code as shown below: - -.. code:: - - var openxr_config = null - - - func _ready(): - var config_gdns = load("res://addons/godot_ovrmobile/OvrInitConfig.gdns") - if config_gdns: - openxr_config = config_gdns.new() - - - func start_passthrough(): - if openxr_config: - return openxr_config.start_passthrough() - else: - return false - - - func stop_passthrough(): - if openxr_config: - openxr_config.stop_passthrough() - -.. note:: - - The Viewport's **Transparent Bg** property must be enabled prior to starting passthrough. - The plugin will log a warning message if it detects an incorrect configuration. - -.. seealso:: - - A fix for a bug related to turning transparent background on/off - is scheduled for the Godot 3.4.3 release. If you wish to toggle passthrough - in your game, you will need to be on that version or newer. diff --git a/tutorials/vr/openxr/runtime_selection.rst b/tutorials/vr/openxr/runtime_selection.rst deleted file mode 100644 index 0abc726..0000000 --- a/tutorials/vr/openxr/runtime_selection.rst +++ /dev/null @@ -1,38 +0,0 @@ -.. _doc_runtime_selection: - -Switching runtimes -================== - -In OpenXR, it is standard for each runtime to implement a mechanism to make it -the current runtime. In Steam, the Oculus application or Windows MR portal, -there will be an option to switch to their runtime as the current OpenXR runtime. - -Generally speaking, end users will have a preferred runtime due to not having a reason -to switch runtimes when playing games that support OpenXR. However, developers may wish to -test multiple runtimes to see if their game behaves. - -To make this easy, Godot provides a dropdown in the top-right corner which can -switch the runtime Godot will use when testing: - -.. image:: img/switch_runtime.png - -The OpenXR plugin will **not** work with the Microsoft MR runtime. -That runtime only supports OpenXR applications that use DirectX, -but Godot uses OpenGL ES 3.0 or 2.0. - -.. note:: - - Selecting a runtime in this dropdown only applies to running the game - from the editor. It does **not** change the runtime used by other - applications. Exported projects will use the computers current runtime. - Also, if you are deploying to an external device, this setting has no effect. - -As OpenXR doesn't have a mechanism for registering runtimes that we can query, -Godot will check common locations for runtime configuration files. -The locations that are checked are stored in the ``addons/godot_openxr/runtimes.json`` file. -If you've installed a runtime in a nonstandard location or a runtime not currently present in this file, you can add it manually using a text editor. - -.. seealso:: - - If the dropdown isn't shown in your editor, make sure the plugin is enabled. - See :ref:`doc_enable_plugin`. diff --git a/tutorials/vr/xr_primer.rst b/tutorials/vr/xr_primer.rst deleted file mode 100644 index f811133..0000000 --- a/tutorials/vr/xr_primer.rst +++ /dev/null @@ -1,192 +0,0 @@ -.. _doc_xr_primer: - -AR/VR primer -============ - -This tutorial gives you a springboard into the world of AR and VR in the Godot game engine. - -A new architecture was introduced in Godot 3 called the AR/VR Server. On top of this -architecture, specific implementations are available as interfaces, most of which are plugins -based on GDNative. This tutorial focuses purely on the core elements abstracted by the core -architecture. This architecture has enough features for you to create an entire VR experience -that can then be deployed for various interfaces. However, each platform often has some unique -features that are impossible to abstract. Such features will be documented on the relevant -interfaces and fall outside of the scope of this primer. - -AR/VR server ------------- - -When Godot starts, each available interface will make itself known to the AR/VR server. -GDNative interfaces are setup as singletons; as long as they are added to the list of -GDNative singletons in your project, they will make themselves known to the server. - -You can use the function :ref:`get_interfaces() ` -to return a list of available interfaces, but for this tutorial, we're going to use the -:ref:`native mobile VR interface ` in our examples. This interface -is a straightforward implementation that uses the 3DOF sensors on your phone for orientation -and outputs a stereoscopic image to the screen. It is also available in the Godot core and -outputs to screen on desktop, which makes it ideal for prototyping or a tutorial such as -this one. - -To enable an interface, you execute the following code: - -.. tabs:: - .. code-tab:: gdscript GDScript - - var arvr_interface = ARVRServer.find_interface("Native mobile") - if arvr_interface and arvr_interface.initialize(): - get_viewport().arvr = true - - .. code-tab:: csharp - - var arvrInterface = ARVRServer.FindInterface("Native mobile"); - if (arvrInterface != null && arvrInterface.Initialize()) - { - GetViewport().Arvr = true; - } - -This code finds the interface we wish to use, initializes it and, if that is successful, binds -the main viewport to the interface. This last step gives some control over the viewport to the -interface, which automatically enables things like stereoscopic rendering on the viewport. - -For our mobile VR interface, and any interface where the main input is directly displayed on -screen, the main viewport needs to be the viewport where :ref:`arvr` -is set to ``true``. But for interfaces that render on an externally attached device, you can use -a secondary viewport. In the latter case, a viewport that shows its output on screen will show an -undistorted version of the left eye, while showing the fully processed stereoscopic output on the -device. - -Finally, you should only initialize an interface once; switching scenes and reinitializing interfaces -will just introduce a lot of overhead. If you want to turn the headset off temporarily, just disable -the viewport or set :ref:`arvr` to ``false`` on the viewport. In most -scenarios though, you wouldn't disable the headset once you're in VR, this can be disconcerting to -the gamer. - -New AR/VR nodes ---------------- - -Three new node types have been added for supporting AR and VR in Godot and one additional -node type especially for AR. These are: - -* :ref:`ARVROrigin ` - our origin point in the world -* :ref:`ARVRCamera ` - a special subclass of the camera, which is positionally tracked -* :ref:`ARVRController ` - a new spatial class, which tracks the location of a controller -* :ref:`ARVRAnchor ` - an anchor point for an AR implementation mapping a real world location into your virtual world - -The first two must exist in your scene for AR/VR to work and this tutorial focuses purely -on them. - -:ref:`ARVROrigin ` is an important node, you must have one and only one -of these somewhere in your scene. This node maps the center of your real world tracking -space to a location in your virtual world. Everything else is positionally tracked in -relation to this point. Where this point lies exactly differs from one implementation to -another, but the best example to understand how this node works is to take a look at a room -scale location. While we have functions to adjust the point to center it on the player by -default, the origin point will be the center location of the room you are in. As you -physically walk around the room, the location of the HMD is tracked in relation to this -center position and the tracking is mirror in the virtual world. - -To keep things simple, when you physically move around your room, the ARVR Origin point stays -where it is, the position of the camera and controllers will be adjusted according to your -movements. When you move through the virtual world, either through controller input or when -you implement a teleport system, it is the position of the origin point which you will -have to adjust. - -:ref:`ARVRCamera ` is the second node that must always be a part of your -scene and it must always be a child node of your origin node. It is a subclass of Godot's -normal camera. However, its position is automatically updated each frame based on the physical -orientation and position of the HMD. Also due to the precision required for rendering to an -HMD or rendering an AR overlay over a real world camera, most of the standard camera properties -are ignored. The only properties of the camera that are used are the near and far plane -settings. The FOV, aspect ratio and projection mode are all ignored. - -Note that, for our native mobile VR implementation, there is no positional tracking, only -the orientation of the phone and by extension, the HMD is tracked. This implementation -artificially places the camera at a height (Y) of 1.85. - -Conclusion: your minimum setup in your scene to make AR or VR work should look like this: - -.. image:: img/minimum_setup.png - -And that's all you need to get started with the native mobile interface. Obviously, you need -to add something more into your scene, so there is something to see, but after that, you can -export the game to your phone of choice, pop it into a viewer and away you go. - -Official plugins and resources ------------------------------- - -As mentioned earlier, Godot does not support the various VR and AR SDKs out of the box, you -need a plugin for the specific SDK you want to use. There are several official plugins available -in the `GodotVR Repository `__. - -* `Godot OpenXR `_: This is the **official XR plugin** - starting with Godot **3.4**. It supports OpenXR, an open standard for designing and building - cross-platform VR and AR software. - Tested with SteamVR, Monada and Oculus OpenXR (desktop and mobile) runtimes. - - * See :ref:`doc_openxr_introduction`. - -* `Godot Oculus Mobile `_ provides :ref:`support for - the Meta Quest `. - - * **Note**: This plugin has been deprecated starting with Godot 3.4. - We recommend migrating to the `Godot OpenXR `_ plugin instead. - -* `Godot OpenVR `_ (not to be confused with OpenXR) - supports the OpenVR SDK used by Steam. -* `Godot Oculus `__ supports the Oculus SDK - (desktop headsets only). - - * **Note**: This plugin has been deprecated starting with Godot 3.4. - We recommend migrating to the `Godot OpenXR `_ plugin instead. - -* `Godot OpenHMD `_ supports OpenHMD, an open source - API and drivers for headsets. - -These plugins can be downloaded from GitHub or the Godot Asset Library. - -In addition to the plugins, there are several official demos. - -* `Godot Oculus Demo `__. -* `Godot OpenVR FPS `__ (the tutorial for this project - is :ref:`doc_vr_starter_tutorial_part_one`). -* `Godot XR tools `__, which shows implementations for VR - features such as movement and picking up objects. - -Other things to consider ------------------------- - -There are a few other subjects that we need to briefly touch upon in this primer that are important -to know. - -The first are our units. In normal 3D games, you don't have to think a lot about units. As long as -everything is at the same scale, a box sized 1 unit by 1 unit by 1 unit can be any size from a cub -you can hold in your hand to something the size of a building. In AR and VR, this changes because -things in your virtual world are mapped to things in the real world. If you step 1 meter forward in -the real world, but you only move 1 cm forward in your virtual world, you have a problem. The same -with the position of your controllers; if they don't appear in the right relative space, it breaks -the immersion for the player. Most VR platforms, including our AR/VR Server, assume that 1 unit = 1 -meter. The AR/VR server, however, has a property that, for convenience, is also exposed on the -ARVROrigin node called world scale. For instance, setting this to a value of 10 changes our coordinate -system so 10 units = 1 meter. - -Performance is another thing that needs to be carefully considered. Especially VR taxes your game -a lot more than most people realize. For mobile VR, you have to be extra careful here, but even for -desktop games, there are three factors that make life extra difficult: - -* You are rendering stereoscopic, two for the price of one. While not exactly doubling the work load - and with things in the pipeline such as supporting the new MultiView OpenGL extension in mind, there - still is an extra workload in rendering images for both eyes -* A normal game will run acceptably on 30fps and ideally manages 60fps. That gives you a big range to - play with between lower end and higher end hardware. For any HMD application of AR or VR, however, - 60fps is the absolute minimum and you should target your games to run at a stable 90fps to ensure your - users don't get motion sickness right off the bat. -* The high FOV and related lens distortion effect require many VR experiences to render at double the - resolution. Yes a VIVE may only have a resolution of 1080x1200 per eye, we're rendering each eye at - 2160x2400 as a result. This is less of an issue for most AR applications. - -All in all, the workload your GPU has in comparison with a normal 3D game is a fair amount -higher. While things are in the pipeline to improve this, such as MultiView and foveated rendering, -these aren't supported on all devices. This is why you see many VR games using a more art style -and if you pay close attention to those VR games that go for realism, you'll probably notice they're -a bit more conservative on the effects or use some good old optical trickery.