Removed vr.

This commit is contained in:
Relintai 2022-09-10 13:01:00 +02:00
parent 12bfca3b70
commit 035ea896b4
36 changed files with 0 additions and 2868 deletions

View File

@ -108,7 +108,6 @@ The main documentation for the site is organized into the following sections:
tutorials/scripting/index
tutorials/shaders/index
tutorials/ui/index
tutorials/vr/index
.. toctree::

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.7 KiB

View File

@ -1,11 +0,0 @@
XR (AR/VR)
==========
.. toctree::
:maxdepth: 1
:name: toc-tutorials-vr
xr_primer
openxr/index
oculus_mobile/index
openvr/index

View File

@ -1,115 +0,0 @@
.. _doc_developing_for_oculus_quest:
Developing for Oculus Quest
===========================
Introduction
------------
This tutorial goes over how to get started developing for the
*Meta Quest* with the Godot Oculus Mobile plugin.
Before starting, there are two things you need to do:
First you need to go through the steps on the :ref:`doc_exporting_for_android`
page. This leads you through installing the toolset that Godot
needs to export to Android devices.
Next you need the Quest plugin. You can get it from the Asset
Library or manually download it from `here <https://github.com/GodotVR/godot-oculus-mobile-asset>`__.
Setting Up Godot
----------------
To get started open Godot and create a new project.
.. image:: img/quest_new_project.png
Make sure to choose the ``GLES2`` renderer. Due to the
Quest's GPU this backend is far better suited for the Quest.
Copy the addons folder from the Oculus Mobile asset into your Godot
project. Your project tree should look similar to this:
.. image:: img/quest_project_tree.png
Now you can start building the main scene:
- Add an :ref:`ARVROrigin <class_ARVROrigin>` node first.
- Then add three child nodes to the origin node, one :ref:`ARVRCamera <class_ARVRCamera>` and two :ref:`ARVRController <class_ARVRController>` nodes.
- Assign controller ID 1 to the first :ref:`ARVRController <class_ARVRController>` and rename that to ``LeftHand``.
- Assign controller ID 2 to the second :ref:`ARVRController <class_ARVRController>` and rename that to ``RightHand``.
- Finally add a :ref:`MeshInstance <class_MeshInstance>` as a child node to our first :ref:`ARVRController <class_ARVRController>` and create a box shape, resize the box so each side is set to 0.1. Now duplicate the :ref:`MeshInstance <class_MeshInstance>` and move it to the second :ref:`ARVRController <class_ARVRController>` node. These will stand in for our controllers.
.. image:: img/quest_scene_tree.png
Now add a script to the main node and add the following code:
.. tabs::
.. code-tab:: gdscript GDScript
extends Spatial
var perform_runtime_config = false
onready var ovr_init_config = preload("res://addons/godot_ovrmobile/OvrInitConfig.gdns").new()
onready var ovr_performance = preload("res://addons/godot_ovrmobile/OvrPerformance.gdns").new()
func _ready():
var interface = ARVRServer.find_interface("OVRMobile")
if interface:
ovr_init_config.set_render_target_size_multiplier(1)
if interface.initialize():
get_viewport().arvr = true
func _process(_delta):
if not perform_runtime_config:
ovr_performance.set_clock_levels(1, 1)
ovr_performance.set_extra_latency_mode(1)
perform_runtime_config = true
Before you can export this project to the Quest you need to do three
more things.
First go into the project settings and make sure that the main scene
is the scene we run. Godot does not ask you to set this on export.
.. image:: img/quest_project_settings.png
Then go into the export menu and configure a new Android export. If
you still haven't gone through the :ref:`doc_exporting_for_android`
page do it now. If you didn't you'll have some red messages on this
screen.
If you did you can forge ahead and make a few small changes to the
export settings. First change the XR Mode to ``Oculus Mobile VR``.
Then change the Degrees of Freedom mode to ``6DOF``.
.. image:: img/quest_export_settings.png
Now save and close the export window.
Setting Up Your Quest
---------------------
Follow `these instructions <https://developer.oculus.com/documentation/native/android/mobile-device-setup/>`__ to
setup your device for development.
Once your device is set up and connected, click the **Android logo** that should be visible in the top-right corner of the Godot editor.
When clicked, it exports your project and runs it on the connected device.
If you do not see this Android logo, make sure you have create an Android export preset
and that the preset is marked as **Runnable** in the Export dialog.
The above does the bare minimum to get your project running on the Quest,
it's not very exciting. Holger Dammertz has made a great toolkit for the
quest that contains a lot of scenes to get help you on your way including
really nice controller meshes.
You can find the toolkit `here <https://github.com/NeoSpark314/godot_oculus_quest_toolkit>`__.
If you want to help out with improving the plugin please join us `here <https://github.com/GodotVR/godot_oculus_mobile>`__.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

View File

@ -1,8 +0,0 @@
Oculus mobile plugin (deprecated)
=================================
.. toctree::
:maxdepth: 1
:name: toc-tutorials-vr-oculus_mobile
developing_for_oculus_quest

View File

@ -1,8 +0,0 @@
OpenVR plugin
=============
.. toctree::
:maxdepth: 1
:name: toc-tutorials-vr-openvr
vr_starter_tutorial/index

Binary file not shown.

Before

Width:  |  Height:  |  Size: 201 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 273 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 208 KiB

View File

@ -1,9 +0,0 @@
VR starter tutorial
===================
.. toctree::
:maxdepth: 1
:name: doc_vr_starter_tutorial
vr_starter_tutorial_part_one
vr_starter_tutorial_part_two

View File

@ -1,38 +0,0 @@
.. _doc_deploy_on_quest:
Deploying on Quest
==================
The OpenXR plugin makes developing for Android seamless with developing desktop XR experiences.
Note that many Android based devices are very constrained performance-wise. Therefore,
**we highly recommend using the GLES2 renderer.**
.. note::
Currently, the only Android-based device supported is the Meta Quest.
As Khronos is finalizing the official Android support for OpenXR, we will be able to offer further support soon.
.. seealso::
As with any other Android device, please follow
`the instructions in the official Godot documentation for deploying to Android <https://docs.godotengine.org/en/stable/getting_started/workflow/export/exporting_for_android.html#doc-exporting-for-android>`__.
Enable developer mode on the Quest
----------------------------------
You can only deploy games to the Meta Quest if developer mode is enabled.
You can do this from the Oculus support application installed on your phone.
Please `follow the instructions on the Oculus developer site <https://developer.oculus.com/documentation/native/android/mobile-device-setup/>`__.
Setting up the export template
------------------------------
The instruction in the official Godot documentation already has you configuring an export template in Godot. However, a few extra settings are needed for XR deployment.
Open the export settings again by opening the **Project > Export...** menu and select the Android export template you created.
If you haven't created it yet, do so now by pressing **Add...** and selecting **Android**.
Scroll down to the **Xr Features** section. Here, the important setting is the **Xr Mode** which should be set to **OpenXR**.
Note that the other options shown here should be set according to your project's needs.
.. image:: img/android_xr_features.png

View File

@ -1,13 +0,0 @@
.. _doc_enable_plugin:
Enabling the OpenXR plugin
==========================
Due to the design of Godot's XR system, the plugin will always automatically load the OpenXR interface.
However, additional editor features will not be available unless the OpenXR plugin is enabled.
For this, go to **Project > Project Settings** and select the **Plugins** tab:
.. image:: img/enable_plugin.png
Make sure the **Enable** checkbox is checked.

View File

@ -1,121 +0,0 @@
.. _doc_handtracking:
Hand tracking
=============
.. note::
Only available in versions **1.1.0 and later** of the OpenXR plugin.
The hand tracking API was originally added to OpenXR by Microsoft to make the tracking information for the users hands and fingers available to the XR client. The API provides pose data for all the bones in the players hands but leaves some room for interpretation in how the API is implemented by the XR runtime.
In SteamVR support was added based on Valves existing hand tracking system that also provides fully rigged bone data extrapolated from controller inputs and proximity sensors if hand tracking is not supported natively on the system used.
Meta added support for this API to their mobile OpenXR runtime tied into their existing hand tracking functionality on the Quest. Note that you do need to enable hand tracking in the export settings for this to be active. The hand tracking API is only used for pure hand tracking, no pose data is presented when controllers are used.
.. note::
When using the hand tracking API it is thus important that the capabilities of the target platform are taken into account.
This may improve in time as feedback is provided to the OpenXR working group.
The hand tracking API defines the bone structure that all XR runtimes must abide to however it doesn't dictate the orientation of the bones in rest or any size requirements.
.. image:: img/hand_tracking_bones.png
Image courtesy of Khronos OpenXR specification.
The hand tracking API is independent of the action system and doesn't make use of it's poses. Hand tracking data is provided internally in global space (relative to the tracking volumes origin point) and the hand tracking nodes should thus have the :ref:`ARVROrigin <class_ARVROrigin>` node as their parent, not a :ref:`ARVRController <class_ARVRController>` node.
The plugin exposes the hand tracking API as two seperate systems. One that updates positions of a tree of nodes and one that updates bones of a :ref:`Skeleton <class_Skeleton>` so mesh deformation can be used.
Node based hand tracking
------------------------
This implementation is the most versatile as it doesn't require any knowledge of the hand model in rest pose. Note that the plugin comes with two example scenes called `left_hand_nodes.tscn` and `right_hand_nodes.tscn` that you can instance as child nodes of the `ARVROrigin` node. These scenes contain logic to automatically resize the used meshes to fit the size of the provided bone data.
.. image:: img/arvr_nodes_example.png
At the root of this scene is a :ref:`Spatial <class_Spatial>` node that has `config/OpenXRHand.gdns` assigned as its script. This class from the plugin will position the spatial node at the base of the hand (see Palm in our image up above) and will update the position and orientation of its children.
.. image:: img/arvr_openxr_hand.png
There are two properties here:
* `Hand` identifies whether we are tracking the position of the left or right hand.
* `Motion Range` is only available on SteamVR and limits how far the hand can close, this is only used in conjunction with infered hand tracking based on controller input.
Our spatial node needs a number of child nodes with hardcoded names that will be updated by our hand tracking system. The type of nodes is not important, our example script uses :ref:`MeshInstance <class_MeshInstance>` nodes to also visualise the bones.
.. image:: img/hand_tracking_nodes.png
First we find the child node `Wrist`, underneath here there are nodes for each finger. Each node starts with the name of the finger followed by the name of the bone. The finger names are `Thumb`, `Index`, `Middle`, `Ring` and `Little`. The bone names are `Metacarpal`, `Proximal`, `Intermediate`, `Distal` and `Tip`. Ergo IndexDistal is the distal bone of the index finger.
.. note::
The thumb is the only finger that does not have a intermediate bone!
The parent-child relationships of these nodes are important and the hand will only look correct if this structure if followed exactly. Note that adding extra nodes isn't a problem, the example scenes add a number of extra bone meshes in to complete the look of the hand. Note also that the example scenes have scripts attached to the wrist node that update the size and positions of these extra nodes.
Skeleton based hand tracking
----------------------------
The second method supported by the OpenXR plugin is exposing the bone data as a :ref:`Skeleton <class_Skeleton>` node. In this case the solution is divided in two classes, one for placing the hand in space and the second to animate the aforementioned skeleton by updating the bone poses of the skeleton.
This approach allows deforming a mesh which is a visually more pleasing solution however differences in implementation between the platforms does pose some problems.
.. note::
Microsoft has added another API to OpenXR that allows for retrieving a properly skinned hand mesh however as they are currently the only platform supporting this API it has not yet been added to the plugin.
At this point in time the plugin only exposes the data as it is provided by the OpenXR runtime. The plugin has an example implementation based on meshes that Valve has made publicly available however these work most reliable when used in conjunction with SteamVR.
These scenes are `scenes/left_hand_mesh.tscn` and `scenes/right_hand_mesh.tscn` and can be childed to the :ref:`ARVROrigin <class_ARVROrigin>` node.
.. image:: img/arvr_mesh_example.png
Below is an overview of the steps needed to implement your own version.
.. note::
The best way to implement this logic is to ask an artist to model a hand in 3D software using real hands dimensions and create an armature for the hand that follows the bone structure exactly as the OpenXR specification dictates in the image at the top of this article. When skinning special care needs to be taken keeping in mind that if full hand tracking is available, the distance between joints will be determined by the actual size of the players hand and may thus be different to the 3D model. After importing the model into Godot you can add the required scripts to make everything work.
To place the hand mesh in space a node needs to be added as a child to the :ref:`ARVROrigin <class_ARVROrigin>` node, this node needs to have the `config/OpenXRPose.gdns` script attached. When importing a 3D file you can add this script to the root node of the imported model.
The `OpenXRPose` script isn't just used by the hand logic but also exposes other pose locations configured in the action map.
.. image:: img/arvr_openxr_pose.png
The following properties can be set on this node:
* `Invisible If Inactive` enables logic that will automatically make this node invisible if the hand is not being tracked.
* `Action` specifies which action in the actionmap is being tracked, this needs to be set to the special type `SkeletonBase`.
* `Path` specified the OpenXR input path, this is `/user/hand/left` for the left hand and `/user/hand/right` for the right hand.
The next step is adding the script `config/OpenXRSkeleton.gdns` to the skeleton node of the 3D model. This script has the same two properties as the `OpenXRHand` script namely `Hand` and `Motion Range` and they have the same use.
Note that the bone names are standardised, the list of bone names is presented below and need to be suffixed with either `_L` or `_R` depending on whether the bone is for respectively the left hand or the right hand:
* Palm
* Wrist
* Thumb_Metacarpal
* Thumb_Proximal
* Thumb_Distal
* Thumb_Tip
* Index_Metacarpal
* Index_Proximal
* Index_Intermediate
* Index_Distal
* Index_Tip
* Middle_Metacarpal
* Middle_Proximal
* Middle_Intermediate
* Middle_Distal
* Middle_Tip
* Ring_Metacarpal
* Ring_Proximal
* Ring_Intermediate
* Ring_Distal
* Ring_Tip
* Little_Metacarpal
* Little_Proximal
* Little_Intermediate
* Little_Distal
* Little_Tip
Finally, and this is standard Godot functionality, a common addition to hand tracking is to track the location of the tip of a finger for physics interaction. This can be accomplished with the :ref:`BoneAttachment <class_BoneAttachment>` node. Simply add this as a child node to the :ref:`Skeleton <class_Skeleton>` node and select the bone you want to track. Now you can add the desired physics object as a child to this node.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 362 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 328 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 85 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 465 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 17 KiB

View File

@ -1,72 +0,0 @@
.. _doc_openxr_introduction:
OpenXR plugin
=============
Welcome to the Godot OpenXR documentation!
Introduction
------------
This is the documentation for the `Godot OpenXR plugin <https://github.com/GodotVR/godot_openxr>`__.
The plugin is supported on Godot 3.4 and later. However, it does **not** support the upcoming Godot 4.0 release.
Getting started
---------------
To start a new project that supports OpenXR, start by opening up the Godot editor and creating a new project.
Copy the plugin into this new project in the subfolder ``addons/godot_openxr/`` using your operating system's file manager.
It is important that the plugin is placed in this **exact** location in your project folder.
Back in Godot, create a new 3D scene and press the **Instance Child Scene** button
(represented by a chain link icon) in the scene tree dock.
Select the ``addons/godot_openxr/scenes/first_person_controller_vr.tscn`` subscene
and add it to your scene.
Right-click the added node and select **Editable Children** to gain access
to some of the nodes in this subscene:
.. image:: img/editable_children.png
This is the bare minimum you need. However, for good measure, we suggest adding
a DirectionalLight node and a few MeshInstance nodes so you have something to see.
If you add those to the hand nodes, you can visualize where your controllers
are tracking.
Your scene should now look something like this:
.. image:: img/default_scene.png
Now you can press the **Run** button in the top-right corner of the editor
to start your project and you should be able to look around.
Next steps
----------
To turn this simple scene into a proper game, the sky is the limit.
Below, there are a few more topics specific to this plugin.
However, the following resources are a good place to continue:
- :ref:`VR starter tutorial <doc_vr_starter_tutorial_part_one>`
in the official documentation may focus on OpenVR, but almost everything
there applies to OpenXR as well.
- `Godot XR tools <https://github.com/GodotVR/godot-xr-tools>`__ is a plugin
that contains a set of handy sub scene to quickly
implement locomotion, object interaction and UI elements in your XR experience.
Please check the `Godot Engine community page <https://godotengine.org/community>`__ to find help from other Godot developers.
The ``#xr`` channel on the Godot Discord has become a vibrant Godot XR community.
Plugin features
---------------
.. toctree::
:maxdepth: 1
:name: toc-tutorials-vr-openxr
enable_plugin
runtime_selection
passthrough
deploy_on_quest
handtracking

View File

@ -1,61 +0,0 @@
.. _doc_passthrough:
Passthrough
===========
.. note::
Only available in versions **1.1.1 and later** of the OpenXR plugin.
Passthrough is a new feature introduced on the Meta Quest and added to the OpenXR spec as a Meta extension.
It is likely this implementation will be implemented by other vendors where the hardware supports a passthrough mode
and promoted to core at some point.
Keep in mind that this feature is not guaranteed to be available.
Passthrough allows for the camera input on the headset to be used within the headset so the user can see the real world.
This allows us to implement an AR-like experience in a VR headset.
If you are using the ``first person controller`` scene, you can simply
enable passthrough by checking the **Start Passthrough** option
on the controller node:
.. image:: img/start_passthrough.png
If you rather do this through code, you will first need to create an instance
of the ``OpenXRConfig`` object.
You can do this the same way the ``first person controller`` does
and assign the ``OpenXRConfig.gdns`` as the script to a node,
or you can instance it in code as shown below:
.. code::
var openxr_config = null
func _ready():
var config_gdns = load("res://addons/godot_ovrmobile/OvrInitConfig.gdns")
if config_gdns:
openxr_config = config_gdns.new()
func start_passthrough():
if openxr_config:
return openxr_config.start_passthrough()
else:
return false
func stop_passthrough():
if openxr_config:
openxr_config.stop_passthrough()
.. note::
The Viewport's **Transparent Bg** property must be enabled prior to starting passthrough.
The plugin will log a warning message if it detects an incorrect configuration.
.. seealso::
A fix for a bug related to turning transparent background on/off
is scheduled for the Godot 3.4.3 release. If you wish to toggle passthrough
in your game, you will need to be on that version or newer.

View File

@ -1,38 +0,0 @@
.. _doc_runtime_selection:
Switching runtimes
==================
In OpenXR, it is standard for each runtime to implement a mechanism to make it
the current runtime. In Steam, the Oculus application or Windows MR portal,
there will be an option to switch to their runtime as the current OpenXR runtime.
Generally speaking, end users will have a preferred runtime due to not having a reason
to switch runtimes when playing games that support OpenXR. However, developers may wish to
test multiple runtimes to see if their game behaves.
To make this easy, Godot provides a dropdown in the top-right corner which can
switch the runtime Godot will use when testing:
.. image:: img/switch_runtime.png
The OpenXR plugin will **not** work with the Microsoft MR runtime.
That runtime only supports OpenXR applications that use DirectX,
but Godot uses OpenGL ES 3.0 or 2.0.
.. note::
Selecting a runtime in this dropdown only applies to running the game
from the editor. It does **not** change the runtime used by other
applications. Exported projects will use the computers current runtime.
Also, if you are deploying to an external device, this setting has no effect.
As OpenXR doesn't have a mechanism for registering runtimes that we can query,
Godot will check common locations for runtime configuration files.
The locations that are checked are stored in the ``addons/godot_openxr/runtimes.json`` file.
If you've installed a runtime in a nonstandard location or a runtime not currently present in this file, you can add it manually using a text editor.
.. seealso::
If the dropdown isn't shown in your editor, make sure the plugin is enabled.
See :ref:`doc_enable_plugin`.

View File

@ -1,192 +0,0 @@
.. _doc_xr_primer:
AR/VR primer
============
This tutorial gives you a springboard into the world of AR and VR in the Godot game engine.
A new architecture was introduced in Godot 3 called the AR/VR Server. On top of this
architecture, specific implementations are available as interfaces, most of which are plugins
based on GDNative. This tutorial focuses purely on the core elements abstracted by the core
architecture. This architecture has enough features for you to create an entire VR experience
that can then be deployed for various interfaces. However, each platform often has some unique
features that are impossible to abstract. Such features will be documented on the relevant
interfaces and fall outside of the scope of this primer.
AR/VR server
------------
When Godot starts, each available interface will make itself known to the AR/VR server.
GDNative interfaces are setup as singletons; as long as they are added to the list of
GDNative singletons in your project, they will make themselves known to the server.
You can use the function :ref:`get_interfaces() <class_ARVRServer_method_get_interfaces>`
to return a list of available interfaces, but for this tutorial, we're going to use the
:ref:`native mobile VR interface <class_MobileVRInterface>` in our examples. This interface
is a straightforward implementation that uses the 3DOF sensors on your phone for orientation
and outputs a stereoscopic image to the screen. It is also available in the Godot core and
outputs to screen on desktop, which makes it ideal for prototyping or a tutorial such as
this one.
To enable an interface, you execute the following code:
.. tabs::
.. code-tab:: gdscript GDScript
var arvr_interface = ARVRServer.find_interface("Native mobile")
if arvr_interface and arvr_interface.initialize():
get_viewport().arvr = true
.. code-tab:: csharp
var arvrInterface = ARVRServer.FindInterface("Native mobile");
if (arvrInterface != null && arvrInterface.Initialize())
{
GetViewport().Arvr = true;
}
This code finds the interface we wish to use, initializes it and, if that is successful, binds
the main viewport to the interface. This last step gives some control over the viewport to the
interface, which automatically enables things like stereoscopic rendering on the viewport.
For our mobile VR interface, and any interface where the main input is directly displayed on
screen, the main viewport needs to be the viewport where :ref:`arvr<class_Viewport_property_arvr>`
is set to ``true``. But for interfaces that render on an externally attached device, you can use
a secondary viewport. In the latter case, a viewport that shows its output on screen will show an
undistorted version of the left eye, while showing the fully processed stereoscopic output on the
device.
Finally, you should only initialize an interface once; switching scenes and reinitializing interfaces
will just introduce a lot of overhead. If you want to turn the headset off temporarily, just disable
the viewport or set :ref:`arvr<class_Viewport_property_arvr>` to ``false`` on the viewport. In most
scenarios though, you wouldn't disable the headset once you're in VR, this can be disconcerting to
the gamer.
New AR/VR nodes
---------------
Three new node types have been added for supporting AR and VR in Godot and one additional
node type especially for AR. These are:
* :ref:`ARVROrigin <class_ARVROrigin>` - our origin point in the world
* :ref:`ARVRCamera <class_ARVRCamera>` - a special subclass of the camera, which is positionally tracked
* :ref:`ARVRController <class_ARVRController>` - a new spatial class, which tracks the location of a controller
* :ref:`ARVRAnchor <class_ARVRAnchor>` - an anchor point for an AR implementation mapping a real world location into your virtual world
The first two must exist in your scene for AR/VR to work and this tutorial focuses purely
on them.
:ref:`ARVROrigin <class_ARVROrigin>` is an important node, you must have one and only one
of these somewhere in your scene. This node maps the center of your real world tracking
space to a location in your virtual world. Everything else is positionally tracked in
relation to this point. Where this point lies exactly differs from one implementation to
another, but the best example to understand how this node works is to take a look at a room
scale location. While we have functions to adjust the point to center it on the player by
default, the origin point will be the center location of the room you are in. As you
physically walk around the room, the location of the HMD is tracked in relation to this
center position and the tracking is mirror in the virtual world.
To keep things simple, when you physically move around your room, the ARVR Origin point stays
where it is, the position of the camera and controllers will be adjusted according to your
movements. When you move through the virtual world, either through controller input or when
you implement a teleport system, it is the position of the origin point which you will
have to adjust.
:ref:`ARVRCamera <class_ARVRCamera>` is the second node that must always be a part of your
scene and it must always be a child node of your origin node. It is a subclass of Godot's
normal camera. However, its position is automatically updated each frame based on the physical
orientation and position of the HMD. Also due to the precision required for rendering to an
HMD or rendering an AR overlay over a real world camera, most of the standard camera properties
are ignored. The only properties of the camera that are used are the near and far plane
settings. The FOV, aspect ratio and projection mode are all ignored.
Note that, for our native mobile VR implementation, there is no positional tracking, only
the orientation of the phone and by extension, the HMD is tracked. This implementation
artificially places the camera at a height (Y) of 1.85.
Conclusion: your minimum setup in your scene to make AR or VR work should look like this:
.. image:: img/minimum_setup.png
And that's all you need to get started with the native mobile interface. Obviously, you need
to add something more into your scene, so there is something to see, but after that, you can
export the game to your phone of choice, pop it into a viewer and away you go.
Official plugins and resources
------------------------------
As mentioned earlier, Godot does not support the various VR and AR SDKs out of the box, you
need a plugin for the specific SDK you want to use. There are several official plugins available
in the `GodotVR Repository <https://github.com/GodotVR>`__.
* `Godot OpenXR <https://github.com/GodotVR/godot_openxr>`_: This is the **official XR plugin**
starting with Godot **3.4**. It supports OpenXR, an open standard for designing and building
cross-platform VR and AR software.
Tested with SteamVR, Monada and Oculus OpenXR (desktop and mobile) runtimes.
* See :ref:`doc_openxr_introduction`.
* `Godot Oculus Mobile <https://github.com/GodotVR/godot_oculus_mobile>`_ provides :ref:`support for
the Meta Quest <doc_developing_for_oculus_quest>`.
* **Note**: This plugin has been deprecated starting with Godot 3.4.
We recommend migrating to the `Godot OpenXR <https://github.com/GodotVR/godot_openxr>`_ plugin instead.
* `Godot OpenVR <https://github.com/GodotVR/godot_openvr>`_ (not to be confused with OpenXR)
supports the OpenVR SDK used by Steam.
* `Godot Oculus <https://github.com/GodotVR/godot_oculus>`__ supports the Oculus SDK
(desktop headsets only).
* **Note**: This plugin has been deprecated starting with Godot 3.4.
We recommend migrating to the `Godot OpenXR <https://github.com/GodotVR/godot_openxr>`_ plugin instead.
* `Godot OpenHMD <https://github.com/GodotVR/godot_openhmd>`_ supports OpenHMD, an open source
API and drivers for headsets.
These plugins can be downloaded from GitHub or the Godot Asset Library.
In addition to the plugins, there are several official demos.
* `Godot Oculus Demo <https://github.com/GodotVR/godot-oculus-demo>`__.
* `Godot OpenVR FPS <https://github.com/GodotVR/godot_openvr_fps>`__ (the tutorial for this project
is :ref:`doc_vr_starter_tutorial_part_one`).
* `Godot XR tools <https://github.com/GodotVR/godot-xr-tools>`__, which shows implementations for VR
features such as movement and picking up objects.
Other things to consider
------------------------
There are a few other subjects that we need to briefly touch upon in this primer that are important
to know.
The first are our units. In normal 3D games, you don't have to think a lot about units. As long as
everything is at the same scale, a box sized 1 unit by 1 unit by 1 unit can be any size from a cub
you can hold in your hand to something the size of a building. In AR and VR, this changes because
things in your virtual world are mapped to things in the real world. If you step 1 meter forward in
the real world, but you only move 1 cm forward in your virtual world, you have a problem. The same
with the position of your controllers; if they don't appear in the right relative space, it breaks
the immersion for the player. Most VR platforms, including our AR/VR Server, assume that 1 unit = 1
meter. The AR/VR server, however, has a property that, for convenience, is also exposed on the
ARVROrigin node called world scale. For instance, setting this to a value of 10 changes our coordinate
system so 10 units = 1 meter.
Performance is another thing that needs to be carefully considered. Especially VR taxes your game
a lot more than most people realize. For mobile VR, you have to be extra careful here, but even for
desktop games, there are three factors that make life extra difficult:
* You are rendering stereoscopic, two for the price of one. While not exactly doubling the work load
and with things in the pipeline such as supporting the new MultiView OpenGL extension in mind, there
still is an extra workload in rendering images for both eyes
* A normal game will run acceptably on 30fps and ideally manages 60fps. That gives you a big range to
play with between lower end and higher end hardware. For any HMD application of AR or VR, however,
60fps is the absolute minimum and you should target your games to run at a stable 90fps to ensure your
users don't get motion sickness right off the bat.
* The high FOV and related lens distortion effect require many VR experiences to render at double the
resolution. Yes a VIVE may only have a resolution of 1080x1200 per eye, we're rendering each eye at
2160x2400 as a result. This is less of an issue for most AR applications.
All in all, the workload your GPU has in comparison with a normal 3D game is a fair amount
higher. While things are in the pipeline to improve this, such as MultiView and foveated rendering,
these aren't supported on all devices. This is why you see many VR games using a more art style
and if you pay close attention to those VR games that go for realism, you'll probably notice they're
a bit more conservative on the effects or use some good old optical trickery.