AR - Enhanced projects#

Summary#

The Enhanced projects repo is meant to give AR augmentations to targets linked to SPL’s projects.
One can use either an Hololens2 or an Android phone. The recognition is based on Vuforia’s library.
AR Scene

AR Scene#

In red, the objects are static in the AR world (based on sensors and where the app is launched at first). Here are some distance Markers (“TicTacs” marking steps of 1 meters), and a simple cube pivoting non-stop. Those objects are simple 3D shapes.

In yellow are the objects recognized and enhanced (explained further below).

In green are the managers, dispatching events like speech recognition and scene data.

Note

An object in the hierarchy can be seen as a container gathering models and scripts which has a position and size. A Marker is solely composed of a 3D body (Mesh renderer), while the PivotingCube has a 3D model and a script, and the SceneManager contains only a script without model.

In the relation tree, the child objects position themselves related to their parents.

../../_images/03_simpleobject.png
../../_images/02_autorotation.png

Launching#

The project is made and intended to be used with Unity 2020.3.19.f1.

When opening the project, go to Assets -> Scenes -> double click Scene.

To enable the Vuforia library, go to Vuforia’s developer website and register for a developer license. Then, on Unity, go to MixedRealityPlayspace -> click ARCamera -> under Inspector, click Open Vuforia Configuration. Simply enter your license under App License Key. You can modify other settings here.

To get a better stabilization while on Android, the ARCore library is included and required (will be managed by Unity).

Enhanced targets#

Vuforia’s targets seem to look for sharp angles and high contrasts. Texts will define a lot of interest points. The targets are modified to get enough interest points.

SPL#

The SPL target is the following :

../../_images/spltarget.png ../../_images/spltarget_ip.png

While the full augmentation is such as:

../../_images/04_spl1.png

On detection, a sound is played and the logo animation launched. It consists of a rotation and expansion, then the logo will fix itself in front of the target:

../../_images/05_spl2.png
../../_images/splanim.gif

The text itself is based on a billboard, i.e. it will reorient to face the user view:

../../_images/06_spl3.png

DigitAlu#

The DigitAlu target is the following :

../../_images/digalu.png ../../_images/digalu_ip.png

While the full augmentation is such as:

../../_images/10_da1.png

On detection, a sound is played and a “force field” is projected around the part. The effect is given by an HollowBox where a shader named Plasma is applied, from the Ultimate 10+ Shaders package:

../../_images/11_da2.png
../../_images/daanim.gif

The text itself is based on a billboard, i.e. it will reorient to face the user view:

../../_images/12_da3.png

DigitalTwin#

The DigitalTwin target is the following :

../../_images/dt.png ../../_images/dt_ip.png

While the full augmentation is such as:

../../_images/20_dt1.png

On detection, a sound is played, the poster is shown and a scan animation will launch.

Note

The poster is the transparent version of the PDF. Drawing it is point based, and such very slow with a white background. Drawing only the required points, then projecting a white Quad (90[°] rotated plane) is much faster.

../../_images/21_dt2.png
../../_images/dtanim.gif

The light “laser grid” effect comes from a simple Spot, where a cookie is applied.

Note

A cookie is based on a grayscale mask, where the more black it is, the less light will pass through.

../../_images/22_dt3.png
../../_images/25_dt_cookie.png

Finally, the light is animated by rotating it and changing the spot intensity based on a sinus pattern. The cubes are moved with a second animation, where their position is simply modified over time:

../../_images/23_dt4.png
../../_images/24_dt5.png

Unique Stability Plates#

The USP target is the following :

../../_images/usp.png ../../_images/usp_ip.png

While the full augmentation is such as:

../../_images/30_usp1.png

On detection, a sound is played, the poster is shown and a two .csv files are read and displayed as a moving graph. A custom script USP Data Provider takes a CSV file and will display more or less data, more or less quickly. The moving tick can be synchronized with another dataset to ensure synchronicity.

../../_images/31_usp2.png
../../_images/uspanim.gif

VLSPM#

The VLSPM target is the following :

../../_images/vlspm.png ../../_images/vlspm_ip.png

While the full augmentation is such as:

../../_images/40_vlspm1.png

On detection, a sound is played, a summary is shown along with a robot animation and a video button. On button click, a video from VLSPM++ is displayed and can be hidden again (either by pressing the button again, or when the video finishes):

../../_images/41_vlspm2.png
../../_images/vlspmanim.gif

The button comes from the MRTK toolkit. On click, it shows or hide the video screen. The screen itself will simulate a click on the button when the video finishes:

../../_images/42_vlspm3.png
../../_images/43_vlspm4.png

The video screen automatically plays when the object is activated:

../../_images/44_vlspm5.png

MRTK - The Mixed Reality ToolKit#

MRTK is a toolkit to build experiences in AR and VR, capturing the device inputs and projecting the scene on the device.
It is especially made for the Hololens, but also manages Android through the use of OpenXR.

The configuration is available under the MixedRealityToolkit object.

../../_images/50_mrtk.png

To modify a setting, first Clone the corresponding profile.

It manages:

  • The Experience scaling (if user will use the device in a room-scale, seated …)

  • The Camera to manage the hologram reprojection

  • The Input from the user (hand gesture, eye tracking, controllers, speech)

  • The Boundary such that the world is mapped and will interact the Unity scene (collisions)

  • The Teleport system to move through the game (mostly for VR to avoid loss of orientation and vomiting effects)

  • The Spatial Awareness that will show the world boundaries as a mesh

  • Some Diagnostics to try and find slow-downs while developing

  • The Scene System allow switching between multiple scenes

Vuforia#

The Vuforia library is intended to detect targets (images, 3D models, QR-codes-like) in AR experiments in an easy-way.
A license is required to deploy the app, but a developer one can be used.

The configuration is available under the MixedRealityPlayspace -> ARCamera -> Open Vuforia Configuration object.

../../_images/51_vuforia.png

One can select: * Between quality and speed * How many targets are tracked simultaneously * The objects database * The Android ARCore usage (better AR tracking) * The camera used for tests in the player