MRTK vNext RC1 – Issue – Empty Scene After Deployment

Problem

I was experiencing the same issue described below :
Slack – #mixed-reality-toolkit – Emiliano Ciavatta [7:38 AM] (not me)
https://holodevelopers.slack.com/archives/C2H4HT858/p1555943891174100
[…snip…]
I have a problem with MRTK. When I deploy an app that uses MRTKv2 on Hololens v1 or emulator v1 the app does not start (black screen, the Unity logo does not even appear).
[…snip…]

Solution

The solution was found by another user on the Slack channel and it worked for me, so I wanted to pass it along in case someone else is having the same issue.

The solution was to uncheck the WSA Holographic Remoting Supported under the Player Settings/UWP/XR Settings.
Blob-MRTK-RC1-Issue-WSA remoting-2019-04-26
Figure 01 – Uncheck this setting

 

 

 

 

 

MRTK vNext RC1 Tutorial 04 – Grid Object Collections

Overview

In this tutorial, we will be creating an object collection. We will not be any custom scripts (we will only utilize built in MRTK scripts). We will create a new material, a new prefab for the objects arranged by the object collection script, and an object hierarchy to contain the object collection.

Figure 01 – Tutorial in action.

Prerequisites

The prerequisites can be completed in about 10 minutes, and only need to be done the first time you set up the MRTK project for this tutorial series. It is important that you are using the MRTK vNext RC1 release. If you use an earlier version of the MRTK you will have issues.

Turorial 03 Begins

Create a new Scene

  • Duplicate Base_Demo_Scene from the prerequisite steps and rename the duplicate scene as Tutorial_04.

Create New Materials

  • Add a new Material named DefaultFocusableMaterial and add it to the Assets/App/Content/Materials folder
    • Shader: Mixed Reality Toolkit/Standard
    • Albedo: set color to a Charcoalish color.
  • Add a new Material named CollectionControlPointMaterial and add it to the Assets/App/Content/Materials folder
    • Shader: Mixed Reality Toolkit/Standard
    • Albedo: set color to a Dark Blueish color.

Object Collections

The Grid Object Collection and Scatter Object Collection (scripts) allow us to easily layout many child game objects at once.

Create FocusableCube Prefab

  • Add a 3D Cube game object named FocusableCube as child a of SceneContent (our root scene object).
    • Position: 0, 0, 0
    • Scale: 1, 1, 1
    • Material: DefaultFocusableMaterial
    • NOTE: Do not add a rigidBody (we do not want gravity to affect our object collection)
    • Add Component: Interactable HighLight (script)
      • Focus: Enabled
      • Highlight: Disabled (you can change this later to see the effect)
      • Highlight Color: Assign a Green color
      • Outline Color: Assign an pastel greenish color
      • Highlight Material: MRTK_Standard_TransparentLime
      • Overlay Material: MRTK_Standard_TransparentYellow
      • Target Style: Both
Save FocusableCube as a prefab by dragging it into the Assets/App/Content/Prefabs folder and delete it from the scene.
Save the project.

Create the Object Collection Hierarchy

  • Add a 3D Sphere game object named SphericalCollectionRoot as child a of SceneContent (our root scene object).
    • Position 0, 1, 1.3
    • Scale: 0.05, 0.05, 0.05
    • Material: CollectionControlPointMaterial
    • Add Component: Interactable HighLight (script)
      • Focus: Enabled
      • Highlight Color: Assign a Goldish color
      • Outline Color: Assign an Orangish color
      • Highlight Material: MRTK_Standard_Orange
      • Overlay Material: MRTK_Standard_TransparentOrange
      • Target Style: Both
    • Add Component: Manipulation Handler (script)
      • Manipulation Type: One And Two Handed
      • Two Handed Manipulation Type: Move Rotate
      • Allow Far Manipulation: Checked
      • One Hand Rotation Mode Near: Rotate About Grab Point
      • One Hand Rotation Mode Far: Rotate About Object Center
Save the project.
  • Add an empty game object named SphericalCollectionFocus as child a of SphericalCollectionRoot
    • Position 0, 4, 0
    • Scale: 1, 1, 1
    • Add Component: Grid Object Collection (script)
      • Sort Type: Child Order
      • Surface Type: Sphere
      • Orient Type: Face Origin
      • Layout: Column Then Row
      • Radius: 8
      • Radial Range: 90
      • Rows: 4
      • Cell Width: 1.25
      • Cell Height: 1.25
Save the project.
  • Drag your FocusableCube prefab on top of the SphericalCollectionFocus game object in the scene to make it a child of SphericalCollectionFocus.
  • Select the FocusableCube prefab in the scene and duplicate the game object 27 times.
  • Select The SphericalCollectionFocus game object in the scene and in the property inspector, click the Update Collection button. This will rearrange the child objects into the sphere layout we selected.

Save the project.
Run the project in the Unity Editor.

Review of Tasks Completed:

  • We created a two new materials.
  • We created a new prefab.
  • We created an object hierarchy that contain the objects arranged by the Grid Object Collection script.
  • Using the Interactable Highlight (script) in multiple locations in our object collection hierarchy allowed us to display different highlights and outlines based on which level of the hierarchy is selected.

Video of Final Result

MRTK vNext RC1 Tutorial 03 – Left and Right WMR Controller Context Menus

Overview

We will be creating menus that can be toggled on and off for each WMR controller. The menus will be specific to each controller (the left menu and right menus do not share the same menu instance). The menus are fairly unobtrusive. To bring them into a good position to interact with, you have to rotate your controller as if you are looking at a wrist watch. The menu attached to the right controller is meant to be used (interacted by) the controller in the opposite hand. The right controller interacts with the left controller menu, and vice a versa. When you controllers are placed in a more natural position, pointing forward for instance, the menus are rotated out of the way, so as to not obstruct you view of the scene.
Figure 01 – Tutorial 03 in action

Prerequisites

The prerequisites can be completed in about 10 minutes, and only need to be done the first time you set up the MRTK project for this tutorial series. It is important that you are using the MRTK vNext RC1 release. If you use an earlier version of the MRTK you will have issues.

Turorial 03 Begins

Create a new Scene

  • Duplicate the Base_Demo_Scene you created during the prerequisite steps and rename the duplicate scene as Tutorial_03.

Create a New Material

  • Add a new Material named DefaultGrabbableMaterial and add it to the Assets/App/Content/Materials folder
    • Shader: Mixed Reality Toolkit/Standard
    • Albedo: set color to a pastel pastel bluish color.

Menus

It is important to note that if you want different menus and menu layouts for the left and right controllers, that you must not share the internal menu objects as nested prefabs. Build each menu separately and add each holographic toggle button separately before you make these into prefabs. If you follow the instruction explicitly as shown, each menu has its own state and the button actions can all be wired to different commands.

Create the Left Controller Menu Prefab

  • Add a empty game object named LeftMenuRoot to SceneContent (our root scene object)
    • Position 0, 0, 0
    • Add Component: Radial View (this automatically adds the Solver Handler script for you)
      • Reference Direction: Object Oriented
      • Min Distance: 0.1
      • Max Distance: 0.1
      • Max View Degrees: 0
    • Modify Solver Handler properties
      • Tracked Object To Reference: Motion Controller Left
      • Additional Offset: -0.2, -0.5, 0.5
      • Additional Rotation: 45, 0, 90
  • Add 3D Cube as a child of LeftMenuRoot and rename it as MenuBackPanel
    • Position 0, 0, 0
    • Scale: 0.3, 0.3, 0.01
    • Set Material to: MRTK_Standard_TranspaerentPink (so it stands out)
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_01
    • Position: -0.25, 0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_02
    • Position: 0.25, 0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_03
    • Position: -0.25, -0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_04 
    • Position: 0.25, -0.25, -0.5
    • Scale: 3, 3, 3
Save LeftMenuRoot as a prefab by dragging it into the Assets/App/Content/Prefabs folder and delete it from the scene.
Save your scene.

Create the Right Controller Menu Prefab

  • Add a empty game object named RightMenuRoot to SceneContent (our root scene object)
    • Position 0, 0, 0
    • Add Component: Radial View (the Solver Handler script is automatically added for you)
      • Reference Direction: Object Oriented
        • Min Distance: 0.1
        • Max Distance: 0.1
        • Max View Degrees: 0
      • Modify Solver Handler properties
        • Tracked Object To Reference: Motion Controller Right
        • Additional Offset: 0.2, -0.5, 0.5
        • Additional Rotation: 45, 0, 270
  • Add 3D Cube as a child of RightMenuRoot and rename it as MenuBackPanel
    • Position 0, 0, 0
    • Scale: 0.3, 0.3, 0.01
    • Set Material to: MRTK_Standard_TranspaerentLime (so it stands out)
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_01
    • Position: -0.25, 0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_02
    • Position: 0.25, 0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_03
    • Position: -0.25, -0.25, -0.5
    • Scale: 3, 3, 3
  • Add a HolographicButtonToggle prefab as a child of MenuBackPanel and rename it as HolographicButtonToggle_04
    • Position: 0.25, -0.25, -0.5
    • Scale: 3, 3, 3
Save RightMenuRoot as a prefab by dragging it into the Assets/App/Content/Prefabs folder and delete it from the scene.
Save your scene.

Menu Swap Volumes (Menu Launchers)

The swap volumes are what toggle the controller menus on and off. Any menu can be launched by any controller, but the menu that is launched or hidden is specific to the either the left or right controller, and which menu is toggled on/off depends on which swap volume you select. The swap volumes appear as blue spheres in the scene with text labels floating above the spheres describing what action will occur (Spawn Left Menu | Spawn Right Menu).

Create Left Menu Swap Volume (Instantiates Left Controller Menu)

  • Add a 3d Sphere object named LeftMenuSwapVolume to the scene as a child of the SceneContent game object.
    • Position -2, 1, 6
    • Scale: 05, 0.5, 0.5
    • Add Component: Swap Volume (from the MRTK examples unity package)
      • Select Action: Select
      • Hide This Object: LeftMenuRoot 
      • Spawn This Prefab: LeftMenuRoot
    • Add a child TextMeshPro object (it may ask to import the TextMeshPro package – import it)
      • Rect Transform – Position: 0, 1, 0
      • Rect Transform – Width: 40  Height: 5
      • Rect Transform – Scale: 0.2, 0.2, 0.2
      • Text: Spawn Left Menu – Toggles ON/OFF
      • Font Style: Bold
      • Horizontal Alignment: Center
Save your scene.

Create Right Menu Swap Volume (Instantiates Right Controller Menu)

  • Add a 3D Sphere object named RightMenuSwapVolume to the scene as a child of the SceneContent game object.
    • Position 2, 1, 6
    • Scale: 0.5, 0.5, 0.5
    • Add Component: Swap Volume (from the MRTK examples unity package)
      • Select Action: Select
      • Hide This Object: RightMenuRoot
      • Spawn This Prefab: RightMenuRoot
    • Add a child TextMeshPro object (it may ask to import the TextMeshPro package – import it)
      • Rect Transform – Position: 0, 1, 0
      • Rect Transform – Width: 40  Height: 5
      • Rect Transform – Scale: 0.2, 0.2, 0.2
      • Text: Spawn Right Menu – Toggles ON/OFF
      • Font Style: Bold
      • Horizontal Alignment: Center

Save your scene.

Review of Tasks Completed

  • We created a new material.
  • We created a menu specific to the left WMR controller.
  • We created a menu specific to the right WMR controller.
  • We created a menu swap volume (menu launcher) to toggle the left controller menu on/off. It can be toggled on/off by either controller.
  • We created a menu swap volume (menu launcher) to toggle the right controller menu on/off. It can be toggled on/off by either controller.

Video of Final Result

Final Thoughts

This pattern will probably be used in quite a few applications. There are drawbacks though: The menus have colliders and could interfere with scenes that have a large number of interactables (menus could knock thinks out of place etc.). This use case ‘Left and Right WMR Controller Context Menus’ could greatly improved by creating gesture for both the HoloLens2 and WMR Controllers. I would probably name it “Look At Watch Gesture” or something similar. If such a gesture was created, it would be useful to trigger the toggle on/off by the rotation angle/angle range and duration of pose of the controller/wrist. As far as I know gestures have only been created for the HoloLens/HoloLens2. This would be a great first gesture for the WMR Controller.

MRTK vNext RC1 Tutorial 02 – Create An Interactive Cube with Highlighting

Prerequisites

Before we begin with this tutorial, follow all of the prerequisite steps outlined in this blog post: MRTK RC1 v2.0.0 Tutorial – Common Steps – Table of Contents

Even though this tutorial is similar to the first tutorial published on this blog, I suggest you start from scratch and follow the prerequisites faithfully. If you do this now, future tutorials will be easier to add onto the base project we are creating.

The Goal of this Tutorial

We will be creating a new scene, and within that scene we will create a cube that the user can interact with and, when the user focuses on the cube, it highlights the cube. We will then make the cube a prefab so we can reuse the interactable item in future tutorials.

Create the Scene

If you have previously followed the prerequisite steps above, you can simply create a new scene in the project by opening the Base_Demo_Scene and use “Save As” to giving it the name Tutorial_02. Ensure you save this scene in the Assets/App/Content/Scenes folder.

Creating Our Assets

  • Create a Material named DefaultGrabbableMaterial and add it to the Assets/App/Content/Materials folder
    • Set the shader to Mixed Reality Toolkit/Standard
    • Set the Albedo color to a pastel redish ochre.
Figure 01 – DefaultGrabbableMaterial Properties
  • Add a 3D Cube named InteractableCube to the scene as a child of the SceneContent game object.
    • Position: 0, -0.5, 4
    • Scale: 1,1,1
    • Add Component: RigidBody (because we want gravity to affect it).
    • Add Component: Interactable Highlight script (this is an MRTK asset). Set the inspector properties as shown in figure 02.
    • Add the DefaultGrabbableMaterial to the InteractableCube game object.
    • Add an C# script named DragAndDropHandler to the Assets/App/Content/Scripts folder and paste in the implementation shown in code listing 01 at the end of this post. Set the inspector properties as shown in figure 03.
  •  Save your project

 

Figure 02
Figure 03

 

 

 

 

 

At this point if you have successfully followed this tutorial, if you click run the Unity Editor you should see something similar to the YouTube video shown here: MRTK vNext RC1 Tutorial 02 – Video 01

Final Steps

  • Drag game object InteractableCube into the Assets/App/Content/Prefabs folder and keep it in the scene.
  • Save your project.

Review of Tasks Completed:

  • Performed prerequisite steps to prepare our project for this tutorial.
  • Created a material utilizing the Mixed Reality Toolkit/Standard shader.
  • Created a new monobehavior.
  • Created a interactable cube with highlighting utilizing new and MRTK provided monobehaviors.
  • Tested our application to confirm expected behavior.
  • Created a prefab out of the interactable cube.

Code Listing 01 – Start – DragAndDropHandler code 

using Microsoft.MixedReality.Toolkit;
using Microsoft.MixedReality.Toolkit.Input;
using Microsoft.MixedReality.Toolkit.Physics;
using Microsoft.MixedReality.Toolkit.Utilities;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;

/// <summary>
/// Component that allows dragging a <see cref="GameObject"/>.
/// Dragging is done by calculating the angular delta and z-delta between the current and previous hand positions,
/// and then repositioning the object based on that.
/// </summary>
public class DragAndDropHandler : BaseFocusHandler,
    IMixedRealityInputHandler<MixedRealityPose>,
    IMixedRealityPointerHandler,
    IMixedRealitySourceStateHandler
{
    private enum RotationModeEnum
    {
        Default,
        LockObjectRotation,
        OrientTowardUser,
        OrientTowardUserAndKeepUpright
    }

    [SerializeField]
    [Tooltip("The action that will start/stop the dragging.")]
    private MixedRealityInputAction dragAction = MixedRealityInputAction.None;

    [SerializeField]
    [Tooltip("The action that will provide the drag position.")]
    private MixedRealityInputAction dragPositionAction = MixedRealityInputAction.None;

    [SerializeField]
    [Tooltip("Transform that will be dragged. Defaults to the object of the component.")]
    private Transform hostTransform;

    [SerializeField]
    [Tooltip("Scale by which hand movement in Z is multiplied to move the dragged object.")]
    private float distanceScale = 2f;

    [SerializeField]
    [Tooltip("How should the GameObject be rotated while being dragged?")]
    private RotationModeEnum rotationMode = RotationModeEnum.Default;

    [SerializeField]
    [Range(0.01f1.0f)]
    [Tooltip("Controls the speed at which the object will interpolate toward the desired position")]
    private float positionLerpSpeed = 0.2f;

    [SerializeField]
    [Range(0.01f1.0f)]
    [Tooltip("Controls the speed at which the object will interpolate toward the desired rotation")]
    private float rotationLerpSpeed = 0.2f;

    /// <summary>
    /// Gets the pivot position for the hand, which is approximated to the base of the neck.
    /// </summary>
    /// <returns>Pivot position for the hand.</returns>
    private Vector3 HandPivotPosition => CameraCache.Main.transform.position + new Vector3(0-0.2f0- CameraCache.Main.transform.forward * 0.2f// a bit lower and behind

    private bool isDragging;
    private bool isDraggingEnabled = true;
    private bool isDraggingWithSourcePose;

    // Used for moving with a pointer ray
    private float stickLength;
    private Vector3 previousPointerPositionHeadSpace;

    // Used for moving with a source position
    private float handRefDistance = -1;
    private float objectReferenceDistance;
    private Vector3 objectReferenceDirection;
    private Quaternion gazeAngularOffset;

    private Vector3 objectReferenceUp;
    private Vector3 objectReferenceForward;
    private Vector3 objectReferenceGrabPoint;

    private Vector3 draggingPosition;
    private Quaternion draggingRotation;

    private Rigidbody hostRigidbody;
    private bool hostRigidbodyWasKinematic;

    private IMixedRealityPointer currentPointer;
    private IMixedRealityInputSource currentInputSource;

    // If the dot product between hand movement and head forward is less than this amount,
    // don't exponentially increase the length of the stick
    private readonly float zPushTolerance = 0.1f;

    #region MonoBehaviour Implementation

    private void Start()
    {
        if (hostTransform == null)
        {
            hostTransform = transform;
        }

        hostRigidbody = hostTransform.GetComponent<Rigidbody>();
    }

    private void OnDestroy()
    {
        if (isDragging)
        {
            StopDragging();
        }
    }

    #endregion MonoBehaviour Implementation

    #region IMixedRealityPointerHandler Implementation

    void IMixedRealityPointerHandler.OnPointerUp(MixedRealityPointerEventData eventData)
    {
        if (!isDraggingEnabled || !isDragging || eventData.MixedRealityInputAction != dragAction || eventData.SourceId != currentInputSource?.SourceId)
        {
            // If we're not handling drag input or we're not releasing the right action, don't try to end a drag operation.
            return;
        }

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.

        StopDragging();
    }

    void IMixedRealityPointerHandler.OnPointerDown(MixedRealityPointerEventData eventData)
    {
        if (!isDraggingEnabled || isDragging || eventData.MixedRealityInputAction != dragAction)
        {
            // If we're already handling drag input or we're not grabbing, don't start a new drag operation.
            return;
        }

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.

        currentInputSource = eventData.InputSource;
        currentPointer = eventData.Pointer;

        FocusDetails focusDetails;
        Vector3 initialDraggingPosition = MixedRealityToolkit.InputSystem.FocusProvider.TryGetFocusDetails(currentPointer, out focusDetails)
                ? focusDetails.Point
                : hostTransform.position;

        isDraggingWithSourcePose = currentPointer == MixedRealityToolkit.InputSystem.GazeProvider.GazePointer;

        StartDragging(initialDraggingPosition);
    }

    void IMixedRealityPointerHandler.OnPointerClicked(MixedRealityPointerEventData eventData) { }

    #endregion IMixedRealityPointerHandler Implementation

    #region IMixedRealitySourceStateHandler Implementation

    void IMixedRealitySourceStateHandler.OnSourceDetected(SourceStateEventData eventData) { }

    void IMixedRealitySourceStateHandler.OnSourceLost(SourceStateEventData eventData)
    {
        if (eventData.SourceId == currentInputSource?.SourceId)
        {
            StopDragging();
        }
    }

    #endregion IMixedRealitySourceStateHandler Implementation

    #region BaseFocusHandler Overrides

    /// <inheritdoc />
    public override void OnFocusExit(FocusEventData eventData)
    {
        if (isDragging)
        {
            StopDragging();
        }
    }

    #endregion BaseFocusHandler Overrides

    /// <summary>
    /// Enables or disables dragging.
    /// </summary>
    /// <param name="isEnabled">Indicates whether dragging should be enabled or disabled.</param>
    public void SetDragging(bool isEnabled)
    {
        if (isDraggingEnabled == isEnabled)
        {
            return;
        }

        isDraggingEnabled = isEnabled;

        if (isDragging)
        {
            StopDragging();
        }
    }

    /// <summary>
    /// Starts dragging the object.
    /// </summary>
    private void StartDragging(Vector3 initialDraggingPosition)
    {
        if (!isDraggingEnabled || isDragging)
        {
            return;
        }

        Transform cameraTransform = CameraCache.Main.transform;

        currentPointer.IsFocusLocked = true;
        isDragging = true;

        if (hostRigidbody != null)
        {
            hostRigidbodyWasKinematic = hostRigidbody.isKinematic;
            hostRigidbody.isKinematic = true;
        }

        if (isDraggingWithSourcePose)
        {
            Vector3 pivotPosition = HandPivotPosition;
            objectReferenceDistance = Vector3.Magnitude(initialDraggingPosition - pivotPosition);
            objectReferenceDirection = cameraTransform.InverseTransformDirection(Vector3.Normalize(initialDraggingPosition - pivotPosition));
        }
        else
        {
            Vector3 inputPosition = currentPointer.Position;
            //currentPointer.TryGetPointerPosition(out inputPosition);

            previousPointerPositionHeadSpace = cameraTransform.InverseTransformPoint(inputPosition);
            stickLength = Vector3.Distance(initialDraggingPosition, inputPosition);
        }

        // Store where the object was grabbed from
        objectReferenceGrabPoint = cameraTransform.transform.InverseTransformDirection(hostTransform.position - initialDraggingPosition);

        // in camera space
        objectReferenceForward = cameraTransform.InverseTransformDirection(hostTransform.forward);
        objectReferenceUp = cameraTransform.InverseTransformDirection(hostTransform.up);

        draggingPosition = initialDraggingPosition;
    }

    /// <summary>
    /// Stops dragging the object.
    /// </summary>
    private void StopDragging()
    {
        if (!isDragging)
        {
            return;
        }

        currentPointer.IsFocusLocked = false;
        isDragging = false;
        handRefDistance = -1;

        if (hostRigidbody != null)
        {
            hostRigidbody.isKinematic = hostRigidbodyWasKinematic;
        }
    }

    #region IMixedRealityInputHandler<MixedRealityPose> Implementation

    void IMixedRealityInputHandler<MixedRealityPose>.OnInputChanged(InputEventData<MixedRealityPose> eventData)
    {
        if (eventData.MixedRealityInputAction != dragPositionAction || !isDraggingEnabled || !isDragging || eventData.SourceId != currentInputSource?.SourceId)
        {
            return;
        }

        Transform cameraTransform = CameraCache.Main.transform;
        Vector3 pivotPosition = Vector3.zero;

        if (isDraggingWithSourcePose)
        {
            Vector3 inputPosition = eventData.InputData.Position;
            pivotPosition = HandPivotPosition;
            Vector3 newHandDirection = Vector3.Normalize(inputPosition - pivotPosition);

            if (handRefDistance < 0)
            {
                handRefDistance = Vector3.Magnitude(inputPosition - pivotPosition);

                Vector3 handDirection = cameraTransform.InverseTransformDirection(Vector3.Normalize(inputPosition - pivotPosition));

                // Store the initial offset between the hand and the object, so that we can consider it when dragging
                gazeAngularOffset = Quaternion.FromToRotation(handDirection, objectReferenceDirection);
            }

            // in camera space
            newHandDirection = cameraTransform.InverseTransformDirection(newHandDirection);
            Vector3 targetDirection = Vector3.Normalize(gazeAngularOffset * newHandDirection);
            // back to world space
            targetDirection = cameraTransform.TransformDirection(targetDirection);

            float currentHandDistance = Vector3.Magnitude(inputPosition - pivotPosition);
            float distanceRatio = currentHandDistance / handRefDistance;
            float distanceOffset = distanceRatio > 0 ? (distanceRatio - 1f* distanceScale : 0;
            float targetDistance = objectReferenceDistance + distanceOffset;

            draggingPosition = pivotPosition + (targetDirection * targetDistance);
        }
        else
        {
            pivotPosition = cameraTransform.position;

            Vector3 pointerPosition = currentPointer.Position; ;
            //currentPointer.TryGetPointerPosition(out pointerPosition);

            Ray pointingRay = currentPointer.Rays[0]; ;
            //currentPointer.TryGetPointingRay(out pointingRay);

            Vector3 currentPosition = pointerPosition;
            Vector3 currentPositionHeadSpace = cameraTransform.InverseTransformPoint(currentPosition);
            Vector3 positionDeltaHeadSpace = currentPositionHeadSpace - previousPointerPositionHeadSpace;

            float pushDistance = Vector3.Dot(positionDeltaHeadSpace,
                cameraTransform.InverseTransformDirection(pointingRay.direction.normalized));
            if (Mathf.Abs(Vector3.Dot(positionDeltaHeadSpace.normalized, Vector3.forward)) > zPushTolerance)
            {
                stickLength = DistanceRamp(stickLength, pushDistance);
            }

            draggingPosition = pointingRay.GetPoint(stickLength);

            previousPointerPositionHeadSpace = currentPositionHeadSpace;
        }

        switch (rotationMode)
        {
            case RotationModeEnum.OrientTowardUser:
            case RotationModeEnum.OrientTowardUserAndKeepUpright:
                draggingRotation = Quaternion.LookRotation(hostTransform.position - pivotPosition);
                break;
            case RotationModeEnum.LockObjectRotation:
                draggingRotation = hostTransform.rotation;
                break;
            default:
                // in world space
                Vector3 objForward = cameraTransform.TransformDirection(objectReferenceForward);
                // in world space
                Vector3 objUp = cameraTransform.TransformDirection(objectReferenceUp);
                draggingRotation = Quaternion.LookRotation(objForward, objUp);
                break;
        }

        Vector3 newPosition = Vector3.Lerp(hostTransform.position, draggingPosition + cameraTransform.TransformDirection(objectReferenceGrabPoint), positionLerpSpeed);

        // Apply Final Position
        if (hostRigidbody == null)
        {
            hostTransform.position = newPosition;
        }
        else
        {
            hostRigidbody.MovePosition(newPosition);
        }

        // Apply Final Rotation
        Quaternion newRotation = Quaternion.Lerp(hostTransform.rotation, draggingRotation, rotationLerpSpeed);
        if (hostRigidbody == null)
        {
            hostTransform.rotation = newRotation;
        }
        else
        {
            hostRigidbody.MoveRotation(newRotation);
        }

        if (rotationMode == RotationModeEnum.OrientTowardUserAndKeepUpright)
        {
            Quaternion upRotation = Quaternion.FromToRotation(hostTransform.up, Vector3.up);
            hostTransform.rotation = upRotation * hostTransform.rotation;
        }
    }

    #endregion IMixedRealityInputHandler<MixedRealityPose> Implementation

    #region Private Helpers

    /// <summary>
    /// Gets the pivot position for the hand, which is approximated to the base of the neck.
    /// </summary>
    /// <remarks>
    /// An exponential distance ramping where distance is determined by:
    /// f(t) = (e^At - 1)/B
    /// where:
    /// A is a scaling factor: how fast the function ramps to infinity
    /// B is a second scaling factor: a denominator that shallows out the ramp near the origin
    /// t is a linear input
    /// f(t) is the distance exponentially ramped along variable t
    /// 
    /// Here's a quick derivation for the expression below.
    /// A = constant
    /// B = constant
    /// d = ramp(t) = (e^At - 1)/B
    /// t = ramp_inverse(d) =  ln(B*d+1)/A
    /// In general, if y=f(x), then f(currentY, deltaX) = f( f_inverse(currentY) + deltaX )
    /// So,
    /// ramp(currentD, deltaT) = (e^(A*(ln(B*currentD + 1)/A + deltaT)) - 1)/B
    /// simplified:
    /// ramp(currentD, deltaT) = (e^(A*deltaT) * (B*currentD + 1) - 1) / B
    /// </remarks>
    private static float DistanceRamp(float currentDistance, float deltaT, float A = 4.0ffloat B = 75.0f)
    {
        return (Mathf.Exp(A * deltaT) * (B * currentDistance + 1- 1/ B;
    }

    #endregion Private Helpers
}

Code Listing 01 – End- DragAndDropHandler code 

 

 

MRTK vNext RC1 Tutorial – Common Steps – Creating an Environment for Teleportation

The Goals of this Tutorial

The goal of this tutorial is to create a scene with an environment consisting of a large cube (the floor) textured with the Sand material, that allows the user movement within the environment. We will also add some ambient audio to the scene.

 

Add Components to the Scene

Create a scene named Base_Demo_Scene and save it to the Assets/App/Content/Scenes folder.

Select the Mixed Reality Toolkit menu and select Add to Scene and Configure.

  • Select the DefaultMixedRealityToolkitConfigurationProfile when the prompt appears (see figure 01).

 

Figure 01 – Selecting the default configuration profile.

 

 

  • Create an empty game object, named SceneContent, in the root of the scene (see figure 01).
    • Position: 0, 0, 0

 

  • Add a child empty game object named Environment to the SceneContent game object.
    • Position: 0, -1, 0
    • Add an Audio Source Component
      • AudioClip:”Western Outside Loop” (located in the “Western Demo Audio Assets” asset folder structure)
      • Play On Awake: checked
      • Loop: checked
      • Volume: 0.06

 

  • Add a child empty game object named Terrain to the Environment game object.
    • Position: 0, 0, 0

 

  • Add a child 3D Cube named FloorPanel to the Terrain game object.
    • Position:   0, -0.25,   0
    • Scale:    100,   0.5, 100
    • Add Sand material (located under the “Lowpoly Wasteland Props” assets)
      • Set Tiling: X: 50 Y: 50

 

 

 

Task Completed So Far

  • Configured the scene for the MRTK.
  • Created a root game object that should contain all scene game objects not created by the MRTK configuration.
  • Created a game object hierarchy for the environment.

Confirm Our Changes

Run the scene from the Unity Editor. If you followed all of the steps correctly, you should hear music playing, and you will appear in the center of the scene with sand under your feet. You should be able to teleport around in the scene using the WMR controllers.

Create Prefabs

After confirming the behavior above, first drag the Terrain game object from the scene into the Assets/App/Content/Prefabs folder. Then do the same thing for the Environment game object. This creates prefabs for you and they can be used in other scenes with minimal effort. NOTE: The order in which you create the prefabs is important.

Your scene in the hierarchy window should look like that shown in figure 01 below. If it does not review this tutorial to ensure you followed all of the steps accurately.

 

Figure 02 – Scene hierarchy when completed

Summary

What we just did is setup an area for the VR user to move around in. We added an audio clip to the Environment game object that begins playing as soon as the VR scene is initialized and continuously loops until we exit the application. This provides audio feedback to the user that the scene is loaded, active, and running . We also added a simple, flat, 100 meter by 100 meter floor so the user can teleport around in the environment. We added the Sand material to FloorPanel game object give the user better visual feedback on movement. If we had just assigned a solid color material to the floor, it would be difficult for the user to visually detect subtle movement within the environment. Additionally we dropped the Environment game object to 1 meter below the normal level. (this is because my system, for some reason, routinely spawns me into the floor, at about chest height, when I don’t do this).

Additionally, we created an empty game object named SceneContent that will act as the root for all scene game objects other than those added by the MRTK configuration (items shown in Figure 05). We do this when using the MRTK (and I believe this applies to all VR applications) because unlike most Unity 3D scenes, you don’t move the player and/or their attached camera to move around in the scene. In a VR application context, instead of moving the player/camera to move around, you move the scene around the player/camera. This root object is used to simplify the moving all game objects in the scene at one go. To make this even more explicit, place all interactive visuals, spawn points, etc. in the scene as children of the SceneContent root game object.

 

Update (2019-04-18): It turns out that the reason the user is spawned into the floor is because the boundary data was missing or corrupted. If a message pops up to look side to side and then to look down, your system has lost its boundary data. I you find yourself being spawned into the floor, go to the Windows Mixed Reality Portal and reset the boundary for your device.
Feel free to leave comments, suggestions.

MRTK vNext RC1 Tutorial – Common Steps – Creating a Home for Application Specific Assets

The Goals of this Tutorial

The goal of this tutorial is to create a clear separation between your unity assets and unity assets imported from other sources.

Create the asset folder structure shown in figure 01.

Figure 01 – The Project Assets after adding custom folders.

 

The purpose of the hierarchy of folders you are creating is to allow us to keep our custom game objects and custom scripts separate from unity assets imported via the Unity Asset Store, and other sources such as GitHub. This creates a clear demarcation between our code and code from a foreign source. 

MRTK vNext RC1 Tutorial – Common Steps – Importing Assets From The Unity Asset Store

The Goals of this Tutorial

The goal of this tutorial is to import free unity assets that can be used in the application we will be creating. If you do not know how to import unity assets from the unity store follow the tutorial “Using the Asset Store”
https://unity3d.com/learn/tutorials/topics/asset-store/using-asset-store?playlist=17132

 

Import these free assets from the Unity Asset Store:

  • “Western Audio Music” see Figure 01 for specific assets imported.
  • “Lowpoly Wasteland Props” import all assets.

And then save your project.

 

Figure 01 – Selective assets imported from the “Western Audio Music” asset package.

 

 

Figure 02 – Assets imported from the Unity Asset Store

The at least some of the assets we just imported will be used in all future tutorials. We will use the “Sand” material from the “Lowpoly Wasteland Props” asset package. We will apply that material to the walking surface that the user will move around on. The material has a subtle texture that allows the user experience better visual feedback on head tracking and movement within a sparsely populated environment. We will use a ambient sound loop (background sound) to indicate to the user that the scene is loaded, running, and ready. In later tutorials we will incorporate more assets from the “Lowpoly Wasteland Props” asset package.

 

 

 

MRTK vNext RC1 Tutorial – Common Steps – Setting Up the MRTK

The Goals of this Tutorial

We will create a Unity3D application and setup the MRTK v2.0.0 RC1 Release package in your Unity application.

Preparing for the Mixed Reality Toolkit (MRTK) Tutorial

Go to GitHub to download the unity packages (*.unitypackage) for the Mixed Reality Toolkit (MRTK) v2.0.0 RC1 Release located here: https://github.com/Microsoft/MixedRealityToolkit-Unity/releases

Create a Unity Project named MRTK-RC1-Demo-02. I have found the issues in loading the MRTK assets are minimized if you change the build setting to output a Universal Windows Platform (UWP) application, and change the player settings/other settings to API Compatibility Level to .NET Standard 2.0 before importing the MRTK assets. The figure 01 and figure 02 below indicate the settings I have modified.

 

Figure 01 – Custom Build settings
Figure 02 – Player Settings

 

 

 

 

 

 

 

 

 

 

Select menu item: Assets/Import Package/Custom Package… to import the MRTK packages you already downloaded:

  • Microsoft.MixedReality.Toolkit.Unity.Foundation-v2.0.0-RC1.unitypackage
  • Microsoft.MixedReality.Toolkit.Unity.Examples-v2.0.0-RC1.unitypackage
Follow the MRTK getting started guide (here: https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html ) to get the MRTK installed and setup. Then load a scene and from the MRTK examples and run the unity application by clicking the start button in the unity editor. If the scene is loaded and runs without error and your Windows Mixed Reality (WMR) headset / or Hololens displays the examples scene you loaded then we can move on to the next step in this tutorial series.

After successfully importing the MRTK packages you should have the assets shown in figure 03. At this point save your project.

 

Figure 03 – Assets imported via the MRTK unity packages.

 

Review of Tasks Completed

  • Created the project
  • Modified the build settings to output a Universal Windows Platform (UWP) application.
  • Modified the player settings API compatibility level to .NET Standard 2.0.
  • Imported the MRTK unity packages.
  • Verified that the MRTK packages are installed correctly by running an MRTK example scene.

 

MRTK vNext RC1 Tutorial – Common Steps – Table of Contents

Common Setup Steps for All Mixed Reality Toolkit (MRTK) Tutorials

  1. Setting Up the MRTK
  2. Importing Assets From The Unity Asset Store
  3. Creating a Home for Application Specific Assets
  4. Creating an Environment for Teleportation

The motivation for breaking the tutorial into separate mini-tutorials is to allow me to start new tutorials with these prerequisites. If you have already completed these steps for a previous tutorial, you can skip these steps if you wish and just create a new scene in the previously created project.

 

 

Universal Windows Platform (UWP) Build Issues With MRTK v2.0.0 RC1 Release

The unity packages (*.unitypackage) for the MRTK v2.0.0 RC1 Release located here: https://github.com/Microsoft/MixedRealityToolkit-Unity/releases

have an issue building Universal Windows Platform (UWP) output for users not enrolled in the Windows Insider Program. The issue revolves around the operating system updates for the HoloLens 2 coming out in late May 2019. Running the application within the Unity Editor works (minus some new functionality around the HoloLens 2 hand manipulations), but compiling and deploying to Windows 10 will fail.

 

You can get more information here: https://stackoverflow.com/questions/55601074/since-the-new-version-of-mrtk-i-cant-build-a-scene/55602238#55602238