In this tutorial, we will be creating an object collection. We will not be any custom scripts (we will only utilize built in MRTK scripts). We will create a new material, a new prefab for the objects arranged by the object collection script, and an object hierarchy to contain the object collection.
The prerequisites can be completed in about 10 minutes, and only need to be done the first time you set up the MRTK project for this tutorial series. It is important that you are using the MRTK vNext RC1 release. If you use an earlier version of the MRTK you will have issues.
Turorial 03 Begins
Create a new Scene
Duplicate Base_Demo_Scene from the prerequisite steps and rename the duplicate scene as Tutorial_04.
Create New Materials
Add a new Material named DefaultFocusableMaterial and add it to the Assets/App/Content/Materials folder
Shader: Mixed Reality Toolkit/Standard
Albedo: set color to a Charcoalish color.
Add a new Material named CollectionControlPointMaterial and add it to the Assets/App/Content/Materials folder
Shader: Mixed Reality Toolkit/Standard
Albedo: set color to a Dark Blueish color.
Object Collections
The Grid Object Collection and Scatter Object Collection (scripts) allow us to easily layout many child game objects at once.
Create FocusableCube Prefab
Add a 3D Cube game object named FocusableCube as child a of SceneContent (our root scene object).
Position: 0, 0, 0
Scale: 1, 1, 1
Material: DefaultFocusableMaterial
NOTE: Do not add a rigidBody (we do not want gravity to affect our object collection)
Add Component: Interactable HighLight (script)
Focus: Enabled
Highlight: Disabled (you can change this later to see the effect)
Highlight Color: Assign a Green color
Outline Color: Assign an pastel greenish color
Highlight Material: MRTK_Standard_TransparentLime
Overlay Material: MRTK_Standard_TransparentYellow
Target Style: Both
Save FocusableCube as a prefab by dragging it into the Assets/App/Content/Prefabs folder and delete it from the scene.
Save the project.
Create the Object Collection Hierarchy
Add a 3D Sphere game object named SphericalCollectionRoot as child a of SceneContent (our root scene object).
Position 0, 1, 1.3
Scale: 0.05, 0.05, 0.05
Material: CollectionControlPointMaterial
Add Component: Interactable HighLight (script)
Focus: Enabled
Highlight Color: Assign a Goldish color
Outline Color: Assign an Orangish color
Highlight Material: MRTK_Standard_Orange
Overlay Material: MRTK_Standard_TransparentOrange
Target Style: Both
Add Component: Manipulation Handler (script)
Manipulation Type: One And Two Handed
Two Handed Manipulation Type: Move Rotate
Allow Far Manipulation: Checked
One Hand Rotation Mode Near: Rotate About Grab Point
One Hand Rotation Mode Far: Rotate About Object Center
Save the project.
Add an empty game object named SphericalCollectionFocus as child a of SphericalCollectionRoot
Position 0, 4, 0
Scale: 1, 1, 1
Add Component: Grid Object Collection (script)
Sort Type: Child Order
Surface Type: Sphere
Orient Type: Face Origin
Layout: Column Then Row
Radius: 8
Radial Range: 90
Rows: 4
Cell Width: 1.25
Cell Height: 1.25
Save the project.
Drag your FocusableCube prefab on top of the SphericalCollectionFocus game object in the scene to make it a child of SphericalCollectionFocus.
Select the FocusableCube prefab in the scene and duplicate the game object 27 times.
Select The SphericalCollectionFocus game object in the scene and in the property inspector, click the Update Collection button. This will rearrange the child objects into the sphere layout we selected.
Save the project.
Run the project in the Unity Editor.
Review of Tasks Completed:
We created a two new materials.
We created a new prefab.
We created an object hierarchy that contain the objects arranged by the Grid Object Collection script.
Using the Interactable Highlight (script) in multiple locations in our object collection hierarchy allowed us to display different highlights and outlines based on which level of the hierarchy is selected.
The goal of this tutorial is to create a scene with an environment consisting of a large cube (the floor) textured with the Sand material, that allows the user movement within the environment. We will also add some ambient audio to the scene.
Add Components to the Scene
Create a scene named Base_Demo_Scene and save it to the Assets/App/Content/Scenes folder.
Select the Mixed Reality Toolkit menu and select Add to Scene and Configure.
Select the DefaultMixedRealityToolkitConfigurationProfile when the prompt appears (see figure 01).
Create an empty game object, named SceneContent, in the root of the scene (see figure 01).
Position: 0, 0, 0
Add a child empty game object named Environment to the SceneContent game object.
Position: 0, -1, 0
Add an Audio Source Component
AudioClip:”Western Outside Loop” (located in the “Western Demo Audio Assets” asset folder structure)
Play On Awake: checked
Loop: checked
Volume: 0.06
Add a child empty game object named Terrain to the Environment game object.
Position: 0, 0, 0
Add a child 3D Cube named FloorPanel to the Terrain game object.
Position: 0, -0.25, 0
Scale: 100, 0.5, 100
Add Sand material (located under the “Lowpoly Wasteland Props” assets)
Set Tiling: X: 50 Y: 50
Task Completed So Far
Configured the scene for the MRTK.
Created a root game object that should contain all scene game objects not created by the MRTK configuration.
Created a game object hierarchy for the environment.
Confirm Our Changes
Run the scene from the Unity Editor. If you followed all of the steps correctly, you should hear music playing, and you will appear in the center of the scene with sand under your feet. You should be able to teleport around in the scene using the WMR controllers.
Create Prefabs
After confirming the behavior above, first drag the Terrain game object from the scene into the Assets/App/Content/Prefabs folder. Then do the same thing for the Environment game object. This creates prefabs for you and they can be used in other scenes with minimal effort. NOTE:The order in which you create the prefabs is important.
Your scene in the hierarchy window should look like that shown in figure 01 below. If it does not review this tutorial to ensure you followed all of the steps accurately.
Summary
What we just did is setup an area for the VR user to move around in. We added an audio clip to the Environment game object that begins playing as soon as the VR scene is initialized and continuously loops until we exit the application. This provides audio feedback to the user that the scene is loaded, active, and running . We also added a simple, flat, 100 meter by 100 meter floor so the user can teleport around in the environment. We added the Sand material to FloorPanel game object give the user better visual feedback on movement. If we had just assigned a solid color material to the floor, it would be difficult for the user to visually detect subtle movement within the environment. Additionally we dropped the Environment game object to 1 meter below the normal level. (this is because my system, for some reason, routinely spawns me into the floor, at about chest height, when I don’t do this).
Additionally, we created an empty game object named SceneContent that will act as the root for all scene game objects other than those added by the MRTK configuration (items shown in Figure 05). We do this when using the MRTK (and I believe this applies to all VR applications) because unlike most Unity 3D scenes, you don’t move the player and/or their attached camera to move around in the scene. In a VR application context, instead of moving the player/camera to move around, you move the scene around the player/camera. This root object is used to simplify the moving all game objects in the scene at one go. To make this even more explicit, place all interactive visuals, spawn points, etc. in the scene as children of the SceneContent root game object.
Update (2019-04-18): It turns out that the reason the user is spawned into the floor is because the boundary data was missing or corrupted. If a message pops up to look side to side and then to look down, your system has lost its boundary data. I you find yourself being spawned into the floor, go to the Windows Mixed Reality Portal and reset the boundary for your device.
The goal of this tutorial is to create a clear separation between your unity assets and unity assets imported from other sources.
Create the asset folder structure shown in figure 01.
The purpose of the hierarchy of folders you are creating is to allow us to keep our custom game objects and custom scripts separate from unity assets imported via the Unity Asset Store, and other sources such as GitHub. This creates a clear demarcation between our code and code from a foreign source.
Import these free assets from the Unity Asset Store:
“Western Audio Music” see Figure 01 for specific assets imported.
“Lowpoly Wasteland Props” import all assets.
And then save your project.
The at least some of the assets we just imported will be used in all future tutorials. We will use the “Sand” material from the “Lowpoly Wasteland Props” asset package. We will apply that material to the walking surface that the user will move around on. The material has a subtle texture that allows the user experience better visual feedback on head tracking and movement within a sparsely populated environment. We will use a ambient sound loop (background sound) to indicate to the user that the scene is loaded, running, and ready. In later tutorials we will incorporate more assets from the “Lowpoly Wasteland Props” asset package.
Create a Unity Project named MRTK-RC1-Demo-02. I have found the issues in loading the MRTK assets are minimized if you change the build setting to output a Universal Windows Platform (UWP) application, and change the player settings/other settings to API Compatibility Level to .NET Standard 2.0 before importing the MRTK assets. The figure 01 and figure 02 below indicate the settings I have modified.
Select menu item: Assets/Import Package/Custom Package… to import the MRTK packages you already downloaded:
Follow the MRTK getting started guide (here: https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html ) to get the MRTK installed and setup. Then load a scene and from the MRTK examples and run the unity application by clicking the start button in the unity editor. If the scene is loaded and runs without error and your Windows Mixed Reality (WMR) headset / or Hololens displays the examples scene you loaded then we can move on to the next step in this tutorial series.
After successfully importing the MRTK packages you should have the assets shown in figure 03. At this point save your project.
Figure 03 – Assets imported via the MRTK unity packages.
Review of Tasks Completed
Created the project
Modified the build settings to output a Universal Windows Platform (UWP) application.
Modified the player settings API compatibility level to .NET Standard 2.0.
Imported the MRTK unity packages.
Verified that the MRTK packages are installed correctly by running an MRTK example scene.
The motivation for breaking the tutorial into separate mini-tutorials is to allow me to start new tutorials with these prerequisites. If you have already completed these steps for a previous tutorial, you can skip these steps if you wish and just create a new scene in the previously created project.
have an issue building Universal Windows Platform (UWP) output for users not enrolled in the Windows Insider Program. The issue revolves around the operating system updates for the HoloLens 2 coming out in late May 2019. Running the application within the Unity Editor works (minus some new functionality around the HoloLens 2 hand manipulations), but compiling and deploying to Windows 10 will fail.
We will create a Unity3D application. The goals of this tutorial is to learn how to:
Setup the MRTK v2.0.0 RC1 Release package in your Unity application.
Create an environment that allows user movement.
Add custom free Unity Assets to our project.
The user has the ability to spawn a wall of cubes via the Windows Mixed Reality (WMR) Controller menu button. Multiple spawns are allowed.
The user can interact with the cubes that were spawned.
We will accomplish this by creating a floor with a material that allows the user have better visual feedback of subtle movement, and ambient audio for better emersion. Then adding the ability to spawn a wall of cubes that the user can interact with.
Create a Unity Project named MRTK-RC1-Demo-01. I have found the issues in loading the MRTK assets are minimized if you change the Build Setting to output a Universal Windows Platform (UWP) application, and change the Player Settings/Other Settings to Api Compatibility Level to .NET Standard 2.0 before importing the MRTK assets. The images below indicates the settings I have modified.
Select menu item: Assets/Import Package/Custom Package…
to import the MRTK packages you already downloaded:
Follow the MRTK Getting Started Guide(here: https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html )to get the MRTK installed and setup. Then load a scene and from the MRTK Examples and run the unity application by clicking the Start button in the unity editor. If the scene is loaded and runs without error and your Windows Mixed Reality (WMR) headset / or Hololens displays the examples scene you loaded then we can begin writing our custom code.
Review of tasks so far:
Created the Project
Modified the Build settings to output a UWP application.
Modified The Player Settings API Compatibility Level to .NET Standard 2.0.
Imported the MRTK unity packages
Verified that the MRTK packages are installed correctly by running an MRTK example scene.
Custom Application Groundwork
Now import these free assets from the Unity Asset Store (These assets will be used in the application we are building in this – and future – tutorial(s)):
“Western Audio Music” see Figure 03 for specific assets imported.
“Lowpoly Wasteland Props” all assets imported.
Now create a folder layout in the Project Assets to support our custom application. Modify the Project Asset add the folder structure shown in Figure 04.
Select the /Scenes folder you created above and add a scene named: Demo_01 to that folder.
With this new scene loaded into the Unity Editor, select the Mixed Reality Toolkit menu and select Add to Scene and Configure.
Figure 05 – The Hierarchy window after MRTK configuration.
The purpose of the hierarchy of folders we created (as displayed in Figure 04) is to allow us to keep our custom game objects and custom scripts separate from unity assets imported via the Unity Asset Store, and other sources such as GitHub. This creates a clear demarcation between your code and code from a foreign source.
Adding the Tutorial Custom Code and Assets
Step One – Environment Setup
Now we will create game objects in the Demo_01 scene that will use the MRTK.
Create an empty game object, named SceneContent, in the root of the Hierarchy Window.
Add a child empty game object named Environment to the SceneContent game object.
Position: 0, -1, 0
Add an Audio Source Component
AudioClip:”Western Outside Loop” (located in the “Western Demo Audio Assets” asset folder)
Play On Awake: checked
Loop: checked
Add a child empty game object named GlobalListener to the Environment game object.
Add a new MonoBehavior named SpawnAction to the Assets/App/Content/Scripts folder (implementation shown in Code Listing 01)
Add Component: Interactable
Enabled: checked
Input Actions: Menu
Is Global: checked
OnClick() Event (shown in Figure 08)
Runtime Only: GlobalListener
SpawnAction.SpawnPrefab
Add a child empty game object named Terrain to the Environment game object.
Add a child 3D Cube named FloorPanel to the Terrain game object.
Position: 0, -0.25, 0
Scale: 100, 0.5, 100
Add Sand Material (located under the “Lowpoly Wasteland Props” assets)
What we just did is setup an area for the VR user to move around in. We added an audio clip that begins playing as soon as the VR scene is initialized and continuously loops until we exit the application. This provides audio feedback the the user that the scene is active and ready. We also added a simple, flat, 100 meter by 100 meter floor so the user can teleport around in the environment. We added the Sand material to FloorPanel game object give the user better visual feedback on movement. If we had just assigned a solid color material to the floor, it would be difficult for the user’s senses (vision) to detect movement within the environment. Additionally we dropped the Environment game object to 1 meter below the normal level (this is because my system, for some reason, routinely spawns me into the floor, at about chest height, when I don’t do this).
Additionally, we created an empty game object named SceneContent that will act as the root for all scene game objects other than those added by the MRTK configuration (items shown in Figure 05). We do this when using the MRTK (and I believe this applies to all VR applications) because unlike most Unity 3D scenes, you don’t move the player and/or their attached camera to move around in the scene. In a VR application context, instead of moving the player/camera to move around, you move the scene around the player/camera. This root object is used to simplify the moving all game objects in the scene at one go. To make this even more explicit, place all interactive visuals, spawn points, etc. in the scene as children of the SceneContent root game object.
Step Two – Wiring Up the VR Controllers to Spawn Game Objects
Ground Work:
Create a Material named DefaultGrabbableMaterial to the Assets/App/Content/Materials folder
Set the Shader to Mixed Reality Toolkit/Standard
Set the Albedo color to a pastel redish ochre.
Add a 3D Cube named WallCube to the scene
Position: 0, 0, 0
Add Component: RigidBody (because we want gravity to affect it)
Add the DefaultGrabbableMaterial to the WallCube.
Add an empty MonoBehavior named DragAndDropHandler to the Assets/App/Content/Scripts folder (implementation shown in Code Listing 02).
Drag game object WallCube into the Assets/App/Content/Prefabs folder and delete it from the scene.
Add an empty game object named WallOfCubes to the scene.
Add the Prefab WallCube we created earlier as a child game object.
Duplicate the prefab 16 times (so you have a total of 17 WallCube prefab children)
Add Component: Grid Object Collection (from the MRTK SDK)
Sort Type: Child Order
Surface Type: Cylinder
Orient Type: Face Center Axis
Layout: Column Then Row
Radius: 8
Rows: 3
Cell Width: 1.32
Cell Height: 1
Click the Update Collection button on the component in the inspector window.
Drag game object WallOfCubes into the Assets/App/Content/Prefabs folder and delete it from the scene.
Drag the WallOfCubes Prefab into the SpawnAction Prefab To Spawn slot as shown in Figure 07.
At this point, you should be able to accomplish the following in the scene:
Teleport around the scene using the WMR controllers.
Click the WMR controller menu button (the small button below the thumb stick). When the menu button is clicked, a wall of 18 cubes are spawned 8 meters in front of the user’s original position and orientation.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
publicclassSpawnAction : MonoBehaviour
{
publicGameObject PrefabToSpawn;
publicVector3 SpawnPosition;
publicvoid SpawnPrefab()
{
var spawnedPrefab = Instantiate(PrefabToSpawn);
spawnedPrefab.transform.position += SpawnPosition;
}
}
Code Listing 01 – End- SpawnAction code
Code Listing 02 – Start – DragAndDropHandler code
using Microsoft.MixedReality.Toolkit;
using Microsoft.MixedReality.Toolkit.Input;
using Microsoft.MixedReality.Toolkit.Physics;
using Microsoft.MixedReality.Toolkit.Utilities;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
///<summary>/// Component that allows dragging a <see cref="GameObject"/>./// Dragging is done by calculating the angular delta and z-delta between the current and previous hand positions,/// and then repositioning the object based on that.///</summary>publicclassDragAndDropHandler : BaseFocusHandler,
IMixedRealityInputHandler<MixedRealityPose>,
IMixedRealityPointerHandler,
IMixedRealitySourceStateHandler
{
privateenumRotationModeEnum
{
Default,
LockObjectRotation,
OrientTowardUser,
OrientTowardUserAndKeepUpright
}
[SerializeField]
[Tooltip("The action that will start/stop the dragging.")]
privateMixedRealityInputAction dragAction =MixedRealityInputAction.None;
[SerializeField]
[Tooltip("The action that will provide the drag position.")]
privateMixedRealityInputAction dragPositionAction =MixedRealityInputAction.None;
[SerializeField]
[Tooltip("Transform that will be dragged. Defaults to the object of the component.")]
privateTransform hostTransform;
[SerializeField]
[Tooltip("Scale by which hand movement in Z is multiplied to move the dragged object.")]
privatefloat distanceScale =2f;
[SerializeField]
[Tooltip("How should the GameObject be rotated while being dragged?")]
privateRotationModeEnum rotationMode =RotationModeEnum.Default;
[SerializeField]
[Range(0.01f, 1.0f)]
[Tooltip("Controls the speed at which the object will interpolate toward the desired position")]
privatefloat positionLerpSpeed =0.2f;
[SerializeField]
[Range(0.01f, 1.0f)]
[Tooltip("Controls the speed at which the object will interpolate toward the desired rotation")]
privatefloat rotationLerpSpeed =0.2f;
///<summary>/// Gets the pivot position for the hand, which is approximated to the base of the neck.///</summary>///<returns>Pivot position for the hand.</returns>privateVector3 HandPivotPosition =>CameraCache.Main.transform.position +newVector3(0, -0.2f, 0) -CameraCache.Main.transform.forward *0.2f; // a bit lower and behindprivatebool isDragging;
privatebool isDraggingEnabled =true;
privatebool isDraggingWithSourcePose;
// Used for moving with a pointer rayprivatefloat stickLength;
privateVector3 previousPointerPositionHeadSpace;
// Used for moving with a source positionprivatefloat handRefDistance =-1;
privatefloat objectReferenceDistance;
privateVector3 objectReferenceDirection;
privateQuaternion gazeAngularOffset;
privateVector3 objectReferenceUp;
privateVector3 objectReferenceForward;
privateVector3 objectReferenceGrabPoint;
privateVector3 draggingPosition;
privateQuaternion draggingRotation;
privateRigidbody hostRigidbody;
privatebool hostRigidbodyWasKinematic;
privateIMixedRealityPointer currentPointer;
privateIMixedRealityInputSource currentInputSource;
// If the dot product between hand movement and head forward is less than this amount,// don't exponentially increase the length of the stickprivatereadonlyfloat zPushTolerance =0.1f;
#region MonoBehaviour Implementation
privatevoid Start()
{
if (hostTransform ==null)
{
hostTransform = transform;
}
hostRigidbody = hostTransform.GetComponent<Rigidbody>();
}
privatevoid OnDestroy()
{
if (isDragging)
{
StopDragging();
}
}
#endregion MonoBehaviour Implementation
#region IMixedRealityPointerHandler Implementation
voidIMixedRealityPointerHandler.OnPointerUp(MixedRealityPointerEventData eventData)
{
if (!isDraggingEnabled ||!isDragging || eventData.MixedRealityInputAction != dragAction || eventData.SourceId != currentInputSource?.SourceId)
{
// If we're not handling drag input or we're not releasing the right action, don't try to end a drag operation.return;
}
eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
StopDragging();
}
voidIMixedRealityPointerHandler.OnPointerDown(MixedRealityPointerEventData eventData)
{
if (!isDraggingEnabled || isDragging || eventData.MixedRealityInputAction != dragAction)
{
// If we're already handling drag input or we're not grabbing, don't start a new drag operation.return;
}
eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
currentInputSource = eventData.InputSource;
currentPointer = eventData.Pointer;
FocusDetails focusDetails;
Vector3 initialDraggingPosition =MixedRealityToolkit.InputSystem.FocusProvider.TryGetFocusDetails(currentPointer, out focusDetails)
? focusDetails.Point
: hostTransform.position;
isDraggingWithSourcePose = currentPointer ==MixedRealityToolkit.InputSystem.GazeProvider.GazePointer;
StartDragging(initialDraggingPosition);
}
voidIMixedRealityPointerHandler.OnPointerClicked(MixedRealityPointerEventData eventData) { }
#endregion IMixedRealityPointerHandler Implementation
#region IMixedRealitySourceStateHandler Implementation
voidIMixedRealitySourceStateHandler.OnSourceDetected(SourceStateEventData eventData) { }
voidIMixedRealitySourceStateHandler.OnSourceLost(SourceStateEventData eventData)
{
if (eventData.SourceId == currentInputSource?.SourceId)
{
StopDragging();
}
}
#endregion IMixedRealitySourceStateHandler Implementation
#region BaseFocusHandler Overrides
///<inheritdoc />publicoverridevoid OnFocusExit(FocusEventData eventData)
{
if (isDragging)
{
StopDragging();
}
}
#endregion BaseFocusHandler Overrides
///<summary>/// Enables or disables dragging.///</summary>///<param name="isEnabled">Indicates whether dragging should be enabled or disabled.</param>publicvoid SetDragging(bool isEnabled)
{
if (isDraggingEnabled == isEnabled)
{
return;
}
isDraggingEnabled = isEnabled;
if (isDragging)
{
StopDragging();
}
}
///<summary>/// Starts dragging the object.///</summary>privatevoid StartDragging(Vector3 initialDraggingPosition)
{
if (!isDraggingEnabled || isDragging)
{
return;
}
Transform cameraTransform =CameraCache.Main.transform;
currentPointer.IsFocusLocked =true;
isDragging =true;
if (hostRigidbody !=null)
{
hostRigidbodyWasKinematic = hostRigidbody.isKinematic;
hostRigidbody.isKinematic =true;
}
if (isDraggingWithSourcePose)
{
Vector3 pivotPosition = HandPivotPosition;
objectReferenceDistance =Vector3.Magnitude(initialDraggingPosition - pivotPosition);
objectReferenceDirection = cameraTransform.InverseTransformDirection(Vector3.Normalize(initialDraggingPosition - pivotPosition));
}
else
{
Vector3 inputPosition = currentPointer.Position;
//currentPointer.TryGetPointerPosition(out inputPosition);
previousPointerPositionHeadSpace = cameraTransform.InverseTransformPoint(inputPosition);
stickLength =Vector3.Distance(initialDraggingPosition, inputPosition);
}
// Store where the object was grabbed from
objectReferenceGrabPoint = cameraTransform.transform.InverseTransformDirection(hostTransform.position - initialDraggingPosition);
// in camera space
objectReferenceForward = cameraTransform.InverseTransformDirection(hostTransform.forward);
objectReferenceUp = cameraTransform.InverseTransformDirection(hostTransform.up);
draggingPosition = initialDraggingPosition;
}
///<summary>/// Stops dragging the object.///</summary>privatevoid StopDragging()
{
if (!isDragging)
{
return;
}
currentPointer.IsFocusLocked =false;
isDragging =false;
handRefDistance =-1;
if (hostRigidbody !=null)
{
hostRigidbody.isKinematic = hostRigidbodyWasKinematic;
}
}
#region IMixedRealityInputHandler<MixedRealityPose> Implementation
voidIMixedRealityInputHandler<MixedRealityPose>.OnInputChanged(InputEventData<MixedRealityPose> eventData)
{
if (eventData.MixedRealityInputAction != dragPositionAction ||!isDraggingEnabled ||!isDragging || eventData.SourceId != currentInputSource?.SourceId)
{
return;
}
Transform cameraTransform =CameraCache.Main.transform;
Vector3 pivotPosition =Vector3.zero;
if (isDraggingWithSourcePose)
{
Vector3 inputPosition = eventData.InputData.Position;
pivotPosition = HandPivotPosition;
Vector3 newHandDirection =Vector3.Normalize(inputPosition - pivotPosition);
if (handRefDistance <0)
{
handRefDistance =Vector3.Magnitude(inputPosition - pivotPosition);
Vector3 handDirection = cameraTransform.InverseTransformDirection(Vector3.Normalize(inputPosition - pivotPosition));
// Store the initial offset between the hand and the object, so that we can consider it when dragging
gazeAngularOffset =Quaternion.FromToRotation(handDirection, objectReferenceDirection);
}
// in camera space
newHandDirection = cameraTransform.InverseTransformDirection(newHandDirection);
Vector3 targetDirection =Vector3.Normalize(gazeAngularOffset * newHandDirection);
// back to world space
targetDirection = cameraTransform.TransformDirection(targetDirection);
float currentHandDistance =Vector3.Magnitude(inputPosition - pivotPosition);
float distanceRatio = currentHandDistance / handRefDistance;
float distanceOffset = distanceRatio >0? (distanceRatio -1f) * distanceScale :0;
float targetDistance = objectReferenceDistance + distanceOffset;
draggingPosition = pivotPosition + (targetDirection * targetDistance);
}
else
{
pivotPosition = cameraTransform.position;
Vector3 pointerPosition = currentPointer.Position; ;
//currentPointer.TryGetPointerPosition(out pointerPosition);Ray pointingRay = currentPointer.Rays[0]; ;
//currentPointer.TryGetPointingRay(out pointingRay);Vector3 currentPosition = pointerPosition;
Vector3 currentPositionHeadSpace = cameraTransform.InverseTransformPoint(currentPosition);
Vector3 positionDeltaHeadSpace = currentPositionHeadSpace - previousPointerPositionHeadSpace;
float pushDistance =Vector3.Dot(positionDeltaHeadSpace,
cameraTransform.InverseTransformDirection(pointingRay.direction.normalized));
if (Mathf.Abs(Vector3.Dot(positionDeltaHeadSpace.normalized, Vector3.forward)) > zPushTolerance)
{
stickLength = DistanceRamp(stickLength, pushDistance);
}
draggingPosition = pointingRay.GetPoint(stickLength);
previousPointerPositionHeadSpace = currentPositionHeadSpace;
}
switch (rotationMode)
{
caseRotationModeEnum.OrientTowardUser:
caseRotationModeEnum.OrientTowardUserAndKeepUpright:
draggingRotation =Quaternion.LookRotation(hostTransform.position - pivotPosition);
break;
caseRotationModeEnum.LockObjectRotation:
draggingRotation = hostTransform.rotation;
break;
default:
// in world spaceVector3 objForward = cameraTransform.TransformDirection(objectReferenceForward);
// in world spaceVector3 objUp = cameraTransform.TransformDirection(objectReferenceUp);
draggingRotation =Quaternion.LookRotation(objForward, objUp);
break;
}
Vector3 newPosition =Vector3.Lerp(hostTransform.position, draggingPosition + cameraTransform.TransformDirection(objectReferenceGrabPoint), positionLerpSpeed);
// Apply Final Positionif (hostRigidbody ==null)
{
hostTransform.position = newPosition;
}
else
{
hostRigidbody.MovePosition(newPosition);
}
// Apply Final RotationQuaternion newRotation =Quaternion.Lerp(hostTransform.rotation, draggingRotation, rotationLerpSpeed);
if (hostRigidbody ==null)
{
hostTransform.rotation = newRotation;
}
else
{
hostRigidbody.MoveRotation(newRotation);
}
if (rotationMode ==RotationModeEnum.OrientTowardUserAndKeepUpright)
{
Quaternion upRotation =Quaternion.FromToRotation(hostTransform.up, Vector3.up);
hostTransform.rotation = upRotation * hostTransform.rotation;
}
}
#endregion IMixedRealityInputHandler<MixedRealityPose> Implementation
#region Private Helpers
///<summary>/// Gets the pivot position for the hand, which is approximated to the base of the neck.///</summary>///<remarks>/// An exponential distance ramping where distance is determined by:/// f(t) = (e^At - 1)/B/// where:/// A is a scaling factor: how fast the function ramps to infinity/// B is a second scaling factor: a denominator that shallows out the ramp near the origin/// t is a linear input/// f(t) is the distance exponentially ramped along variable t////// Here's a quick derivation for the expression below./// A = constant/// B = constant/// d = ramp(t) = (e^At - 1)/B/// t = ramp_inverse(d) = ln(B*d+1)/A/// In general, if y=f(x), then f(currentY, deltaX) = f( f_inverse(currentY) + deltaX )/// So,/// ramp(currentD, deltaT) = (e^(A*(ln(B*currentD + 1)/A + deltaT)) - 1)/B/// simplified:/// ramp(currentD, deltaT) = (e^(A*deltaT) * (B*currentD + 1) - 1) / B///</remarks>privatestaticfloat DistanceRamp(float currentDistance, float deltaT, float A =4.0f, float B =75.0f)
{
return (Mathf.Exp(A * deltaT) * (B * currentDistance +1) -1) / B;
}
#endregion Private Helpers
}