Unity 3D

About

Notes

Notes

Application

Installation

Unity Editor default install location:

C: Program Files\Unity\Hub\Editor

GPU Instancing

If applicable and available, use Enable GPU Instancing to help reduce draw calls. This is particularly relevant and effective when instantiating multiple instances within a scene and for material definitions.

image

Preview Unity Packages

Some packages are in preview mode, so access them by using the Advanced drop-down menu and enabling Show preview packages.

image

Experiments

HDA to Unity for AR

image

This is my exploration of a workflow from an HDA in Houdini to Unity and augmented reality through Houdini Engine for testing HDA controls and visualizing updates. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.

The intent is to understand the accessibility of HDA parameters in Houdini Engine for user manipulation during runtime of Unity, and particularly through augmented reality. I envision this process as a way of testing HDA's before baking them into assets and scripting potential interactivity actual deployment to an independent augmented reality application.

Versions used in test:

  • Houdini 18.0.460
  • Houdini Engine 3.3.5
  • Unity 2019.3.13f1

Test & Analysis

The first bit is a quick overview of the test followed by a narrated breakdown of the workflow. This is primarily documentation of my academic research and not intended to be a tutorial, but the workflow does detail applicable HDA scripts and libraries.

Initial HDA setup in Houdini with exposed parameter controls.

image

Houdini Engine control panel in Unity with parameters accessed via C# script.

image

UI elements and scripts in AR. The asset rotation is managed with user touch controls on the mobile device and world space UI, while the material color variation is defined through a color picker on on the screen space UI.

Texture Sheets

Exploration of workflow from generating a pyro sim and texture sheet composition in Houdini to a Unity particle system for use in augmented reality. My aim is to explore how sims from Houdini might be able to capitalize on these workflows and evaluate its performance in AR compared to other simulation solutions. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.

Resources

Vertex Animation Textures

Exploration of workflow from generating visual effects in Houdini to Unity for use in augmented reality. The intent is to identify optimal strategies for showcasing complex effects in augmented reality. For instance, Houdini's game development toolset includes a ROP for generating texture sheets by rendering the effect and then compositing layers within a base and normal texture, or Houdini's Vertex Animation Texture (VAT) Rendering Operator outputting to Unity's Universal Rendering Pipeline (URP). My aim is to explore how sims from Houdini might be able to capitalize on these workflows and evaluate its performance in AR compared to other simulation solutions. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.

Houdini's Vertex Animation Texture (VAT) Rendering Operator outputting to Unity's Universal Rendering Pipeline (URP) for visualizing in augmented reality. The intent was to understand the workflow within Houdini and implementing its output in Unity's URP.

Resource

AR Virtual Buttons & Interactivity

Test for coordinating multiple targets and linking their corresponding game objects with cast ray and attraction forces with Unity's Visual Effects Graph.

Virtual Rower - VR Application

This short video documentation depicts my exploration of using the HTC Vive Tracker with the Vive system to enhance indoor rowing with a custom, interactive virtual environment. The Vive Tracker is attached to the handle of the Concept2 Indoor Rower and linked with scripts in Unity to the virtual oars for propelling the boat in the virtual environment.

HTC Vive Tracker

As an example of a great application that can be enhanced by the Vive Tracker, I am developing a virtual rower that attaches the HTC Vive Tracker to the handle of a Concept2 Indoor Rower. It uses the HTC Vive headset in conjunction with the custom virtual environment created in Unity Pro. Check out the video documentation to see it in action:

Pairing the Vive Tracker

If the Vive Tracker doesn’t immediately pair and track when SteamVR initiates and identifies the headset, lighthouse sensors, and hand controllers, then pairing it afterwards through the SteamVR pairing process appears to correctly pair and track. The Steam VR plugin for Unity relies on the first 2 tracked devices to be the hand controllers, so they have to be activated for Unity to tag the Vive Tracker as the 3rd or nth tracked devices after the hand controllers. This is elaborated upon in the following paragraph.

image

Tracked Object Assignment

image
image

Note that the Steam VR Controller Manager defines the first 2 tracked devices: the left and right hand controllers. This means that when the Unity scene is run in game-mode, the Steam VR Controller Manager first looks for the hand controllers before it identifies other tracked devices, like the Vive Tracker. In an initial test where neither of the hand controllers were visible or paired, the Unity scene in game-mode assigned the Vive Tracker as one of the hand controllers because that’s what it was looking to assign first, and since it first identified the Vive Tracker, that’s what it interpreted to be the hand controller. Again, the key here is to verify that both hand controllers are activated and available for tracking to allow the Vive Tracker to be assigned to the 3rd or nth tracked device in the Steam VR Controller Manager.

Vive Tracker 3D Model

Source: https://www.vive.com/us/vive-tracker-for-developer/

This asset was downloaded and integrated into the Unity scene as a placeholder and reference for positioning and orientation. In the Virtual Rower, I set it to be transparent so that it was still visible for reference, but not intrusive to the virtual environment. With the representation of the Vive Tracker 3D model in the Unity scene, it was attributed with the script Steam VR Tracked Object so that the Steam VR Controller Manager could identify is as a tracked object.

Post Processing

Enable post-processing via Package Manager and deploy to AR using URP.

image

Resource

Packages

Unity XR Interactions

Some packages are in preview mode, so access them by using the Advanced drop-down menu and enabling Show preview packages.

image

AR Foundation

XR Plugin Management

Version 3.2.12

This defines switching between platforms.

Option included to download Samples: Example XR Management implementation

ARKit XR Plugin

Version 4.0.1

Verify that this is the same version as was selected for the AR Foundation package.

XR Interaction Toolkit

Version 0.9.4

Vuforia

Process

Workflow - Vuforia Developer Portal

  • Access the Vuforia Developer Portal to set up the database and image targets.
  • Under Target Manager, create a Database.
  • Within the new database, Add Target for each targets intended to be in the Unity scene.
  • Verify that each target shows a Rating of at least 4 stars to confirm effectiveness as a target.
  • When complete adding target images, select Download Database (All).
  • Of the options that are available in the Download Database window, select Unity Editor as the development platform, and Download.
  • Import the downloaded database by dragging into the Unity workspace.

Workflow - Unity

  • Verify that the Vuforia database is loaded into the Unity project.
  • Under Inspector, check the following for the ARCamera that is in the Scene:
  • Vuforia Behaviour (Script)
    • Verify that the App License Key is populated.
    • Set both of the Max Simultaneous parameters to 2 or more, depending on the number of targets.
    • Verify that World Center Mode is set to DEVICE_TRACKING
  • Database Load Behavior (Script)
  • Uncheck any database that is not in use in the current Scene.
  • Check the database (Load databaseName Database) that is to be used in the Scene.
  • Check Activate.
  • From the Vuforia Prefabs, drag a separate ImageTarget asset for each target to be used in the Scene.
  • There is no need to keep the MainCamera in the Scene.
  • Verify that each ImageTarget asset is renamed to clarify and set to be a Child under the ARCamera in the Scene.
  • Under Inspector, check the following for each separate ImageTarget in the Scene:
    • Image Target Behaviour (Script)
      • Type is set to Predefined by default.
      • Set the Database to the corresponding database that has the image trackers intended for the Scene.
      • Select the Image Target based on which ImageTarget asset is selected in the Scene.
  • Add models from Assets to the Scene and set each to be a Child to the corresponding Image Target asset in the Scene.
  • Run the game to see the different trackers activate the corresponding models.

Throughout the process I encountered 2 minor challenges, both of which involved the use of Vuforia. The first one was was the quality of the tracker image being used for the AR recognition by the mobile device camera. Vuforia’s developer portal includes a database structure for uploading these images. Part of the upload process is the evaluation of the image’s quality as a suitable tracker. The more contrasting graphics and defined points to grasp, the higher the quality. After several tests, I found that a rating of 3 stars or less made the tracker very difficult to recognize, if at all. As a result, the selected images were modified until a rating of 4 or, preferably, 5 were achieved. A few Magic: The Gathering cards happened to turn out as really good trackers for my first test with Vuforia.

image

The other challenge emerged during the build process from Unity. The builds were processed using the Unity Cloud Build service by linking the Unity project files through a BitBucket repository that was, in turn, linked to the Unity Cloud Build platform. The following lines from the first attempt’s Unity Cloud Build log file noted the issue that needed to be resolved before the build could be completed. The Vuforia package within the Unity project has .dll files located in its plugins folder.

image

When these VuforiaWrapper assets were selected in Unity, the Inspector panel displays the following Import Settings. By default, the first box (Any Platform) is checked, which appears to have limited some of the other options. After unchecking it and selecting Any OS from the OS dropdown menu under Platform Settings, the next build was processed correctly. Since Vuforia version 6.2.6, I have not tested a more recent Vuforia package to see if the default settings were resolved. Regardless, it was a simple resolution, thanks to the Unity Cloud Build log that clearly noted the file and location.

image
image

Tips

Verify Vuforia updated to latest release via Package Manager.

Resources

AR Deployment to iOS

XCode setup

Initial setup

Buildtime Issues

  • Validate Project Settings: Before doing anything else, follow guide and Update to recommended settings.
  • Enable Base Internationalization: Enable this and follow guide to implement.
  • Migrate Localization: Migrate localizations that are deprecated.

Project

Unity-iPhone

  • Set iOS Deployment Target to match what was specified in Unity project.

Targets

Unity-iPhone

  • [TAB] General
    • Most of this should be defined in the Unity project, but should be confirmed prior to build.
    • Verify Bundle Identifier matches what was defined in the Apple Developer account.
  • [TAB] Signing & Capabilities
    • Disable Automatically manage signing. This is enabled by default but needs to be disabled for options to define other signing settings.
    • Label Bundle Identifier as necessary to match the settings defined in the Apple Developer account.
    • Choose from available Provisioning Profiles to match the one in the Apple Developer account. Import profiles from Apple Developer account.
    • Any errors that appear will clarify which setting needs to be addressed. When no errors appear, then signing is fully addressed.

Devices (iPhone/iPad)

  • Verify that Developer settings are enabled and set to trust linked computer.

Build

Runtime Issues

  • Issue appears to be with alembic files

Resources

ECS/DOTS Setup

Setup

The following is my documentation of following the tutorial guide noted in the source link above, along with additional notes from my experience in working in this setup.

Packages

From the Package Manager, access Show preview packages and install the following, which are specific to ECS.

  • Entities
  • Hybrid Renderer
    • Dependencies include Entities

NET Standard 2.0

By default, the Unity project is initiated with 4.x, but the following warning notes that it should be set to 2.0 for ECS DOTS to run correctly. Based on my initial tests attempting to run some operations with entities, Unity locked up in game mode under 4.x, but then ran correctly without issue as soon as the project was set to 2.0. Though this does not confirm that the .NET Standard setting is the direct cause of the lockup, it did appear to resolve it.

image

To make the change to .NET Standard in the Project Settings, navigate to the Player section and use the drop-down menu next to API Compatibility Level, as shown below.

image

Entity Component System (ECS)

While the traditional Monobehaviour allows for mixing data, actions and methods, ECS is compartmentalized as such:

  • Entities: things
  • Components: data
  • Systems: logic

Organization

  • World: Contains all of the game systems
  • Entity Manager: Manages all of the entities, or things
  • One Entity Manager per World
  • All of the spawning, modifying, and destroying of entities occurs with the Entity Manager

Entity Debugger

This window needs to be accessed and docked to track the entities as they are created and deployed in the scene during runtime. Without it, the Hierarchy does not reflect the actions being undertaken by the entity manager.

image

Create Entity

image

In runtime, the Entity Debugger shows the entities currently deployed to the scene. Note that WorldTime is a default entity that stores the world time data. It is created automatically by Unity. The other entity currently shown is empty and simply a 'thing' awaiting to reference 'data'. The entity is just an identifier with a id, while the data is held somewhere else. The entity will simply reference the data rather than actually hold it.

Traditional game objects have data already assigned to them like Transform and others, but ECS requires that each of these datum be created and referenced individually. This compartmentalization is where the efficiencies lie in ECS over the traditional game object method, particularly as many entities and data are added to the scene.

Create Entity using Archetype

An entity created using a defined archetype shows additional data per the archetype. In this case, the archetype included data of Transform, Rotation, RenderMesh, RenderBounds, LocalToWorld, and Unity created a few other automatically.

image

Add Data to Entity

Once an entity has acquired a format with an archetype, data can be applied and linked for use by the entity. Methods include the following:

AddComponentData()
AddSharedComponentData()

With data applied and the game in run mode, the values applied via the data are evident in the entities properties, listed under the Inspector.

Note that in this setup, the entity does not show up in the hierarchy without the game being set to run mode.

image

Setup Summary

  1. Setup variable reference to world entity manager
  2. Define archetype/s for each type of entity
  3. Declare and initialize entity/entities with relevant archetypes
  4. Assign data to entity/entities

DOTS Visual Scripting Experimental Drop

Access Package

These experimental drops are not a Package Manager supported package, so they will not show up in the Package Manager even with Show Preview Packages enabled. Regardless, the drops can be downloaded and installed to the recommended Unity version for testing. As of drop 10, the relevant line is provided for adding to the project manifest.json file, located in the Packages folder.

A quick way to get the latest version is to go the to forum and search for the following DOTS Visual Scripting Experimental Drop with Search titles only enabled and Data Oriented Technology Stack selected in the Search in Forums field. This will output only those threads relevant to the DOTS Visual Scripting and the latest drop.

image

Documentation

Support Tools

Support Tools

ECS/DOTS Resources

Resources