Unity Editor default install location:
C: Program Files\Unity\Hub\Editor
If applicable and available, use Enable GPU Instancing to help reduce draw calls. This is particularly relevant and effective when instantiating multiple instances within a scene and for material definitions.
Preview Unity Packages
Some packages are in preview mode, so access them by using the
Advanced drop-down menu and enabling
Show preview packages.
HDA to Unity for AR
This is my exploration of a workflow from an HDA in Houdini to Unity and augmented reality through Houdini Engine for testing HDA controls and visualizing updates. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.
The intent is to understand the accessibility of HDA parameters in Houdini Engine for user manipulation during runtime of Unity, and particularly through augmented reality. I envision this process as a way of testing HDA's before baking them into assets and scripting potential interactivity actual deployment to an independent augmented reality application.
Versions used in test:
- Houdini 18.0.460
- Houdini Engine 3.3.5
- Unity 2019.3.13f1
Test & Analysis
The first bit is a quick overview of the test followed by a narrated breakdown of the workflow. This is primarily documentation of my academic research and not intended to be a tutorial, but the workflow does detail applicable HDA scripts and libraries.
Initial HDA setup in Houdini with exposed parameter controls.
Houdini Engine control panel in Unity with parameters accessed via C# script.
UI elements and scripts in AR. The asset rotation is managed with user touch controls on the mobile device and world space UI, while the material color variation is defined through a color picker on on the screen space UI.
Exploration of workflow from generating a pyro sim and texture sheet composition in Houdini to a Unity particle system for use in augmented reality. My aim is to explore how sims from Houdini might be able to capitalize on these workflows and evaluate its performance in AR compared to other simulation solutions. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.
Game Tools | Pyro FX Texture Sheets | SideFX
This video shows one of the ways you could pack multiple pieces of information into a single texture. The idea is to have more control over the look of an explosion in UE4 by separating out normals, emission and opacity.
Vertex Animation Textures
Exploration of workflow from generating visual effects in Houdini to Unity for use in augmented reality. The intent is to identify optimal strategies for showcasing complex effects in augmented reality. For instance, Houdini's game development toolset includes a ROP for generating texture sheets by rendering the effect and then compositing layers within a base and normal texture, or Houdini's Vertex Animation Texture (VAT) Rendering Operator outputting to Unity's Universal Rendering Pipeline (URP). My aim is to explore how sims from Houdini might be able to capitalize on these workflows and evaluate its performance in AR compared to other simulation solutions. Augmented reality filmed with an iPad Pro and video/sound editing processed in After Effects.
Houdini's Vertex Animation Texture (VAT) Rendering Operator outputting to Unity's Universal Rendering Pipeline (URP) for visualizing in augmented reality. The intent was to understand the workflow within Houdini and implementing its output in Unity's URP.
Game Tools | Vertex Animation Textures | SideFX
Export simulation data to texture files. This allows for very effective vertex shader playback of your high resolution simulation.
AR Virtual Buttons & Interactivity
Test for coordinating multiple targets and linking their corresponding game objects with cast ray and attraction forces with Unity's Visual Effects Graph.
Virtual Rower - VR Application
This short video documentation depicts my exploration of using the HTC Vive Tracker with the Vive system to enhance indoor rowing with a custom, interactive virtual environment. The Vive Tracker is attached to the handle of the Concept2 Indoor Rower and linked with scripts in Unity to the virtual oars for propelling the boat in the virtual environment.
HTC Vive Tracker
As an example of a great application that can be enhanced by the Vive Tracker, I am developing a virtual rower that attaches the HTC Vive Tracker to the handle of a Concept2 Indoor Rower. It uses the HTC Vive headset in conjunction with the custom virtual environment created in Unity Pro. Check out the video documentation to see it in action:
Pairing the Vive Tracker
If the Vive Tracker doesn’t immediately pair and track when SteamVR initiates and identifies the headset, lighthouse sensors, and hand controllers, then pairing it afterwards through the SteamVR pairing process appears to correctly pair and track. The Steam VR plugin for Unity relies on the first 2 tracked devices to be the hand controllers, so they have to be activated for Unity to tag the Vive Tracker as the 3rd or nth tracked devices after the hand controllers. This is elaborated upon in the following paragraph.
Tracked Object Assignment
Note that the Steam VR Controller Manager defines the first 2 tracked devices: the left and right hand controllers. This means that when the Unity scene is run in game-mode, the Steam VR Controller Manager first looks for the hand controllers before it identifies other tracked devices, like the Vive Tracker. In an initial test where neither of the hand controllers were visible or paired, the Unity scene in game-mode assigned the Vive Tracker as one of the hand controllers because that’s what it was looking to assign first, and since it first identified the Vive Tracker, that’s what it interpreted to be the hand controller. Again, the key here is to verify that both hand controllers are activated and available for tracking to allow the Vive Tracker to be assigned to the 3rd or nth tracked device in the Steam VR Controller Manager.
Vive Tracker 3D Model
This asset was downloaded and integrated into the Unity scene as a placeholder and reference for positioning and orientation. In the Virtual Rower, I set it to be transparent so that it was still visible for reference, but not intrusive to the virtual environment. With the representation of the Vive Tracker 3D model in the Unity scene, it was attributed with the script Steam VR Tracked Object so that the Steam VR Controller Manager could identify is as a tracked object.
Enable post-processing via Package Manager and deploy to AR using URP.
How To Use new Post Process Layer/Volume in 2018.1 ?
I am trying to use the new Post-Processing Stack in Unity 2018.1, but the effects do NOT show in the Editor, nor do they show in the Game preview or in a Build. All of this worked fine on the same hardware in version 2017.2 using a "Post-Processing Behavior".
Unity XR Interactions
Some packages are in preview mode, so access them by using the
Advanced drop-down menu and enabling
Show preview packages.
Samples available: https://github.com/Unity-Technologies/arfoundation-samples
XR Plugin Management
This defines switching between platforms.
Option included to download
Samples: Example XR Management implementation
ARKit XR Plugin
Verify that this is the same version as was selected for the AR Foundation package.
XR Interaction Toolkit
Workflow - Vuforia Developer Portal
- Access the Vuforia Developer Portal to set up the database and image targets.
- Under Target Manager, create a Database.
- Within the new database, Add Target for each targets intended to be in the Unity scene.
- Verify that each target shows a Rating of at least 4 stars to confirm effectiveness as a target.
- When complete adding target images, select Download Database (All).
- Of the options that are available in the Download Database window, select Unity Editor as the development platform, and Download.
- Import the downloaded database by dragging into the Unity workspace.
Workflow - Unity
- Verify that the Vuforia database is loaded into the Unity project.
- Under Inspector, check the following for the ARCamera that is in the Scene:
- Vuforia Behaviour (Script)
- Verify that the App License Key is populated.
- Set both of the Max Simultaneous parameters to 2 or more, depending on the number of targets.
- Verify that World Center Mode is set to DEVICE_TRACKING
- Database Load Behavior (Script)
- Uncheck any database that is not in use in the current Scene.
- Check the database (Load databaseName Database) that is to be used in the Scene.
- Check Activate.
- From the Vuforia Prefabs, drag a separate ImageTarget asset for each target to be used in the Scene.
- There is no need to keep the MainCamera in the Scene.
- Verify that each ImageTarget asset is renamed to clarify and set to be a Child under the ARCamera in the Scene.
- Under Inspector, check the following for each separate ImageTarget in the Scene:
- Image Target Behaviour (Script)
- Type is set to Predefined by default.
- Set the Database to the corresponding database that has the image trackers intended for the Scene.
- Select the Image Target based on which ImageTarget asset is selected in the Scene.
- Add models from Assets to the Scene and set each to be a Child to the corresponding Image Target asset in the Scene.
- Run the game to see the different trackers activate the corresponding models.
Throughout the process I encountered 2 minor challenges, both of which involved the use of Vuforia. The first one was was the quality of the tracker image being used for the AR recognition by the mobile device camera. Vuforia’s developer portal includes a database structure for uploading these images. Part of the upload process is the evaluation of the image’s quality as a suitable tracker. The more contrasting graphics and defined points to grasp, the higher the quality. After several tests, I found that a rating of 3 stars or less made the tracker very difficult to recognize, if at all. As a result, the selected images were modified until a rating of 4 or, preferably, 5 were achieved. A few Magic: The Gathering cards happened to turn out as really good trackers for my first test with Vuforia.
The other challenge emerged during the build process from Unity. The builds were processed using the Unity Cloud Build service by linking the Unity project files through a BitBucket repository that was, in turn, linked to the Unity Cloud Build platform. The following lines from the first attempt’s Unity Cloud Build log file noted the issue that needed to be resolved before the build could be completed. The Vuforia package within the Unity project has .dll files located in its plugins folder.
When these VuforiaWrapper assets were selected in Unity, the Inspector panel displays the following Import Settings. By default, the first box (Any Platform) is checked, which appears to have limited some of the other options. After unchecking it and selecting Any OS from the OS dropdown menu under Platform Settings, the next build was processed correctly. Since Vuforia version 6.2.6, I have not tested a more recent Vuforia package to see if the default settings were resolved. Regardless, it was a simple resolution, thanks to the Unity Cloud Build log that clearly noted the file and location.
Verify Vuforia updated to latest release via Package Manager.
Vuforia Developer Portal |
Vuforia Engine 9.2 drops today! This release includes many quality of life improvements based on community feedback. Please be sure to check out the release notes for the full list. The Vuforia team is committed to making our developers successful in developing augmented reality applications.
Getting Started with Vuforia Engine in Unity
Add content as a child of the target. Tip: Delete the default Main Camera after adding an ARCamera. The ARCamera contains its own scene Camera. You won't need the Main Camera unless you are using it to render a specific camera view.
Build an AR application with vuforia sdk for Unity3D.
With AR technology, you can overlay digital information in the real word. So AR is a great technology for enriching the media. You can overlay digital information in a real book : You can overlay digital information like a video in a real newspaper : etc...
AR Deployment to iOS
Validate Project Settings: Before doing anything else, follow guide and
Update to recommended settings.
Enable Base Internationalization: Enable this and follow guide to implement.
Migrate Localization: Migrate localizations that are deprecated.
iOS Deployment Targetto match what was specified in Unity project.
- [TAB] General
- Most of this should be defined in the Unity project, but should be confirmed prior to build.
Bundle Identifiermatches what was defined in the Apple Developer account.
- [TAB] Signing & Capabilities
Automatically manage signing. This is enabled by default but needs to be disabled for options to define other signing settings.
Bundle Identifieras necessary to match the settings defined in the Apple Developer account.
- Choose from available
Provisioning Profilesto match the one in the Apple Developer account. Import profiles from Apple Developer account.
- Any errors that appear will clarify which setting needs to be addressed. When no errors appear, then signing is fully addressed.
- Verify that
Developersettings are enabled and set to trust linked computer.
- Issue appears to be with alembic files
Getting started with iOS development
Building games for devices like the iPhone and iPad requires a different approach than you would use for desktop PC games. Unlike the PC market, your target hardware is standardized and not as fast or powerful as a computer with a dedicated video card.
Building for iOS
Unity A continuous integration service for Unity projects that automates the process of creating builds on Unity's servers. See in Cloud Build More info Glossary helps you automate the process of building your Unity Project for devices. This article describes the prerequisites necessary to build your Project for iOS and how to create the supporting components to configure Cloud Build.
The following is my documentation of following the tutorial guide noted in the source link above, along with additional notes from my experience in working in this setup.
Package Manager, access
Show preview packages and install the following, which are specific to ECS.
- Dependencies include Entities
NET Standard 2.0
By default, the Unity project is initiated with 4.x, but the following warning notes that it should be set to 2.0 for ECS DOTS to run correctly. Based on my initial tests attempting to run some operations with entities, Unity locked up in game mode under 4.x, but then ran correctly without issue as soon as the project was set to 2.0. Though this does not confirm that the .NET Standard setting is the direct cause of the lockup, it did appear to resolve it.
To make the change to .NET Standard in the
Project Settings, navigate to the
Player section and use the drop-down menu next to
API Compatibility Level, as shown below.
Entity Component System (ECS)
While the traditional Monobehaviour allows for mixing data, actions and methods, ECS is compartmentalized as such:
- Entities: things
- Components: data
- Systems: logic
- World: Contains all of the game systems
- Entity Manager: Manages all of the entities, or things
- One Entity Manager per World
- All of the spawning, modifying, and destroying of entities occurs with the Entity Manager
This window needs to be accessed and docked to track the entities as they are created and deployed in the scene during runtime. Without it, the Hierarchy does not reflect the actions being undertaken by the entity manager.
In runtime, the Entity Debugger shows the entities currently deployed to the scene. Note that WorldTime is a default entity that stores the world time data. It is created automatically by Unity. The other entity currently shown is empty and simply a 'thing' awaiting to reference 'data'. The entity is just an identifier with a id, while the data is held somewhere else. The entity will simply reference the data rather than actually hold it.
Traditional game objects have data already assigned to them like Transform and others, but ECS requires that each of these datum be created and referenced individually. This compartmentalization is where the efficiencies lie in ECS over the traditional game object method, particularly as many entities and data are added to the scene.
Create Entity using Archetype
An entity created using a defined archetype shows additional data per the archetype. In this case, the archetype included data of Transform, Rotation, RenderMesh, RenderBounds, LocalToWorld, and Unity created a few other automatically.
Add Data to Entity
Once an entity has acquired a format with an archetype, data can be applied and linked for use by the entity. Methods include the following:
With data applied and the game in run mode, the values applied via the data are evident in the entities properties, listed under the Inspector.
Note that in this setup, the entity does not show up in the hierarchy without the game being set to run mode.
- Setup variable reference to world entity manager
- Define archetype/s for each type of entity
- Declare and initialize entity/entities with relevant archetypes
- Assign data to entity/entities
DOTS Visual Scripting Experimental Drop
These experimental drops are not a
Package Manager supported package, so they will not show up in the
Package Manager even with
Show Preview Packages enabled. Regardless, the drops can be downloaded and installed to the recommended Unity version for testing. As of drop 10, the relevant line is provided for adding to the project
manifest.json file, located in the
A quick way to get the latest version is to go the to forum and search for the following
DOTS Visual Scripting Experimental Drop with
Search titles only enabled and
Data Oriented Technology Stack selected in the
Search in Forums field. This will output only those threads relevant to the DOTS Visual Scripting and the latest drop.
Unity Scripting Graph Package
The Scripting Graph Package provides a node based editor to author gameplay and logic visually in a DOTS project.
On This Page
- GPU Instancing
- Preview Unity Packages
- HDA to Unity for AR
- Test & Analysis
- Texture Sheets
- Vertex Animation Textures
- AR Virtual Buttons & Interactivity
- Virtual Rower - VR Application
- HTC Vive Tracker
- Post Processing
- Unity XR Interactions
- AR Deployment to iOS
- XCode setup
- Initial setup
- Devices (iPhone/iPad)
- ECS/DOTS Setup
- NET Standard 2.0
- Entity Component System (ECS)
- Entity Debugger
- Create Entity
- Create Entity using Archetype
- Add Data to Entity
- Setup Summary
- DOTS Visual Scripting Experimental Drop
- Access Package
- Support Tools
- ECS/DOTS Resources