Easy introduction to AR using ARFoundation
thank you for your hard work.
This is Matsuyama from the Yokohama Office Development Office.
I decided to test AR because I thought it might lead to some work.
I would like to leave a memo on my blog.
Advance preparation
■ Tools and Plugins
First, prepare the environment.
I thought it would be quick to test it with Unity.
・Unity 2019.3.0b11
Although it is a beta version, the AR camera was not displayed properly in the latest Fix version (2019.2.13f1), so I will use 2019.3.
Unity Hub is really useful.
*While I was writing this blog, his fixed version of 2019.3 (2019.3.0f1) was released.
The AR camera works fine in this version.
・AR Plugin
・AR Foundation (2.1.4)
・ARKit XR Plugin (2.1.2) *For iOS
After creating a new project, import it from Package Manager.
By the way, the previously existing UnityARKit Plugin is no longer available, and it appears that AR development will be done by the AR Foundation.
This time, we are only going to check the operation on iOS, so we will only import ARKit. If Android is also compatible, import ARCore as well.
Package Manager is also really useful.
■ Project Settings (iOS)
- Company Name, Product Name, BuildIdentifier are optional
- Automatically Sign is convenient to set in Preferences
- Since a camera is used, enter an appropriate description in Camera Usage Description
- Check Require ARKit support
- iOS version 11 or later is required, so set "11.0" to minimum iOS version
- Architecture is "ARM64"
■ Build Settings
Since we are running on iOS,
- Select iOS for Platform
- Change Run in Xcode as to "Debug"
- Check Development Build
and run Switch Platform.
*What is AR Foundation?
・ARKit is for iOS
・ARCore is for Android
・AR Foundation is a multi-platform AR utility that is compatible with both OS
Add AR functionality to your scene
Create an appropriate scene (default scene is also acceptable)
- AR Camera will be used instead of MainCamera, so delete MainCamera
- Add the following objects to Hierarchy as an AR function
- AR Session Origin
- AR Session
It's that easy is.
Display plane graphically
Make settings to display the detected plane with a black frame and a semi-transparent board like this.
1. Generate a Plane to display as a plane (XR → AR Default Plane)
2. Convert the generated Plane into a Prefab and delete it from Hierarchy
3. Add the "AR Plane Manager" Component to the AR Session Origin to display the detected plane. Addition
4. Set the Prefab from step 2 in "Plane Prefab" of Component's AR Plane Manager.
5. If you only need the horizontal plane, set "Detection Mode" to Horizontal.
Place objects in AR space
Touch the detected plane to display the object there.
・Add "AR Raycast Manager" to AR Session Origin to detect touch coordinates
・Create code to generate an object at touch coordinates
using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR.ARFoundation; using UnityEngine.XR.ARSubsystems; public class ARManager : MonoBehaviour { [SerializeField] private GameObject objPrefab; private ARRaycastManager raycastMan; private List<ARRaycastHit> hitResults = new List<ARRaycastHit> (); void Awake() { raycastMan = GetComponent<ARRaycastManager> (); } void Update() { if (Input.touchCount > 0) { Touch touch = Input.GetTouch(0); if (touch.phase != TouchPhase.Ended) { return; } if (raycastMan.Raycast (touch .position, hitResults, TrackableType.All)) { Instantiate (objPrefab, hitResults[0].pose.position, hitResults[0].pose.rotation); } } } }
・Add Component to the created code (class) to AR Session Origin
・Register the object you want to display in ObjPrefab
the object you want to display
A Cube or a Sphere would be fine, but I'll take the trouble to find a suitable model from the Asset Store.
I narrowed it down to 3D, Free Assets and decided to use "Space Robot Kyle".
Import, create an Avatar, make it move, and adjust it so that it can play the motion of "UnityChan".
Since this is a hobby level, I will omit the details.
Check with actual machine
Let's build it and check it on the actual machine.
・Open Build Settings and run Build
・Start the generated Xcode project
・Connect the terminal to be installed and select it on Xcode
・Press ▶︎ to build & run
When executed, an object will be generated at the touched position like this.
*I put a mosaic on the face to protect privacy, but it looked a bit suspicious. . .
something summative
Installation was really easy.
With just these steps, you can implement something like "Monster AR" of "DQ Walk".
At first, I got stuck with the Unity version (the AR camera was not displayed on the 2019.2 series), but
other than that, I don't think there were any major problems.
If we were to take the trouble into consideration,
we couldn't check the operation in the Unity Editor, so we had to build it each time, which was a hassle.
Next time, I'll try verifying the link between GPS and Google Maps!
lastly
I have opened the system development service site "SEKARAKU Lab" to which I belong.
Beyond is a one-stop service for everything from server design and construction to operation, so if you have any trouble with server-side development, please feel free to contact us.
SEKARAKU Lab: [https://sekarakulab.beyondjapan.com/](https://sekarakulab.beyondjapan.com/)
Well, that's all for now.