Lee Englestone has recently written a book about developing augmented reality apps using ARKit on Xamarin.iOS. It piqued my interest so I bought it. His book appears to be the best source of information for learning ARKit on Xamarin.iOS, and he was a companion website that’s pretty damn good.
After experimenting with ARKit, I’ve already forgotten some of the early lessons I learnt. So, as a reminder to my future self, I’ve decided to blog about getting to grips with ARKit.
The sample this code comes from can be found on GitHub.
Platform setup
The first thing to note is that you’ll need a physical iPhone to run an ARKit app. ARKit requires the use of the camera, and you won’t have much joy with that in the iOS Simulator.
Deploying an ARKit app to your iPhone will require an Apple ID, and optionally an Apple Developer account. However, I created a new Apple ID that I didn’t link to an Apple Developer account. This made it possible to use free provisioning in VSMac to deploy the app to my device, without having to register for on the Apple Developer program.
The most convenient approach to learning ARKit is via a Single View app. Once you’ve created the app you’ll need to give the app permission to use the device camera, in Info.plist:
<key>NSCameraUsageDescription</key>
<string>Use camera?</string>
To check you can deploy an app to your phone, it’s useful to create an ARKit app that displays the camera output. This can be accomplished by modifying the ViewController
class in your single view app:
using System;
using ARKit;
using UIKit;
namespace ARKitFun
{
public partial class ViewController : UIViewController
{
readonly ARSCNView sceneView;
public ViewController(IntPtr handle) : base(handle)
{
sceneView = new ARSCNView
{
AutoenablesDefaultLighting = true,
ShowsStatistics = true
};
View.AddSubview(sceneView);
}
public override void ViewDidLoad()
{
base.ViewDidLoad();
sceneView.Frame = View.Frame;
}
public override void ViewDidAppear(bool animated)
{
base.ViewDidAppear(animated);
sceneView.Session.Run(new ARWorldTrackingConfiguration
{
AutoFocusEnabled = true,
LightEstimationEnabled = true,
WorldAlignment = ARWorldAlignment.Gravity
}, ARSessionRunOptions.ResetTracking | ARSessionRunOptions.RemoveExistingAnchors);
}
public override void ViewDidDisappear(bool animated)
{
base.ViewDidDisappear(animated);
sceneView.Session.Pause();
}
public override void DidReceiveMemoryWarning()
{
base.DidReceiveMemoryWarning();
}
}
}
The core type being used here is ARSCNView
, which stands for Augmented Reality Scene View. When the session for the ARSCNView
runs, it automatically sets the camera to be the background of the view.
When you first attempt to run your app you’ll have to instruct your device to trust apps from you, the “untrusted” developer. You can do this in the device settings at General > Device Management > Trust developer.
When the app starts, the initial device location is registered as the world origin (X=0, Y=0, Z=0). Any objects you place in the scene will be relative to the world origin.
In my next blog post I’ll discuss overlaying an image onto the scene.
No comments:
Post a Comment