本編主要講使用ARKit進行構(gòu)建AR世界并實現(xiàn)圖片識別、平面捕捉、人臉識別功能并在真實世界中創(chuàng)建虛擬場景,從而達到虛實結(jié)合,這也是AR的本質(zhì)。
一、AR場景的從無到有是如何實現(xiàn)的
1、ARkit通過攝像頭捕捉攝像機中真實世界的畫面
2、通過蘋果的游戲引擎(3D引擎SceneKit, 2D引擎SpriktKit)加載渲染物體模型到虛擬世界。AR的展示脫離不開游戲引擎,如果脫離了游戲引擎的渲染,ARKit跟普通相機的作用就沒什么區(qū)別。
3、將物體模型放置在AR場景中(在AR世界中模型需要有大小、遠近、角度等等屬性才能真實的展示出虛擬物體與真實世界的相互結(jié)合,那么其中主要應(yīng)用傳感器追蹤和坐標(biāo)識別以及坐標(biāo)轉(zhuǎn)換)
傳感器追蹤:追蹤現(xiàn)實世界動態(tài)物體的六軸(X,Y,Z)的變化,(有位移、旋轉(zhuǎn))。(注:X,Y,Z為右手坐標(biāo)系,方便記憶)
位移三軸決定物體的方位和大小
旋轉(zhuǎn)三軸決定物體的顯示形態(tài)和區(qū)域

二、AR世界的構(gòu)建
AR世界三要素:世界追蹤器ARWorldTrackingConfiguration、AR場景視圖ARSCNView、AR虛擬場景SCNScene。
代碼:
#pragma mark - lazy load
- (SCNView *)sceneView{
if (!_sceneView) {
_sceneView = [[ARSCNView alloc] initWithFrame:CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height)];
_sceneView.delegate = self;
}
return _sceneView;
}
- (ARWorldTrackingConfiguration *)configuration{
if (!_configuration) {
_configuration = [[ARWorldTrackingConfiguration alloc] init];
}
return _configuration;
}
- (SCNScene *)scene{
if (!_scene) {
_scene = [[SCNScene alloc] init];
}
return _scene;
}
@property (nonatomic, strong) ARSCNView *sceneView;
@property (nonatomic, strong) ARWorldTrackingConfiguration *configuration;//AR世界追蹤
@property (nonatomic, strong) SCNScene *scene;
/**
* 播放器對象
*/
@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) SCNNode *playerParanNode;//視頻播放器載體節(jié)點
- (void)viewWillAppear:(BOOL)animated{
[super viewWillAppear:animated];
[self initARSceneView];
[self startARWorldTrackingConfiguration];
}
- (void)viewDidAppear:(BOOL)animated{
[super viewDidAppear:animated];
}
- (void)viewWillDisappear:(BOOL)animated{
[super viewWillDisappear:animated];
[self.sceneView.session pause];
}
//初始化AR場景
- (void)initARSceneView{
self.sceneView.scene = self.scene;
[self.view addSubview:self.sceneView];
}
//開啟AR世界追蹤
- (void)startARWorldTrackingConfiguration{
switch (self.arType) {
case ARWorldTrackingConfigurationType_detectionImage:{//圖片識別
if (@available(iOS 11.3, *)) {
//設(shè)置ARWorldTrackingConfiguration的detectionImage指向的目錄(標(biāo)示圖文件夾)
//注意:文件夾中的標(biāo)示圖必須設(shè)置其size大小,否則不能使用
self.configuration.detectionImages = [ARReferenceImage referenceImagesInGroupNamed:@"ARDetectionImageResource" bundle:nil];
//啟動AR追蹤
[self.sceneView.session runWithConfiguration:self.configuration options:ARSessionRunOptionResetTracking | ARSessionRunOptionRemoveExistingAnchors];
}
break;
}
case ARWorldTrackingConfigurationType_planeDetection:{//平面識別
if (@available(iOS 11.3, *)) {
self.configuration.planeDetection = ARPlaneDetectionHorizontal; // | ARPlaneDetectionVertical;
//啟動AR追蹤
[self.sceneView.session runWithConfiguration:self.configuration options:ARSessionRunOptionResetTracking | ARSessionRunOptionRemoveExistingAnchors];
}
break;
}
case ARWorldTrackingConfigurationType_faceTracking:{//人臉檢測
if (@available(iOS 11.3, *)) {
[self.sceneView.session runWithConfiguration:self.faceConfiguration];
}
break;
}
case ARWorldTrackingConfigurationType_faceTrackingBlendShapes:{//人臉檢測 - 表情檢測
if (@available(iOS 11.3, *)) {
[self.sceneView.session runWithConfiguration:self.faceConfiguration];
}
break;
}
default:{
break;
}
}
}
到此,運行工程就能創(chuàng)建出AR追蹤器,創(chuàng)建ARWorldTrackingConfiguration內(nèi)部會創(chuàng)建相機,無需我們手動打開相機。
在Demo工程中展示了4中追蹤場景如下:根據(jù)不同場景配置ARWorldTrackingConfiguration的運行追蹤類型(看上面case中的追中場景)
//AR世界追蹤場景
typedef enum : NSUInteger {
ARWorldTrackingConfigurationType_detectionImage,//圖片識別
ARWorldTrackingConfigurationType_planeDetection,//平面捕捉
ARWorldTrackingConfigurationType_faceTracking,//人臉識別
ARWorldTrackingConfigurationType_faceTrackingBlendShapes,//表情檢測
} ARWorldTrackingConfigurationType;
追中到真實世界中的場景后我們繼承ARSCNView的ARSCNViewDelegate代理,ARKit會回調(diào)回來追蹤到的節(jié)點信息。一下開始各種識別追蹤的處理:
1、圖片識別+視頻流播放
圖片識別成功后,處理以下回調(diào)方法,在目標(biāo)圖片中添加視頻播放器節(jié)點,并進行視頻資源的播放(imageHC01和0003為我添加在工程中的目標(biāo)圖片和視頻資源的名稱,可以根據(jù)需要進行修改)
//AR世界追蹤回調(diào) - 不斷更新
- (void)renderer:(id <SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
if (self.arType == ARWorldTrackingConfigurationType_detectionImage) {//圖片識別
ARImageAnchor *imageAnchor = (ARImageAnchor *)anchor;
//獲取參考標(biāo)示圖對象
ARReferenceImage *referenceImage = imageAnchor.referenceImage;
if ([referenceImage.name isEqual:@"imageHC01"] || [referenceImage.name isEqualToString:@"0003"]) {//識別到指定標(biāo)示圖
//暫停并移除原先添加的節(jié)點
[self.player pause];
[self.playerParanNode removeFromParentNode];
//加載新的視頻資源
[self setPlayerVideoItemWithDetectionImageName:referenceImage.name];
//在標(biāo)示圖上創(chuàng)建節(jié)點
self.playerParanNode = [SCNNode new];
SCNBox *box = [SCNBox boxWithWidth:referenceImage.physicalSize.width height:referenceImage.physicalSize.height length:0.001 chamferRadius:0];
self.playerParanNode.geometry = box;//創(chuàng)建一個箱子放在節(jié)點上
//將創(chuàng)建的子節(jié)點旋轉(zhuǎn)到與圖片貼合(右手坐標(biāo))
self.playerParanNode.eulerAngles = SCNVector3Make(-M_PI/2, 0, 0);
//將box的materials設(shè)置成player對象
SCNMaterial *material = [[SCNMaterial alloc] init];
material.diffuse.contents = self.player;
self.playerParanNode.geometry.materials = @[material];
//直接播放
[self.player play];
[node addChildNode:self.playerParanNode];
}
}
}
創(chuàng)建AVPlayer和加載資源
/**
播放器對象
@return AVPlayer
*/
-(AVPlayer *)player{
if (!_player) {
_player=[[AVPlayer alloc] init];
}
return _player;
}
//獲取與識別圖對應(yīng)的視頻路徑
- (NSURL *)getPlayVideoUrl:(NSString *)videoName{
NSString * urlStr = [[NSBundle mainBundle]pathForResource:[NSString stringWithFormat:@"%@.mp4",videoName] ofType:nil];
NSURL *url=[NSURL fileURLWithPath:urlStr];
return url;
}
//設(shè)置player預(yù)播放的視頻資源
- (void)setPlayerVideoItemWithDetectionImageName:(NSString *)imageName{
NSURL *videoUrl = [self getPlayVideoUrl:imageName];
self.player = [AVPlayer playerWithURL:videoUrl];
}
圖片識別效果:

2、平面捕捉+模型放置
如果要實現(xiàn)ARKit平面捕捉功能需要設(shè)置ARWorldTrackingConfiguration的planeDetection屬性。可進行水平面捕捉和垂直平面捕捉:
//平面識別
self.configuration.planeDetection = ARPlaneDetectionHorizontal; // | ARPlaneDetectionVertical;
//啟動AR追蹤
[self.sceneView.session runWithConfiguration:self.configuration options:ARSessionRunOptionResetTracking | ARSessionRunOptionRemoveExistingAnchors];
使用.scn格式模型,將模型放置在捕捉到的平面上。這里只是簡單放置3D模型,后期可以對模型進行更多交互,包括模型動畫、點擊交互等操作。
捕捉到平面后,執(zhí)行以下回調(diào),操作模型
//AR世界追蹤回調(diào)
- (void)renderer:(id<SCNSceneRenderer>)renderer didAddNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor{
if (self.arType == ARWorldTrackingConfigurationType_planeDetection) {//平面捕捉
if ([anchor isMemberOfClass:[ARPlaneAnchor class]]) {//識別到了平面
//添加一個3D模型,ARKit只有捕捉能力,錨點只是一個空間位置
//獲取捕捉到的平面錨點
ARPlaneAnchor *planeAnchor = (ARPlaneAnchor *)anchor;
// //創(chuàng)建一個box,3D模型(系統(tǒng)捕捉到的平底是不規(guī)則的,這里將其縮放)
// SCNBox *planBox = [SCNBox boxWithWidth:planeAnchor.extent.x * 0.5 height:0 length:planeAnchor.extent.x * 0.5 chamferRadius:0];
// //使用Material渲染3D模型
// planBox.firstMaterial.diffuse.contents = [UIColor redColor];
// //創(chuàng)建3D模型節(jié)點
// SCNNode *planeNode = [SCNNode nodeWithGeometry:planBox];
// //設(shè)置節(jié)點的中心為捕捉到的中心點
// planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
//
// //將3D模型添加到捕捉到的節(jié)點上(此時如果將模型設(shè)置有顏色,就可以看到3D長方體模型)
// [node addChildNode:planeNode];
//創(chuàng)建3D模型場景(將自定義模型展現(xiàn)出來)
SCNScene *scene = [SCNScene sceneNamed:@"art.scnassets/vase/vase.scn"];
//獲取模型節(jié)點
//一個場景有多個節(jié)點,所有場景有且只有一個根節(jié)點
SCNNode *modelNode = scene.rootNode.childNodes.firstObject;
//設(shè)置模型節(jié)點的位置為捕捉到平底的位置(默認為相機位置)
modelNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
//將自定義模型節(jié)點添加到捕捉到的節(jié)點上
[node addChildNode:modelNode];
}
}
}
平面捕捉效果:

3、人臉識別+人臉貼圖
ARKit 1.5新特性新增了人臉識別功能需要ios11.3或更高系統(tǒng)支持。人臉識別與圖片識別和平底捕捉所創(chuàng)建的追蹤器有所區(qū)別,需要創(chuàng)建ARFaceTrackingConfiguration
創(chuàng)建ARFaceTrackingConfiguration:
@property (nonatomic, strong) ARConfiguration *faceConfiguration;//人臉識別追蹤
- (ARConfiguration *)faceConfiguration{
if (!_faceConfiguration) {
_faceConfiguration = [[ARFaceTrackingConfiguration alloc] init];
_faceConfiguration.lightEstimationEnabled = YES;
}
return _faceConfiguration;
}
//運行AR人臉識別追蹤器
if (@available(iOS 11.3, *)) {
[self.sceneView.session runWithConfiguration:self.faceConfiguration];
}
人臉識別成功后,給人臉增加貼圖。當(dāng)人的表情發(fā)生變化(皺眉、張嘴等)時,需要不斷更新貼圖在人臉節(jié)點中的狀態(tài),以達到貼圖與人臉表情同步的效果。
//人臉貼圖節(jié)點
@property (nonatomic, strong) SCNNode *faceTextureMaskNode;
- (void)renderer:(id<SCNSceneRenderer>)renderer willUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor{
if (self.arType == ARWorldTrackingConfigurationType_faceTracking) {
if (anchor && [anchor isKindOfClass:[ARFaceAnchor class]]) {//識別到人臉
ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
if (!_faceTextureMaskNode) {
[node addChildNode:self.faceTextureMaskNode];
}
//實時更新貼圖
ARSCNFaceGeometry *faceGeometry = (ARSCNFaceGeometry *)self.faceTextureMaskNode.geometry;
if (faceGeometry && [faceGeometry isKindOfClass:[ARSCNFaceGeometry class]]) {
[faceGeometry updateFromFaceGeometry:faceAnchor.geometry];
}
}
}else if (self.arType == ARWorldTrackingConfigurationType_faceTrackingBlendShapes){//表情檢測
if (anchor && [anchor isKindOfClass:[ARFaceAnchor class]]) {//識別到人臉錨點
ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
NSDictionary *blendShapes = faceAnchor.blendShapes;
NSNumber *browInnerUp = blendShapes[ARBlendShapeLocationBrowInnerUp];//皺眉程度
if ([browInnerUp floatValue] > 0.5) {
NSLog(@"皺眉啦............");
if (!_glassTextureNode) {
[node addChildNode:self.glassTextureNode];
}
ARSCNFaceGeometry *faceGeometry = (ARSCNFaceGeometry *)self.glassTextureNode.geometry;
if (faceGeometry && [faceGeometry isKindOfClass:[ARSCNFaceGeometry class]]) {
[faceGeometry updateFromFaceGeometry:faceAnchor.geometry];
}
}
}
}
}
/**
人臉貼圖節(jié)點
@return SCNNode
*/
- (SCNNode *)faceTextureMaskNode {
if (!_faceTextureMaskNode) {
id<MTLDevice> device = self.sceneView.device;
ARSCNFaceGeometry *geometry = [ARSCNFaceGeometry faceGeometryWithDevice:device fillMesh:NO];
SCNMaterial *material = geometry.firstMaterial;
material.fillMode = SCNFillModeFill;
material.diffuse.contents = [UIImage imageNamed:@"faceTexture.jpg"];
_faceTextureMaskNode = [SCNNode nodeWithGeometry:geometry];
}
_faceTextureMaskNode.name = @"textureMask";
return _faceTextureMaskNode;
}
人臉識別與人臉貼圖效果:

本文到此結(jié)束,后續(xù)將對AR場景中3D模型交互進行研究與實現(xiàn)。感謝您的閱讀~