Discover supporting concepts, features, and best practices for building great AR experiences.
為了構(gòu)建好的AR體驗,發(fā)現(xiàn)支持的概念,特征和好的例子.
Overview
總欄
Figure 1
場景1

The basic requirement for any AR experience—and the defining feature of ARKit—is the ability to create and track a correspondence between the real-world space the user inhabits and a virtual spaceconventionwhere you can model visual content.
- 體驗任何AR所需要的基本需求和ARKit已確定的特性是創(chuàng)建基本的AR模型和追蹤模型和用戶行為的基礎(chǔ).
When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world.
- 當(dāng)你的app把攝像頭和用戶自己的東西一起顯示,就會產(chǎn)生你虛擬的東西是現(xiàn)實世界的一部分的錯覺,.
In all AR experiences, ARKit uses world and camera coordinate systems following a right-handed convention: the y-axis points upward, and (when relevant) the z-axis points toward the viewer and the x-axis points toward the viewer's right.
- 在所有的AR體驗中,ARKit使用的顯示世界和攝像頭的坐標(biāo)系遵循右手定則:y軸向上,
z軸直線觀察者,x軸指向觀察者的右側(cè)
Session configurations can change the origin and orientation of the coordinate system with respect to the real world (see worldAlignment).
- 原始的坐標(biāo)和方向是遵從現(xiàn)實世界的,可以通過會話配置更改,(參考 worldAlignment).
Each anchor in an AR session defines its own local coordinate system, also following the right-handed, z-towards-viewer convention; for example, the ARFaceAnchor class defines a system for locating facial features..
-每個在AR會話中的錨點都遵循它自己的坐標(biāo)系統(tǒng),并且準(zhǔn)守右手定制和z軸指向觀察者的慣例,例如ARFaceAnchor 就是一個定義面部特征的類.
- How World Tracking Works
現(xiàn)實追蹤如何工作
To create a correspondence between real and virtual spaces, ARKit uses a technique called visual-inertial odometry.
- ARKit 使用一項視覺慣性測試的技術(shù)來創(chuàng)建現(xiàn)實世界和虛擬空間的對應(yīng).
This process combines information from the iOS device’s motion sensing hardware with computer vision analysis of the scene visible to the device’s camera.
-進程采集iOS運動傳感器硬件和從設(shè)備攝像圖采集的影像做分析.
ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data.
- ARKit能識別圖像中的顯著特征,識別視頻每幀中這些特征的不同,并且把這些信息和運動傳感器數(shù)據(jù)比較.
The result is a high-precision model of the device’s position and motion.
-結(jié)果是一個高精度的位置和運動的模型
World tracking also analyzes and understands the contents of a scene.
-現(xiàn)實追蹤也分析和理解場景中的內(nèi)容.
Use hit-testing methods (see the ARHitTestResult class) to find real-world surfaces corresponding to a point in the camera image.
-使用擊中測試方法可以找到顯示世界的面對應(yīng)相機照片上的一個點.
If you enable the [planeDetection](apple-reference documentation://hswVcas8JL) setting in your session configuration, ARKit detects flat surfaces in the camera image and reports their position and sizes.
- 如果你在會話配置中打開planeDetection設(shè)置,ARKit就會檢測相機影像里面的平面表面并輸出他們的位置和尺寸.
You can use hit-test results or detected planes to place or interact with virtual content in your scene.
-你可以使用擊中測試結(jié)果或檢測平面或者和場景的虛擬內(nèi)容交互.
- Best Practices and Limitations
- 練習(xí)和局限性
World tracking is an inexact science.
- 現(xiàn)實追蹤是一個不精確的科學(xué).
This process can often produce impressive accuracy, leading to realistic AR experiences.
-可以經(jīng)常產(chǎn)生印象深刻的現(xiàn)實AR的體驗.
However, it relies on details of the device’s physical environment that are not always consistent or are difficult to measure in real time without some degree of error.
-然后,它依賴于設(shè)置的物理環(huán)境并不是一直可靠的,或很困難測試和真實情況的一個錯誤的程度.
To build high-quality AR experiences, be aware of these caveats and tips.
-為了構(gòu)建高質(zhì)量的AR體驗,記住這些注意事項和提示.
Design AR experiences for predictable lighting conditions.
-為不同燈光條件設(shè)計AR體驗.
World tracking involves image analysis, which requires a clear image.
-現(xiàn)實追蹤涉及圖像分析,且需要一個清晰的圖像.
Tracking quality is reduced when the camera can’t see details, such as when the camera is pointed at a blank wall or the scene is too dark.
-當(dāng)攝像頭拍不清楚時,例如攝像頭指向一個黑色的墻或者場景太黑,追蹤的質(zhì)量就會降低.
Use tracking quality information to provide user feedback.
-使用追蹤質(zhì)量信息來提供用戶反饋
World tracking correlates image analysis with device motion.
-現(xiàn)實追蹤把設(shè)備運動關(guān)聯(lián)到圖像分析
ARKit develops a better understanding of the scene if the device is moving, even if the device moves only subtly.
-ARKit對設(shè)備在移動/甚至巧妙的場景開發(fā)出一個更好的理解.
Excessive motion—too far, too fast, or shaking too vigorously—results in a blurred image or too much distance for tracking features between video frames, reducing tracking quality.
- 過多的運動例如太遠/太快/晃的太大力會導(dǎo)致模糊的影像,兩視頻幀之前距離太遠也會降低追蹤質(zhì)量.
The ARCamera class provides tracking state reason information, which you can use to develop UI that tells a user how to resolve low-quality tracking situations.
-ARCamera類提供是追蹤狀態(tài)的原因信息,可以用在界面上顯示幫助用戶知道怎么解決低質(zhì)量的追蹤狀態(tài).
Allow time for plane detection to produce clear results, and disable plane detection when you have the results you need.
-允許一定時間檢測平面來生產(chǎn)清晰的結(jié)果,并且獲得到你想要的結(jié)果后可以關(guān)閉平面檢測.
Plane detection results vary over time—when a plane is first detected, its position and extent may be inaccurate.
-平面檢測有時會變化的,當(dāng)首次檢測平面時,它的位置和擴展可能不準(zhǔn)確
As the plane remains in the scene over time, ARKit refines its estimate of position and extent.
-隨著時間的推移平面一直存在,ARKit會估算位置和擴展.
When a large flat surface is in the scene, ARKit may continue changing the plane anchor’s position, extent, and transform after you’ve already used the plane to place content.
-當(dāng)場景中有一個大的平面表現(xiàn),ARKit會在你已經(jīng)使用的平面放在內(nèi)容后也會更改平面錨點的位置/擴展/方向.