AudioToolbox.Framework深入了解

What Is Core Audio?

Core Audio is the digital audio infrastructure of iOS and OS X. It includes a set of software frameworks designed to handle the audio needs in your applications. Read this chapter to learn what you can do with Core Audio.核心音頻是iOS的數(shù)字音頻基礎(chǔ)設施和OS x,它包括一組軟件框架設計用于處理音頻需要在您的應用程序。閱讀這一章,了解你能做什么與核心音頻

Core Audio is tightly integrated into iOS and OS X for high performance and low latency.

In OS X, the majority of Core Audio services are layered on top of the Hardware Abstraction Layer (HAL) as shown in Figure 1-1. Audio signals pass to and from hardware through the HAL. You can access the HAL using Audio Hardware Services in the Core Audio framework when you require real-time audio. The Core MIDI (Musical Instrument Digital Interface) framework provides similar interfaces for working with MIDI data and devices.核心音頻緊密集成到iOS和mac OS X高性能和低延遲。

在OS X中,大部分核心音頻服務是分層的硬件抽象層(HAL)如圖1 - 1所示。音頻信號通過與硬件通過HAL。您可以訪問的HAL使用音頻硬件服務核心音頻框架時需要實時音頻。核心MIDI(樂器數(shù)字接口)框架提供了類似的接口處理MIDI數(shù)據(jù)和設備。

You find Core Audio application-level services in the Audio Toolbox and Audio Unit frameworks.你找到核心音頻音頻應用程序級服務工具箱和音頻單元框架。

Use Audio Queue Services to record, play back, pause, loop, and synchronize audio.使用音頻隊列服務記錄,播放,暫停,循環(huán),和音頻同步。

Use Audio File, Converter, and Codec Services to read and write from disk and to perform audio data format transformations. In OS X you can also create custom codecs.使用音頻文件、轉(zhuǎn)換器和編解碼器服務從磁盤讀取和寫入和執(zhí)行音頻數(shù)據(jù)格式轉(zhuǎn)換。在OS X您還可以創(chuàng)建一個自定義編解碼器

Use Audio Unit Services and Audio Processing Graph Services (represented in the figure as “Audio units”) to host audio units (audio plug-ins) in your application. In OS X you can also create custom audio units to use in your application or to provide for use in other applications.使用音頻服務和音頻處理單元圖服務(在圖中表示為“音頻單元”)主辦單位音頻(音頻插件)在您的應用程序。在OS X你也可以創(chuàng)建自定義音頻單元在應用程序中使用或提供其他應用程序使用

Use Music Sequencing Services to play MIDI-based control and music data.使用音樂測序服務MIDI-based控制和音樂數(shù)據(jù)

Use Core Audio Clock Services for audio and MIDI synchronization and time format management.使用核心音頻和MIDI音頻時鐘服務同步和時間格式管理。

Use System Sound Services (represented in the figure as “System sounds”) to play system sounds and user-interface sound effects.使用系統(tǒng)聲音服務(在圖中表示為“系統(tǒng)聲音”)系統(tǒng)聲音和用戶界面音效。

Core Audio in iOS is optimized for the computing resources available in a battery-powered mobile platform. There is no API for services that must be managed very tightly by the operating system—specifically, the HAL and the I/O Kit. However, there are additional services in iOS not present in OS X. For example, Audio Session Services lets you manage the audio behavior of your application in the context of a device that functions as a mobile telephone and an iPod. Figure 1-2 provides a high-level view of the audio architecture in iOS.核心音頻iOS中可用的計算資源優(yōu)化是一個電池驅(qū)動的移動平臺。沒有API服務必須由操作系統(tǒng)管理的非常嚴格,(HAL)和I / O設備。然而,在iOS有附加服務不存在例如在OS x,音頻會議服務允許您管理應用程序的上下文中的音頻行為的設備功能手機和iPod。圖1 - 2提供了一個在iOS音頻架構(gòu)的高級視圖。

Frameworks Available in iOS and OS X

The frameworks listed in this section are available in iOS 2.0 and OS X v10.5.框架在iOS和mac OS X

在這一節(jié)中列出的框架在iOS 2.0和OS X v10.5可用

AudioToolbox.framework:

The Audio Toolbox framework contains the APIs that provide application-level services. The Audio Toolbox framework includes these header files:音頻工具箱框架包含api,提供應用程序級別的服務。音頻工具箱框架包括這些頭文件:

AudioConverter.h: Audio Converter API. Defines the interface used to create and use audio converters.AudioConverter。h:音頻轉(zhuǎn)換器API。定義接口用于創(chuàng)建和使用音頻轉(zhuǎn)換器。

Audio File Services

Audio File services lets you read or write audio data to and from a file or buffer. You use it in conjunction with Audio Queue Services to record or play audio. In iOS and OS X, Audio File Services consists of the functions, data types, and constants declared in the AudioFile.h header file in AudioToolbox.framework.音頻文件服務讓你讀或?qū)懸纛l數(shù)據(jù)和從文件或緩沖區(qū)。你使用它與音頻隊列服務記錄或播放音頻。在iOS和mac OS X中,音頻文件服務包括函數(shù)、數(shù)據(jù)類型和常量AudioFile中聲明。在AudioToolbox.framework h頭文件。

AudioFile.h: Defines an interface for reading and writing audio data in files.定義一個接口,用于讀取和寫入音頻文件中的數(shù)據(jù)。

Audio File Stream Services 音頻文件流服務

Audio File Stream services lets you parse audio file streams—that is, audio data for which you don’t necessarily have access to the entire file. You can also use it to parse file data from disk, although Audio File Services is designed for that purpose.音頻文件流服務允許您解析音頻文件流、音頻數(shù)據(jù)你不一定對整個文件的訪問。您還可以使用它來解析從磁盤文件數(shù)據(jù),盡管音頻文件服務是為這個目的而設計的。

Audio File Stream services returns audio data and metadata to your application via callbacks. which you typically then play back using Audio Queue Services. In iOS and OS X, Audio File Stream Services consists of the functions, data types, and constants declared in the AudioFileStream.h header file in AudioToolbox.framework.音頻文件流服務返回音頻數(shù)據(jù)和元數(shù)據(jù)到您的應用程序通過回調(diào)。然后您通常使用音頻播放隊列服務。在iOS和mac OS X中,音頻文件流服務包括函數(shù)、數(shù)據(jù)類型和常量AudioFileStream中聲明。在AudioToolbox.framework h頭文件。

AudioFileStream.h: Defines an interface for parsing audio file streams.用于解析音頻文件流。

Audio Format Services 音頻格式的服務

Audio Format Services lets you work with audio data format information. Other services, such as Audio File Services, have functions for this use as well. You use Audio Format Services when all you want to do is obtain audio data format information. In OS X, you can also use this service to get system characteristics such as the available sample rates for encoding. Audio Format Services consists of the functions, data types, and constants declared in the AudioFormat.h header file in AudioToolbox.framework.音頻格式服務允許您處理音頻數(shù)據(jù)格式的信息。其他服務,如音頻文件服務功能使用。您使用音頻格式服務當所有你要做的是獲得音頻數(shù)據(jù)格式的信息。在OS X中,您還可以使用該服務來獲得系統(tǒng)可用的樣本率等特征編碼。音頻格式的服務包括函數(shù)、數(shù)據(jù)類型和常量AudioFormat中聲明。在AudioToolbox.framework h頭文件。

AudioFormat.h: Defines the interface used to assign and read audio format metadata in audio files.用于分配和讀取音頻格式音頻文件的元數(shù)據(jù)

Audio Queue Services 音頻隊列服務

Audio Queue Services lets you play or record audio. It also lets you pause and resume playback, perform looping, and synchronize multiple channels of audio. In iOS and OS X, Audio Queue Services consists of the functions, data types, and constants declared in the AudioQueue.h header file in AudioToolbox.framework.音頻隊列服務允許你播放或記錄音頻。它還允許你暫停和恢復播放,執(zhí)行循環(huán),多個通道的音頻同步。在iOS和mac OS X中,音頻隊列服務包括函數(shù)、數(shù)據(jù)類型和常量AudioQueue中聲明。在AudioToolbox.framework h頭文件。

AudioQueue.h: Defines an interface for playing and recording audio.用于播放和錄制音頻。


AudioServices.h: Defines three interfaces. System Sound Services lets you play short sounds and alerts. Audio Hardware Services provides a lightweight interface for interacting with audio hardware. Audio Session Services lets iPhone and iPod touch applications manage audio sessions.定義了三個接口。系統(tǒng)聲音服務允許你短聲音和警報。音頻硬件服務提供了一個輕量級的界面與音頻硬件進行交互。音頻會話服務讓iPhone和iPod touch的應用程序管理音頻會話

Audio Processing Graph Services 音頻處理圖服務

Audio Processing Graph Services lets you create and manipulate audio processing graphs in your application. In iOS and in OS X, it consists of the functions, data types, and constants declared in AUGraph.h header file in AudioToolbox.framework.音頻處理圖服務允許您創(chuàng)建和操縱音頻處理圖形應用程序中。在iOS和mac OS X,它包括功能、數(shù)據(jù)類型和常量AUGraph中聲明。在AudioToolbox.framework h頭文件。

AudioToolbox.h: Top-level include file for the Audio Toolbox framework.頂級音頻工具箱包括文件框架。

AUGraph.h: Defines the interface used to create and use audio processing graphs.定義了接口用于創(chuàng)建和使用音頻處理圖。

ExtendedAudioFile.h: Defines the interface used to translate audio data from files directly into linear PCM, and vice versa.定義接口用于將音頻數(shù)據(jù)從文件直接轉(zhuǎn)化為線性PCM,反之亦然。

AVFoundation.framework

The AV Foundation framework provides an Objective-C interface for playing back audio with the control needed by most applications. The AV Foundation framework in iOS includes one header file:AV基礎(chǔ)框架提供了一個objective - c接口播放音頻的控制所需的大多數(shù)應用程序。iOS的AV基礎(chǔ)框架包含一個頭文件

AVAudioPlayer.h: Defines an interface for playing audio from a file or from memory.定義一個接口,用于播放音頻文件或從內(nèi)存。

OpenAL.framework

The OpenAL framework provides an implementation of the the OpenAL specification. This framework includes these two header files:OpenAL框架提供了一個實現(xiàn)OpenAL規(guī)范。這個框架包括這兩個頭文件al.h alc.h

In iOS you have these additional header files:

oalMacOSX_OALExtensions.h ? ?oalStaticBufferExtension.h

framework

You get another view of Core Audio by considering its API frameworks, located in /System/Library/Frameworks/. This section quickly lists them to give you a sense of where to find the pieces that make up the Core Audio layers.

Take note that the Core Audio framework is not an umbrella to the other frameworks here, but rather a peer.

The Audio Toolbox framework (AudioToolbox.framework) provides interfaces for the mid- and high-level services in Core Audio. In iOS, this framework includes Audio Session Services, the interface for managing your application’s audio behavior in the context of a device that functions as a mobile phone and iPod.音頻工具箱框架(AudioToolbox.framework)為中級和高級服務提供接口核心音頻。在iOS,這個框架包括音頻會話服務,界面來管理您的應用程序的上下文中的音頻行為的設備功能手機和iPod。

The Audio Unit framework (AudioUnit.framework) lets applications work with audio plug-ins, including audio units and codecs.音頻單元框架(AudioUnit.framework)允許應用程序使用音頻插件,包括音頻單元和編解碼器。

The AV Foundation framework (AVFoundation.framework), available in iOS, provides the AVAudioPlayer class, a streamlined and simple Objective-C interface for audio playback.AV基礎(chǔ)框架(AVFoundation.framework),可用在iOS,AVAudioPlayer類,提供了一個簡單和簡單的objective - c接口音頻回放。

The Core Audio framework (CoreAudio.framework) supplies data types used across Core Audio as well as interfaces for the low-level services.音頻核心框架(CoreAudio.framework)供應數(shù)據(jù)類型用于核心音頻以及接口的底層服務

The Core Audio Kit framework (CoreAudioKit.framework) provides a small API for creating user interfaces for audio units. This framework is not available in iOS.核心音頻設備框架(CoreAudioKit.framework)提供了一個小型的API為音頻單元創(chuàng)建用戶界面。這個框架在iOS是不可用的。

The Core MIDI framework (CoreMIDI.framework) lets applications work with MIDI data and configure MIDI networks. This framework is not available in iOS.MIDI核心框架(CoreMIDI.framework)允許應用程序使用MIDI數(shù)據(jù)和配置MIDI網(wǎng)絡。這個框架在iOS是不可用的。

The Core MIDI Server framework (CoreMIDIServer.framework) lets MIDI drivers communicate with the OS X MIDI server. This framework is not available in iOS.MIDI核心服務器框架(CoreMIDIServer.framework)允許MIDI司機與OS X MIDI服務器通信。這個框架在iOS是不可用的。

The OpenAL framework (OpenAL.framework) provides the interfaces to work with OpenAL, an open source, positional audio technology.OpenAL框架(OpenAL.framework)提供的接口與OpenAL工作,一個開源的、位置音頻技術(shù)。

The Appendix Core Audio Frameworks describes all these frameworks, as well as each of their included header files.附錄核心音頻框架描述所有這些框架,以及他們各自包含的頭文件

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容