GPUImage框架_文檔翻譯_01

是的,我要開始寫關(guān)于GPUImage 框架的文章了!先來把簡介看一波!借助翻譯工具也得啃出來哇!??!看來我又得早起學(xué)英文了。。。。

我知道大家,估計(jì)不會耐心看簡介!不過,要了解一個(gè)框架還是要看看滴。CC就幫大家做這件事情吧~~~~

翻譯不正確的地方,歡迎大家吐槽哦~~~

為提供英文閱讀能力,我會把我閱讀的所有英文文檔都以這樣的方式寫入到簡書中!

GPUImage下載地址

先來介紹一下,GPUImage吧!

The GPUImage framework is a BSD-licensed iOS library that lets you apply GPU-accelerated filters and other effects to images, live camera video, and movies. In comparison to Core Image (part of iOS 5.0), GPUImage allows you to write your own custom filters, supports deployment to iOS 4.0, and has a simpler interface. However, it currently lacks some of the more advanced features of Core Image, such as facial detection.

(GPUImage框架是一個(gè)BSD(伯克利軟件套件)許可iOS庫,能讓你的APP應(yīng)用GPU加速的過濾器和其他圖像處理效果,現(xiàn)場攝像機(jī)視頻和movies。在Core Image對比(iOS 5的一部分),GPUImage允許你添加自己的自定義過濾器,支持部署到iOS 4,并有一個(gè)簡單的接口。然而,它目前缺乏Core Image的一些更高級的特性,如人臉檢測。)


For massively parallel operations like processing images or live video frames, GPUs have some significant performance advantages over CPUs. On an iPhone 4, a simple image filter can be over 100 times faster to perform on the GPU than an equivalent CPU-based filter.

(大規(guī)模用來處理圖像或視頻直播框架,GPUImage框架有顯著的性能優(yōu)勢。在iPhone 4上,一個(gè)簡單的圖像過濾器在GPU上執(zhí)行的速度比同等CPU的過濾器快100倍以上。)


However, running custom filters on the GPU requires a lot of code to set up and maintain an OpenGL ES 2.0 rendering target for these filters. I created a sample project to do this:

(然而,在GPU上運(yùn)行自定義過濾器需要大量的代碼來設(shè)置和維護(hù)這些過濾器的OpenGL ES 2渲染目標(biāo)。我創(chuàng)建了一個(gè)示例項(xiàng)目來做這件事:)

? ??示例項(xiàng)目地址

and found that there was a lot of boilerplate code I had to write in its creation. Therefore, I put together this framework that encapsulates a lot of the common tasks you'll encounter when processing images and video and made it so that you don't need to care about the OpenGL ES 2.0 underpinnings.

(你會發(fā)現(xiàn)有大量的樣板代碼我已經(jīng)寫在其創(chuàng)作中。因此,我將這個(gè)框架封裝起來,封裝了處理圖像和視頻時(shí)遇到的許多常見任務(wù),使您不必關(guān)心OpenGL ES 2基礎(chǔ)。)


This framework compares favorably to Core Image when handling video, taking only 2.5 ms on an iPhone 4 to upload a frame from the camera, apply a gamma filter, and display, versus 106 ms for the same operation using Core Image. CPU-based processing takes 460 ms, making GPUImage 40X faster than Core Image for this operation on this hardware, and 184X faster than CPU-bound processing. On an iPhone 4S, GPUImage is only 4X faster than Core Image for this case, and 102X faster than CPU-bound processing. However, for more complex operations like Gaussian blurs at larger radii, Core Image currently outpaces GPUImage.

(GPUImage框架在處理視頻時(shí)與Core Image相比是有利的,在iPhone 4上只需2.5毫秒就可以從照相機(jī)上傳幀,應(yīng)用gamma濾波器,并使用Core Image對同一操作顯示106毫秒?;贑PU的處理需要460毫秒,使GPUImage 40x核心圖像比這個(gè)操作在該硬件更快,和184x速度比CPU綁定的處理。在iPhone 4S,GPUImage只有快4倍比核心的形象,這種情況下,和102x速度比CPU綁定的處理。然而,對于更復(fù)雜的操作,如高斯模糊半徑較大,目前超過GPUImage核心形象。)


Technical requirements(技術(shù)支持)


OpenGL ES 2.0: Applications using this will not run on the original iPhone, iPhone 3G, and 1st and 2nd generation iPod touches

(OpenGL ES 2:應(yīng)用程序?qū)⒉粫\(yùn)行在最初的iPhone,例如iPhone 3G和第一代和第二代iPod觸摸)


iOS 4.1 as a deployment target (4.0 didn't have some extensions needed for movie reading). iOS 4.3 is needed as a deployment target if you wish to show live video previews when taking a still photo.

(iOS 4.1作為部署目標(biāo)的(4.0比沒有電影閱讀所需的擴(kuò)展)。如果您希望在拍攝靜止照片時(shí)顯示實(shí)時(shí)視頻預(yù)覽,則需要iOS 4.3作為部署目標(biāo)。


iOS 5.0 SDK to build

Devices must have a camera to use camera-related functionality (obviously)

(顯然需要必須有一個(gè)攝像機(jī)來應(yīng)用與相機(jī)相關(guān)的功能)


The framework uses automatic reference counting (ARC), but should support projects using both ARC and manual reference counting if added as a subproject as explained below. For manual reference counting applications targeting iOS 4.x, you'll need add -fobjc-arc to the Other Linker Flags for your application project.

(GPUImage框架使用自動引用計(jì)數(shù)(ARC),但要支持的項(xiàng)目,如果添加一個(gè)子項(xiàng)目解釋如下使用手動引用計(jì)數(shù)。手動引用計(jì)數(shù)的應(yīng)用針對iOS 4.X系統(tǒng),你需要添加-fobjc-arc的其他連接標(biāo)記到你的應(yīng)用程序項(xiàng)目。)


General architecture (普遍結(jié)構(gòu))


GPUImage uses OpenGL ES 2.0 shaders to perform image and video manipulation much faster than could be done in CPU-bound routines. However, it hides the complexity of interacting with the OpenGL ES API in a simplified Objective-C interface. This interface lets you define input sources for images and video, attach filters in a chain, and send the resulting processed image or video to the screen, to a UIImage, or to a movie on disk.

GPUImage使用OpenGL ES 2著色器進(jìn)行圖像和視頻處理速度遠(yuǎn)遠(yuǎn)超過可以在CPU綁定的程序做的。然而,它隱藏在OpenGLES API簡化Objective-C接口OpenGL交互的復(fù)雜性。這個(gè)接口允許您定義的圖像和視頻輸入源,鏈連接的過濾器,并發(fā)送處理結(jié)果的圖像或視頻的畫面到屏幕,一個(gè)UIImage,或磁盤上的一個(gè)movie。


Images or frames of video are uploaded from source objects, which are subclasses of GPUImageOutput. These include GPUImageVideoCamera (for live video from an iOS camera), GPUImageStillCamera (for taking photos with the camera), GPUImagePicture (for still images), and GPUImageMovie (for movies). Source objects upload still image frames to OpenGL ES as textures, then hand those textures off to the next objects in the processing chain.

(視頻圖像或幀從源對象的上傳,這是GPUImageOutput。這些包括GPUImageVideoCamera(從iOS相機(jī)錄制視頻)、GPUImageStillCamera(帶相機(jī)的照片),GPUImagePicture(靜態(tài)圖片),和GPUImageMovie(電影)。源對象將圖像幀上傳到OpenGL ES作為紋理,然后將這些紋理傳遞給處理鏈中的下一個(gè)對象。)


Filters and other subsequent elements in the chain conform to the GPUImageInput protocol, which lets them take in the supplied or processed texture from the previous link in the chain and do something with it. Objects one step further down the chain are considered targets, and processing can be branched by adding multiple targets to a single output or filter.

(鏈中的過濾器和其他隨后的元素符合GPUImageInput協(xié)議,這讓他們以提供或加工紋理從鏈中的上一個(gè)鏈接,用它做什么。在鏈上一步一步的對象被認(rèn)為是目標(biāo),并且處理可以通過將多個(gè)目標(biāo)添加到單個(gè)輸出或過濾器來進(jìn)行分支)


For example, an application that takes in live video from the camera, converts that video to a sepia tone, then displays the video onscreen would set up a chain looking something like the following:

(例如,一個(gè)應(yīng)用程序,需要在攝像頭獲取視頻,再轉(zhuǎn)換視頻到深褐色調(diào),然后顯示視頻屏幕將建立一個(gè)鏈,看起來過程有點(diǎn)像下面:)

GPUImageVideoCamera -> GPUImageSepiaFilter -> GPUImageView

小伙伴們閱讀后,請喜歡一下。文章更新可以提醒到你哦~~~~


最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容