2015 年 11 月 9 日,Google Research 發(fā)布了文章:TensorFlow - Google’s latest machine learning system, open sourced for everyone,正式宣布其新一代機(jī)器學(xué)習(xí)系統(tǒng)開源。
至于 Google 為什么要開源 TensorFlow,官方的說法是:
If TensorFlow is so great, why open source it rather than keep it proprietary? The answer is simpler than you might think: We believe that machine learning is a key ingredient to the innovative products and technologies of the future. Research in this area is global and growing fast, but lacks standard tools. By sharing what we believe to be one of the best machine learning toolboxes in the world, we hope to create an open standard for exchanging research ideas and putting machine learning in products. Google engineers really do use TensorFlow in user-facing products and services, and our research group intends to share TensorFlow implementations along side many of our research publications.
Here's Why Google Is Open-Sourcing Some Of Its Most Important Technology 文章中援引了 TensorFlow 開發(fā)者的說法:
The decision to open-source was the brainchild of Jeff Dean, who felt that the company’s innovation efforts were being hampered by the slow pace of normal science. Google researchers would write a paper, which would then be discussed at a conference some months later. Months after that somebody else would write another paper building on their work.
Dean saw that open-sourcing TensorFlow could significantly accelerate the process. Rather than having to wait for the next paper or conference, Google’s researchers could actively collaborate with the scientific community in real-time. Smart people outside of Google could also improve the source code and, by sharing machine learning techniques more broadly, it would help populate the field with more technical talent.
“Having this system open sourced we’re able to collaborate with many other researchers at universities and startups, which gives us new ideas about how we can advance our technology. Since we made the decision to open-source, the code runs faster, it can do more things and it’s more flexible and convenient,” says Rajat Monga, who leads the TensorFlow team.
毫無意外地,TensorFlow 在 Github 上的 Repo 在很短的時間內(nèi)就收獲了大量的 Star 和 Fork,學(xué)術(shù)界和工業(yè)界都對其表示了巨大的興趣,并投身于 TensorFlow 的社區(qū)和 Google 一起完善和改進(jìn) TensorFlow。
然而,當(dāng)時在 Github 做基準(zhǔn)測試、目前就職于 Facebook AI 部門的程序員 Soumith 發(fā)布了文章 Benchmark TensorFlow(中文解讀),對 TensorFlow 和其他主流深度學(xué)習(xí)框架的性能進(jìn)行了比較,結(jié)果差強(qiáng)人意。當(dāng)然,Google 團(tuán)隊表示會繼續(xù)優(yōu)化,并在后面的版本中支持分布式。
2016 年 4 月 13 日,Google 通過文章 Announcing TensorFlow 0.8 – now with distributed computing support! 正式發(fā)布支持分布式的 TensorFlow 0.8 版本,結(jié)合之前對 CPU 和 GPU 的支持,TensorFlow 終于可以被用于實際的大數(shù)據(jù)生產(chǎn)環(huán)境中了。
2016 年 4 月 29 日,開發(fā)出目前最強(qiáng)圍棋 AI 的 Google 旗下 DeepMind 宣布:DeepMind moves to TensorFlow,這在業(yè)界被認(rèn)為 TensorFlow 終于可以被當(dāng)作 TensorFlow 在工業(yè)界發(fā)展的里程碑事件,極大提升了 TensorFlow 使用者的研究熱情。
The Good, Bad & Ugly of TensorFlow(中文翻譯)對目前 TensorFlow 的優(yōu)缺點做了詳細(xì)的分析。
TensorFlow 學(xué)習(xí)資源
TensorFlow 使用 Python 作為主要接口語言,所以掌握 Python 在 Data Science 領(lǐng)域的知識就成為學(xué)習(xí) TensorFlow 的必要條件。A Complete Tutorial to Learn Data Science with Python from Scratch 就是一篇非常好的學(xué)習(xí)資料。
- TensorFlow 官方文檔 從首個版本就非常詳細(xì)。
- LearningTensorFlow: A beginners guide to a powerful framework.,包含詳細(xì)的接口定義,各種學(xué)習(xí)資源和例子。
- Hello, TensorFlow! Building and training your first TensorFlow graph from the ground up.
- A noob’s guide to implementing RNN-LSTM using Tensorflow
- Updated with Google’s TensorFlow: Artificial Intelligence, Neural Networks, and Deep Learning 強(qiáng)烈推薦這篇文章,對AI、NN、DL 的發(fā)展歷史以及其中的關(guān)鍵大牛的關(guān)鍵工作做了詳細(xì)介紹。
- DeepDreaming with TensorFlow This notebook demonstrates a number of Convolutional Neural Network image generation techniques implemented with TensorFlow for fun and science
- TensorFlow Examples TensorFlow Tutorial with popular machine learning algorithms implementation. This tutorial was designed for easily diving into TensorFlow, through examples.
- TensorFlow-Tutorials Introduction to deep learning based on Google's TensorFlow framework. These tutorials are direct ports of Newmu's Theano Tutorials.
- Dive Into TensorFlow, Part II: Basic Concepts TensorFlow 中基本概念的解釋
- TensorFlow學(xué)習(xí)筆記1:入門 系列學(xué)習(xí)筆記,中文版
- TensorFlow人工智能引擎入門教程所有目錄 非常多作者學(xué)習(xí)和使用 TensorFlow 的經(jīng)驗文章。
深度學(xué)習(xí)不是一個突然出現(xiàn)的概念,而是從神經(jīng)網(wǎng)絡(luò)發(fā)展而來的,所以,學(xué)習(xí) TensorFlow,對深度學(xué)習(xí)領(lǐng)域本身的發(fā)展歷史有基本的了解有助于理解技術(shù)的發(fā)展。這方面有很多非常好的文章:
- [Machine Learning & Algorithm] 神經(jīng)網(wǎng)絡(luò)基礎(chǔ)
- Deep Learning in Neural Networks: An Overview
- Deep learning by Yann LeCun, Yoshua Bengio& Geoffrey Hinton
- A 'Brief' History of Neural Nets and Deep Learning 一個系列,圖文并茂,非常詳細(xì)。
- A Gentle Guide to Machine Learning 條理非常清晰。
- A Neural Network in 11 lines of Python 非常好的從頭開始實現(xiàn)一個神經(jīng)網(wǎng)絡(luò)的文章,對學(xué)習(xí)和理解神經(jīng)網(wǎng)絡(luò)中所用到的技術(shù)很有用。
- Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks 系列文章,非常詳細(xì)。
- Neural Networks and Deep Learning
- Welcome to the Deep Learning Tutorial! 斯坦福深度學(xué)習(xí)資料
- Learning How To Code Neural Networks
- Machine Learning in a Week Deep Learning 的學(xué)習(xí)計劃。
- Conv-Nets-And-Gen TensorFlow 官方推薦的文章。
- Convolutional Neural Networks backpropagation: from intuition to derivation 神經(jīng)網(wǎng)絡(luò)反向傳播算法的詳細(xì)解釋。
- RECURRENT NEURAL NETWORKS TUTORIAL, PART 1 – INTRODUCTION TO RNNS RNN 的系列學(xué)習(xí)文章。
- A Deep Dive into Recurrent Neural Nets RNN 深度學(xué)習(xí)文章
- How to implement a neural network 神經(jīng)網(wǎng)絡(luò)的系列學(xué)習(xí)文章
- An Interactive Node-Link Visualization of Convolutional Neural Networks 非常好的可視化神經(jīng)網(wǎng)絡(luò)工作原理的博客。
- How to Code and Understand DeepMind's Neural Stack Machine
- Neural Networks Demystified 解密神經(jīng)網(wǎng)絡(luò)的視頻教程。
- Fundamentals of Deep Learning – Starting with Artificial Neural Network 非常詳細(xì)。
- 神經(jīng)網(wǎng)絡(luò)淺講:從神經(jīng)元到深度學(xué)習(xí) 中文資料中難得的非常詳細(xì)的資料。
- 有趣的機(jī)器學(xué)習(xí)概念縱覽:從多元擬合,神經(jīng)網(wǎng)絡(luò)到深度學(xué)習(xí),給每個感興趣的人 中文資源中關(guān)于神經(jīng)網(wǎng)絡(luò)到深度學(xué)習(xí)的歷史講解很有意思的文章。
- 卷積神經(jīng)網(wǎng)絡(luò)CNN經(jīng)典模型整理Lenet,Alexnet,Googlenet,VGG,Deep Residual Learning 對不同的 CNN 模型做了詳細(xì)的對比介紹。
- 反向傳播神經(jīng)網(wǎng)絡(luò)極簡入門 這是極簡?BP 得多復(fù)雜?