Google Brain Student Program – Review after one year






“I came from a background in statistics, physics and chemistry, and the Google Brain Student program was my first exposure to deep learning and serious programming. What makes me happy is that I can choose my own research topics from a wide variety of topics: computer vision and language deep learning, reinforcement learning and theory. I originally planned to do a PhD in statistics, but my experience here inspired me to enroll in the Stanford CS program starting this fall!”






Google Brain Learner Program
In the past year






ICLR
ICML
CVPR
EMNLP
RSS
GECCO
ISMIR
ISMB
Cosyne
He has published more than 30 papers
NIPS
ICCV
BMVC
Nature Methods
Distill
How does deconvolution generate checkers boards
Visual handwriting generation model


One published by the studentsDistill the articleHow to generate handwriting model by neural network is discussed.

This system explores how robots can imitate human movement through observation and learning. For more details, please refer to theTemporal comparison networks: Self-supervised learning from multiple perspectives”(Time – Contrastive Networks: Self-supervised Learning from multi-view Observation (Corey Lynch, P. Sermanet, J. Hsu, AND S. Levine) Accepted by CVPR Workshop 2017)
This model uses reinforcement learning to train distributed deep learning networks on a large scale by optimizing the allocation of computing tasks on each hardware device. For more details, please refer to theOptimize device configuration using reinforcement learning”Device Placement Optimization with Reinforcement Learning” (Azalia Mirhoseini and Hieu Pham, Q. Le, B. Steiner, R. Larsen, Y. Zhou, N. Kumar, M. Norouzi, S. Bengio, and J. Dean, submitted to ICML 2017).
This solution is designed to automate the process of exploring optimization methods, with a focus on optimizing deep learning architectures. “Neural optimizer search using reinforcement learningThe final version of Neural Optimizer Search with Reinforcement Learning by Irwan Bello and Barret Zoph, V. Vasudevan and Q. Le, Submitted to ICML 2017) coming soon.

Generic sequence – to – sequence model
Music synthesis
Human sketch
The sequence is sampled twice for model training
An efficient model of “attention” mechanism
Time series analysis