Notes from The Science of Deep Learning Colloquia

We attended The Science of Deep Learning Colloquia which is a part of Arthur M. Sackler Colloquia of the National Academy of Sciences,US. We listen mostly high level but motivating and fun talks by leading researchers working on deep learning. We drafted some notes which are very terse and might be inaccurate, but you can get the intuition of the talks. Videos told to be released by two weeks in NAS’s youtube channel.

Day-1 Speakers: Amnon Shashua, Jitendra Malik, Chris Manning, Regina Barzilay, Tomaso Poggio, Orial Vinyals, Terrence Senjnowski, Olga Troyanskaya, Kyle Cranmer, Eero Simoncelli, Bruno Olshausen, Antonio Torralba, Rodney Brooks

  • Favorites: Amnon Shashua, Kyle Cranmer, Eero Simoncelli, Olga Troyanskaya, Antonio Torralba, Rodney Brooks, Regina Barzilay

Day-2 Speakers: Tomaso Poggio, Nati Srebro, Peter Bartlett, Anders Hover, Jonathon Phillips, Doina Precup, Naim Sompolinksy, Ronald Coefman, Konrad Kording, Tara Sainath, P. Jonathan Philips, Jitendra Malik, Antonio Torralba, Jon Kleinberg, Terrence Senjnowski, Isabelle Guyon, Leon Bottou

  • Favorites: Nati Srebro, Peter Bartlett

Continue reading

Complex Derivatives, Wirtinger View and the Chain Rule

Two days ago in Julia Lab,  Jarrett,  Spencer,  Alan and I discussed the best ways of expressing derivatives for automatic differentiation in complex-valued programs. Having inspired from this discussion, I want to share my understanding of the subject and eventually present a chain rule for complex derivatives.


\mathbb{R}ealistic view: derivative is a real number that tells you how fast a value changes with respect to a variable.

D = \frac{dy}{dx} Continue reading

fMRI Data Visualization with Julia in Minutes


Recently, I needed to investigate an fMRI research[1] and data which is collected from people who are reading a Harry Potter chapter. The words are shown to people one by one by blurring other words around. The work has demonstrated that, given hardcoded language features and visual features of two words, one can distinguish which word the reader reads at a time by looking fMRI data.

Continue reading

Gaussian Integral and A Pi Approximator

I remember how I was happy when I find the integration of the Gaussian function in my high school years. It was an instantaneous spark which make me realize that increasing the integration dimension may bring a solution. Moreover, one can find a series expansion for Pi number after evaluating the integral.

Gaussian Integral

What is the Gaussian integral: Finding the area under the Gaussian function. I = \int_{-\infty}^{+\infty}\, e^{-\alpha x^2}\,dx

Continue reading