Smart speakers are everywhere. But could we design a smart speaker to talk to us with our own voice, or that of a person we know?
Category: deep learning
A Roadmap to Intelligence
This post looks at some key ideas for artificial intelligence systems. It acts as a guide to the landmarks on our path to improved computing.
Intelligence, Representations & Generative Functions
When you work for a long time with artificial neural networks, you begin to see patterns emerge. One of these touches on fundamental aspects of intelligence. As I haven't seen it described very often, I'll set it out here.
Swift Taylor Approximations
This post continues my explorations of simple intelligence. In this post, we'll consider some elementary sensing cells. We'll then look at whether we can apply local function approximators. Sensors Consider a set of sensing cells, which we'll call "sensors". We have N sensors, where each sensor measures a value over time. This value could be … Continue reading Swift Taylor Approximations
This post looks at composition. It starts with Lego. It then looks at a theory of why deep neural networks work and how they could be trained. It ends on how brains may embody these theories.
Predicting the Future
The rise of machine learning and developments in neuroscience hint that prediction is key to how brains navigate the world. But how could this work in practice?
The one where I perform some autoencoder origami to make sense of the brain.
Natural Language Processing & The Brain: What Can They Teach Each Other?
This post looks at what recent advances in natural language processing can teach us about the brain and cognition.
Sampling vs Prediction
Some things have recently been bugging me when applying deep learning models to natural language generation. This post contains my random thoughts on two of these: sampling and prediction. By writing this post, I hope to try to tease these apart in my head to help improve my natural language models. Sampling Sampling is the … Continue reading Sampling vs Prediction
Understanding Convolution in Tensorflow
This is a quick post intended to help those trying to understand convolution as applied in Tensorflow. There are many good blog posts on the Internet explaining convolution as applied in convolutional neural networks (CNNs), e.g. see this one by Denny Britz. However, understanding the theory in one thing, knowing how to implement it is … Continue reading Understanding Convolution in Tensorflow