Recently I got to thinking about statistics and determinism. By statistics I mean the study of sets of data, typically obtained by measurement across a set of measurements or a population of "things". By determinism I mean processes that always have the same output for the same input. Often in engineering they are presented as … Continue reading Statistics and Determinism
I've often been struck by the inflexibility of neural networks. So much effort goes into training huge billion parameter models, but these all have fixed sized inputs and outputs. What happens if you need to add another input or output? I was thinking about this in the context of the simple matrix algebra of neural … Continue reading Extending Neural Networks
Mammalian brains are horrendously complex. But there are some general patterns of organisation. If we consider a systems-level abstraction, can we learn anything about how we think?
In this post I'll sketch out some vague outlines of a better life for the new year. I'll try to do this without hypocrisy. I'll likely fail.
In this post, I go back to basics with reinforcement learning and consider the stupidest form of intelligence.
This post looks at composition. It starts with Lego. It then looks at a theory of why deep neural networks work and how they could be trained. It ends on how brains may embody these theories.
What can we learn about the brain from the common patterns in human-generated audio?
Locality. It constrains with ubiquity. But why are we unable to see it?
Ever since "On Intelligence", I've been a Jeff Hawkins fanboy. It takes a lot of guts to invest all your wealth into understanding the brain. But what are his recent theories?
It's difficult to obtain audio/video data in Python. You just want a numpy array but how do you get this? This post presents a number of Python classes to address this issue.