Twitter rambles about explanations and AI. Maybe to spruce up at a later date.
Category: language
Effortless NLP with spaCy
I've loved spaCy for a long time but I've only just got my head around how you can structure a text processing pipeline to take full advantage of its power.
Natural Language Processing & The Brain: What Can They Teach Each Other?
This post looks at what recent advances in natural language processing can teach us about the brain and cognition.
Rambles about Language Models
"But it must be recognised that the notion of 'probability of a sentence' is an entirely useless one, under any known interpretation of this term." Chomsky (1969) Are language models a waste of time? I recently found this post in my drafts, having written it over the Christmas period in 2017. Having talked with several … Continue reading Rambles about Language Models
How do we remember characters?
I have a question of cognitive science: how do we hold in our minds the combination of characteristics that make up a particular object or character? How do we then keep that specific combination in mind and consistent over the span of a narrative? Consistency is a hard one. Characters may be separated by several … Continue reading How do we remember characters?
Reflections on “Meaning”
Existential “meaning” is partly the telling of a story featuring ourselves that is available and consistent with our higher-level representations of the world. It is not, generally, rational; it is more a narrative correlated with a feeling of “selfness” and “correctness”.
Getting All the Books
This is a short post explaining how to obtain over 50,000 text books for your natural language processing projects. The source of these books is the excellent Project Gutenberg. Project Gutenberg offers the ability to use sync the collection of books. To obtain the collection you can set up a private mirror as explained here. … Continue reading Getting All the Books