Why the mind should be every hacker's next project

640-johnny_five.jpg

1. The mind is a computer

Most coders suspect this; most don't dig deeper. We anthropomorphise our code constantly ("It thinks the input is a string so it got confused when it tried to use it to...") but rarely stop to wonder that the words we use to describe people's mental states, desires and intentions are so breathtakingly appropriate for describing the aggregate behaviour of so many electrons coursing through silicon. In fact, if pressed, how else would you describe what your code does? It's no coincidence - we use the same words because we are talking about the same physical activity.

We are living through the early years of a revolution. It is a revolution that was ignited by philosophers and fuelled by psychologists. But now, as this revolution reaches escape velocity, it is us, the computer scientists and hackers, who find ourselves best placed to realise its potential. That revolution is the Computational Theory of Mind and its prize offspring: Cognitive Science.

The great cognitive scientist Steven Pinker, a major influence here at Mental Technologies, describes the Computational Theory of Mind in terms any hacker should find irresistible:

Beliefs are a kind of information, thinking a kind of computation, and emotions, motives and desires are a kind of feedback mechanism in which an agent senses a difference between a current state and goal state and executes operations designed to reduce the difference.

In the blockbuster Jurassic Park, Lex Murphy, a tween hacker tasked with fending off some raptors by activating the computerised door locks, taps keys frantically before exclaiming: "It's a Unix system. I know this!". The same euphoric familiarity should greet any hacker confronted with the Computational Theory of Mind. We relish the serendipity: the intuition that guides our mastery of computer systems can now be used to reverse-engineer the mind itself.

2. It has a data structure

The Computational Theory of Mind posits that the mind represents the world somehow, paring the over-generous flood of sensory input down to a wireframe essence - a model which is then exploited through computation. Computations on this model, scaled and aggregated, produce the cherished galaxy of beliefs, inferences, goals, techniques, plans and behaviours that make us who we are.

Models and computation? This is manifestly hacker turf. Any coder worth their salt begins a software project by modelling the domain and filling the model with data, even test data, before doing computations on that model. They strip out all properties but those needed for computation. When they model a bank customer, their name and address makes it onto the whiteboard but any dev who suggested including the customer's favourite movies would be laughed out of the room. When they model that same person for a dating website, the opposite is true. This goal-driven representation is instinct for people who code and fundamental to understanding the mind.

Just as goals drive the model in software projects, so evolution's goals drive the mind's representation of the world. Psychologists identify two types of factual memory - episodic memory for remembering stories and semantic memory for remembering generalities (like bananas are yellow or whales eat plankton). These types of memory are an outward manifestation of the mind's representation of the world and offer good clues to evolution's goals, i.e. to distil facts from the mess of sensory information that we can use to enhance our safety, nourishment, sexual attractiveness and status. 

Now that we have an inkling of the system's goals, we can start to sketch what the representation driven by these goals looks like. First, we need to focus our attention on the right level of computation. It's easy to get distracted by neurons - they seem so important - but we hackers know that systems are sometimes best understood by getting out of the jungle and up into the fresh rare air of abstraction. 

Here's the level of computation we like. It turns out that thought itself can be broken down into a suite of combinatorial building blocks - objects, events, states, places - which, like the twelve notes on the chromatic scale, can be cobbled together endlessly to produce a space of possible thoughts far more expansive than the space of possible melodies.  How are these building blocks represented by neurons? Let that be someone else's dissertation. All we need to do is figure out how to represent them on transistors and we certainly have the tools to do that.

When emulated on a computer, these assemblies of building blocks form something truly remarkable: a universal data structure capable of storing any thought expressible in language. This data structure is a superset of all others - every XML schema, database, JSON representation and spreadsheet out there - and can be filled with data using a tool specially designed by evolution for this very purpose: natural language.

3. It's yours to discover

The philosopher Dan Dennett, in a characteristically instructive piece of wordplay, calls our brains Necktop Computers and it's worth remembering. You are carrying the most sophisticated computer in the known universe with you, always. You don't need to travel. You don't need special clearance. You don't even need a password. You have constant access: you can observe, prod, play and learn any time you like - you just need to know how. 

So how do we figure out this universal data structure inside each of our heads? The answer is literally in front of you: language. Pioneering work done by Leonard Talmy, Ray Jackendoff and Steven Pinker shows that the words we use are a teeming trove of hints, clues and outright no-brainers about how our minds are put together.

Again, the contents of a programmer's mental toolkit offer an advantage here. A worthwhile way for coders to think of language is as a RESTful API. I have a data structure in my head; you have one in yours. For the most part, these data structures align. We both believe, for example, that there is a company in Cupertino, California that invented the iPhone. But sometimes our data structures are out of alignment: when I tell you my name is Des, a kind of representational state transfer has happened that is strikingly similar to PUT /interlocutor/name/Des

Just as you could make an accurate guess about a web API's underlying data model by studying successive API calls, these pioneering cognitive scientists have done much to elucidate the mind's data structure by studying language. For example, take a look at the following sentences:

  • The man went from Berlin to Frankfurt

  • The light went from red to green

  • The meeting went from three to four

  • The house went from the mother to the daughter

The man moves in the first but the light doesn't move in the second. The third describes a span of time whereas the fourth describes a change of possession. On the surface, these situations are very different. So why do we use the same verb to describe each one? Dig deeper and you'll see that each of these statements is almost identical: they all describe a movement through some type of space.  The first is physical space; the second is state space; the third is temporal space and the fourth is possession space.

Like a good programmer, evolution has followed the maxim of Don't Repeat Yourself and implemented a version of code re-use. Here is an example of a phenomenon seen time and time again in studying language: machinery that first evolved for one purpose (describing motion through physical space) has been repurposed to describe more abstract events (motion through state, temporal and possession space). And so we have discovered one of the building blocks of thought: a representation of motion along a path through varying types of space. The best part? Just as evolution has repurposed the same mental machinery, so can we implement that machinery using the same code to reap the same benefits - parsimonious representation of superficially different pieces of knowledge.

Try it for yourself. Maybe you'll notice that the prepositions on and at can be used to describe places in both space and time. Or that the verb force can be used to describe physical force (They forced the door open) and peer pressure (They forced the minister to resign). Once you start thinking about language like this, it's kind of hard to stop.

Interested? Get in contact, leave a comment or just stay tuned for more.

 

Des Kelleher