Language & Literacy

cognition

The Octopus

“Over cultural evolution, the human species was so pressured for increased information capacity that they invented writing, a revolutionary leap forward in the development of our species that enables information capacity to be externalized, frees up internal processing and affords the development of more complex concepts. In other words, writing enabled humans to think more abstractly and logically by increasing information capacity. Today, humans have gone to even greater lengths: the Internet, computers and smartphones are testaments to the substantial pressure humans currently face — and probably faced in the past — to increase information capacity.”

Uniquely human intelligence arose from expanded information capacity, Jessica Cantlon & Steven Piantadosi

According to the perspectives of the authors in the paper quoted above, the capacity to process and manage vast quantities of information is a defining characteristic of human intelligence. This ability has been extended over time through the development of tools and techniques for externalizing information, such as via language, writing, and digital technology. These advancements have, in turn, allowed for increasingly abstract and complex thought and technologies.

The paper by Jessica Cantlon & Steven Piantadosi further proposes that the power of scaling is what lies behind human intelligence, and that this power of scaling is what further lies behind the remarkable results achieved by artificial neural networks in areas such as speech recognition, LLMs, and computer vision, and that these accomplishments have not been achieved through specialized representations and domain-specific development, but rather through the use of simpler techniques combined with increased computational power and data capacity.

I think the authors may be overselling scaling as the main factor behind intelligence, but scale most definitely plays a leading role alongside brain and neural network architecture and specialized data, and it most definitely plays a role in how human language is used and developed.

The Potential of Scale

“LLMs give us a very effective way of accessing information from other humans.”

–Alison Gopnik in an interview with Julien Crockett in the Los Angeles Review of Books

In our previous explorations of language, cognition, and Large Language Models (LLMs), the recurring theme of the power of scale has certainly emerged.

We've delved into the statistical nature of language, where the vast interconnectedness of word combinations and their contextual relationships drive LLMs' generative abilities. We've pondered the inherent imprecision of human language and the journey towards computational precision in LLMs. And throughout, the concept of scale has remained central – the scale of data, the scale of computation, and the scale of language itself.

It's intriguing to consider the possibility, as this paper suggests, that the capacity to process increasing amounts of information may have been a key factor in the development of human intelligence. This idea extends to how, as a species, we have continually sought ways to expand our ability to store and access information, from the invention of writing to the development of computers, the internet, and smartphones.

This suggests that the most exciting potential of artificial neural networks such as LLMs may lie not only in their ability to respond to and generate human language, but furthermore in their ability to help us to process and manage vast quantities of information, and thus further extend our cognitive capabilities. When framed in this manner, it shifts the debate from whether LLMs already demonstrate human intelligence and whether they will soon achieve superhuman intelligence, to whether LLMs will indeed equip us with superhuman abilities. And – as always with advancements in powerful technologies – the question is who among us will gain the most from those abilities and whether the new tools will further increase or diminish disparities between groups (i.e. “the future is already here — it's just not very evenly distributed”).

So we’ve explored a few implications of LLMs relating to language and literacy development so far, then: 1) LLMs gain the base for their uncanny powers from the statistical nature of language itself; 2) LLMs present us with an opportunity for further convergence between human and machine language; and 3) LLMs present us with an opportunity to further extend our cognitive abilities by allowing us to process far more information.

The Dark Side of Scale

All of this said, there is a dark side to scale, as Geoffrey West elucidates in his book, Scale (more on this on my other blog, Schools & Ecosystems), which is that as we consume far more energy and create far more waste beyond our biological needs and functions than any other creature on earth as we continue to scale our technologies. As West describes it, we humans are energy-guzzling behemoths, using thirty times more energy than nature intended for creatures our size. Our outsized energy footprint makes our 7.3 billion population act as if it were in excess of 200 billion people. And we are hitting the upper limits on ecological constraints of the earth as we do so.

Similarly, as LLMs extend our capabilities, they consume ever more power as they consume and produce ever more data. So at the very same time that our earth is rapidly accelerating towards critical thresholds of environmental change and wreaking havoc on insect, animal, soil, and plant life, we are rapidly accelerating our consumption of energy and production of waste.

It’s hard to see a clear end in sight to this. It’s possible that the greedy demands of continuing to scale AI model training and use ends up leading to rapid development of greener technologies and accelerated efficiency in digital computation and compression. It’s just as possible that in our short-sighted endeavors we put a half-life on human civilization via no longer containable war, famine, disaster, and disease.

Not to end this post on such a sour note, but it is important to bear a healthy skepticism about a new technology and its attendant powers, even as we seek to gain from it. And from what I see in the discourse, it seems to me that there has been a pretty healthy mix of boosterism and critique and excitement and paranoia about it all, so I’m enjoying the ride, nonetheless.

#cognition #language #AI #LLMs #technology #brains #scale

Natural digital

Regularity and irregularity. Decodable and tricky words. Learnability and surprisal. Predictability and randomness. Low entropy and high entropy.

Why do such tensions exist in human language? And in our AI tools developed to both create code and use natural language, how can the precision required for computation co-exist alongside this necessary complexity and messiness of our human language?

Read more...

A statistical tapestry

”. . . the fact, as suggested by these findings, that semantic properties can be extracted from the formal manipulation of pure syntactic properties – that meaning can emerge from pure form – is undoubtedly one of the most stimulating ideas of our time.”

The Structure of Meaning in Language: Parallel Narratives in Linear Algebra and Category Theory

In our last post, we began exploring what Large Language Models (LLMs) and their uncanny abilities might tell us about language itself. I posited that the power of LLMs stems from the statistical nature of language.

But what is that statistical nature of language?

Read more...

In a previous post, Thinking Inside and Outside of Language, we channelled Cormac McCarthy and explored the tension between language and cognition. We dug in even further and considered Plato's long ago fears of the deceptive and distancing power of written language in Speaking Ourselves into Being and Others into Silence: The Power of Language, and how bringing a critical consciousness to our use of language could temper unconscious biases and power dynamics.

If you find any of that interesting, I recommend reading this short interview, How to Quiet Your Mind Chatter in Nautilus Magazine with Ethan Kross, an experimental psychologist and neuroscientist at the University of Michigan.

Two relevant quotes:

“What we’ve learned is that language provides us with a tool for coaching ourselves through our problems like we were talking to another person. It involves using your name and other non-first person pronouns, like “you” or “he” or “she.” That’s distanced self-talk.”

“The message behind mindfulness is sometimes taken too far in the sense of 'you should always be in the moment.' The human mind didn’t evolve to always be in the moment, and we can derive enormous benefit from traveling in time, thinking about the past and future.”

Check out the full interview here.

#language #research #unconscious #cognition

Researchers with gifts This has been a great year for education research. I thought it could be fun to review some of what has come across my own limited radar over the course of 2023.

The method I used to create this wrap-up was to go back through my Twitter timeline starting in January, and pull all research related tweets into a doc. I then began sorting those by theme and ended up with several high-level buckets, with further sub-themes within and across those buckets. Note that I didn’t also go through my Mastodon nor Bluesky feeds, as this was time-consuming enough!

The rough big ticket research items I ended up with were:

  • Multilinguals and multilingualism
  • Reading
  • Morphology
  • The influence of physical or cultural environment
  • The content of teaching and learning
  • The precedence of academic skills over soft skills
  • Brain research and Artificial Neural Networks
Read more...

Innate vs developed

We have spent some time picking away at the tension between the generalizations and assumptions made around whether reading and writing development is natural or unnatural.

We continue this exploration, except now we dig into an even more fundamental aspect of human development: language. Language development is a seemingly magical evolutionary development that humans have uniquely adapted—or for which language is uniquely adapted for—to the surviving and thriving of our species.

Are we born with innate capacities for language baked into our brains—a 'universal grammar'? Or do we develop and hone these capacities—albeit, rapidly—through exposure and use? Is it both? If so, how much is innate, and how much is developed? And in what way do these continued advancements of language and literacy across the generations enable our cognitive, cultural, and technological achievements? And in what way might they at the same time magnify the biases and base motivations of those most able to leverage power to manipulate others? In other words, how much does language and literacy bring us into a more generative engagement with ourselves and our world, and how much does it create a distance that may lead to destruction?

This journey continues in this series of posts:

#language #literacy #innate #natural #unnatural #development #cognition

Talking is just recording what you're thinking. It's not the thing itself. When I'm talking to you some separate part of my mind is composing what I'm about to say. But it's not yet in the form of words. So what is it in the form of? There's certainly no sense of some homunculus whispering to us the words we're about to say. Aside from raising the spectre of an infinite regress—as in who is whispering to the whisperer—it raises the question of a language of thought. Part of the general puzzle of how we get from the mind to the world. A hundred billion synaptic events clicking away in the dark like blind ladies at their knitting.

Stella Maris by Cormac McCarthy

A hundred billion synaptic events clicking away in the dark like blind ladies at their knitting.

OK, so let’s take some stock of where we’ve been thus far in our explorations of the development of language and literacy.

We’ve spent some time poking at the notion of whether learning to read is unnatural or not, and landed on the conviction that terming it unnatural–though useful as a rhetorical device–may be less precise than recognizing that learning to read and write is more formal, abstract, and distal from the immediate context of human interaction – and thus requires more effort, instruction, and practice to master.

We then turned to the development of language and discovered that even here–despite the ubiquity and swiftness with which native languages develop anew in every child across our species–language may not be as innate and inborn as it may appear.

Both language and literacy have bestowed humanity with sacred powers for the transmission and accumulation of cultural knowledge that seems to–as of yet–have no ceiling beyond that of our own destruction. Whether this is natural or innate or not may be beside the point. What does seem to be clear is that we have something inherited within us that is unfurled and reified by the networks that are riven across our brains through storytelling, interactive dialogue, and shared book reading that connects spoken to written language, and further strengthened with the hardwon fluency we manage to achieve on our own across modalities, texts, and languages.

Read more...

the inner scaffold

The Acculturation of the Mind

There is a fertile topsoil we are born with in our brains, imprinted by the interplay of sights and sounds and movement of those who interact with us. This immersive communicative theater, felt first in the womb, roots itself within the immediacy of each moment, even while gesturing at distant realms yet unknown. Climbing towards this mystery with our tongues and thoughts and technology bends the world toward our needs, and allows us to project our inner selves into the past and future. We ride rivers and build highways across our brains. This is our cultural inheritance, our storied legacy of language and literacy.

Read more...

connection to the world

There was a fascinating summary thread I came across recently that I want to dig into, as there’s some really interesting and rich areas of tension to unpack. Here’s the thread

What especially caught my eye and made me ponder for days afterwards was this:

The language network does not overlap or build on nonlinguistic cognitive abilities . . . fMRI evidence from 32 experiments, with 64 conditions, and 761 participants across 1,007 scanning sessions suggests language is separate from thought – when processing non-linguistic stimuli other areas are activated compared to when processing linguistic stimuli . . . The language system does not share resources with other cognitive abilities.

Language is separate from thought. I really struggled to understand this . . . isn’t language how we think, whether conscious or not?

Fedorenko argues that there are properties of language that suggest is is not suitable for complex thought, but is well-suited for communication . . .For example, language processing is fundamentally predictive, something that wouldn’t be useful if language was primarily used for thought and not communication. Although the language network and other cognitive abilities seem to be distinct systems, they need to integrate in some way. Shedding light on this integration is a key direction for future research

Where language does intersect with other cognitive systems, however, according to this presentation, is “some exciting new research emerging that language is intimately linked with the system that supports social cognition, such as Theory of Mind.”

Another tantalizing tidbit in this thread relates to syntax and word meaning:

language does not rely on abstract syntax. Syntactic processing is distributed across the language network and “every syntax-responsive cell population or brain area is robustly sensitive to word meaning” . . . . In every region, even at the most fine-grained level of analysis shows that there are no selective responses to abstract syntactic structure – everything that responds to structure building also responds to word meaning.

Well now, I want to unpack that one a bit more! It seems to suggest that word meaning i.e. semantics i.e. vocabulary/morphology is higher leverage than syntactical structure.

All of this really got me thinking, about thought and cognition, about language . . . and especially about how adding in literacy — a writing system — complicates all of this . . . I mean, writing is a form of thought, right? I sometimes don’t think things, or know what I think about things, until I force myself to write it. Does reading and writing connect cognition and language in a way that language itself does not?

In pondering about this thread further, I threw out the following on Twitter:

Is working memory a component of the executive function construct? Or an inter-related but separate domain?

I got some great food for thought in response to this query — Corey Peltier, Courtney Ostaff, and Andrew Watson confirmed that working memory is typically understood as a component of executive function — the cognitive system of thought that would appear to be distinct from language.

Lisa Archibald then went in deep on the relation between working memory and language, and it’s worth digging into her specific points, as they bear challenges to some of the points made above in the earlier thread.

Key points she makes that I found very helpful:

  • What is activated and therefore measured depends on the nature of the task
  • Whether the brains scanned are children or adults matters, as adult brains are more specialized
  • Just as with emerging reading/writing skills, language development requires more cognitive attention until we are fluent
  • And similar to struggling readers and writers, students struggling with language (i.e. DLD / SLI) have to apply more cognitive energy to using language accurately, which makes meaning/content/thinking harder to get to

She also referred me to another thread from DLD and Me that gives a neat way of framing this as unity but diversity — i.e. there is a single pool of resources of executive function (unity) but there is a diversity of different types of tasks we’re trying to apply that pool of resources to

Whew! This is heady stuff. Share your thoughts and Discuss...!

#language #cognition #DLD #workingmemory #executivefunction #literacy #thought #brain