There's nothing harder than making simple
What kind of mind does it take to make language simpler?
The instinct seems backwards. Languages grow by accretion, by borrowing, bending, forgetting, and the drift is almost always toward complexity.
To reverse that trend requires something extraordinary: a theory of language, a philosophy of mind, and a near-surgical confidence in what can be discarded.
Basic English was the invention of the Cambridge linguist and philosopher C.K. Ogden, who published it in 1930. His proposition was radically brave: that 850 words, carefully chosen, could express everything a person needed to say in international communication. Not 850 random words, but a system: 600 nouns, 150 adjectives, and just 18 verbs, with “make,” “get,” and “put” doing the heavy lifting. Churchill endorsed it during the Second World War. H.G. Wells championed it as a tool of world peace. At its peak it had genuine institutional momentum.
Yet Basic English sits in an interesting and somewhat uncomfortable position. It is arguably the most ambitious attempt to construct an artificial vernacular (possibly contradictory), a(n auxiliary) language for ordinary human exchange, not for lab notation or code. Esperanto, invented by Zamenhof in the 1880s, has perhaps 2 million speakers today; Basic English has never achieved that kind of adoption. Does that make it a failure, or simply a more honest experiment?
Designing an artificial language requires psycho-technologies that are far from obvious.
You need metalinguistic awareness, the capacity to stand outside your own tongue and observe it as a system. You need the philosophical tradition that runs from Leibniz’s characteristica universalis through Locke’s theory of signs to the logical positivists who surrounded Ogden at Cambridge. Without that inheritance, the idea of stripping language to its skeleton is literally unthinkable.
Simplification is, in this sense, harder than complexity. To add is easy; to remove without losing meaning demands a model of what meaning actually is. Ogden drew on his own theory of signs, developed with I.A. Richards in the 1923 book The Meaning of Meaning, which argued that words are not intrinsically bound to things, but are tools pointing at mental representations. If that is true, perhaps fewer tools can do the same work. But how would you know which tools to keep? And who decides?
The deeper question may be whether any artificial vernacular can truly succeed, or whether success itself is the wrong measure. Languages live in mouths, not manifestos. Basic English reached classrooms in China, India, and parts of Latin America during the 1940s. It shaped early BBC World Service broadcasts. Ogden’s word list quietly influenced Plain English movements that continue today. Is that a kind of success — distributed, diffuse, absorbed without credit?
What cultural moment makes artificial language imaginable in the first place? The 17th century gave us universal grammar theories; the 19th century gave us nationalist language revivals and colonial linguistics; the early 20th century gave us logical atomism and the dream of frictionless communication. Each era produced its own fantasy of the perfectly transparent idiom. Are we in another such moment now, watching large language models compress and translate across tongues at scale?
Perhaps the real achievement of Basic English is the question it keeps asking: what is the minimum a language needs to remain human?


