Friday, October 15, 2021

Why I haven't been posting

I have been preoccupied lately with developing a podcast which will be a part of the Electric Agora network. It is called Culture and Value and it is still a work in progress. For the present, at least, it will consist of brief monologues, scripted and spoken by me. It is meant for a general audience, one probably less well-educated in academic philosophy, linguistics or related disciplines than the typical reader of this blog would be.

There may or may not be a focus on language-related or philosophy-related questions. General social and cultural questions will be dealt with, as well as politics and geopolitics. Obviously I want to keep the tone restrained and reasonable and (as far as possible) non-partisan.

The artwork (Chinese woman in traditional dress) is meant to allude to global economic and cultural shifts and to the fact that East Asian cultures have not cut themselves off from their cultural traditions to the same extent that Western European cultures have. The neon lettering, which is part of the Electric Agora house style, adds some unexpected semiotic complexity.

The show is available now via the following audio streaming services: Stitcher, iHeartRadio, Amazon Music, Audible, Pocket Casts and Spotify. We have been having problems getting it running on Apple Podcasts (and also Google Podcasts, it seems) but we expect these issues will be sorted out. 

You can subscribe (free) to Culture and Value here. Or click on the subscribe button in the insert below.

Saturday, May 29, 2021

Conceptualizing language


[This piece was published at The Electric Agora earlier this year. Chomsky's ideas on linguistics are very polarising and even my qualified endorsement of some of his central ideas prompted some animated exchanges.]

Complex language is a precondition not only for the kinds of interaction which characterize human societies but also for many kinds of thinking. It is both social and biological. A language only develops in a context of social continuity over an extended period of time, though it is typically learned very quickly by infants who are exposed to it. There is still controversy about the extent to which natural language is shaped and constrained by the structure and physiology of the human brain, but it is clear that the advent of complex human language was associated with genetic changes which impacted on various aspects of human physiology (including brain function).

What, then, is language? How should we conceptualize it? The approach I am outlining (and recommending) here is strongly idiolectal.

The term “idiolect” can be understood in different ways and taken more or less seriously in the study of language. Taken in a strong sense, it inclines us to see the individual rather than the language or linguistic community as the primary focus of study. As I see it, language only exists insofar as it is used (or instantiated) by individuals. A social context is a given. But speaking and writing and listening and reading and the thinking (or cognitive processing) which supports these activities or which impinges in some other way on linguistic forms or structures are all things which are done by, or (in the case of cognitive processing) occur within the brains of, individuals.

I don’t deny having been greatly influenced by Noam Chomsky’s ideas in my thinking about language. I was first introduced to linguistics by a former student of Chomsky’s who followed a broadly (but by no means doctrinaire) Chomskyan approach. This general approach appealed to me. Chomsky put the focus firmly on what he originally called (linguistic) “competence” (the individual speaker’s internal intuitions about grammar etc.) rather than “performance” (as a behaviorist might). This distinction was developed over time into one between I-language and E-language. For Chomsky the focus was on the former and consequently on idiolects rather than languages.

The entry for “idiolects” in the Stanford Encyclopedia of Philosophy (credited to Alex Barber and Eduardo Garcia Ramirez) highlights the philosophical implications of the concept and some of the confusions which surround it. Those who claim that idiolects (in any strong sense of the term) do not exist or that such a notion is useless or incoherent “are nonetheless happy to use the word ‘idiolect’ to describe a person’s partial grasp of, or their pattern of deviance from, a language that is irreducibly social in nature.”

But nobody is denying the social dimension of language. Of course a language is a social product, but “partial grasp”? Of what exactly? And (as I see it) any attempt to define idiolects in terms of patterns of deviance from a norm is likely to be arbitrary or trivial unless the norm itself is defined in terms of idiolects.

In what sense does a language exist as distinct from particular instances of language use? Spoken words and written texts are generally assignable to this language or that, but precise boundaries are impossible to draw. Grammars and dictionaries try to do this but they can never reflect the constantly shifting contours of actual linguistic practice which always depend on the knowledge and behavior of individual speakers. In the final analysis, then, what we have is a set of unique and (to a greater or lesser extent) overlapping idiolects. We find it convenient, however, to group sets of idiolects into what we call dialects or languages.

“The substantial debate,” Barber and Ramirez explain, “is not over how to define [the word ‘idiolect’]. It turns, rather, on whether an idiolectal perspective on language is to be preferred to a non-idiolectal one. Someone taking an idiolectal perspective on language treats idiolects […] as having ontological or investigative priority and [sees languages as] nothing but more-or-less overlapping idiolects. […] At issue, then, is what we should take languages to be.”

They go on to explain that Chomsky does not deny that language is at least in part a social product. But he is skeptical of E-language-based approaches. The term “E-language” is used by Chomsky to refer to those things (whatever they might be) that are the target of study for those who take languages and their properties to be external to the mind.

“Chomsky’s case for introducing and using the notion of an I-language is, in the end, indistinguishable from his case for a cognitivist approach to the study of language as a natural phenomenon. And his case against E-languages is that there is no scientifically coherent project to which they belong as posits.”

Chomsky does not deny the existence of some linguistic arbitrariness (emphasized by Ferdinand de Saussure and David Lewis, for example). But he sees the core aspects of language as being constrained by the specifics of our biological nature and the (undoubted) arbitrary and contingent aspects of language as operating within these constraints.

The facts of first language acquisition arguably demonstrate this. It is clear that language learning in infants represents a special kind of learning. Infants are not like little scientists observing and inferring the linguistic conventions prevailing among adult users. And even if they were, even if they were masterminds, they would still be unable, on the basis of the fragmentary, flawed and often inconsistent evidence which the typical linguistic environment provides, to zero in on an appropriate grammar. Logically speaking, there would be countless possible languages which would be compatible with the data. (This is the “poverty of stimulus” argument.) What we see in fact is very rapid, and apparently effortless, linguistic progress. And it calls for an explanation.

According to Chomsky, language acquisition can be thought of as a series of brain states, developing from an initial state, S0 [S zero], through intermediate states to a relatively stable mature state, SM.

From the SEP: "S0 is the initial state common to all humans, idealizing away from individual linguistic impairments and the like. Subsequent states arise through exposure to a particular linguistic environment. Nothing said so far requires that these states be thought of as representational states we could call “knowing a language”. […] [L]anguage acquisition can be described—usefully—as a matter of children evolving through various stages of knowledge en route to acquiring adult competence. This description is useful because the empiricist/nativist debate can now be couched as a debate over what linguistic information must already be known by someone in S0 if information supplied by the linguistic environment is to culminate in knowledge of the mature language M. Empiricists claim that nothing much is needed, that S0 is a “blank slate” to be filled in using environmental data. Nativists claim that plenty of information must already be provided, in the form of innate knowledge of a language dubbed Universal Grammar (UG) by Chomsky. We each come predisposed to acquire only certain languages, the humanly possible ones that can grow out of UG."

Despite the controversies surrounding the notion of Universal Grammar, I tend to agree with Barber and Ramirez that nothing much is added to this account, as an account of language learning, by describing it as development towards the learning of an externalistically specified social language (as opposed to some specific mature linguistic state (SM) of an individual). On this view, the primary target of investigation is the human language faculty, its nature and limits. Of course, other approaches to language are possible but, to the extent that they have scientific pretensions, they will probably be in tension with an idiolectal approach.

"Because there is considerable variety […] in the underlying conceptions of languages, Chomsky’s criticisms can seem sweeping, but the underlying thought is that, because E-languages are less “real” than I-languages, the concept [of an E-language] appears to play no role in the theory of language. […] Linguistic behaviour is the product of both the language faculty on the one hand and external influences—performance systems in the mind/brain of the individual and social factors—on the other. At issue is not whether anything at all can ever be said, usefully, about these “downstream” effects, but whether the notion of an E-language has any pivotal explanatory role to play in saying it (save as a useful shorthand)."

This is well put.

It is worth noting also that Barber and Ramirez explicitly recognize the challenges that idiolectal (or I-language-focused) approaches pose to traditional approaches in the philosophy of language.

"One apparent corollary of [Chomsky’s view of language] is significant for those many philosophers of language who have agonized over how to construct a theory of meaning for English. A common thought is that such a theory ought to take the form of a statement of the referential properties of the expressions of English—a link between words and objects in the world—from which the truth conditions of all English sentences can be derived (e.g., [Donald] Davidson, [Richard] Montague). Echoing P. F. Strawson, Chomsky suggests that referring is something people do. They use words in doing so, it is true, but referring is not something that words somehow do by themselves, through some fantastical medium, English. If referential properties of expressions amount to anything, rather than being relational properties between expressions and external objects (or “word-world” relations) they should be thought of as embodying instructions to the individual’s conceptual system, one of the performance systems with which the language faculty interfaces. If Chomsky is right, a great deal of the philosophy of language is either radically off beam or needs considerable re-interpretation."

I have barely scratched the surface here and don’t have a fully worked out position. But I am convinced that an idiolectal perspective has been and will continue to be extremely useful in the quest to develop a truer and more parsimonious account of human language.

Sunday, February 14, 2021

Epistemic relativism in a digital world

Extracts (slightly revised) from an essay of mine which appeared earlier this year at The Electric Agora under the title "Thought control and cultural decline".


[...] One consequence of the cultural and technological changes we are seeing is that the line between political activism and research and knowledge sharing has been erased – at least in many areas. Journalistic and publishing standards have plummeted, obviously. But it is the failure of the universities and other institutions of science and scholarship which is particularly galling for me. The institutions which I most respected – and to which I devoted many years of professional life – are being compromised and debased. Whatever could be politicized has been politicized and in very boring and predictable ways. The trends have been obvious for decades, but I had no idea just how fragile the commitment to science, scholarship and truth-seeking was, both within academia and in the broader community.

At the heart of the problem, as I see it, are postmodern and pragmatic views on truth and history. For whatever reason or reasons, such views have been energetically promoted by academics and school teachers, and they now pervade the broader society, contributing to the failure of political discourse. Everything is being reduced to rhetoric and the here and now.

I am not saying that there are not serious flaws in traditional ways of seeing the world. There are. But, for me, it’s a baby and bathwater thing. If we abandon the path of consilience and convergence in the realm of knowledge, we are ultimately condemning ourselves to intellectual impotence and irrelevance.

This is just rhetoric, you may say, and so it may be. I have not proved anything here. But my claim is a substantive one. It is a claim about the past and a prediction concerning the fate of cultures which abandon traditional epistemic values. The basic idea or intuition behind it is that epistemic relativism facilitates ideological fantasies which in turn lead to a disconnect between cultural and economic realities.

With respect to our future, time will tell. The intellectual fashions of which I speak could conceivably pass as the generation that has promoted and popularized them slowly dies off. But the signs are not good. Great damage has been done.

Maybe these tendencies of thought (mutating now in grotesque and seemingly crazy ways) are best seen as epiphenomena driven by economic forces or by the physics of complex systems. Maybe it doesn’t even make sense to claim that certain ways of thinking caused or contributed to this or that. Nonetheless, we can always observe and describe. Clarity and perspicacity are possible even when we cannot see into the heart of things or identify the root causes of the changes which we observe.

[...]

The advent of digital technologies has compounded the problem and given added traction to relativistic and ahistorical modes of thought. Traditional scholarship was developed within the context of linguistically sophisticated cultures and centered around paper documents which persisted over time. A sense of history was built in. You held firm evidence of the past in your hand, and many of the most important and influential fields of scholarly research dealt with historical (and historico-linguistic) questions. Objective evidence was painstakingly marshalled and deployed. And, over all and over time, epistemic progress occurred. There was a convergence of views on central questions.

The savagery with which scholars were treated during Mao’s Cultural Revolution or by the Khmer Rouge can be seen in this context. It showed how seriously scholars were still taken in the Far East less than fifty years ago. They were perceived as a real threat to forcibly imposed radical ideas.

We are now in an utterly different world, of course. Today’s all-enveloping digital environment makes it relatively easy for history to be rewritten, for minds to be molded at will and at scale. What could be more insidious – and more destructive of individual autonomy – than opaque, monopolistic systems skewing searches for information in order to promote particular views and agendas, and using AI to monitor personal communications and manipulate what people will and won’t see on their screens?

I would like to resist, but don’t see how I could do so effectively. Institutions of learning have been hijacked by ideologues and self-serving bureaucrats. More generally, the common ground which makes effective discourse possible has all but disappeared.

What, then, can one do? Join a rhetorical battle which by its very nature will never result in a clear or decisive outcome? Or withdraw, watch and wait – while cultivating one’s garden?


Thursday, August 6, 2020

Literature, cinema and truth



Is there any point in trying to set out one's personal criteria for judging fiction, plays and films? I think there is. For me good writing etc. represents human realities without undue simplification, sentimentality or ideological distortion.

Let me explain what led me to address this question in the first place. Prompted by what I see as an ongoing crisis in the education sector and beyond, I have been trying to articulate (on this site and elsewhere) a knowledge-based approach to education and culture. Formal education, I argued, needs to be firmly knowledge-based if it is to resist the tendency to become a vehicle for various kinds of propaganda. But my concerns are with the broader culture, not just with education.

Knowledge and truth are concepts which can be applied to the propositions and theories of the various sciences, to claims made in ordinary life, and also – in a sense – to artistic representations.

Most forms of knowledge are practical. But knowledge doesn’t necessarily have to have a use to be worthwhile. Science, scholarship and general knowledge add to our understanding of the world and ourselves and so have intrinsic value even when there are no practical applications.

Those activities generally seen as constituting “the arts” have an unusual status. They are practices but (apart from traditional crafts) they are detached from mundane reality and are not “useful” in the normal sense of the word. In a general sense, the arts could be seen to involve an extension into the adult world of aspects of the childhood practice of play.

There are definitional problems with the concept of art: the term has a positive connotation but is intrinsically vague. But we can still talk sensibly about particular art forms and make reasoned assertions about particular kinds of object or product (novels or films, say) or individual works.

Serious imaginative literature and cinema can be seen as making claims about the world and human experience along the lines that life and human experience is like this. These claims are necessarily implicit and indirect and typically engage the emotions as much as or more than the intellect.

Even apparently explicit claims within artistic works are not direct claims. For example, explicit statements in the dialogue or narration of a novel or film are embedded within an imaginative construction and cannot be taken solely at face value. The statements are actions within a world which is not our world – at least not in a literal sense.

Literary and cinematic works seek to engage us in imagined worlds. These worlds may or may not be plausible representations of recognizable aspects of the world we know, or of our own subjective personal and social lives.

Truth in literature or cinema, then, means something like true to some aspect of life as we experience it. We respond: Yes, that rings true; that’s how it works, that’s how it feels.

Or not, as the case may be. All too often the representation in question is false in one way or another. It may be simplistic or sentimentalized or ideologized. In such cases we are being presented with a distorted or impoverished version of reality.

Some distortions and simplifications are worse than others. Simple escapism is harmless enough. To the extent that distortions and simplifications are understood and accepted as such, no harm is done. But we should, I think, be particularly wary of sentimentality and ideological bias.

In a sentimental novel or film, cheap tricks are used to manipulate the emotions. Sentimentality is the antithesis of art and the antithesis of truth. Does it matter if people enjoy this sort of thing? Perhaps not. But sentimental thinking is very insidious and can distort the way we think and see the world in serious ways.

To the extent that a writer exploits sentimentality, he or she is less of a writer. Charles Dickens, for example, had great rhetorical energy and inventive powers. But his personal and social understanding (as it is reflected in his novels) was marked by a strong tendency to sentimentality.

I won’t talk here about ideology except to say that art is not about propagandizing which is, of course, a form of manipulation. Though propaganda often exploits artistic forms, it does so to the detriment of the artistic integrity of the work in question.

Saturday, May 23, 2020

When is a discipline not a discipline?



Disruptions to business as usual, such as we have been experiencing in the wake of the coronavirus pandemic, inevitably raise questions regarding which activities or institutions are essential or important for a good or fulfilling life, and which may be happily dispensed with. Answers to such questions are often very personal, of course.

My focus here is on activities associated with education and research. A strong case can be made that – especially within the arts and humanities, but also within the social sciences – skepticusm about the possibility of objective knowledge has been taken to extremes and, in fact, weaponized to protect entrenched interests. In view of this, I thought it useful to articulate a firmly knowledge-based perspective on education and research.

I am always a bit uneasy talking about academic disciplines and discipline boundaries. For one thing, it feels a bit redundant. Disciplines are what they are, and practitioners and observers make their own judgments about where to draw boundaries and about the worth or value of particular fields. Nonetheless, judgments must be made. And their significance is all the greater in times of change, in times of crisis: in times like these, in fact, when the future is in the balance and business as usual is just no longer an option.

I see our educational and cultural infrastructure as having lost its legitimacy and being in desperate need of reshaping and radical reform. The early years of education are particularly crucial but universities find themselves having to do remedial work and teach basic skills. I won't go into detail. Most people know the situation and everyone has their own ideas about possible solutions.

What is clear is that much more needs to be done in the earlier years, both in terms of imparting practical skills and knowledge, and in terms of broader goals associated with education’s socializing – or civilizing – function. It is beyond dispute that the K-12 system in America and many equivalent systems elsewhere have been failing badly for years.

Universities are also struggling and the value of higher education is increasingly being called into question. College enrolments in America have declined by more than 10 percent over the last eight years. Last month NPR reported that the current crisis may be an existential one for many colleges. But what is being taught in many of these colleges may be part of the problem.

All intellectual disciplines – be they scientific or scholarly – can be seen as adding to a shared knowledge base and having knowledge as their reason for being. Many other possible raisons d'être for academic and intellectual disciplines could be given, of course. And are. My point is just that I don't find other justifications for classing activities as serious intellectual disciplines particularly convincing. The fields in question may well be intellectual, but where is the theoretical rigor, where is the discipline, if anything goes on the knowledge front? What is the point of theory if it is not a means of building or articulating or facilitating the acquisition of knowledge?

Of course, high levels of rigor and discipline are often in evidence in activities which involve various kinds of practical knowledge. Such activities may or may not be associated with a body of theory. To the extent that they are, they will depend on formal educational structures.

Explicit claims about the world always need to be assessed regarding their plausibility. This need not be – and normally isn't – done in a rigorous or systematic way. In day-to-day life and politics, all kinds of claims are made and assessed on the run within dynamic social contexts. I am not complaining about this.

What's more, in ordinary life the truth of a claim is often less important than its social function, its role in modifying behavior for example. Or think of politeness phenomena like white lies which are primarily designed to spare the feelings of others. Courtesy and truth don't go together well!

Within the strict confines of intellectual and technical disciplines, however, the truth or otherwise of the claims being made or assessed is (or should be) quite central. Unfortunately many academic disciplines – especially within the humanities – have lost sight of this simple and obvious fact and have become, wholly or in part, self-perpetuating talking shops, jargon-ridden and superfluous extensions of the jousting and jostling of ordinary social and professional life.


[This is a slightly edited and abridged version of a piece which first appeared at The Electric Agora.]

Wednesday, April 15, 2020

Time and physics



Einstein's rejection of the notion of time as we know and experience it was squarely based in classical physics and classical mathematics. One problem with such a view is that it assumes the existence of infinite information (e.g. infinite decimal expansions).

Nicolas Gisin, a physicist at the University of Geneva, wants to reformulate standard physics in terms of intuitionistic mathematics. This approach holds the promise of resolving some of the paradoxes and confusions which have bedevilled theoretical physics for over a century.

Information is physical. We now know that there are strict limits on how much information can exist within any specific volume of space.

Nathalie Wolchover writes: "The universe’s initial conditions would, Gisin realized, require far too much information crammed into too little space. “A real number with infinite digits can’t be physically relevant,” he said. The block universe, which implicitly assumes the existence of infinite information, must fall apart."

Wolchover's non-technical article on Gisin's ideas and reactions to them by fellow physicists is well worth reading. This is how it begins:

Strangely, although we feel as if we sweep through time on the knife-edge between the fixed past and the open future, that edge — the present — appears nowhere in the existing laws of physics.

In Albert Einstein’s theory of relativity, for example, time is woven together with the three dimensions of space, forming a bendy, four-dimensional space-time continuum — a “block universe” encompassing the entire past, present and future. Einstein’s equations portray everything in the block universe as decided from the beginning; the initial conditions of the cosmos determine what comes later, and surprises do not occur — they only seem to. “For us believing physicists,” Einstein wrote in 1955, weeks before his death, “the distinction between past, present and future is only a stubbornly persistent illusion.”

The timeless, pre-determined view of reality held by Einstein remains popular today. “The majority of physicists believe in the block-universe view, because it is predicted by general relativity,” said Marina Cortês, a cosmologist at the University of Lisbon.

However, she said, “if somebody is called on to reflect a bit more deeply about what the block universe means, they start to question and waver on the implications.”

Physicists who think carefully about time point to troubles posed by quantum mechanics, the laws describing the probabilistic behavior of particles. At the quantum scale, irreversible changes occur that distinguish the past from the future: A particle maintains simultaneous quantum states until you measure it, at which point the particle adopts one of the states. Mysteriously, individual measurement outcomes are random and unpredictable, even as particle behavior collectively follows statistical patterns. This apparent inconsistency between the nature of time in quantum mechanics and the way it functions in relativity has created uncertainty and confusion.

Over the past year [...] Nicolas Gisin, has published four papers that attempt to dispel the fog surrounding time in physics. As Gisin sees it, the problem all along has been mathematical. Gisin argues that time in general and the time we call the present are easily expressed in a century-old mathematical language called intuitionist mathematics, which rejects the existence of numbers with infinitely many digits. When intuitionist math is used to describe the evolution of physical systems, it makes clear, according to Gisin, that “time really passes and new information is created.” Moreover, with this formalism, the strict determinism implied by Einstein’s equations gives way to a quantum-like unpredictability. If numbers are finite and limited in their precision, then nature itself is inherently imprecise, and thus unpredictable. [...]

On this view the future is open (rather than closed or predetermined), and time is closer to how we experience it – and so intuitively envisage it to be – than most physicists have supposed.

Wednesday, April 8, 2020

Is art a useful concept?



Daniel Kaufman recently talked about art and art criticism. His concerns were basically with what certain philosophers have said about these things. He explains that he used to accept Arthur Danto’s views but was persuaded (in part by reading Susan Sontag's essay, 'Against interpretation') that they were mistaken. He notes other philosophers' views which he also thinks are mistaken. And he suggests the outlines of an alternative view of art and criticism: “The idea, then, is that our critical engagements with works of art bring into existence extended, collaborative works, of which the initial artwork is only a first move; something for subsequent artists, audiences, and critics to “riff” off of…”

This, of course, leaves open the question of what counts as ‘art’ and who gets to say so. Does this matter?

Obviously, the word ‘art’ and its cognates are used in different ways. But basically it is a term which conveniently (or inconveniently) groups together and implicitly assigns a “special” status to a wide range of disparate activities/objects, contemporary and historical. That is, the term is extremely vague but generally carries a positive connotation.

I am not saying the word 'art' is meaningless or should not be used. But it cannot bear the weight that intellectuals often put on it. For one thing, it carries a lot of implicit (and dubious) metaphysical baggage and strong links to various kinds of philosophical idealism as well as to certain Romantic ideas. If (like me) you are uncomfortable with many of these historical associations and assumptions, this poses problems.

So also do institutional changes which have created a bureaucratized and self-perpetuating arts or culture "industry", significant parts of which are integrated into, or are directly or indirectly dependent upon, various levels and branches of government.

For me at least, the interesting questions are not about identifying "art" or about its promotion or support but rather about how we respond to and assess the aesthetic qualities of various specific products of human activity. The perceived value and worth of the activities involved will naturally depend, in many cases, upon such assessments.