Tuesday, January 28, 2014

Nouny nouns

Most of us come up with ideas which we think are good but which we don't develop or exploit. Ideas for making money or doing good, or – as in the case I am about to describe – ideas which have absolutely no possible commercial or practical applications.

Typically, we discuss these bright ideas with trusted friends or family members and get discouraged when our interlocutors are less than overwhelmed.

So let me recycle here (to the extent that I can reconstruct it from memory) one such idea which was effectively discouraged by an old academic friend and colleague whose views on the matter I may have taken a shade too seriously. Or not, as the case may be.

It relates to the topic of animism, which I raised in my previous post on this site.

There I talked about the so-called 'mind projection fallacy' discussed by Edwin Thompson Jaynes. He talked about evidence in ancient literature and pointed out that the fallacy in question would have long pre-dated written records.

We have anthropological evidence for something like Jaynes's mind projection fallacy from studies of various non-literate cultures, but my idea was to look for evidence in the structure of language.

For our natural tendency to project human-like intelligence into non-living and non-human nature is obviously reflected in various ways in the grammar and morphology of the languages we speak or know about, and these languages (would have) not only reflect(ed) but also facilitate(d) animistic modes of thinking.

You find traces of animism even in modern English idioms such as 'the wind blows', but grammatical analysis of both verbal and nominal forms takes us much further back in time.

My intention was to focus on nouns. Willard Van Orman Quine speculated (in his Word and Object as I recall) that the most basic form of noun was the mass noun – like 'sand' – rather than the count noun – like 'hill'. The former doesn't need an article ('the' or 'a'), the latter does.

But, counter to Quine's speculations, it can in fact be demonstrated by looking at the potential for inflection – grammatical suffixes and so on – of various kinds of noun in a range of languages within the Indo-European family that the prototypical noun – the 'nounier' noun if you like – is the count noun rather than the mass noun; and, of the count nouns, animate nouns are nounier than inanimate nouns; and nouns relating to humans or human-like agents are the nouniest of all.

My intention, then, was to elaborate and refine and draw out the implications of this fact: that for many languages – including some of the oldest linguistic forms of which we have any knowledge – the nouniest nouns are personal agents.

Perhaps this idea had already been developed by others at the time I first thought of it. Perhaps it has been discussed and developed more recently. Perhaps it is just not an interesting enough idea to bother with. Or perhaps none of the above applies.

Wishing, then, to maintain – at least for a little while – a state of blissful ignorance on the matter, I am deliberately postponing any scholarly delving.

I have also refrained from mentioning the name of the linguist (now in his eighties) whose work was my jumping-off point. If his name comes up in my (or anyone else's) searching it will suggest that the territory is still relatively virgin.

Sunday, January 12, 2014

Randomness in nature

I have talked before about randomness. Somehow it seems important to know whether the world we live in is driven in part by fundamentally random processes.

Some recent findings seem to confirm (though 'confirm' is probably too strong a word) what quantum theory has suggested all along: that there are basic physical processes which are truly random.

I might also mention in this context that, in doing a bit of reading on probability and related matters, I happened to come across some references to, and a paper by, the physicist Edwin Thompson Jaynes (1922-1998). Jaynes promoted the view that probability theory is an extension of logic.

This is intuitively plausible. The concept of truth (and truth tables) lies at the heart of propositional logic, and T is, of course, equivalent to a probability of 1, F to a probability of 0. Probability theory just fills in the bits in between in a quantitative way!*

Of particular interest to me is Jaynes's notion of a 'mind projection fallacy' which he sees as a root cause of much false thinking, including what he sees as the mistaken ascription of randomness to (certain) natural events or processes.

But his case seems to suffer from an overdependence on personal intuition as well as from a lack of historical perspective. For example, he develops** his concept of a mind projection fallacy without (to my knowledge) relating it to other clearly similar or related concepts – from animism to teleological reasoning – which have been widely discussed over the last century-and-a-half.

Jaynes argues that this fallacy is evident not only in the thinking of primitive cultures and amongst uneducated people but also in scientific contexts. He uses his mind projection idea to argue against certain interpretations of probability theory and statistics as well as against certain interpretations of quantum mechanics.

The basic thought seems to be that theoreticians are all too inclined to project their perspectives (their particular states of knowledge or ignorance) on to reality. He rejects, for example, the ascription by probability theorists – and physicists, it seems – of 'randomness' or 'stochastic processes' to nature. He rejects the Copenhagen interpretation of quantum theory as a mere projection of our ignorance.

But, as I say, I find it a bit off-putting that (in the cited paper, at any rate) he not only fails to acknowledge that others have developed and discussed notions very similar to his own, but also – ironically – that he seems to sensationalize and exaggerate the significance of his own insights and intuitions.

More on the substance of his claims later, perhaps.


Let me take this opportunity to thank past readers for their interest and commenters for their comments and to wish everyone a pleasant 2014.



* Like other objective Bayesians, Jaynes sees probabability theory as a formal, axiomatic system, and the calculus of propositions as a special case of the calculus of probabilities.

** Here, for example (PDF).

Monday, December 23, 2013

The phantom self

Set out below is the core section of Gordon Cornwall's analysis of the 'phantom self' (taken from the post to which I provided a link in my previous post on this site).

But first, my brief critique.

I do go along with Cornwall (and with Derek Parfit) to the extent that they deny the existence of any substantive self. What exists are bodies which are, at a basic level, conscious of their existence as (mortal) bodies and, at a more complex (and problematic) level, subject to the illusion of a (potentially independent) immaterial self.

Planning and thinking about the future need not involve these problematic beliefs in any essential way, it seems to me. And imagining possible threats to one's well-being (and the well-being of loved ones) – which of course lies at the basis of intelligent behavior and planning – needn't lead to neurosis or anything like it.

It is true, however, that our awareness of our own mortality does, at a fundamental level, cast a long shadow and put a dampener on joy and real constraints on human happiness.

Parfit's statement (cited by Cornwall) that "ordinary survival is about as bad as being destroyed and having a replica" may be playful. But it seems to me only to make sense if you deny the existence not only of a substantive self but also of the sense of a specific self which a body generates as it 'survives' from minute to minute and from day to day.

This specific-body-generated first-person point of view is what we are, and we would prefer (under most circumstances) that it continue. I don't see how having a surviving 'copy' would allow that to happen.

Finally, Cornwall seems to misunderstand the distinction between the public, objective stance of science and the first-person perspective – which encompasses all of what he calls 'practice' as well as our subjective understanding (even when the latter is informed by science).

I just don't see any serious problems with a straightforward physicalism, at least as it pertains to the scientific understanding of the relationship between the body and the sense of self.


Cornwall writes:

"Belief in the special, separate unity of the self comes naturally to humans. It is the result of a trick of natural selection. Having a self-model is an adaptive feature of complex animals that are capable of moving around. The self-models of such animals are tightly coupled to their motivational systems, which include their emotional systems. The appearance of an immediate threat to self triggers a strong emotional response in most animals, activating the amygdala and launching a flood of psychosomatic and behavioural responses which tend to help them survive the crisis.

Humans are unlike most other animals in that, with our highly developed prefrontal cortices, we are capable of imagining and making detailed plans for the future. As part of imagining the future, we imagine ourselves in the future. Visualizing a threat to oneself in the future triggers an emotional, motivational response similar to that which would occur if the threat were actually happening on the present scene. The response is enabled by strong projections from the prefrontal cortex to the amygdala and associated limbic regions of the brain. The ability to label an imagined entity as ‘self,’ and have it trigger this kind of emotional response, is an adaptation that, perhaps more than any other, propelled our species into our present position of earthly dominance. Unfortunately, this adaptation [...] came at a considerable cost in unnecessary suffering. It is an effective design, but not a very good one. It is far from optimal, and certainly not elegant.

One way to view this idea is as another outgrowth of the scientific physicalism that has illuminated so much else. Looking at what we have learned in the past few hundred years, it is hard not to be impressed by scientific physicalism as the source of our most far-reaching and productive changes in outlook. Out of it came the demise of geocentrism. When the direction 'down' was displaced as a fundamental orientation of the universe, so our parochial planet was displaced as its centre. Ceding centre stage is always salutary; it resulted in a widening of horizons, a deeper engagement with extraterrestrial reality.

Scientific physicalism was also Darwin’s mindset. We no longer see ourselves as the pinnacle of creation, but as blood relatives of all other species on this planet, an extended family of creative solutions to the problem of life. They reflect us in countless ways, and we will learn from them for a long time to come. Understanding natural selection, we come to know that we are not the product of a perfect design process. We are beginning to see opportunities to improve on our own nature.

The productivity of scientific physicalism stems from its ontological parsimony. Science does not assume the existence of entities that are not needed to explain observations. Physicalists saw the opportunity to dispense with a fixed framework of space-time in which all objects had a position and velocity. There is no such framework; hence the insights of relativity. Physicalists do not need to assume the existence of God, either. What most people don’t quite realize yet is that the selves they imagine themselves to be can also be dropped from the scientific ontology, with a resulting gain, not loss, in its explanatory power. If you simply look at what is, then Parfit’s famous statement that "[o]rdinary survival is about as bad as being destroyed and having a replica" gains the presumption of truth, for there is no evidence for the existence of anything so mysterious as its negation implies. I should point out that Parfit’s characterization of ordinary survival as ‘bad’ is playful; this insight into what survival amounts to is all to the good. To embrace it is to escape the glass tunnel and engage with life on a broader scale and a longer time dimension, one that extends long after one’s biological death.

One more thing. My approach to this subject has been, and remains, one of intellectual discovery. I’ve always been more interested in learning the truth than in changing myself. Advocates of ‘spiritual practice’ sometimes tell me I’m doomed to failure; the truth cannot be grasped intellectually. Respectfully, I think the jury is out on that. Western philosophers in the analytical tradition have justly been criticized for mistaking their own failures of imagination for metaphysical necessity. So, too, past failures to intellectually grasp religious insights into ‘no-self’ should not be taken as proof that all such attempts in future will also fail. Scientific progress has achieved much, and will achieve much more. I don’t know of any convincing argument that science cannot leap this hurdle."

Thursday, December 12, 2013

The glass tunnel

Adrian McKinty is to blame. He started a discussion on Derek Parfit's perennially frustrating ideas on personal identity and death. You will see that I reiterated my previously-stated views* (which are similar to Adrian's own) in the course of an exchange on the comment thread.

And now I have stumbled across Gordon Cornwall's sophisticated analysis which defends Parfit's view and so implicitly challenges mine.

My intention, then, is to revisit the very important questions that lie behind these discussions, initially by reading and thinking about what Gordon Cornwall has to say. I can't reject it just because it has a mystical or religious feel which I don't like and which makes me suspicious (just as Parfit's approach does).

But first let me make a few general comments on my attitude to Derek Parfit as well as trying to set out the emotional context of my thinking on these matters.

When I first encountered Parfit's 1984 book, Reasons and Persons, I remember concluding that his view seemed inconsistent with planning and caring about one's future, with prudence basically. But Parfit himself seems to have made it into his eighth decade without any trouble – and (if his claims are to be believed) with less stress than would have been encountered had he retained his earlier, more conventional view of human identity.

My main concern, however, is not to decide which view is more conducive to longevity or quality of life, but rather to figure out which view gives the truer picture of our individual selves.

Parfit experienced his change of viewpoint on personal identity from a conventional view to one which did not privilege the future over the past – and which downplayed the centrality and perhaps even the reality of his very existence as a self – as liberating.

Previously, he had, as he put it,

"... seemed imprisoned in myself. My life seemed like a glass tunnel, through which I was moving faster every year, and at the end of which there was darkness. When I changed my view, the walls of the glass tunnel disappeared. I now live in the open air. There is still a difference between my life and the lives of other people. But the difference is less. Other people are closer. I am less concerned about the rest of my own life, and more concerned about the lives of others." [Reasons and Persons, p. 281]

This talk about caring for others (especially from a son of medical missionaries) makes me wary. Is Parfit merely adopting (the broad outlines of) an essentially religious outlook and rationalizing it in philosophical terms?

But before turning (in a subsequent post) to examine alternative views more closely, let me set out briefly the broad outlines and emotional drivers of my current position.

My view could be seen to be based on a narrower view than Parfit's, and aspires to an almost animal-like simplicity. ('Almost' because animals don't worry about the future – or foresee their own inevitable deaths.)

Though I doubt that my self has any substantive reality (and to this extent I may have more in common with Parfit than I am assuming here), I know that whatever reality it has is entirely dependent on the continuing existence and proper functioning of this body. Oversimplifying: I am my body.

The tragedy is, of course, that this body, like all bodies, will fail in the end. This is just how things are. Life is tragic (and comic and pathetic), and not at all bathed in sweetness and light as some religiously-inclined people are inclined to see it. From my perspective, at any rate, it seems more honorable – and more honest – to interpret life in pessimistic and uncompromising terms.

This need not entail an entirely non-religious outlook (think of Miguel de Unamuno, for example), though my approach is non-religious.

An anecdote might help explain some of my values and attitudes. Some years ago my mother had very bad pneumonia and spent a number of truly terrible weeks in an intensive care unit: close to death, hooked up to a daunting array of machines and unable to speak (because of a tracheotomy). The family was called in for a meeting with the senior doctors and nurses: they were clearly expecting her to die.

In the ICU, there was a 1:1 ratio of nurses to patients, each nurse on duty assigned to one patient only, and we visiting family members got to know some of the nurses quite well. I don't remember much of what was talked about, but I clearly remember one of them commenting that she preferred dealing with (and liked) patients who fought against death. And my mother decidedly was (and still is) such a fighter.

On more than one occasion when I came to sit by her bed when she was at her lowest ebb and hooked up to all those tubes and machines she turned and appeared to attempt to climb over the bed rails towards me. When I first witnessed this, it took a few moments to realize what she was trying to do. It was at once grotesque and sublime – and extremely moving.

I don't want to make too much of this and suggest that those who "rage against the dying of the light" are right and those who opt for more dignified options are wrong. And I fully realize that of course a nurse – especially one specializing in critical care – is going to prefer patients who don't die on her.

But speaking personally, though I admire those who decide to end their own lives when the signs are that those lives have reached a certain level of completeness, I am rather less keen on going (when the time approaches) with dignity and rather more keen on hanging around for as long as possible.


Now, having aired my general thoughts and feelings on the matter, I will try to put them out of my mind and examine what Gordon Cornwall has to say (see link above) with an open mind.



* See, for example, this post.

Thursday, November 21, 2013

Science and self-effacement

Famously – or perhaps notoriously – Steven Jay Gould proposed that science and religion constituted non-overlapping magisteria. In my opinion, his claim was not plausible; but a similar claim regarding the sciences and the arts does stand up.

I want to focus here on the issues of self-expression and collaboration.

Individual and creative thinking plays an important role in science, but it involves a form of creativity which is far removed from the sort of creativity which applies in the arts. The latter is always associated with self-expression; whereas self-expression has no role to play in science.

So self-expression can be seen not only as a key demarcation criterion between the arts and the sciences but also as an indicator that these pursuits are opposites, incompatible, non-overlapping. It is a crucial part of the one, and plays no part in the other.

Collaboration, on the other hand, occurs in both the arts and the sciences. But it is an essential – and defining – feature only of the latter.

The vast majority of the greatest works of literature, music and the visual arts are attributable essentially to one man or woman. The artist draws, of course, on his or her teachers and the broader culture but in a real sense owns – as author or creator – the finished product.

Similar notions can apply even to necessarily collaborative arts like the cinema. Think of the director, Alfred Hitchcock. The best of the early films he made in England have the same winning combination of suspense, latent eroticism and humor as his American masterpieces even though he was working with entirely different people in a very different cultural context.

The arts are by their nature self-expressive, even if the expression is often, as in theatre, cinema, etc., group-based or, as in much medieval art for example, anonymous. But even in these cases, I would argue, the greater works will be more likely to bear the stamp of an individual genius or personality.

Science is just not like that. It is the antithesis of self-expression, and is all about building a common body of knowledge. To the extent that the individual's ideas are deemed to be important, to that extent the science is undeveloped and uncertain. As a science matures all traces of pioneering individual contributions are erased or at least merged into a greater, more complex and more subtle body of knowledge than any single mind could even begin to comprehend.

There was an interesting exchange a while ago on a comment thread at Rationally Speaking about the nature and the scope of science which has a bearing on this point. A German botanist working in Australia was arguing that science is concerned with everything empirical and is defined primarily in terms of its communal nature.

"... [I]t is not science if I personally figure out whether Craspedia species are apomictic. I have to share this information in a way that allows other humans to test it, reproduce it, and build on it, because science is a community effort. But then it would be science no matter how trivial the fact."

Though not everyone will see the collaborative side of science as a key defining feature – another commenter calls it "unusual" as a demarcation criterion – science has, in my opinion, an essentially communal, individual self-erasing nature. (It imposes self-effacement, as it were.)

This criterion also fits mathematics. You get untutored geniuses (like Ramanujan) but it's only when they are integrated into the mathematical community (as Ramanujan was, thanks to G.H. Hardy) that they become real mathematicians.

Thursday, October 24, 2013

Myths with pretensions

I commented recently – in the context of a post about myths relating to race and (Jewish) identity – that one of the things I like about science is its myth-destroying power.

And science (broadly construed to take in the historical sciences) certainly does have that power. But it is – I readily admit – a strangely disturbing power. It goes against the grain of human psychology and culture which is irredeemably myth-ridden.

So when I said I 'like' that aspect of science, I was oversimplifying – leaving out the sense of ambivalence.

Let me give an illustration based on the final couple of years of my religious phase which relates not just to myths but to metaphysicalized myths – myths, if you like, with pretensions.

Two kinds of thinker appealed to me, but each in a different way.

On the one hand were those who distilled the essence (as they saw it) of the Christian myth and offered a deeply satisfying (for those who could accept it) way of relating to the challenges of life which incorporated a very deep, intuitive but historically-validated understanding of human psychology. For me, these thinkers were largely those in the Protestant (and particularly the Reformed) tradition who embraced Paul's emphasis on the absolute power of God. My favourite was Karl Barth.

But I was also attracted to a completely different kind of scholar – more scientifically- and historically-oriented – who offered none of that psychological comfort, but who offered another kind of liberation entirely. Rudolf Bultmann spoke of demythologizing the faith, but what he was doing was simply reinterpreting the old myths. More convincing were those who didn't talk about faith at all but who sought merely to elucidate the historical background of the New Testament. And the more I understood that background, the less plausible the Christian interpretations (and myths) came to seem.

But when one gets rid of one myth another will often arise to take its place. Social and political myths, for example, often take the place of religious ones.

Our brains have a special affinity for simple narratives which is explicable no doubt – at least in part – in terms of the need to generate the sense of a coherent, continuing self. We also have a strong tendency (which manifests itself in the grammars of natural languages) towards animism – seeing even inanimate nature as exhibiting human-like intentions and purposes. And though we have come to accept science's non-teleological explanations, many still shy away from these as ultimate explanations. For example, there is resistance to the view that randomness is, as modern physics suggests, a fundamental characteristic of reality.

In a sense, science is – and always will be – an unnatural activity, and the scientific worldview is a peculiarly unsatisfying one. Those of us who are committed to a scientific view of the world will always be, I fear, to some extent at war with our own natures.

I have also been thinking about mathematical Platonism again recently. Though such a view is essentially a timeless one (and so renounces narratives in the normal sense), it may still be seen to incorporate elements of myth (and teleology) as well as metaphysics. How else would it manage to exert such a strong emotional attraction (as it clearly does for many)?

The question of the plausibility of mathematical Platonism (or realism) is so important because it impinges on broader questions, such as the viability of an empiricist worldview. In fact, mathematical realism can be seen – and is seen by many – as posing major challenges not only for empiricism but also for physicalism.

And, as my instincts are (for want of a less abstract way of characterizing them) deeply empiricist and physicalist, I need to settle on a particular view of mathematics and see if, or to what extent, I will be forced to modify the basic way I see the world.

I am quite resigned to the fact that I will never entirely escape mythical thinking, but my goal is – if possible – to rid myself completely of the grand, intellectualized and metaphysicalized kind of myth and settle instead for the humble and commonplace variety.

Like the perennially-appealing prospect, mooted in a famous section of Homer's Odyssey and revived in the 18th and 19th centuries, of retiring to an exotic island paradise and drifting extremely slowly into a peaceful and uncomplicated old age.

Monday, September 23, 2013

Anti-metaphysical musings

I have been looking recently at some material relating to "the metaphysics wars", and thought it worthwhile to jot down a few notes.

No doubt, my general position would be characterized by those with other views as scientistic. It is also anti-metaphysical in that I don't see the traditional philosophical discipline of metaphysics as having much point these days.

I don't deny that there are very interesting questions in the philosophy of physics, the philosophy of mathematics and the philosophy of logic which may be characterized as metaphysical. The meta-thinking that goes on at the margins of physics, other sciences and mathematics, etc. is necessary and valuable.

But somehow, when such thinking moves away from the discipline in question and becomes more generally philosophical, problems arise.

Timothy Williamson is perhaps the most powerful and impressive advocate for this broader kind of metaphysics (and analytic philosophy generally). As an avowedly non-religious person, he can't be dismissed as having ulterior motives of a religious nature; and, being at home with formal – and specifically modal – logic, he can't be dismissed as natural language-bound or as being daunted in any way by technical rigor.

Some of the points he makes in this interview are good ones – such as noting the light that modal logic can undoubtedly throw on the workings and nature of natural language (via Montague grammar, for example), and perhaps also on the foundations of set theory – but I have to say that I am strongly inclined to reject the basic thrust of his argument in defense of metaphysics, and, by extension, philosophy (as he understands it).

Essentially the questions he seems most interested in are reminiscent of medieval scholasticism. I too have great respect for thinkers such as Avicenna (to whom he refers approvingly) and respect also for more recent – and more mathematically sophisticated – exponents of that general tradition of thought (such as Bolzano, to whom he also refers), but it seems to me that it is now incumbent upon any thinkers who aspire to deal with questions of what there is in a fundamental sense to base their accounts – at least in large part – on contemporary physics; or on mathematics if they are restricting their focus to mathematical realities.

Williamson seeks to defend the relative independence of his core preoccupations from science by invoking the old shibboleths, scientism and reductionism, and rejecting naturalism as a confused and inadequate concept.

I grant that mathematics does pose problems for advocates of strong forms of naturalism and empiricism, and there are real unresolved issues in the philosophy of mathematics. But my preference is to address these issues in a broadly scientific and mathematical context rather than in a purely logical or philosophical one, or – worse – not to address them at all and instead merely to use them as a kind of justification or license for logical excess and metaphysical self-indulgence.

Williamson cites Quine as an example of scientistic naturalism.

"Quine privileged natural science, and in particular physics, over all other forms of inquiry, to the point of not taking very seriously any theory that couldn't be reduced to part of natural science."

Williamson's view, by contrast, more or less allows the analytic metaphysician carte blanche, and Williamson's own approach to analytic metaphysics is clearly – in my view at any rate – insufficiently constrained and guided by science.

Here, for example, is an extract from an old interview in which he explains his developing views:

"My work on vagueness and ontology doesn’t really concern ontology. Probably my most distinctive ontological commitment comes from my defence of a controversial principle in logic known as the Barcan formula, named after the American logician Ruth Barcan Marcus, who first stated it. An application of this principle is that since Marilyn Monroe and John F. Kennedy could have had a child (although they actually didn’t), there is something that could have been a child of Marilyn Monroe and John F. Kennedy. On my view, it is neither a child nor a collection of atoms, but rather something that merely could have been a child, made out of atoms, but actually has no location in space and time. The argument can be multiplied, so there actually are infinitely many things that could have been located in space and time but aren’t. It takes quite a bit of work to show that the Barcan formula is better than the alternatives! That’s what my next book will be on. The working title is Ontological Rigidity."

The book was actually called Modal Logic as Metaphysics, and this is how he recently stated its main point:

"I am ... saying that it is necessary what there is. Necessarily everything is necessarily something. There could not have been more or fewer things than there actually are, and which particular things there are could not have been different. What is contingent is only what properties those things have, and what relations they have to each other. I call that view necessitism. Its denial is contingentism. Who knows how far back necessitism goes? Maybe Parmenides was some sort of necessitist..."

On the face of it, talking about (apparently countable) things (minus their properties and relations!) as given strikes me as breathtakingly naïve in the context of a physics-based understanding of reality. I can only imagine that Williamson is – like the medieval scholastics – implicitly asserting a privileged role for logic.

Quine's assertion of a privileged role for physics makes a lot more sense to me.

Admittedly I haven't looked at Williamson's ideas in any depth, but what I have seen so far – and what he says in this latest interview – really makes me question whether it would be worth the effort. I am intrigued, however, by what is driving such thinkers.

Strangely, Williamson appears not to be quite sure whether his latest work is meaningful or not – or at least seems unwilling to commit himself on the matter. There is (don't you think?) just a touch of arrogance in this passage (from Chapter One of Modal Logic as Metaphysics)?

"This book compares necessitism and contingentism. Which is true? Of course the question has a false presupposition if the definitions of 'necessitism' and 'contingentism' lack meaning or content. But if every enquiry must first establish its own meaningfulness we are on an infinite regress, since the enquiry into the meaningfulness of the previous enquiry must first enquire into its own meaningfulness, and so on. Better to act on the assumption of intelligibility: readers can decide for themselves whether they understand the book as they go along, and recycle it if they don't."

This passage is a combination of facile reasoning and rhetorical sleight of hand. By using the word 'understand' in the final sentence, he subtly shifts the focus to the reader's possible inadequacy and away from the original question concerning the work's meaningfulness.

In fact, I am tempted to see Williamson's work as emblematic of a broader trend. On the basis of my (admittedly limited) knowledge of the history of the relevant intellectual cultures, I discern, since the middle years of the 20th century, a disturbing falling off in intellectual seriousness in secular circles accompanied by an equally disturbing rise in anti-scientific name-calling and credulity amongst those thinkers who remain favorably disposed towards religion.

I'll finish here with a few comments about Paul Horwich, Williamson's great philosophical antagonist, whose deflationary views on truth I have referred to favorably in the past.

Horwich is opposed to the sort of traditional theoretical philosophy ('T-philosophy') which Williamson defends. I have made the point that, though I broadly accepted Horwich's account of truth, I doubted that his Wittgensteinian view of philosophy was compatible with a continuation of philosophy as an academic discipline. And, interestingly, Williamson makes a similar point in the recent interview.

"...Horwich didn’t explicitly call for T-philosophy not to be funded. I pointed out that if the picture of philosophy in his book were accurate, philosophy should be abolished. The reader encounters just two sorts of philosophy: irrational T-philosophy, and level-headed Wittgensteinian debunkers of T-philosophers. Philosophy is presented as an activity in which some people make a mess and others clear it up. Why on earth should taxpayers fund that? It looks as though we’d be better off simply abolishing the activity altogether."

Finally, I was surprised (and a bit disappointed) to learn recently that Horwich rejects naturalism, and even more unequivocally than Williamson does. He cites not only mathematical but also moral claims as a basis for his view.

Horwich is more thoroughly Wittgensteinian than I had previously thought.