Showing posts with label morality. Show all posts
Showing posts with label morality. Show all posts

Monday, September 12, 2022

Against ideology



As we watch economies fail and societies move into the more advanced stages of dysfunction and dissolution, there is a lot of political finger-pointing going on. Blame is typically assigned in such a way as not to upset one’s preferred political or economic narratives.

Targeting ideological enemies necessarily entails a labeling process. The terms used are normally vague and abstract but loaded with emotional content – positive for terms designating “us”, strongly negative for terms designating “them”. Though the abstractness of the terms in question may confer a veneer of intellectual seriousness, the communicational dynamic remains purely rhetorical. Meaning is reduced to connotation, the various “isms” and so on merely providing convenient ways of encapsulating ill-defined sets of attachments on the one hand and aversions on the other.

Political ideologies are real is the sense that they affect the way people interpret history and current events and motivate action but, incorporating as they inevitably do political myths and simplifying abstractions, they are quite useless as analytical tools. This is not to say, of course, that terms like fascism, corporatism, socialism, capitalism, etc. – qualified to distinguish different forms where necessary – cannot be a useful shorthand when they are used descriptively and in historically informed ways.

The trouble is, such terms are rarely used like this. More often than not they are used rhetorically: as tribal markers, as weapons of ideological combat.

I do not have a particular ideological position to which I am committed or which I am promoting. This is not for want of trying to discover or build one. After much study and thought, I have come to the conclusion that this desire to choose or construct a preferred ideology is ill-conceived.

It is quite unnecessary to have some kind of explicit social blueprint in mind. Better not to, in fact (for all sorts of reasons, most of them relating to the contingent and context-dependent nature of social and cultural interactions).

Part of my Ph.D. thesis was focused on the revival in the 1930s of the principles of economic liberalism and their development and application during the post-WW2 era. The broad aim of the self-styled “neoliberals” – mainly European thinkers – whose work I was writing about was to offer an alternative to totalitarianisms of the left and the right. This is a goal with which I was (and still am) sympathetic. But, as I say, I have come to believe that no abstract system or ideology is adequate to deal either with questions of ends (which involve crucial moral choices) or means. How is an abstract system supposed to mesh with the complexities of an historically evolved and evolving social structure? The old joke has the Irishman telling the stranger who asked for directions, “Well, if I were you, I wouldn’t start from here.” It matters – a lot – where you happen to be.

Liberal institutions developed within – and were dependent on for their proper functioning – cultures which had certain common features. A certain kind of culture and a certain level of trust and moral attainment are prerequisites for liberal values and institutions to thrive. Those conditions no longer apply in the societies with which I am most familiar.

Monday, January 28, 2019

Culture and language: some personal reflections


Much is written about shared narratives and their role in creating a common culture. But what is a culture?

The idea of a common culture – whether that culture is defined in regional, national or supranational terms – is an idealization and necessarily vague and imprecise. This fact needs to be recognized. But it does not entail that clear and definitive claims about culture cannot be made. One way of making claims more precise is to focus on specific cultural elements. My background in comparative literature and linguistics leads me to focus on language.

Language can be seen from a broadly literary perspective on the one hand or from a scientific perspective on the other. The former perspective motivates my views to a large extent and provides a partial conceptual framework (based on certain intellectual and literary-historical traditions). But linguistics extends and strengthens the conceptual framework and provides a bridge to cognitive and evolutionary science.

Language is central because, without language, distinctively human forms of social practice would not have arisen. Take early ritual burials, often seen as markers of emerging human consciousness. Clearly, some kind of shared narrative is at work here; a shared notion of an afterlife for which the deceased is being prepared. Such a sophisticated narrative could not have existed before our ancestors developed a capacity for language.

There is a huge gulf between the linguistic or semiotic capacities of humans and other animals. All known human languages (apart from pidgins) share an equivalent degree of structural and grammatical complexity. Presumably language did not come into existence all at once and fully-formed, but evidence for hypothetical, intermediate forms is unavailable and we can only speculate regarding the communicational powers of pre-modern humans and other hominins.

There is an important distinction between a (natural) language and the more general and abstract concept of human language which parallels the distinction between a culture and human culture in general. All actual linguistic phenomena occur within a specific linguistic context, of course. But languages share common elements and/or structural features with other languages, so the idea of a language is not a simple one and is not without its problems.

In what sense does a language exist as distinct from particular instances of language use? Spoken words and written texts are generally assignable to this language or that, but precise boundaries are impossible to draw. Grammars and dictionaries try to do this but they can never reflect the constantly shifting contours of actual linguistic practice which always depend on the knowledge and behavior of individual speakers. In the final analysis, then, what we have is a set of unique and (to a greater or lesser extent) overlapping idiolects. We find it convenient, however, to group sets of idiolects into what we call dialects or languages. (Noam Chomsky and many other linguists have explicitly endorsed this idea.)

If a language is difficult (or impossible) to define, the notion of “a culture”, being more general, is even more problematic. But something similar to an idiolect-based approach can help us out. Each of us can be seen to represent a unique cultural mix. What we call “a culture” is represented by a set of (potentially communicating) individuals whose cultural knowledge and practices are similar in certain respects.

Though only limited precision is possible when talking about particular cultures, it helps if the primacy of the individual (in the sense explained above) is borne in mind. Consequently, a bit of personal history may help to flesh things out.

I went to a high school which had a strong classical focus. Latin was considered an important subject. We read Book IV of Virgil’s Aeneid, the letters of the Younger Pliny (not recommended) and extracts from Julius Caesar’s Commentarii de Bello Gallico (an account of his military campaigns in Gaul).

The older boys had studied classical Greek as well as Latin, but Greek was phased out. We were aware that Latin also was being marginalized in the broader educational culture. Fewer and fewer students were taking up Latin and, of those who did, fewer and fewer were taking it through to their final high school years.

Language lies at the heart of culture and knowing Latin gave us a sense of being part of a long tradition of Western cultural life. Being exposed to the actual words of cultural forebears who lived in a world untouched by Christian philosophy and yet which did not seem completely alien challenged us in subtle ways. This is an aspect of classical learning which is not always appreciated. Classical values (despite attempts by later thinkers to Christianize them) are opposed in quite fundamental ways to the moral spirit of the New Testament and, by extension, to the underlying values of the many social and political movements that were founded upon and driven by secularized versions of Biblical ethics and eschatology.

Elements of classical culture permeated ordinary life in ways that are difficult to conceive today. The details, taken individually, seem trivial: Latin words and phrases were used in English more than they are now; likewise classical references in English idioms were once more common (like “crossing the Rubicon”). And historical figures were routinely alluded to. I don’t know if “Great Caesar’s ghost!” was ever actually a common exclamation, but it certainly was a successful 20th-century popular culture meme. Significantly, in the 1990s television series, Lois and Clark: The New Adventures of Superman, the Perry White character says, “Great shades of Elvis!” instead of “Great Caesar’s ghost!”.

One of the things that characterized European cultures in previous centuries was a fairly widespread knowledge of Greek and Roman myths and legends. You can’t read much literature in English or other modern European languages or appreciate the visual arts without at least a cursory knowledge of these stories.

Strangely, even in the 20th century, the names and images of Greek and Roman (and Scandinavian) gods and heroes were very popular and effective marketing tools for selling consumer products. Or even football teams (e.g. Ajax Amsterdam).

As a love goddess, Venus was always popular. Some years ago, a local firm, Venus Packaging, got rid of their old, sexist logo which incorporated a shapely silhouette with the tagline: “That’s packaging!” On a more sober note, STDs used to be called venereal diseases.

Fables (going back to Aesop and beyond) and fairy tales were generally better adapted for children than Greek myths and were woven deep into the fabric of life. I recall getting off a train at London’s North Wembley station (which was near where I then lived). I was in the back carriage and had a long way to walk down the platform to the exit gate. As I passed through, an elderly white woman was telling the story of “The Tortoise and the Hare” to the young, black ticket collector (an immigrant from the Caribbean). Obviously, she had made an allusion to the fable (she being the tortoise), which he had not understood. It was a poignant scene, especially given the race-based social and political frictions which were beginning to manifest themselves in parts of London and other English cities.

A sense of regret for the loss of shared stories and traditions has nothing to do with racism. It applies within all ethnic or racial groups and across them. But it also reflects a particular view of culture which my literary education happened to reinforce.

“The term culture,” wrote T.S. Eliot in his Notes Towards the Definition of Culture, “includes all the characteristic activities and interests of a people; Derby Day, Henley Regatta, Cowes, the twelfth of August, a cup final, the dog races, the pin table, the dart board, Wensleydale cheese, boiled cabbage cut into sections, beetroot in vinegar, 19th-century Gothic churches and the music of Elgar. The reader can make his own list …”

As I see it, without a rich, common culture, not only does a society become less interesting, it becomes less resilient. It fractures. And this is precisely what we are seeing in the United States and many Western countries today.

No doubt there are many causes of and explanations for the social and political problems we are currently witnessing, but the marginalization of shared, traditional stories is undeniably a significant factor. Given the nature of our brains – given that they are narrative-consuming and narrative-generating engines and that our sense of self and meaning and purpose are narrative-dependent – the loss of one set of stories will make space for another. How one characterizes and interprets the current changes will depend on one’s ideology which in turn depends on the stories which one has internalized over a lifetime.

A part of me (my non-scientific, emotional side) sees a toxic mix of manufactured slogans coupled with ad hoc narratives rushing to fill the vacuum left by the loss of traditional and organic modes of thought and practice.

This judgment is tempered, however, by an awareness of the essential transience of languages and cultures, and a belief that what is truly valuable in what has been lost, culturally speaking, will – for as long as humans continue to exist and thrive – always manage to find new forms of expression.

[This is a revised version of an essay first published at The Electric Agora.]

Wednesday, October 31, 2018

Thoughts on morality, politics and history

I. Past and present

If we ignore the past, not only do we forego the opportunity to understand our own social and cultural situation in more than a superficial way, we disrespect ourselves. We are to future observers what past generations are to us, and, if we have no interest in the lives or achievements of our forebears, we are implicitly condoning a similarly dismissive attitude to our lives and achievements on the part of future generations.

Linguists analyse language both in terms of its structure and in terms of its history. Not only language but any cultural element needs to be seen in this double perspective if it is to be properly understood; that is, not just as it is at any given point in time, but also in terms of how it developed.

But increasingly, both in general and educational contexts (each reflecting the other and creating a negative feedback loop), the historical dimension is being neglected or replaced by crude historical myths and fictions. For most people today, the past is a great unknown expanse like the distant oceans on old maps, a blank canvas on which political actors are free to paint whatever they see fit. Inevitably the historical record is distorted beyond recognition as ideologues flesh out their chosen political myths with suitably aligned heroes and moral monsters.


II. Morality and Politics

Political differences are generally based on moral priorities being ranked differently. Moreover, issues which are high on a particular group’s or party’s agenda are issues upon which divergent opinions are not allowed: the party line must be adhered to on pain of excommunication.

Such attitudes are profoundly illiberal. They characterized Christian and Jewish and Muslim orthodoxies in the past and (in varying degrees) still do; and they characterize political orthodoxies, especially at the extremes of the ideological spectrum.

The fact that individual moral intuitions and priorities differ is acknowledged (how could it not be?) but in polarized political contexts these intuitions tend to be seen as indicators of the moral and political status of the individuals in question. They are also seen, by extension, as markers of group membership.

This is, as I suggested, essentially a religious attitude. “He that is not with me,” says Matthew’s Jesus, “is against me.” Or, pluralizing to make it more relevant to politics: those who are not with us are against us.

This attitude often goes slightly further and does away with even arm’s-length tolerance of alternative views. In such cases, even those who are not against those who are not with us are against us.

Against this backdrop of religionized politics, it’s no great surprise that moral realism is currently flourishing in humanist and academic contexts today, with many self-described atheists (especially those with strong political commitments) coming to believe in an objective “moral law” or set of rights that just happens to correspond to (and thus, validate) the moral imperatives central to their own favoured political ideology.


III. The Anti-Metaphysical Stance

My default position on matters metaphysical is strongly “anti” in the sense that given my assumptions about the natural world, I do not see either traditional metaphysics or theology as viable academic subjects. That a similarly robust anti-metaphysical stance has been promoted by many significant thinkers whose starting point was very different from my own (e.g. Christian fideists, Ludwig Wittgenstein, Richard Rorty) serves only to strengthen my convictions on the matter.

There are, I accept, questions of a metaphysical kind that can be asked and meaningfully discussed. But, in terms of scholarship and academic research, such discussion is only likely to be fruitful to the extent that it is informed by the sciences (broadly interpreted to include mathematics and historical disciplines).

Take the topic of time, for example. There seems little point in writing about it as metaphysically-oriented philosophers have done in the past, attempting to explicate its deep nature in an a priori fashion. Depending on the aspects of time in which you are interested, you obviously need to draw on various sciences or other disciplines (e.g. physics, psychology, linguistics, literature).

Epistemological questions, likewise, need (as I see it) to be seen in the context of particular research programs.


IV. Liberalism versus Dogmatism

Morality, which used to be seen in metaphysical terms, is a practical business, not a theoretical one. You can theorize about it, of course, just as you can theorize about other aspects of human value systems and human behaviour. Such theorizing is useful to the extent that it provides a descriptive account of these matters and a framework for discussion of normative questions. A large degree of agreement can be expected on the descriptive front. On the normative side, there is some scope for convergence and agreement but on personal priorities and many controversial questions definitive answers are just not possible.

The view I am putting stands opposed to sophisticated (e.g. Kantian) as well as less sophisticated forms of moral realism (such as those that underlie certain religious and secular orthodoxies). Mainstream Christian churches and Jewish communities have over the centuries come to accept the kind of tolerance for differing opinions and perspectives that has traditionally characterized the European liberal tradition. But the mainstream churches are in decline as, on the one hand, fundamentalist religious groups seem to thrive and, on the other, politics becomes more polarized and extreme.

All in all, the tradition of European liberalism seems to have played itself out. It worked (to the extent that it did) only because a unique set of social and cultural conditions in European and other Western countries allowed it to work.

These conditions no longer prevail. They included a sense of continuity with the past, an enlightened and science-friendly perspective, relatively homogeneous regional and national cultures, and an organic and historically significant network of interrelated communities and practices transcending regional and national boundaries.

Typically, today’s communities and networks lack the deep historical dimension which only cultural continuity provides. As such, they are free-floating and fragile and extremely vulnerable to demagoguery and dogmatism.

Thursday, October 27, 2016

Brutal reality?

[Recently posted to my Google+ collection Language, Logic, Life.]

Dan Kaufman recently got a bit of flak (even from his wife apparently) for his energetic critique of the new – or continuing – 'cult of the self'.

"... [T]he old Cult of the Self [the reference here is to the 1970s and '80s and such movements as Werner Erhard's 'est' program] actually may have been slightly less loathsome than its newer, smarmier versions, insofar as it was at least honest, albeit in a brutal, tone-deaf sort of way."

He is saying that the older movements did not really disguise their egoistic nature whereas more recent iterations – while still basically egoistic – present themselves as being driven by humane motives.

"... [T]oday’s Cult of the Self represents itself as being socially oriented, and with social media having trained us to accept the thinnest, most indirect, heavily mediated interactions as constituting real relationships, it’s easy to convince ourselves that seeing others entirely through the lens of our own well-being and virtue constitutes genuine connection and concern, rather than self-absorption masquerading as such.  Gone is the idea that our deepest relationships with and obligations to others are properly self-effacing, and in its place is the notion that the main thing to think about, with respect to other people and what they deserve, is how the way I treat them reflects upon me."

....

I commented (in part) as follows:

"My default position is that something like that "brutal" position is probably 'true' in the sense that it correlates well with reality. But this could be seen as a dangerous idea. It seems to me there is a key divide here on how people see the world (and themselves). I don't know, however, that I would want to push this idea too much: social consequences may not be good. There is no reason to think that just because something is true, it is something one should talk about. I've never liked the 'noble lie' idea, but reticence is slightly different from this. Reticence – like lying, actually – is ... something I am not particularly good at, however."

I also suggested in the comment that Max Stirner's radical egoism – which Leszek Kolakowski saw as prefiguring fascism – was an important precursor to the movements Dan Kaufman was attacking.

In due course I will try to expand on these somewhat cryptic remarks. It could form the basis for a new Electric Agora article (or articles). But let me here and now try to put the core idea more directly.

I am suggesting that the standard way of seeing things involves a lot of self-deception and (to use a loaded term which may or may not be appropriate here) hypocrisy.

Fundamentally the social world works just like the natural world described by biologists. Evolutionary processes are not pretty. Having language and culture adds complexity and richness and gives us freedoms and possibilities which other animals do not have. But it does not allow us to escape this world of deception, manipulation and struggle. A basic kind of ethics and very basic notions of rights and responsibilities make sense: as individuals we survive longer and prosper when we cooperate. But a Christian or socialist-style ethic – based on a kind of generalized altruism (or generosity) mixed with self-denial and deemed to be in some sense obligatory – is problematic, both in terms of its consequences for those individuals (very few, it must be said) who sincerely and seriously try to implement such an ethic in their lives, and in terms of rational motivation.

Still a bit cryptic perhaps. But it is an attempt at least to clarify (in my own mind as well as in a more public sense) the supposedly "dangerous" idea I was talking about not talking about!

Tuesday, May 21, 2013

The view from Nagel-land

I said I would follow up on Thomas Nagel's views. The first twenty minutes or so* of the video inserted below is a talk in which Nagel summarizes and critiques his friend Ronald Dworkin's view of morality. Nagel speaks (as he writes) with great clarity and seriousness. (I realize that many will find the content a bit dry, but there is interest also in the style of delivery, in the very manner of the man. Ivy League and very 20th century!)

Dworkin wants ethics to be objective, and has a clever argument which appears to demonstrate that moral claims can indeed be seen as objectively true or false even within the context of a naturalistic world view.

Nagel – correctly in my view – sees our current naturalistic world view (he refers specifically to evolutionary theory) as being "difficult to square with" the objectivity of moral claims. But, as he is not willing – for moral reasons, apparently – to give up on the objectivity of right and wrong, he rejects the current naturalistic world view.

This last move is a grievous mistake, in my opinion. He is saying, in effect, that it would be just too awful if right and wrong did not have an objective basis – and so they do have an objective basis, and the scientists must have got things seriously wrong.

I respect Nagel's honesty and directness. He goes with his moral intuitions, but I would say that they take him out of the secular mainstream.

Nagel's move in this talk, by the way, needs to be seen in the context of his long-standing insistence that science, which aspires to an objective 'view from nowhere', is incomplete for it cannot encompass or explain the reality and the realities of the first-person point of view.

This idea is associated with (because it can be used to justify) what I see as the main problem with Nagel's thinking: that he lacks, and shows little interest in, scientific knowledge.

This may not matter for certain kinds of intellectual enquiry, but scientific issues (especially relating to evolutionary biology and physics) are crucial to many of the questions Nagel addresses.

In fact, his obvious (and self-confessed) lack of knowledge in these areas makes it difficult to take his reflections on human psychology or human evolutionary or cosmic history (most recently expressed in his book Mind and Cosmos) seriously.

I don't want to posit a simplistic contrast between scientifically-trained thinkers and those with little or no scientific training, however, and suggest that only the former are worth listening to. The scientifically trained can be just as stupid and irrational as anyone else.

But it does seem reasonable to expect anyone dealing in a serious way with questions pertaining to a particular area of science to have a thorough grounding in that area, or at least in a related area of science.



* The most interesting bit, in my opinion, starts at the 14:20 mark.

Sunday, April 14, 2013

Descriptive and normative approaches to ethics

[This piece is a substantially-revised version of an earlier post, 'Ethics in a nutshell'.]

Meta-ethical questions are a bit like questions in the philosophy of mathematics, where various forms of platonism or realism do battle with more mundane interpretations. The key difference, I would suggest, is that the philosophy of mathematics has very little impact on the way mathematics is done, whereas meta-ethical disputes do impinge on normative (or prescriptive) ethics as practised by philosophers (though not, it must be said, on most ordinary ethical decision-making).

Unfortunately, meta-ethical disputes (which are often driven by deeply-felt convictions about the nature of human life and reality) are not readily resolvable, posing problems not only for meta-ethics but also for normative ethics.

Moral reasoning is complex and difficult enough, even if one is working – like many religious thinkers – within a generally accepted broader framework. But if there is no agreed-upon framework then conclusions are going to be – to say the least – very contestable.

And what of science? Science can, I believe, change the way we see the world in a way that philosophy can't. Though there is an important distinction to be maintained between the descriptive and the normative, between scientific and value-based judgements, science can undoubtedly offer new insights into value-based questions.

Our evolving understanding of the natural world and our place in it has a profound impact not only on how we see particular moral issues but also on how we frame and respond to general questions about human values, responsibility and freedom.

For example, 'ought' implies 'can', and the findings of science have a lot to say on what is realistically possible in terms of human behavior and what is not.

More generally, as our knowledge of human psychology has advanced, there have been – and there will continue to be – changes, both subtle and profound, in the way we think about right and wrong and conscience and guilt – and also changes to institutional mechanisms for dealing with anti-social and criminal behavior.

One approach to descriptive ethics which is not strictly scientific but which complements more rigorous approaches involves looking at how adjectives like 'ethical' and 'moral', auxiliaries like 'should' and nouns like 'obligation' or 'duty' are actually used in ordinary day-to-day contexts, and attempting to discern the implicit social rules and expectations which underlie the use of such expressions.

Every society, every social group, incorporates implicit rules of behavior. These rules (some relating to etiquette or manners, others to morality) can be studied and described like any other aspect of social life, though such descriptions will of necessity be incomplete and somewhat interpretive.

These systems of implicit moral rules coexist, of course, with explicit rules, as exemplified in systems of law and regulation. Though my focus here is on the former, it's important to be aware of the subtle, complex and often contentious relations between the two.

Just as the law is a system of enforceable explicit rules, so morality can be seen as a system of implicit rules. And just as the law outlines legal responsibilities and confers certain legal rights, so moral systems can be seen to assign responsibilities and confer certain moral rights. If you break society's explicit laws and are discovered, formal mechanisms of enforcement and justice are set in train. Similarly, if you break implicit moral rules, informal mechanisms (like disapproval and ostracism) will likely be triggered. The basic principle (hard-wired into our brains, perhaps) is that if you flout the rules you forfeit your right to the benefits and protections those rules might potentially provide.

Normative, as distinct from descriptive, approaches to ethics involve the individual actually becoming ethically engaged (rather than just describing what is). This will involve making or accepting or rejecting particular moral judgements or affirming or endorsing or arguing for particular judgements or values. It inevitably involves interpreting social rules, sometimes criticizing, and sometimes rejecting them.

Deontic logic traditionally divides behaviors into three broad classes: obligatory, impermissible and optional. ('Optional', by far the most appealing, is also, plausibly, by far the largest of the three classes.)

It's a complex branch of logic, but the real complications and challenges of practical moral thinking are not so much logical as contextual. Because, obviously, the general situation and the specific position(s) of the individual(s) involved need to be taken fully into account.

Times have changed since F.H. Bradley wrote his famous essay, 'My station and its duties' [a chapter, actually, of his book Ethical Studies (1876)], but the basic principle of the contextuality of ethics still applies. A person's duties or obligations derive in large part from (or at least cannot be assessed without taking into account) his or her positions in complex societal, professional and familial structures.

The key question in ethics is a situated-first-person question: what should I – in a particular situation at a particular time – do (or refrain from doing)? I say this is the key question in ethics, but such a question (and this is reflected in the ambiguity of the word 'should') often goes beyond ethics or morality, and merges with questions of prudence or etiquette or other areas or dimensions of life.

Unacceptable behavior causing serious harm to others, however, is clearly an issue in which ethical (and probably also legal) considerations will dominate.

What of ethical subjectivism? Is it a threat or a problem? My view is that, if normative ethics is seen as something theoretical, as an area of study to be compared and contrasted with descriptive (psychological or sociological) approaches, the former will inevitably suffer from the comparison, especially concerning claims to having an objective basis.

But if, on the other hand, normative ethics is seen in a more practical light, seen as an integral part of actually living and choosing rather than in purely academic or epistemic terms, then questions of objectivity versus subjectivity may not even arise.

The fact is, we are all forced to make ethical and other value-based decisions all the time. And, while empirical knowledge, reason and rational discourse can play an important part in these decisions, other more obscure elements are also inevitably in play.

Friday, January 18, 2013

Ethics in a nutshell

[Note: I am no longer happy with this, and intend to post a revised version soon. April 4]

Ethics and morality are important topics, but much ethical discussion and debate is unenlightening and unproductive.

I have serious reservations about philosophical ethics. Whilst a knowledge of some of the rudiments of ethical theory may be useful for articulating issues and problems, there is no clear way of solving problems or deciding between alternative approaches. The academic study of ethics soon becomes (in my experience) an area of rapidly diminishing returns.

Different people have very different ideas about the scope and nature of ethics, often talking at cross purposes or seeking to promote a cherished agenda by any means, including personal abuse.*

Rather than elaborating ambitious theories or contributing to the revival of Aristotelian or other classical approaches, I am drawn simply to look at how adjectives like 'ethical' and 'moral', auxiliaries like 'should' and nouns like 'obligation' or 'duty' are actually used in ordinary day-to-day contexts, and the implicit social rules with which such expressions are associated.

Every society, every social group incorporates implicit rules of behavior. These rules (some relating to etiquette or manners, others to morality) can be studied and described like any other aspect of social life.

Prescriptive (as distinct from descriptive) approaches involve the individual actually making or accepting or rejecting moral judgements or using or applying moral language or concepts.

Deontic logic traditionally divides behaviors into three broad classes: obligatory, impermissible and optional. It's a complex branch of logic, but the real complications and challenges of moral thinking are not so much logical as contextual. Because, obviously, the general situation and the specific position(s) of the individual(s) involved need to be taken into account.

Times have changed since F.H. Bradley wrote his famous essay, 'My station and its duties' [included in his Ethical Studies (1876)], but the basic principle of the contextuality of ethics still applies. A person's duties or obligations derive in large part from (or at least cannot be assessed without taking into account) his or her positions in complex societal, professional and familial structures.

Kant talked about a categorical imperative, but I don't think we can get beyond hypothetical imperatives. In other words, if you (in such and such a situation) want such and such an outcome, do this or that. With respect to social relations, this way of thinking is never straightforward or foolproof, and requires judgement and insight to be applied successfully.

The kind of (implicit) rule-based approach to ethical thinking and manners which I am advocating is consistent with a very modest view of rights. If you break society's implicit rules whenever it suits you, you forfeit your right to the benefits and protections those rules might potentially provide.

The key question in ethics is a first-person question: what should I do (or refrain from doing)? I say this is the key question in ethics, but such a question (and this is reflected in the ambiguity of the word 'should') transcends ethics or morality.

Ethical or moral questions often merge into questions of etiquette, aesthetics and prudence as well as other areas or dimensions of life. There are no clearcut divisions between ethical and other considerations, in other words, and a certain type of (marginally unacceptable) behavior may be condemned by some as immoral, while others might prefer to call it ugly, unwise or just bad form. Others may see it in a positive light.

Even very serious moral transgressions (like the indiscriminate killing of civilians) are sometimes seen by people in the grip of certain ideologies or belief-systems as praiseworthy.

Most of us, of course, will condemn such ideologies as noxious and depraved. I certainly do. It is not really a problem that we can't prove our view correct and its converse incorrect in some objective, theoretical sense (though many think it is). Ethics is just not like that.

Quite simply, there is no absolute or objective ethical authority, and nor is there any objective method of determining 'moral truths'.



* Here is a summary of a recent controversy involving some very silly and intemperate assertions on the part of one of the protagonists.