What Is This?
Stuart W. Mirsky
Kirby Urner
Join Us!

Stuart W. Mirsky (Stuart W. Mirsky is the principal author of this blog).
Last 10 Entries:

Sean Wilson's Blog:

Ludwig Wittgenstein:

Search Archives:
Every Entry

Duncan Richter's Blog:


Can Value Be "Transcendental"?

Wittgenstein famously claimed in his early work that ethics (a form of valuation applied to human actions) is not in the world but outside it and thus "transcendental." He asserted this but did not explain it, likening ethics to aesthetics and, indeed, religious belief. In the same sense that these things seemed to the early Wittgenstein to be outside the realm of factual assertion (our artistic and religious inclinations are not reducible to facts about the world or things in it), so, too, did he hold that our ethical claims are a type of inclination. But that did not mean that, for him, they are to be ignored or discarded. Rather, as he told others at the time and later, he had the utmost respect for all three modes of human behavior (religion, the appreciation of art and ethics). Yet, as far as the matter of ethics may go, which is to say determining what is right or wrong to do when we deal with the world, he did not hold out hope for the use of reasoning as a means of furthering our understanding of ethical matters, either in making ethical choices or in explicating that activity.

Even though moral discourse, the linqua franca of ethics, seems to be about offering reasons to do one thing rather than another in certain situations (typically where others' interests are in play), on Wittgenstein's transcendentalist approach, there would seem to be nothing to be said. So how, then, do moral judgments make themselves felt in the world? . . .

Click to read more ...


Wittgenstein's Later Ethics: A Reply to Duncan Richter

Writing in an omnibus compilation of Wittgensteinian scholars on the subject of Wittgenstein's approach to Ethics, Wittgenstein's Moral Thought, Edited by Reshef Agam-Segal and Edmund Dain, Duncan Richter who teaches philosophy and ethics at the Virginia Military Institute, reminds us that, like G. E. Moore and Henry Sidgwick before him, Wittgenstein thought the terminology of ethics (words like "good," "right," etc.) could not be reduced to any naturally occurring element in the world. Unlike the Utilitarians who hold that goodness is definable as whatever makes the greatest number of us happiest (or some variant of that condition), or deontologists who, writing in a Kantian tradition, take the good or whatever we deem right to mean fulfilling one's duty, however defined (Kant offers one way, others may offer another), for self-styled ethical intuitionists like Moore and Sidgwick the good is an indefinable because it is a simple feature we discern in things. It cannot be reduced to anything else. But if it can be discerned in things, it cannot be pinned down in any natural way, as some particular feature of the thing or activity which we count as "good" or "right."

In his article, Sketches of Blurred Landscapes, Richter reminds us that, while Wittgenstein shared the view that goodness is indefinable, he held this to be so for a different reason . . . .

Click to read more ...


Too Much Philosophy?

Since retiring in 2002 I have put in nearly two decades returning to philosophy, the intellectual passion of my youth. I left philosophy when I left the university and headed out into the world to make a life, start and raise a family and, ultimately, build a bit of a career. I never forgot my interest in things philosophical, of course, but I let them go as I put my energies into other things. Eventually I picked up another favored pastime of my youth when I wrote an historical novel in my later years on the job (during a hiatus when our upper management was in transition) and later published it myself (no luck finding representation or an interested publisher -- it was a Norse saga pastiche, written in an archaic voice, so I wasn't surprised). But after that book, and a couple of others, I kind of got bogged down. Why? I found my old interest in philosophy again and for a decade I lost myself in philosophical discussions and readings on the Internet (sometimes contentious -- philosopher wannabes can be pretty arrogant -- and always detailed and wide ranging) until I finally felt that I had something of my own to say. There followed two books. The first, Choice and Action, turned out to be a compilation of many essays expanded by me from my online jeremiads on those philosophy discussion groups, as well as some written especially for the new book.

Because the focus of those essays leaned toward ethics and meta-ethics and because, while back in college majoring in philosophy I had always thought that, if I wrote a book of philosophy, it would be focused on ethics and bear that name, that's the title I settled on. But Choice and Action was too disjointed a book and even while I believed the latter part of it made a cohesive case for the ethical answer I was trying to give, I came to think it had failed. And so I wrote another, one that I hoped would provide a more satisfactory answer to the questions I'd tried to deal with in the first. . .

Click to read more ...


Heidegger's Place?

In light of Martin Heidegger's enrollment in Nazism during the early days of Hitler's rise to power in Germany, and his failure to recant or explain himself (though he did resign his position under the Nazis and go into semi-retirement after a year), and in light of evidence of his complicity with Nazi anti-Semitism in those first days when he was a spokesman for National Socialism in the German academy) . . . and, given the opacity of his thinking itself . . . why does he continue to be an object of philosophical interest for so many in philosophy today? Hasn't his engagement with Nazism fatally undermined his credibility, even if one can make sense of his thinking?

Heidegger took philosophy away from the subject-object distinction that has engaged it in the West since Descartes, arguing that there is no real separation between us and the things of our world because the world is neither constituted by subjects in a world of objects (dualism which conceives subjects and objects as distinctly separate modes of being or substances) nor is it just subjects holding ideas of objects in their minds (idealism), nor merely physical stuff which has the power in some configurations to masquerade as the mental (materialism). Instead he rejects these categories in favor of refocusing our attention on what it means to stand and operate in a world as aspects of it characterized by awareness.

For Heidegger the world is a kind of continuum and human beings, as subjects, are inextricably in it . . .

Click to read more ...


Waiting for Wednesday - Values and Facts: The "Truth" Connection

. . . All valuation begins with truth discernment because that is the first order of valuation, the first sorting we must do as living creatures. But human beings, because we have a cultural dimension to our lives, made possible by the cognitive capabilities that enable us to conceive our disparate sensory inputs as a world, move beyond this level, beyond the recognition of the true and the false, to the valuation of things in terms of their effects on us. The idea that value questions are not amenable to truth determinations is simply wrong. Truth is just another form of valuation and, as the most basic form there is, the underlying ground for all the other claims of value we can make.

The idea that moral questions are cut off from claims of truth is misleading because, insofar as moral valuation is valuation at all, it comes from the very same place our truth claims come from. . . .

Click to read more ...


More On Chomsky and Language: Its Nature and Acquisition 

I've been critical of Chomsky's theory of language here based on having viewed several of his talks and interviews on Youtube from over the course of the last 40 or 50 years. Seeing little change in his explanations, examples and claims over that period, I've concluded that he hasn't made all that much progress since his earliest theories about the innateness of language. But perhaps I haven't been totally fair to him because in at least some of the later talks he offers a more concrete thesis about what he means when he refers to the sudden occurrence of language in humans (which he places as occurring somewhere in the past 70,000 years or so). He argues that since language requires a computational capacity and there is no evidence for language-capable thinking in human artifacts prior to that time (but indirect evidence of it, in the presence of symbols, art and decorative imagery in the archaeological record, from at least around that period), this capability must have appeared in one human (because it involved a mutation) at some point back then. And it must have occurred full blown. . . .

Click to read more ...


Chomsky on Language: Its Use, Acquisition and Value

Frege and Russell made language central to philosophy in the twentieth century and Ludwig Wittgenstein made ordinary language the core of our interest, how it shapes our thoughts and deeds, how it structures our picture of the world. In the 1950s, Noam Chomsky came on the scene with a radical new take on language though, a new take that partook of old ideas. Picking up from the 17th and 18th century thinkers, particularly the rationalist tradition but also the early empiricists, Chomsky argued that language was so complex that it could not possibly be merely learned by us as children. Rather, he posited, there must be a deep, inherent set of rules encoded in our brains which enable language to grow in us the same way the human embryo grows arms and legs, the infant matures, the child passes through puberty, etc. Language, that is, on his view had to be inherent in creatures like us or it could not occur at all.

The old empiricist tradition which had challenged rationalists like Descartes and rationalist reformers like Kant and others writing in his wake, must have gotten it wrong, Chomsky argued. Kant and his supporters had it right: There must be a structure to experience which arises in the brain itself and which is built in, not learned by trial and error of the organism. The old empiricist idea of the tabula rasa had to be mistaken. . . .

Click to read more ...