In the summer of 1986, I gave a public talk at a Zen center in Ann Arbor, Michigan. After the talk, a woman who had been in the audience handed me a sheet of paper printed on both sides and assured me I would find it edifying. I read the front page of the sheet and learned the document in my hand contained the teachings of an entity named Lazaris, who lived in one of the dimensions that somehow had remained undiscovered by science and who sent messages to the beings in the dimensions that human beings occupy through a channeler. The dissemination of the teachings of Lazaris have progressed since 1986 from the crude and unhygienic medium of the printed page to the aseptic medium of the Internet. Lazaris now has a website, on which we learn this:
Since 1974, Lazaris has channeled through Jach Pursel, his only channel, offering his friendship and love and generating a remarkable body of tools, techniques, processes, and pathways for our Spiritual Journey.
Learning more than some basic information about Jach Pursel and his assistants and getting more than a few short quotations from the teachings of Lazaris requires going to the shopping page and buying access to audio recordings, which range in price from $7.95 to $74.95 for the basic teachings, although some basic teachings are free. Learning what Lazaris has to say about being prosperous and successful can cost up to $250. Prosperity is rarely inexpensive.
Another website gives more information about what Lazaris is:
Lazaris is a nonphysical entity who first began channelling through Jach Pursel, his only channel, on October 3rd, 1974. He is a spark of Light, a Spark of Love, who has helped tens of thousands of people to expand personally, metaphysically and spiritually on their Spiritual Journey Home.
Another page on that same website offers a taste of the wisdom of Lazaris through a few quotations
Your love has a fierce majesty that cannot be matched; your love has a tenacious magnificence that cannot be contained or measured. You of the Human Race stand alone among the many Races within the dimensional universes in your capacity to love and in your ability to care – in the way you care for each other. Others love, but none like you do… Others care, but none like you do.
Back in 1986, a friend who was with me as I perused the double-sided sheet of printed paper commented “I wonder why it is that some people feel they need a nonphysical entity to tell them the sorts of things they could probably figure out for themselves by paying attention to their own experiences in life. If people feel a need for help from others, there are plenty of ordinary physical entities who have given us more than enough good advice.” I wondered the same thing.
Thirty years later I am still wondering, although to be honest I have not given the matter much thought. It is not that we human beings need better advice, I am inclined to believe, but that we would do well to follow the good advice we have been given. The only reason I am giving the matter of the appeal of Lazaris and various other allegedly disembodied entities some thought right now is that the United States seems about to embark on an era of authoritarianism. The man elected to be the 45th President of the United States has said that he does not need extensive intelligence briefings, because he can be told about a situation for twenty seconds, and he gets a gut feeling about what to do. It’s as if he “just knows” what to do and how to do it. His way of just knowing does not require the careful gathering of evidence, the consideration of all the different conclusions that any body of evidence could support, assessing the limitations of the available evidence, and the painstaking weighing of possibilities and probabilities. No phronesis is required. All that is needed is to listen for twenty seconds and then let intuition and instincts lead the way. If others question the decision, they simply need to be told “I am very smart. I know things that no one else knows.” There is no admission that there could be any legitimacy to questioning the conclusions that such a method yields. It is absolute. In that respect, the pronouncements of the 45th President are like the statements of an oracle, or a nonphysical entity that channels wisdom through just one person.
In his book The War on Science: Who’s Waging It, Why It Matters, What We Can Do About It, Shawn Lawrence Otto chronicles several of the human tragedies that have unfolded when authoritarians have suppressed both questioning and evidence that casts doubt on firm convictions. When, for example, unimpeachable leaders have insisted that particular agrarian policies would produce more crops, then have fired or imprisoned or even killed observers who dared to bring forth conclusive evidence that crop yields were in fact poor, masses of people have starved to death.
Authoritarianism manifests itself as an inability to accept that one has been mistaken. An antidote to authoritarianism is scientific method, which has at its very core not only the realization that one may very well be mistaken but the practice of trying to show that the currently accepted conclusions are mistaken, or at least incomplete and oversimplified and liable to be modified as new evidence comes to light. Failure to falsify a tentative claim reinforces the claim for now, but of course the claim can always be overturned in the future as better and more complete observations are made. It is falsifiability that distinguishes scientific claims from claims that are placed out of the reach of questioning or criticism. Authoritarianism and scientific method are fundamentally incompatible, which is why politicians with an authoritarian streak tend to be wary of science, scientists and evidence-based reasoning.
People, either individually or collectively, who have various kinds of vested interest have often used the greatest strength of those who practice science—their ability to replace a tentative conclusion with a more accurate one—against science. The tobacco industry, for example, sought to undermine public confidence in the conclusion that tobacco use entails numerous health risks by pointing out that what scientists say in one decade is shown to be false in later decades. If researchers are saying today that tobacco use entails health risks, suggested the practitioners of denial, ten years from now they may be saying something completely different. Exactly the same strategy was used by the pesticide industry to undermine public confidence in the finding that some pesticides and herbicides do damage to the environment and some health hazards to human beings. The same strategy has been used by the petroleum industry to manufacture doubt that the combustion of fossil fuels is a cause of changes in the climate that result in the melting of ice caps and glaciers, rising ocean levels, lethal acidification of waters, more turbulent storms and generally more unpredictable meteorological events. The transition team of the 45th president-elect tried to create doubt about CIA and FBI findings that Russia was involved in hacking into the email servers of both political parties by pointing out that those entities were mistaken in 2003 in saying that Saddam Hussein probably had, or would soon have, nuclear weapons. The doubt-creating strategy consists of making the fallacious argument that if someone was mistaken about something, then they cannot be believed about anything. That being the case, no one is to be believed but an infallible authority. But ordinary human beings are notoriously fallible, so the safest bet is either an extraordinary human being or a non-human entity.
If scientific method and authoritarianism are incompatible, then science is a way to counter authoritarianism, but authoritarianism is also a way to counter science. The authoritarian method consists in making claims, repeating them until people believe them, undermining the credibility of those who make contrary claims, deliberately silencing those who disagree with or question one’s claims, or by casting aspersions on the character of those who do not readily endorse one’s claims. A good many politicians practice all of those techniques. Probably every human being uses those techniques at one time or another.
This squib began with a reference to the teachings of Jach Pursel, which he claims are really the teachings of a nonphysical entity called Lazaris. Since I have no idea what these teachings are, because I am not inclined to pay money to find out, I am not at all in a position to suggest that there is anything pernicious in those teachings. In fact I suspect, but do not know, that the teachings are innocuous enough and unlikely to do anyone direct harm. They are probably not at all in the same league as the claim that nothing need be done by human beings to reduce or eliminate carbon emissions. There is, however, a potential unintended consequence of disseminating advice by making the untestable claim that the advice comes from a nonphysical entity rather than taking the more straightforward route of saying “Here are some ideas I have that I would like to share with you (for free).” Claiming that the advice is not just the outcome of the thinking of another ordinary human being, but is the communication of a nonphysical entity who speaks through only one human being, makes the advice seem extraordinary and therefore (in the minds of some) more credible, less prone to the errors made by minds encased in meat, fat and bones. Presenting the advice in this way is an attempt to make an end run around critical thinking. In an age about to embark on an autocratic and authoritarian presidency, critical thinking is not something to try to run around. It is something to embrace and to use as well as one is able. There is no area of life that I can think of that is not enhanced by critical thinking.
Many a political and economic commentator has expressed the view that the inauguration of the 45th President of the United States is likely to be the beginning of a dark and dangerous period on undemocratic authoritarianism in American history. For what it’s worth, Jach Pursel, blogging on behalf of Lazaris, is inclined to disagree. His cheerful advice is remarkably similar to that of the President so many people are dreading:
Be a champion of change, a champion of the new future—a future no one has yet imagined.
Some, I think, may have imagined our future. Names such as William Golding and Eric Arthur Blair (alias George Orwell) spring to mind.
Alice: But I don’t want to go among mad people.
The Cat: Oh, you can’t help that. We’re all mad here. I’m mad. You’re mad.
Alice: How do you know I’m mad?
The Cat: You must be. Or you wouldn’t have come here.
A common feature of the kind of madness that modern psychologists call psychosis is delusion, that is, a perception of events that does not conform to experiences of the majority of people. A person with a psychosis may be subject to audial or visual hallucinations, that is, experiences they have that other people are not experiencing. It is not uncommon for a person with a psychosis to have a sense of self-importance or extraordinary ability, called a delusion of grandeur; this sense is sometimes accompanied by a feeling that one is so important that others are conspiring to thwart his efforts or bring him harm, which is called a paranoid delusion. People living with those who have been diagnosed with a psychosis sometimes report that the psychotic is convinced that he alone is sane and the the rest of the world is crazy. Whatever its content may be, delusional thinking involves a narrative, a story that the thinker is weaving to make some sense of his or her experiences.
The very idea of delusion presupposes a correct narrative, deviation from which constitutes fantasy. What is considered correct can vary considerably from one time to another—one need only recall that there was a time when the narrative that the earth was fixed in space and that the sun, planets and stars all rotated around it was so firmly established that alternative accounts of the relative positions of heavenly bodies was considered preposterous. Even at the same time, there can be significant differences among narratives. The Qur’ān, for example, claims to correct the mistaken narrative of the Christian gospels that Jesus died on the cross—Jesus was not crucified, says Qur’ān 14:157, it appeared to some that he had been. From the perspective of one who accepts the narrative of the Qur’ān as the true standard, the appearance of the crucifixion of Jesus may have been a hallucination, and the gospel narrative is an example of delusional thinking.
Consensus is not necessarily a reliable criterion of what is actually the case. Indeed, logicians regard the appeal to popular consensus an informal fallacy, called argumentum ad populum. It is absurd to believe that something must be true simply because most people believe it, equally absurd to hold the contrarian belief that something must be false simply because most people believe it.
Friedrich Wilhelm Nietzsche (15 October 1844 – 25 August 1900), who by popular consensus was, or at least was becoming, insane during the years when his most often-cited works were written, famously wrote “There are no facts, only interpretations.” If he was sincere in writing those words, of course, he cannot have believed that to have been a factual claim; it was merely a statement of how he interpreted things. It was his narrative of the moment, one that presumably helped him make some sense of what he was experiencing.
Narrative, the telling of stories, is most often done with language, although it is also possible through images such as wordless cartoons or mime. Language has been regarded with particular suspicion both by some individuals and some traditions. Consider the Daoist saying, “Those who know do not speak, and those who speak do not know.” Or consider the character Hugo in Iris Murdoch’s novel Under the Net, who says in chapter four to the first-person narrator:
“All the time when I speak to you, even now, I’m saying not precisely what I think, but what will impress you and make you respond. That’s so even between us—and how much more it’s so where there are stronger motives for deception. In fact, one’s so used to this one hardly sees it. The whole language is a machine for making falsehoods.”
There were (and perhaps still are) some Buddhists who seemed to agree with Hugo’s claim that “the whole language is a machine for making falsehoods.” The Mādhyamika philosopher Candrakīrti, for example, can be interpreted as having held the position that propositions and propositional thinking have a place in the world of commerce (vyavahāra) and other practical goal-driven enterprises, but they have at the very best an asymptotic relationship with the greatest good, nirvāṇa, the eradication of the causes of personal and social turmoil (duḥkha). A philosopher on one of whose principal works Candrakīrti wrote a commentary was Nāgārjuna, who praised the Buddha for having shown that liberation (śiva) consists in the silencing of narratives (prapañcopaśama).
I have written about prapañca as narrative before in a squib suggesting that the Buddhist notion of prapañca is that it is “pointless narrative.” Candrakīrti’s praise of silence (tūṣṇīm-bhāva) as the route to liberation suggests that he may have regarded all narrative as pointless and troublesome. If that was indeed his view, of course, he courted the same dilemma as Nietzsche would have courted if he thought it was a fact that there are no facts, or that the Daoist Laozi courted when he said in chapter 56 of Daodejing that those who know do not say and those who say do not know (知者不言、言者不知。)
Creating narrative is what people do. Every culture is a culture of story-tellers. It could even be said that what we call culture is little more than story-telling. There may well be no way of avoiding narrative so long as the brain is alive; this may be the case for spiders who build webs as well as for human beings who write books and then build cathedrals in which to asseverate what has been written. Nāgārjuna and Candrakīrti were creating narrative when they said the the way to peace is to find a way to stop creating narratives.
It could be the case that narrative becomes a problem only when people believe narratives that create in their minds hopes that cannot be fulfilled or expectations that cannot be met. The Buddhist, for example, who uncritically accepts the narrative that the root causes of turmoil can be eradicated through mindfulness may be setting up an expectation that can lead only to frustration when the goal remains elusive. The ethicist who places an emphasis on the questionable premise that agents have freedom of will may be transmitting a narrative that leads to the avoidable condemnation of those whose essentially involuntary actions are unwelcome in mainstream society. Nations and would-be nations that take collective actions on the basis of the narrative that there are inalienable rights to which everyone is entitled may be promoting conditions in which citizens are constantly invited to be indignant about their rights (which are, after all, entirely fictitious) having been abridged.
A good deal of religion, philosophy and politics consists of pernicious narrative. To conclude, however, that because some narrative is pernicious, all narrative must be pernicious is to fall prey to the inductive fallacy, a form of thinking that, according to the narrative of mainstream logicians, may lead to conclusions that prove disappointing.
And so I urge you, go after experience rather than knowledge. On account of pride, knowledge may often deceive you, but this gentle, loving affection will not deceive you. Knowledge tends to breed conceit, but love builds. Knowledge is full of labor, but love, full of rest.—(The Book of Privy Counseling, Chapter 23)
About thirty years ago, in 1986 or so, I attended a day-long workshop on Buddhist and Christian contemplative practices. During the day various Buddhists led meditations based on vipassanā exercises, Theravādin mettābhāvanā and Tibetan gtong-len practices, and an Anglican contemplative nun led a meditation based on the fourteenth-century guide to contemplative prayer called The Cloud of Unknowing. The author of The Cloud is unknown, but it is commonly believed that the same anonymous author wrote The Book of Privy Counseling that is quoted above. The session based on The Cloud of Unknowing turned out to have a profound and lasting influence on my own approach to meditation. In the present writing my aim is to reflect on one particular Cloud theme and how I have found it useful as a Buddhist practitioner.
First, for those who may not be familiar with The Cloud of Unknowing, the principal notion is that all the knowledge we have acquired in various ways eventually presents an obstacle to the only reliable way of truly knowing God, which is not through the intellect but through the experience of love. That experience of love takes place in what the author calls The Cloud of Unknowing. Access to that “cloud” is gained by first passing through what the author calls The Cloud of Forgetting. In practice, passing through this first cloud consists in making a deliberate effort to set aside all the beliefs and convictions one has acquired through indoctrination, teaching, catechism and personal study. All such intellectual knowing is to be put out of one’s mind so that the meditator can sit with a completely open heart to whatever may arise in the cloud of unknowing. The cloud of unknowing itself is simply (but not necessarily easily) sitting in complete silence with a mind free of thoughts, expectations, anticipation or personal concerns but with a loving readiness to receive whatever experiences may arise as if they were gifts lovingly bestowed. A Christian doing this practice will naturally speak of it in terms of loving and being loved by God, while a Buddhist may be more inclined to speak of it in terms of experiencing Suchness (tathatā) or the love of Amitābha Buddha, but of course to speak in such terms is possible only outside the clouds of forgetting and unknowing.
Since setting aside all dogmas and indoctrination permanently could prove to become socially awkward, or even dangerous to one’s health, within the context of a religious community that expects adherence to those dogmas, the author of the Cloud of Unknowing recommends again picking up the intellectual knowledge that one set had aside in the cloud of forgetting. After being in the cloud of unknowing, however, one is likely to hold all those views more lightly and perhaps even somewhat ironically. The contemplative who regularly practices this form of contemplative prayer may, for example, continue to say what he or she knows a Christian or Buddhist is supposed to say but is likely to have a profound sense of acceptance of the fact that others were given other lines to recite and are saying what they are expected to say. Believing in the sense of assenting to propositions, however, yields to wordless loving, and as practice deepens, loving becomes increasingly unconditional.
It has been my experience over the decades that there are more and more doctrines that I am prepared to leave at the threshold of the cloud of forgetting and to be disinclined to pick up again at the exit. Except in the most abstract and general way, I now find myself disinclined to recite the lines that as a Buddhist I was taught to say. Yes, I am still willing to say that attachment is a condition for eventual disappointment, and that is indeed a Buddhist teaching, but it is also a commonplace observation on which no tradition owns the copyright. Beyond voicing such commonly articulated observations as that, however, I am no longer led to speak as a Buddhist (or anything else that attempts to organize experiences into doctrinal structures).
Beyond a general disinclination to recite Buddhist dogmas, I feel a particularly strong resistance to repeat a few specific doctrines associated with Buddhism. There is one in particular that I have questioned so often that I have come to feel it is almost entirely useless,—at times even counterproductive—in contemporary society, namely, the doctrine of non-self (anātmavāda).
It is clear from looking at the canonical and scholastic literature of Buddhism in India that the original doctrine was a critique of one specific doctrine held by rival schools, namely, the doctrine that the self (ātman) is a simple, unchanging substance that has no cause, has no agency, is unaffected by anything else and produces no effects. A fairly typical Buddhist critique of that notion of self is that if there is such an entity, we cannot know about it, since it has no effects, including the effect of making an impression on our faculties of sensing and understanding. Moreover, even if such an entity exists, it cannot play any role at all in the task of primary interest to a Buddhist, which is the task of changing one’s mentality from one that sets up the conditions for frustration and disappointment to one that painlessly deals with whatever experiences may present themselves. It takes only a moment’s reflection to see that the Buddhist doctrine of non-self is a critique of a view that hardly anyone in modern times holds. It is a razor in search of a beard. In the context of current beliefs about how the human mentality is constituted, arguing that there is not a simple, permanent, unchanging, uncaused, actionless and inconsequential self is approximately like arguing that there is no such thing as the fire-element phlogiston. Anyone standing on a soapbox and making such a proclamation is unlikely to meet any opposition. Such a safe proclamation is unnecessary and ultimately useless.
In the absence of an actually held negandum for the doctrine of non-self, modern Buddhists have tended either to absolutize the doctrine to mean that there is no self of any kind anywhere or to interpret it to be a warning against a particular notion of self called Ego.
The former of those options, saying that there is no self at all of any kind, is too obviously false to be worth more than a moment’s consideration. There clearly is a complex physical and psychological self that every healthy person experiences nearly every waking moment of every day, a self that is inaccessible to other selves and to which other selves are largely unknowable. The self of daily experience is so multifaceted that it does not admit of easy definition, but being difficult to define does not disqualify it from being something that most people devote most of their energy to making more or less successful attempts at protecting, nurturing, ameliorating and controlling. It is important to realize that being a self is not in any way contrary to the letter or to the spirt of Buddhist teachings. As one of the most treasured of all Buddhist texts, Dhammapada, says:
157. If one holds oneself dear, one should diligently watch oneself. Let the wise man keep vigil during any of the three watches of the night.
159. One should do what one teaches others to do; if one would train others, one should be well controlled oneself. Difficult, indeed, is self-control.
160. One truly is the protector of oneself; who else could the protector be? With oneself fully controlled, one gains a mastery that is hard to gain.
163. Easy to do are things that are bad and harmful to oneself. But exceedingly difficult to do are things that are good and beneficial.
165. By oneself is evil done; by oneself is one defiled. By oneself is evil left undone; by oneself is one made pure. Purity and impurity depend on oneself; no one can purify another.
166. Let one not neglect one’s own welfare for the sake of another, however great. Clearly understanding one’s own welfare, let one be intent upon the good.
The second of the options, saying that denying self is really about denying Ego, is potentially more confusing that it would be to say nothing at all. That is because both in modern psychology and in ordinary language, the term ego has numerous meanings, so one must specify exactly which sense of the term one is taking pains to deny. In some discussions of abnormal psychology, for example, having a weak ego is said to be a characteristic of some types of serious mental illness. Given the polysemy of the term ego in modern usage, it is probably better not to present Buddhism as a set of antidotes against ego itself.
Buddhism may be presented as an antidote to egocentrism, that is, the inability to distinguish between self and other that manifests as an inability to grasp or appreciate any perspective or belief other than one’s own. Such an antidote, however, can be presented in a more straightforward way than by expounding the somewhat arcane Buddhist doctrine of anātmavāda. Rather than denying self (whatever that might mean) or problematizing the distinction between self and other in the mysterious language of non-dualism, it is probably more helpful simply to teach positive contemplative exercises such as the cultivation of friendship (mettā-bhāvanā), which begins with the recognition that one naturally strives for well-being for oneself, progresses to the realization that all conscious beings seek well-being for themselves and that there is no compelling reason why one should favor one’s own self over anyone else’s self, and finally extends the care that one has for oneself to an increasingly wide circle of other selves. While the cultivation of unconditional love for all beings is easier to say than to achieve, it is a task that is not in any way made easier by introducing the classical Buddhist doctrine of non-self.
Religious and philosophical teachings are better seen as invitations to discovery than as accurate descriptions of what one will discover. Teachings that prove useful to some people at some times may not be at all useful to other people, or to the same person at different times of life. In the culture of ancient India, there was a doctrine that all the changes of life are not to be taken too seriously, because they are not really the self, the true self being outside the realm of everyday experience. While some people no doubt found that way of thinking a useful way not to be overwhelmed by the world of change, others found it difficult to make sense of such a doctrine. It is said that the Buddha was among those who did not find the doctrine of a static true self (ātman) useful and sought to provide an alternative strategy, the dogma of non-self or even no self (anātman), to avoid being overwhelmed by the experience of constant change. That alternative is historically interesting, but that there is no simple, unchanging substance to be called the self now goes without saying. That which goes without saying is probably better left unsaid. Or, in the language of The Cloud of Unknowing, it is better left inside the cloud of forgetting.
From both of my parents and all four of my grandparents, I inherited a distaste for self-promotion—even the indirect forms such as being patriotic or proud of one’s school or place of residence or of other members of one’s own family. Early childhood conditioning tends to be persistent, so to this day I inwardly cringe upon witnessing displays of self-referential praise.
Years ago, I was on an academic committee considering a faculty member for promotion. I knew and admired the candidate, and there was little doubt in my mind about his being worthy of promotion. That notwithstanding, I found myself put off by his supporting documentation. Rather than simply submitting the required teaching evaluations, he supplied an accompanying document quoting selected phrases from comments that students had made; these selected words of praise were isolated from the surrounding narrative by being placed in text boxes and formatted in a large and bold font so that there would be no missing how highly his students thought of him. Offprints of his publications were accompanied by a similar document, featuring laudatory remarks that reviewers had made of his work, also placed in text boxes and set in boldface type in a larger font size. The presentation felt like an advertising brochure, as though the candidate somehow believed that the only way to get an academic promotion by others was to display a capacity for self-promotion.
I was not the only member of that committee to be put off by the presentation. An older colleague, nearing retirement age, commented that universities nowadays are almost forcing their employees to expunge all traces of modesty and humility from their behavior, if not from their mentality. Department chairs are expected to write annual reports assuring the university administration that their department is filled with world-class scholars and universally admired instructors. By the time of the early 1990s, candidates for promotion in the academic world could no longer submit a simple letter and a typed resumé. They had to write 10-page descriptions of their goals as teachers and scholars, accompanied by ample evidence that they were accomplishing those goals and that their accomplishments were being recognized by others living near and far. When I sought my first promotion, it was still possible to submit a brief letter and a typed resumé. By the time I was at the stage of my career to seek a promotion to the next level, all that had changed dramatically. I found the new process so unpleasant to contemplate that I never sought another promotion after that first one—and I was amply rewarded by never getting another one. Putting together the expected sort of dossier was not worth the time and effort, but more to the point it was not worth the violence to my sense of dignity. Ironically, my sense of self-worth would have been undermined by having to present myself as worthy.
It is not only the academic world that has steadily gravitated toward a culture of vainglory. Far from being an unpleasant feature of a bloated ego, fulsome self-congratulation now seems to be expected. Just as no commercial product can afford to be presented simply as adequate to the task but must be portrayed as better than all its competitors and indispensable to the discriminating consumer, no person can afford to be seen as merely competent. Pretty good is just not considered good enough anymore.
During an election year in the United States of America, voters are treated to a parade of candidates who not only toot their own horns sans cesse but also boast about their country as the greatest country in the world, even as the greatest country in the history of the world. Some of the candidates go so far as to disparage political leaders who do not participate in their jingoistic frenzy; those not caught up in nationalistic fervor are characterized as actually hating their country and wanting to drag it down to the same level as ordinary countries. An ordinary country is one that has affordable health insurance and reasonably-priced medical services and pharmaceutical products for everyone; reasonable tuition fees and generous food, housing and transportation subsidies for students pursuing a higher education; a modest-sized military of men and women trained mostly to help citizens cope with natural disasters such as floods, hurricanes and earthquakes; and a prison system designed to reform and educate miscreants rather than punish them. An ordinary country does not have a bloated military budget that is used to send personnel and materiel to countries all around the word and to build permanent military bases in more than a hundred other nations. Americans these days who long to live in an ordinary nation are advised to go live in Canada or Northern Europe, for the United States is a nation for those who wish to participate in excelsior.
In the 1950s, the psychologist Carl G. Jung said in an interview broadcast in English that the United States as a nation is “extraverted like hell.” The quiet reflection of the introvert is deprecated to such a degree that the system of public education is skewed in favor of gregarious doers whose energy is dedicated to making changes in the world rather than in one’s own attitudes and expectations. The thriving industry dedicated to selling products designed to help people realize their dreams of “self-improvement” tend to focus on how to be more self-confident, more assertive, more aggressive, more successful by external standards of assessment, more admired by the crowd. Jung chose his words carefully; an overly-extraverted country truly is like hell.
Although the United States could be described as “extraverted like hell” in the 1950s, it appears not to have always been that way. Neither the New England where some of my ancestors were born and lived, nor the Midwest in which other of my ancestors made their way from the cradle to the grave, according to what I heard from family elders, had much room for the braggadocio narcissism that has become so prevalent in today’s culture.
It is possible that my elders’ memories of the prevalent culture of their early days were faulty. Perhaps they were just getting old and slowing down, as I have managed to do a few decades after they passed on. Perhaps the world always seems too fast-paced, too forceful and too brash from the perspective of a rocking chair on the porch with a commanding view of an array of bird feeders. Or perhaps a culture of modesty and moderation really has been mostly replaced by a culture of excess and hubris.
Where the crowd is, therefore, or where a decisive importance is attached to the fact that there is a crowd, there no one is working, living, and striving for the highest end, but only for this or that earthly end; since the eternal, the decisive, can only be worked for where there is one; and to become this by oneself, which all can do, is to will to allow God to help you—“the crowd” is untruth. (Søren Kierkegaard, On the Dedication to “That Single Individual”)
One feature of being a primate that I enjoy the least is the way we primates tend to organize our social groups hierarchically. Our penchant for hierarchy is perhaps most obvious in institutions such as the military and the Catholic Church, but it manifest in some way every time more than one primate is present. All one need do is go to a public place such as a coffee shop and watch the interactions within a group of people. This observation is most effective either when the group being watched is far enough away that one cannot hear what they are saying, or if they are speaking a language one cannot understand. Then one has nothing to focus on but body language, which is quite revealing of social hierarchy. If a couple is carrying on a conversation, chances are very good that one of the pair will be doing most of the talking; the other may or may not be listening. In a crowd of three or more people, most likely one person will be a de facto leader, a maker of suggestions and decisions. (Some people made fun of George W. Bush when he said “I’m the decider,” but in fact when more people than one are present, it will soon be evident that one of them is the decider.) This is a tendency one can see even in very young children. There are a few leaders, and the rest, whether they like it or not, are followers. As is the case with chickens, so with it is with us taller bipeds: we have a pecking order, and whosoever gets out of order will soon be pecked back into the proper position. This is a process we call socialization.
Anyone familiar with the academic world will know about the administrative hierarchy of president, vice presidents, provost, vice provosts and a battery of deans, and all the faculty rankings from professor to down lecturer. What some students may not realize is that if the salaries of the instructors were divided by the number of classroom hours, some of their most effective instructors turn out to be paid considerably less than others, have no vote at faculty meetings (and may not even be invited to attend them), are rarely consulted on matters of policy and may be sharing an office with several other instructors at the bottom of the totem pole. There is very little justification for this setup other than that this is how universities were organized in the fourteenth century, and by the time somehow has risen to a position of privilege, there is little incentive to make the system more equitable. People at the top of totem poles see no virtue in horizontal poles. I recall one senior professor commenting on a petition for better working conditions that came from seriously underpaid graduate student lecturers, “They want to be where we are, but they don’t want to be where we have all been.” In other words, he had to suffer substandard wages for several years, so why shouldn’t they? After all, being at the bottom of the dog pile builds character, no? How else will one learn how to behave when one gets to the top of the pile if one does not spend time at the bottom?
It could perhaps be argued that there are situations where a hierarchical structure serves a purpose. When confronted with a raging fire, for example, it is no doubt to everyone’s advantage to have a captain who assesses the situation and assigns specific tasks to others who then follow orders without question or complaint. An emergency is no time to have everyone sit in a circle and to wait until the talking stick is passed to him so that he can venture a suggestion that will be carefully and respectfully weighed along with other suggestions and eventually decided by consensus. Fire brigades, police departments and battalions probably work better when there is a hierarchy and everyone in that hierarchy knows exactly where his or her place is. But not every situation is emergent. In most of the ordinary situations in life, there is no need for a hierarchy. And in some, a hierarchy can be a real obstacle.
The one enterprise in life that least needs hierarchy is the very one from which the word “hierarchy” comes, namely, religion. The word comes from two Greek words, hieros (sacred) and arkhein (to lead, to rule), and it originally meant a system of government in which the ruling was done by priests or holy people. Although few countries these days are hierarchies in that original sense of the word, most religious organizations evolve into hierarchies in which those deemed most spiritually advanced are the deciders. This fact, I would argue, helps account for why most religious organizations end up being a grotesque caricature of the very doctrines and values they were founded to propagate.
Many years ago, I was on the board of directors of a Zen Buddhist temple in North America. As a registered charitable organization with tax-exempt status, the temple was required by law to have a board of directors and a constitution. Our constitution specified that the Zen master was president for life of the board and that the president had sole authority to decided all spiritual matters, while the board had the authority to decide secular matters. At one meeting of the board, the order of business was to renew the constitution—another procedure that was required by tax laws to be done periodically. I was unprepared to vote for approval of our constitution until I could be helped to understand what exactly differentiated “spiritual” from “secular” matters. What eventually became clear to me was that whatever decision the Zen master wanted to make, even down to the color of napkins at a potluck dinner, was automatically spiritual. Anything he did not want to be bothered with was secular. It became clear that the entire structure of the organization was designed to preserve the absolute power and authority of one man and that the principal task for everyone else was to learn to be subservient and deferential. Once that was clear to me, it was also clear to me that I must resign from the board of directors and leave that entire organization. As much as I enjoyed, and perhaps even benefited from, the practices of Zen, I did not undertake those practices for the purpose of learning to accept the absolute and often arbitrary power of a fellow human being.
Over the decades I have given a good deal of thought to the question of how best to organize a spiritual community. The more thought I have given to the matter, the more clear it has become to me that the best interests of a spiritual community are served by having no organization at all. Jesus of Nazareth was reported by Matthew (18:20) to have said “For where two or three gather in my name, I am there in their midst.” Now, I have never been a Biblical literalist, but my understanding of this passage is that when a fourth person shows up, Jesus finds somewhere else to go. Four is a good number for a barbershop quartet or a game of bridge, but it is one too many for a spiritual community. When numbers grow, so do perceived collective needs, and before one knows it there is a building and grounds committee, a fund-raising committee and a hospitality committee—not to mention a spirit of rivalry among the committees and hard feelings on the part of those unfortunate congregants who are overlooked to serve on them. In astonishingly short order, all vestiges or spiritual practice have vanished in the ensuing chaos of primates jockeying for position in a social hierarchy.
Institutions have a way of providing a constant supply of distractions. They tend to promote what Indian Buddhists called habitual distraction (abhyasta-vikṣepa), which in turn promotes delusional thinking, a condition that obstructs peace of mind. Distraction (vikṣepa) is a name given to having one’s thoughts scattered (vikṣipta-citta). Each time one allows oneself to be distracted, the tendency to be distracted again is reinforced (abhyasta), and eventually distraction becomes the usual state of one’s mind. Distraction makes it more difficult to be aware of the constant flow of changing perceptions, internal dialogues, judgements and motivations, which in turn hampers the process of stopping unproductive thinking before it leads to troublesome behavior.
Our life always expresses the result of our dominant thoughts.
The Buddha said in a number of places that the social condition most conducive to having a focused mind (samāhita-citta, also known as samādhi) is isolation from other people. Being around others, especially others who talk much and scurry about getting things done, makes mental focus difficult and makes distraction easy. Given that a good deal of what people collectively set out to accomplish is simply not necessary, and given that this is no less true of spiritual communities than of mahjong clubs, the best way for most people to keep their minds safe and sound is to avoid congregating, even into spiritual communities and organized institutions.
My conclusion, then, is that not only is the best organizational structure of a religious community no organization at all, but the best spiritual community for an individual serious about spiritual practice is no community at all.
What I have said here has been based on my experience. Others may not be similarly constituted, so their experiences may be different; I cannot know for sure, since the only mentality available to me to observe directly is my own. I offer these reflections in the spirit articulated well by Śāntideva:
atha matsamadhātur eva paśyed aparo ’py enam ato ’pi sārthako ’yam
If another whose constitution is like mine should see this, then this person may benefit from it. (Śāntideva, Bodhicaryāvatāra 1.3)
clinicians have long known that there are plenty of people who experience anxiety in the absence of any danger or stress and haven’t a clue why they feel distressed. Despite years of psychotherapy, many experience little or no relief. It’s as if they suffer from a mental state that has no psychological origin or meaning, a notion that would seem heretical to many therapists, particularly psychoanalysts.
An article in the New York Times reports that neural scientists have discovered that a genetic mutation that occurs in approximately 20% of the population results in an abnormally high production of a molecule called anandamide in the brain. Anandamide, named after the Sanskrit word ānanda, which means bliss, results in lower-than-normal levels of anxiety and higher-than-normal feelings of well-being. Anandamide also occurs naturally in cannabis, which could account for some of that plant’s popularity. Interestingly enough, people with the genetic mutation that produces abnormally high levels of anandamide typically have little interest in marijuana; they don’t feel a need for it, and many find that cannabis actually decreases their pleasure and feelings of happiness and well-being.
If the clinicians are correct that a naturally-occurring chemical is a significant condition in subjective feelings of well-being—and in a culture with very high levels of legal and illegal mood-altering drug consumption, who would doubt it?—then it is not only psychoanalysts who might be challenged by these findings. Also challenged should be some religious traditions, such as Buddhism, that claim that people can change the quality of their experiences of the world simply by learning new patterns of thinking and by taking up certain contemplative practices.
Buddhist teachings tend to place an emphasis on the importance of studying causal relationships, and especially learning what kinds of thinking result in unhappiness so that one can eliminate those kinds of thinking and replace them with patterns of thinking that result in more happiness. That sounds much more easy to achieve than it in fact is. First of all, given that (as Buddhists universally acknowledge) every event and state is the consequence of innumerable conditions, it is in practice at best very difficult and at worst impossible to isolate which internal and external conditions are producing the frame of mind that one is currently experiencing. Without being able to identify the most significant conditions, one has no ideas which conditions to eliminate and replace with others. And even if one could identify the offending conditions, replacing them may not be possible. (What if, for example, it should turn out that genetics plays a significant role in how happy one is capable of being? Does one then just replace one’s grandparents with better ones?)
I have written before about how difficult it is to know whether a contemplative practice is a factor in one’s overall psychological health. For my entire life I have wondered whether I have a sanguine temperament and have a tendency to be alarmingly cheerful because I meditate regularly and practice Buddhism. It has always seemed a real possibility that in fact I meditate and feel an affinity with Buddhism because I was born with a sanguine temperament.
When I look at my own temperament, I see a great deal of my father’s mentality. He was rarely discouraged, almost never depressed, remarkably resilient, hardly ever sad, almost never exhilirated, rarely excited and yet prone to moments of unpredictable angry impatience. That also describes me (as I see myself at least, but also how at least a few others have reported that they see me). In trying to account for the similarity in my temperament to my father’s (which, incidentally, I see in a good many of my blood relatives), one set of conditions that we do not have in common is our diet or our religious beliefs and practices. Put perhaps a little too simply, my father had very little interest in religion. He did not read religious texts, never (that I knew of) meditated or prayed, and he showed no inclination to study the biographies of saints in translation or in their original languages. In short, he never did, even in a casual way, the things I spent my entire adult life doing. Very few days went by in which he did not consume a moderate amount of alcohol, whereas I often go months at a time between one glass of wine or beer and the next. He had very little interest in paying attention to diet, whereas I have been almost obsessively interested in getting a balanced diet of organic foods sold by fair-trade merchants. He lived to the age of 89, as did most of his close relatives, who collectively held a remarkable variety of religious beliefs and followed very diverse lifestyles. All of this evidence predisposes me to think that I am as I am largely because of genetic factors that I could not change even if I wanted to. Fortunately, there are few genetic factors I would be even momentarily tempted to change, aside perhaps from wishing for better eyesight.
Talking about all this recently with other Buddhists, I was asked whether I think that my religious practice has been a waste of time. I gave the somewhat feeble answer that calling anything a waste of time presupposes that one can think of other ways that one wishes time had been used; I do not wish I had done something with my time rather than meditate, read Buddhist texts and think about them and study Sanskrit, and therefore I do not consider any of that a waste of time. But that dodges the real question that was being asked, which was probably this: Do I give meditation and Buddhist practice any credit for bringing about the fortunate mentality I enjoy today? I think my answer to that would have to be negative. I had pretty much the same mentality that I have now even when I was a child, long before it ever occurred to me to meditate or think about the Buddha and his teachings, let alone go to them for refuge. If I had the courage to speak frankly of my experience, I might even say that going for refuge to the Buddha and the dharma has hardly had any effect on all on me, other than perhaps to allow me to remain the rather sanguine, calm, even-keeled, uninspired and uninspiring, rather plodding and occasionally irritable person I have always been.
Having said all this, I am inclined to say that doing contemplative practice for the sake of bringing about positive changes in one’s mentality may be the wrong way to go about it, if only because one is bound to be disappointed. Rather, I am inclined to see my own contemplative practice as an expression of gratitude for the fact that there is not much about my mentality that I feel inclined to change. (I am minfdul that this may sound smug, but like it or not, it happens to be true.)
In Buddhist technical terms, I suppose this places me rather squarely in the camp of those who believe in what the Japanese Buddhists called tariki (他力), that is, the conviction that whatever blessings one has have all been caused not by one’s own efforts but rather have been brought as gifts from others over whom and over which one has no control or influence. One could also call it blind luck. Or one could call it by the Sanskrit term śūnyatā, usually translated as emptiness, a term that succinctly expresses the conviction that if one were to subtract from one’s “self”—that is, from one’s body and mind—every single element that was produced by something outside oneself, there would be absolutely nothing left that one could rightly claim to be one’s own.
I have that conviction of the correctness of the doctrine of “emptiness” myself, but I have no idea whether I came to it through careful thinking (yoniśo manaskāra) and study or through the accidental mutation of a gene that has resulted in a generous helping of the insentient molecule anandamide.