Out of a living silence

A contemplative shares thoughts that emerge in moments of quiet reflection

Archive for the ‘Social analysis’ Category

The barren landscape of originalism

with one comment

“The Constitution that I interpret and apply is not living but dead, or as I prefer to call it, enduring. It means today not what current society, much less the court, thinks it ought to mean, but what it meant when it was adopted.”—Justice Antonin Scalia (March 11, 1936–February 12/13, 2016)

Justice Scalia was one of the leading proponents of a method of interpreting the Constitution called originalism, a form of textual exegesis that uses historical and linguistic scholarship to determine what the authors of a text meant by their words or what the first readers most probably understood the words of the text to mean. The original meaning, once determined as well as scholarship allows, is then regarded as the only meaning of the text, unaffected by the what later generations of readers of the text may believe. According to Scalia, while the patterns of thinking of society as a whole may change from one generation to the text, the meaning of the Constitution endures without change.

The method of textual interpretation called originalism is familiar to and widely practiced by scholars of ancient and medieval texts, even if that name is not commonly used by textual scholars. While I was being trained in the field of Buddhist studies, for example, students were advised to try to discover what the expressions found in the Pali Canon (the scriptures of the Theravāda school of Buddhism) probably meant to speakers of Indian languages at the time of the Buddha and to ignore what those same expressions came to mean to commentators in later centuries. The greater the temporal and geographical distance of a commentator from the time and location of the the Buddha, the more that particular commentator was to be regarded with suspicion. What this meant in practice was that my fellow students of the Pali Canon and I were unlikely to turn even to Buddhaghosa (who probably lived in the same part of India in which the Buddha lived but nearly one thousand years later), let alone to Bhikkhu Buddhadasa (a Thai monk who lived from May 27, 1906 until May 25, 1993). Despite the fact that Buddhaghosa’s commentaries on the Pali Canon came to be the interpretation that prevailed from the twelfth century C.E. onward, and that Buddhadasa was regarded as the most authoritative interpreter since Buddhaghosa, an academic scholar of the Pali Canon trained in Toronto in the 1970s would studiously avoid being influenced by them. Similarly, when reading a Sanskrit text written by Nāgārjuna in the second century C.E., a student in Toronto in the 1970s would carefully steer clear of writings by modern Tibetan scholars, such as the Dalai Lama, from schools of Buddhism based on Nāgārjuna’s teaching. In other words, in my academic training in Buddhist studies, I was taught that academic respectability diminished to the extent that one’s method of interpretation of a Buddhist text deviated from textual originalism.

If I had never approached Buddhist texts in any way than as a textual historian, I might hold originalism in high esteem. My interest in Buddhist texts, however, was never motivated principally by historical curiosity. What motivated me, probably to my detriment as a serious academic scholar, was a search for inspiration, a perhaps vain hope to find advice on how to lead a more useful life. As a seeker of inspiration, my practice was to read and reflect on whatever came into my hands and to let it have its way with me. Looking back now on my thinking of several decades ago, I realize how divided my mind was against itself. Without consciously setting out to compartmentalize my thinking, I unconsciously developed two distinct modes of reading. While in academic mode I would read a classical Buddhist text one way; while in spiritual seeker mode, I would read that same text differently—sometimes only somewhat differently and sometimes radically differently. It was rarely possible to be in both modes at once, and it wasn’t always easy to discern which mode I was in at any given time. Flitting back and forth between the two modes became a way of life and probably caused almost as much confusion in the minds of my students and fellow Buddhists as it did in me. The confusion both for me and for others was much less when I was in the midst of Quakers, for my Quaker faith and practice was almost entirely uncontaminated by a scholarly approach to either the Bible or the writings of George Fox and other Quaker authors. Looking back on it all now, I think it may have been easier for all concerned had I been just a practicing Quaker with an intellectual curiosity about the history of Buddhist thought. That, however, is not what I was.

My bewildered and bewildering life as a scholar-practitioner has no doubt had an influence on how I think about the Constitution of the United States. My interest in that document has never been that of a historical scholar. If my interest had been purely scholarly, I would probably have been inclined to be sympathetic to some form of originalism. My interest in the Constitution, however, is much more like my interest in the writings of George Fox. I read and reflect on Fox that I might be a better Quaker, and I read and reflect on the Constitution that I might be a better citizen of the particular constitutional democracy into which I happened to be born. Given that orientation to reading the Constitution, I have relatively little interest in how the people who wrote the text, and those who voted to ratify it in 1789, saw the world. Since they lived and wrote, the world has been exposed to and enriched by the thinking of Charles Darwin and the tens of thousands of scientists who take his work as a point of departure; and to astronomical research that has resulted in a view of the universe that the Founding Fathers could not even imagine; and to quantum mechanics, which has resulted in a view of the universe that no one can imagine; and to neurophysiology, which has resulted in transformations of how we view personality, agency, responsibility, and what people used to call the self, the mind or the soul; and to Hegel and Nietzsche and Kierkegaard, who have made naivety in most matters impossible; and to the Emerson, Thoreau, the Transcendentalists, the Pragmatists and numerous kinds of religious and philosophical pluralists; and to to depth psychology; and to generations of brilliant litterateurs, social commentators, political thinkers and essayists. Since the Constitution was written, the United States has expanded across a continent, been blessed with waves of immigrants from all parts of Europe, Asia and Africa and survived several devastating wars from which surely numerous important lessons have been (or should have been) learned. There is hardly any aspect of modern life that would be recognizable to the the authors of the Constitution. Why should the way we think and act in the world be recognizable to them? To expect people today to ignore all that has happened in the past two and a quarter centuries and to eschew all the wisdom gained from those happenings and to hew to the world view of the Founding Fathers is as unreasonable and impractical as it would be to expect great grandparents to continue thinking and acting as they did as toddlers.

There is no doubt in my mind that Justice Scalia was a deeply learned and highly intelligent scholar of the text of the Constitution. There is equally little doubt in my mind that an enduring or dead Constitution is no more than a historical artifact, as impractical in today’s world as a horse and buggy. What is needed is a method of interpretation of the Constitution that allows for changes in human thinking resulting from scientific discovery, developments in the humanities and social sciences and trends in the arts. The results of such interpretation would no doubt sometimes be wild and unpredictable, and occasionally discomfiting, exactly as life itself is. To be sure, not all change is for the better, but all change must be acknowledged to have taken place, for better or for worse. To ignore change is delusional. To resist it is futile. To embrace it is alone conducive to flourishing.

Written by Richard P. Hayes (Dayāmati Dharmacārin)

Wednesday, February 17, 2016 at 17:32

Posted in Social analysis

The culture of self-promotion

with 2 comments

From both of my parents and all four of my grandparents, I inherited a distaste for self-promotion—even the indirect forms such as being patriotic or proud of one’s school or place of residence or of other members of one’s own family. Early childhood conditioning tends to be persistent, so to this day I inwardly cringe upon witnessing displays of self-referential praise.

Years ago, I was on an academic committee considering a faculty member for promotion. I knew and admired the candidate, and there was little doubt in my mind about his being worthy of promotion. That notwithstanding, I found myself put off by his supporting documentation. Rather than simply submitting the required teaching evaluations, he supplied an accompanying document quoting selected phrases from comments that students had made; these selected words of praise were isolated from the surrounding narrative by being placed in text boxes and formatted in a large and bold font so that there would be no missing how highly his students thought of him. Offprints of his publications were accompanied by a similar document, featuring laudatory remarks that reviewers had made of his work, also placed in text boxes and set in boldface type in a larger font size. The presentation felt like an advertising brochure, as though the candidate somehow believed that the only way to get an academic promotion by others was to display a capacity for self-promotion.

I was not the only member of that committee to be put off by the presentation. An older colleague, nearing retirement age, commented that universities nowadays are almost forcing their employees to expunge all traces of modesty and humility from their behavior, if not from their mentality. Department chairs are expected to write annual reports assuring the university administration that their department is filled with world-class scholars and universally admired instructors. By the time of the early 1990s, candidates for promotion in the academic world could no longer submit a simple letter and a typed resumé. They had to write 10-page descriptions of their goals as teachers and scholars, accompanied by ample evidence that they were accomplishing those goals and that their accomplishments were being recognized by others living near and far. When I sought my first promotion, it was still possible to submit a brief letter and a typed resumé. By the time I was at the stage of my career to seek a promotion to the next level, all that had changed dramatically. I found the new process so unpleasant to contemplate that I never sought another promotion after that first one—and I was amply rewarded by never getting another one. Putting together the expected sort of dossier was not worth the time and effort, but more to the point it was not worth the violence to my sense of dignity. Ironically, my sense of self-worth would have been undermined by having to present myself as worthy.

It is not only the academic world that has steadily gravitated toward a culture of vainglory. Far from being an unpleasant feature of a bloated ego, fulsome self-congratulation now seems to be expected. Just as no commercial product can afford to be presented simply as adequate to the task but must be portrayed as better than all its competitors and indispensable to the discriminating consumer, no person can afford to be seen as merely competent. Pretty good is just not considered good enough anymore.

During an election year in the United States of America, voters are treated to a parade of candidates who not only toot their own horns sans cesse but also boast about their country as the greatest country in the world, even as the greatest country in the history of the world. Some of the candidates go so far as to disparage political leaders who do not participate in their jingoistic frenzy; those not caught up in nationalistic fervor are characterized as actually hating their country and wanting to drag it down to the same level as ordinary countries. An ordinary country is one that has affordable health insurance and reasonably-priced medical services and pharmaceutical products for everyone; reasonable tuition fees and generous food, housing and transportation subsidies for students pursuing a higher education; a modest-sized military of men and women trained mostly to help citizens cope with natural disasters such as floods, hurricanes and earthquakes; and a prison system designed to reform and educate miscreants rather than punish them. An ordinary country does not have a bloated military budget that is used to send personnel and materiel to countries all around the word and to build permanent military bases in more than a hundred other nations. Americans these days who long to live in an ordinary nation are advised to go live in Canada or Northern Europe, for the United States is a nation for those who wish to participate in excelsior.

In the 1950s, the psychologist Carl G. Jung said in an interview broadcast in English that the United States as a nation is “extraverted like hell.” The quiet reflection of the introvert is deprecated to such a degree that the system of public education is skewed in favor of gregarious doers whose energy is dedicated to making changes in the world rather than in one’s own attitudes and expectations. The thriving industry dedicated to selling products designed to help people realize their dreams of “self-improvement” tend to focus on how to be more self-confident, more assertive, more aggressive, more successful by external standards of assessment, more admired by the crowd. Jung chose his words carefully; an overly-extraverted country truly is like hell.

Although the United States could be described as “extraverted like hell” in the 1950s, it appears not to have always been that way. Neither the New England where some of my ancestors were born and lived, nor the Midwest in which other of my ancestors made their way from the cradle to the grave, according to what I heard from family elders, had much room for the braggadocio narcissism that has become so prevalent in today’s culture.

It is possible that my elders’ memories of the prevalent culture of their early days were faulty. Perhaps they were just getting old and slowing down, as I have managed to do a few decades after they passed on. Perhaps the world always seems too fast-paced, too forceful and too brash from the perspective of a rocking chair on the porch with a commanding view of an array of bird feeders. Or perhaps a culture of modesty and moderation really has been mostly replaced by a culture of excess and hubris.

Written by Richard P. Hayes (Dayāmati Dharmacārin)

Friday, February 5, 2016 at 18:09

Posted in Social analysis

Just deserts

leave a comment »

“Some people are born on third base and go through life thinking they hit a triple.” — Barry Switzer, quoted in The Chicago Tribune, 1986.

One of the observations I remember from the only sociology course I ever took (some forty-five years ago) was that people who are wealthy tend to believe that they earned their wealth and therefore deserve it, while people who are poor tend to believe that wealth is mostly a matter of luck and has little to do with just deserts. Whether a person justly deserves what he or she gets is probably one of those questions that cannot be answered, because there is no clear criterion for what makes one’s fortune just or unjust. Insofar as there is any truth to the matter, it is probably simply that what happens happens, and what one gets is what one gets. Justice does not enter into the picture most of the time, but that hardly prevents human beings from reading justice or injustice into almost everything that occurs in life.

I have no intention of trying to convince others that what has happened to them is just or unjust. What I intend to do instead is to reflect on my own life in a way that extends an invitation to others to reflect on their own lives. Whether they will reach the same conclusions I have reached, I neither know nor care.

There is almost nothing concerning the basic circumstances of my life for which I can take any credit at all. With very few minor exceptions, I have enjoyed good health. When I look at the illnesses and injuries and infirmities that many of my friends and acquaintances have had, I realize that I have had remarkably good fortune, none of which I can claim to deserve. A good deal of health is a consequence of genetic inheritance, a matter of which no one has any control. Other factors in health have to do with the circumstances of one’s life and the conditions of one’s environment. As a child I was fortunate to live in mostly healthy environments, a fact that was made possible by the fact that the family I was born into could afford to choose where we lived. From the choices my parents made I derived a good deal of benefit, but I played no significant role in making the decisions from which I derived benefit.

One of my earliest memories is being taken along with my parents to an office in which they conducted what seemed to me an interminable and crushingly boring business transaction of some sort. What they were doing, I later learned, was buying a life insurance policy in my name into which they paid a modest amount every month until I was eighteen years old. When that policy was cashed out, it provided enough money to pay for my college tuition and room and board. Through no effort of my own, I was in a position to get a good education. I did not get as good an education as my opportunity allowed for, because for the first two years I made hardly any effort to learn anything except what I found interesting and stimulating. As luck would have it, I had acquired a good curiosity from the adults in my life, so I was interested in just enough to keep going from one year to the next, but it could hardly be said that I was disciplined. I was far more hedonistic than disciplined, and whatever work I did was a result of happening to enjoy work rather than a result of doing what anyone else expected me to do.

My parents, as I mentioned above, had the means to make good decisions that were conducive to my wellbeing. To some extent that was because my father had a job that he loved to do and that paid him reasonably well. I benefited from all that, but I contributed nothing of my own to either my good fortune or my parents’. The comfortable circumstances my family was in was due only in part to my father’s earning a steady living wage in his profession. Not an insignificant part of our good fortune came from the fact that some of our ancestors had become wealthy in industry and had passed their wealth down through several generations of people who had done nothing at all to contribute to the business that generated the wealth they had inherited. No one who inherits prosperity can be said to be deserving of that prosperity. Having it is blind luck.

It could perhaps be said that I have played some minor role in having had a good life. But even my ability to play those minor roles was inherited, either genetically or culturally. Without making any real effort of my own to do so, I managed to acquire productive attitudes from my parents and their friends. The adults in my life were, with very few exceptions, good role models, and I imitated their examples, because imitation is what children do best. That I was surrounded by good examples to imitate was entirely a matter of luck, not something I deserved to have through my own hard work or good sense.

Perhaps because I am so aware of how much good luck I have had, it has always been difficult for me to understand how easily people come to believe that they deserve what has come their way, that what they have received has been earned rather than given to them by others, often quite gratuitously. That individuals can believe that they have somehow earned their fortunes, whether good or bad, is not entirely a matter within their individual control. We are all influenced by the society in which we live, and it turns out that most societies have devised a mythology according to which there is some justice to what happens to people.

Some societies, for example, have a mythology of karma, a belief that happiness is a natural consequence of doing what is right and good and that misery is a natural consequence of doing what is wrong and evil. The notion of karma often accompanies a belief in rebirth or reincarnation, so that what happiness one has in this life can be seen as a natural consequence of altruistic deeds done in a previous life, and what ills one experiences in this life is but the ripening of selfishness in a previous life. The greatest virtue of this belief is that it is completely impossible to test. It cannot be verified, nor can it be refuted, and there is therefore no great risk involved in holding the belief. There may even be some benefit, both to the fortunate and to the miserable, in believing that there is some sort of cosmic justice behind how fortune is dispensed. The fortunate can enjoy their good fortune without having their enjoyment spoiled by awareness of the less fortunate. And the miserable can console themselves in the belief that they are learning a lesson of some kind and that by making a few good decisions in this life they may have better fortune in the next life.

Other societies have other mythologies that smooth the rough edges of misfortune. The philosopher Leibniz summed up the convictions of his Christian worldview by articulating the doctrine that God cannot possibly be anything but good, and that God is omnipotent and omniscient. What follows from this, according to Leibniz, is that God can only have created the best of all possible worlds and that whatever happens in this world is therefore good. That a set of circumstances seems not to be good is only because it is being viewed from a limited perspective that blinds one to the larger picture. The mouse that is being gobbled up by the cat, for example, sees this event as a misfortune only because it cannot see that it is participating in the goodness of the cat’s being provided its nutrition. The person dying of cancer sees the condition as a disease because she cannot see that she is participating in the goodness of making room for others to have their turn in leading a good life. Like the doctrine of karma, this conviction has the virtue of being beyond the reach of tests that could either confirm it or refute it. Those who accept the doctrine as true have only to have faith that God would never do anything truly harmful to them and that everything that happens to them, no matter how it may seem when viewed superficially, is in fact to their overall benefit.

A substantial part of American society subscribes to some version of the myth that those who have good fortune have it because a benevolent God is rewarding them for their virtue and that the unfortunate are miserable because they are being punished for their vices. This way of thinking made it possible for European Americans to feel justified in owning slaves and conducting genocidal campaigns against the occupants of lands that they wanted for their own purposes. Throughout much of American history, preachers have been available to support the essentially plutocratic and anti-democratic dogma that the wealthy and powerful deserve all their comforts while the poor are simply reaping the consequences of their lack of ambition, their laziness and their poor attitudes. A good deal of the resistance to social welfare programs can be traced to the effectiveness of preaching such doctrines, and to preaching the doctrine that everything good is a gift from God rather than a gift from good human beings striving to make good fortune more a matter of good planning than of blind luck.

If the truth is that we all get what we get, not because we deserve it, but because of an essentially amoral universe dispensing blind luck willy nilly, it is not a particularly pleasant truth. Finding anything satisfactory in it is probably at best an acquired taste. The unpleasantness of the view, however, hardly disqualifies it from being true. There is no reason to claim that truth must be palatable. If one observes life with a degree of impartiality, it does seem that this view of amoral blind luck is a candidate for being considered true. There are, after all, plenty of scoundrels who seem to get away with their selfish domineering actions with impunity, and there is no short supply of people who are hard-working and generous and loving and cooperative but who just barely make it through life. There are plenty of people who never receive the appreciation and recognition and credit for their virtues, and plenty who take credit and get recognition for what others have done.

What happens is what happens. Seeing any rhyme or reason to it, seeing justice or injustice in it, is subscribing to a story that adds a gratuitous layer of comforting fiction to the small gritty core of fact. Do people who do not separate fiction from fact get what they deserve? Who can ever know?

Written by Richard P. Hayes (Dayāmati Dharmacārin)

Monday, January 20, 2014 at 17:15

Posted in Social analysis

Whose money?

leave a comment »

Many claims that sound sensible on first hearing evaporate into nonsense if one takes the trouble to think about them a little more carefully.

I know it’s going to be the private sector that leads this country out of the current economic times we’re in. You can spend your money better than the government can spend your money. (George W. Bush)

George W. Bush’s folksy encomium of the private sector and derogation of government echoes similar sentiments repeatedly voiced by Ronald Reagan, who said:

Entrepreneurs and their small enterprises are responsible for almost all the economic growth in the United States.

Government always finds a need for whatever money it gets.

Government’s view of the economy could be summed up in a few short phrases: If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it.

Such shallow one-liners managed to convince quite a few voters that they would all be much better off if private entrepreneurs were allowed to keep their money rather than being taxed. The assumptions behind this claim were 1) that wealthy people would invest their money in ventures that would employ people, and 2) that whatever money anyone has rightfully belongs solely to the person who has it. The first of these assumptions led to such incessantly repeated slogans as “job-killing taxation.” The second assumption led to the conviction that taxation is a kind of theft of honest people by essentially dishonest governmental policy makers.

Eight years of Reagan’s systematic dismemberment of the body politic showed how bankrupt the first of those two assumptions is. As commercial enterprises were deregulated and corporate taxation was reduced, the gulf between the wealthy and the poor increased dramatically. As predicted, wealthy entrepreneurs invested their money in ways that employed people. They showed a strong preference for employing people who lived in countries in which environmental regulations were feeble and where there were low standards for protecting the health and safety of low-wage workers. Workers in countries with laws protecting them from unsafe working conditions and assuring them a wage that did not keep them in perpetual poverty tended to be deemed greedy, and their skills were considered to be overpriced. The factories that used to employ them were moved to third-world countries. The majority of voters who swept Reagan into office, and then kept him there for a second term despite an unemployment crisis, were arguably not among those who benefited most from his faulty assumption. The minority of voters whose net worth placed them in the wealthiest 2% of the nation, however, fared very well under Reagan. They fared well again under Bush père and Bush fils. Even during the time of Clinton they did not fare badly. The extremely wealthy have been faring quite well in the United States since 1980.

The second assumption is that if money is in your bank account, or invested in your stock portfolio, then it is yours pure and simple. For a government to take it away from you is therefore a kind of theft. What this seductive assumption fails to take into account is that none of us would have any money at all if it were not for the social contract that forms governments and establishes currencies and regulates banks and investment institutions. All money is essentially social in nature, which means that none of us would have anything if it weren’t for the social fabric in which each of us is but a minor thread. Without the social conventions that underlie the monetary system, a $1000 bill is just a piece of paper with Grover Cleveland’s picture on it, and one’s balance in a bank account is nothing but a meaningless number in an electronic data base. Even if one’s life savings is in gold bullion or real estate, those things have only as much value as the rest of society agrees they have. Supposing one invests one’s money in the stock market and it increases in value, that increase would never have been possible without hundreds or thousands of other investors and the labor of a multitude of employees. That increase in the value of an investment no more belongs rightfully to the investor than to the host of people whose work and cooperation made it possible for the investment to increase in value. One of the most important factors that makes an investment possible is a functional government. So surely a substantial portion of what any investor considers his (or her) money in fact belongs to society at large, and to government in particular.

When one looks at the wide variety of ways that people choose to spend their money, some of them wise and many of them foolish, it is not at all obvious that George W. Bush was speaking accurately when he said “You can spend your money better than the government can spend your money.” Private citizens seem every bit as capable of squandering fortunes as governments. It is no less true of individuals than of governments that they find a need for all the money they get. Few people seem to feel they need less than they have. Even the very wealthy often seem to feel they need every bit of what they have; otherwise, they would not spend as much time and energy as they do to avoid paying taxes.

People who increase their wealth through investment, people who inherit wealth from their relatives, people who increase their wealth by selling goods and services for more than they pay for them—none of them can make the legitimate claim that their wealth is rightfully theirs alone. The wealth of all such people is conditioned by social mechanisms of which they are not fully in control and for which they therefore cannot claim full credit. Even those whose modest income comes to them as a result of selling their labor have what they have as a result of a social network that makes the selling of labor possible. The very idea that your wealth is really yours and that it need not be shared with the rest of society that made it possible is, when examined more carefully, a vacuous idea. Like all vacuous ideas, it makes a very poor foundation for a life worth living.

The shibboleths of the political and economic right in the United States are, with hardly any exception, the war whoops of plutocrats who are waging—and winning—a war against the middle-class and the poor. (Plutocracy is government of the people, for the wealthy, by the politicians purchased by the wealthy; it is the form of government now found in the United States, Saudi Arabia and Libya, and the form of government that used to be found in Tunisia and Egypt.) It is difficult to see the Wisconsin government’s newly passed law that strips public workers of their rights to collective bargaining in any other way than as a war on the general public by the plutocrats. It is difficult to see the proposed cuts in the public funding for health care, education, scientific research and public broadcasting in any other way. People who can think for themselves, and people who are well informed, and people who care about justice, and people who wish to have a voice in making decisions about policies that will affect their lives are not very good for a thriving plutocracy. Keeping people ignorant and complacent and so worried about their livelihoods that they will never speak up is what a plutocracy needs. And that is what an increasing number of elected representatives are delivering—policies that favor the wealthiest 2% of the population at the expense of the remaining 98%.

As long as we are remembering quotations by Ronald Reagan, it may be worth remembering that he said this:

We might come closer to balancing the Budget if all of us lived closer to the Commandments and the Golden Rule.

As far as I can tell, few of the ten commandments would have much impact on the budget. The only one I can think of is &ldquoThou shalt not kill.” If that were followed, the military budget would probably be about one-tenth its current size, and there might then be sufficient resources left over to enable us collectively to love our neighbors as we love ourselves and to do unto others as we would have others to unto us—such as take care of us when we are ill or when we meet misfortune or when we are struggling to make an honest livelihood.

An interesting chart on class warfare is furnished with statistics that make their point well. (Also interesting are a few of the comments supplied by admirers of the Reagan-Bush plea for minimal government.) An excellent analysis of the 2011 budget is available on the website of the Friends Committee on National Legislation.

Written by Richard P. Hayes (Dayāmati Dharmacārin)

Saturday, March 12, 2011 at 17:02

Posted in Social analysis