2000

 

The Meaning of It All
Thoughts of a Citizen Scientist

Book Review

March 2000

Richard P. Feynman, and two others, won the Nobel Prize in Physics in 1965 “for their fundamental work in quantum electrodynamics, with profound consequences for the physics of elementary particles.” Quantum electrodynamics is the basis for Michael Crichton’s new novel Timeline, where history is studied by direct observation and not just available artifacts.

Humanists claim to be guided by science and the scientific method, but how many of us really know what this means? Unfortunately, many people are intimidated by the large technical jargon that scientists use. Carl Sagan, another great 20th Century scientist, was very concerned that science is viewed as a kind of elite club and perceived to be out of reach to most people.

The Meaning Of It All, Thoughts of a Citizen-Scientist (Perseus Books, 1998) is a transcript of three lectures that Feynman presented in April 1963 at the University of Washington in Seattle as part of the John Danz Lecture Series. The titles of the lectures are: The Uncertainty of Science, The Uncertainty of Values, and This Unscientific Age. The entire book is only 122 small pages of fairly large print and yet it describes very eloquently in layman’s terms the nature of science, the relationship between science and religion, what science can and cannot do, etc.

The book is obviously a verbatim transcription from an audio recording of the lectures. In places the punctuation is poor, some sentences are incomplete, and some of the constructions are awkward. Nevertheless this is an important book that I recommend to all humanists. It will help you in understanding what science is and what it is not.

–Wayne Wilson

Discussion Group Report

What Freedom is Found in the Local Culture?

January 2000

By Richard Layton

In a lecture in the University of Utah Great Issues Forum in the early 1960’s, Professor Waldemer P. Read of the University’s Philosophy Department addressed the question posed in the title of this article.

In preparing himself for the talk, he said, he, who had been born and raised a Mormon and had left the church, had asked himself the question, “Why should I have the effrontery to talk to my own people about their bondage?” Then he on one occasion heard the sound of the Nauvoo (Illinois) bell and heard the announcer declare that this bell had special significance. It rings for freedom? But his own reflections had led him to this conclusion: In Utah we enjoy the political and civil liberties that are characteristic of America as a whole. “In themselves,” he states, “they scarcely justify the distinctive claim made for the Nauvoo Bell. Such justification would seem to require that this culture and its people have a greater than usual appreciation of these freedoms, and a greater than usual zeal for their protection, preservation, and enhancement. It has been my impression that such has not been the case”

During the rise of Nazism, Utahns were neither distinctively clairvoyant nor concerned about the nature and seriousness of its threat to freedom. Almost boasting reports came from missionaries in Germany and their mission president that, though the Catholic and Protestant clergies were having difficulties with Hitler, the Nazis saw nothing in the activities of the Mormon missionaries to alarm them. Perhaps the claim that the Nauvoo Bell tolled for freedom had an eschatological (i.e., an otherworldly) reference and had nothing to do with the political freedoms and civil liberties of the here and now, Read suggested.

In his opinion, McCarthyism had been the most serious internal threat to freedom to which Americans had been exposed, at least during the previous half-century. Local leadership, in both church and press, had been woefully silent on this subject.

Reid put forth a definition of human freedom as freedom of the mind. The ability to pursue one’s desires is a condition of freedom. Increase in the ability to do increases freedom. Therefore, the literate man is more free than the illiterate. All increase in mental powers is an increase in freedom. Other conditions being equal, the individual who can think new thoughts–thoughts that no one before has thought–is freer than those who cannot; and the society whose membership includes individuals who can think new thoughts is free–to a degree which varies directly with the proportion of its membership having this capacity. Excessive stability in the degree of channelization, stabilization of the patterns of imagination, of conception, and of judgment and belief is the foe of creativity and “the friend of the status quo, of sameness, monotony, and death,” Read said. He quoted A. P. Ushenko: “Perpetual endurance of the actual status quo degenerates into stagnation.”

William F. Allbright observes, “A group may be so completely integrated that it exhibits little internal friction, a high degree of efficiency in accomplishing its purposes, together with self-sufficiency and smugness–but it will accomplish little of value for the world.” And Bertrand Russell adds, “…those who believe that the voice of the people is the voice of God may infer that an unusual opinion or peculiar taste is almost a form of impiety, and is to be viewed as culpable rebellion against the legitimate authority of the herd. This will be avoided if liberty is as much valued as democracy, and it is realized that a society in which each is a slave of all is only a little better than one in which each is the slave of a despot.” John Stuart Mill made an eloquent appeal for freedom of thought and speech, freedom of action, taste and pursuit as essential conditions for freshness, vigor, vitality, and the continued enrichment of the life of the human spirit. Von Humboldt supported the idea of individuality “as one of the elements of well-being.”

Also vital to our well-being, said Read, is independence of judgment and belief. We can discern truth from falsehood only if we have an adequate sense of evidence, i.e., a sense for what sorts of consideration should guide the attempt to identify the true. It is not clearly recognized that belief is not in itself an indication of truth, that subjective certainty is of no evidential significance. Faith is no substitute for evidence. Nor is the comfort that an idea gives a mark of its truth. Only two sorts of considerations are legitimate for the identification of true propositions: considerations of empirical fact and of logical relation.

Human beings can be controlled through control of their minds–thought control. The more sophisticated of us have known that since the beginning of human society men and women have been committed to beliefs, policies and practices without knowing why they were committed. Logic texts have pointed out a group of fallacies that often lead people off-track in the search for truth. These fallacies are generated when by the arousal of the emotions the critical faculties are thrown off guard, the attention is diverted, and the idea being advanced gets past the censor without being examined for its credentials–and once accepted by the mind will be defended by the mind. A process of “conditioned response” has occurred, which is logically invalid though psychologically effective. It is what is back of tenacious beliefs that cannot be intellectually justified. It is often used as a means of manipulation, an instrument of control of people. Individuals become members of society, not through reasoning, but by conditioning. Through conditioning, every family and church group recruits and controls its members. This is not necessarily bad. It is good up to a point, for we are institutional animals; but beyond that point it is deadening. Institutional control is good if the institution is open at the top so that the individual may transcend the very forms that lifted him. But, if the institution is closed, then the control is bad. It shields him but limits him and uses him as one of the elements in the truss which holds him up. Institutions of the first sort liberate the human spirit; those of the latter kind imprison it.

Read made two points about the local culture: 1) that the controls in this culture are excessive; and 2) that they are unfortunately so. There is a stifling uniformity of belief. Imagination is not stimulated and judgments are not challenged by conflicting opinions. Rather the belief of each reinforces and sustains the belief of others. A condition that is requisite for the cultivation of freedom is diversity of opinion, making possible habituation in the search for and examination of possible alternatives. In Mormonism the beliefs tend to reinforce the uniformity. They tend to insure that no discussion will get out of hand, that no heretic will run away with the argument that The Truth will always prevail. Three such beliefs are: 1) belief in the absolute certainty of the doctrine (the dogmatic attitude); 2) belief in the wickedness of doubt; and 3) belief in the authoritative hierarchy–all three conditioned responses. Dogmatism is inimical to freedom of thought. It denies the need of inquiry–for further research. On the adoration of faith and the distrust of doubt, Read says, “The free mind recognizes that the question of truth…is prior to the obligation to believe. The insistence upon faith begs the question of truth. The local culture penalizes the reluctant believer by holding him suspect as to character.” The virtue of deference to authority is thought to be one of the strongest assurances of salvation, but it is an abnegation of individual responsibility in thought. Another factor of control is the highly articulated ideology. One begins with acceptance of the scriptures as authoritatively interpreted, and from there on all is clear sailing. Not many members are fully aware of the extent to which their conclusions rest ultimately upon psychological grounds rather than logical grounds. Finally, a feature of the culture that makes for excessive control is the monopolistic nature of the program. The home is a conditioning agency for the church. Meetings, suppers, socials, lessons, dances, celebrations, testimonials, fellowship, fireside meetings, seminaries and church institutes, and the church basketball league, are conditioning agents dedicated to the psychological sale of the central beliefs. There is a persistent attempt to get every individual involved for as many of the waking hours of his life as possible in church activity, even often at the expense of other legitimate individual interests. As in all cultures the cords that bind the minds of people do not chafe or gall like the chains in the ancient dungeon. Rather they warm and comfort. The sweetness of the bondage is its greatest strength. As Rousseau said, “They love their servitude.”

If the people, then, love this control, why, then, is it unfortunate? For one thing, there is the monotony resulting from a successful perpetuation of the status quo. We would seem to be headed for Russell’s “new prison, just, perhaps, since none will be outside it, but dreary and joyless and spiritually dead.” However, whether we like it or not, tomorrow things will be different. “There never was a time when the world, and, particularly the United States, had greater need for new ideas,” says Read. What is to be regretted is …that the local culture is so geared to preserve its theology that it is incapacitated to contribute or support needed new insights and conceptions bearing upon national policy and action. The people under the local culture are saddled with the following ideological hindrances which make it unlikely they will contribute anything of significance to the solution of the problems that confront this nation and the world: 1) an antiquated doctrinaire economic conservatism with business-corporation mindedness which incapacitates people for solving the problems of human well-being; 2) a built-in isolationism which prevents enthusiastic participation in efforts to establish world peace; 3) an “exclusivism”–the “we are right and you are wrong” attitude requiring that the world be made over in their own image instead of a vision of peaceful coexistence, preserving and protecting the distinctive values of each culture; and 4) a built-in racial prejudice. This last item has been somewhat ameliorated since this speech was given, but I would suggest adding in to the list of hindrances a built-in sexism.

One might wish that Utah’s contribution to solving the great problems could be more than foot-dragging, Read says; “…but such would require a quality of inner freedom that we do not have, and that we are not about to develop.”

Ethics vs. Making a Living

August 2000

INTRODUCTION

Tonight I am going to talk about the choices open to an employee who finds herself in an ethical dilemma in the workplace, and the possible consequences of exercising those options.

I am also going to present a framework for THINKING about the problem of what to do if asked to do something unethical or illegal and deciding what to do.

I am NOT going to tell you how to decide if you are being faced with an ethical dilemma–that is different for each person, as you will see when you hear about the framework for thinking about this problem.

Gandhi included in his SEVEN BLUNDERS OF THE WORLD THAT LEAD TO VIOLENCE one about “Commerce Without Morality.” His words were:

“Commerce Without Morality: As in wealth without work we indulge in commerce without morality to make more money by any means possible. Price gouging, palming off inferior products, cheating and making false claims are a few of the obvious ways in which we indulge in commerce without morality. There are also thousands of other ways in which we do immoral or unethical business. When profit making becomes the most important aspect of business, morals and ethics usually go overboard. We cut benefits and even salaries of employees. If possible we employ “slave” labor, like the sweat shops and migrant farm workers in New York and California where workers are thoroughly exploited. Profit supersedes the needs of people. When business is unable to deal with labor it begins to mechanize. Mechanization, it is claimed, increases efficiency, but in reality it is instituted simply to make more money. Alternate jobs may be created for a few. Others will fall by the wayside and languish. Who cares? People don’t matter, profits do. In more sophisticated language what we are really saying is that those who cannot keep up with the technological changes and exigencies of the times do not deserve to live–a concept on which Hitler built the Nazi Party. If society does not care for such people, can we blame them if they become criminals?”

Gandhi was speaking at an earlier time, and from the point of view of the greedy entrepreneur. Little did he know that matters were going to get a whole lot worse–that many ordinary employees, with no stake in ill-gotten profits, would be expected to carry out unethical or illegal activities in the routine course of their duties if they wanted to go on feeding their families. Unethical acts that endanger public health and safety or defraud are widespread. This ranges from falsifying time cards for projects, or fudging test results for buildings or equipment that might subsequently fail and kill people, to bypassing safety rules in nuclear power plants with the possible result of wiping out life in large parts of the planet.

The trend is for more and more Americans to be employees. Professions that formerly offered the opportunity for private practice and greater freedoms are increasingly being brought into large organizations. Eighty percent of all engineers are employees, and more and more physicians and architects are being pressured to become employees. As employees, we pretty much have to do what we are told. That’s part of the deal. But what if we are told (or expected) to do something that is illegal or that may cause serious harm to others? Consider the plight of a physician in a managed-care setting. It is one thing to be told to “keep costs down” and quite another to be forbidden, upon pain of dismissal, to even discuss with patients treatment options that the employer deems too costly.

DEFINITIONS AND DISTINCTIONS

In this talk, I draw a distinction between ethical issues, which are based on broadly accepted standards of right and wrong as laid out in codes of ethics, and issues of personal morality pertaining to individual conduct. Aspects of one’s job may offend one’s PERSONAL moral sense without being unethical or illegal. For example, a Muslim might find it offensive that female coworkers do not cover their heads. Or, an employee may think that the goods and services provided by his organization are offensive. In such cases, the employee can either put up with it or leave.

Instead, I am going to talk about cases in which an employee is asked or expected to perform illegal or unethical acts, or in which an employee becomes aware that illegal or unethical activities are being carried out, where “unethical activities” are those that might cause damage to the public, the environment, or fellow employees.

If the determination that an act is unethical derives from established codes of ethics, there is more societal support than if it is simply derived from a personal ethical sense. There is little enough societal support in any event, but if an employee is damaged and seeks recourse through the courts, reliance on an established code of ethics makes a much stronger case than reliance on a personal sense of ethics.

I am also not going to talk about specific codes of ethics, except to say that they exist. There are THOUSANDS of ethics codes for various professions and employment categories. Teaching ethics is also a big business–engineers, business students, and others are trained to split ethical hairs to a precise degree. However, enforcing these codes is like belling the cat. Enforcement is VERY difficult. Courts, though, do sometimes look at whether ethical objections are based on established codes of ethics for a profession or field. If a dismissed employees? ethical objections are NOT based on an established code, the courts often throw the case out as being simply a personal matter. In other words, you may know that it is wrong to kill people, but you had better find a code of ethics that specifically mentions not taking actions that harm human beings!

Standards of ethics should also be distinguished from laws. The fact that an action is legally permissible does not establish that it is morally and ethically permissible. Similarly, illegality does not imply immorality. We can all construct examples, based on our own experience and on common sense.

Moral statements are statements that something is right or wrong. There can, of course, be disagreement over moral statements. For example, employees may disagree about the morality of killing civilians, so that one engineer could work on a particular defense contract in good conscience and another could not. A code of ethics might simply mention not doing harm to human beings–leaving the interpretation up to the individual employee.

CHOICES AND OPTIONS FOR HANDLING ETHICAL PROBLEMS AS AN EMPLOYEE

Instead of discussing the ins and outs of developing and debating ethics standards, what I AM going to talk about is the various choices an employee has when faced with an ethical dilemma and the possible outcomes from those choices. In other words, I cannot tell other people what constitutes an ethical workplace problem–this is different for each person–but when you get into that situation, you will KNOW it and what I am providing is a taxonomy of options and the possible outcomes from exercising those choices.

I am also going to present a framework for THINKING about ethical problems in the workplace–hopefully BEFORE one is faced with a serious ethical problem. “Ethics” is often a gray area, so I am going to restrict my comments to cases where there is an established code of ethics for a profession or field and one is being asked to violate it.

The best course of action is to AVOID being put into a situation where one would have to perform illegal or unethical acts, but what about KNOWLEDGE of such acts if you think the consequences are dire? Can one in good conscience remain silent?

Being expected to perform actions that are illegal or unethical is usually a no-win situation. If we refuse, we get fired and possibly “black-listed” as “troublemakers” or–the ultimate sin these days–“not team players.”. If we go along and do what we are told to do, we risk not only the bad conscience of doing harm but the very real consequences of getting blamed if the illegal acts are discovered. That usually results in loss of career or employment, and in some cases personal legal liability. The Nuremberg Defense (“I was just following orders”) doesn’t even work very well in the aftermath of real wars, during which ethics and morals had been abandoned in the interest of winning the war.

Consider the unfortunate engineer who had to sign off on the flight of the Challenger shuttle in 1986. He had not designed those O rings, nor been responsible for the range of tests performed, but he was in a position where the shuttle could not take off without his approval and his best engineering judgement was that the flight would be unsafe at the very cold temperatures that day. He hesitated, and had a code of engineering ethics to back him up, but both his own management and government managers brought a lot of pressure to bear. There were serious political considerations, both for his company and for the U.S. Government. This was toward the end of the Cold War, and President Reagan had made a major point of the shuttle flight with the school teacher aboard. He was told to “Take off your engineer’s hat and put on your manager’s hat.” He did so, with results that we all know about. Who can ever forget the television coverage of that vapor trail dividing into two separate arms?

What would have happened to this man had he stood his ground as an engineer and refused to sign? He would no doubt have been fired. As it was, his career was essentially over after the shuttle exploded. That’s what I mean by a “no-win” situation. This is a famous case, taught in many university courses on ethics. The question for the student–what should he have done?–is usually qualified by the caution to remember the age of this man and think about his chances of finding another job. Students are also urged to consider the latitude in making a technical prediction that the O-rings WOULD fail.

So, if the random workings of the business world place you in such a position, what are your choices?

  • Doing nothing or delaying action–is always a choice. Sometimes things blow over.
  • Attempting to negotiate with one’s superiors is an honorable and forthright course, but one that is frequently not rewarded with a gold star.
  • Going higher in the organization, skipping levels, is also a choice. This sometimes produces a result that is good for society but not so good for the employee, against whom the skipped superiors have many forms of retaliation.
  • “Blowing the whistle” by airing one’s concerns OUTSIDE of the organization is also a choice. Whether this is the first or the last choice depends on the characteristics of the organization.

The Option of Doing Nothing or Delaying

There are lots of ways to carry out this option, depending on the circumstances. An example will be provided later.

The Option of Negotiation Through Internal Channels

If we go through internal channels, the result is very often dismissal. While we dislike beginning with the assumption that everybody is crooked, the truth is that nobody in authority wants to HEAR about such problems. The rules of war–when customary moral and ethical considerations are suspended in the name of winning the war and thus preserving one’s society–seem to have drifted downward into the organizational world. Increasingly, the competitive world of capitalism is viewed as war. The attitude is that preserving the organization is so important that customary moral and ethical considerations are suspended. Firms are merging and buying each other and getting larger and larger. The faceless corporation–which must be protected with the same fervor as are nations at war–prevails.

The Option of Voicing One’s Concerns Higher in the Organization

Sometimes this works and higher management agrees with the employee. However, that does not protect the employee’s job from retaliation by the people in between.

For example, in one instance, an engineer successfully convinced a vice-president, after having been brushed off by his immediate superiors, that one of the procedures being carried out was illegal and harmful. The procedure was then modified. This man had been with the company for 14 years. Immediately after his complaints were listened to and acted on, he was fired for insubordination. His subsequent lawsuit was dismissed on the grounds that the allegation of insubordination ruled and the ethical complain had not been material. (Essay #6, “Loyalty and Professional Responsibility”, part of work done by Dr. Mike Rabins, Dr Charles Harris, Dr. Michael Pritchard and others on an NSF grant at Texas A & M University.)

The Option of Blowing the Whistle Outside The Organization

There is always the option of reporting wrongdoing through external channels–to government agencies or to the press. However, one had better resign first, since firing is sure to follow, and one may find oneself blacklisted and unable to get another job.

Anonymous whistle-blowing is an option, but this lacks credibility and often the identity of the whistle-blower can be determined anyway.

The contemporary culture is so complex that it is often difficult to foresee or even understand the eventual possible consequences of actions taken as an employee. One frequently has to accept on faith that health and safety rules have a reason–for example that a “cold pour” of concrete (freshly-mixed concrete poured on top of concrete that has already begun to set) may cause a structure to collapse at some future time–or that mixing outdated hamburger with fresh hamburger in a fast-food restaurant may make customers sick.

Taking actions that are harmful to society is not limited to architects, engineers, physicians, those with professional credentials whose seal or signature has legal impact. Ordinary workers in food stores or restaurants or repair shops can also be placed in the position of going along with illegal activities or being unemployed. For example, a mechanic may be told to use refurbished parts and certify them as new, or to skip maintenance steps and certify that they have been performed. The head of maintenance for Alaskan Airlines, in recent testimony shown on television, said that “You have to make mistakes if you are going to make a business grow.” He apparently believed that “making the business grow” justified risking the lives of airline crews and passengers.

To make the point even more vividly, consider that any of you might have been a passenger on the aircraft that crashed and burned in Florida because spent oxygen containers had been illegally shipped as cargo. SOMEBODY had to sign-off on the falsified paperwork–“pencil-whipping”, as it is called–and whether or not the person who did so understood the possible consequences of this action is murky. However, had the person refused, he or she would have probably been fired.

Worrying about the consequences of illegal or unethical actions is what drives some people to make the hard choice to be a whistleblower. I say this is a “hard choice” because what usually happens to whistleblowers in our society is not pleasant.

WHY IS THE PROBLEM OF ETHICAL DILEMMAS FOR EMPLOYEES GETTING WORSE?

What I have said so far does not paint a pretty picture of life in the land of the free and the home of the brave.

The personal problem of what to do when asked to perform illegal acts in the course of one’s employment is becoming more severe–not just because more people ARE employees these days–but because here in the richest country in the world there is so little in the way of a social safety net that many essentials–such as health insurance–come from one’s employment. What to do in this situation if one is an employee with family members who have pre-existing medical conditions that preclude private health insurance? Where is the balance between one’s personal and immediate responsibilities and one’s responsibility to the society at large, to one’s profession, or to one’s own career?

In the 1980’s, the fashion in ethics circles was that organizations should be restructured to provide alternate channels for reporting wrongdoing so that “whistleblowing” was not necessary. Ethicist Michael Davis summarized this viewpoint in a wonderful paper called “Avoiding the Tragedy of Whistleblowing” in the Business and Professional Ethics Journal (Vol. 8, No. 4, pp 3 – 19). He made the point that even when the whistleblower is protected by law or policy, that employee is essentially lost as an organizational resource because trust has been destroyed and the employee can never again be a contributing part of the organization.

During the 1990’s, as globalization or just plain greed caused organizations to downsize, to restructure, and to try in every way to squeeze every last penny out of every resource, we stopped hearing about arranging alternate channels to prevent whistleblowing. Tight labor market or not, employees are now much more expendable and it is easier to just get rid of the troublesome ones than to worry very much about spending resources on organizational structures to make whistleblowing unnecessary.

A hard fact to accept is that some problems have no solution. This is one of them. What I am going to talk about is not a solution, but rather a framework for THINKING about this problem and formulating one’s own personal viewpoint.

A FRAMEWORK FOR DECIDING WHAT TO DO IF YOU ARE FACED WITH AN ETHICAL DILEMMA

Several years ago, I attended a national seminar on professional ethics put on by a reputable and respected national science organization. One talk has stayed in my mind. Unfortunately, the notes from that seminar have disappeared during the course of many moves, and I cannot give proper credit to the presenter. He was a professor at a school in one of the Carolinas, I think, and I regret that I cannot give him the credit that is his due. At any rate, I want to make it clear that the framework I am going to present here is not original–I got it from that wonderful lecture.

The presenter began by emphasizing that ours is NOT a society that likes whistleblowers, no matter how right their actions or how serious the actions they are reporting. So, he said, the first thing to do is to attempt to negotiate a solution based on the interests of the organization. The assumption is that illegal and harmful activities are not in the long-range interests of any organization. As I said, this was a viewpoint common in the 1980’s, but by the 1990’s began to suffer from a much shorter planning horizon and from the increasing knowledge that in business, crime frequently DOES pay.

Failing that, he said, one should evaluate a matrix of one’s responsibilities. That is different for every person. The factors to be considered are:

  • Personal and family responsibilities to other people (This can include co-workers.)
  • Duty to the society at large (This includes the physical environment of the earth.)
  • Duty to one’s profession
  • Effect on one’s career

For each factor, one should try to estimate the worst case and the probability of its occurring. Then, the worst case results are prioritized. The talk was for engineers and scientists, so the presenter used a decision tree with probability branches and outcomes at the end of each branch. If statistics are not your cup of tea, simply list the worst-case outcomes and prioritize them in their order of significance in your life and conscience.

For example, the parent of a child with a chronic illness that would preclude obtaining private health insurance would weigh the loss of a job or career more heavily than would an employee with only him or herself to consider. A duty to the society at large might outweigh considerations of personal responsibility to family members if the consequences are grave–a nuclear meltdown, for example, in which many people could be expected to be killed.

Negotiation should be attempted, but carefully. The superiors with whom one is attempting to negotiate may know perfectly well that what they are asking the employee to do is illegal, and may not want to discuss this or continue the employment of anyone who has mentioned it.

Another option is to quietly circumvent the illegal directive, if at all possible. As an example, I offer one of my own experiences. When I was a quite junior consultant at one of the Big Ten firms, I collected, for a government contract, proprietary data from private firms with the clear understanding that these data were to be used for no other purpose. Not too long afterwards, one of the officers of the firm decided to use the data for another contract, one in the private sector. I knew there was no point in discussing this ethical–and possibly legal–violation. Instead, I went into the office in the middle of the night, removed the data from the files, and put it into the burn bag for classified waste. When the data turned up missing, nothing much was thought about it. The place was ALWAYS disorganized, and the assumption was that the data had simply been lost or misfiled.

If one has evaluated one’s own personal matrix and decided that she cannot live with the ethical violation, and options within the organization have been exhausted or believed to be ineffective, the next step is to “blow the whistle” by reporting the violations outside the organization. Whether or not this is combined with a resignation depends on the degree of statutory protection one enjoys. This is MUCH greater in the public than in the private sector. For private sector employees, the best choice is to find another job FIRST, and then blow the whistle on the violations at the previous job.

The framework discussed above is a simplistic application of Utilitarian Analysis, but a powerful one. Issues of ethics are usually not black-and-white. They are many shades of gray. Even questions about the legality or illegality of various actions are usually not simple. A structured analysis of one’s duties and responsibilities, with probabilistic estimates of the likelihood of outcomes and of their consequences, is helpful.

OURS IS NOT A SOCIETY THAT LIKES WHISTLEBLOWERS

Let me get back to the assertion that ours is not a society that likes whistleblowers. Most employment is “at-will”, meaning that an employee can be dismissed at any time for any reason.

Three major “exceptions” to the at-will doctrine are:

  • Breach of an express or implied promise, including representations made in employee handbooks.
  • Wrongful discharge in violation of public policy, and
  • Breach of implied covenant of good faith and fair dealing.

You will notice that one of these is “wrongful discharge in violation of public policy.” However, employees who are dismissed because they brought illegal practices to light do not have an easy time of it in court. As has often been said, “The business of America is business” and courts have been known to issue judgements NOT favorable to whistleblowers, even those who only blew the whistle INSIDE their organization. For example, a woman who reported wrongdoing through channels in her firm and was then fired was not protected because she did not report the wrongdoing OUTSIDE the firm and a court found that her dismissal fell under the category of internal company reasons rather than the violation of public policy rubric.

Even when there is a whistleblower statute, enforcing it is long, expensive, and difficult. I was president of the board of the ACLU in Washington, D.C. when we took on the famous whistleblower case of Ernie Fitzgerald. Fitzgerald, a cost analyst for the Air Force, had told a Senate hearing about the true, and astronomical, cost of the C-5A airplane. For this action, he was trashed all over the government and fired. A Washington law firm with a partner on our board was representing him pro bono for the ACLU, and as I recall, the pro bono time totaled a couple of million dollars before the case was successfully concluded. It took four years for Fitzgerald to get his job back. He was then demoted, denied a pay raise, and banished to some Pentagon attic. When he was proposed for a special award from the Institute of Industrial Engineers for outstanding service, the prize was canceled. Corporate members on the board of directors included officers from Lockheed, manufacturer of the C-5A. “They’ll be after me as long as I live,” says Fitzgerald, as reported in the WASHINGTON POST.

Following is a taxonomy of things that organizations do to discredit whistleblowers and even those who try to express their ethical concerns through internal channels. This is from a summary of the Government Accountability Project’s handbook for whistleblowers, The Whistleblower’s Survival Guide: Courage without Martyrdom.

Spotlight the Whistleblowers

This common retaliatory strategy seeks to make the whistleblower, instead of his or her message, the issue: employers will try to create smokescreens by attacking the sources motives, credibility, professional competence or virtually anything else that will work to cloud the issues s/he raised.

Manufacture a Poor Record

Employers occasionally spend months or years building a record to brand a whistleblower as a chronic problem employee. To lay the groundwork for termination, employers may begin to compile memoranda about any incident real or contrived, that conveys inadequate or problematic performance; whistleblowers who formerly received sterling performance evaluations may begin to receive poor ratings from supervisors.

Threaten Them into Silence

This tactic is commonly reflected in statements such as, “You’ll never work again in this town/industry/agency?” Threats can also be indirect: employers may issue gag orders, for example, forbidding the whistleblower from speaking out under the threat of termination.

Isolate or Humiliate Them

Another retaliation technique is to make an example of the whistleblower by separating him or her from colleagues. This may remove him or her from access to information necessary to effectively blow the whistle.

Employers may also exercise the bureaucratic equivalent of placing a whistleblower in the public stocks: a top manager may be reassigned to tasks such as sweeping the floors or counting the rolls of toilet paper in the bathroom. Often this tactic is combined with measures to strip the whistleblower of his or her duties, sometimes to facilitate subsequent termination.

Set Them Up for Failure

Perhaps as common as the retaliatory tactic of isolating or humiliating whistleblowers by stripping them of their duties is its converse-overloading them with unmanageable work. This involves assigning a whistleblower responsibilities and then making it impossible to fulfill them.

Prosecute Them

The longstanding threat to attack whistleblowers for “stealing” the evidence used to expose the misconduct is becoming more serious, particularly for private property that is evidence of illegality. Having to sign waivers about the confidentiality of ALL workplace information as a condition of employment is becoming commonplace.

Eliminate Their Jobs or Paralyze Their Careers

A common tactic is to lay off whistleblowers even as the company or agency is hiring new staff. Employers may “reorganize” whistleblowers out of jobs or into marginalized positions. Another retaliation technique is to deep-freeze the careers of those who manage to thwart termination and hold onto their jobs: employers may simply deny all requests for promotion or transfer. Sometimes it is not enough merely to fire or make the whistleblowers rot in jobs. The goal is to make sure they ” will never work again” in their field by blacklisting them: bad references for future job prospects are common.

GUIDE FOR WHISTLEBLOWERS

As for whistleblowing in general, there is an excellent guide on the web for how to go about this and what to consider ( BLOWING THE WHISTLE WISELY – 12 SURVIVAL STRATEGIES http://www.whistleblower.org/www/Tips.htm). Key issues covered are

  • Before taking any irreversible steps, talk to your family or close friends about your decision to blow the whistle.
  • Be alert and discreetly attempt to learn of any other witnesses who are upset about the wrongdoing.
  • Before formally breaking ranks consider whether there is any reasonable way to work within the system by going to the first level of authority. If you do decide to break ranks, think carefully about whether you want to “go public” with your concerns or remain an anonymous source. Each strategy has implications: the decision depends on the quantity and quality of your evidence, your ability to camouflage your knowledge of key facts, the risks you are willing to assume and your willingness to endure intense public scrutiny.
  • Develop a plan-such as strategically-timed release of information to government agencies-so that your employer is reacting to you, instead of vice-versa.
  • Maintain good relations with administration and support staff.
  • Before and after you blow the whistle, keep a careful record of events as they unfold. Try to construct a straightforward, factual log of the relevant activities and events on the job, keeping in mind that your employer will have access to your diary if there is a lawsuit.
  • Identify and copy all necessary supporting records before drawing any suspicion to your concerns.
  • Break the cycle of isolation and identify and seek a support network of potential allies, such as elected officials, journalists and activists. The solidarity of key constituencies can be more powerful than the bureaucracy you are challenging.
  • Invest the funds to obtain a legal opinion from a competent lawyer.
  • Always be on guard not to embellish your charges.
  • Engage in whistleblowing initiatives on your own time and with your own resources, not your employer’s.
  • Don’t wear your cynicism on your sleeve when working with the authorities.

Many states have whistleblowing laws, and these can be effective if one is fully prepared beforehand and learns from the errors of those who have gone before and had their cases thrown out (see URL above). Utah’s whistelblowing law covers only public employees, not those in the private sector. (See Section 67-21-1 et seq.)From time to time there are murmurings that the Congress may pass federal whistleblowing legislation to protect employees in the private sector (public employees already have a law for this). However, the trend is not to disturb the mighty economic engine, so that is not likely to happen any time soon.

Employees in the private sector are pretty much on their own. What CAN they do? Here are some suggestions, things that I have seen used successfully.

  • Always know where you could get another job FAST, so that you can change jobs before matters about your reporting of illegal activities or refusing to perform them get too nasty. Form a trust group of others in your field who will take you in on short notice, and whom you would take in on short notice. This may or may not involve people who will falsify references for you if need be.
  • Accumulate assets and stay out of debt. Money means freedom–freedom to resign without another job if necessary, freedom to support your family if you are fired and it takes a long time to find another job, freedom to move to another community if necessary, freedom to pay legal fees if you have to. How do you go about this? Discipline yourself to routinely set aside a certain percentage of all income–not for your next car or next house or your children’s education–but for your freedom and survival. Many routinely tithe to a church. I have not heard about churches? coming to the aid of their members when they run afoul of illegal or unethical activities in their workplace. They are more likely to point a finger and say that the person’s trouble are due to unrighteousness! So tithe to yourself instead and hope that you never have to use the money to get out of an impossible ethical situation in the workplace. If you are lucky, it will be a nice nest egg for your golden years. If you are not lucky, and get into a jam, the money may make it possible for you to keep both your livelihood and your self-respect.

CONCLUSION

I have painted a dismal picture of the American workplace, and I think that matters are going to get a lot worse before they get any better. The pendulum always swings, but it may take a while for THIS one to turn around. In the long waves of Capital vs. Labor, labor is not currently on the winning side. And we are ALL labor–no matter how highly compensated–unless we control capital of our own. Some of us may even be slaves–if our indebtedness is high enough.

So how can we live with ourselves, and more importantly, what do we tell our children about the work world? Do the right thing and you will be rewarded and prosper? They probably know better. I should mention at this point that I am an Atheist speaking to a Humanist gathering, and rule out divine intervention as a solution. To those who say that if they are put in an impossible work situation–break the law and harm others or see their families on the street–God or Jesus or a guardian angel or whatever will rescue them–my reply is “How did God or Jesus or your guardian angel let you get into the situation in the first place?” Better to look to human society, human structures, and one’s own intelligent planning for help.

Marilyn T. Welles

Ten Thousand Villages Store

September 2000

Salt Lake area humanists will want to know about and undoubtedly patronize a new store in the city, Ten Thousand Villages, at 2186 South Highland Drive. The Grand Opening will be Saturday, September 9, 2000. What is it that makes this store unique and worthy of attention in this space? Ten Thousand Villages is a nonprofit network of over 200 stores in North America featuring quality handicrafts from more than 30 Third World countries and benefiting over 60,000 craftspeople annually. Artisans are paid a fair price for their work rather than being exploited by large commercial ventures which pay a craftsperson a pittance and then charge exorbitant prices at retail outlets. Thanks To Ten Thousand Villages sales, thousands of unemployed or under employed artisans are able to support their families and participate in health and educational programs in their communities. Such sales go a long way in Third World countries. For instance, on average, $1,200 in Ten Thousand Villages retail sales provides the equivalent of full-time work for an artisan for a year! Furthermore, in addition to financial support, other humanistic life values are fostered by the Ten Thousand Villages network. To quote from one of their brochures:

For craftspeople in the Third World, “village” is where one’s heart is: where family and tradition and culture reside. In our mass production world, villages are still a setting for the individualized creation of authentic handicrafts. Making handicrafts is a way to pass one’s culture and skills to the next generation. But as the outside world pushes at the village, taking its natural resources and often its children, it becomes more and more difficult to live the village way of life. By selling their handicrafts, Ten Thousand Villages helps crafts people provide food and education for their families and helps these threatened villages survive.
Each village represents a unique, distinctive group of people. Multiply the village idea by ten thousand and it represents the world that our program is working to build. We invite you to join us in making our vision a reality.

In addition to your purchases, the local store would welcome financial donations to offset startup costs as well as your volunteer time on an ongoing basis to handle sales, restock shelves, tidy up, etc. Here is a venture we can all heartily support. Do stop by the store soon to see the amazing array of beautiful and fair-traded items from around the world.

–Hugh Gillilan

Discussion Group Report

Some Observations On Manifesto 2000

October 2000

By Richard Layton

This month the Discussion Group discussed Paul Kurtz’ Humanist Manifesto 2000. This document is 15 pages long, and all of it is important. It is impossible to summarize it in the short space of this article and still do it justice. However, a one-page summary can be found at the Council for Secular Humanism (CSH) web site at www.secularhumanism.org. In this present article, I am going to deviate from the usual format and present a summary of the discussion group comments made in the meeting:

There are many desirable ideas in Manifesto 2000, but it may be too large, detailed and abstract. A shorter document expressing the humanist viewpoint more succinctly might attract the interest of more people. Perhaps a Manifesto should address the question of how first and third world countries could get together to work on environmental problems.

The publication of Manifesto 2000 came as a surprise to the leaders of the American Humanist Association (AHA). Four major Humanist Manifestos and Declarations had already been published in the twentieth century: Humanist Manifesto I, Humanist Manifesto II, A Secular Humanist Declaration, and a Declaration of Interdependence. The AHA had announced it was planning to organize leading thinkers representing all the humanist organizations to write a new updated Manifesto. Kurtz’ document appeared possibly to be an attempted preemption of the AHA effort, which had intended to involve the CSH. Concern was expressed over what appears to be attempts by Kurtz in recent decades to set himself up as the spokesman for humanism. While it is recognized that he has been a most eminent humanist, it is being asked: Is there a power play going on here? The CSH has an authoritarian power structure and has been secretive about releasing membership data, while the AHA has been more open and democratic. The AHA has a committee developing a Humanist Manifesto III with instructions to make it short and declarative.

It is unfortunate that there is so much divisiveness among humanists. The various humanist organizations agree on the most substantive issues and ought to be working together more than they are to promote humanism, although there have been some attempts at cooperative efforts. The differences among the groups have perhaps been given too much emphasis. There is a need for an umbrella organization that could embrace the various orientations within humanism, such as religious humanism, secular humanism, etc. Yet humanists are very individualistic and may sometimes over-emphasize the importance of differences. Perhaps too much ego is involved. “Personality” or even power-seeking seem to be important factors in the divisiveness.

A similar schism has also existed among American Atheists in recent decades. There have been strong differences between the followers of Madeline Murray O’Hare and others, with O’Hare taking a quite authoritarian posture. Unfortunately, however, now she and two other family members have been murdered by criminal extortionists. That tragedy had nothing to do with any in-fighting within the organization.

Membership in humanist organizations has been declining in recent years. Currently there are about 65 chapters in the AHA. Humanists of Utah is one of the most active and has one of the largest memberships. The AHA headquarters will soon be moved to Washington, D.C.

On a more positive note, under the stimulus of AHA Board member Herb Silverman of South Carolina, a cooperative project involving the American Humanist Association, the Council for Secular Humanism, and the Atheist Alliance International was recently launched to explore new avenues of cooperation. Other organizations representing free thinkers will be invited to join and to support a joint publicity program to “Promote, Attract, and defend the Community of Reason.”

Humanism has a great deal of difficulty getting its views publicized. Although Manifesto 2000 received some media attention, it was not much. Perhaps a point of attack for humanists could be to get more attention called to Thomas Paine-a great champion of the American revolution. He deserves a monument, or perhaps his face could be put on a stamp. Some effort is already being made by the AHA Board to create more public awareness of his accomplishments.

Historically, some revolutions, such as the American and French Revolutions, have served to bring attention to humanist ideas. The Declaration of Independence is humanistic and the U.S. Constitution is totally secular. However, after a revolution, once a government takes over, it tends to forget the original values that sparked it. We need to articulate more clearly the ideas of the Enlightenment.

Discussion Group Report

Socrates Changes the Lives of Present-Day Prison Inmates

August 2000

By Richard Layton

“…the overwhelming majority of prison inmates in this country, both state and federal, are not incorrigibly mean or evil, and a correct understanding of the `public interest’ dictates that they should be given the opportunity to participate in state and federally funded higher-education programs designed to change their thinking and conduct.” This startling statement by Lawrence T. Jablecki appears in his article, “Prison Inmates Meet Socrates,” in the May-June, 2000, issue of The Humanist. He is not assuming the role of the liberal weenie who doesn’t believe in punishment. He acknowledges that “criminal offenders are in conflict with the norms of society; and that they are not suffering from psychological disorders that both explain and excuse their conduct. They have consciously and deliberately chosen to commit a crime; or, in numerous cases they have consciously and deliberately set themselves up for committing a crime by altering their normal mental and physical capacities. They were free to do otherwise and should be held responsible. Violent predators and many career criminals deserve to be incarcerated for many years, and some should be sentenced to life without the possibility of parole. I have no philosophical objection to capital punishment, but I am opposed to it because innocent persons are convicted and executed.”

Jablecki believes an introduction to the gadfly of Athens is a highly potent crime-prevention initiative that should be made available to a multitude of prisoners. As an undergraduate student who had not allowed any serious reflection and study to engage his mind or interfere with fun and who was thinking seriously of dropping out of college, one afternoon Jablecki encountered a campus intellectual who, when greeted with the words, “Hello, what do you know?” stopped in front of him and said, “Mr. Jablecki, I do not know anything. I am simply attempting to understand.” Later on Jablecki asked a senior philosophy major to explain the difference between knowing and understanding. The latter encouraged him to enroll in philosophy. Jablecki did, and learned the answer to his question. He was introduced to the life and teachings of Socrates. In a very brief period a Socratic “conversion” changed the entire course of his life. This autobiographical snapshot evidences the view that it is impossible to exaggerate the power of ideas and concepts–for example, justice, truth, goodness, virtue, and beauty–to grab a human mind and redirect a person’s life in the manner advocated by Socrates.

In 1986-87 Jablecki introduced Socrates to 30 prison inmates in two classes at Brazosport Junior College in Lake Jackson, Texas. They had been convicted of a range of serious felonies and incarcerated for a number of years. He told them he had decided to teach this class because of his firm commitment to the views of the German philosopher Immanuel Kant concerning “respect” for all persons as moral agents capable of choices and because of his own firm belief that the class members could change the direction of their lives if they chose to do so. This experience of teaching philosophy to prison inmates has convinced him that, if the prisoners perceive that he really means what he says, the way is opened up for some existentially meaningful discussions and insights.

Perhaps surprisingly, except for a mere few, these prisoners do not blame society or others for their criminal behavior. Many vented their resentment about how they believed they were treated unfairly at one or more steps in our system of criminal justice, and any seasoned practitioner in the system is obliged to acknowledge the truth of some of their claims. Yet they did accept the facts that they made real choices to commit crimes and that society has a right to protect itself by incarcerating malefactors. They recognized that none of them were compelled or forced to commit their crimes and they were free to do otherwise. None claimed, or even implied, that he did not deserve to be punished. They said they knew exactly what they were doing when they committed a murder, robbed a store at gunpoint, etc. They also spent several hours discussing the meaning of concepts such as knowledge, wisdom, ignorance, self-interest, mistake, voluntary, involuntary, happiness, and virtue.

In 1988 Jablecki began teaching philosophy to various graduate and undergraduate student prisoners in Rosharon, Texas, in the already established prison program of the University of Houston at Clear Lake. The profound relevance of Socrates’ teaching that the “unexamined life is not worth living” are evidenced in comments made by the researchers in the program after a five-year revue:

“These students find that courses in history, literature, and philosophy profoundly deepen their sensitivities and expand their horizons. TDCJ students may come from pockets of economic poverty from which they have never escaped–they have literally no knowledge of other ways of living. Humanities courses open new realities to them, wholly changing their perspectives about who they are and what the world is about…Such courses are truly revelations, showing ways of living and thinking that they have not encountered before.”

The university’s most current report in 1995 showed that between 1990 and 1995, of the 39 inmates who earned a bachelor’s degree, 17 were released on parole and two were returned to prison–a recidivism rate of 11%. Of the 45 who earned a master’s degree during the same period, 19 were released on parole and one was returned to prison–a recidivism rate of 5%. Studies conducted recently in Indiana, Maryland, Massachusetts, New York, and other states have all reported significantly low recidivism rates for inmates in correctional higher education programs, ranging from 1% to 15.5%. In Texas between 45% and 50% of parolees from the general prison population are re-incarcerated within three years of the date of their release. They then are convicted of new felony offenses, many of which involve victims who suffer the loss of property, physical injuries, and death. The author’s own contact with students in the program, including some now on parole, confirms a determination to change and make contributions to society totally unmatched by the majority of inmates who spend their idle time playing dominoes, watching television, and reflecting on their perceptions that they are the oppressed victims of society.

Some of these former students are now paying taxes. Some short-sighted politicians in Washington have in recent years made prison inmates ineligible for Pell Grant tuition assistance for higher education, a move that has saved only a tiny amount of money, 35 million dollars of the six billion awarded to all recipients. “The policy these politicians approved,” says Jablecki, “places them in the category of unmerciful retributivists who sincerely believe in the moral imperative of severe punishment for all criminal offenders–that is, they have no mercy for the wicked…they really believe that the construction of new prisons is not a necessary evil but a necessary good.”

“As was the case when I was introduced to Socrates, he can shake unexamined beliefs and faiths. However, unlike any of their other academic classes, it is important that most of my courses contain opportunities for prison inmates to reflect on the most important and enduring questions of human existence. And I can confidently claim that many of them are surprised by the joy of facing the unfathomed depth of Socrates’ message to live an examined life.”

Successful Summer Social

September 2000

The annual Summer Social, held August 10, 2000 was a tremendous success. Rolf Kay definitely knows how to organize a party!

I joined Humanists of Utah in early 1992. That year some members of the group decided to try a summer social so that we could meet less formally than our regular meetings. We had a pot-luck picnic at a park somewhere in the avenues. As I recall there were eight of us who attended. The other seven brought salads of some sort. I brought a cake. You see it was my birthday, so I felt obliged to bring dessert.

Kurt Vonnegut had recently been named the honorary president of the AHA and I was then, as I am now, much enamored with his writing, so I brought a special cake. It said, “Happy Birthday Wanda June” in frosting. My first exposure to Mr. Vonnegut’s work was a performance of the play Happy Birthday Wanda June at Arrow Press Square. After the show I stopped by a local pub that I then frequented and waxed about the quality of the work. The bartender, a friend of mine, told me that Kurt Vonnegut had published numerous stories and books. I was hooked.

This past week’s celebration also fell on my birthday, my fiftieth. I must admit that the Prime Rib and Salmon were huge leaps forward from the salads of ’92. The company of 52 thinking people was tremendous, and the music was incomparable.

We have come a long way from our beginnings. I am proud of our accomplishments in promoting humanistic ideals and confident that in another 8-10 years we will have progressed even more!

–Wayne Wilson

Discussion Group Report

Should Science and Religion Stay Out of Each Other’s Domain?

November 2000

By Richard Layton

“Creationism does not pit science against religion, for no such conflict exists,” declares Stephen Jay Gould in “Non-Overlapping Magisteria,” in the Skeptical Inquirer, July/August 1999. In the same issue of this magazine, two other scientists, quoted below, give alternative viewpoints to Gould’s on the relationship between science and religion.

Gould goes on: “Creationism does not raise any unsettled intellectual issues about the nature of biology or the history of life. Creationism is a local and parochial movement, powerful only in the United States among Western nations, and prevalent only among the few sectors of American Protestantism that choose to read the Bible as an inerrant document, literally true in every jot and tittle.” Creationism based on biblical literalism makes little sense to either Catholics or Jews, he says, because neither religion maintains any extensive tradition for reading the Bible as literal truth. It is illuminating literature based partly on metaphor and allegory, and demanding interpretation for proper understanding. Most Protestant groups other than the fundamentalists take the same position.

Pope Pius XII in a 1950 encyclical, Humani Generis, said that Catholics could believe whatever science determined about the evolution of the human body as long as they accepted that at some time of his choosing God had infused the soul into such a creature. But Pius regarded evolution as only tentatively supported and potentially untrue. Yet Pope John Paul II, considering the growing data in support of evolution acquired in the past half-century, placed the factuality of it beyond reasonable doubt. Sincere Christians must now accept it as effectively proven fact.

“The lack of conflict between science and religion arises from a lack of overlap between their respective domains of professional expertise–science in the empirical constitution of the universe, and religion in the search for proper ethical values and the spiritual meanings of our lives,” Gould continues. This principle he calls “non-overlapping magisteria” (NOMA), and he says Pius accepted it. “Science and religion are not in conflict, for their teachings occupy distinctively different domains…I believe, with all my heart, in a respectful, even loving concordat.”

Richard Dawkins firmly disagrees. In “You Can’t have It Both Ways,” he says, “There is something dishonestly self-serving, in the tactic of claiming that all religious beliefs are outside the domain of science. On the one hand miracle stories and the promise of life after death are used to impress simple people, win converts, and swell congregations. It is precisely their scientific power that gives these stories their popular appeal. But at the same time it is considered below the belt to subject the same stories to the ordinary rigors of scientific criticism: These are religious matters and therefore outside the domain of science. But you cannot have it both ways. At least, religious theorists and apologists should not be allowed to get away with having it both ways. Unfortunately all too many of us…are unaccountably ready to let them get away with it…Given a choice between honest-to goodness fundamentalism on the one hand, and the obscurantist, disingenuous doublethink of the Roman Catholic church on the other, I know which I prefer.”

Ernst Mayr in “The Concerns of Science” demarks between science and religion as follows: Scientists do not invoke the supernatural to explain how the natural world works. Nor do they rely on divine revelation to understand it. Science shows an openness to new facts and hypotheses. Religions are characterized by their relative inviolability; in revealed religion a difference in the interpretation of even a single word in the revealed founding document may lead to the origin of a new religion. In contrast, in science one finds different versions of almost any theory. Scientists bring a set of “first principles” to the study of the natural world: 1) that there is a real world independent of human perceptions, 2) that this world is not chaotic but is structured in some way and that most, if not all, aspects of this structure will yield to the tools of scientific investigation, and 3) that there is historical and causal continuity among all phenomena in the material universe and included within the domain of legitimate scientific study is everything known to exist or to happen in this universe. But they do not go beyond the material world to a metaphysical or supernatural realm inhabited by souls, spirits, angels or gods, a heaven or nirvana which is often believed to be the future resting place of all believers after death. Such constructions are beyond the realm of science.

Discussion Group Report

The Politics of Sanctimony

April 2000

By Richard Layton

George W. Bush and God Himself are on notice: “The Democratic Party is going to take back God this time,” Gore operative Elaine Kamarck announced a few months ago as the vice president made his play for the Almighty. He declared his disdain for “hollow secularism,” his support for state funding of sectarian social service programs, and his conviction that “the purpose of life is to glorify God.” Gore said of his religious faith, “I don’t wear it on my sleeve, but faith is the center of my life.”

The above paragraph opens Wendy Kaminer’s article in The American Prospect, November 23,1999, with the same title as the present article. She further observes that a lack of faith in the intelligence of the American people inspires educated candidates like Gore, Bush, Steve Forbes, and Elizabeth Dole to waffle on evolution. All of them responded sympathetically to recent efforts by the Kansas Board of Education to purge the science curriculum of evolution. A perceived lack of faith in the morality of the American people has inspired a crusade in Congress against popular culture. Congressional moralists leave us no choice but virtue.

What do they mean by virtue? “Godliness in the form of allegiance to an established, mainstream religion (New Age will not do)…we cannot be good without God–a Judeo-Christian God, or maybe an Islamic one,” Kaminer says. Virtue is supposedly attendant on respectable religions as shown by the conviction that America is in a state of moral decline grounded in the 1960’s and evidenced largely by sexual permissiveness in real life and the media. Only lately has violence in the media become a focus for conservatives.

It could be argued that America made significant moral progress in the ’60’s. The Civil Rights Movement, feminism, and the Supreme Court’s imposition of constitutional restrictions on the prosecutorial power of the state challenged us to turn ideals of freedom and equality into realities for all Americans. The emphasis on the losses associated with the 1960’s, such as chastity and traditional religiosity, instead of the gains, dominates the anti-vice campaigns today. The drive to sanctify life by imposing new restrictions on speech and lifting old restrictions on state sponsoring of religions has been evident throughout the 1990’s and would have dominated the 2000 campaign if it hadn’t gained political momentum from recent mass shootings. These shootings have provided social-issue conservatives with unexpected opportunities for culture control, which Clinton Democrats seem afraid to oppose.

The juvenile justice bill pending in Congress includes amendments aimed at introducing sectarianism into the public schools. It mandates posting the Ten Commandments in the schools and denies attorneys’ fees to people who successfully sue a school that has violated rules against establishing religion by conducting sectarian services or erecting sectarian memorials. A majority of House members also voted for a resolution exhorting all Americans to engage in “prayer, fasting, and humiliation before God.” It failed on a vote of 275 for and 140 against (less than the two-thirds needed for passage). Kansas Republican Senator Sam Brownback introduced a resolution to create a Special Committee on American Culture “to study the causes and reasons for social and cultural regression,” to determine the impact of unspecified “negative cultural trends” on “the broader society” and on “child well-being,” and to “explore means of cultural renewal.” This agenda “represents one of the periodic campaigns against popular entertainments and the people who enjoy them…Congress has been tenacious in its efforts to censor images of sex and violence it doesn’t like,” says Kaminer. She points out that in 1996 Congress also passed the Communications Decency Act, prohibiting “indecency” on the Internet. When CDA was invalidated by the Supreme Court, Congress passed The Children On-Line Protection Act, prohibiting speech that a federal prosecutor might consider “harmful to minors.” In 1996 Congress also passed a law requiring cable operators either to scramble fully or consign to limited late-night hours sexually explicit programs in order to prevent the “signal bleed” that accompanies partial scrambling and exposes fleeting images and sounds of sex. A challenge to the signal bleed prohibition, brought by the Playboy Entertainment Group, will be argued before the Supreme Court. Another proposal currently before the Senate would classify violent “audio and visual media products” with cigarettes and subject them to federal labeling requirements.

All these laws take it as an article of faith that children are harmed by any exposure to virtual sex. Like God’s love it needn’t be proved empirically. In an evidentiary hearing, Playboy Entertainment Group’s experts testified that there is no empirical evidence that sexually explicit videos harm minors psychologically, a point the government’s witness did not dispute. Kaminer asks, “If this law is enacted, will film adaptations like Shakespearean tragedies or movies like the Thin Red Line or Bonnie and Clyde be treated like toxic wastes, which must be labeled to the satisfaction of federal bureaucrats?

“Liberals repelled and frightened by hate speech or anxious to restore ill-defined spiritual values to society, as well as centrists and conservatives, need to be reminded of the moral illegitimacy of censorship,” says Kaminer. “Liberals troubled by congressional visions of culture control need to address its political implications unapologetically.”

Restore Pledge

May 2000

Back in the good old days, pre 1950, we were pleased and proud to pledge allegiance to our flag and our nation. When I was a lad at the old Lowell School, we stood at attention and placed our right hand over our heart when the flag was raised every morning and once in the classroom we performed the same ceremony to recite the pledge. But in those days the pledge was to “one nation, indivisible, with liberty and justice for all.” It was a daily reminder that we lived in great democracy that recognized the worth and value of every citizen. Now, we are no longer “one nation, indivisible” because inserting the phrase “under God” divides this nation into those who believe they live “under God” and those who don’t. The growing political influence of the religious fanatics will soon press to have “Christians” added to end of the pledge so that it will read “one nation, under God, with liberty and justice for all Christians.” Then the transformation of our democracy to a theocracy will be complete.

Until that time I encourage reasonable people to resist forcing school children to recite the pledge of allegiance and to urge congress to delete the phrase “under God” and restore the pledge to its secular role recognizing the United States as “one nation, indivisible, with liberty and justice for all.”

–Flo Wineriter

Discussion Group Report

Is Science Just A Synonym For Rationality?

September 2000

By Richard Layton

There is a “tendency common to most humans to create abstract concepts such as justice, freedom, love, spirituality, and now, science and animate them all, appearing as antique gods with arrows, swords, or balances in hands,” says Andreas Rosenberg in his article, “The Nature of Scientific Inquiry,” in the Occasional Newsletter of the Friends of Religious Humanism. “This completes the move of such concepts into the realm of mythology. This move changes how we look at science.” It elevates science into an unrealistically inclusive position. It gives the impression that, if you are a rational being, you must use the scientific method in every possible situation. At this point it becomes unclear whether science is the preferred universal tool for a rational being or whether science is just a synonym for rationality. If it becomes just a tool, it becomes very difficult to define when it is appropriate to use it as a label. Is astrology a product of science as a tool? It is logical in its arguments and based on observations. Yet, if we agree that science is not a universal tool but only another name for rationality, then the most primitive aborigines practice science because their behavior within their surroundings is quite rational. It is clear, then, that it is not useful either to consider scientific inquiry as a universal tool or science as a synonym for rationality.

Is scientific inquiry in a nearly mythological context the golden path to truth? Look in the newspaper, and you will see that science can be used to define as true many incompatible statements. One day red wine is good for your health. The next day alcohol use may lead to liver disease. No wonder some lost school board in Kansas has declared creation to be a true scientific theory. Perhaps Winnie the Pooh could be put forth as a theory of small bears. Uncertain and vague statements about science seem to be due to an unnecessary broadening of the definition of scientific inquiry.

In defining scientific inquiry, we first have to identify its goals. If we read texts in physics, astronomy, chemistry, biology, and psychology, we find that the goal is always the same: 1) A precise description of the external world with us a part of it, and 2) A description in terms of observations made.

Scientific vocabulary does not contain statements like, “good for you.” Thus stories about the usefulness of red wine or the dangers of alcohol have nothing to do with science.

Scientific inquiry is based on two premises, says Rosenberg: 1) There is an external world common to us all-a world existing independent of our observing it. Thus if the human race were obliterated by an intergalactic construction company, the record players booming out Beethoven’s Fifth Symphony would continue to play although there are no humans to hear it. Observers from other planets would find ruins of lost cities and all our toys as real as they once were to us. The world exists even without us. We call it the common reality premise. 2) Events in the external world are related to each other by causal connections. Our observations of them are logically related. We call this the causality premise. Science as an art of describing the world around us cannot function if either of these two premises is violated. Are they ever challenged? Yes, the Christian dogma of God’s omnipotence and the possibility of his intervention in our time negate the causality premise. The French postmodernists and deconstructionists challenge the common reality premise. “Anytime you hear somebody describing science as a white male power structure and touting the advantages of female science,” says the author, “you know the common reality premise has been violated. The structure of the world we observe as scientists is observer-neutral and common to us all. You cannot deconstruct it to different pieces depending on who you are.”

Rosenberg summarizes: “Provided common reality exists independent of us and the events show reproducible causality, we can proceed to paint a picture of the world. The process of doing it is scientific inquiry. This inquiry is based solely on past or present observations.”

The inquiry itself is practical and takes place in seven consecutive steps: 1) Record observations, 2) Compare observations and convert observations to quantitative measurements (as height or length on a scale-high, higher, highest), 3) Introduce common reference (a standard for weight or using sea level as a base for height measurements), 4) Identify and separate variables ( a variable is some measurable property, conforming to some reasonable scale on which its variation can be defined, 5) Formulate a hypothesis, 6) Convert the hypothesis to a theory, 7) Elevate the theory to the status of a law.

How well is the external world described by the laws derived by the seven-point method? In our assumptions, we assume that all our theories have become laws, and we get dangerously close to describing a totally determined universe that no one believes in any more. What about the presence of randomness, supported by quantum mechanical theory? The author posits, “…what we call randomness and probability are factors introduced to account for the inadequacy of the human senses to deal with a wide variety of observations. There are too many variables for us to observe with necessary precision; due to the limited nature of our senses, we inevitably influence events in measuring them.” Many scientists, including Einstein, refused to believe in inherent randomness and preferred to look at apparent randomness as a product of hidden variables. The problem of uncertainty is closely linked to the effect of measurements. We have to visualize events so our senses will allow us to make an observation. This is the major limitation of science.

Scientific inquiry will lead to a true picture of the world surrounding us, but the picture is never complete and has to be continuously amended when and if new observations are made. Can something be true if it has to be amended? Yes, all scientific laws are approximations, and what we mean by amendment is that we have isolated new variables and can work with higher precision than before so that the picture of the world becomes clearer and shows more details. This does not mean that the previous picture was wrong, only that it was true at that level of detail and precision.

The behavioral sciences-sociology, psychology, economics, etc., are currently at a low level of development. There has been plenty of hard work done in them and brilliant insights gained, but their level is simply a function of the complexity of the systems at issue. “The extension of science,” Rosenberg states, “to art, poetry, and religion has not contributed much to science or to the arena of human emotions and feelings. Finally, the verbal extension of science into the realm of spirituality…may contribute to literature and poetry, but not to science.”

Prison Inmates Meet Socrates

2000

Written by Lawrence T: Jablecki who is the director of the Brazoria County Community Supervision and Corrections Department in Angleton Texas, and has a Ph.D. in political philosophy from Manchester University in Manchester, England. Printed in the May/June 2000 Issue of The Humanist.

Since 1986, as an adjunct professor on the faculty of a college and a university in the state of Texas, I have had direct contact with hundreds of prison inmates enrolled in academic programs for the purpose of completing the associate’s, bachelor’s, and master’s degrees. I am persuaded that this experience permits the following assertions: the overwhelming majority of prison inmates in this country, both state and federal, are not incorrigibly mean or evil, and a correct understanding of the “public interest” dictates that they should be given the opportunity to participate in state and federally funded higher-education programs designed to change their thinking and conduct.

If any reader is tempted to brand me with the pejorative label of a liberal weenie who doesn’t believe in the hard coinage of punishment, the following brief comments should suffice to assuage that suspicion. Criminal offenders are in conflict with the norms of society; they are not suffering from psychological disorders that both explain and excuse their conduct. They have consciously and deliberately chosen to commit a crime or, in numerous cases, they consciously and deliberately set themselves up for committing a crime by altering their normal mental and physical capacities. They were free to do otherwise and should be held responsible. Violent predators and many career criminals deserve to be incarcerated for many years, and some should be sentenced to life without the possibility of parole. I have no philosophical objection to capital punishment, but I am opposed to it because innocent persons are convicted and executed.

Now that I have exposed most of the philosophical guts of my position on crime and punishment, the specific purpose of this essay is to elucidate the reasons why I believe that an introduction to the gadfly of Athens is a highly potent crime-prevention initiative that should be made available to a multitude of prisoners.

I graduated from high school in 1958 and the thought of pursuing higher education was almost totally foreign to my mind. Primarily to maintain my association with buddies in my graduating class, I enrolled in a local junior college and unceremoniously flunked out after less than a full semester due to a total lack of interest. I went to work full time and made some very foolish choices that brought me dangerously close to becoming a felonious hoodlum. When not working, I was in the neighborhood bowling alley, where I achieved some local notoriety as the kid with a 200-plus average. In the fall of 1959, motivated mainly by the desire for an adventure away from parental oversight, I enrolled in the four-year college in Oklahoma where my mother had been a student.

Although I was not failing any of my classes during my first semester, I refused to allow any serious reflection and study to engage my mind or interfere with fun, so by January 1960 I was determined to drop out and pursue the career of a professional bowler. The passage of very close to forty years has not significantly dimmed the memory of an event during the same month that marks the beginning of a radical transformation in my thinking and conduct.

Walking to class one afternoon I encountered one of the recognized campus intellectuals. In response to my greeting of “Hello, what do you know?” he made an abrupt stop in front of me and said, “Mr. Jablecki, I do not know anything. I am simply attempting to understand.” He then marched past me. Not having a clue as to the meaning of his curt remark, I articulated a response in very unscholarly language. Several days later I asked a senior who was majoring in something called philosophy to explain to me the distinction between knowing and understanding. After his learned discourse, most of which I failed to comprehend, he urged me to remain in school and suggested that in the spring semester I sign up for “Introduction to Philosophy.”

Inspired by his apparent wisdom I remained in college and enrolled in “Introduction to Philosophy.” In that class the instructor explained the perennial problems of philosophy: I was able to grasp the difference between knowledge and understanding, and I was introduced to the life and teachings of Socrates. During the semester my ambitions, my thinking, and even my behavior changed. I sold my prized black-beauty bowling ball and purchased some philosophical works, which are still in my library. In a very brief period of time a Socratic “conversion” changed the entire course of my life. To the teacher, Dr. Mel-Thomas Rothwell (deceased), I owe an immeasurable debt of gratitude for his patient mentoring until my graduation in 1964.

The relevance of this autobiographical snapshot is that it evidences the view that it is impossible to exaggerate the power of ideas and concepts–for example, justice, truth, goodness, virtue, and beauty–to grab a human mind and redirect a person’s life in the manner advocated by Socrates. And, at the risk of making a generalization to which I acknowledge numerous exceptions, a Socratic conversion usually requires the inspired communication of a teacher or mentor who has experienced the transformative power of ideas and concepts.

In the 1986-1987 academic year I was given my first opportunity to introduce Socrates to prison inmates under the auspices of what was at the time Brazosport Junior College in Lake Jackson, Texas. This institution, now known as Brazosport College, continues to provide a two-year course of instruction leading to an associate of arts degree. I taught two courses of “Introduction to Philosophy” to approximately thirty male inmates at the Clemens Unit of the Texas Department of Corrections. I possess no knowledge of the success or failure of any of these men, but I do have some vivid recollections of some of the classes, including our lively discussions of Socrates.

The first session of the first class has left a permanent mark in my bank of memories. Standing in front of a group of men convicted of a range of serious felonies and incarcerated for a substantial number of years can be terrifying, to say the least. I told them that I had agreed to teach this class because of my firm commitment to the views of the German philosopher Immanuel Kant concerning “respect” for all persons as moral agents capable of choices and my equally firm belief that they can change the direction of the remainder of their lives if they choose to do so. This is essentially how I introduce myself to all new classes of prison inmates. And if they perceive that I really mean what I say, the path is clear for some existentially meaningful discussions and insights.

Perhaps the most important fact I can report about these men–inclusive of the inmates I have taught to date–is that, except for a mere few, they do not blame society or others for their criminal behavior. This acceptance of guilt and responsibility is probably at odds with the belief of most people about the supposed rationalizations of criminals. Not unexpectedly, many of the inmates vented their resentment about how they believe they were unfairly treated at one or more steps in our system of criminal justice, and any seasoned practitioner in the system is obliged to acknowledge the truth of some of their claims. The pertinent and critical point, however, is their acceptance of the facts that they made real choices to commit crimes and that society has a right to protect itself by incarcerating malefactors.

These intuitive or pre-philosophical beliefs are fertile ground for introducing the free-will-versus-determinism debate and the arguments employed to justify the institution of punishment. And these issues lead straight to what is usually a hotly contested debate of the Socratic view that persons do not voluntarily or knowingly commit evil or unlawful acts because knowledge and wisdom are the most powerful elements in human life.

When the above issues are examined in philosophy classes in what the inmates refer to as the “free world,” they do not convey the same sense of urgency and importance as they do for students confined behind steel bars. One version of determinism is that all so-called free choices are illusory because no human actions or decisions are exempt from an unbroken chain of “causes.” Realizing that, if true, this theory could exonerate him from blame and punishment, a convicted murderer eagerly stated, “I would like to think that it was determinism” rather than a choice, and the room was filled with soft laughter. Another student, convicted of aggravated robbery, attempted to articulate the centuries-old view that all persons are born with an innate knowledge of right and wrong–that is, a moral compass called the conscience. Confessing much confusion about how it works, he said, “Now, I done something and I know it was wrong.” Following a Socratic unpacking of the words cause and compel, the unanimous decision was that none of them were compelled or forced to commit their crime and they were free to do otherwise.

It should come as no surprise that a discussion of the purpose and justification of punishment with prison inmates, many of whom have been incarcerated for a major portion of their lives, reaches a high level of emotional intensity. No student, in either class, claimed or even implied that he did not deserve to be punished. A chorus of voices, however, condemned the enormous disparity in sentences characteristic of an indeterminate sentencing system and the wide range in which judicial discretion is free to roam.

With no hesitation, one of the men expressed the belief that if he stole a car and Dr. Jablecki stole a car the latter would undoubtedly be gently treated with probation and the former would be sentenced to prison. This, he exclaimed, is not justice or equality, as both committed the same crime and deserved the same punishment. Heads nodded in agreement and several voiced the caustic remark that the lovely lady of justice wearing the blindfold of impartiality and equality is never blind to the influences of money and status in the community. Anyone, therefore, who plays the role of a Socratic midwife in a similar situation needs to be prepared to maneuver through an emotional minefield in which they will be made aware of all the ugly warts and blemishes in our system of criminal justice.

Now, as implied earlier, I can still almost hear the initial outbursts of disbelief expressed in response to Socrates’ belief that no person voluntarily or knowingly commits an evil or wrong act. Socrates, according to the first consensus, had been drinking too much wine or he was an insane old man. The inmates said they knew exactly what they were doing when they committed a murder, robbed a store at gunpoint, sexually assaulted a woman, or cut a drug deal. Assuming the role of Socrates, I called them a collection of ignorant fools incapable of recognizing their best and permanent interests as human beings.

Needless to say, this enlivened the tone of the discussion and set the stage to unpack the meaning of a cluster of relevant words: knowledge, wisdom, ignorance, self-interest mistake, voluntary, involuntary, happiness, and virtue. After several hours of defining and analyzing them, the new consensus was a defense of Socrates’ sobriety and the belief that he was a very smart old man. Although I don’t have current information on any of the inmates, I believe that most of them made some progress in the ascent from the cave of ignorance and have not forgotten their meeting with Socrates.

In 1988 a fortuitous meeting with George Trabing, the director of the prison program for the University of Houston at Clear Lake, resulted in an invitation for me to join the adjunct faculty of the university. My assignment was to teach a variety of undergraduate and graduate courses in philosophy to prison inmates housed in the Ramsey I prison unit in Rosharon, Texas. During the past ten years, missing only one or two semesters, I have taught a number of classes–including “Metaphysics,” “Epistemology,” “Philosophy and the Law,” “Philosophy and Religion,” “Political Philosophy,” “Ethics,” and “Human Rights and the Justification of Punishment”-in which I inject the life and teachings of Socrates.

The university’s bachelor’s program was established in 1974; the master’s program began in 1988. Four degrees are currently offered to inmates: a B.A. in behavioral sciences, a B.A. in the humanities, an M.A. in literature, and an M.A. in the humanities. As Trabing, Jerry Fryre, and Craig White describe in their 1995 report Five Year Review: Texas Department of Criminal Justice Outreach Component Human Sciences and Humanities, the degree in behavioral science contributes to the development of the

undergraduate student’s skills in analytical thinking, written communication, and research; to provide understanding of the customs, languages, values and behaviors of culturally diverse populations, and to educate students to participate as informed, critical citizens of society…. The primary mission of the undergraduate and graduate plans in Humanities and literature is to promote cultural literacy and interdisciplinary skills through the study of the liberal arts. The most important dimension of the mission of all of these educational programs, however, is to promote positive changes in the thinking and conduct of inmates and to reduce the recidivism rate of those who are released on parole. The profound relevance of Socrates’ teaching that the “unexamined life is not worth living” and his identification of knowledge and virtue are captured in the five-year review’s comments regarding the men who earned their degree in the humanities: These students find that courses in history, literature, and philosophy profoundly deepen their sensitivities and expand their horizons. TDCJ students may come from pockets of economic and intellectual poverty from which they have never escaped–they have literally no knowledge of other ways of living. Humanities courses open new realities to them, wholly changing their perspectives about who they are and what the world is about…. Such courses are truly revelations, showing ways of living and thinking that they have not encountered before. Now, as every practitioner in the field of criminal justice should know, the verification of an indisputable causal connection between offenders’ completion of any crime-prevention strategy and their subsequent conduct is a tricky enterprise. At the outset, the creators of these academic programs for prison inmates were cognizant of the paramount importance of documenting a bank of data from which they could quantify the apparent successes and failures. The university’s most current report was released in January 1995 as a twenty-year history of the program. The report found that more than 200 inmates earned a bachelor’s degree, while forty-five earned a master’s degree. From 1990 to 1995, of the thirty-nine inmates who earned a bachelor’s degree, seventeen were released on parole and two were returned to prison–a recidivism rate of II percent. During the same period, of the forty-five who earned a master’s degree, nineteen were released on parole and one was returned to prison–a recidivism rate of 5 percent.

To argue that their academic accomplishment is the only factor capable of explaining their successful reintegration into society would be a mistake. The only near definitive answer to this issue is to track a control group of parolees in the same age range and duration of incarceration who have not completed a similar academic program. Although the U.S. Department of Justice did not fund a recent grant proposal from the university to conduct such research, studies conducted in Indiana, Maryland, Massachusetts, New York, and other states have all reported significantly low recidivism rates for inmates in correctional higher-education programs, ranging from 1 percent to 15.5 percent. In addition, my contact with the students in the Texas program–some of whom are now on parole confirms a determination to change and make contributions to society totally unmatched by the majority of inmates who spend their idle time playing dominos, watching television, and reflecting on their perceptions that they are the oppressed victims of society.

Fortunately, I experienced my Socratic “conversion” when I was twenty years old and would not entertain benevolent thoughts toward any person casting doubts on the reality and meaning of that experience. Similarly, five of the former inmates who achieved academic success deserve to be heard. Their comments include:

  • “My new degrees, new self-image, and newfound confidence in society led me to try something I’d never tried before: a straight lifestyle…. Without the formal education which was available through the college program I would still be trying to perfect my technique for a life of crime. Instead, I am giving something back.”
  • “I cannot begin to tell you how much my life has changed as a result of the ‘awakening’ I received from each … of my instructors. The accomplishments I have made since my release would not have been possible without an education.”
  • “Because of my educational pursuits started while incarcerated, I find myself with a master’s degree, an L.C.D.C. (licensed chemical dependency counselor), and a position as the manager of client services with a large nonprofit organization. I am forever thankful…for the opportunity to change my life.”
  • “For me, the college experience…has changed my life. It has allowed me to believe in myself. It has forced me to reevaluate my life without the self-pity or excuse making.”
  • “I firmly believe that education is the key to staying out of prison…. My parents are proud of me; I am respected and consulted by my colleagues; I pay taxes. … I hope that I do make a difference in other peoples’ lives as a result of my experiences and achievements.”

The latter reference to the payment of taxes by a former inmate exposes the shortsighted and factually incorrect arguments of the politicians in Washington, D.C., who have seen to it that prison inmates are ineligible for federal Pell Grant tuition assistance for higher education. In his July 10, 1995, New Yorker article “Teaching Prisoners a Lesson,” James S. Kunen draws attention to the critical factual misrepresentations involved in the demise of inmates’ eligibility for Pell Grants: When Bart Gordon, a Democratic representative from Tennessee, sponsored the 1994 crime-bill amendment that barred prisoners from receiving Pell Grants, his aim was to trim the fat in federal education spending. He was under the impression that prisoners were using up something like seventy million dollars a year in Pell Grants that could have gone to more deserving students–those on the outside. Senator Kay Bailey Hutchison of Texas, a Republican who led the fight in the Senate against Pell Grants for prisoners, argued that inmates siphoned off two hundred million dollars and displaced a hundred thousand law-abiding students. In fact, all applicants who meet the grants’ need-based eligibility requirements receive Pell Grants, regardless of how many qualifying recipients there are. As a General Accounting Office report explains, “If incarcerated students received no Pell Grants, no student currently denied a Pell award would have received one and no award amount would have been increased.” And the amount of money saved by cutting off grants to prisoners is tiny: according to the General Accounting Office, of approximately four million Pell Grant recipients in the 1993-94 academic year, twenty-three thousand were in prison, and they received thirty-five million dollars of the six billion dollars awarded, or about six cents of every ten program dollars. It would probably be incorrect to suggest that Hutchison and the other members of Congress who helped her destroy hope for thousands of inmates in this country are in the philosophical camp of the ancient Cynics, who were contemptuous of bodily pleasures, sneering fault-finders, and incredulous of human goodness and the capacity to change from vice to virtue. I am persuaded, however, that the policy these politicians approved places them in the category of unmerciful retributivists who sincerely believe in the moral imperative of severe punishment for all criminal offenders–that is, they have no mercy for the wicked. They are not hypocrites, because they really believe that the construction of new prisons is not a necessary evil but a necessary good. Some of the extremists in this camp probably believe that it would be good policy to literally brand the scarlet letter C (for convict) on the forehead of every prison inmate.

Contrary to the philosophy of unmerciful retributivism, Pell Grants for inmates had the long-range potential of saving billions of tax dollars that will now be spent on the construction and maintenance of prisons and the annual costs of warehousing multitudes of federal and state inmates in what can best be described as toxic waste dumps inhabited by persons with little or no hope for a future that can make life worth living. And equally, if not more important, the advocates of unmerciful retributivism have crafted a policy that unintentionally results in a multitude of new victims of crime perpetrated by parolees who have changed from bad to worse.

Recognizing the existence of an unknown number of contingencies–all of which can influence the success or failure of a parolee armed with a university degree the university’s statistics stand in sharp contrast to the fact that, in Texas, between 45 percent and 50 percent of parolees are reincarcerated within three years of the date of their release. Most of them are convicted of new felony offenses, many of which involve victims who suffer (among numerous things) the loss of property, physical injuries, and death. Although it is an expansion of the normal usage of the word, this is an obscenity that in addition to all of the accompanying human suffering is costing taxpayers many millions of dollars every year. In Texas, the annual cost for one prison inmate is close to $20,000–very close to the amount my wife and I pay for our son to attend the prestigious Rice University in Houston–and this cost does not include the maintenance of existing prisons and the construction of new ones.

After ten years of almost weekly contact with students in the University of Houston prison program, it has become abundantly clear that if I did not believe in the inmates’ capacity to change their totally selfish habits of thought and conduct I would not waste my time on an academic exercise destined to fail. Inmates do not have a “right” to a free university education, nor do they “deserve” it. However, there is an urgent and compelling public interest at stake, justifying the use of tax dollars to create and sustain academic programs for them. Once they grasp the Socratic definition of knowledge and its vast distance from opinions and beliefs, most of my current students articulate the hindsight observation that, had they met Socrates at the age of twenty or earlier, it is not unrealistic to suggest they might not be meeting him now clothed in prison garb. While not willing to fully embrace the contention that during their life of crime they were totally ignorant and really did not “know” what they were doing, most of my students “see,” for the first time, the profound truth of Socrates’ doctrine that the possession of knowledge and wisdom can lead to a radical and positive change in both thinking and behavior.

Despite the occasional bitterness aimed at the alleged disparities in the system of criminal justice, during these discussions many of the inmates feel at ease to lay bare their souls and express genuine remorse about the impact of their conduct on parents, spouses, children, and victims. It would be foolhardy to claim or even imply that an encounter with Socrates is a necessary prerequisite to bring the majority of them to a profound existential consciousness of the negative consequences of their crimes. In fact, many of them have previously read several books of Plato’s Republic, and some have read his Apology and Crito. But none of them have participated in a methodical unpacking of the content, the profound truth, and the errors in Socratic doctrine and instead have had their emotions shaped by traumatic events in their lives–the death of one or both parents, a divorce decree from a former spouse, children who commit crimes, and a denial of parole. The important claim can be made, however, that the Socratic method of philosophical reflection provides a coherent conceptual framework in which many of these men, for the first time, are “awakened” to a totally new perspective on life.

Prior to my career in criminal justice, when I discovered Great Visions of Philosophy by W. P. Montague, a notation of “good” was made by the following passage:

There is a great deal of wrong conduct by individuals and by groups that owes its wrongness to want of wisdom rather than to want of will…. We all know that boys brought up in a slum district may get the notion that gang loyalty is really better than loyalty to society; the stealing, kidnapping, and even murder are justifiable and thrilling adventures; and that pity for the weak is stupid or unmanly. In these groups the only vices recognized as such will be the vices of cowardice and of treachery or “squealing” on one’s “pals.” To be a “tough guy” and perhaps the leader of a gang is an activating and in a sense a genuinely moral ideal of many a high-spirited lad, whose courage and energy if directed into other channels might make him not merely a useful citizen but even a hero. It is obvious enough that here the kind of moral reform that is called for is educational in the broadest sense, involving destruction of hideous economic conditions and of the cultural squalor and ignorance that go with them. Not all criminals indeed but probably the majority could be reformed or cured by being given a Socratic wisdom or knowledge of the things in life that are really worthwhile and an environment that would make it possible to achieve them. Moreover the whole philosophy of punishment would be revolutionized. Prevention rather than cure would be emphasized, and when preventive measures had failed the necessary restraint of the criminal would be accompanied by education rather than by social revenge.

My Socratic conversion justified the use of the word good in response to the above claims. Today, however, I can confidently proclaim the truth of Montague’s call for a Socratic revolution in the philosophy of punishment.

According to the most recent estimates released by the U.S. Department of Justice, at the close of 1998 there were 1,232,900 federal and state prison inmates. To advocate the belief that the majority of them could be reformed by a strong dose of Socrates appears to be an incredulous form of idealism completely out of touch with reality. Given the facts that the opinion of the public is that prison inmates should be “better” people when released on parole and that high-school equivalency classes and vocational training programs provided to the majority of them are not designed to foster moral reform, the suggestion that a multitude of inmates should be introduced to Socrates is not a fantasy of an unearthly idealism.

More specifically, I am absolutely convinced that the recidivism rate of former prison inmates can be reduced significantly if, while incarcerated, they are skillfully guided through a systematic discussion of the life and teachings of Socrates as presented by Plate in the Apology, Crito, Phaedo, Protagaras, and the analysis of the concept of justice in the Republic. This is the largely uncultivated and fertile soil in which federal and state authorities should plant the seeds of carefully designed and well-funded programs capable of tracking the lives of the participants (male and female) and those in control groups for three to five years in order to establish some incontrovertible data regarding the power of education to change the thinking and conduct of former criminal offenders.

So I tell all of my students that the only way to silence the voices of the cynics committed to the view that providing a university or college education to prison inmates is flushing clean dollars down a dirty toilet is to remain crime-free following release on parole. I tell them that the continuation of the program is contingent upon years of cumulative success stories and that their moral obligation to succeed is grounded in the lives of the students who remain behind bars. They are encouraged to contact me after their release, as I may be able to assist them in their search for employment. However, if they call me for help after committing another felony offense, I will volunteer to testify against them. As I said on May 13, 1998, in the conclusion of the commencement address I gave to a group of inmates who had earned either an associate’s degree from Alvin Community College in Alvin, Texas, or a bachelor’s or master’s degree from the University of Houston at Clear Lake:

The profound sense in which Socrates was correct is precisely why we are here this evening. Collectively, your teachers have guided you on the ascent from the cave of ignorance as articulated by Plate in his Republic. You have been led out of the abyss of intellectual and moral darkness and our hope is that you have experienced a genuine Socratic “conversion”–that is, that you have accepted total responsibility for the rottenness of your past conduct and are morally prepared to fulfill your obligations as a member of the human community…. However I am obliged to tell you that, if you have not or do not experience a Socratic conversion prior to your release, you will be nothing more than a hypocritical, educated crook. Socrates does not hold all the answers. For example, I readily admit to my students that, although he was committed to the view that humankind is essentially good, Socrates failed to recognize what philosopher David Hume called the incurable weakness in human nature. In his essay Of the Origin of Government, Hume comments on the nature of humanity and why it was necessary to invent a system of rules to protect lives and property: It is impossible to keep men faithfully and unerringly in the paths of justice. Some extraordinary circumstances may happen, in which a man finds his interests to be more promoted by fraud or rapine than hurt by the breach which his injustice makes in the social union. But much more frequently he is seduced from this great and important but distant interest by the allurement of present, though often very frivolous, temptations. This great weakness is incurable in human nature.

Men must, therefore, endeavor to palliate what they cannot cure. They must institute some persons under the appellation of magistrates, whose peculiar office it is to point out the decrees of equity, to punish transgressors, to correct fraud and violence, and to oblige men, however reluctant, to consult their own real and permanent interests. In a word, obedience is a new duty which must be invented to support that of justice, and the ties of equity must be corroborated by those of allegiance.

Hume’s view of humanity is consistent with Montague’s claim that whether we call it “sin” or “selfishness,” wrong conduct is due “not to lack of wisdom, but to lack of will…. Insight into the nature of the good … may be termed a ‘necessary,’ but not a ‘sufficient,’ cause of virtue. Wisdom by itself is not enough and great Socrates was wrong in thinking that it was.” Also, almost invariably during our discussions one or more students realize that Socrates’ doctrines of humankind and knowledge and virtue are diametrically opposed to the orthodox Christian belief that humans are sinners whose salvation from evil inclinations requires a supernatural infusion of divine grace. The majority of my students, in widely diverse environments, were nurtured in the tradition of Christian theism, and, not surprisingly, a significant number of them are unwilling to concede that Socratic doctrines inflict any serious damage on their religious commitments.

As was the case when I was introduced to Socrates, he can shake unexamined beliefs and faiths. However, unlike any of their other academic classes, it is important that most of my courses contain opportunities for prison inmates to reflect on the most important and enduring questions of human existence. And I can confidently claim that many of them are surprised by the joy of facing the unfathomed depth of Socrates’ message to live an examined life.

Discussion Group Report

Humanism Against Itself: The Religious Debate

March 2000

By Richard Layton

“In 1933 The humanists who joined in Manifesto I set out to reconstruct faith in the modern world,” says Howard Radest in a chapter with the same title as this article in The Devil and Secular Humanism. “Without apology they described their enterprise as ‘religious humanism.'”

In 1980 some humanists led by Paul Kurtz issued A Secular Humanist Declaration, which explicitly rejected the idea of a “religious humanism.” They accused those who retained the adjective of intellectual confusion, sentimentality and even opportunism. The Declaration identified religion with: “The reappearance of dogmatic authoritarian religions; fundamentalist, literalist, and doctrinaire Christianity; a rapidly growing and uncompromising Moslem clericalism in the Middle East in Asia; the re-assertion of orthodox authority by the Roman Catholic papal hierarchy; nationalistic, religious Judaism; and the reversion to obscurantist religions in Asia.”

“Religion was the enemy and humanist flirtation with it ensured confusion at best and surrender at worst,” laments Radest. “Clearly the climate of the humanist neighborhood had changed…The polemic and the anger…were addressed to the enemy within. Humanism seemed intent on destroying itself.”

He says that the 1980s found humanists as antagonistic toward their fellow humanists as to Fundamentalists and right-wing Christians. Since then another manifestation of fragmentation in the humanist movement has been the attempts by other groups to distinguish themselves from the American Humanist Association. These have included Ethical Culture, the Fellowship of Religious Humanists, the Society for Humanistic Judaism, and the Committee for Democratic and Secular Humanism (organized by Kurtz). Rationalism, free thought, and atheism went their separate ways. Countervailing attempts to bring humanists together were the Conference on Science and Democracy and the North American Committee for Humanism, which had only minor success. Manifesto II, published in 1973, Radest argues, was a long and puzzling essay, lacking the clarity, directness, and assurance of the 1933 document and was symptomatic of the unresolved issues.

Meanwhile America was pushing toward secularization. Religion on the left had developed a moralistic tone and center. The pulpit addressed itself to social criticism as much as it did to salvation. Its efforts were often in the secular world and its energies devoted to social reform. Biblical scholarship, the “higher criticism” and archaeology revealed the worldly sources of cult and text; and science held sway in the academy and the marketplace. There was a widely felt need to bring religion into the modern world.

This cultural pattern was an appropriate home for the appearance of humanism. Edwin Wilson, an important leader in organizing the humanist movement, recalled that it first came to self-awareness as a movement among Unitarians. In a meeting of the Western Unitarian Conference in Des Moines in 1917, The Reverends John Dietrich and Curtis W. Reese found that they both had been presenting ” a revolution from theocracy to humanism, from autocracy to democracy.” “The humanist movement was born at that moment,” said Wilson.

Radest opines that the reason humanists are polarized is that “we avoid working on the question, ‘What is humanism up to,’ and instead play a game of ‘either/or…our thinking is distorted by the fact that we like to choose sides. Humanists, more than most, are given to an argumentative game by temperament and by history…we lose ourselves in the joys of argument and forget that it is only argument. In the heat of argument it is easy to turn ‘faith’ into a caricature of itself and then identify all faith with superstition. When such a mood seizes us, we embrace its complement, a simple-minded secularism that denies any value to a move beyond the immediate…it is all too human to invest ourselves in our arguments and then to be unable to retreat. Losing the argument comes to feel like a loss of self…we are given to the game of ‘either-or’ precisely because the ambiguities of experience have become nearly intolerable. The authors of Manifesto I could speak with confidence about the world to come. They had not yet seen science perverted into holocaust and nuclear destruction. They had not yet seen democracy turned into populist conformism…In the midst of chaos, it is much more satisfying to separate into sheep and goat, saved and damned.”

Like everyone else, humanists, he continues, tend to revert to a mythic past where matters were simpler, clearer, and more assured. So it is that when humanism meets Fundamentalism, it responds in Fundamentalist style with a “raucous humanism.” The angers of Fundamentalism and the confusion of sects confess to a widely shared anxiety of spirit…both Fundamentalism and raucous humanism are only symptomatic, and the game of either-or attends only to the symptoms. When we are lost…we seek out a villain…within the debates is hidden the question: How shall human life be purposeful and joyful in a universe where human life seems only a chemical and biological incident? Humanism is not yet. This arises from the fact that the game of either-or and not the accidents of history blocks the reconstruction the signers of Manifesto I proposed.

Radist suggests that, although humanism is worldly and secular, the qualities of experience to which humanism must address itself are those that have legitimately been called religious. He says humanism is “where the action is, all of the action, including that which has historically been religious action.” For the humanist the “sacred,” the name given to that which is untouchably precious, departs from its separate universe to inform this one, the only one we have. Thus both sacred and secular are transformed under the aegis of a humanist naturalism.

Whether the reader agrees with Radist’s analysis or not, he broaches an important question for humanism.

Thoughts from Eric Hoffer

December 2000

Nature attains perfection, but man never does. There is a perfect ant, a perfect bee, but man is perpetually unfinished. He is both an unfinished animal and an unfinished man. It is this incurable unfinishedness which sets man apart from other living things. For, in the attempt to finish himself, man becomes a creator. Moreover, the incurable unfinishedness keeps man perpetually immature, perpetually capable of learning and growing.

There is a powerful craving in most of us to see ourselves as instruments in the hands of others and thus free ourselves from the responsibility for acts which are prompted by our own questionable inclinations and impulses. Both the strong and the weak grasp at this alibi. The latter hide their malevolence under the virtue of obedience: they acted dishonorably because they had to obey orders. The strong, too, claim absolution by proclaiming themselves the chosen instrument of a higher power–God, history, fate, nation or humanity.

Retiring the Gods From Politics

Centennial Speech by Robert Ingersoll

November 2000

One hundred years ago, our fathers retired the gods from politics.

THE Declaration of Independence is the grandest, the bravest, and the profoundest political document that was ever signed by the representatives of a people. It is the embodiment of physical and moral courage and of political wisdom.

I say of physical courage, because it was a declaration of war against the most powerful nation then on the globe; a declaration of war by thirteen weak, unorganized colonies; a declaration of war by a few people, without military stores, without wealth, without strength, against the most powerful kingdom on the earth; a declaration of war made when the British navy, at that day the mistress of every sea, was hovering along the coast of America, looking after defenseless towns and villages to ravage and destroy. It was made when thousands of English soldiers were upon our soil, and when the principal cities of America were in the substantial possession of the enemy. And so, I say, all things considered, it was the bravest political document ever signed by man. And if it was physically brave, the moral courage of the document is almost infinitely beyond the physical. They had the courage not only, but they had the almost infinite wisdom, to declare that all men are created equal.

Such things had occasionally been said by some political enthusiast in the olden time, but, for the first time in the history of the world, the representatives of a nation, the representatives of a real, living, breathing, hoping people, declared that all men are created equal. With one blow, with one stroke of the pen, they struck down all the cruel, heartless barriers that aristocracy, that priestcraft, that king-craft had raised between man and man. They struck down with one immortal blow that infamous spirit of caste that makes a God almost a beast, and a beast almost a god. With one word, with one blow, they wiped away and utterly destroyed, all that had been done by centuries of war–centuries of hypocrisy–centuries of injustice.

What more did they do? They then declared that each man has a right to live. And what does that mean? It means that he has the right to make his living. It means that he has the right to breathe the air, to work the land, that he stands the equal of every other human being beneath the shining stars; entitled to the product of his labor–the labor of his hand and of his brain.

What more? That every man has the right to pursue his own happiness in his own way. Grander words than. these have never been spoken by man.

And what more did these men say? They laid down the doctrine that governments were instituted among men for the purpose of preserving the rights of the people. The old idea was that people existed solely for the benefit of the state–that is to say, for kings and nobles.

The old idea was that the people were the wards of king and priest–that their bodies belonged to one and their souls to the other.

And what more? That the people are the source of political power. That was not only a revelation, but it was a revolution. It changed the ideas of people with regard to the source of political power. For the first time it made human beings men. What was the old idea? The old idea was that no political power came from, or in any manner belonged to, the people. The old idea was that the political power came from the clouds; that the political power came in some miraculous way from heaven; that it came down to kings, and queens, and robbers. That was the old idea. The nobles lived upon the labor of the people; the people had no rights; the nobles stole what they had and divided with the kings, and the kings pretended to divide what they stole with God Almighty. The source, then, of political power was from above. The people were responsible to the nobles, the nobles to the king, and the people had no political rights whatever, no more than the wild beasts of the forest. The kings were responsible to God; not to the people. The kings were responsible to the clouds; not to the toiling millions they robbed and plundered.

And our forefathers, in this Declaration of Independence, reversed this thing, and said: No; the people, they are the source of political power, and their rulers, these presidents, these kings are but the agents and servants of the great sublime people. For the first time, really, in the history of the world, the king was made to get off the throne and the people were royally seated thereon. The people became the sovereigns, and the old sovereigns became the servants and the agents of the people. It is hard for you and me now to even imagine the immense results of that change. It is hard for you and for me, at this day, to understand how thoroughly it had been ingrained in the brain of almost every man that the king had some wonderful right over him that in some strange way the king owned him; that in some miraculous manner he belonged, body and soul, to somebody who rode on a horse–to somebody with epaulets on his shoulders and a tinsel crown upon his brainless head.

Our forefathers had been educated in that idea, and when they first landed on American shores they believed it. They thought they belonged to somebody, and that they must be loyal to some thief who could trace his pedigree back to antiquity’s most successful robber.

It took a long time for them to get that idea out of their heads and hearts. They were three thousand miles away from the despotisms of the old world, and every wave of the sea was an assistant to them. The distance helped to disenchant their minds of that infamous belief, and every mile between them and the pomp and glory of monarchy helped to put republican ideas and thoughts into their minds. Besides that, when they came to this country, when the savage was in the forest and three thousand miles of waves on the other side, menaced by barbarians on the one hand and famine on the other, they learned that a man who had courage, a man who had thought, was as good as any other man in the world, and they built up, as it were, in spite of themselves, little republics. And the man that had the most nerve and heart was the best man, whether he had any noble blood in his veins or not.

It has been a favorite idea with me that our fore-fathers were educated by Nature, that they grew grand as the continent upon which they landed; that the great rivers–the wide plains–the splendid lakes–the lonely forests–the sublime mountains–that all these things stole into and became a part of their being, and they grew great as the country in which they lived. They began to hate the narrow, contracted views of Europe. They were educated by their surroundings, and every little colony had to be to a certain extent a republic. The kings of the old world endeavored to parcel out this land to their favorites. But there were too many Indians. There was too much courage required for them to take and keep it, and so men had to come here who were dissatisfied with the old country–who were dissatisfied with England, dissatisfied with France, with Germany, with Ireland and Holland. The kings’ favorites stayed at home. Men came here for liberty, and on account of certain principles they entertained and held dearer than life. And they were willing to work, willing to fell the forests, to fight the savages, willing to go through all the hardships, perils and dangers of a new country, of a new land; and the consequence was that our country was settled by brave and adventurous spirits, by men who had opinions of their own and were willing to live in the wild forests for the sake of expressing those opinions, even if they expressed them only to trees, rocks, and savage men. The best blood of the old world came to the new.

When they first came over they did not have a great deal of political philosophy, nor the best ideas of liberty. We might as well tell the truth. When the Puritans first came, they were narrow. They did not understand what liberty meant–what religious liberty, what political liberty, was; but they found out in a few years. There was one feeling among them that rises to their eternal honor like a white shaft to the clouds–they were in favor of universal education. Wherever they went they built schoolhouses, introduced books and ideas of literature. They believed that every man should know how to read and how to write, and should find out all that his capacity allowed him to comprehend. That is the glory of the Puritan fathers.

They forgot in a little while what they had suffered, and they forgot to apply the principle of universal liberty–of toleration. Some of the colonies did not forget it, and I want to give credit where credit should be given. The Catholics of Maryland were the first people on the new continent to declare universal religious toleration. Let this be remembered to their eternal honor. Let it be remembered to the disgrace of the Protestant government of England, that it caused this grand law to be repealed. And to the honor and credit of the Catholics of Maryland let it be remembered that the moment they got back into power they re-enacted the old law. The Baptists of Rhode Island also, led by Roger Williams, were in favor of universal religious liberty.

No American should fail to honor Roger Williams. He was the first grand advocate of the liberty of the soul. He was in favor of the eternal divorce of church and state. So far as I know, he was the only man at that time in this country who was in favor of real religious liberty. While the Catholics of Maryland declared in favor of religious toleration, they had no idea of religious liberty, They would not allow anyone to call in question the doctrine of the Trinity, or the inspiration of the Scriptures. They stood ready with branding-iron and gallows to burn and choke out of man the idea that, he had a fight to think and to express his thoughts.

So many religions met in our country–so many theories and dogmas came in contact–so many follies, mistakes, and stupidities became acquainted with each other, that religion began to fall somewhat into disrepute. Besides this, the question of a new nation began to take precedence of all others.

The people were too much interested in this world to quarrel about the next. The preacher was lost in the patriot. The Bible was read to find passages against kings.

Everybody was discussing the rights of man. Farmers and mechanics suddenly became statesmen, and in every shop and cabin nearly every question was asked and answered.

During these years of political excitement the interest in religion abated to that degree that a common purpose animated men of all sects and creeds.

At last our fathers became tired of being colonists–tired of writing and reading and signing petitions, and presenting them on their bended knees to an idiot king. They began to have an aspiration to form a new nation, to be citizens of a new republic instead of subjects of an old monarchy. They had the idea–the Puritans, the Catholics, the Episcopalians, the Baptists, the Quakers, and a few Freethinkers, all had the idea–that they would like to form a new nation.

Now, do not understand that all of our fathers were in favor of independence. Do not understand that they were all like Jefferson; that they were all like Adams or Lee; that they were all like Thomas Paine or John Hancock. There were thousands and thousands of them who were opposed to American independence. There were thousands and thousands who said: “When you say men are created equal, it is a lie when you say the political power resides in the great body of the people, it is false.” Thousands and thousands of them said: “We prefer Great Britain.” But the men who were in favor of independence, the men who knew that a new nation must be born, went on full of hope and courage, and nothing could daunt or stop or stay the heroic, fearless few.

They met in Philadelphia; and the resolution was moved by Lee of Virginia, that the colonies ought to be independent states, and ought to dissolve their political connection with Great Britain.

They made up their minds that a new nation must be formed. All nations had been, so to speak, the wards of some church. The religious idea as to the source of power had been at the foundation of all governments, and had been the bane and curse of man.

Happily for us, there was no church strong enough to dictate to the rest. Fortunately for us, the colonists not only, but the colonies differed widely in their religious views. There were the Puritans who hated the Episcopalians, and Episcopalians who hated the Catholics, and the Catholics who hated both, while the Quakers held them all in contempt. There they were, of every sort, and color and kind, and how was it that they came together? They had a common aspiration. They wanted to form a new nation. More than that, most of them cordially hated Great Britain; and they pledged each other to forget these religious prejudices, for a time at least, and agreed that there should be only one religion until they got through, and that was the religion of patriotism. They solemnly agreed that the new nation should not belong to any particular church, but that it should secure the rights of all.

Our fathers founded the first secular government that was ever founded in this world. Recollect that. The first secular government; the first government that said every church has exactly the same rights and no more; every religion has the same rights, and no more. In other words, our fathers were the first men who had the sense, had the genius, to know that no church should be allowed to have a sword; thai it should be allowed only to exert its moral influence.

You might as well have a government united by force with Art, or with Poetry, or with Oratory, as with Religion. Religion should have the influence upon mankind that its goodness, that its morality, its justice, its charity, its reason, and its argument give it, and no more. Religion should have the effect upon mankind that it necessarily has, and no more. The religion that has to be supported by law is. without value, not only, but a fraud and curse. The religious argument that has to be supported by a musket, is hardly worth making. A prayer that must have a cannon behind it, better never be uttered. Forgiveness ought not to go in partnership with shot and shell. Love need not carry knives and revolvers.

So our fathers said: “We will form a secular government, and under the flag with which we are going to enrich the air, we will allow every man to worship God as he thinks best.” They said: “Religion is an individual thing between each man and his creator, and he can worship as he pleases and as he desires.” And why did they do this? The history of the world warned them that the liberty of man was not safe in the clutch and grasp of any church. They had read of and seen the thumb-screws, the racks, and the dungeons of the Inquisition. They knew all about the hypocrisy of the olden time. They knew that the church had stood side by side with the throne; that the high priests were hypocrites, and that the kings were robbers. They also knew that if they gave power to any church, it would corrupt the best church in the world. And so they said that power must not reside in a church, or in a sect, but power must be wherever humanity is–in the great body of the people. And the officers and servants of the people must be responsible to them. And so I say again, as I said in the commencement, this is the wisest, the profoundest, the bravest political document that ever was written and signed by man.

They turned, as I tell you, everything squarely about. They derived all their authority from the people. They did away forever with the theological idea of government.

And what more did they say? They said that whenever the rulers abused this authority, this power, incapable of destruction, returned to the people. How did they come to say this? I will tell you. They were pushed into it. How? They felt that they were oppressed; and whenever a man feels that he is the subject of injustice, his perception of right and wrong is wonderfully quickened.

Nobody was ever in prison wrongfully who did not believe in the writ of habeas corpus. Nobody ever suffered wrongfully without instantly having ideas of justice.

And they began to inquire what rights the king of Great Britain had. They began to search for the charter of his authority. They began to investigate and dig down to the bed-rock upon which, society must be founded, and when the got down there, forced there, too, by their oppressors, forced against their own prejudices and education, they found at the bottom of things, not lords, not nobles, not pulpits, not thrones, but humanity and the rights of men.

And so they said, We are men; we are men. They found out they were men. And the next thing they said, was, “We will be free men; we are weary of being colonists; we are tired of being subjects; we are men; and these colonies ought to be states; and these states ought to be a nation and that nation ought to drive the last British soldier into the sea.” And so they signed that brave Declaration of Independence.

I thank every one of them from the bottom of my heart for signing that sublime declaration. I thank them for their courage–for their patriotism–for their wisdom–for the splendid confidence in themselves and in the human race. I thank them for what they were, and for what we are–for what they did, and for what we have received–for what they suffered, and for what we enjoy.

What would we have been if we had remained colonists and subjects? What would we have been to-day? Nobodies–ready to get down on our knees and crawl in the very dust at the sight of somebody that was supposed to have in him some drop of blood that flowed in the veins of that mailed marauder–that royal robber, William the Conqueror.

They signed that Declaration of Independence, although they knew that it would produce a long, terrible, and bloody war. They looked forward and saw poverty, deprivation, gloom, and death. But they also saw, on the wrecked clouds of war, the beautiful bow of freedom.

These grand men were enthusiasts; and the world has been raised only by enthusiasts. In every country there have been a few who have given a national aspiration to the people. The enthusiasts of 1776 were the builders and framers of this great and splendid Government; and they were the men who saw, although others did not, the golden fringe of the mantle of glory that will finally cover this world. They knew, they felt, they believed that they would give a new constellation to the political heavens–that they would make the Americans a grand people–grand as the continent upon which they lived.

The war commenced. There was little money, and less credit. The new nation had but few friends. To a great extent each soldier of freedom had to clothe and feed himself. He was poor and pure, brave and good, and so he went to the fields of death to fight for the rights of man.

What did the soldier leave when he went?

He left his wife and children,

Did he leave them in a beautiful home, surrounded by civilization, in the repose of law, in the security of a great and powerful republic?

No. He left his wife and children on the edge, on the fringe of the boundless forest, in which crouched and crept the red savage, who was at that time the ally of the still more savage Briton. He left his wife to defend herself, and he left the prattling babes to be defended by their mother and by nature. The mother made the living; she planted the corn and the potatoes, and hoed them in the sun, raised the children, and, in the darkness of night, told them about their brave father and the “sacred cause” She told them that in a little while the war would be over and father would come back covered with honor and glory.

Think of the women, of the sweet children who listened for the footsteps of the dead–who waited through the sad and desolate years for the dear ones I who never came.

The soldiers of 1776 did not march away with music and banners. They went in silence, looked at and gazed after by eyes filled with tears. They went to meet, not an equal, but a superior–to fight five times their number–to make a desperate stand to stop the advance of the enemy, and then, when their ammunition gave out, seek the protection of rocks, of rivers, and of hills.

Let me say here: The greatest test of courage on the earth is to bear defeat without losing heart. That army is the bravest that can be whipped the greatest number of times and fight again.

Over the entire territory, so to speak, then settled by our forefathers, they were driven again and again. Now and then they would meet the English with something like equal numbers, and then the eagle of victory would proudly perch upon the stripes and stars. And so they went on as best they could, hoping and fighting until they came to the dark and somber gloom of Valley Forge.

There were very few hearts then beneath that flag that did not bean to think that the struggle was useless; that all the blood and treasure had been shed and spent in vain. But there were some men gifted with that wonderful prophecy that fulfills itself, and with that wonderful magnetic power that makes heroes of everybody they come in contact with.

And so our fathers went through the gloom of that terrible time, and still fought on. Brave men wrote grand words, cheering the despondent; brave men did brave deeds, the rich man gave his wealth, the poor man gave his life, until at last, by the victory of Yorktown, the old banner won its place in the air, and became glorious forever.

Seven long years of war–fighting for what? For the principle that all men are created equal–a truth that nobody ever disputed except a scoundrel; nobody, nobody in the entire history of this world. No man ever denied that truth who was not a rascal, and at heart a thief; never, never, and never will. What else were they fighting for? Simply that in America every man should have a right to life, liberty, and the pursuit of happiness. Nobody ever denied that except a villain; never, never. It has been denied by kings–they were thieves. It has been denied by statesmen–they were liars. It has been denied by priests, by clergymen, by cardinals, by bishops, and by popes–they were hypocrites.

What else were they fighting for? For the idea that all political power is vested in the great body of the people. The great body of the people make all the money; do all the work. They plow the land, cut down the forests; they produce everything that is produced. Then who shall say what shall be done with what is produced except the producer?

Is it the non-producing thief, sitting on a throne, surrounded by vermin?

Those were the things they were fighting for; and that is all they were fighting for. They fought to build up a new, a great nation to establish an asylum for the oppressed of the world everywhere. They knew the history of this world. They knew the history of human slavery.

The history of civilization is the history of the slow and painful enfranchisement of the human race. In the olden times the family was a monarchy, the father being the monarch. The mother and children were the veriest slaves. The will of the father was the supreme law. He had the power of life and death. It took thousands of years to civilize this father, thousands of years to make the condition of wife and mother and child even tolerable. A few families constituted a tribe; the tribe had a chief; the chief was a tyrant; a few tribes formed a nation; the nation was governed by a king, who was also a tyrant. A strong nation robbed, plundered, and took captive the weaker ones. This was the commencement of human slavery.

It is not possible for the human imagination to conceive of the horrors of slavery. It has left no possible crime uncommitted, no possible cruelty un-perpetrated. It has been practiced and defended by all nations in some form. It has been upheld by all religions. It has been defended by nearly every pulpit. From the profits derived from the slave trade churches have been built, cathedrals reared and priests paid. Slavery has been blessed by bishop, by cardinal, and by pope. It has received the sanction of statesmen, of kings, and of queens. It has been defended by the throne, the pulpit and the bench. “Monarchs have shared in the profits. Clergymen have taken their part of the spoils, reciting passages of Scripture in its defence at the same time, and judges have taken their portion in the name of equity and law.

Only a few years ago our ancestors were slaves. Only a few years ago they passed with and belonged to the soil, like the coal under it and rocks on it.

Only a few years ago they were treated like beasts of burden, worse far than we treat our animals at the present day. Only a few years ago it was a crime in England for a man to have a Bible in his house, a crime for which men were hanged, and their bodies afterward burned. Only a few years ago fathers could and did sell their children. Only few years ago our ancestors were not allowed to write their thoughts–that being a crime. Only a few years ago to be honest, at least in the expression of your ideas, was a felony. To do right was a capital offence; and in those days chains and whips were the incentives to labor, and the preventives of thought. Honesty was a vagrant, justice a fugitive, and liberty in chains. Only a few years ago men were denounced because they doubted the inspiration of the Bible–because they denied miracles, and laughed at the wonders recounted by the ancient Jews.

Only a few years ago a man had to believe in the total depravity of the human heart in order to be respectable. Only a few years ago, people who thought God too good to punish in eternal flames an unbaptized child were considered infamous.

As soon as our ancestors began to get free they began to enslave others. With an inconsistency that defies explanation, they practiced upon others the same outrages that had been perpetrated upon them. As soon as white slavery began to be abolished, black slavery commenced. In this infamous traffic nearly every nation of Europe embarked. Fortunes were quickly realized; the avarice and cupidity of Europe were excited; all ideas of justice were discarded; pity fled from the human breast a few good, brave men recited the horrors of the trade; avarice was deaf; religion refused to hear; the trade went on; the governments of Europe upheld it in the name of commerce–in the name of civilization and religion.

Our fathers knew the history of caste. They knew that in the despotisms of the Old World it a was disgrace to be useful. They knew that a mechanic was esteemed as hardly the equal of a hound, and far below a blooded horse. They knew that a nobleman held a son of labor in contempt–that he had no rights the royal loafers were bound to respect.

The world has changed.

The other day there came shoemakers, potters, workers in wood and iron, from Europe, and they were received in the city of New York as though they had been princes. They had been sent by the great republic of France to examine into the arts and manufactures of the great republic of America. They looked a thousand times better to me than the Edward Alberts and Albert Edwards–the royal vermin, that live on the body politic. And I would think much more of our Government if it would fete and feast them, instead of wining and dining the imbeciles of a royal line.

Our fathers devoted their lives and fortunes to the grand work of founding a government for the protection of the rights of man. The theological idea as to the source of political power had poisoned the web and woof of every government in the world, and our fathers banished it from this continent forever.

What we want to-day is what our fathers wrote down. They did not attain to their ideal; we approach it nearer, but have not reached it yet. We want, not only the independence of a State, not only the independence of a nation, but something far more glorious–the absolute independence of the individual. That is what we want. I want it so that I, one of the children of Nature, can stand on an equality with the rest; that I can say this is MY air, MY sunshine, MY earth, and I have a right to live, and hope and aspire, and labor, and enjoy the fruit of that labor, as much as any individual or any nation on the face of the globe.

We want every American to make to-day, on this hundredth anniversary, a declaration of individual independence. Let each man enjoy his liberty to the utmost enjoy all he can; but be sure it is not at the expense of another. The French Convention gave the best definition of liberty I have ever read: “The liberty of one citizen ceases only where the liberty of another citizen commences.” I know of no better definition. I ask you to-day to make a declaration of individual independence. And if you are independent be just. Allow everybody else to make his declaration of individual independence Allow your wife, allow your husband, allow your children to make theirs. Let everybody be absolutely free and independent, knowing only the sacred obligations of honesty and affection. Let us be independent of party, independent of everybody and everything except our own consciences and our own brains. Do not belong to any clique. Have clear title-deeds in fee simple to yourselves, without any mortgages on the premises to anybody in the world.

It is a grand thing to be the owner of yourself. It is a grand thing to protect the rights of others. It is a sublime thing to be free and just.

Only a few days ago I stood in Independence Hall–in that little room where was signed the immortal paper. A little room, like any other; and it did not seem possible that from that room went forth ideas, like cherubim and seraphim, spreading heir wings over a continent, and touching, as with holy fire, the hearts of men.

In a few moments I was in the park, where are gathered the accomplishment of a century. Our fathers never dreamed of the things I saw. There were hundreds of locomotives, with their nerves of steel and breath of flame–every kind of machine, with whirling wheels and curious cogs and cranks, and the myriad thoughts of men that have been wrought in iron, brass and steel. And going out from one little building were wires in the air, stretching to every civilized nation, and they could send a shining messenger in a moment to any part of the world, and it would go sweeping under the waves of the sea with thoughts and words within its glowing heart. I saw all that had been achieved by this nation, and I wished that the signers of the Declaration–the soldiers of the Revolution–could see what a century of freedom has produced. I wished they could see the fields we cultivate–the rivers we navigate–the railroads running over the Alleghanies, far into what was then the unknown forest–on over the broad prairies–on over the vast plains–away over the mountains of the West, to the Golden Gate of the Pacific. All this is the result of a hundred years of freedom.

Are you not more than glad that in 1776 was announced the sublime principle that political power resides with the people? That our fathers then made up their minds nevermore to be colonists and subjects, but that they would be free and independent citizens of America?

I will not name any of the grand men who fought for liberty. All should be named, or none. I feel that the unknown soldier who was shot down without even his name being remembered–who was included only in a report of “a hundred killed,” or “a hundred missing,” nobody knowing even the number that attached to his august corpse–is entitled to as deep and heartfelt thanks as the titled leader who fell at the head of the host.

Standing here amid the sacred memories of the first, on the golden threshold of the second, I ask, Will the second century be as grand as the first? I believe it will, because we are growing more and humane. I believe there is more human kindness, more real, sweet human sympathy, a greater desire to help one another, in the United States, than in all the world besides.

We must progress. We are just at the commencement of invention. The steam engine–the telegraph–these are but the toys with which science has been amused. Wait; there will be grander things, there will be wider and higher culture–a grander standard of character, of literature and art. We have now half as many millions of people as we have years, and many of us will live until a hundred millions stand beneath the flag. We are getting more real solid sense. The schoolhouse is the finest building in the village. We are writing and reading more books; we are painting and buying more pictures; we are struggling more and more to get at the philosophy of life, of things–trying more and more to answer the questions of the eternal Sphinx. We are looking in every direction–investigating; in short, we are thinking and working. Besides all this, I believe the people are nearer honest than ever before. A few rears ago we were willing to live upon the labor of four million slaves. Was that honest? At last, we have a national conscience. At last, we have carried out the Declaration of Independence. Our fathers wrote it–we have accomplished it. The black man was a slave–we made him a citizen. We found four million human beings in manacles, and now the hands of a race are held up in the free air without a chain.

I have had the supreme pleasure of seeing a man–once a slave–sitting in the seat of his former master in the Congress of the United States. I have had that pleasure, and when I saw it my eyes were filled with tears. I felt that we had carried out the Declaration of Independence–that we had given reality to it, and breathed the breath of life into its every word. I felt that our flag would float over and protect the colored man and his little children, standing straight in the sun, just the same as though he were white and worth a million. I would protect him more, because the rich white man could protect himself.

All who stand beneath our banner are free. Ours is the only flag that has in reality written upon it: Liberty, Fraternity, Equality–the three grandest words in all the languages of men.

Liberty: Give to every man the fruit of his own labor–the labor of his hands and of his brain.

Fraternity: Every man in the right is my brother.

Equality: The rights of all are equal: justice, poised and balanced in eternal calm, will shake from the golden scales in which are weighed the acts of men, the very dust of prejudice and caste: No race, no color, no previous condition, can change the rights of men.

The Declaration of Independence has at last been carried out in letter and in spirit.

The second century will be grander than the first.

Fifty millions of people are celebrating this day. To-day, the black man looks upon his child and says: The avenues to distinction are open to you–upon your brow may fall the civic wreath–this day belongs to you.

We are celebrating the courage and wisdom of our fathers, and the glad shout of a free people the anthem of a grand nation, commencing at the Atlantic, is following the sun to the Pacific, across a continent of happy homes.

We are a great people. Three millions have increased to fifty–thirteen States to thirty-eight. We have better homes, better clothes, better food and more of it, and more of the conveniences of life, than any other people upon the globe.

The farmers of our country live better than did the kings and princes two hundred years ago–and they have twice as much sense and heart. Liberty and labor have given us all. I want every person here to believe in the dignity of labor–to know that the respectable man is the useful man–the man who produces or helps others to produce something of value, whether thought of the brain or work of the hand.

I want you to go away with an eternal hatred in your breast of injustice, of aristocracy, of caste, of the idea that one man has more rights than another because he has better clothes, more land, more money, because he owns a railroad, or is famous and in high position. Remember that all men have equal rights. Remember that the man who acts best his part–who loves his friends the best–is most willing to help others–truest to the discharge of obligation–who has the best heart–the most feeling–the deepest sympathies–and who freely gives to others the rights that he claims for himself is the best man. I am willing to swear to this.

What has made this country? I say again, liberty and labor. What would we be without labor? I want every farmer when plowing the rustling corn of June–while mowing in the perfumed fields–to feel that he is adding to the wealth and glory of the United States. I want every mechanic–every man of toil, to know and feel that he is keeping the cars running, the telegraph wires in the air; that he is making the statues and painting the pictures; that he is writing and printing the books; that he is helping to fill the world with honor, with happiness, with love and law.

Our country is founded upon the dignity of labor–upon the equality of man. Ours is the first real Republic in the history of the world. Beneath our flag the people are free. We have retired the gods from politics. We have found that man is the only source of political power, and that the governed should govern. We have disfranchised the aristocrats of the air and have given one country to mankind.

Living and Dying
Humanism and US History

October 2000

This past summer, (2000) Sen. Joseph Lieberman introduced a congressional resolution calling for more emphasis on the teaching of American history. In making his presentation Sen. Lieberman said: “When we lose the memory of our past, when we lose our understanding of the remarkable individuals, events and values that have shaped this nation, we are losing much of what it means to be an American.” Gordon Wood, professor of history at Brown University followed up by saying: “Without some such sense of history, the citizens of the United States can scarcely long exist as a united people.” And Theodore Rabb, chairman of the National Council for History Education said, “Unlike many people of other nations Americans are not bound together by a common religion or a common ethnicity. Instated, our binding heritage is a democratic vision of liberty, equality and justice. If Americans are to preserve that vision and bring it to daily practice, it is imperative that all citizens understand how it was shaped in the past.”

It is my belief that this nation was founded and developed by political leaders who were students of the western European Enlightenment movement. Today our nation is the world’s leading example of the social society envisioned by those realistic and practical philosophers of the 16th and 17th century. Humanists today are leading the movement to restore human understanding of the principles of The Enlightenment, “life, liberty, equality and justice for all.”

Howard Radest, Humanist Leader of the American Ethical Union, says Humanism has failed to communicate with a large number of people because we haven’t developed interesting stories. In the 1999 annual issue of Humanism Today, Radest wrote: “The clue to an alternative approach to human relationship is the notion of `stories.’ Far from being mere fictions, stories help human beings put their experience together, suggest directions for finding meanings in our lives, reflect the experience of particular times, places, and peoples. So, stories enable the rest of us to gain access to strangers and make them somewhat less than strange”, he says, “Humanists have failed to create, communicate, and celebrate their own stories”.

Radest suggests the story of Humanism is Human Dignity. We need to create or discover the stories that exemplify the development of Human Dignity, then tell and retell those stories.*

I propose that one place we may begin to develop the story of Humanism is the 15th century in Western Europe when humans began the process of realizing they were in bondage to popes, kings and other authoritarians. That awakening became know in history as The Enlightenment.

In his latest book, “From Dawn to Decadence”, historian Jacques Barzun refers to this as the beginning of “Modernism” and reflects the beginning of Human Emancipation from the authority of governments controlled by religion. It had its roots in the Reformation, the Renaissance and the Enlightenment. The branches of Modernism are reason, individualism, and rationality. From those roots and branches humans have learned to live with relativism, complexity and uncertainty. They have acquired freedom, dignity, and confidence.

In the Sept/Oct 2000 issue of The Humanist magazine Edward L. Ericson, the Humanists of the Year in 1990, writes about the need for Humanism to Reclaim the High Ground. In the article he defines a humanist as one who holds that the source of our values, including our moral and inspirational values, is to be found within human nature and experience. He goes on to say, “The core of the humanist philosophy is naturalism–the proposition that the natural world proceeds according to its own internal dynamics, without divine or supernatural control or guidance, and that we human beings are creations of that process.”

With those introductory remarks I will now move into the body of my presentation, “Humanism, a Rational Approach to Life and Death”

The earliest records relating to the Humanist philosophy are found in Greek manuscripts written around 600 BC when a few Greek scholars questioned the popular belief that supernatural forces influenced human life. The Greek philosopher, Protagoras, around 450 BC wrote “Man is the measure of all things. As for the gods, I do not know whether they exist or not. Life is too short for such difficult inquiries.” That statement is an expression of justified pride in human potential and expresses confidence that the human mind can be the most reliable source of solving the problems of human existence and discovering the means of leading a worthwhile, fulfilling and valuable life. Then, as now, the majority of thinkers believed that various `gods’ had an interest-in and an influence-on the affairs of the human race and the workings of nature. Then, as now, a few thinkers questioned that majority concept and proposed that perhaps individuals should accept responsibility for what happens in their life. Some of them taught that death is neither a reward nor a punishment, but simply a natural event. The Greek philosopher Epicures (342-270 BC) summarized this attitude writing in the third century BC, “Become accustomed to the belief that death is nothing to us. For all good and evil consist in sensation, but death is deprivation of sensation. And therefore a right understanding that death is nothing makes life enjoyable.”

In southern Europe and the Middle East a stimulating atmosphere of free and open discussion about science, religion and the meaning of life continued for about 800 years. During that period a magnificent library was constructed in Alexandria, Egypt. It contained over 700,000 volumes, dedicated to the collection, preservation and study of ancient Greek culture. It was a beacon of learning, illuminating the intellectual life of the classical world. Unfortunately the clash of cultures eventually resulted in its destruction. It was partially burned by the troops of Caesar, and later totally destroyed by Muslim forces in 641 AD.

Thus began the era we now refer to as the Dark Ages, a period when the only people permitted to read and write were the students in religious monasteries. For the next thousand years the authoritarian Christian Religion controlled the major sources of knowledge, consequently it also dominated the cultural, social and political climate. Discussions of serious philosophical questions were limited and questioning the authority of religious leaders strongly discouraged.

Dark Ages

To help us in understanding emotionally the impact and the significance of the humanism we enjoy today let me take a few minutes to summarize the depressive human conditions that existed during that period.

The academic term for the Dark Ages is the Middle Ages, the period in Europe dating from the collapse of the Roman Empire, around the 5th century, to the 15th century. The fixing of exact dates for the beginning and end of the Middle Ages is arbitrary;The term implies a suspension of time and, especially, a suspension of progress-a period of cultural stagnation. During this period western Europe essentially declined to a primitive culture. People lived in a state of perpetual crisis and ignorance.

The loose confederation of tribes coalesced into kingdoms, but virtually no effective machinery of government existed, and political and economic development came to a stand still. Regular commerce had ceased almost entirely. Peasants became bound to the land and dependent on landlords for protection and the rudimentary administration of justice. Feudalism emerged.

The only universal European institution was the Catholic church. The church saw itself as the spiritual community of Christian believers, in exile from God’s kingdom, waiting in a hostile world for the day of deliverance. At the center of the very limited educational activity stood the Bible, and all secular learning became regarded as mere preparation for understanding the holy text. Not only did the papacy exercise direct political control over the domain lands of central and northern Italy, but through diplomacy and the administration of justice in the extensive system of church courts it also exercised a directive, authoritative power throughout Europe.

With new migrations and invasions-the coming of the Vikings from the north and the Magyars from the Asian steppes-violence and dislocation caused lands to be withdrawn from cultivation, populations to decline, and the monasteries became outposts of civilization.

This was also a century of Crusades. These wars, begun in the late 11th century, were called by the popes to free Christian holy places from the control of the Muslims.

The catastrophic appearance in the 1340s of the Black Death, killed about a fourth of Europe’s population.

When most people did not read or write they lacked the information to think seriously and discuss openly the questions about the deeper meanings of life and death. Consequently I’m certain they were inclined to accept, without question, the decisions of religious and political leaders. This made it rather easy for the masses to be convinced that their leaders were given inspiration and knowledge by magic conversations with supernatural powers. It is important to remember that the contents of “The Bible” were transmitted orally for hundreds of years. When the stories were eventually hand written the general population remained illiterate so very few people could read them. Consequently for centuries religious authorities indoctrinated their subjects with authoritarian dogma.

The dogma and myths included such ideas as: the earth is flat, the sun and stars rotate around the earth, and some people were inferior and born to be slaves to their superiors. Such religious myths dominated Europe until the Renaissance.

The `Age of the Renaissance’ has no clear beginning but historians usually recognize the mid 1400’s as the Renaissance period. The essence of the Renaissance was the questioning and testing of the authority of the church in secular affairs. It was the first stage of the cultural evolution which led to the scientific revolution of the Enlightenment. The prime quality of the Renaissance has been defined as “independence of mind”. Its ideal was a person who, by mastering all branches of art and thought, need not depend on any outside authority for the formation of knowledge, tastes, and beliefs. Such a person was considered ‘`the complete man.”

The principle product of the Renaissance was the reestablishment of humanism, the ancient Greek conviction that humanity is capable of mastering the world in which it lives. It was a decisive break with the middle ages when men and women were considered to be helpless pawns of supernatural Providence and universal sin.

Renaissance humanism was marked by a fundamental shift from the theocratic or god-centered world view of the middle ages to the anthropocentric or man-centered view. Its original manifesto may have been Pico’s treatise “On the Dignity of Man”. That essay is connected with the stirrings of the scientific attitude, the principle that nothing should be taken as true unless it can be tried and demonstrated.

The Renaissance established the grounding for the eventual recognition of individualism.

The later stages of the Renaissance witnessed Martin Luther posting his famous 95 Theses questioning the ethical practice of the Catholic church selling indulgences. This encouraged other religious leaders to challenge the dogma that “whoever has the right to rule also has the right to determine religion.”

A the human brain was slowly freed from the centuries of oppressive ecclesiastical bondage people began to ask questions and the thirst for knowledge dramatically increased. The Humanist philosophy of individual dignity once again enjoyed increasing recognition. When Gutenberg developed a movable type process that made it possible to print books faster the thoughts of those rebelling against authoritarian controls spread rapidly.

Many people now see Gutenbergs invention comparable in its day to the development of the Internet today. In fact, the Discovery Television channel listed Gutenberg as the most influential person of the second millenium. In the period between 1450 and 1500, more than 6000 separate works were printed. Information became public property and increasing numbers of people began to read, to think and to discuss serious subjects; science, politics, religion, the purpose of life and the meaning of death. In the year 1558 Macchiavelli’s book “The Prince” was published and it is thought to have been the original publication of a completely secular book. It was the first printed book which did not mention a deity.

Free thinkers exercised increasing influence and, led by Martin Luther, openly challenged the right of religious leaders to control human thought. A French philosopher, Pierre Charron (1541-1603) summarized the dominant theme of the new age when he wrote in his Book of Wisdom, “The proper science and subject for mans contemplation is man himself.”

Freed from domination of the religious authorities intellectuals expanded the scope of their inquiry and began to challenge as well the secular authority of emperors, kings, feudal lords and military leaders. This began the age of Enlightenment.

The Age of Enlightenment, is a term used to describe the trends in thought and letters in Europe and the American colonies during the 18th century prior to the French Revolution. The phrase was frequently employed by writers of the period itself, convinced that they were emerging from centuries of darkness and ignorance into a new age enlightened by reason, science, and a respect for humanity.

This is the historical period of time when we learned that the earth is not flat, the sun does not rotate around the earth, the earth is not the center of creation and no person should be a slave to another.

Of the basic assumptions and beliefs common to philosophers and intellectuals of this period, perhaps the most important was an abiding faith in the power of human reason. The age was enormously impressed by Isaac Newton’s discovery of universal gravitation. Other brilliant minds thought if humanity could so unlock the laws of the universe, why could it not also discover the laws underlying all of nature and society? People came to assume that through a judicious use of reason, an unending progress would be possible-progress in knowledge, in technical achievement, and even in moral values. 18th-century writers taught that knowledge is not innate, but comes only from experience and observation guided by reason. Through proper education, humanity itself could be altered, its nature changed for the better. A great premium was placed on the discovery of truth through the observation of nature, rather than through the study of authoritative sources, such as Aristotle and the Bible.

Although they saw the church-especially the Roman Catholic church-as the principal force that had enslaved the human mind in the past, most Enlightenment thinkers did not renounce religion altogether. They opted rather for a form of Deism, accepting the existence of God and of a hereafter, but rejecting the basic Christian theology of creation, sin and divine damnation. Human aspirations, they believed, should not be centered on the next life, but rather on the means of improving this life. Worldly happiness was placed before religious salvation. Nothing was attacked with more intensity and ferocity than the authority of the church, with all its wealth, political power, and suppression of the free exercise of reason.

More than a set of fixed ideas, the Enlightenment implied an attitude, a method of thought. According to the German philosopher Immanuel Kant, the motto of the age should be “Dare to know.” A desire arose to reexamine and question all accepted ideas and values, to explore new ideas in many different directions.

One of the major events of this period of history was the execution of King Charles the First of England. He was put on public trial for claiming the divine right to rule. He refused to enter a plea, saying the court had no authority over him. The court found otherwise. He was declared guilty and beheaded Jan.30, 1649. That was a significant event in the downfall of the concept of `the divine right to rule’ and a major step toward establishing the revolutionary concept of separation of church and state.

This was the period of time during which the Scotish philosopher David Hume wrote his Treatise on Human Nature and his “Inquiry Concerning the Principles of Morals” in which he says human ethics are not rules dictated by a `god’ but rather are the result of human experience. The English, poet & philosopher Alexander Pope (1688-1744) in his Essay on Man wrote: “Know then thyself, presume not God to scan; the proper study of mankind is man.”

Probably the most notable figure of The Enlightenment is the English philosopher John Locke (1632-1704). He wrote his essays on the nobility of human nature in which he proclaims basic human rights such as the right to think freely and the right to express one’s views without public censorship or fear of repression.

His friendships with prominent government officers and scholars made him one of the most influential men of the 17th century. His essay Concerning Human Understanding, written in 1690, is considered one of the classical documents of empirical philosophy. He concluded that the principle subject of philosophy is the extent of the mind’s ability to know. Locke is perhaps best known for his contributions to political thought. He wrote two major treatises of government that have had lasting influence on the political structures of England, France and the United States. In those works he set forth the principle that the state exists to preserve the natural rights of its citizens.

In his `Letter Concerning Toleration’ Locke expresses the view that no one should dictate the form of another persons religion. That was another major event in the movement to separate the powers of church and state.

John Locke’s writings were a major influence on Thomas Jefferson who put those Humanist principles into the famous American revolutionary document “The Declaration of Independence” and a few years later Jefferson relied on Locke’s philosophy in helping to draft one of the finest secular documents in world history, a document that would establish a humanistic form of secular government the “Constitution of the United States.” Finally, a form of government that did not equate disbelief with treason.

Let me cite one more of the masterful minds of The Enlightenment that is recognized for his contributions to the basic principles of Humanism. Paul Henri d’Holbach. In 1772 he published one of his major works “Natural Ideas Opposed to the Supernatural”. In it he writes: “In vain should we attempt to cure men of their vices, unless we begin by curing them of their prejudices. It is only by showing them the truth, that they will know their dearest interest, and the motive that ought to incline them to do good. Instructors have long enough fixed men’s eyes upon heaven, let then now turn them upon earth…Let the human mind apply itself to the study of nature, to intelligible objects, sensible truths and useful knowledge….To learn true principles of morality, men have no need of theology, of revelations, or gods: They have need only of reason.”

That statement by Holbach is foundational to the ethics of contemporary Humanism.

The basic morality of Humanism is based on `Situational Ethics’ rather than `Traditional Ethics’ often referred to as `Family values’. Traditional Ethics, presupposes that there are certain basic rules, ordained by god, that govern all human conduct. Situational Ethics, on the other hand, are flexible, determined by the particular situation as well as concern for the welfare of the persons involved. Situational ethics also considers the likely outcome, `the consequences’, of an action.

Traditional Ethics ,”Family Values”, rest on four assumptions:

  1. That there is a real distinction between right and wrong.
  2. That no consideration of consequences can over ride that distinction.
  3. That right and wrong are based on strict rules. And
  4. That such principles are clear and unambiguous….

In contrast, Situational Ethics maintains that:

  1. Right and wrong are not always clearly defined,
  2. That consequences of actions must be considered.
  3. That there frequently will be situations in which established rules can be discarded. And
  4. That many moral principles are ambiguous and uncertain.

The author of the major study of Situation Ethics, James Fletcher, received the Humanist of the Year Award in 1974. In his book describing Situation Ethics he gives the following example to justify consideration of consequences. He looks at a passage from the stage play and movie “The Rainmaker”. The scene where the morally out raged brother of a lonely, girl threatens to shoot the Rainmaker because he made love to her in the barn during the night. The Rainmakers intention is to restore her sense of womanliness and her hopes for marriage and children. Her father, a wise old rancher, grabs the gun away from his son, saying, “Noah, you’re so full of what’s right you can’t see what’s good.”….This episode illustrates the Humanist belief that we can choose between allegiance to established norms, based on traditional ethics, and Human well being, based on situational ethics.

Humanist Ethics seeks to bypass intense dogmatic differences and to negotiate disagreements, appealing to the civil virtues of rational dialogue and tolerance. It is our belief that most problems can be solved by negotiated compromises that respect individual rights, encourage personal responsibility and recognize societal needs.

In accepting his “Humanist of the Year” plaque Dr. Fletcher said: “We should drop the sanctify-of-life ethic and embrace a quality-of-life ethic”. His recommendation has since become closely identified with the Hospice movement and the Pro Choice movement.

That is a brief outline of historical Humanism. Here is a brief summary of today’s Humanist philosophy.

Humanism is a philosophy that puts the emphasis on humans solving the problems of life without the dogmatic authority of secular or religious institutions.

Humanism is committed to rational thought and responsible behavior that will enhance the quality of life on this earth.

Humanists believe that human beings are part of the natural world with all other forms of life and that nature is indifferent to our individual existence.

Humanists are convinced that the meaning and purpose of life must be found in living not in dying.

Humanists believe that moral values are neither divinely revealed nor the special property of any religious tradition, that they must be found by humans through the use of their natural reason, and that our beliefs about what is right or wrong in human behavior must be constantly subjected to the deepest reflection in light of our evolving understanding of our nature and the world in which we live.

Humanists have faith in the human capacity to choose good over evil without the expectation of reward in another life.

Humanists encourage moral excellence, positive relationships and human dignity; compassion, cooperation and community.

I am a Humanist because it offers a positive, intelligent, rational approach to solving the many problems of the human condition without resorting to character assassination, to brutality or condemning anyone’s lifestyle. I am a Humanist because it encourages a `zest’ for living.

Bertrand Russell, in his book The Conquest of Happiness, referred to “zest” as “the most universal and distinctive mark” of the happy individual. People with this quality, Russell argued, are those who come at life with a sound appetite, are glad to have what is before them, partake of things until they have enough, and know when to stop.

Omar Khayyam described a `zest’ for life when he wrote:

A Book of Verses underneath the Bough, A Jug of Wine, a Loaf of Bread–and Thou Beside me singing in the Wilderness

Ah, make the most of what we yet may spend, Before we too into the Dust descend;

The 20th century mythologist, Joseph Campbell, said one can find a zest for life by `following your bliss’ which he described as acting according to the dictates of your own heart rather than the expectations of society.

The leading psychologist of Humanism, the late Dr.Abraham Maslow, popularized the formula for individual fulfillment with his “Hierarchy of Needs”: physiological, safety, belonging, esteem, and self fulfillment. Dr. Maslow was honored as Humanist of the Year in 1967.

I have fairly well covered the Humanist attitude about living, I would like now to turn to the Humanist thoughts about death.

One of the questions I am most frequently asked is: “If you don’t believe in God and life after death what’s your incentive for leading a moral life?” My answer is “My respect for others and respect for myself.” One of the basic teachings of Humanism is recognizing the dignity of every human being and taking responsibility for how we treat every person we encounter. The daily acts of road rage, the gang shootings, and school yard fights; the political character assassinations, abuse of family members and the brawls in professional sports are not caused by a lack of belief in God but by a lack of belief in the rights of people. When people in positions of power and influence demand sexual favors from associates, its not because they don’t believe in a supernatural power, it’s because they lack a sense of responsibility that goes with leadership. The ethical teachings of the worlds leading religions use the fear of a supernatural power as the enforcer of moral values. Humanism suggests that moral values should be based on respect for human values, values that have been outlined by such documents as the Hammurabi Code, the Magna Carta, the U.S. Declaration of Independence; the French Declaration of the Rights of Man, and the U.S. Bill of Rights. Humanists may not believe there is life after death but we do believe in honoring this life. We conclude that the moral problems of this world are not the result of people having lost their religion it’s the result of people having lost their humanism.

A few years before I became active in organized Humanism I got involved in the Hospice movement. I enrolled in the training course to be a Hospice Volunteer and after completing the course I was invited to be a member of the group organizing a Utah Hospice program. The original Utah hospice organization included several nurses, a doctor, a dentist, a couple of advertising executives and myself. At that time I was a broadcast journalist and public affairs representative for KSL.

Our primary goal was to introduce the hospice philosophy to Utah and to train both medical professionals and non-medical lay people in the art of volunteer hospice care. At that time Hospice training was basically teaching volunteers the principles of `rational compassion’, that is recognizing human pain and suffering, then helping patients and their families to deal realistically with it. Today its called Palliative Care. That’s much different than simply feeling sorry for people.

The Hospice philosophy recognizes that death and dying are difficult situations for everyone. We are a death denying culture, we avoid talking about death and tend think about it only in vague terms, consequently we are confronted with over powering decisions when we or a loved one faces the reality of dying.

Hospice helps people to realize that death is a natural process and that the end of life deserves thoughtful consideration and care. The goal of Hospice care is the best quality of life possible during a persons final weeks, days and hours. Hospice believes people have the right to spend their final days in a familiar, friendly environment, their own home if possible. The Hospice program teaches the value of meaningful communications with the person in the process of dying and the family members. Hospice believes that people should not have to suffer severe pain during the final days of their life and encourages the medical profession to provide adequate pain control medication. Hospice encourages taking care of the whole person, the body, mind and emotions not simply the disease.

During the 25-years of my involvement with Hospice I’ve been a lay volunteer, a trainer, a workshop leader and a pastoral counselor. But I must say that I have received much more than I have given. My hospice service has stimulated me to be more compassionate, to learn more about the art of listening, given me tools to deal more realistically with death and dying, and to more fully appreciate the daily experiences of life.

For example about two years after my wife and I took the hospice training program our doctor discovered my wife had an incurable cancer. Our hospice training was really put to test as we learned to think about how to live each day with the knowledge of a very limited future together…We got first hand experience in the art of honestly expressing the full range of human feelings…the art of caring about each other…the benefits of doing things we really wanted to do now rather than postponing them. Her death was a great loss for me but our experiences with Hospice helped her to deal with her pending death and helped me to accept her death realistically and to deal honestly with the frequent feelings of sorrow and loneliness that would erupt unexpectedly for many years.

A few years after my wife’s death, my 86-year old mother was faced with a serious health situation that made death or incapacitation her only options. With the benefit of discussions we had had about living wills, advanced medical directives and special powers of attorney , she chose to die. She was released from the hospital and spent her final days of life in the comfort of my home, surrounded with the love of her children, grand children and great grand children…

A couple of years after my mothers death my two-year old great grandson was seriously injured in a home accident. Doctors at the Primary Children’s Hospital tried every possible way to restore his consciousness but finally said it was futile, that his brain has been deprived of oxygen so long after the accident that if he survived his life would be a vegetative state. My granddaughter talked with me and decided to remove her son from the life support system. She held him for a while then asked if I would like to hold him. I had the privilege of holding his small body in my arms, with his face resting on mine, as his body exhaled its final breath.

I relate these personal experiences as examples of the value of the Hospice Philosophy and Hospice training. I have no way of knowing how I would have handled these family deaths if I had not been involved with the Hospice Program. But I do know that the `rational compassion’ I developed as a result of being a Hospice volunteer has been inspiring and a source of emotional strength for me.

CONCLUSION

In summary then, how does my Humanist philosophy , my Hospice training and my experience with family deaths effect my feeling about death and dying? I grieve and sorrow and cry, I remember with regret the times when those relationships were marred by misunderstanding and anger but I also remember with joy the happy times, the moments we shared beautiful experiences, the quiet times of thoughtful tenderness and times of boisterous laughter. I try to remind myself that life is not an orderly process of moving from point A to point B, but rather life is chaotic, uncertain and ambiguous. Is the possibility that life may continue after death appealing? You bet. But is it probable? I don’t think so. As a Humanist I celebrate life and I recognize that death ends a life but not a relationship.

In conclusion I want to return to the thoughts of Howard Radest with which I began this presentation, he suggests that Humanism needs to create, communicate and celebrate stories about human dignity, then tell and retell those stories.

A vital element of creating such stories is clarifying our goals, understanding what it is Humanism hopes to accomplish.

Much of the success of religion can be attributed to its effort to provide answers to the basic mysteries of life: where did we come from, why are we here and where are we going. Religions tell stories that provide answers to those questions, then tell those stories over and over and over. Religions have been repeating the same answers to those same questions for at least five-thousand years.

Can we create meaningful stories about the dignity of being human that will have emotional impact? Can we create stories that will endure, that will appeal to generation after generation, for five-thousand years?

The Enlightenment was the beginning of human emancipation from mythology and authoritarianism. The enlightenment leaders proclaimed that every human being should have an equal opportunity to life, liberty and the pursuit of happiness and that the primary role of government is to assure social and political conditions that protect that equality. I think that is the primary story Humanism should glorify and tell over and over.

Abraham Maslow clarified the stages of how humans find meaning in life. I believe that’s another story Humanism should tell over and over.

Every person has worth and dignity and we should say so over and over.

Every person has the right to a pain free, dignified death. That message deserve frequent repetition.

I believe Humanism does has meaningful stories to tell about living and dying and we should tell those stories over, and over and over.

–Flo Wineriter

>

Politics 2000: Who Cares?

November 2000

Professor J.D. Williams presented his perspective on the 2000 elections to our group. He made it very clear there are real choices in this year’s elections. This has not always been the case. Consider in 1944 when the slogan was, “Hold your nose and vote for Roosevelt or shut your eyes and vote for Dewey.” In local politics in 1928, the refrain, “We want a Dern good Governor and we don’t mean Mabey,” was common.

This is not to say that either of our major choices at a national level is pristine or clear-cut. Al Gore’s reputation is badly stained with the illegal fund-raising he did with the Chinese for the DNC. George W. Bush, who was born with an oil stick in his mouth, presides over a state with a terrible record in maternal and child care and the most capital punishments carried out in the country.

Key differences between the two candidates:

Bush
Gore
Favors oil exploration of Arctic Refuge Opposes
Favors education vouchers Favors tax deduction for college expense
Would allow individual investment of part of SSA Opposes
Would protect soft money political contributions Endorses McCain-Feingold ban (but collected record amounts in 96)
Tax cuts across the board including the wealthy Tax cuts for lower and middle classes

Some generalizations about these positions:

  • Both candidates have interesting ideas about improving education.
  • Bush will be a boon to the wealthy class
  • Gore will be a boon to middle and lower classes and, in Professor William’s opinion, a better custodian of social security and the environment.

Does anyone really care about the elections this year? Should anyone care? Who Cares?

  • If the mal-distribution of income and poverty remains unchanged?
  • If the minimum wage produces an annual income less than 2/3 of a family poverty income?
  • If social security isn’t properly funded?
  • If global warming isn’t addressed?
  • If 80,000 children in Utah continue to live in poverty-they would fill the University stadium, the Delta Center, the Tabernacle and more.
  • If consumers are deprived by HB 320 of their protection against monopoly power of the utilities in Utah.

There should be enough CARING to carry over to November 7th!

Professor Williams encouraged all of us to get involved with the election. Support the candidates and causes of your individual choice by volunteering, putting up lawn signs, and most importantly: voting!

 

In Appreciation: Erich Fromm

May 2000

When I made a commitment to give this presentation some months ago I had no idea how timely it would be given the current activities of Fromm devotees around the world. March 23, 2000, marked the centenary of Erich Fromm’s birth, March 23, 1900. As it turns out the centenary is being observed by the publication of numerous books and articles in Fromm’s honor, and various lectures and conferences are being held as well.

In an audience such as this one I would expect that there would be a number of humanistically oriented authors that are favorites such as Isaac Asimov, E.O. Wilson, Robert Ingersoll, Corliss Lamont, Paul Kurtz, Bertrand Russell, Carl Sagan, and , I would hope, Erich Fromm. I certainly enjoy all of these authors but Fromm holds a special place in my life for he, more than the others, was very much a mentor for me as I was making my philosophical transition from traditional Christianity to humanism, and my career evolution from minister to psychologist. I never conversed with Fromm in person although I did have the pleasure of hearing him speak once at the University of Utah years ago. What I did have the opportunity to do was to avidly read his books once I discovered them, especially from the late 1950s to the time of his death in 1980.

My assumption is that Fromm may be fading into obscurity, particularly in this country and with younger persons because the American attention span is so short. I think that’s unfortunate given his status in the evolution of humanism over the last 60 years. Gerhard Knapp, for instance, has described Fromm as “one of the most influential humanists of this century.” But I express my appreciation for Fromm tonight not just for his personal contribution to me or for his historical contribution but also because I heartily believe his writings are still very relevant as we move into the challenges of the 21st century.

Before dipping into just a few of his books let me quickly sketch in a bit of the Fromm biography. He was born, as indicated earlier, March 23, 1900, in Frankfurt Germany, the only child of Orthodox Jewish parents. Fromm later described his mother as overprotective, his father distant and himself as an “unbearable, neurotic child.” And further, “being the only child of two overly anxious parents did not, of course, have an altogether positive effect on my development, but over the years I’ve done what I could to repair that damage.” (It has been said that those of us in the mental health profession often choose that line of work to cure our parents-or ourselves!)

The Fromm family was steeped in Jewish tradition and the young Fromm was an avid scholar of the Talmud and the old Testament, particularly the prophets Isaiah, Amos, and Hosea with their emphasis on justice, righteousness, and universal peace, motifs which would echo through all of Fromm’s later writings. In 1926, however, at the age of 26 he officially abandoned his Jewish faith. I was interested to note that was about the same age I officially abandoned my Methodist affiliations.

Fromm’s formal education focused on psychology, philosophy, sociology, and later, psychoanalysis. The major intellectual influences for him were Sigmund Freud and Karl Marx although Fromm was eventually to be a revisionist of both of these men.

In 1926 Fromm married a woman ten years his senior who had been his psychoanalyst, Frieda Reichman, but the marriage lasted only four years. (There are many good reasons not to marry your therapist!) Nonetheless, Fromm and Freida Fromm Reichman continued to be friends and professional collaborators and she had her own distinguished career as an author and psychotherapist.

In 1933 Fromm left Germany because of the rising tide of Nazism, just one of millions who fled from or perished at the hands of Hitler’s legions. In addition to the horrific and incomprehensible genocide of those days, how can one really imagine the incalculable loss to Germany and the occupied countries of the intellectuals, professionals, artisans, and myriad other talented persons who either died or fled to other countries-much to the enrichment of their adopted countries.

Here in America Fromm became one of the founders of the William Alanson White Institute of Psychiatry, Psychoanalysis and Psychology. At different times he taught at Yale, Columbia, Bennington College, New York University, the University of Michigan and Michigan State as well as the National Autonomous University in Mexico City. He also maintained a psychoanalytic practice for more than forty five years.

Fromm married his second wife in 1944 and moved to Mexico City seeking a more favorable climate for her health. Unfortunately, she died an untimely death in 1952. Fromm was later to marry for a third time, obviously a firm believer in the institution.

In the middle fifties Fromm joined the American Socialist Party and tried to formulate a progressive program for that party-without a great deal of success. However, he continued to be a firm believer in democratic socialism as the most humane and humanistic of political systems. Another prime political interest was the international peace movement and he was a co-founder of SANE, an organization opposing both the atomic arms race and the war in Vietnam. He also was a vigorous supporter of Senator Eugene McCarthy during the 1968 presidential campaign. After Nixon’s election, however, Fromm withdrew from political activism. Nixon was surely the cause of many folks questioning their hope for mankind!

During his lifetime Fromm suffered two major bouts of tuberculosis and three heart attacks before finally succumbing to a fourth attack on March 18, 1980, in the Swiss village of Muralto, just five days shy of his 80th birthday.

Gerhard Knapp has said of Fromm that he “Consistently devoted himself and work to one single goal: the propagation of a great visionary hope for a better and more dignified life for all of humanity. [He] clung tenaciously to his unflagging faith in humanity’s potential for self-regeneration. This unbroken hope is the spiritual center of his life and his works.” Daniel Burston, author of The Legacy of Erich Fromm, has written: [Fromm] was a man who cherished an abiding love for the values of humanistic religion and the Jewish tradition in which he was raised. [He] was nonetheless a committed atheist who regarded belief in a personal creator God as an historical anachronism.” Fromm described himself as “an atheistic mystic, a Socialist who is in opposition to most Socialist and Communist parties, a psychoanalyst who is a very unorthodox Freudian.”Fromm was a very prolific writer with hundreds of articles and almost two dozen books in English to his credit. The range of his subject matter was broad including psychology and psychoanalysis, sociology, humanism, religion, ethics, Buddhism, Marxism, socialism and foreign policy. The International Erich Fromm Society is currently completing the publication of all of his collected works in twelve volumes and 6,000 pages in length! How then to deal adequately tonight with that mass of material in our time remaining? Obviously we can’t, but let me just dip lightly into a few of his works to illustrate some of his concerns which I think still have decided relevance for the present.

Fromm’s first book in English was Escape From Freedom published in 1941, almost 60 years ago in the midst of World War II. The book opens with three provocative questions from the Talmud that I have found useful with numerous clients and classes:

  • If I am not for myself, who will be for me?
  • If I am for myself only, what am I?
  • If not now, when?

The first question, “If I am not for myself, who will be for me?” must surely be answered, “no one.” The second question, “If I am for myself only, what am I?” provides the balance between self interest and concern for others and suggests to me the answer, “lonely”, for persons completely self-preoccupied are not very enjoyable folks to be around. The third question provides the kicker, “If not now, when?” If we are not fully living now when do we plan to get around to it? Perhaps never!

In Escape From Freedom Fromm describes the growth of human freedom and self-awareness from the Middle Ages to modern times but with a problematic result. Modern man, freed from pre-individualistic bonds of servitude and old securities of stifling and outworn cosmologies can find himself isolated, anxious, and alone. To escape that unpleasant condition one can easily enter into new dependencies and turn to authoritarian states and institutions for meaning and identity. In 1941 Fromm clearly put Nazism in that role-with hideous results in World War II and its aftermath. How distressing it is today to see a resurgence of Nazi motifs whether in Europe or in Northern Idaho or elsewhere! The alternative to abject dependency and compliance to authority, Fromm wrote, was to advance toward a positive freedom based upon the uniqueness and individuality of persons working in concert for the greater good of humankind. The challenge of enjoying and capitalizing upon diversity among persons and life styles is an ever present challenge. (We can cite the current diversity deficit at the University of Utah as a prime example.)

Fromm’s second book, Man For Himself, published in 1947, is my personal favorite. My copy is dog-eared, heavily underlined throughout, and the source of many useful quotations. For instance, in discussing the existential realities of human existence, Fromm wrote what I deem to be a classic statement of the humanist stance:

There is only one solution to [the human condition]: for one to face the truth, to acknowledge his fundamental aloneness and solitude in a universe indifferent to his fate, to recognize that there is no power transcending him which can solve his problem for him. Man must accept the responsibility for himself and the fact that only by using his powers can he give meaning to his life. If he faces the truth without panic he will recognize that: there is no meaning to life except the meaning man gives his life by the unfolding of his powers, by living productively; and that only constant vigilance, activity, and effort can keep us from failing in the one task that matters-the full development of our powers within the limitations set by the laws of our existence. Only if he recognizes the human situation, the dichotomies inherent in his existence and his capacity to unfold his powers, will he be able to succeed in his task: to be himself and for himself and to achieve happiness by the full realization of those faculties which are peculiarly his-of reason, love, and productive work.

The key words here are “reason,” “love,” and “productive work” that Fromm elaborates upon throughout much of his writings; “reason,” “love”, and “productive work” as the basic ingredients for a fulfilling human life.

In describing humanistic ethics, Fromm wrote (and I’ve collected several quotations here):

Humanistic ethics is based on the principle that only man himself can determine the criterion for virtue and sin, and not an authority transcending him: “good” is what is good for man and “evil” what is detrimental to man; the sole criterion of ethical value being man’s welfare. Man indeed is the “measure of all things.” The humanistic position is that there is nothing higher and nothing more dignified than human existence.

…it is one of the characteristics of human nature that man finds his fulfillment and happiness only in relatedness to and solidarity with his fellow men.

Love is not a higher power which descends upon man nor a duty which is imposed upon him; it is his own power by which he relates himself to the world and makes it truly his.

Undoubtedly Fromm’s most popular book was a little volume entitled The Art of Loving. It was translated into 28 languages and had sold more than one and a half million copies in English alone by 1970. Reportedly upon publication some librarians and book sellers thought they would have to keep the book behind the counter-a clear indication they hadn’t read the book. The Art of Loving is a far cry from Alex Comfort’s The Joy of Sex for instance, or many a tome currently available in libraries and book stores. The Art of Loving quickly makes the point that loving is a very demanding human activity. The very first two sentences in Chapter I read: “Is love an art? Then it requires knowledge and effort,” Further, the mastery of an art requires that it be a matter of ultimate concern; “there must be nothing else in the world more important than the art.” What proportion of humankind do you imagine has loving as it’s ultimate concern? “?in spite of the deep-seated craving for love, almost everything else is considered to be more important than love: success, prestige, money, power-almost all our energy is used for the learning of how to achieve these aims, and almost none to the art of loving.” A substantive love, Fromm wrote, is not just a strong feeling, “It is a decision, it is a judgment, it is a promise. If love were only a feeling, there would be no basis for the promise to love each other forever. A feeling comes and it may go. How can I judge that it will stay forever, when my act does not involve judgment and decision”

In an age of throw-away relationships with passing fancies those words sound rather quaint, don’t they? Somewhere in the back of my head I hear the lament of a popular song, “doesn’t anyone stay together anymore?” But not just judgment and decision are called for. Fromm cites other basic elements common to all forms of love: care, responsibility, respect and knowledge. These quotes:

  • Love is the active concern for the life and growth of that which we love. Where this active concern is lacking, there is no I owe.
  • Respect means the concern that the other person should grow and unfold as he is. Respect, thus, implies the absence of exploitation. I want the loved person to grow and unfold for his own sake, and in his own ways, and not for the purpose of serving me.
  • To respect a person is not possible without knowing him; care and responsibility would be blind if they were not guided by knowledge.

In a contrary mode, how often do we hear about couples who have a frenzied courtship and marry after only a few days or weeks? Or how often do we read about persons who kill the person they supposedly love but feel alienated from and are quoted as saying, “If I can’t have her, no one will!” Love, Fromm said, requires care, responsibility, respect and knowledge.

In a little volume entitled Psychoanalysis and Religion, Fromm spells out the differences between authoritarian and humanistic religion:

The essential element in authoritarian religion and in the authoritarian religious experience is the surrender to a power transcending man. The main virtue of this type of religion is obedience, its cardinal sin is disobedience. Just as the deity is conceived as omnipotent or omniscient, man is conceived as being powerless and insignificant. Only as he can gain grace or help from the deity can he feel strength.

Humanistic religion, on the other hand,

“is centered around man and his strength. Man must develop his power of reason in order to understand himself, his relationship to his fellow men and his position in the universe. He must recognize the truth, both with regard to his limitations and potentialities. He must develop his powers of love for others as well as for himself and experience the solidarity of all living beings. Man’s aim in humanistic religion is to achieve the greatest strength, not the greatest powerlessness; virtue is self-realization, not obedience. Faith is certainty of conviction based on one’s own experience of thought and feeling, not assent to propositions on credit of the proposer. The prevailing mood is that of joy, while the prevailing mood in authoritarian religion is that of sorrow and guilt.

The last book that I want to mention and one of the last that Fromm wrote was To Have or to Be published in 1976. It’s is an admirable book to read for anyone currently interested in simplicity movements and de-escalating frantic life styles and the perpetual accumulation of material possessions. (However, looking around the benches of this valley it doesn’t look like many folks in our part of the world are much into simplicity!) It is interesting to note that To Have or to Be has consistently been more popular in Europe than here in the U.S.

Fromm was severely critical of the consumerism that drives our economy, depleting natural resources, increasing the gap between the rich and the poor, exploiting the resources and people of developing countries, and promoting a radical hedonism that breeds indifference to pervasive social needs. To quote Fromm: “?the selfishness the system generates makes leaders value personal success more highly than social responsibility?at the same time, the general public is also so selfishly concerned with their private affairs that they pay little attention to all that transcends the personal realm.” (We can think of the abysmally low voter turnout for elections in this country as just one of many examples.) The nagging question for us still today is, are we really happy for all of our expansive homes, accumulating toys and endless consumption? Have things really changed much from Fromm’s description of life twenty five years ago? “?the observable data show most clearly that our kind of ?pursuit of happiness? does not produce well-being. We are a society of notoriously unhappy people; lonely, anxious, depressed, destructive, dependent-people who are glad when we have killed the time we were trying so hard to save.” And further, “?the need for speed and newness, which can only be satisfied by consumerism reflects restlessness, the inner flight from oneself?looking for the next thing to do or the newest gadget to use is only a means for protecting oneself from being close to oneself or another person.” (Psychologists and psychiatrists are always messing with our heads!)

“Being,” in Fromm’s terms, is living simply with modest wants, with depth and vitality, deeply involved with caring communities, sensitive to the natural world around us, and mindful of the rightful place of all of earth’s people. The “having mode” in contemporary life might well be typified by a Wall Street Journalcartoon I saw recently which pictured a man walking determinedly down the street, briefcase in hand, with a long stick arching from his back forward over his head and dangling a dollar bill in front of him. (The Wall Street Journal is an interesting place for such a cartoon!)

Well, there is no way I can do justice to the depth of Fromm’s writings in this piecemeal fashion, and there is so much more of his work that I would enjoy discussing but time is limited. I would invite you to consider his writings either again or perhaps for the first time. There are significant books that I have not even mentioned and topics that I imagine you would find both provocative and enlightening. Fortunately, virtually all of Fromm’s books are still in print, and I have a sheet available listing all of his published works in English. I commend them to you for a consciousness raising experience. The sheet also cites the web address of the International Erich Fromm Society for those of you into cyber exploration.

Let me add this one postscript (and speaking of consciousness raising). Fromm wrote in an era when it was the norm to use the generic term, “man” to refer to all humans and “he” as the accompanying personal pronoun. You heard that usage in the quotations and you may well have winced a bit when you heard them, especially if you are a woman. Time has moved on since Fromm last wrote and feminists have appropriately helped us to be more sensitive in our language usage. Our language is still cumbersome on the point but gender equity demands that we speak and write without disenfranchising either gender. On the other hand, perhaps fair play would now suggest we typically use “woman” in a generic sense-and, of course, that includes “man!”

–Hugh Gillilan

Annual Membership Meeting and Banquet

March 2000

“And a grand time was had by all,” summarizes the attitudes of those who attended the annual Business Meeting and Banquet on February 10th at Distinctive Catering. Following a fine dinner, reports from the Board of Directors showed that our chapter continues to grow, survived the Y2K issues, and has a viable presence on the Internet and an increasing public presence.

Our membership grew to nearly 150 this past year. Our average meeting attendance is around 50 and we have had a few meetings with more than 100 attendees. The Discussion Group attracts 15-20 people every month.

Our website now contains all of the issues of The Utah Humanist dating back to 1994. Former journal editor/publisher Bob Green donated diskettes containing his work in establishing our newsletter as one of the better humanist publications.

Our chapter is co-sponsoring a junior and senior high school science fair being held at Weber State University. The Board approved $300 from our conference fund specifically for the Social and Behavioral Sciences category at the fair. This puts our proverbial money where our mouth is in supporting science. It also gives us positive publicity in the program for the fair.

The quorum elected all of the nominees for the Board. Tonya Evans was nominated from the floor and also elected to a position on the Board. Thanks were expressed by the group to outgoing Board members Brenda Wright and Earl Wunderli for their service.

The finale to a memorable evening was an interactive drum concert by George Grant. Everyone should express thanks to Rolf Kay for again organizing a great event!

 

Failed United States’ Leadership On Human Rights

June 2000

The half-century of United States’ dominance of international politics has often been characterized as “benevolent” hegemony, or hegemonial “leadership” (Nye, Bound to Lead; Lundestad, East, West, North, South). The underlying assumption is that, contrary to the typical hegemonial power, the United States has used its predominance in the system to promote “public goods,” such as democracy, human rights and free trade (Brilmayer, American Hegemony). The selective pattern of participation and advocacy demonstrated by the United States on human rights, however, reveals why it is widely perceived as the primary obstacle to the progress and success of human rights on the cusp of the new millennium.

Human rights emerged as a theoretical concept and political tool in the aftermath of WW II as numerous international treaties cultivated the legal responsibilities of the sovereign states to honor its citizens’ inherent human rights. The (1948) Universal Declaration of Human Rights and the Convention on the Prevention and Punishment of Genocide demarcated the beginning of this new era. Fifty years of “promoting” human rights has produced a plethora of human rights treaties across the political landscape.

The Cold War with its heavy ideological underpinnings obscured an interesting pattern-while the United States was invariably the first to advocate human rights treaties, it was one of the last to ratify them. Although Presidents have signed many of the “core” treaties, the U.S. Senate has, more often than not, refused to ratify them. The Genocide Convention, for example, was only ratified during the Reagan Administration after a forty-year delay. The United States has joined only one of the treaties based on the Universal Declaration (the Covenant on Civil and Political Rights), and then only during the Bush Administration. The Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW), hailed as the “international bill of rights” for women, has been stuck in the Senate Foreign Relations Committee for years. The only two countries choosing not to ratify the Rights of the Child are Somalia and the United States. The United States conspicuously refused to join in the recent and highly publicized Landmines Treaty.

The turn of the century has witnessed the international community’s dramatic transition from the “promotion” of human rights to the “protection” of human rights by creating the institutions and authority necessary to oversee that citizens’ rights are respected by the sovereign states. Not only have significant and successful courts emerged on the regional level (in Western Europe and Latin America), but the Security Council also created international tribunals for addressing atrocities in Former Yugoslavia and Rwanda. These courts stand as major achievements in international enforcement. In fact, the United States strongly supported the realization of all these court and firmly advocated the need for an international criminal court throughout the Cold War-until it became a plausible reality.

International Criminal Court

With the end of the Cold War, the United Nations General Assembly requested a draft of an International Criminal Court (ICC). A designated Preparatory Committee reworked its original form (1995-98) and the resulting Rome Conference (15 June to 17 July of 1998) met to consider the draft document which contained 116 articles in 13 section including nearly 1400 brackets (indicating unresolved content or language). The “Committee of the Whole” divided itself into 13 subgroups to address the relevant problems in each section.

The most dynamic political and legal presence at the Rome Conference was the “Like-Minded Group.” This drew upon middling and small states (primarily from Europe and Latin America) as well as the “Coalition for the Establishment of an International Criminal Court” (CICC), an energetic and effective alliance of Non-Governmental Organizations. (Note: Plugging in INTERNATIONAL CRIMINAL COURT for a web search will bring up both the “Campaign for the Establishment of the ICC” and the “Coalition for the ICC” sites which provide general background and the latest ICC information as well as relevant commentary from Human Rights Watch and Amnesty International). Using the Coalition to Ban Landmines as its model, the CICC proved to be a prime example of the impact that can be made upon international politics by “civil society” (defined in contemporary scholarship as globally-organized citizen groups actively supporting United Nations’ goals). The “Like-Minded Group” envisioned an independent court with an independent prosecutor and an ICC with universal jurisdiction over the four most serious international crimes.

Four “Core” Crimes

The first of these international crimes was readily agreed upon. The crime of genocide is defined in its treaty as the deliberate attempt to destroy “in whole or in part” members of a religious, ethnic or racial group. Neither “reasons of state” nor the orders of superiors provide a defense. The ICC’s automatic and universal jurisdiction over genocide gained swift approval without significant dissent.

At Nuremberg, “crimes against humanity” were prosecuted only in connection with other crimes. The International Tribunal for Former Yugoslavia broke new ground in advancing its legal status. Under the ICC’s mandate, “widespread or systematic” attacks against civilians, including murder, torture, “disappearances,” rape, and forcible transfers were denoted as “crimes against humanity.” Careful attention was given at the Rome Conference to establish that deliberate policies of rape and sexual slavery were included in its definition. Moreover, this crime was not confined (as it had been at Nuremberg) to the context of international war, but also covered such acts when committed within a civil conflict or even during peacetime. This last factor alone signifies a significant inroad on state sovereignty on behalf of human rights.

Undoubtedly the oddest outcome of the Rome Conference lies in the status of the crime of aggression (also identified and punished at Nuremberg). The United Nations had repeatedly failed in its historical quest to find a consensus definition for aggression. The “Like-Minded Group” strongly advocated for its inclusion in the ICC’s mandate. The permanent members of the Security Council, however, fought vigorously for the preeminence of Article 39 of the UN Charter which clarifies that the Security Council has sole authority to identify aggression and authorize a response on behalf of the international community. (Hence, aggression is whatever the Security Council says it is under the UN Charter.) The Rome Conference crafted a curious compromise-aggression became one of the four “core” crimes under the ICC’s authority-although the ICC could never exercise this authority until 7/8th of its adherents approved of a working definition.

War crimes remains virtually the oldest area of customary international law, protecting the treatment of prisoners-of-war and civilians, and prohibiting certain weapons, among other features. (The proposal from several Non-Aligned states to add nuclear weapons to the list of illegal weapons alarmed the United States, which successfully defeated the effort.) The Clinton Administration, which had signaled its support for the ICC as recently as four months before the Rome Conference, dug in its heels on war crimes. Could U.S. soldiers be dragged before a politically motivated prosecutor for actions undertaken even in fulfillment of United Nations-sponsored humanitarian efforts? What the Clinton Administration sought but failed to get was essentially an iron-clad guarantee that no U.S. soldier would end up on trial by the ICC for alleged war crimes. Although deliberate intent is required of war crimes acts (so that an accidental bombing of civilians, e.g. as happened in Kosovo, would not create legal vulnerability), this issue sparked such fierce opposition by the Pentagon that it swayed the Clinton Administration into absolute opposition. The United States then floated the idea that it might join the ICC if it could join with an exemption on war crimes. This notion drew vehement denunciation by the “Like-Minded Group” and others. In one of the several attempt to satisfy the United States, a 7 year “opt-out” provision for war crimes was inserted permitting a state to join the ICC while being immune from its war crimes authority for 7 years.

Jurisdiction

The Security Council may request that the ICC prosecutor take up a case, provided that 9 of its 15 members, including all of the permanent members, agree. The United States argued that this should be the only process by which the prosecutor could consider legal action. Of course, this would in effect permit the permanent members of the Security Council to veto any potential actions against themselves and their allies.

The Rome Conference, despite U.S. opposition, added 2 more routes to activate the prosecutor. State members, either the territorial state where the alleged crime was committed, or the nationality state of the accused violator, could request the prosecutor’s investigation, but only under restrictive and complicated conditions. At U.S. insistence, territorial or nationality states could invoke the prosecutor attention only following the format of “complementality.”

“Complementality” means the prosecutor would request the relevant state to do its own investigation and evaluation of the merits of the charge, and, if appropriate, try the individual in their own domestic court system. The ICC would in fact come into play only by taking up jurisdiction if the state proved either “unable” or “unwilling” to carry out a legitimate investigation and prosecution according to a three-person panel of the ICC (serving as a pre-trial chamber). When and if the ICC takes a case following the “complementality” procedure, the prosecutor is limited to “requesting” (not compelling) witnesses, and “seeking” (not demanding) state cooperation. Moreover, the Security Council may require the ICC to suspend any of the ICC’s investigations or trials for 12 months-an option that can be renewed indefinitely.

United States Rejection of ICC

France and the United Kingdom had been noncommittal about the ICC until the series of compromises on “complementality” had been worked out. At that point, they joined virtually the rest of the European Union in aligning to support of the ICC. David Schiffer, the U.S. Ambassador to the Rome Conference, squeezed out “complementality” and other compromises by implying that the United States would join if these points were satisfied. On July 17th, 120 nation-states voted to accept what now became the Rome Treaty creating the ICC, while the United States joined Algeria, Iran, Iraq, Israel, Libya, North Korea, the PRC and Sudan in opposition to the ICC. (There were also 21 abstentions.)

Ambassador Schiffer explained that the Rome Conference had been hectic, pointing out the 1400 unresolved provisions and that the final draft was received at 2 am on July 17th. This foreclosed a careful, line-by-line reading of its provisions and left the United States with an unacceptable take it or leave it proposition. Furthermore, the notion of “complementality” left the possibility that two members of the ICC pre-trial chamber could overrule (by 2-1) the United States judicial system on such a vital matter. Noting that the United States has the largest deployment of military personnel in the history of the world, often to the most troubled parts of the world at the very request of the United Nations, it would be not be in U.S. national interest to endorse the Rome Treaty.

The Preparatory Committee continued to meet in the following months to work out the treaty’s details with active U.S. participation. In August of 1999, the United States made it known to the Preparatory Committee that there was still time to reconsider on both sides-would it entertain a provision that would secure U.S. support for the treaty? It would. The United States then proposed an exemption from the war crimes mandate for those military actions committed as “official acts.” In other words, military actions undertaken at the express command of the appropriate military hierarchy could not constitute war crimes. Voicing the opinion of virtually all the participants, one disgruntled observer noted that this would have reversed the protection of human rights to its pre-Nuremberg status.

The United States presently has little credibility with the CICC or states supportive of the ICC. Senator Jesse Helms, the powerful chair of the Senate Foreign Relations Committee, has forcefully and repeatedly declared the Rome Treaty “dead in the water.” It may be DOA in the United States, but the ICC presently has over 90 signatures, and many anticipate its achievement by December of 2001 of the 60 ratifications necessary to bring it into force.

Ironically, many liberal supporters have not only lost their enthusiasm for the ICC, but have moved, in some cases, to actively oppose the Rome Treaty based on two arguments. First, they allege that the ICC was unconscionably weakened to entice the United States to join. Not only did the United States spurn the ICC, but it managed to mangle the intended judicial capacity of the ICC through “complementality” and other compromises that in effect crippled it. Secondly, many leftist critics fear that the ICC may be seen as a visible and viable punishment for the worst international criminals, and thus provide a convenient excuse for the international community to avoid humanitarian intervention during a crisis. In other words, the ICC may end up functioning as an excuse or substitute for actually taking action to stop horrific bloodshed in difficult cases like those in Former Yugoslavia and Rwanda.

The United States’ Failed Leadership on Human Rights

To many of its critics, the United States’ posture opposing the ICC signifies that the “emperor has no clothes” (Amnesty International, Annual Report; Human Rights Watch, World Report). Once the unquestioned international champion of human rights, the United States regressed into the most prominent Western recalcitrant on human rights by its refusal to join human rights treaties, and then further retreated from its original role to become the chief obstacle to the supreme achievement of human rights-an International Criminal Court signifying the genuine “protection” of human rights.

–Nancy Haanstad, PhD
Weber State University

 

In Appreciation: Erich Fromm

May 2000

When I made a commitment to give this presentation some months ago I had no idea how timely it would be given the current activities of Fromm devotees around the world. March 23, 2000, marked the centenary of Erich Fromm’s birth, March 23, 1900. As it turns out the centenary is being observed by the publication of numerous books and articles in Fromm’s honor, and various lectures and conferences are being held as well.

In an audience such as this one I would expect that there would be a number of humanistically oriented authors that are favorites such as Isaac Asimov, E.O. Wilson, Robert Ingersoll, Corliss Lamont, Paul Kurtz, Bertrand Russell, Carl Sagan, and , I would hope, Erich Fromm. I certainly enjoy all of these authors but Fromm holds a special place in my life for he, more than the others, was very much a mentor for me as I was making my philosophical transition from traditional Christianity to humanism, and my career evolution from minister to psychologist. I never conversed with Fromm in person although I did have the pleasure of hearing him speak once at the University of Utah years ago. What I did have the opportunity to do was to avidly read his books once I discovered them, especially from the late 1950s to the time of his death in 1980.

My assumption is that Fromm may be fading into obscurity, particularly in this country and with younger persons because the American attention span is so short. I think that’s unfortunate given his status in the evolution of humanism over the last 60 years. Gerhard Knapp, for instance, has described Fromm as “one of the most influential humanists of this century.” But I express my appreciation for Fromm tonight not just for his personal contribution to me or for his historical contribution but also because I heartily believe his writings are still very relevant as we move into the challenges of the 21st century.

Before dipping into just a few of his books let me quickly sketch in a bit of the Fromm biography. He was born, as indicated earlier, March 23, 1900, in Frankfurt Germany, the only child of Orthodox Jewish parents. Fromm later described his mother as overprotective, his father distant and himself as an “unbearable, neurotic child.” And further, “being the only child of two overly anxious parents did not, of course, have an altogether positive effect on my development, but over the years I’ve done what I could to repair that damage.” (It has been said that those of us in the mental health profession often choose that line of work to cure our parents-or ourselves!)

The Fromm family was steeped in Jewish tradition and the young Fromm was an avid scholar of the Talmud and the old Testament, particularly the prophets Isaiah, Amos, and Hosea with their emphasis on justice, righteousness, and universal peace, motifs which would echo through all of Fromm’s later writings. In 1926, however, at the age of 26 he officially abandoned his Jewish faith. I was interested to note that was about the same age I officially abandoned my Methodist affiliations.

Fromm’s formal education focused on psychology, philosophy, sociology, and later, psychoanalysis. The major intellectual influences for him were Sigmund Freud and Karl Marx although Fromm was eventually to be a revisionist of both of these men.

In 1926 Fromm married a woman ten years his senior who had been his psychoanalyst, Frieda Reichman, but the marriage lasted only four years. (There are many good reasons not to marry your therapist!) Nonetheless, Fromm and Freida Fromm Reichman continued to be friends and professional collaborators and she had her own distinguished career as an author and psychotherapist.

In 1933 Fromm left Germany because of the rising tide of Nazism, just one of millions who fled from or perished at the hands of Hitler’s legions. In addition to the horrific and incomprehensible genocide of those days, how can one really imagine the incalculable loss to Germany and the occupied countries of the intellectuals, professionals, artisans, and myriad other talented persons who either died or fled to other countries-much to the enrichment of their adopted countries.

Here in America Fromm became one of the founders of the William Alanson White Institute of Psychiatry, Psychoanalysis and Psychology. At different times he taught at Yale, Columbia, Bennington College, New York University, the University of Michigan and Michigan State as well as the National Autonomous University in Mexico City. He also maintained a psychoanalytic practice for more than forty five years.

Fromm married his second wife in 1944 and moved to Mexico City seeking a more favorable climate for her health. Unfortunately, she died an untimely death in 1952. Fromm was later to marry for a third time, obviously a firm believer in the institution.

In the middle fifties Fromm joined the American Socialist Party and tried to formulate a progressive program for that party-without a great deal of success. However, he continued to be a firm believer in democratic socialism as the most humane and humanistic of political systems. Another prime political interest was the international peace movement and he was a co-founder of SANE, an organization opposing both the atomic arms race and the war in Vietnam. He also was a vigorous supporter of Senator Eugene McCarthy during the 1968 presidential campaign. After Nixon’s election, however, Fromm withdrew from political activism. Nixon was surely the cause of many folks questioning their hope for mankind!

During his lifetime Fromm suffered two major bouts of tuberculosis and three heart attacks before finally succumbing to a fourth attack on March 18, 1980, in the Swiss village of Muralto, just five days shy of his 80th birthday.

Gerhard Knapp has said of Fromm that he “Consistently devoted himself and work to one single goal: the propagation of a great visionary hope for a better and more dignified life for all of humanity. [He] clung tenaciously to his unflagging faith in humanity’s potential for self-regeneration. This unbroken hope is the spiritual center of his life and his works.” Daniel Burston, author of The Legacy of Erich Fromm, has written: [Fromm] was a man who cherished an abiding love for the values of humanistic religion and the Jewish tradition in which he was raised. [He] was nonetheless a committed atheist who regarded belief in a personal creator God as an historical anachronism.” Fromm described himself as “an atheistic mystic, a Socialist who is in opposition to most Socialist and Communist parties, a psychoanalyst who is a very unorthodox Freudian.”Fromm was a very prolific writer with hundreds of articles and almost two dozen books in English to his credit. The range of his subject matter was broad including psychology and psychoanalysis, sociology, humanism, religion, ethics, Buddhism, Marxism, socialism and foreign policy. The International Erich Fromm Society is currently completing the publication of all of his collected works in twelve volumes and 6,000 pages in length! How then to deal adequately tonight with that mass of material in our time remaining? Obviously we can’t, but let me just dip lightly into a few of his works to illustrate some of his concerns which I think still have decided relevance for the present.

Fromm’s first book in English was Escape From Freedom published in 1941, almost 60 years ago in the midst of World War II. The book opens with three provocative questions from the Talmud that I have found useful with numerous clients and classes:

  • If I am not for myself, who will be for me?
  • If I am for myself only, what am I?
  • If not now, when?

The first question, “If I am not for myself, who will be for me?” must surely be answered, “no one.” The second question, “If I am for myself only, what am I?” provides the balance between self interest and concern for others and suggests to me the answer, “lonely”, for persons completely self-preoccupied are not very enjoyable folks to be around. The third question provides the kicker, “If not now, when?” If we are not fully living now when do we plan to get around to it? Perhaps never!

In Escape From Freedom Fromm describes the growth of human freedom and self-awareness from the Middle Ages to modern times but with a problematic result. Modern man, freed from pre-individualistic bonds of servitude and old securities of stifling and outworn cosmologies can find himself isolated, anxious, and alone. To escape that unpleasant condition one can easily enter into new dependencies and turn to authoritarian states and institutions for meaning and identity. In 1941 Fromm clearly put Nazism in that role-with hideous results in World War II and its aftermath. How distressing it is today to see a resurgence of Nazi motifs whether in Europe or in Northern Idaho or elsewhere! The alternative to abject dependency and compliance to authority, Fromm wrote, was to advance toward a positive freedom based upon the uniqueness and individuality of persons working in concert for the greater good of humankind. The challenge of enjoying and capitalizing upon diversity among persons and life styles is an ever present challenge. (We can cite the current diversity deficit at the University of Utah as a prime example.)

Fromm’s second book, Man For Himself, published in 1947, is my personal favorite. My copy is dog-eared, heavily underlined throughout, and the source of many useful quotations. For instance, in discussing the existential realities of human existence, Fromm wrote what I deem to be a classic statement of the humanist stance:

There is only one solution to [the human condition]: for one to face the truth, to acknowledge his fundamental aloneness and solitude in a universe indifferent to his fate, to recognize that there is no power transcending him which can solve his problem for him. Man must accept the responsibility for himself and the fact that only by using his powers can he give meaning to his life. If he faces the truth without panic he will recognize that: there is no meaning to life except the meaning man gives his life by the unfolding of his powers, by living productively; and that only constant vigilance, activity, and effort can keep us from failing in the one task that matters-the full development of our powers within the limitations set by the laws of our existence. Only if he recognizes the human situation, the dichotomies inherent in his existence and his capacity to unfold his powers, will he be able to succeed in his task: to be himself and for himself and to achieve happiness by the full realization of those faculties which are peculiarly his-of reason, love, and productive work.

The key words here are “reason,” “love,” and “productive work” that Fromm elaborates upon throughout much of his writings; “reason,” “love”, and “productive work” as the basic ingredients for a fulfilling human life.

In describing humanistic ethics, Fromm wrote (and I’ve collected several quotations here):

Humanistic ethics is based on the principle that only man himself can determine the criterion for virtue and sin, and not an authority transcending him: “good” is what is good for man and “evil” what is detrimental to man; the sole criterion of ethical value being man’s welfare. Man indeed is the “measure of all things.” The humanistic position is that there is nothing higher and nothing more dignified than human existence.

…it is one of the characteristics of human nature that man finds his fulfillment and happiness only in relatedness to and solidarity with his fellow men.

Love is not a higher power which descends upon man nor a duty which is imposed upon him; it is his own power by which he relates himself to the world and makes it truly his.

Undoubtedly Fromm’s most popular book was a little volume entitled The Art of Loving. It was translated into 28 languages and had sold more than one and a half million copies in English alone by 1970. Reportedly upon publication some librarians and book sellers thought they would have to keep the book behind the counter-a clear indication they hadn’t read the book. The Art of Loving is a far cry from Alex Comfort’s The Joy of Sex for instance, or many a tome currently available in libraries and book stores. The Art of Loving quickly makes the point that loving is a very demanding human activity. The very first two sentences in Chapter I read: “Is love an art? Then it requires knowledge and effort,” Further, the mastery of an art requires that it be a matter of ultimate concern; “there must be nothing else in the world more important than the art.” What proportion of humankind do you imagine has loving as it’s ultimate concern? “?in spite of the deep-seated craving for love, almost everything else is considered to be more important than love: success, prestige, money, power-almost all our energy is used for the learning of how to achieve these aims, and almost none to the art of loving.” A substantive love, Fromm wrote, is not just a strong feeling, “It is a decision, it is a judgment, it is a promise. If love were only a feeling, there would be no basis for the promise to love each other forever. A feeling comes and it may go. How can I judge that it will stay forever, when my act does not involve judgment and decision”

In an age of throw-away relationships with passing fancies those words sound rather quaint, don’t they? Somewhere in the back of my head I hear the lament of a popular song, “doesn’t anyone stay together anymore?” But not just judgment and decision are called for. Fromm cites other basic elements common to all forms of love: care, responsibility, respect and knowledge. These quotes:

  • Love is the active concern for the life and growth of that which we love. Where this active concern is lacking, there is no I owe.
  • Respect means the concern that the other person should grow and unfold as he is. Respect, thus, implies the absence of exploitation. I want the loved person to grow and unfold for his own sake, and in his own ways, and not for the purpose of serving me.
  • To respect a person is not possible without knowing him; care and responsibility would be blind if they were not guided by knowledge.

In a contrary mode, how often do we hear about couples who have a frenzied courtship and marry after only a few days or weeks? Or how often do we read about persons who kill the person they supposedly love but feel alienated from and are quoted as saying, “If I can’t have her, no one will!” Love, Fromm said, requires care, responsibility, respect and knowledge.

In a little volume entitled Psychoanalysis and Religion, Fromm spells out the differences between authoritarian and humanistic religion:

The essential element in authoritarian religion and in the authoritarian religious experience is the surrender to a power transcending man. The main virtue of this type of religion is obedience, its cardinal sin is disobedience. Just as the deity is conceived as omnipotent or omniscient, man is conceived as being powerless and insignificant. Only as he can gain grace or help from the deity can he feel strength.

Humanistic religion, on the other hand,

“is centered around man and his strength. Man must develop his power of reason in order to understand himself, his relationship to his fellow men and his position in the universe. He must recognize the truth, both with regard to his limitations and potentialities. He must develop his powers of love for others as well as for himself and experience the solidarity of all living beings. Man’s aim in humanistic religion is to achieve the greatest strength, not the greatest powerlessness; virtue is self-realization, not obedience. Faith is certainty of conviction based on one’s own experience of thought and feeling, not assent to propositions on credit of the proposer. The prevailing mood is that of joy, while the prevailing mood in authoritarian religion is that of sorrow and guilt.

The last book that I want to mention and one of the last that Fromm wrote was To Have or to Be published in 1976. It’s is an admirable book to read for anyone currently interested in simplicity movements and de-escalating frantic life styles and the perpetual accumulation of material possessions. (However, looking around the benches of this valley it doesn’t look like many folks in our part of the world are much into simplicity!) It is interesting to note that To Have or to Be has consistently been more popular in Europe than here in the U.S.

Fromm was severely critical of the consumerism that drives our economy, depleting natural resources, increasing the gap between the rich and the poor, exploiting the resources and people of developing countries, and promoting a radical hedonism that breeds indifference to pervasive social needs. To quote Fromm: “?the selfishness the system generates makes leaders value personal success more highly than social responsibility?at the same time, the general public is also so selfishly concerned with their private affairs that they pay little attention to all that transcends the personal realm.” (We can think of the abysmally low voter turnout for elections in this country as just one of many examples.) The nagging question for us still today is, are we really happy for all of our expansive homes, accumulating toys and endless consumption? Have things really changed much from Fromm’s description of life twenty five years ago? “?the observable data show most clearly that our kind of ?pursuit of happiness? does not produce well-being. We are a society of notoriously unhappy people; lonely, anxious, depressed, destructive, dependent-people who are glad when we have killed the time we were trying so hard to save.” And further, “?the need for speed and newness, which can only be satisfied by consumerism reflects restlessness, the inner flight from oneself?looking for the next thing to do or the newest gadget to use is only a means for protecting oneself from being close to oneself or another person.” (Psychologists and psychiatrists are always messing with our heads!)

“Being,” in Fromm’s terms, is living simply with modest wants, with depth and vitality, deeply involved with caring communities, sensitive to the natural world around us, and mindful of the rightful place of all of earth’s people. The “having mode” in contemporary life might well be typified by a Wall Street Journalcartoon I saw recently which pictured a man walking determinedly down the street, briefcase in hand, with a long stick arching from his back forward over his head and dangling a dollar bill in front of him. (The Wall Street Journal is an interesting place for such a cartoon!)

Well, there is no way I can do justice to the depth of Fromm’s writings in this piecemeal fashion, and there is so much more of his work that I would enjoy discussing but time is limited. I would invite you to consider his writings either again or perhaps for the first time. There are significant books that I have not even mentioned and topics that I imagine you would find both provocative and enlightening. Fortunately, virtually all of Fromm’s books are still in print, and I have a sheet available listing all of his published works in English. I commend them to you for a consciousness raising experience. The sheet also cites the web address of the International Erich Fromm Society for those of you into cyber exploration.

Let me add this one postscript (and speaking of consciousness raising). Fromm wrote in an era when it was the norm to use the generic term, “man” to refer to all humans and “he” as the accompanying personal pronoun. You heard that usage in the quotations and you may well have winced a bit when you heard them, especially if you are a woman. Time has moved on since Fromm last wrote and feminists have appropriately helped us to be more sensitive in our language usage. Our language is still cumbersome on the point but gender equity demands that we speak and write without disenfranchising either gender. On the other hand, perhaps fair play would now suggest we typically use “woman” in a generic sense-and, of course, that includes “man!”

–Hugh Gillilan

 

Freedom Of and From Religion in Utah

July 2000

Brian Barnard, JD spoke about religious freedom and the freedom from religion in the state of Utah. Mr. Barnard has litigated many suits in Utah dealing with separation of church and state issues. Two of the main premises of his presentation are that separation of church and state is not a goal in and of itself. Rather, the separation is necessary to protect another more important right to assure religious protection found in the First Amendment. People must be free to protect their own religions (or lack thereof) without government interference. When government supports one religion to the exclusion of others, or when government supports religion to the exclusion of non-religion, the peoples’ right to free exercise of their beliefs is in jeopardy.

Second, people and officials often falsely claim that they are generally in favor of government involvement in religion and in support of religious ideals. In fact, they only support their own religious principles.

To illustrate the first situation Mr. Barnard recalled past litigation with the city of St. George concerning the lighting of the LDS temple. The city provided free electricity to the facility claiming that it was a tourist attraction that benefited the whole city. The case never came to a conclusion in the court system because the LDS church decided that the publicity being generated was undesirable and had the power company install a meter and begin charging for the used current.

The second was illustrated by recalling court proceedings involving Ogden City and a stone monolith in front of the Municipal Building with the biblical “Ten Commandments” inscribed. Mr. Barnard represented a religious organization known as the Summum who have “Seven Aphorisms” as a statement of their core beliefs. They were not allowed to place a stone monument with these principles inscribed in the same park. While denied in court, other municipalities have been reticent to post Moses’ injunctions. A second example of this point was a case involving prayer at the beginning of the Murray City Council meetings. While they claimed to be open to all prayers, they refused to allow a prayer that included, “Our Mother in Heaven, please give these government leaders enough wisdom to see the need for separation of church and state so that they will stop having prayers before government meetings.”

Unfortunately for us (but perhaps fortunately for Mr. Barnard and his family) there is plenty of unfinished business in Utah and the rest of the country in making sure that the religious freedom portion of the First Amendment is enforced.

–Wayne Wilson

 

From The Halls Of Congress To The Halls Of Ivy

January 2000

Some 46 years ago, shortly after graduating from the University of Utah with a degree in journalism, I went to work for the National Wool Growers Association, as assistant editor of their trade publication. My starting salary was $250 a month. I knew little about sheep, although I had spent summers during the war on a family farm in Lava Hot Springs, Idaho. There I learned to sleep while riding Shorty, who also slept, as together we operated the derrick during haying season.

Some five years later, hoping (but failing) to increase my salary to $450 a month, I learned of an opening as assistant farm program director at KSL Radio. I applied and got the job and a salary increase to $550 a month. I had taken speech classes at the U from Louise Hill Howe and had always been interested in broadcasting. About a year later, Von Orme who was the farm director left and I became Director of Agricultural and Economic programming, a fancy title, and as Farm Director I traveled the state getting interviews and covering various agricultural events for our radio programs that ran as part of John Barlowe’s morning show and also for our noontime programming.

Six years later Arch Madsen selected me to take over both news operations at KSL radio and television. Until then they had operated as separate divisions of the corporation. Not too long after that, I was greatly pleased the Florien Wineriter became part of our news team. I can’t recall the exact year. When you get to my stage in life, you begin to forget a few things. You know you’re growing older when everything hurts and what doesn’t hurt doesn’t work or when you get winded playing chess. Your children begin to look middle aged you know all the answers but nobody asks the questions. Your favorite part of the newspaper is “25 years ago today”. You sit in a rocking chair and can’t get it going your knees buckle, but your belt won’t. Dialing long distance wears you out. Your back goes out more than you do. You burn the midnight oil after 9 PM you sink your teeth into a steak and they stay there or the best part of the day is over when the alarm goes off.

Television was still rather young in 1964. Videotape was not yet available. We shot everything on film, black and white film, and most of what we shot was with hand-held cameras and not sound. That soon changed. Color became available. But video cameras and satellite transmission were still a few years off.

Although I knew little of television, I was greatly supported by Arch Madsen and had the safety net of Nourse, Weiti, and James. For Arch had lured Paul James and Bob Welti from Channel four to channel five the same time he appointed me as their boss.

Working for the LDS Church I was often asked if I was constantly told what stories we could and could not cover and how we should cover various stories. The editor of the New York Sun at the turn of this ending century said, “we are tools and vassals of the rich men behind the scenes. We are the jumping jacks; they pull the strings and we dance. Out talents, our possibilities and our lives are all the property of other men. We are intellectual prostitutes.” That was the 1900 opinion of editor John Swinton. But I found no such “controls” despite what some might have forecast. I recall, for instance, our ongoing coverage by Louise Degn, of the planned (and then the actual) demolition of the Coalville Tabernacle. I am certain that some of the “powers-that-be” were not pleased with our hard-hitting coverage of that story. To add to my concerns about that story: I had to decide whether or not to run the film we had that included my mother and my wife demonstrating in front of the LDS Church offices against destruction of the Coalville landmark. We did run the film.

On another occasion, a well known Salt Lake Advertising executive came barging into my office–receiving no satisfaction–then went to Arch Madsen’s office (again no satisfaction)-wanting us to kill a story we were ready to air about contamination of milk in Delta, Utah. He represented Utah dairy producers and was fearful we would scare everyone so they wouldn’t drink milk. We ran the story, milk sales dropped in certain parts of the state, but quickly recovered when we ran subsequent stories later on about the problem being resolved.

This was then–and still is, I think–a relatively good television news market. That is, competition is keen. The former owner of Channel two, George Hatch, was never willing to roll over and play dead against the heavy weight ownership of KSL. That was terrific for those of us who worked at both stations. We usually got what we wanted in equipment and in good people. Channel Four, with outside ownership was not then, but I think is now, willing to compete.

Thinking that one should change jobs about every seven years or so when the opportunity came to go to Washington, D. C. in 1972–with the urging of and strong support from–my wife I “jumped” at the chance. Wes Vernon, who was earlier our political specialist and primary radio news anchor at KSL and had become Bonneville International’s first bureau chief in D.C., decided to go to work for CBS Radio’s owned and operated stations. So Arch asked me if I wanted to go to Washington. Thinking that a journalist should have tomb stone recognition for having worked at the seat of world power, I did, as mentioned go.

I arrived in Washington two weeks before that “second-rate burglary” at the Watergate Hotel. That was how the event was first described to me by Kem Gardner who was then Senator Frank Moss’s Administrative Assistant. Of course the break-in turned out to be much more than that, and did in my opinion, have a significant impact on journalists and journalism in the decades since.

My first visit to the galleries on the day I arrived in Washington allowed me to hear the great (but about to depart the Congress) Senator Mike Mansfield of Montana who was saying something about crash landings and not being involved if you are not in on the takeoff. It all made sense. To be sure “sense” was not always the outcome of a gallery stay. One does, I think, have to agree with a Russian visitor to the spectator galley in the House: “Congress is so strange,” he said “a man gets up and speak and says nothing. Nobody listens–and then everybody disagrees.” That, I fear, is often the case.

My job in Washington was to cover congressional delegations from the states and surrounding states where Bonneville owned 14 radio and television stations. It was a daunting task. Quickly I learned that hearing the “Gentleman from California” called ill informed and totally out of touch, or much stronger words, meant little as the combatants would leave the hall arm in arm for the nearest watering hole. Coming from Utah in the early ’70’s, I was used to a high degree of civility and had to learn that although words were important in the halls of congress often certain ones had no meaning.

Early on, while visiting with Congressman Richard Bolling, a democrat from Kansas City, I was schooled in important definitions. “Politics,” he said, “is the art of the possible. Compromise,” he went on, “is not a dirty word” Bolling, a protégé of Sam Rayburn’s was very astute and very helpful as were many others. I wonder how many of our Utah politicians, locally and nationally, believe in the “art” of compromise.

Many reporters and perhaps a majority of voters make it difficult nowadays for politicians to indicate there may be two sides to issues–and that sometimes a middle ground is the only one possible–or even the best one. Extremism seems to have won the day in much of the media, especially talk radio. One wonders if many that call in have it all together. Many such callers (and often the hosts) remind one of the hecklers who confronted Theodore Roosevelt when making a political speech during one of his campaigns. “I am a democrat,” said the heckler with a repeated and somewhat inebriated cry. Roosevelt was a dangerous man to heckle. Pausing in his speech and smiling with oriental unction, he leaned forward and said, “may I ask the gentleman why he is a democrat?” The voice replied “My grandfather was a democrat, my father was a Democrat, and I am a Democrat.” Roosevelt said, “My friend, suppose your grandfather had been a jackass, and your father had been a jackass, what would you be?” Instantly the reply came back “A Republican.” Not to be quoted–I think this may have been one case where the heckler was right.

In all truth, I learned in Washington that many (perhaps even most) of those in Congress were well motivated. At that time, the Senate was not exclusively the millionaires club it is today. At that time it did not take selling of the soul to get enough money to be elected to Congress. When Gunn McKay first ran for the House around 1970–he told me he spent a total of$14,000 to be elected. Joel Pritchard a moderate republican from the University district in Seattle told me he spent about the same amount to be elected. And Joel said he would not seek reelection after five terms and at that point, he quit.

Senator Henry (“Scoop”) Jackson of Washington State–who had served in Congress for over three decades–told me that he wasn’t there primarily to pass laws. But instead to watch over the bureaucrats–the young BMW driving lawyers–whose job was to implement the will of the congress. For that, and other reasons, I am not a strong proponent of term limits. It takes several years to learn the ropes in Washington, whether you are a member of congress or a newsperson.

Astute politicians not only properly oversee the bureaucracy, but they know how to deal effectively with the media. I recall, as an example, Senator Howard Baker of Tennessee being interviewed by CBS’s great Roger Mudd. It was in the Senate radio and TV gallery during the Watergate hearings. Roger wanted to know what exactly was on those White House tapes. This was before the committee had gotten to that portion of their hearings. Baker effectively sidestepped the question–went on to something that was known and was to him significant. Roger then asked the same question in a slightly different way–tried again after that. But Baker–with no rancor–won the day.

Earlier I said that Watergate changed journalism and journalists. Too often since, young inexperienced reporters, hoping for air time or front page placement, instantly see blood when there is a slight scratch or no scratch at all. Too often sources are not adequately checked and rechecked, statements taken out of context, cubbyholes found for ideas and actions that require giant cupboards. I’m not sure I entirely agree with Oscar Wilde who said, “Instead of monopolizing the seat of judgment, journalism should be apologizing in the dock.” But he may have had a point.

The technological revolution began in earnest while I was in Washington. Videotape replaced film. Satellite transmission replaced couriers running our film to Dulles hoping to make the early–or even the late news–in Seattle or Salt Lake. “Live” shots became de riguer and, as silly as they usually are, are still the order of the day. But television–challenged for viewers time by the world wide web, by video tape movies, and the increasing demands on human time and energy–has changed the world–perhaps, in ways, not completely for the better.

You have all heard them: the startling, some would say frightening–even deplorable–statistics regarding the use or misuse of television. The TV set is on for 7 hours and 50 minutes each day in the average US home. In families with children, it’s on even more–an average of 63 hours each week.

Not only young people watch television, but also old people. I’m reminded of a story of two elderly people who were sitting watching television together one evening. She was knitting and was sitting close to the television set because she was a bit hard of hearing. He, seated in his recliner a few feet away from her, watched her carefully as they viewed a television drama. The elderly gentleman looked lovingly at his wife and thought back over the years they had spent together and how much he loved her. He said to her in a kindly tone, “I’m proud of you, darling.” She put her knitting in her lap, turned her head to face him squarely and said, “I’m tired of you, too.”

Well, television can bring people together or, I suppose, drive them apart. Television has done, is doing and will do many marvelous, wonderful, even magical things. It has also been, is and will be, for many of us, a menace. Television binds us together as a people, a nation, even humanity as a whole in ways that were unimagined only a few years back. Television spelled the beginning of the end to widespread segregation. Television provided national support for the historic space program that allowed men to walk on the moon. Television has had much to do with bringing about the possibilities of long hoped for freedom in Eastern Europe, Asia, and other parts of the world where people are exposed to free and open debate, to the exchange of ideas. There is great hope for freedom. Television can provide that exposure.

So, yes, television can, indeed, work magic. Its impact is enormous. It shapes the way we see ourselves, our neighbors, and our institutions. Our perceptions of the world are filtered through the prism of television.

Yet, despite its great potential for change, for educating, for allowing us to know and understand each other, thus creating an environment for world peace, despite this great potential, the highest rated programs on television are those like the Wheel of Fortune, Jeopardy, the Newlywed Game and so forth.

Edward R. Murrow said of television, “It is a sword rusting in the scabbard during a battle for survival.” Not everyone, of a course, would agree with Murrow’s opinion but those of us who do are attempting to do something about it. We are attempting to take the sword out of the scabbard.

Rather than curse the darkness of mediocrity and lowest common denominator programming–the University of Utah has lit a candle, as have many other universities and public television stations nationwide. We operate a regional telecommunications center serving six states

All of these marvelous “tools of information and education” will soon be housed in one building in the Dolores Bore Eccles Broadcast Center, a marvelous new facility–much needed–that will allow us to provide even greater benefits to all of the people of our state and our region. A building that will allow students to receive more adequate “hands-on” training in broadcast media. A building that will allow cooperation of all our communication efforts–something that has long been needed to take KUED out of the basement of the old union building, now Gardner Hall, and KUER out of its totally inadequate facilities in the basement of Kingsbury Hall. We are in the process now of launching a capital campaign to raise the remaining amount that will allow us to realize this long sought after dream.

And so for us–as with the rest of the industry–the great television revolution continues. Involved are cable, VCRs, direct broadcast satellite, low-power television stations, and who knows what might yet lie ahead. The three major commercial networks’ prime time share continues to decline. That, of course, has been brought about by the extreme competition provided by the technologies that I referred to a minute ago–especially cable television. About 55 percent of all US households now have cable. Salt Lake has the highest VCR ownership market in the nation at 60 percent.

So with all of these great tools capable of delivering magic, what is or what should be the role for PBS, the Public Broadcasting Service? Some are saying PBS and most all of public television, as we now know it is no longer needed. As you might expect, I certainly do not agree. In fact, soon after having been elected chairman of the PBS Board, I determined after many years of rejecting the idea that we should have cable television in our home. We now do. I spent some time flipping through the plethora of channels, fearful at first that I would come down on the side of those who say that because of all that is available in our television offerings we no longer need public television. I have come to quite the contrary conclusion. I am more convinced the more I watch what is on cable television that public television is essential to the well being of our nation.

Public television is reexamining its charge, and I think we must. We cannot compete with the American Movie Channel in presenting old movies every night or with Ted Turner’s many offerings. We cannot and should not attempt to compete with the Disney Channel and all that it does for young people–although there are some public television stations who think their niche is in this field. We cannot, in my opinion, compete with MTV–nor should we. There are areas of overlap with such channels as Discovery, Arts and Entertainment and Bravo. But even there, the realities of the commercial world require those channels to not only carry commercials but also to try and broaden their audiences with a common denominator that may not be of the quality that PBS should be and I think usually is.

–Ted Capener

 

Civility

February 2000

Professor John Kesler has spent a number of years developing a description of developmental stages of social thought and action. The model lists seven stages in the civility spectrum and the ages at which each plateau can or should be attained.

Level 1 is that of “power” and is usually achieved by the age of four. It involves following rules for the sake of the rules themselves. Much of society exists on this level. Examples include big money politics, predatory business practices, and professional sports. Brutality is a hallmark of this social level.

Level 2 recognizes interests of others. It can be achieved by age eight. The empiricist social sciences are based on this level. Common questions are: “Who gets what?” “How much?” This is the first stage of reciprocity.

By the age of 12 people achieve the Level 3 cultural stage. Here conflicts of values are dealt with. The sciences of Anthropology and Sociology are highly concerned with this level

Level 4 looks at universality. It involves principled actions that result in three dimensional win/win/win situations. Historically, the Enlightenment was largely based on social interaction at this level. Most adults should attain this step by the age of 16. It is the last age-associated level. Piaget, Kohlber, and Giligan are important theorists.

Level 5 introduces 3-dimensional thinking and is labeled the integrative stage. Positive examples of this level include the seminal thinkers of the great American experiment. Negative examples are communism and post-modern thinking. Some of the leading thinkers were Jean Gebser, Jurgen Habermas, and Charles Taylor.

The Ecological Level 6 considers the physical as well as the social environment. Thoreau was a prime example.

Level 7, Transcendence, is best exemplified in the writings of Ken Wilber. Traditional humanist principles of the value of all life and the environment understand society at this level. Thinking includes transcending self and inspiration. Religion, in its purest form, is also socially at level 7.

Kesler noted that civility is the foundation of civilization. The fact that both words share the same root is just a first-order indication that civilization will not thrive if we do not treat each other and our environment respectfully with long-term consideration.

–Wayne Wilson

 

Bill of Rights

April 2000

Utah State History Professor Dan McInerney presented a historical perspective on the Bill of Rights. His first point was that the context of the lecture was the history of the subject, not the legal ramifications. McInerney painted an interesting, and to most, a surprising picture of the social climate when the first 10 amendments to the Constitution were adopted in 1791.

Professor McInerney delineated 10 “surprises” that most people now are unaware of:

  1. The Bill of Rights was the product of suspicion and distrust, not confidence and security. Advocates were NOT confident in the continued blessings of liberty. They feared the impending collapse of liberty — not because of anything the people might do, but because of what the government might do. They were doubtful, fearful, even paranoid about the new federal government.
  2. The original proponents were more interested in states’ rights than individual rights. Some historians take the argument further, claiming that proponents were not trying to restrict all government; they wanted to restrict a particular level of government. Rather than protecting individual liberties against government power in general, advocates of a Bill of Rights wanted to protect states’ rights from federal encroachment
  3. There was strong opposition to a bill of rights. The Bill of Rights was REDUNDANT. There already was a “bill of rights”: the Constitution! (Hamilton wrote, “The Constitution is itself, in every rational sense, and to every useful purpose, a bill of rights.”) It already protected privileges and rights, protected the most important privileges and rights in a republican system: the right of habeas corpus (to prevent wrongful detention); the prohibition of ex post facto laws (that defined an act as a crime after it happened); and the prohibition of titles of nobility.
  4. The Bill of Rights was not, originally, ten amendments added to the constitution. Originally, states and political leaders offered over 200 proposals! Madison whittled that number down to 17 and submitted them to Congress. The House approved all 17 amendments; the Senate, by rejecting some of the House proposals, and combining others, dropped the number to 12. Recall, also, that the Senate dropped the one amendment Madison thought most important: the one concerning state government infringement on conscience and press.
  5. The 1st amendment was not “first.” The original first amendment to the Constitution had nothing to do with the stirring guarantees of free speech, press, assembly and the free exercise of religion that we know today. Instead, it concerned a clunky mathematical formula for proportional representation in the House. The original second amendment dealt with Congressional pay: it would have prevented Congressmen who voted themselves a raise from collecting until after the next election. The states rejected both ideas.
  6. The amendments do not declare rights clearly and comprehensively. They are filled with ambiguities. In trying to decipher the Bill of Rights, we ought to turn to experience rather than language for the meaning of the text. The amendments, seemingly specific injunctions, are not self-defining and do not necessarily exclude exceptions. There were probably no constitutional absolutes in 1791 and no guarantees that were clear and precise in meaning. It is tough to construe ambiguities strictly.
  7. For most of American history, most Americans have not paid much attention to the Bill of Rights. The Bill of Rights has been around for 200 years. But when historians consider the effect of the first ten amendments on America’s political culture, the two centuries do not seem to have made all that great an impression. For most of American history, most Americans have either slighted, disregarded, or ignored the Bill of Rights.
  8. Some of the key “founding fathers” did not think much of the Bill of Rights. In June 1789, Madison rose in the House to present formal proposals for Constitutional amendments. Other speakers followed. Every one either opposed the Bill of Rights or tried to postpone discussion; they felt “there were several matters before them of more importance. The discussion would take up more time than the House could now spare.”
  9. In the 1790’s, some states did not approve the Bill of Rights. Connecticut, Georgia, and Massachusetts never got around to ratifying the Bill of Rights until the sesquicentennial of the Constitution in 1939!
  10. The actual document has been treated carelessly. There were copies of the Bill of Rights made for each state, plus one for Congress. Each copy was signed by Vice-President John Adams (the President of the Senate) and Speaker of the House Frederick Augustus Muhlenberg. Congress did not handle its copy in a particularly reverential way as historian Michael Kammen notes. In 1789, the Secretary of State was named custodian for the Declaration, the Constitution, and (later) the Bill of Rights. The documents were bundled with scores of other government papers, shuttled around from New York City to Philadelphia to D.C., and housed where few people could see them. From 1875 to 1921, the Constitution was kept in a cellar. Finally, in 1921, the Declaration and Constitution were transferred to the care of the Library of Congress. But the Bill of Rights stayed in a basement in a plain green cabinet. And the states? Five of them, Pennsylvania, Georgia, Maryland, New York, and North Carolina, can’t find their copies of the Bill of Rights!

There are ten surprises about the Bill of Rights. I mean no disrespect towards the amendments by noting these historical curiosities. I’m very grateful that we have a Bill of Rights, because of 4 key achievements of the amendments:

  1. Although it took many weeks for the first Congress to take up the business of the Bill of Rights, it is important to recognize that one of the first acts of the new national government was to limit its authority. That’s a remarkable story in any historical period.
  2. The first ten amendments stated popular rights in a dramatically new way. Most states expressed their bills of rights in the subjunctive voice: “Liberty of the press ought not be restrained.” How do you like the sound of that? Does it fill you with confidence? I don’t think I’d pay my magazine subscriptions too far in advance. The Bill of Rights makes its points in a far different manner, not in the subjunctive voice but in the declarative or imperative voice: Congress shall or shall not. The old expression holds true: it’s not just what you say, it’s how you say it.
  3. Passage of the Bill of Rights helped ease years of debate over America’s form of national government. It made the new, enlarged, unprecedented experiment in republicanism more acceptable. It provided a sense of legitimacy that the new government badly needed. In practical terms, when Congress passed measures in Sept. 1789, it helped encourage “hold-out” states (NC and RI) to finally join the Union.
  4. Over time, the amendments have hindered the government’s power from extending over the thoughts and consciences of citizens.

Professor McInerney’s presentation was sponsored by the Utah Humanities Council as part of its Front Porch Series.

 

Religion and Ethics: A Secular Response

December 2000

A couple of months ago I was asked by the organizers of a conference at UVSC to provide a secular response to the contemporary relevance of great religious teachers. The choice, that is asking me, was both curious and fitting. Curious, in that I am not a thoroughgoing secular humanist; nor has the three centuries old process of what students of religion call “secularization” been the focus of my studies, teaching, and writing. However, there was something fitting about the request due to the fact that I am currently serving as a minister at South Valley UU Society. South Valley is a religious congregation many of whose members are secular humanists. That is, though regularly attending religious services, these men and women believe that “this world,” the immediate environment available to ordinary experience and scientific investigation is all there is to reality. They either reject outright the existence of God (or an ultimate transcendent reality) or they are content to reserve judgement on the issue due to the lack of sufficiently compelling evidence. What such people are doing openly in “church” is a question worth raising and to which I shall return. In the remarks that follow, I have a number of tasks to accomplish. First, to briefly sketch out why moral codes arise within both religious and secular communities. Second, to describe the conditions that gave rise to a secularizing world-view and the essential characteristics of that outlook. And finally, to answer the question: “what’s a secular humanist doing in a church?” By looking at these topics in order, I hope that it will become clear why those in the secular or humanist camp believe that religious communities today bear an extraordinary burden when it comes to the relevance of their ethics. And finally, I also hope that I can make a case for the benefit obtained when both the religious and the secular strive to create a common ground. That is, a place to which they can bring both the experience of failure that hounds the human condition and their hopes and vision for a more just, compassionate, and equitable society.

From Whence Moral Codes?

Religious men and women express their distinctive world-views by means of sacred stories, scripture, ritual, and art. The virtuous life, or morality, has also been a central feature of religion. Indeed, wherever religion is found, in individuals and groups, one component will be moral codes. These rules provide norms for right attitudes and actions. They are principles that express what we ought to think and do in this world. To live morally is to embody these principles.

It is an interesting fact, that the presence and persistence of ought statements and moral codes show that the human family is clearly aware that there is a disconnection between how life is and how it really should be. Aren’t we all aware of failures in the human condition? When we think about it, don’t notions of sin, immorality, suffering, and evil remind us of the ways in which the human family has been all too frequently derailed from achieving the good?

Religions have functioned powerfully through most of our history to provide an explanation about the meaning or purpose of life that is articulated by reference to our place within an ultimate, transcendent or divine framework. That framework is sketched out in sacred stories, ritual, moral codes, and art. These artifacts of religion are the handiwork of spiritual teachers or prophets through whom the transcendent framework for human existence has been revealed or disclosed. Religious frameworks attempt to describe the ultimate origins, reasons and ends of nature, humanity, and cosmos. They also strive to explain the causes and remedies for that which is broken or incomplete or obstructive in the human condition. As long as these descriptions and cures are convincing, they provide a sense of cohesion within communities and confer a powerfully resonant sense of identity both on individuals and the communities in which they live. Once such frameworks have been clearly established any moral principle or rule can be assessed by terms of the contribution it makes, or fails to make towards the realization of the ends revealed through the religious framework of the community. And when those rules are accepted as divine commands what results is a system of social control which induces religiously sanctioned behavior in individuals and established a regulated pattern of social order. (Bryan Wilson)

However, you don’t have to be religious to suffer from a gnawing lack of connection between is and what ought to be. There are non-religious people who are moral, and who are examples of virtue and the good. They do not need the explanatory and exhortative systems of religion in order to feel keenly the failures in the human condition, to be aware of the reality of evil and suffering, or to be upright in their conduct. That this could be and is in fact the case leads many moral philosophers to question and reject the commonly held belief that religion is essential for morality. If both the religious and the secular feel and experience the absence and the need for the good, then this moral sensibility is an inalienable attribute of human existence. At most, religions might exemplify morality, but they cannot claim to be its exclusive source. Jesus or the Buddha may be exemplars of the moral sense, but each of us, according to the secular view, possesses a moral impulse. We can look upon these two great figures as archetypes of how we should respond to the moral law written in our hearts by virtue of our birth, but the immediate source of the suffering and obligation we feel toward humanity exists within each person apart from any religious identity.

Because 18th century moral sense philosophers like Hutcheson, Reid, Smith and Kant were Christians, they believed that the ultimate source of the moral sentiments within us were inscribed there by a benevolent Creator and that Jesus was the unique example of moral virtue. But many of their heirs rejected the premise, and focused instead on the project of making morality truly autonomous; and thus independent of something so culturally various, historically changeable, and experientially unpredictable as religious belief and custom.

Have religious moralists truly and adequately accounted for and responded to the tough, dry-eyed contentions of Feuerbach, Marx, Nietzsche and the philosophical heirs of British empiricism? When critics have pointed out that it is religion that interferes with the cultivation of moral values and that it diverts people’s attention to the life to come when they should be focussing on improving things in the here and now, have we really listened? I am not so sure. That is unfortunate for both religion and for humanists. The former continue to offer codes rooted in premises that humanists find implausible at best, while the latter lose hope of engaging the members of religious communities in critically self-aware dialogue.

Indeed, much of what passes for public discourse in this country today returns again and again to the proposition that morality is baseless unless it has its origin in the decrees of God’s eternal will. Could it not be the case that the opposite proposition is more true logically and more fitting to current needs in a radically multi-ethnic, multi-cultural society? Could we not say instead, that a belief in God, or in this or that religion, is not necessary to give authority to moral decrees? And that, instead, it is the nobility of our moral autonomy that gives rise to the idea of God or a transcendent reality as source of our notions of good and evil. This is the argument of the contemporary philosopher Kai Nielsen. Let me read a brief quote to illustrate this view:

Morality cannot be based in religion. If anything, the opposite is partly true, for nothing can be God unless he or it is an object worthy of worship, and it is our own moral insight that must tell us if anything at all could be worthy of worship.

A religious belief depends for its viability on our sense of good and bad–our own sense of worth–and not vice versa…. A moral understanding must be logically prior to any religious assent.

This assertion: the priority, content, and dignity of our moral intuitions and reasoning as human beings first, provides, I believe, a more certain and inclusive environment for our moral endeavors as citizens in a pluralistic nation. They are the necessary preconditions for harmony in a modern, religiously plural state. Let us agree first to assert the self-evident truths of our Declaration of Independence: life, liberty and the pursuit of happiness. If we agree that they are self-evident, then they cannot be infringed upon or set aside by political regime or religious creed. Who among us would abrogate these universal truths, given by reason, intuition and experience for the price of a historically and culturally contingent truth, though we call them the fruits of our religions?

I want to move from what gives rise to moral codes to the issue of where this secular world-view came from and what its key characteristics look like.

Secular World-view: Conditions and Characteristics I want to briefly sketch both the conditions that gave rise to a secular world-view and its chief characteristics. The point here is to enable us to better understand what impels a secularist to respond critically to religiously based ethics and explore what conditions obtain for fruitfully bridging the divide between religious and secular views. Secularization is the word used to name the process of change by which religious beliefs, practices, and institutions have been divested of a great deal of their influence and power in both individual and social spheres. It refers loosely to a transition, that is variously observed and explained, from a religious to a less or non-religious world. The process began in the sixteenth century when church property and prestige passed from religious to secular control. And it continued apace in the last three centuries as individuals and public institutions turned less and less to religious authority and systems in order to express group identity, to maintain social order, to study and explain natural phenomenon, and to provide emotional support to afflicted members of society.

Once it was the case that religious cultures like the medieval Muslim and Christian worlds offered a complete intellectual, spiritual, and social program. They explained and justified not only the supernatural and the moral, but the nature and the purpose of the cosmos as well. Islamic and Christian doctrine and practice legitimized status quo political authority and social policy. They justified feudal wars and provided theological rationale for social inequalities of wealth, power and status. And their religious sciences mapped out the order of nature within and beyond the human person in a hierarchy of ascent, purpose, control and deference.

It is a world in which we only partially live or that has become completely strange to us. And would any of us wish avidly for its return in full? Look at the situation today. We employ the scientific method to understand the physical order of our world, where once it was an article of faith to accept an Aristotelian map of nature and cosmos. The separation of church and state has been carefully attempted in our own political constitution and civic space where once such a division of labor was unthinkable or treasonous, or both. Increasingly we turn to analysts and physical activity instead of to rabbis and priests for our therapeutic needs. To prescription drugs and surgery for remedies to our physical ailments rather than to consecrated oil or holy water or pilgrimages to venerated relics.

In these fields of human activity and more, the traditional significance of religious consciousness and institutions has diminished. And what we see as a result is leaders and members of religious communities either adapting to changing circumstances (i.e. evolution in the science curriculum in religious schools, support for church/state separation, etc.) or rejecting secularization by means of cognitive and physical separation from society into religious ghettoes.

In short, secularization is the consequence of the struggle to maximize the autonomy of individuals, professions, politics, and markets. It has had the consequence of encouraging us increasingly to employ this-worldly thought, institutions, and procedures for making sense of the world in which we live. The consequence of secularization is that religious belief and practice is an option, a matter of choice, where once it our destiny and fate.

A social scientist employs the thesis of secularization as a neutral, not a normative, term indicating a factual process of social change, a change of thinking, habit, and process centuries in the making. It does not necessarily postulate the disappearance of religion in secularizing societies. Religion, and the insights of its great teachers, manifestly persist and continue to persuade and inspire the faithful.

However, there are those within religious communities who have embraced rather than begrudged this transformation of consciousness and practice; that is, there are and have been religious leaders, thinkers, and lay men and women for whom the process of secularization is seen normatively as beneficial for a more authentic faith. Their vision includes embedding religious ethics and morality in soil more conducive to the growth and well being of human autonomy and less susceptible to the vagaries of parochial customs and practices. For these religious humanists, the value of such a move is readily apparent.

I will cite just one example. Traditional Lutheran ethics postulated a two kingdoms worldview (that is, sacred and profane) and a convention of morality that it entailed. Citing the divine authority of scripture, Luther taught as a matter of faith and doctrine that the Lord placed human rulers to preside as his vice regents over the profane kingdoms of this earth. They rule in his place; that’s what a vice regent is. All subjects within worldly domains are, therefore, obliged to obey and submit in all things to those in secular authority above them. This ethic, derived from a divine command model and formed during the waning era of European feudalism, was invoked by Luther to justify the savage repression of the peasants’ revolt within Protestant domains in the 1520s. The persistence of this two kingdoms ethic contributed much later to incapacitating the German people from being able to resist both the rise to power and the rule of a violent, neo-pagan Nazi regime. The Christian theologian Dietrich Bonhoeffer, executed for his part in a German resistance movement against that regime, belatedly recognized the damage done by Luther’s ethics. For his part, he grew to welcome the advent of a more secular society whose morality would be grounded in the physical dignity and the moral autonomy of every person, and not on the supremacy of the sacred over the profane. For there is in reality only one world, no matter what we may believe about its source and destiny.

What’s a Humanist Doing in Church?

Human moral understanding is logically prior to religious assent to the idea of a transcendent source of the good. The historical record of religions promoting and securing our physical security and moral autonomy is dismal. Is it any wonder, then, that a more secular worldview would be critical of religiously based morality? The burden of proof resides with those who promote the contemporary relevance of the great religious thinkers to issues of morality and ethics. Have they made a sufficiently compelling case? And have they taken into account the logical and historical arguments of secular humanists? If they have not, I would suggest that they have, as we say, begged the question.

In closing, I mentioned at the beginning of these remarks that I minister to a religious congregation, many of whose members are secular humanists.

That is, people who do well enough, thank you, without a belief in god or in moral codes authoritatively grounded in scripture or prophetic warrant. What are they doing in a church on Sunday mornings? Isn’t their presence in a religious sanctuary a contradiction? An oxymoron?

Or perhaps they just haven’t been able to kick an old habit?

What I have encountered is this: the humanists I know and meet in church want to be a part of a living community and tradition with rich, complex and life enhancing beliefs. The sources of those beliefs about good and evil arise first of all from the humanist insistence upon the nonnegotiable worth and dignity of every person. No appeal to the gods, no appeal to the state, no invocation of custom and convention trumps our inherent rights of physical dignity and moral autonomy. Religious and secular humanists say “fly” from any religious tradition or community that would say otherwise.

The humanists I know and meet in church, who seek rich, complex, and life enhancing beliefs look to the guidance of reason and the results of science, especially as they warn us against idolatries of the mind and spirit. But they will also turn to the wisdom of the world’s religions when they inspire us in our ethical and spiritual life. The hallmarks of such a life are justice and equity in human relations, the free and responsible search for meaning and truth, the right of conscience and the use of the democratic process within our communities and in the society at large, and respect for the interdependent web of existence of which we are all a part.

The humanists I know and meet in church, who seek rich, complex and life enhancing beliefs respond enthusiastically to the words and deeds of prophetic men and women when those men and women challenge us to confront powers and structures of evil with justice, compassion, and transforming power of love.

The honest critique of religious claims and the resolute affirmation that humanity is the measure of things, the maker of history and moral values, are the special contributions that humanists-religious and secular- have made to western culture and to the Unitarian Universalist community to which I minister.

Humanists are needed and should be welcomed within our religious traditions. They remind us that an unexamined faith leads to credulity and tyranny. They will remind us, even if we don’t want to hear it, that historically, religious people and their institutions have been unusually susceptible to the temptations of power, wealth, and irrationalism. Humanists have a special task, or calling, toward those who say that they are religious. And that is to remind those within faith traditions that the adventure of their faith, if it is to have meaning and credibility, is contingent upon its capacity to create and secure values that dignify and enhance life here and now.

–Dr. Steven Epperson, minister of South Valley Unitarian Universalist Society
November 9, 2000.

 

Florien Wineriter Receives Utah Humanities Council Award

December 2000

The Utah Humanities Council recognized our chapter president, Florien Wineriter, with a Friend of the Humanities Award at their annual Governor’s Awards ceremony, Saturday, October 14th.

In presenting the certificate for outstanding support of the humanities at the Memorial House, in Memorial Grove, the UHC Executive Director, Cynthia Buckingham, mentioned that during his years in the broadcasting industry, Flo had been the producer of “Vital Issues” at KALL, and “Public Pulse” at KSL, discussion programs that addressed important social and political issues pertaining to the humanities. He was also praised for inviting several Utah Humanities Council representatives to speak at the Humanists of Utah monthly meetings. Director Buckingham remarked that the Humanities represent the bridge of balance between religion and science

Receiving certificates of recognition along with Flo were Wally Cooper and Allen Roberts, architects, Anne and Sandy Dolowitiz, leaders of the Utah Jewish community, Michael Zimmerman, former Chief Justice of the Utah Supreme Court, and Bonnie Stephens, Director of the Utah Arts Council.

Congratulations, Flo!
 
 

Humanism Beliefs and Practices

Book Review

April 2000

HUMANISM: Beliefs & Practices, written by Jeaneane Fowler and published by Sussex Press, 1999, is the most definitive book on humanism since Corliss Lamont published the Philosophy of Humanism 50 years ago. The author is an Honorary Research Fellow at the University of Wales College. She says the life stance of humanism must take its place alongside of the attitudes of the Christian, Jew, Hindu, Muslim, Buddhist, and Sikh.

In the foreword Paul Kurtz, chairman of the Council for Secular Humanism, says, “The meaning of the term humanism has often puzzled friend and critics alike: Is humanism a religion? Is it simply equivalent to humanitarianism? Is it so inclusive that it applies to everyone?

“Dr. Jeaneane Fowler has attempted to answer these questions, and she has done so with objectivity and sensitivity, skill, and virtuosity. Indeed, in my judgment, she has written the best source book on humanism that is currently available.”

I found this outstanding book while browsing the new books display at the University of Utah Marriott Library, read it, then ordered a copy for my own library.

–by Flo Wineriter

 

Firestone of Religious Rhetoric

October 2000

If talking about UFOs or constipation would get votes, then presidential candidates Bush, Gore, Cheney, and Lieberman would be flaunting this tactic. Instead, the 2000 campaign is firing away with in-your-face religious/God rhetoric to obtain votes. So omnipresent is this rhetoric that the Anti-Defamation League, whose focus is fighting anti-Semitism, has written Joseph Lieberman to urge him to curb expression of religiosity in his campaigning. Examples of his rhetoric are: “I stand before you today as a witness to the goodness of God,” and “As a people, we need to reaffirm our faith and renew the dedication of our nation and ourselves to God and God’s purposes.”

Al Gore’s rhetoric is that he supports “faith-based partnerships,” which includes involving sectarian groups. Referring to gang violence and deteriorating social conditions among inner-city youth, he said, “Those who are quick to feel disrespected often have a spiritual vacuum in their lives, because they feel disconnected to the love of their Father in Heaven.”

Said George Bush, “We should promote these private and faith-based efforts because they work,” promising to dedicate $8 billion to such groups in the first year of his presidency through a program of tax rebates and direct grants.

Of course, religion, like any other institution such as education, marriage, or profession, has potential to benefit our growth and development. However, to imply or explicitly state that without religious boundaries, we are an ethically and morally corrupt nation, is an erroneous argument.

The bogus logic of this religious campaigning can be summarized in this syllogism: religions are good, people are religious, therefore, religious people are good. However, reality is: good people are not all religious, and religious people are not all good. “Good,” for the purpose of this piece, is defined as practicing ethical and moral values.

For instance, Presidents Bill Clinton and John Kennedy’s sexual infidelities poke a hole in the syllogism. Religion did not save them. Newt Gingrich, Jimmy Bakker, and Jimmy Swaggart spouted the same religious rhetoric while committing public hypocrisy. Prisons house the largest percent of God believers, children cheat more often in parochial schools than in public schools, and religious crusades and ethnic cleansings are products of both dogmatic believers and dogmatic nonbelievers (Gordon Gamm, Colorado Humanist, Jan./Feb. 2000).

Naturally, it goes without saying the converse also applies: some religious people are good, and some good people are religious.

Whatever the source, people practice ethical and moral values most genuinely when these are internalized, which means we act “good” no matter where we are, who is watching us, and what tempts us to act otherwise. Anything less is less, less meaning we “act for the wrong reason,” less meaning we act from fear of after-death hell and damnation, or fear of imprisonment, or fear of what others think about us. At the same time, acting for the wrong reason is not necessarily all negative if these acts benefit self, loved ones, co-workers, and community at large.

That said, some people require rules, policies, laws, and religions to keep them honest and responsible. If “artificial” societal, political, and religious barriers or boundaries did not exist, some of us would more easily succumb to deceiving, manipulating, lying, cheating, stealing, abusing, and so on. Without established, organized, penalizing courses of conduct, “higher principles” or “laws of the spirit” are mere terms belonging to fantasy novels and Hallmark cards.

Religion often seems to be no respecter of certain decisions. Driving above the speed limit is probably one of the most frequently broken laws, as well as driving through red lights and changing lanes in an intersection. Without “no-noise” ordinances, more of us would be hiking music up to Nine-Inch Nails concert volume all night long or allowing a dog to bark at 3 a.m. We would more likely jay walk, park in front of people’s driveways, park in spaces allotted for the handicap, and litter like children.

When faced with these commonplace, but less clear-cut situations, the religious and non-religious may act no differently. 1) A store clerk gives you too much change. 2) A credit card company makes a mistake in your favor. 3) A dinner was not charged in your restaurant bill. 4) You can take full credit for a project although others did the work. 5) A co-worker discloses a terrific idea that you can pirate. 6) You did not follow through on an assignment or promise, but can blame it on someone else. 7) Someone calls with whom you do not wish to talk, and you can ask your child to say you’re not home.

Whether religious or non-religious, people often feel worse about getting caught and getting penalized than committing the immoral or unethical deed.

Despite Utah being one of the most church-going states in the U.S., recent United Way (UW) data casts stones upon our perceived, squeaky clean state (August 29, 2000: KSL-TV). For example, as many as 4,000 are homeless in the greater Salt Lake area, and 28,051 children live in poverty (defined in 1998 as a family of four earning $16,530 or less). At least 50,000 Salt Lake County residents over age 12 are considered substance abusers. One in eight Utah women experiences physical abuse, one in three endures emotional abuse in a relationship, and one in three suffers sexual abuse before age eighteen. The crowning figure that religious Utah is not all good is that child sexual abuse here is double the national average.

These damning statistics of our perceived “happy valley” Utah refute, at least in part, the campaign rhetoric implying that religious people have a monopoly on ethical and moral behavior. Hopefully, a new rhetoric will spawn where a colleague or neighbor can say he is non-religious or atheist, and you won’t hear, “Oh, he couldn’t be. He’s too good a person.” With 26 million non-religious Americans, a lot is riding on the do-the-right-thing internalization tread where temptation does not split apart conscience and character.

–Sarah Smith
This article was also published by the
Salt Lake Tribune on September 24, 2000.

 

Discussion Group Report

The Evidence for Evolution

February 2000

By Richard Layton

“Scientists, like many others, are touched with awe at the order and complexity of nature. Indeed many scientists are deeply religious. But science and religion occupy two separate realms of human experience. Demanding that they be combined detracts from the glory of each,” says Bruce Alberts, the president of the National Academy of Science.

In spite of the compelling evidence for evolution, the teaching of evolution in our schools remains controversial. How can the two views of science and religion about origins be so different?

The publication, Science and Creationism: A View from the National Academy of Sciences, in 30 pages gives a synopsis of the scientific evidence on this question. In the preface Alberts indicates, “The tremendous success of science in explaining natural phenomena and fostering technical innovation arises from its focus on explanations that can be inferred from confirmable data. Scientists seek to relate one natural phenomenon to another and to recognize the causes and effects of phenomena. In this way they have developed explanations for the changing of the seasons, the movements of the sun and stars, the structure of matter, the shaping of mountains and valleys, the changes in the positions of continents over time, the history of life on Earth, and many other natural occurrences. By the same means, scientists have also deciphered which substances in our environment are harmful to humans and which are not, developed cures for diseases, and generated the knowledge needed to produce innumerable labor-saving devices.”

He goes on to say, “The concept of biological evolution is one of the most important ideas ever generated by the application of scientific methods to the natural world. The evolution of all the organisms that live on Earth today from ancestors that lived in the past is at the core of genetics, biochemistry, neurobiology, physiology, ecology, and other biological disciplines. It helps to explain the emergence of new infectious diseases, the development of antibiotic resistance in bacteria, the agricultural relationships among wild and domestic plants and animals, the composition of Earths atmosphere, the molecular machinery of the cell, the similarities between human beings and other primates, and countless other features of the the biological and physical world.” He quotes Theodosius Dobzhansky (1973), “Nothing in biology makes sense except in the light of evolution.”

Alberts continues, “Scientists have considered the hypotheses proposed by creation science and have rejected them because of a lack of evidence. Furthermore the claims of creation science do not refer to natural causes and cannot be subject to meaningful tests, so they do not qualify as scientific hypotheses. In 1987 the U.S. Supreme Court ruled that creationism is religion, not science, and cannot be advocated in public school classrooms. And most major religious groups have concluded that the concept of evolution is not at odds with their descriptions of creation and human origins.”

Evolution, Science and Creationism states, helps explain the origin of the universe, the Earth and life. In the late 1920s astronomer Edwin Hubble made observations that he interpreted as showing that distant stars and galaxies are receding from the Earth in every direction. The velocities of recession increase in proportion with distance, a discovery that has been confirmed by numerous and repeated measurements since Hubbel’s time. The implication is that the universe is expanding. One hypothesis is that the universe was more condensed at an earlier time. This deduction suggests that all the currently observed matter and energy in the universe were initially condensed in a very small and infinitely hot mass. A huge explosion then sent matter and energy expanding in all directions. This Big Bang hypothesis led to more testable deductions. One was that the temperature in deep space today should be several degrees above absolute zero. Observations have shown this deduction to be correct. The COBE satellite launched in 1991 confirmed that the background radiation field has exactly the spectrum predicted by a Big Bang origin for the universe.

Charles Darwin’s original hypothesis on biological evolution has undergone extensive modification and expansion, but the central concepts stand firm. Studies in genetics and molecular biology have explained the occurrence of the hereditary variations that are essential to natural selection. Genetic variations result from changes, or mutations, in the nucleotide sequence of DNA, the molecule that genes are made from. Such changes in DNA now can be detected or described with great precision. In 1799 an engineer named William Smith reported that, in undisrupted layers of rock, fossils occurred in a definite sequential order with more modern-appearing ones closer to the top. Today many thousands of ancient rock deposits have been identified that show a corresponding succession of fossil organisms. Multi-cellular organisms–fungi, plants and animals–have been found in only younger geological strata. The fossil record provides consistent evidence of systematic change through time–of descent through modification. Inferences derived about common descent are reinforced by comparative anatomy. For example, the skeletons of humans, mice and bats are strikingly similar despite their radically different ways of life and diversity of their environments. This suggests a common ancestry for them.

Evolutionary theory explains that biological diversity results from the descendents of local or migrant predecessors becoming adapted to their diverse environments. This explanation can be tested by examining present species and local fossils to see whether they have similar structures, which would indicate how one is derived from the other. Also, there should be evidence that species without an established local ancestry had migrated into the locality. Wherever such tests have been carried out, these conditions have been confirmed.

Embryology is another source of independent evidence for common descent. A wide variety of organisms from fruit flies to worms to mice to humans have very similar sequences of genes that are active early in development. Evidence for evolution also is provided by the discoveries of modern biochemistry and molecular biology. The code used to translate nucleotide sequences into amino acid is essentially the same in all organisms. Moreover, proteins in all organisms are invariably composed of the same set of 20 amino acids. The unity of composition and function is a powerful argument in favor of the common descent of the most diverse organisms. The evidence for evolution from molecular biology is overwhelming and is growing quickly.

Many of the most important advances in paleontology over the past century relate to the evolutionary history of humans. Not one but many connecting links have been found as fossils. They document the time and rate at which primate and human evolution occurred. Scientists have unearthed thousands of fossil specimens representing members of the human family. Most of these specimens have been well dated, often by means of radiometric techniques. They reveal a well-branched tree, parts of which trace a general evolutionary sequence leading from ape-like forms to modern humans.

Science and Creationism closes by saying, “The claim that equity demands balanced treatment of evolutionary theory and special creation in science classrooms reflects a misunderstanding of what science is and how it is conducted. Scientific investigators seek to understand natural phenomena by observation and experimentation. Scientific interpretations of facts and the explanations that account for them therefore must be testable by observation and experimentation.

“Creationism, intelligent design, and other claims of supernatural intervention in the origin of life or of species are not science because they are not testable by the methods of science…publications [of the advocates of these claims] typically do not offer hypotheses subject to change in the light of new data, new interpretations or demonstration of error. This contrasts with science, where any hypothesis or theory always remains subject to the possibility of rejection or modification in the light of new knowledge. The growing role that science plays in modern life requires that science, and not religion, be taught in science classes.”

 

The Social Contract and Human Rights

June 2000

The true civilization is where every man gives to every other every right he claims for himself.
–Robert G. Ingersoll
Over the years there has been an off-and-on discussion in the pages of the Humanist about the meaning of natural rights and human rights. For example, in an article entitled “Demythologizing Natural Human Rights” in the May/June 1989 issue, Delos B. McKown advanced the view that human rights possess no independent existence; they are mere creatures of law that “are neither immutable nor permanent.”

In a direct response to McKown, Tibor R. Machan wrote the article “Are Human Rights Real?” published in the November/December 1989 issue. Denying McKown’s proposition, Machan insisted that human rights are unalienable and inherent in human nature, concluding that, “without the ‘borders’ that basic human rights define between individuals, people would be able to harm otheto harm others or rob them of their achievements all too easily.”

Responding to Machan, Anselm Atkins weighed in with his March/April 1990 article “Human Rights Are Cultural Artifacts,” in which he rejected the notion of inherent human rights from the standpoint of evolutionary biology. Atkins argued that “a right is … something furnished, granted to, or bestowed upon someone. It comes from outside–something ‘extra’ to the being.” He then concluded: “Philosophically, the only way to found or establish such a thing as a ‘natural right’ is to presuppose a god who bestows and secures such rights. Absent a god, there can be no natural rights.”

More recently, Fred Edwords, chronicling “The Advance of Human Rights” in the November/December 1998 Humanist, presented evidence that the whole concept of human rights as we know it is an extremely late development in human history–scarcely older than the seventeenth century–and that, even within this context, the idea was “applied in but a few small parts of the globe to a chosen few” until around the middle of the twentieth century.

My own position is that human rights are not legal fictions conferred by governments but are inherent features of our nature as human beings. And while it is clear that our knowledge and understanding of human rights are relatively modern, human rights themselves are as old as humanity.

All societies have rules or laws and require their members to obey themthem for the peace and good order of that society. In this regard, philosopher John Rawls assumes in his book A Theory of Justice that a society is defined by its rules. He writes that a society is “a more or less self-sufficient association of persons who in their relations to one another recognize certain rules of conduct as binding and who for the most part act in accordance with them.” But why should any free and independent person consciously and willingly choose to obey any king or chieftain or the laws of a society? To answer this question, we need to understand that there are essentially two sources for the duty to obey such laws. The first is authority; the second is mutual consent.

In Europe, the authoritarian doctrine of the divine right of kings, evolving out of the Middle Ages and continuing into the eighteenth century, asserted that kingly authority was derived from the higher authority of God and therefore could not be called into question by either parliament or people. In many other cultures rulers were seen as gods themselves or as direct descendants of gods. Thus obedience to such figures of authority, usually through obedience to their duly ordained subordinates, was seen as a basic duty.

Mere obedience, however, is not necessarily an ethical act. When obedience is either enforced through conquest or slavery, or is simply the result of blind and unthinking compliance with the law, there is no o free, intelligent, and conscious choice involved; there is no consent. To yield to the strong is an act of prudence, not an act of respect for the law. Only when submission to the authority of a society is learned and accepted as a thoughtful, deliberate choice does acceptance of this duty become an ethical act.

That is where the second source of the duty to obey the laws comes from: negotiated consent to be so obligated–a consent mutually given and accepted by all members in the society. As Samuel Johnson observed in his 1766 Letter to Boswell: “Life cannot subsist in society but by reciprocal concessions.”

This concept of the mutual consent of the governed became the basis for the denunciation of the divine right of kings at the dawn of the European Enlightenment, It was first enunciated philosophically by John Locke in 1690 in his Two Treatises on Civil Government, in which he also developed his theory of self-government and the social contract. He wrote:

Men being, as has been said, by nature all free, equal and independent, no one can be put out of his estate and subjected to the political power of another without his own consent, which is done by agreeing with other men, to join and unite into a community for their comfortable, safe, and peaceable living, one amongst another, in a secure enjoyment of their properties, and a greater seer security against any that are not of it…. When any number of men have so consented to make one community or government, they are thereby presently incorporated, and make one body politic.

After that, majority rule prevails.

Locke’s theory of self-government and the social contract became the philosophical basis that moved Western civilization from authority to agreement as the basis of the civic duty to obey society’s rules. It constituted one of the greatest paradigm shifts in history.

In our time, John Rawls has transformed the classic conception of the social contract from the great myth of Western political thought into a parable-a story or thought experiment used to analyze an abstract concept or explain a moral or ethical process. In Rawls’ scenario, we imagine a gathering of human beings who have been stripped of their accidental characteristics: sex, age, race, nationality or tribe, social status, wealth or poverty, good health or disability. They are left with only the essential characteristics of their human nature.

Thus each is an animal that is a predator with needs and appetites for food, clothing, and shelter. Each needs and wants a mate and territory. And each has basic instincts to protect oneself and one’s offspring. Furthermore, each is rational, able to think at a very abstract and symbolic level, and each has the s the power to remember that action or inaction has consequences and that planning is possible. These humans can make free choices about what is in their own self-interest–and they understand their enlightened self-interest sometimes values long-term goals over short-term satisfactions. Above all, they are social animals that know how to cooperate with each other.

What has been described here is what the law refers to as “the reasonable person.” Each of these human beings becomes that hypothetical or abstract person who will act reasonably under any circumstance.

We then add to this condition the situation that these reasonable people will now come together and make rules for the commonweal, and they will do so behind “the veil at ignorance”–that is, they are without knowledge of who or what they will become when they return to society. They are ignorant of what sex, age, or race they will be; what nationality or tribe they will belong to; what social status or wealth they will possess; what state of health or disability they will find themselves in. They are thus unaware of how the rules they make will affect each of them.

Under these circumstances, Rawls argues, these reasonable people, completely equal in bargaining power and absolutely impartial, will make rules that are both reasonable and just–that is, they will make rules that burden and benefit each person equally. This becomes the ideal social contract.

But how does this ris relate to real life, where no such gathering of equal and impartial reasonable people has ever taken place and where no such social contract has actually been drafted? It is here that we must turn to the concept of an implied or tacit consent to such a social contract.

Of course, such an implied or tacit consent to a social contract by most citizens or subjects would be difficult to establish. How then can people generally be obligated to obey laws based on social contract theory? Our consent is an assumed consent since it is taken as granted or true that every reasonable person in a state of perfect equality and absolute impartiality, if asked, would give such consent. (In real life, where consent is actually withheld from time to time, it is assumed that the conditions of rationality, equality, or impartiality are imperfect.) Therefore every member of a given society is automatically bound by the social contract, since every member’s consent is assumed and required. It is this universally assumed consent to the social contract that constitutes the general basis for political duty.

However, assumed consent is not actual consent; it is consent that is imputed to each person as a member of society. It is, therefore, not an ethical act. Only when people explicitly acknowledge and accept the duties imposed by the social contract, with knowledge and forethought, do they perform an ethical act. And itit is such explicit consent to the duties of the social contract that internalizes a person’s obligation to obey the law.

Basic duties are natural duties since they arise from our nature as human beings. However, these natural duties are not perfected until we form ourselves into social groups, since duties are relationships. For example, the duty not to kill each other becomes a duty only with the formation of the social contract. Before that, it is an inchoate duty. Basic or natural duties are the substantive and necessary provisions of the social contract, but not all the duties in our society are basic. How, then, do we discover which among our many duties are basic? We use the reasonable person test: a duty is a basic duty and a substantive provision of the social contract if reasonable people with equal bargaining power and no knowledge of how the duty will affect them will unanimously agree to it everywhere and at all times.

Let’s take an example. Our process of rational analysis concludes that, when reasonable people gather in a state of perfect equality and absolute impartiality to negotiate the basic rules for a peaceful and just society, the first subject must be war or peace. By definition, there must be a mutual agreement (or law) not to kill (or injure) each other. The agreement not to kill each other is the condition precedent to a peaceful society. Since we begin with the presumption that a reasasonable person is motivated by rational self-interest and the basic instinct to survive and therefore desires a peaceful society, every reasonable person will mutually agree not to kill any other member of that society. Moreover, this agreement must be unanimous, since it is essentially the outcome of a disarmament negotiation. Consider the possibility that there is one holdout to the accord. No other party to the proposed compact would surrender weapons unless and until all others in the group have laid their weapons on the table. So the consent must be unanimous and the duty imposed by the agreement universal. Using the reasonable person test, the same analysis can be made of every basic duty that we are obligated to respect. These will be very few.

So at last we can define what we mean by the social contract. It is the compilation of all our basic or natural duties. The social contract is that fundamental compact that consists of the rules imposing basic duties, assigning rights, and distributing the benefits of political, social, and economic cooperation, unanimously agreed to by reasonable people in a state of perfect equality and absolute impartiality. This contract is not the result of a historical event; it is the result of rational and legal analysis and hypothesis. The reasonable person test asks: would reasonable people agree to this or that duty? Would their agreement be unanimous-cross-cultural, cross-generational? The answers are usually given by lawyers, judges, politicians, philosophers, professors, and sometimes by popular vote. While the assembly of reasonable people is hypothetical and their deliberations behind “the veil of ignorance” a parable, the social contract that results from this rational analysis is real. It is the fundamental compact that is assumed to exist in every society.

When governments are formed and laws are made, the social contract becomes positive law–the laws of a particular society. It is similar to an oral contract becoming a written agreement. However, positive law must conform to the agreements of the social contract if they are to be just. Basic natural duties necessarily imposed by the social contract must continue under the laws of every society and government, Organic documents or constitutions must respect basic duties of the social contract because, as we shall see, it is these basic natural duties that give rise to natural or human rights.

What is a right? A right is one side of a relationship; your right is the duty of another. What is a human right? A human right is a relationship arising from our nature as human beings that entitles an individual to certain conduct from all others. It is a contractual right flowing from the social contract that imposes upon all others the necessary and universal duty to act or refrain from acting in a certain way. A human right, however, should not be confused with a possession, like an apple or a house. Nor should it be equated with a human power, like the power to think or see or live. Rather, a human right is a relationship between an individual and all others that entitles a person to certain conduct from every other person and from society. You have the power of life, but the right to your life is created when all others promise not to kill you.

Human rights, or natural rights, are the flip side of the natural duties of the social contract. They are the quid pro quo of the social contract. Human rights are the benefits negotiated by our theoretical reasonable persons and received by each of them as a result of their agreement to accept the natural duties imposed by the social contract. Human rights are the consideration for the obligations assumed under that fundamental agreement.

Recall that when parties enter into a contract each becomes obligated to the other and each reciprocally acquires a right to what is promised by the other. So when we say that you have a right to life, we mean that there is a corresponding duty imposed upon all other persons in our society, and upon the society itself, not to kill you. Therefore each person within that society is entitled to the enforcement of these rights by the government against offending members of that society and against an abusive government itself, not only on behalf of the society as a whole but on behalf of each victimized individual member.

When the authority of a lawgiver, such as God or a king, is made the basis of an obligation to obey good laws, the benefits to society as a whole can be seen but individual rights are not clearly defined. However, when mutual consent emerges as the basis for such an obligation, and when self-government of an adult society becomes a reality, then the existence of individual human rights is revealed quite clearly.

Human or natural rights are only those that arise from the acceptance of natural duties–no more and no less. So to discover a new human right we must first discover a new natural duty, necessary to a peaceful and just society, general in its application, and accepted by consensus. There can be no human right without the acceptance of a corresponding natural duty.

It is equally true that where human rights are abridged or the benefits of social cooperation are denied, the willingness to observe the basic duties of the society is diminished. In fact; the denial or abridgement of human rights constitutes a breach of the social contract. It is no accident that those in a society who perceive themselves as underprivileged rebel and commit crimes against those perceived as privileged.

Human rights are universal since the reciprocal basic natural duties established by the social contract are general in their application to all people and at all times. However, the manifestation of these rights and duties will vary from civilization to civilization, since the degree of knowledge and understanding of these duties and rights will vary and the expression of these duties and rights will be exhibited according to each society’s history and culture and the sense of justice of different people. Nonetheless, the underlying principles are the same everywhere and at all times. Human rights are foreign to no culture and native to all nations, and it is the universality of human rights that gives them their strength.

We assert that these rights are unalienable–that is, they cannot be taken away or even abridged. Therefore no ethical government can deny these basic rights to its citizens since the people don’t receive them from government. Basic rights precede the formation of government, and it is the duty of government to preserve, protect, and defend these rights equally for all its citizens. Moreover, government has the obligation to protect the rights of visitors, travelers, and resident aliens within its jurisdiction and to respect the human rights of the entire human family. Offensive war is immoral and unethical; defensive war is tolerated only when reasonably necessary for the selfdefense of one’s own country or another innocent country victimized by aggressive war.

Human rights are indivisible and interdependent. Everything obviously depends upon the right to one’s life; however, the right to one’s life is inadequate if a person is enslaved or falsely imprisoned. And to be free is a cruel sham if one lives on the edge of starvation. Human rights need to be enjoyed in their entirety, as an indivisible and interdependent whole, in order that people may truly live the good life as human beings in a peaceful and just world.

In the United States, constitutional rights are those rights found in the federal Constitution. But not all constitutional rights are human rights and not all human rights are spelled out in the Constitution. Specifically, First Amendment rights and the prohibition against slavery are both human rights and constitutional rights. However, most of the other constitutional rights are procedural devices designed to enhance equality and the franchise and to protect life, liberty, and property. For example, habeas corpus and the right t the right to a trial by a jury of one’s peers are not human rights; they are procedural safeguards for human rights.

Clearly, then, there are very few human rights–life, liberty, and the pursuit of happiness–but they are basic to our ability to live as human beings. Basic to all is the right to one’s life. The right to liberty includes liberty of the body–that is, freedom from slavery and false imprisonment-and liberty of the mind–that is, freedom of conscience, in spoken and written communications, and in association with others. The concept of ordered liberty adds the right to marry and to raise a family and educate one’s children according to one’s best lights. The pursuit of happiness includes the right to acquire and own property and the right to a minimum standard of living. In recent years, the right to privacy has been added to the short list of human rights.

How do we resolve the apparent conflict between unalienable rights and government by the consent of the governed? Self-government ultimately boils down to government by the majority of those voting. The issue is whether human rights can be abolished or abridged by a majority in Congress, a majority vote in a public referendum, or a supermajority through the process of amending the Constitution.

The philosophy, history, and Supreme Court decisions of the United States have consistently held that human ri9hts–including First Amendment rights–are not subject to a majority vote. Unalienable means unalienable. As Ralph Ketcham states in The Anti-Federalist Papers and the Constitutional Convention Debates: American political thought and experience after 1776 in fact highlighted a tension built into the Declaration of Independence which proclaimed in one clause that certain rights were “unalienable,” and in another that “Governments … derive their just powers from the consent of the governed.” Rights to life, liberty, and the pursuit of happiness were not to be submitted to a vote or to depend on the outcome of elections; that is, not even the consent of the governed could legitimately abridge them. But it was nonetheless possible that the people, through their elected representatives, might sanction laws violating “unalienable” rights. Suppose legislatures, state or national, passed laws abridging freedom of the press, or violating liberty of conscience, or permitting default on contracts, as happened in the 1780s. Which principle had priority, that of “consent” or that of “unalienable rights”? Unless it could be assured that all, or at least a majority of the people would always protect “unalienable rights,” which few thought likely, the American Revolutionists seemed committed to propositions not always compatible. The Federal Constitution of 1787 was one effort to contain the tension, and the debate over its ratification often revolved around whether the framers had properly adjusted the balance of the two principles. Virtually all the members of the Federal Convention, and both sides in the ratification struggle, sought to fulfill the purposes of the Declaration of independence to both protect rights and insure government by consent. The key differences arose over which purpose to emphasize and what mechanisms of government best assured some fulfillment of each.

There is no power in our government-be it Congress acting through a majority of both houses and with the consent of the president, the Supreme Court acting through a majority of its justices, a plebiscite of the whole people, or even a supermajority acting to amend the Constitution-that can abolish or abridge human rights; they remain even if denied. The natural duties of the social contract are ethically binding upon our federal and state governments, and the human rights flowing therefrom cannot be taken away in whole or in part.

However, Congress and the courts do have the power to define those rights–as in the right to life and capital punishment–and to describe the outer boundaries of those rights–as with the exercise of free speech and the prohibition against defamation–and to balance one right against another–as in the case of freedom of public assembly and the needs of public safety. And as our knowledge and understanding of human rights develop, we can improve or expand the scope of existing rights–with the application of free speech rights to the Internet and identify or define new human rights–as with the right to privacy.

So we see that universal human rights are real, They are derived from our biological nature as social animals and the logical principle of reciprocity as applied by reasonable people through a theoretical social contract. Rights, of course, imply duties, and those duties fall as much upon governments as individuals. So rights cannot be abolished by governments or even by democratic majorities; they can only be recognized.

Governments therefore become just when they enforce the basic natural duties and protect the human rights flowing therefrom that constitute the social contract. And individuals become ethical when they freely acknowledge and affirm obedience to these basic duties as a personal obligation and give their informed consent to respect and honor the human rights of all other human beings.

–Robert Grant
is a practicing attorney and former judge living in New Rochelle, NY.

 

From The Devil and Secular Humanism:
The Children of the Enlightenment

by Howard Radest, 1990

February 2000

 

(over a page of notes and footnotes was intentionally left off this version of this article)

‘Tis but thy name that is my enemy,
Thou art thyself, though not a Montague.
What’s Montague! it is nor hand nor foot,
Nor arm nor face, O be some other name
Belonging to a man.
What’s in a name! that which we call a rose
By any other word would smell as sweet.
So Romeo would, were he not Romeo call’d,
Retain that dear perfection which he owes
Without that title. Romeo, doff thy name, And for thy name,
which is no part of thee,
Take all myself’

In 1933, the Humanists who joined in Manifesto I set out to reconstruct faith in a modern world. Without apology, they described their enterprise as “religious humanism.” In 1980, a number of Humanists led by Paul Kurtz issued A Secular Humanist Declaration and explicitly rejected the idea of a “religious” Humanism. They accused those who retained the adjective of intellectual confusion, sentimentality, and even opportunism. The 1980 Declaration identified religion with:

The reappearance of dogmatic authoritarian religions; fundamentalist, literalist, and doctrinaire Christianity; a rapidly growing and uncompromising Moslem clericalism in the Middle East and Asia; the re-assertion of orthodox authority by the Roman Catholic papal hierarchy; nationalistic religious Judaism, and the reversion to obscurantist religions in Asia.’

Religion was the enemy and Humanist flirtation with it ensured confusion at best and surrender at worst. Clearly, the climate of the Humanist neighborhood had changed. The style of attack was reminiscent of the pamphleteering spirit that had animated the Enlightenment. The secularist broadsides had a familiar ring. Echoes of the “philosophe” could be heard and nineteenth-century battles over atheism and agnosticism were again replayed. Sadly, however, the views chat had animated the attacks of earlier centuries now seemed only trite. The polemic and the anger were, however, addressed to the enemy within. Humanism seemed intent on destroying itself.

The 1980s found Humanists–or at least many of them–as antagonistic toward their fellow Humanists as to Fundamentalists and right-wing Christians. The terms of the internal quarrel were not new but the tone of disdain was. I recall that in the 1950s, the question, “are we religious,” would also evoke debate in Humanist circles. I recall, too, that efforts to distinguish Ethical Culture from the American Humanist Association on one side and from Unitarianism oil the other circled around the “religious” issue and the “God” issue. For Ethical Culture, the Humanists were just too “secular,” while the Unitarians were just too “pious.” In turn, Unitarians and Humanists found Ethical Culture too straitlaced in its ethicism and just out of-date in its neo-Kantianism. But these were arguments with a certain friendliness of spirit; by 1980 that seemed to be gone.

The assurance with which the authors and signers of Manifesto I had taken to the task of religious reconstruction was unsurprising. In the late nineteenth century, religion on the Left in America had developed a moralistic tone and center. The pulpit addressed itself to social criticism as much as it did to salvation. Its efforts were often to be found in the secular world, and its energies were devoted to social reform. As biblical scholarship, the “higher criticism,” and archaeology revealed the mundane sources of cult and text, and as science held sway not just in the academy but in the marketplace, the need to bring religion into the moden world was felt by many in church and synagogue and not just by secular critics. At the same time, ordinary life came to be focused on this world and its demands. To be sure, the sacred was given its due with typical American piety in the patriotic rhetoric of “God and Country.” In the twentieth century, religion was assigned to a Sunday “ghetto,” to the occasional “revival” meeting, or to the rhetoric of a political campaign. By contrast, the new immigrants and ethnic minorities still held on to their religion as a defense against the assaults of the new world. But, they too were pushed and were pushing toward Americanization, toward assimilation and toward secularization. All of this invited the reconstruction of faith from the left and reformulation from the Right. The “old time religion” really would not do.

This cultural pattern of secularization was an appropriate home for the appearance of a self-conscious and organized Humanism. Much of the stimulus for its emergence came from the Western Unitarian Conference, informal successor to the Free Religious Association of the nineteenth century. As Edwin Wilson recalled:

Religious Humanism as a movement had no one source, but it first came to self-awareness as a movement among Unitarians. In 1917 at a meeting of the Western Unitarian Conference at Des Moines, Iowa, the Reverend John Dietrich and the Reverend Curtis W. Reese compared notes. They decided that what Reese had been presenting as a “revolution in religion: from theocracy to humanism, from autocracy to democracy” was precisely what Dietrich was preaching at Minneapolis. In a sense, the Humanist movement, as such, was born at that moment.’

Of course, a Humanist point of view did not go unchallenged in Unitarian circles, and two other ministers, Drs. George R. Dodson and William Lawrence Sullivan, argued that the issue for the denomination was between “the God-men and the No-God men.”

Another stimulus to organized Humanism, albeit not without controversy either, came from within Ethical Culture:

Felix Adler… was himself scornful of naturalism as a basis for ethics and religion. Though he invited humanists into membership…he made it clear that they did not yet share the full “religious” vision which he identified with the transcendental or “supersensible” to distinguish it from crude supernaturalism. He did not knowingly admit the humanist or non-religious members into positions of leadership. …The news that two of the professional leaders. V. T Thaycr, Director of the Ethical Culture Schools and Frank Swift, a young Associate in Philadelphia, had signed the Humanist Manifesto of 1933 was kept from Dr. Adler in his final illness.

In the academy, the third source of modern Humanism, the argument appeared on philosophic grounds, the issue of naturalism, and on institutional grounds, the proper role of scholarship. Led by John Dewey, the academy was challenged to put its ideas to work, to avoid mere academicism. We might even think of it as a controversy between an older Humanism and a new one. The former held itself aloof from the world of action, harking back to an aestheticism and a putative notion of scholarly purity, of art for art’s sake, of truth for truth’s sake. For this Humanism, the humanities and humanistic study were sufficient. The latter took its cue from the Baconian notion that “knowledge is power.” Interpreting modern science as “organized inquiry” and “inquiry” caught in the realities of activity, it insisted on the political and social basis of ideas as well as on the utility of ideas for politics and society. In schooling, this controversy showed itself as the argument between the “old education” and the “new”–as Dewey called it; and the “new” flew the banner of “learning by doing.” In politics, it was to appear in the mobilization of scholars as policy advisors as in Franklin Roosevelt’s “brain trust.”

Humanism continued to be the object of attack from “neo-orthodox” and traditional religious points of view. But it was also shaped by the fact that modern Humanism itself became a matter of controversy within its own neighborhood. Among the symptoms was the appearance–after the end of World War II and repeatedly since–of Humanist departures and Humanist fragments. The American Humanist Association was organized in 1941 to bring together Unitarian ministers who could not turn to their own denomination, Ethical Culture leaders who could not overcome the neo-Kantian idealism of their founder, and academics who sought a place to locate their philosophic commitments. Efforts were made to arrive at common projects with other Humanists but these were few and, with two exceptions–joint activity on behalf of the separation of church and state, and the Conference on Science and Democracy (1944-1945)–relatively minor. Two decades later, the American Humanist Association was caught up in an internal leadership struggle. The Fellowship of Religious Humanists was organized in 1963 “by a group of liberal religious leaders, mainly Unitarians and Ethical Culturalists, and is principally concerned with the practice and philosophy of Humanism as a religion.” In 1968, a Society for Humanistic Judaism was established in Birmingham, Michigan by Rabbi Sherwin Wine. He moved toward an explicit Humanism while not departing from a secular Jewish point of view. In 1981, following the publication of A Secular Humanist Declaration, the Committee for Democratic and Secular Humanism was organized by Paul Kurtz. Meanwhile, within Unitarian Universalism and within Ethical Culture, the Humanist strain grew or faltered depending on the leadership and the climate of the moment. Despairing of ever uniting these disparate organizations that seemed to appear with increasing frequency, a North American Committee for Humanism was established in 1980 to bring individual Humanists together. This was met with suspicion as yet another fragment, another competition. Meanwhile, rationalism, free thought, and atheism went their separate ways. Implicitly or explicitly, each of these fragments claimed to represent the best, or the most adequate, or the most comprehensive of Humanism.

I confess that the vicissitudes of these organizational ventures are not really of any great interest in themselves. There was little originality in each “new” platform and each “new” effort only revealed a familiar pattern and told a familiar story. But the fragmented and even sectarian development of Humanist organizations since 1933 can be used to trace the struggle of Humanism with its own ideas. The organizations, while often the result of the temperamental and idiosyncratic Humanism of individuals or reflective of particular histories, also serve as markers of Humanist efforts at self-definition. They offer clues to the evolving meanings attaching to modern Humanism. Whereas the nineteenth century witnessed the struggle of Humanism to appear, the twentieth century witnessed the struggle of Humanism to know itself.

The message of these organizational ventures is that modern Humanism does not exist yet. The checkered career of Humanist efforts to state and restate themselves in organization, program, and language since the 1933 Manifesto are symptoms of that fact. Indeed, the arguments between “god men and no-god men,” between philosophic naturalists and philosophic idealists, between activists and contemplatives, between socialists and libertarians and above all between religionists and secularists remain points of polarization within the Humanist neighborhood. At a distance, many of these points seem increasingly less worth the divisions they encourage. Yet they are real enough to the protagonists. Symptomatic of these unresolved issues was Manifesto II published in 1973. It was signed by Humanists from nearly all points of the Humanist compass. At the same time, gone was the clarity, directness, and assuredness of the 1933 document. Manifesto II is a long and puzzling essay, giving with one hand and taking away with the other. In its discussion of religion, for example, it says:

FIRST: In the best sense, religion may inspire dedication to the highest ethical ideals. The cultivation of moral devotion and creative imagination is an expression of genuine “spiritual” experience and aspiration.

We believe, however, that traditional dogmatic or authoritarian religions that place revelation, God, ritual, or creed above human needs and experience do a disservice to the human species….

Some Humanists believe we should reinterpret traditional religions and reinvest them with meanings appropriate to the current situation. Such redefinitions, however, often perpetuate old dependencies and escapisms; they easily become obscurantist, impeding the free use of intellect.

In speaking of science it notes:

The controlled use of scientific methods, which have transformed the natural and social sciences since the Renaissance, must be extended further in the solution of human problems. But reason must be tempered by humility…Nor is there any guarantee that all problems can be solved or all questions answered.

Perhaps, most revealing of all is the following from the Manifesto’s introduction,

Many kinds of humanism exist in the contemporary world. The varieties and emphases of naturalistic humanism include “scientific.” “ethical,” “democratic,” “religious.” and “Marxist” humanism. Free thought, atheism, agnosticism. skepticism, deism, rationalism, ethical culture, and liberal religion all claim to be heir to the humanist tradition.”

I do not want to overstate the differences, although within the Humanist neighborhood an exaggerated importance attaches to them. On all sides, there is agreement on the values of rationality, on the moral responsibility of human beings, and on the importance of living socially and in the present. On all sides, there is agreement on freedom of conscience and the urgency of free inquiry. 011 all sides, there is agreement on the moral and political priority of democracy. On all sides, there is a commitment to nurture human capabilities for good and an essential hopefulness about human beings.

At the same time, these agreements often mask deeply felt disagreements. The proponents of democracy separate into libertarians and social democrats, and the confidence in human potentiality founders on issues of practical policy, of how to give political and social reality to that potentiality. The commitment to human responsibility divides in the argument about the appropriate role for Humanists and for Humanist organizations in social action. Indeed, Humanists seem to rehearse in their own terms the same kinds of quarrels that have divided churches and fragmented political parties. I might be tempted to leave it that Humanists have turned out to be human, after all. But that does not solve the problem: what, really, is Humanism up to?

It seems to me that the fragmentation, even sectarianism, that has emerged in Humanism since 1933 is only partly explained by the differing sources from which Humanists and Humanist organizations came. Instead, fragmentation is a consequence of the quarrel over faith, and in particular, of the way in which that quarrel has been framed in the argument with Fundamentalist Christians. Given to dogmatic assertion, they invite equally dogmatic contradiction. And given that they, often for their own opportunistic reasons, insist that Secular Humanism itself is a religion, it is understandable that they evoke denials that it is a “religion.” But Fundamentalism is only the current occasion for Humanist argument. Were the quarrel over the nature of faith resolved, a quarrel that is really about the nature and function of Humanism itself, the other differences would vanish in a constructive diversity.

At one level, the religious argument is really only over words. For example, the leading proponent of “secularism,” Paul Kurtz, has struggled painfully with the issue and has even gone so far as to coin the term eupraxophy to describe Humanism:

If humanism is not a religion, what is it! Unfortunately, there is no word in the English language adequate to describe it fully…. Accordingly, I think we will have to coin a new term in order to distinguish nontheistic beliefs and practices from other systems of beliefs and practices, a term that could be used in many languages. The best approach is to combine Greek roots. I have come up with the term eupraxophy which means “good practical wisdom.”

I sympathize with Kurtz’s impatience and I understand his concern over the confusions of religious language and the political uses to which those confusions are put. In our world, a religious temperament prevails that, in its current anger, is often viciously anti-intellectual and anti-democratic. The identification of religion with Fundamentalist dogmatism and anger tends to monopolize public consciousness arid compromises all others who would use the term. Indeed, liberal and centrist religions like mainstream Protestantism and Reform Judaism have moved to the right in response to this Fundamentalist climate. Like Paul Tillich, who once called for a moratorium on “God language,” it might be worthwhile to call for a moratorium on “religion language”. At least the dust might settle and we could all get on to more substantive matters.

At the same time, there is a historical and intellectual truthfulness in the effort at religious reconstruction that was evident in Manifesto I. It recognized that “religious” values were among the persistent features of human experience everywhere. In seeking to capture the point, Dewey remarked:

It is pertinent to note that the unification of the self through the ceaseless Bur of what it does, suffers, and achieves cannot be attained in terms of itself. The self is always directed toward something beyond and so its own unification depends upon the idea of the integration of the shifting scenes of the world into that imaginative totality we call the Universe.

Paul Kurtz, himself a naturalist, is not unaware of the needs of human experience. When he shared the draft of his text on “eupraxophy” with me, I wrote in reply:

I think the matter (of religion) is a “non-issue” on the evidence of your own text. Thus, when you describe what humanism should be up to, ie: a method of inquiry, a cosmic world view, a life stance, and a set of social values (p. 13ff), you’re talking about what others call “religion.” Furthermore, when you talk about “humanist centers” or other institutional forms, you’re really describing Ethical Culture Societies, Unitarian Fellowships, etc. I don’t think you’ve developed a new form but only have given a new name to an existing one.

But the argument is not simply over words and the quarrels are not merely semantic. Although we might be successful at inventing new vocabularies as Kurtz and others have tried to do,” we would still face the question of modern Humanism’s lack of coherence, and its deterioration into polar positions since 1933. This lack of coherence might find a more hopeful resolution were polarization over religion settled. These days, sadly, we avoid working on the question, “what is Humanism up to,” and instead play a game of “either/or.” All of us are infected by the features of rightwing-policicized religion, here and abroad.

Our thinking is distorted by the fact that we love to choose sides. Humanists, more than most, are given to an argumentative game by temperament and by history. Often, however, what begins as an intellectual exercise takes on a life of its own and drives us toward separations that were unimagined when the argument began. I have seen this happen repeatedly, and never more than in the past decade. We lose ourselves in the joys or argument and forget that it is only argument. The game of either/or itself becomes our reality. So it is with many of the polarities that afflict Humanism. In the heat of argument it is easy to turn “faith” into a caricature of itself and then to identify all faith with superstition. When such a mood seizes us, we embrace its complement, a simple-minded secularism that denies any value to a move beyond the immediate. In saner moments, we know that experience is too rich with possibilities to be reduced to abusive labeling and that we are ill served by the mentality of the arena. Yet it is all too human to invest ourselves in our arguments and then to be unable to retreat. Losing the argument comes to feel like a loss of self.

I do not mean to trivialize what occurs as a result of debate although its origins and issues are often trivial. The consequences of argument appear in realities of relationship. We come to like, associate with, and support some, whereas we reject others. These separations are then often reflected in our conduct so that the next debate is not only about words but about these realities too. Our arguments become weapons of internal warfare rather than tools of understanding and social criticism. We lose sight of the problem. Fundamentalist religious movements here and abroad have succeeded in constricting freedom, influencing public policy, and corrupting education. These political realities cannot be ignored and there is a legitimate need for a responsive Humanist politics. This would seem to require unity rather than fragmentation, but that is the opposite of what happens.

At the same time, we are stubborn and contrary. We convince ourselves of the correctness of our own views and proceed to act on that conviction. So I ponder the Humanist adjectives that have emerged in the less than sixty years that have passed since Manifesto I–scientific, naturalistic, religious, evolutionary, Marxist, existential, secular, rationalist, and ethical. It is, I confess, ironic, perhaps even sadly comic. Humanism set out to be inclusive. Its method, from classic times to ours, has been dialogic, the effort to catch the partial truths on all sides and to erect transcending truths that would move us beyond the present encounter. Yet today, Humanism retreats into secularism and surrenders to its own Fundamentalist temptation. It allows the non-Humanist to set the terms, the style, even the content of the argument. It hereby becomes ineffective and loses itself.

I think that we are given to the game of “either/or” precisely because the ambiguities of experience have become nearly intolerable. The authors of Manifesto I could speak with confidence about the world to come. They had not yet seen science perverted into holocaust and nuclear destruction. They had not yet watched democracy turn into populist conformism. To be sure, they warned of these possibilities, but these warnings seemed merely conceptual. Given the events of the decades since 1933, it is understandable that Humanist confidence should be eroded and chat Humanism should lose its way In the midst of chaos, it is much more satisfying to separate into sheep and goat, saved and damned. To confess the truthfulness and the humanity in the other is never easy; to admit the falsehood and inhumanity in ourselves is even more difficult. But then that is the permanent difficulty of all human relationships. Today, as we lose our moorings, that difficulty nears impossibility.

Humanists, like all other human beings, are caught in the terrors of our age and have difficulty holding onto the genius of their position. Like everyone else, they tend to revert to a mythic past where matters were simpler, clearer, more assured. So it is that when Humanism meets Fundamentalism, it responds in Fundamentalist style with a “raucous Humanism.” The world we live in seems to justify the Humanist in his or her defensive aggression. Surely the twentieth century has taught us through the horrors of genocide and the possibilities of global destruction that we are not God’s special creatures. It is an affront to be told that these horrors are, after all, just punishment for evil or unknowable features of a divine plan. Too much has happened for us to be beguiled any longer by promises of eternal elevation. As we grow increasingly more sensitive to other natural beings, indeed to nature itself, we also learn how arrogant “speciesism” is, whether advanced by the story of creation or by the ” religion of humanity.” So the pretensions and pretentiousness of traditions that single us out as “lords of creation” stir us to a noisy Humanism as if we could shout down the enemy. But the noise deafens us too and blocks the effort to reconsider the place of human beings in a reconceived nature.

This game of either/or, of absolute meeting absolute, encourages other instances of Humanism’s loss of itself, like meeting the foolishness of “creation science” by polemicizing evolution theory. We play out, once again, the post-Darwinian battle. Or else, we argue God and No-God, moving as if choreographed for and against the arguments from “design.” “first cause,” “final cause,” and so on. We are quite comfortable with these moves, we know them in advance and know that the outcome is predictable. They are, in fact, an indulgence and an escape. Neither side convinces the other, can convince the other, or expects to convince the other. As Corliss Lamont commented recently, “I’m bored with it.” Wisdom, then, would search for ways to move beyond the battle…but wisdom is surrendered to the joys and protections of battle itself.

Yet something serious is at stake. To be sure, the stories of creation and the promises of providence are poor physics, poor biology, and poor history. Humanism cannot, however, simply dismiss the matter by patronizing diagnosis and thereby betray itself by resigning most of humanity to superstition. If human beings mistakenly “people the darkness beyond the stars with harps and habitations,” as Robinson Jeffers put it, a Humanist must ask why and find a better response to the unspoken question for which spirits, demons, saints, and devils are answers. In other words, as the world grows incomprehensibly large and as we learn that it simply does not pay attention to us, we need all the more to attend to questions of meaning and security in experience.

These questions have not yet been answered. The notions of Enlightenment will not do–they addressed a more manageable world. The angers of Fundamentalism and the confusion of sects confess to a widely shared anxiety of spirit. In that sense, both Fundamentalism and raucous Humanism are only symptomatic, and the game of either/or attends only to the symptoms. When we are lost we shout more and more loudly in panic; we mask our desperation with busyness; we seek out a villain. Thus, within the debates that produce a noisy humanism is hidden the question: How shall human life be purposeful and joyful in a universe where human life seems only a chemical and biological incident?

A Humanist can cite the evidence that shows we are indeed living in a secular culture. The powers of the Gods are invisible to nearly all of us not just to Humanists. They grow increasingly more invisible as time passes. We conduct ourselves as if the Gods, even if they existed, were indifferent. The believer–with the exception of those few who separate themselves from the world–reveals that he or she does not seriously hold to the notions of judgment and resurrection. Eternity is denied in practice no matter how loudly proclaimed in rhetoric. The game of either/or invites the Humanist to take pleasure in the fact, while the Fundamentalist rages against the rule of Satan. The Humanist proclaims that we are already living in the “humanist century” as the authors of Manifesto II put it, echoing the optimism of their eighteenth-century ancestors. That great numbers of people do not hear or respond to that proclamation should give the Humanist pause…but it does not.

Reacting against the animation of nature with mysterious deities–there is indeed a renewed spiritualism, a confidence in magic, even a so-called “new age” philosophy–a prosaic Humanism joins a raucous Humanism and calls on the “facts” to witness the falsehood of its enemies. The Humanist forgets, however, that “facts” do not convince and that their interpretation, their meaning, is always at issue. “Facts” need their stories and it is stories that have vanished. Certainly it is important to expose the foolishness of astrology or the trickery of religious charlatans. This has merit as a type of social mental health. But, despite repeated exposure, the followers of astrology persist and the charlatans continue to find their victims. Argument does not work because it does not reach to the depths that move us toward gullibility. At the same time, the naive empiricism that sometimes afflicts Humanism–a consequence of playing the debating game–leads it to an aesthetically impoverished and psychologically inadequate outcome. Thus, Humanism fails to address the depths, and the resort to argument becomes a double defeat.

Nowhere is the need for psychological and aesthetic adequacy more evident than in the utterly personal fact of death and dying. Here Humanists are better in their practices than in their theories–responsive in their relationships and ceremonies while still narrowly rationalistic in their arguments. Of course, Humanism does not, cannot, promise immortality, but the issue is not about immortality although the debate pretends that it is. The believer weeps the same tears the Humanist does, feels the same losses, and the same regrets. Humanist and non-Humanist alike know that their lives do not play out fittingly with a beginning, middle, and end. We are interrupted, repeatedly interrupted, and finally interrupted by death. For our experience, the game of denials on all sides is simply inadequate, the debate pointless…and I may add, the promises of tradition not only unbelievable but irrelevant. The hidden question is again a question of meaning and security, how shall we live with the consciousness–and not just the fact–of mortality. As Harold Blackham put it once:

The loved detail of a landscape is annihilated by distance, but one call return and find it. There is no return in rime but what was once somewhere had no less reality than what is elsewhere…. By the criterion of eventual oblivion, there are no distinctions nor standards, no virtues nor values nor joys nor sorrows: nothing is. This is the true nihilism, to take oblivion as the measure of all things because oblivion is the destiny of all things.

To accept and respect the temporal condition of all things is the beginning of wisdom….To appeal against the temporal terms of the human condition, the ephemeral character of our life, to aspire to an eternal unconditioned existence is not really to look for salvation, for it is to reject and forfeit life. This earnest refusal of life is the profoundest thoughtlessness, the tragic misunderstanding not merely of the terms of human existence but mainly of its very character, what there is there to love and care for, and how it is as it is.

Not only does the debating game force Humanism to respond raucously, noisily, and prosaically, but it leads Humanism to continue to repeat naive views of reason, science, and progress as if the mere repetition would overwhelm the opposition. To be sure, I understand the need to deny the claims of the supernatural; I too find those claims not merely unbelievable but degrading. After all, we are told by the supernaturalist that another reality is necessary for the intelligibility and worthiness of this one. The world that is my home thus becomes the object of a sneer, as it were, even a cosmic sneer: an extra-natural invasion of nature is needed if my life is to have any meaning. We are told that we must lose the world in order to gain it…and so on.

The rationalist has little difficulty in demonstrating the contradictions of this extra-naturalism. However, in his or her anxiety to win the battle, the rationalist ignores the absurd features of existence, the non-rational, the intuitive, and the responsive features of our experience, the contradictions and false starts, that should prevent us from attributing a rationalist’s structure to nature itself. Science, which is where Humanism embeds rationalism, is not merely reason working itself out in some Hegelian world-historical drama. It is a skeptic’s enterprise, but it is a poet’s enterprise as well, a fascinating scene of intuitions, guesses, and inventions. A reasonable Humanism, even a scientific Humanism, is not merely a rationalist’s Humanism. It understands that we meet the world long before we assemble it in art and science. We meet mysteries, always new mysteries, in a present encounter-just as we think we have dismissed the old one. Of course, that is not the same as elevating the encounter to a meeting with “the mysterious,” as the believer would have it. But it does not permit us to deny mysteries as if sooner or later the world will become entirely transparent.

Just as the game of either/or tempts reason toward rationalism and science toward scientism, so too it tempts hope to foolishness. We all hear the promises of salvation. We know enough of sadness, pain, and disappointment to want to believe that somehow, somewhere, it all fits together and that what was sadness, pain, and disappointment had some meaning and purpose. Thus Humanism once secularized salvation with the notion of progress. Our Promethean energies would make things right later, if not now. Ironically, with this belief in inevitable progress, Humanist freedom surrendered to destiny as presented in Comte’s positivism or the paradoxes of Marxist determinism, although it was a good destiny, for history was on our side. A selective reading of events, a history that even posed as scientific, outfitted this sentiment for salvation with evidence.

But false promises turn out cynics on all sides. Just as salvation demands blind belief, so progress cannot survive an honest reading of events. The game of either/or does not, however, permit the Humanist to confess the inadequacy of the Enlightenment’s idea of progress, nor to reconstruct it. Were it possible to escape the playing field, we might acquire a sense of tragedy, a certain humility, and finally reach a notion of progress as a setting for future action and not as a description of past achievement. Thus, Jean-Paul Sartre from within his Humanism imagines us leaning forward into time not yet:

But there is another meaning to humanism. Fundamentally, it is this: man is constantly outside of himself; in projecting himself, in losing himself outside of himself, he makes for man’s existing; and on the other hand, it is by pursuing transcendent goals that he is able to exist; man being this state of passing-beyond, and seizing upon things only as they bear upon this passing-beyond, is at the heart, at the center of this passing-beyond…. This connection between transcendency as a constituent element of man … and subjectivity in the sense that man is not closed in on himself but is always present in a human universe is what we call existentialist humanism.

There are many instances where Humanism has been betrayed by its compulsion to fight its enemies with inappropriate weapons; I have named but a few of them. The game of either/or, wherever we find it, induces a recurring pattern of simplification confronting simplification, absolute confronting absolute. As in all games, there are winners and losers, points to be scored, and cheers to be heard. Sadly, the game of either/or is played most viciously when faith is the playing field…and this is not surprising. For whatever the point of view, faith, unlike politics, business, or sports, addresses itself to those deepest issues of human experience, issues of life and death and meaning. It is these issues, not the name given to them, that stir the passions and call for attention. For Humanism, which is neither simple nor absolute, the game of either/or forces a loss of integrity, a loss of its own character.

The argument over whether or not Humanism is religious or secular needs to be reconceived. Perhaps there is some wisdom, given today’s environment, to minimizing religious description and language. We might avoid the worst dangers of the game of either/or. Humanism might then illustrate the virtues of dialogue in a world of partisans. But dialogue is not merely toleration-the final temptation of the game of either/or. It is almost inevitable that those Humanists who find the noisiness of their fellows an affront suppress criticism for the sake of peaceableness, confuse courtesy with clarity, and dialogue with the exchange of opinions. Dialogue, however, transcends opinions in the continuous renewal of knowledge and meaning. That is the genius of the sciences that are dialogues between persons mediated by events and that offer reliability through the constructive uses of uncertainty. In place of the game of either/or, Humanism, in its commitment to the sciences, intended to substitute an inquirer’s biography for the ” man of faith.” For Humanism, discoveries, reasonings, and arguments were always in the process of acceptance, rejection, and transformation. Moments of organization were indeed found in experience, but they were moments. The universe was not organized once and forever.

By contrast, the religious climate today is indeed sectarian and absolutist. Diversity is taken as a sign of sin. Humanism, if it could avoid the Fundamentalist temptation, might preach an appreciation of diversity from within its own genius for inclusiveness. It would not then simply take its identity from its opposition as seems to be the current fashion. Ironically, it is the secular Humanist who is most likely to enter the lists of religious warfare as a protagonist and often with relish. While always denying it, he or she still fights the religious/anti-religious war–often confusing it with the clerical/anti-clerical war–on the same ideological, even theological, territory as his or her opponent.

Humanism is worldly and secular. The qualities of experience to which Humanism must address itself, however, are those that have legitimately been called religious. The authors of Manifesto I knew this very well and knew the need for a reconstruction of faith. Since then the religious question has not been faced adequately in Humanist terms–in secular terms. Here, the game of either/or blocks the reconstruction of the terms of its faith-progress and hope–by driving Humanism to a mere echo of its past or to inane simplifications. Not the least of these simplifications may be found in the confusions surrounding the notion of the “secular” itself. In one sense, the term only describes a location. For example, “secular priests” in the Roman Catholic Church exercise their vocation in the world. In another sense, the “secular” is contrasted or even opposed to the “sacred.” Thus, St. Augustine’s City of God and City of Man, or Jesus’ advice to “render unto Caesar the things that are Caesar’s and unto God the things that are God’s.” But neither of these meanings conveys the intention of secularity for Humanism. It is where the action is, all of the action, including that which has historically been religious action. For the Humanist, the “sacred,” the name given to that which is untouchably precious, departs from its separate universe to inform this one, the only one we have. Thus both sacred and secular are transformed under the aegis ofa Humanist naturalism.

The story of twentieth-century Humanism in the period after Manifesto I is a story of departing from that notion of the interpenetration of sacred and secular in the natural world, and is instead a story of attack and defense. This contrasts starkly with the revolutionary excitement of the Enlightenment which enshrined its secular saints in its own pantheon. It contrasts as well with the intellectual and cultural excitement of the nineteenth century when Emerson could embed the transcendent within experience and nature, and Ingersoll could call for a “secular religion.” It contrasts finally with the philosophic confidence of that religious naturalism that inspired Manifesto I.

Some of the differences within Humanism may be traced to differences of origin, for example, as Unitarian Humanism arose within a Christian framework or as Ethical Culture arose within a Reform Jewish context. Each of these, as we have seen, experienced an internal lack of clarity at the outset. By and large, their legacy of controversy over Humanism has been muted. It is regarded as a legitimate possibility in Unitarian-Universalist circles even by non-Humanists, and naturalism, if not Humanism, has replaced neo-Kantian idealism in Ethical Culture. In other words, Humanist fragmentation can no longer be attributed to its pluralist sources. To be sure, varieties still show themselves in differences of organization, practice, and language. Yet, important as these are, they no longer in themselves require fragmentation. Still, Humanism is not yet. This arises from the fact that the game of either/or and not the accidents of history blocks the reconstruction the signers of Manifesto I proposed.

 

Discussion Group Report

Dealing With Superstition

December 2000

By Richard Layton

Carl Sagan’s book, The Demon-Haunted World, is an outstanding work that describes practically every known form of superstitious belief in today’s world and tells us how to distinguish between authentic science on the one hand and pseudoscience and anti-science on the other. These latter have become a favorite lure used by present-day peddlers of superstition.

Sagan asks the crucial question, “If we teach only the findings and products of science–no matter how useful and even inspiring they may be–without communicating its critical method, how can the average person possibly distinguish science from pseudoscience? Both are then presented as unsupported assertion.”

He points out the importance of democracy, with its attendant freedom of expression and separation of powers, for the advancement of science. Thomas Jefferson, himself a scientist, explained, “In every government on earth is some trace of human weakness, some germ of corruption and degeneracy, which cunning will discover and wickedness insensibly open, cultivate and improve. Every government degenerates when trusted to the rulers of the people alone. The people themselves therefore are its only safe depositories. And to render even them safe, their minds must be improved…”

Sagan says part of the duty of citizenship is not to be intimidated into conformity. He advocates that the oath of citizenship taken by recent immigrants and the pledge that students recite include something like, “I promise to question everything my leaders tell me,” and “I promise to use my critical faculties. I promise to develop my independence of thought. I promise to educate myself so I can make my own judgments.” The pledge, he says, should be directed at the Constitution and the Bill of Rights rather than to the flag and the nation.

The founders of our nation, he adds, were well-educated products of the European Enlightenment and students of history. At that time there were only 2.5 million American citizens. Today there are 100 times more. If there were 10 people of Jefferson’s caliber then, there ought to be 10 x 100 = 1,000 Tom Jeffersons today. Where are they?

Most of us are for freedom of expression when there’s a danger our own views will be suppressed. We’re not all that upset, though, when views we despise encounter a little censorship here and there. The system founded by Jefferson, Madison, and their colleagues offers means of expression to those who do not understand its origins and wish to replace it by something very different. Jefferson proffered, “If a nation expects to be both ignorant and free in a state of civilization, it expects what never was and never will be.” He continued, “A society that will trade a little liberty for a little order will lose both and deserve neither.”

Science, or its delicate mix of openness and skepticism, and its encouragement of diversity and debate, is a prerequisite for continuing the delicate experiment of freedom in an industrial and highly technological society. The Bill of Rights de-coupled religion from the state, in part because so many religions were steeped in an absolutist frame of mind–each convinced that it alone had a monopoly on the truth and therefore eager to impose this truth on others. The Framers of the Bill of Rights had before them the example of England, where the ecclesiastical crime of heresy and the secular crime of treason had become nearly indistinguishable.

Sagan concludes, “Education on the value of free speech and the other freedoms reserved by the Bill of Rights, about what happens when you don’t have them, and about how to exercise and protect them, should be an essential prerequisite for being an American citizen–or indeed a citizen of any nation, the more so to the degree that such rights remain unprotected. If we can’t think for ourselves, if we’re unwilling to question authority, then we’re just putty in the hands of those in power. But if the citizens are educated and form their own opinions, then those in power work for us. In every country we should be teaching our children the scientific method and the reasons for a Bill of Rights. With it comes a certain decency, humility, and community spirit. In the demon-haunted world that we inhabit by virtue of being human, this may be all that stands between us and the enveloping darkness.”

Carl Sagan died 29 December 1996. His role as a voice of reason, a researcher, a defender of the scientific method, a skeptic, a storyteller, and an inspiration is greatly missed.

 

Render Unto Caesar

May 2000

Two distinguished authors have recently published books concerning the deterioration of morality in our nation and both suggest the need for dramatic changes of individual attitudes to create improvements in our personal and community relations. They express a great deal of agreement as they write about their observations regarding human greed, alienation, dishonesty and lack of civility but the solutions they offer exemplify the basic differences between humanism and religion.

Robert Grant, humanist lawyer and former judge, author of American Ethics and the Virtuous Citizen, views the human being as an evolved rational animal who has developed the capacity to think, to reason and to discover truths. Gordon B. Hinckley, LDS Church president, author of Standing for Something, views human beings as creations of God who gave them the capacity to think, to reason and to accept revealed truths.

Grant cites the writings of secular philosophers who say the source of power is people who give governments limited authority to impose rules. Hinckley cites God as the source of power with humans having the obligation to obey divine commandments.

Grant concludes the virtuous citizen will choose the right course of action for their own enlightened self-interest and the common good. Hinckley says the virtuous citizen will choose the right course because of reverence to God. Both authors agree that humans must take responsibility for their decisions and accept the consequences of their actions. Both authors agree that civility is a core value of a virtuous society. After reading both books and meditating on the contrasting solutions to our nations moral dilemmas I have a greater appreciation for the wisdom our nations founders displayed in creating a democratic republic rather than a theocratic republic.

–Flo Wineriter

 

Are Souls Real?

Book Review

November 2004

Are Souls Real? by Jerome W. Elbert, Ph.D., publisher Prometheus Books.

There are three excellent reasons for putting this book on your must read list. First, the author presents a readable survey of the ancient religious myths modified and adopted by the founders of the Christian religion. Second, he is a distinguished scientist who explains basic science understandably for those of us with very little scientific knowledge. Third, he rewards the reader with a good understanding of why individual identity ends with death.

The author was a physics research professor at the University of Utah for 25 years.

–Flo Wineriter

 

The Mormon Hierarchy: Extensions of Power
By D. Michael Quinn

December 2000

Dr. Quinn’s book is a remarkable accomplishment. For a brief time, in the70’s and 80’s, the historical office of the LDS church allowed for some objective, professional examination of its records. Quinn brings us some of the fruits of that time. This is not “faith-promoting” history-Deseret Book and Bookcraft have taken care of that-but shows the Brethren in all their human glory. Some reviewers have indicated that this volume has not threatened their LDS testimonies, but only confirmed what they already knew, that church leaders are human and fallible; other reviewers may be threatened by this realization, although many past presidents have pointed it out. The marketing of the infallibility of church leaders continues, perhaps because it gives comfort to those church members who are intolerant of ambiguity, but also because toadying is often rewarded in organizations.

Extensions of Power is actually several books. It is topically arranged to consider more or less controversial aspects of the church leadership-violence, involvement in politics, etc. It also includes, as the earlier companion volume did, hundreds of pages of notes and a detailed chronology of church activities from 1848 to 1996. We are afforded a glimpse into the complex personalities, power factions, and challenges of maintaining, growing and adapting a religious movement to a constantly changing and evolving U. S. and world culture. I was by turns frustrated with church leadership and empathetic with them in their struggle to understand and accommodate ‘the world’ without losing their unique identity.

I was also able to see how present problems have their roots in the past, and the futile efforts of those leaders–such as Gordon B. Hinckley and Boyd K. Packer–who would like to bury the past. Mormonism is a religion that was established and grew during historical, literate times, and leaders and members must come to terms with the difficulties of their history. Despite Correlation committees, Strengthening the Members Committees and million dollar public relations and marketing campaigns, and particularly since the advent of the internet, historical problems will not go away. For the questioning believer or the student of religions and U. S. history, Dr. Quinn’s book is a very useful tool in understanding how the present Mormon church came to be.

–Richard Garrard

 

Three Book Reviews

September 2000

Three new books will add interesting information to any humanist conversation. The first two are fast reads; the details in the third one will require more time and concentration.

Papal Sin by Garry Wills summarizes the blatant historical abuses of power by Catholic popes, nepotism, murders, and wars of conquest but concentrates on the historical distortions and evasions of the modern papacy. The Pulitzer Prize-winning author is an adjunct professor of history at Northwestern University.

The Battle for God by Karen Armstrong details how and why the fundamentalists of Christianity, Judaism and Islam came into existence and what they yearn to accomplish. She examines the way these movements arose through the common fear of modernity, the dominance of secular values around the world. Fundamentalists have no tolerance for democracy, pluralism, free speech or the separation of church and state. The author took her degree at Oxford and is one of the foremost commentators on religious affairs in both Britain and the United States.

From Dawn to Decadence by Jacques Barzun details the 500 years of western cultural life since 1500, from the Renaissance and Reformation through the Enlightenment to the present. The hours you spend reading this book will reward you with a more clear understanding of the great division between Modernity and Post-Modernism. The 93-year old author was Seth Low Professor of History at Columbia, Dean of Faculties and Provost and twice president of the American Academy of Arts and Letters.

–Flo Wineriter

 

Where Do We Go From Here?

June 2000

I have been feeling a little frustrated recently and I’m appealing to you for some inspiration and guidance. Our membership numbers seem to have reached a plateau and I would like to see the growth renewed. Two areas are important to the health of our chapter:

  • Offering programs and activities that make the chapter meaningful to members.
  • Creating public relations activities that attract the interest and attention of new people.

To give you an example, during the past two years I’ve had the opportunity to speak to several hundred students in classes at the University of Utah. The student evaluation papers regarding my presentation indicate that about 95% of them had never heard of humanism before I spoke to them. The most frequent comment regarding humanism was surprise that people who do not fear godly judgment can have such a positive attitude about life and high moral values. A few of the students have requested more information or attended our meetings but most of them are now aware of humanism and many respect our view points as legitimate even though they disagree.

I would appreciate your comments on how we may expand public knowledge of humanism and your thoughts on programs and activities that would enhance your personal involvement in our chapter. No comment or suggestion is too ridiculous to consider.

–Flo Wineriter

 

Wouldn’t An Atheist Or Humanist Cheat If There Was No One Watching To Hold Them Accountable?

March 2000

Michael Medved, a talk show host, posed this question on his show. He assumed that if there was no omnipotent, everywhere present, score-keeping God, we would cheat on our spouse or neighbor if we thought we could get away without penalty. He informed his audience that he went to Yale Law School with Bill Clinton. Consequently, it seemed ironic that he would ask the question because no matter what one thinks of Bill Clinton as President, his God beliefs did not appear to inhibit his many infidelities. Neither did it stop John Kennedy.

One could construct an experiment to discern whether a God believer, an atheist, or a humanist would be more likely to cheat. A clerk at a grocery store could pretend to not know that she had returned too much change to each of these categories of believers and then ascertain who would be most likely to return the misbegotten change. The results could then be compared.

The evidence we have from history doesn’t suggest that the rewards of heaven or the fear of hell have been very effective social controls. The institution that holds the largest percent of God believers is our prisons. Children in parochial schools cheat more often than children in public schools. Religious Crusades and ethnic cleansing have been the handiwork of dogmatic believers as well as dogmatic nonbelievers.

Ethical humanists internalize their values. They do the right thing because it befits their character. People of character honor their commitments, keep their promises, treat others fairly, and do service for their community because it is consistent with their valuing themselves. Being honorable has its own rewards, as people in the community respect honorable people, and they serve as good role models for their children. On the other hand, people who make public pronouncements about their self righteousness are often suspect. Jimmy Baker, Jimmy Swaggert, and Newt Gingrich are examples of public hypocrites who use religion as a cover.

We can leave it to others to determine whether people are more likely not to cheat because it is inconsistent with their character or because an invisible superpower may punish them after they die.

–Gordon Gamm
reprinted from the January/February 2000 edition of
The Colorado Humanist.

 

God Bless You, Dr. Kevorkian
by Kurt Vonnegut

Book Review

June 2000

Kurt Vonnegut, honorary president of the AHA and my own favorite author, has done it again. He has published a short book where he visits with 21 dead people. He is able to accomplish this feat with the help of Dr. Jack Kevorkian and the staff at the state-of-the-art lethal injection execution facility at Huntsville, Texas. They transport Kurt down the “blue tunnel to the Pearly Gates” where he meets with the past luminaries for a short chat. Of course he doesn’t actually pass through the Gates, because once through, there can be no return.

Of interest to humanists is the forward section of the book: Vonnegut notes that he is a humanist who believes in neither heaven nor hell. He also offers several short definitions of humanism: “I have tried to behave decently without any expectation of rewards or punishment after I’m dead.” Again, “‘humanist’ is nothing more supernatural than a handy synonym for ‘good citizenship and common decency.'” And “humanists, having received no credible information about any sort of God, are content to serve as well as they can, the only abstraction with which they have some familiarity: their communities.”

The vignettes take the form of a reporter for a New York public radio station who visits with historical people including: Clarence Darrow, John Brown, Adolf Hitler, Isaac Newton, James Earl Ray, William Shakespeare, Isaac Asimov, Kilgore Trout (who isn’t actually dead yet, but then he has lived only in Vonnegut’s pages), and others. The reports were designed to fit 90-second interludes on WNYC.

The 79-page book is a joyous treat, humor and thought provoking prose from one of the 20th Century’s greatest fiction writers. I encourage you to get a copy and enjoy it!

–Wayne Wilson

 

The Elegant Universe

Book Review

July 2000

One of Albert Einstein’s greatest hopes was to find a unified theory to explain the ultimate constituent particles or forces of the universe. Ironically much of his work in describing gravity was at odds with what was known about electromagnetism. When I studied physics as part of my university protocol the lessons were divided into two separate and mutually exclusive sections. For the grandiose, macro-universe one used Einstein’s theories. For the micro world of the atom the whole new set of formulae of quantum mechanics had to be learned.

Since my days in a formal education setting many new particles beyond electrons, protons, and neutrons have been predicted and discovered. Now there are muons, neutrinos, quarks (with interesting names like up, down, charm, strange, top, and bottom), etc.

Brian Greene, in his book The Elegant Universe, Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory, (Vintage Books, 1999) argues that physicists may be close to TOE-The Theory of Everything.

Most of this book was written for the lay person with an interest in science but without needing a strong mathematical background. There are a few chapters devoted to those better equipped for the math, but they are marked and can be skipped without losing out on the important concepts in the book.

Superstring theory not only resolves the differences between relativity and quantum mechanics, it requires both lines of thinking. It is also the first concept to fully explain gravity.

If you are like me and are interested in a refresher on physical theory I think that you will find this book fascinating.

–Wayne Wilson

 

Discussion Group Report

Are Mormons Creationists?

June 2000

By Richard Layton

There are many versions of creationists, points out Duane E. Jeffery, BYU professor of zoology, in an April 1985 Sunstone magazine article, “Are Mormons Creationists?” In common American non-LDS usage, the term “creationists” refers to “persons of very ?fundamental? Christian persuasion who have banded together to promulgate certain views pertaining to the origin of the universe, earth, man, and so on. These include the tenets that God is omniscient, sovereign, absolute, and omnipotent; that he created all time, space, and matter instantaneously and out of nothing (ex nihilo) roughly 6,000-10,000 years ago. From such matter (dust), he then molded a body for man and created Eve from a rib thereof…over a period of six literal 24-hour days and …spoke, and things came instantaneously into existence, fully developed and functioning.” He is responsible to no power or laws other than his own and works by supernatural processes. Natural laws are seen as ungodly, the results of sin and wickedness. “Such concepts,” says Jeffery, “are demonstrably foreign to the philosophical underpinnings of Mormon theology.” He does not believe that Mormons should jump on the bandwagon with creationists, since there are important theological differences between Mormonism and creationism.

Mormon prophet Brigham Young responded directly to creationist concepts; “When you tell me that father Adam was made as we make adobies [sic] from the earth, you tell me what I deem an idle tale…There is no such thing in all the eternities where the Gods dwell.” Other statements from top Mormon authorities are: John A Widtsoe, General Authority: “The statement that man was made from the dust of the earth is merely figurative…Likewise the statement that God breathed into man the breath of life is figurative.” Church President Spencer W. Kimball: “The story of the rib, of course, is figurative.” President Joseph F. Smith: “The Church itself has no philosophy about the modus operandi employed by the Lord in His creation of the world, and much of the talk therefore about the philosophy of Mormonism is altogether misleading.” The First Presidency in 1860 and the first presidency and the Quorum of the Twelve in 1865 denounced Apostle Orson Pratt’s views on this but declined to establish any Church view for exactly what method the “Creator” had employed. In 1909 the First Presidency published a treatise entitled “The Origin of Man,” arguing that man’s spirit derives from divine parentage but paying little attention to the origin of man’s body. Curious LDS readers were answered in the church magazine, The Era, that the Lord had not revealed his methods. Readers were given three possibilities to consider: divinely directed evolution, transplantation from another sphere, or “born here in mortality, as other mortals have been.” None of these agrees with the creationists. Although various views have been expressed on this subject by apostles and presidents of the Church and some of these views have conflicted with each other, through it all the First Presidency, which includes the president of the church, who is considered by Mormons to be the only one who can receive divine revelation for the whole church, has made it clear that the church as yet possesses no precise revealed information on how man’s body was produced by God. In 1931 they ruled against continued discussion of the topic, silencing a running debate on the matter: “Our mission is to bear the message of the restored gospel to the people of the world. Leave Geology, Biology, Archaeology and Anthropology, no one of which has to do with the salvation of the souls of mankind, to scientific research, while we magnify our calling in the realm of the Church.” More recently President Kimball echoed such sentiments, “We don’t know exactly how their [man’s and woman’s] coming into this world happened, and when we’re able to understand it the Lord will tell us.”

Finally, although Jeffery could not have known this when he wrote his article, just a few months ago the Salt Lake Tribune reported that the LDS Church public relations department had told it that the church supports the teaching of evolution in the public schools.

However, LDS spokesmen have overwhelmingly agreed that Adam and Eve were historical people, yet that their bodies were produced by some sort of biological procreation. Jeffery states, “This latter idea is thoroughly repugnant to modern creationists and serves to underscore my final point: that beyond generalities, Mormonism and modern creation are completely incompatible on issues relating to the origin of man. For Mormons it seems clear: believing in creation does not make one a creationist. Indeed Mormons would have to reject their entire philosophical framework to become such. This conclusion becomes even more vivid when one examines concepts of the nature of God, of physical law, and of ex nihilo creation.”

 

Discussion Group Report

Are Human Rights Inherent in Our Nature

July 2000

By Richard Layton

“Human Rights are not legal fictions conferred by governments but are inherent features of our nature as human beings,” argues Robert Grant in his article, “The Social Contract and Human Rights,” in the January-February issue of The Humanist.

He says all societies have rules or laws and require their members to obey them for the peace and good order of that society. He cites John Rawls’ assumption in his book, A Theory of Justice, that a society is defined by its rules. There are two sources for the duty to obey such laws: authority and mutual consent. In Europe the authoritarian doctrine of the divine right of kings until the eighteenth century asserted that kingly authority was derived from the higher authority of God and therefore could not be called into question by either parliament or people. In many other cultures rulers were seen as gods themselves or as direct descendents of gods. Obedience to such figures of authority or to their duly ordained subordinates was seen as a basic duty. But mere obedience is not an ethical act. When it is enforced through conquest or slavery, or is simply the result of blind and unthinking compliance with the law, there is no free, intelligent, and conscious choice involved; there is no consent. “To yield to the strong is an act of prudence, not of respect for the law, asserts Grant. “Only when submission to the authority of a society is learned and accepted as a thoughtful, deliberate choice does acceptance of this duty become an ethical act.”

John Locke’s concept of the mutual consent of the governed as the basis of the social contract, enunciated in 1690, moved Western civilization from authority to agreement as the basis of civic duty to obey society’s rules. It was one of the greatest paradigm shifts in human history.

In our time John Rawls has transformed the conception of the social contract into a parable. In his scenario we imagine a gathering of human beings who have been stripped of their accidental characteristics: sex, age, race, nationality or tribe, social status. wealth or poverty, good health or disability. They are left with only the essential characteristics of their human nature. These humans can make free choices about what is in their own self-interest-and they understand that their enlightened self-interest values long-term goals over short-term satisfactions. They are social animals that know how to cooperate with each other. Each of these human beings becomes what the law refers to as “the reasonable person,” a hypothetical or abstract person who will act reasonably under any circumstance. These people come together and make rules for the commonwealth behind the “veil of ignorance”-that is without knowledge of who or what they will become when they return to society. Being completely equal in bargaining power and absolutely impartial, they will make rules that are both reasonable and just-that is, that burden and benefit each person equally. This becomes the ideal social contract.

In real life no such ideal gathering of people has ever taken place and no such ideal social contract has been drafted. We must then turn to the concept of an assumed consent, which takes it as granted or true that every reasonable person in a state of perfect equality and impartiality, if asked, would give such consent to the contract. Therefore, every member of a society is automatically bound by the social contract. Only when people explicitly acknowledge and accept the duties imposed by the social contract, with knowledge and forethought, do they perform an ethical act. Such explicit consent internalizes a person’s obligation to obey the law.

Basic duties are natural duties, since they arise from our nature as human beings. However, these natural duties are not perfected until we form ourselves into social groups, since duties are relationships. For example, the duty not to kill each other becomes a duty only with the formation of a social contract. In a disarmament negotiation, no party to the proposed compact would surrender weapons unless and until all others in the group had laid their arms on the table. So the consent must be unanimous and the duty imposed universal. Using the reasonable person test, the same analysis can be made of every basic duty. These will be very few.

“The social compact,” says Grant, “is the fundamental compact that consists of the rules imposing basic duties, assigning rights, and distributing the benefits of political, social, and economic cooperation, unanimously agreed to by reasonable people in a state of perfect equality and absolute impartiality.” It is the fundamental compact that is assumed to exist in every society.

A right is one side of a relationship; your right is the duty of another. A human right is a relationship arising from our nature as human beings that entitles an individual to certain conduct from another. It is a contractual right flowing from the social contract. A human right is not to be confused with a possession, like an apple or a house. Nor should it be equated with a human power, like the power to think or see or live. It is a relationship between an individual and all others that entitles a person to certain conduct from every other person and from society. Human or natural rights are only those that arise from the acceptance of natural duties. The denial or abridgement of human rights constitutes a breach of the social contract. These rights are universal, unalienable-they cannot be taken away or even abridged-indivisible and interdependent. There are very few human rights-life, liberty, and the pursuit of happiness-but they are basic to our ability to live as human beings.

Self-government ultimately boils down to government by the majority of those voting except that human rights are not subject to majority vote. Unalienable means unalienable. “Governments become just, says Grant, “when they enforce the basic natural duties and protect the human rights flowing therefrom that constitute the social contract. And individuals become ethical when they freely acknowledge and affirm obedience to these basic duties as a personal obligation and give their informed consent to respect and honor the human rights of all other human beings.”

 

The Nature of Scientific Inquiry

August 2000

In the November-December 1999 issue of The World magazine, UUA President John Buehrens wrote eloquently about spirituality and science. His treatment of the topic fits nicely with my theme. Dr. Buehrens’ conclusions are quite wide-reaching. Citing scientific philosopher Freeman Dyson, President Buehrens states: “…a God who is not self involved or fearful but creative and therefore always giving away being and power. A God who is not static but growing and changing, who is hurt or given joy by what we do or leave undone in our relations with others. Dyson speaks of a God inherent in the Universe and growing in power and knowledge as the universe unfolds.” In this citation, Buehrens presents a number of concepts: 1) Creative God, 2) Generous God, 3) Growing God, 4) Unfolding of universe, 5) God is the Universe: weighty theological innovations arising from science and spirituality. How has a simple discipline of measuring triangles and dissecting frogs reached such lofty mythological levels?

We see in president Buehren’s philosophy a tendency common to most humans to create abstract concepts such as justice, freedom, love, spirituality, and now, science, and animate them all, appearing as antique Gods with arrows, swords, or balances in hands. This completes the move of such concepts into the realm of mythology. This move changes how we look at science. Read the old mission statement of the Minneapolis Society and you can see the lofty position to which we have elevated science and the scientific method.

Science is in this way often tied to rational thought, giving the impression that if you are a rational being, you must use the scientific method in every possible situation. At that point, it becomes quite unclear whether science is the preferred universal tool for a rational being or whether science is just a synonym for rationality. If scientific inquiry–the operational definition of science–is just a tool, it becomes very difficult to define when it’s appropriate to use it as a label. If a Swedish scientist claims to have constructed the best possible mattress, has he used our tool? Has science, as a tool, been used to produce a BMW 500 ? Is astrology a product of science as a tool? It certainly is logical in its arguments and based on observations.

On the other hand, is science not a universal tool but only another name for rationality? If we agree that this is the case, then the most primitive aborigines practice science because their behavior within their surroundings is quite. rational.

It is clear from the above reasoning that it is not useful either to consider scientific inquiry as a universal tool or science as a synonym for rationality.

Looking at the problem, maybe we could justify the use of science in a nearly mythological context by considering scientific inquiry as the golden path to truth. The role of scientific inquiry is primarily the establishment of truth, but we need only read the scientific sections in our newspapers to realize that science can be used to define as true many incompatible statements. One day red wine is good for your health. True! The next day, alcohol use may lead to liver disease. True! No wonder that some lost school board in Kansas has declared creation to be a true scientific theory. Arguing along the mythological interpretation of science in president Buehrens’ article, I see no problem in declaring the story of Winnie the Pooh as a theory of small bears.

Uncertain and vague statements about science thus seem to be due to an unnecessary broadening of the definition of scientific inquiry. Let us try to examine its functioning and try to redefine the meaning of the words scientific inquiry.

We have to start from the beginning and try to find the goals for scientific inquiry and the foundational assumptions that such inquiry is based on. You can find the goals quite easily. Read a number of text books in physics ,astronomy, chemistry, biology and psychology, and it becomes immediately clear that the goal is always the same:

A precise description of the external world with us a part of it. A description in terms of observations made, using our senses.

Scientific vocabulary does not contain statements like, “good for you.” Thus, stories about the usefulness of red wine or the dangers of alcohol have nothing to do with science. Nor is the mattress design by Swedes classifiable as scientific inquiry. If the goal of scientific inquiry is to paint a picture, make a model of the physical world, then the two premises that such inquiry is based on become immediately evident.

  1. There is an external world common to us all–a world existing independent of our observing it. Thus, if the human race were obliterated by an intergalactic construction company, the record players booming out Beethoven’s Fifth Symphony would continue to play although there are no more humans to hear it. Observers from other planets would find ruins of lost cities and all our toys as real as they once were to us. The world exists even without us. We will call it the common reality premise.
  2. Events in the external world are related to each other by causal connections. Our observations of them are logically related. We call this the causality premise.

Science as an art of describing the world around us cannot function if any of these two premises is violated.

We can ask if these two premises are ever challenged. Of course they are. The causality premise is challenged by the Christian dogma of God’s omnipotence and the possibility of his intervention in our time. The beautiful stories about the compatibility of Christianity and science are just stories. As long as it accepts miracles and response to prayers, Christian dogma cannot be compatible with science because as dogma it negates science’s causality premise,

Let me illustrate with a story: I work for Jimmy the Greek in Las Vegas and my quite well- paid role is to determine the handicaps for college football teams so that the public can bet on them. I work hard on developing the odds through researching statistics, health reports, previous records, etc. All is going pretty well. We don’t worry that most teams with their coaches pray before the game. Let us now assume that one day these prayers are answered and miracles happen here and there. The college games become unpredictable. Very soon, Jimmy the Creek and I decide to throw in the towel. Miracles do the same thing for scientists. Who can predict weather if in the wink of an eye the Red Sea parts on order of a local prophet?

The common reality premise has been challenged by many schools of thought, the latest being the French postmodernists and deconstructionists (Foucault;Lyotard). Anytime you hear somebody describing science as a white male power structure and touting the advantages of female science, you know the common reality premise has been violated. The structure of the world we observe as scientists is observer-neutral and common to us all. You cannot deconstruct it to different pieces depending on who you are. If you can do it, science is not possible. Dogmatic religious people at least are honest to the extent that they often bet their fate on the outcome of their prayer. Postmodern thinkers, however, negate the common reality premise only in a verbal sense in the academic and intellectual world. In their private lives, they act as if this premise were true; they do not doubt that reality for the 10,000 air traffic-controllers is precisely the same anywhere in the world, so they can collect their frequent flier miles in safety.

Let us summarize:

Provided common reality exists independent of us and the events observed show reproducible causality, we can proceed to paint a picture of the world. The process of doing it is defined as scientific inquiry. This inquiry is based solely on past or present observations.

The first development of scientific inquiry by humans started after they accepted the common reality and causality premises. Humans discovered the reproducibility and common basis of observations, maybe 10,000 years ago. We believe that they started with observations of the positions of heavenly bodies and such natural phenomena as the rise and fall of the level of the Nile.

The premises for scientific inquiry so far presented are a more abstract part of the process of inquiry. The inquiry itself is an eminently practical process that takes place in seven consecutive steps. We come now to the first three steps of inquiry, common to all branches of science. The steps follow each other both historically and logically. Thus, every new branch of science goes through these steps sequentially.

  1. Record observations (The earliest documents were astronomical charts, that is, bones with markings related to phases of the moon).
  2. Compare observations (develop some kind of quantitative measure, length, height and then develop the concept of scale–high, higher, highest). Convertobservations to measurements.
  3. Introduce common reference (a standard for weight or using sea level as a base for height measurements).
  4. At this point, words such as theory, hypothesis and law, all referring to the latter steps of the procedure, have not yet made their appearance.
  5. Do not think that these first three steps are somehow only associated with the past, namely–the beginnings of science. Some of the most modern scientific fields have not yet advanced beyond this point. Take, for example, the study of human intelligence. The measure we all are acquainted with is I.Q. We carry out a number of measurements in the form of questions and answers. We time some responses; we record and study a population; we convert the results to some combined quantitative measure, percent, number of correct answers, etc. We study a population and find an average. After that, we declare that high deviations from the average are a sign of high intelligence, and deviations towards low, of low intelligence. These conclusions have little to do with science but are a social commentary that may have some validity for some uses. If somebody rapidly solves a puzzle, one should, according to our scale, be good at making cross-word puzzles for the newspaper, which in real life may or may not be true. The big fights about the significance of such measurements as I.Q. are to be expected. The science of human intelligence measurement, if there is such a thing (definable by observations), is still in a very early stage. How does scientific inquiry proceed from observations and compilations? How do we move beyond I.Q.?
  6. This fourth step is the most difficult and arduous of all. It represents identification and separation of variables.
  7. What is a variable? Some measureable property, conforming to some reasonable scale on which its variation can be defined. Temperature, the height of things, weight, are good, clear variables. How many variables do we need to fully describe the properties of any gas such as air/oxygen? Volume pressure, temperature? All three are easy to measure.
  8. We see the problem with I.Q. We cannot yet define decent variables that can be measured by some scale. Can solving crossword puzzles be measured? Think about playing scrabble. What are the variables you can define your prowess with? Can you think of a formula converting your eyesight, memory, speed, the amount of books you have read, into a prediction of your scrabble score? Of course not! Galileo’s brilliant insight was that he found decent variables for observing falling bodies: weight, time, height. Once the variables are separated and defined, we can proceed to the next step.
  9. The fifth step is the formulation of an hypothesis. Let us consider the game of billiards where the described by a change in variables. A billiard ball receives an impetus from the cue ball, thereby changing its position and speed and colliding with a second ball. A follows B follows C. The second ball moves along with some defined speed to some defined direction.
  10. I now define an hypothesis. Billiard balls after being hit by the cue ball move in a direction determined by the angle and momentum provided during the collision. I propose it to be valid for all billiard balls and cues wherever and hit by whomever.
  11. It sounds plausible but needs verification. We and others here and in Europe hit cue balls with different power at different angles and we see that using this hypothesis, we can predict the path of the second ball well. After a lot of testing, we move to step 6.
  12. Conversion of a hypothesis to a theory.
  13. It is time to declare the formula to be a theory for billiard balls. The difference between hypothesis and theory is the amount and variety of testing that has been done. The theory is still and always open to revision by new observations of the behavior of the two balls. We find that we had forgotten to account for friction on the surface. O.K., an amended theory is put forward. Then a wise guy points out that the spin of the cue ball plays a big role. We amend the theory again. When at long last, all variables are accounted for and in a very large number of trials the outcome can be predicted every time, we are ready to proceed to the final seventh step.
  14. The elevation of theory to the status of law, valid for all balls of every material and surface spin and air pressure and humidity. We, of course, find that in this case we are describing Newton’s laws. The fellow with the apple has scooped us.

These are the seven practical steps that fully describe the process of scientific inquiry: a practical, simple but often arduous task. We can now recognize at which stage of development different popular problems are to be found.

We find that the study of human intelligence is in the pre-hypothesis stage where we are looking for variables. Darwin’s hypothesis of 1859 has become a theory of evolution still being amended. The inheritance of genes is based on Mendel’s research which has since become law (Mendel’s problems were simpler than Darwin’s as the number of variables for a gene is far less than for an organism).

The last question about science we are going to explore briefly is the question of the value of the product of scientific inquiry. We assume that all our theories have become laws. How well is the external world described by the laws derived by the seven-point method?

An immediate response to this question will doubtless be put forth by some erudite listeners. They will point out that I am getting dangerously close to describing a totally determined Universe which nobody believes in anymore. Furthermore, in my lecture, I am neglecting indeterminism. Many well-known scientists can be quoted as claiming there is no absolute certainty in science, that science deals only with probabilities.

Some listeners will stretch it somewhat further and claim that the Universe is characterized by the presence of randomness. They point to quantum mechanical theory as support. However, I am trying, in a very short time and quite crudely, to point out that what we call randomness and probability are factors introduced to account for the inadequacy of the human senses to deal with a wide variety of observations. There are too many variables for us to observe with necessary precision; due to the limited nature of our senses, we inevitably influence events in measuring them.

Let us now tackle the concept of probability and its role in scientific inquiry.

We are rolling dice and we keep book, recording all outcomes. It turns out that after a large number of rolls, each of the faces comes up with equal frequency, a very precise number. However, we can never predict what the outcome of an individual roll will be. Despite the best bookkeeping, you cannot find deviations from the results for the large numbers. Thousands of gamblers have learned this bitter truth. Thus, there is a very precise outcome for large numbers; we can certainly consider it to be the law for large numbers. However, there is randomness in the outcome for a single cast. Is this a mystery or a paradox?

Is randomness inherent in the design of the dice? Let us put the dice flat on the table with the face with one up and administer a push to the dice with a very precise machine–a machine that hits the dice in the middle, a little bit above the point of gravity. The dice rolls, maybe three times, with the face showing 2 turning up all the time. We can repeat this as many times as we want. If we place the dice precisely, and the force and direction of the push are constant, the distribution of the face turning up is quite different from the previous rolling where we used our hand. The face showing 2 turns up in most cases. There may be a few other faces when we were sloppy in positioning the dice. Well, there is certainly no randomness in the frequency of faces of dice turning up. (in stark contrast to the series where we used the hand). Consequently, there is no randomness in the dice (the cube itself).

The randomness is evidently in our roll. If we study the human hand casting the die, we discover it’s nearly impossible to isolate all the variables that go in the throw: our muscle coordination has an inherent tremor in it. We cannot put forward a theory for the trajectory of the dice. This doesn’t mean it’s not possible in principle to calculate it. In principle it is the same problem as a chess playing robot. We have always said that the game is too complex with too many variables, even the best computer cannot calculate them. Well, Deep Blue is very near to being the best chess player in the world.

What does science do if the calculations are too overwhelming? It observes a number of test throws and draws a list of outcomes and determines the frequency with which any of the faces comes up. The inverse of the frequency gives you the probability of the outcome. There is, therefore, no mystery about the frequency or the apparent randomness.

The most celebrated tables of probability are those for life expectancy. I am sure that soon with completion of the human genome project, we may calculate biological life expectancy for anybody. Still, the tables will do better because they will include such variables as traffic accidents, which are not mirrored in our genes. Thus, frequencies and probabilities reflect our limitations when dealing with very complex systems. They represent a useful approximation that nonetheless has very high predictive power.

What we have done now is to compensate for the inadequacy of our senses in dealing with large numbers, using statistics, which is the science of dealing with large numbers. Again, probabilities and frequencies represent efforts to deal with the inherent limitations of our human senses.

At this point somebody will assert that what I say may be true in macrocosmos, but when we go to atoms, the quantum chemistry enshrines both randomness and the probability present in very simple units of matter.

First of all, the story is not as clear as many would like it to be. Many scientists, Einstein among them, refused to believe in inherent randomness and preferred to look at apparent randomness as a product of hidden variables. The problem of uncertainty is closely linked to the effect of measurements. We have to visualize events so our senses will allow us to make an observation. This is the major limitation of science.

Yes, scientific inquiry will lead to a true picture of the world surrounding us. The picture, however, is never’ complete and has Ito be continuously amended when and if new observations are made. One can ask how can something be true if it has to be amended. The answer is that all scientific laws are approximations, and what we mean by amendment is that we have isolated new variables and can work with higher precision then before so that the picture of the world becomes clearer and shows more details. This does not mean that the previous picture was wrong, only that it was true at that level of detail and precision. One can compare science to photography that works with the same negative but with the use of increasing magnification.

Scientific inquiry leads to a neutral product. Good, bad, useful, harmful are adjectives outside the realm of scientific discourse. Human beings can look at any picture of nature and exclaim, “This is horrible; I do not like it.” Such valuation of neutral facts has nothing to do with science.

To be sure, the behavior of human beings, their beliefs and motivation for their acts, is itself a field for scientific inquiry. It is, however, on the first, most primitive level. We observe human behavior and try to identify variables that seem to influence it although we have not even succeeded in finding suitable measures. Sociology, psychology, economy, etc, are sciences of a sort, but sciences at their lowest level of development. The level is not determined by dearth of hard work and brilliant insight but is simply a function of the complexity of the systems at issue.

The main point remains: we have constructed our present picture of our world entirely on knowledge gathered by use of the 7-point method of scientific inquiry. The extension of science to art, poetry, and religion has not contributed much to science or to the arena of human emotions and feelings. Finally, the verbal extension of science into the realm of spirituality, as in the Buehren’s article, may contribute to literature and poetry, but not to science.

–Andreas Rosenberg

With a Ph.D. and a Doctor of Science from The University of Uppsala, Sweden, Andreas Rosenberg has been professor of Laboratory Medicine and Pathology, Biochemistry and Biophysics at the University of Minnesota since 1964. He retired in 1999 but is still consultant at the Laboratory for Diagnostic Allergy in the Department of Lab. Med, and Path. He is also on the adjunct faculty of the Humanist Institute in New York. This address was delivered last December al the Forum of the First Unitarian Society of Minneapolis.

Published in the 2000 Issue 1 of Occasional Newsletter of the Friends of Religious Humanism

Discussion Group Report

America’s Declining Social Capital

May 2000

By Richard Layton

According to a Roper Report study, the number of Americans who report that “in the past year” they have “attended a public meeting on town or school affairs” fell by more than a third between 1973 and 1993. Similar (or even greater) relative declines are evident in responses to questions about attending a political rally or speech, serving on a committee of some local organization, and working for a political party. By almost every measure, Americans’ direct engagement in politics and government has fallen steadily and sharply over the last generation, despite the fact that average levels of education-the best individual-level predictor of political participation-have risen sharply throughout this period. “Every year over the last decade or two, millions more have withdrawn from the affairs of their communities,” says Robert D. Putnam in his article, “Bowling Alone: America’s Declining Social Capital, “in the 1995 publication of The National Endowment for Democracy, by The Johns Hopkins University Press. He says Americans have also disengaged psychologically from politics and government over this era. The proportion of Americans who reply that they “trust the government in Washington” only” some of the time” or “almost never” has risen steadily from 30% in 1966to 75% in 1992.

Similar reductions have taken place in the numbers of volunteers for mainline civic organizations such as Boy Scouts (off 60% since 1970)) and the Red cross (off 61% since1970). Serious volunteering declined by roughly one-sixth between 1974and 1989, according to the Labor Department’s Current Population surveys. Fraternal major civic organizations have already seen a substantial drop in membership during the 1980s and 1990s. Although America is an astonishingly “churched” society (The U.S has more houses of worship per capita than any other nation on Earth), religious sentiment seems to be becoming somewhat less tied to institutions and more self-defined. Net participation by Americans in religious services and in church-related groups has declined modestly(perhaps by a sixth). For many years labor unions provided one of the most common organizational affiliations among workers. Since the mid-1950s,the unionized portion of the nonagricultural work force has dropped by more than half. The solidarity of union halls is now mostly a fading memory of aging men. Participation in parent-teacher organizations has declined drastically from more than 12 million in 1964 to 7 million now. Membership in traditional women’s groups and civic and fraternal organizations has fallen since the mid-1960s. Whimsical yet discomfiting evidence is the fact that more Americans are bowling than ever before, but bowling in organized leagues has plummeted in the last decade or so. These facts evidence as ignificant decline in “social capital,” a social science concept which refers to features of social organization such as networks, norms, and social trust that facilitate coordination and cooperation for mutual benefit.

But perhaps the traditional forms of civic organization have been replaced by vibrant new organizations. There have been dramatic increases in national environmental organizations like the Sierra Club, feminist groups like the National Organization for Women, and the American Association of Retired Persons (now the largest private organization in the world except the Catholic Church). Although these new mass-membership organizations are of great political importance, for the vast majority of their members, the only act of membership consists of writing a check for dues or perhaps occasionally reading a newsletter. Few ever attend the meetings of such organizations, and most are not likely to encounter knowingly another member. And there is a growing prominence of nonprofit organizations, especially service agencies like Oxfam, the Metropolitan Museum of Art, the Ford foundation, and the Mayo Clinic. But it would be a mistake to assume that these necessarily promote social connectedness. There has been a rapid expansion of “support groups,” in which fully 40%of Americans claim to be currently involved on a regular basis. Although such groups unquestionably represent an important form of social capital, they do not typically play the same role as traditional civic associations. Robert Wuthnow opines that small groups may not be fostering community very effectively. Some “merely provide occasions for individuals to focus on themselves in the presence of others. The social contract binding members together asserts only the weakest of obligations. Come if you have time. Talk if you feel like it. Respect everybody’s opinion. Never criticize. Leave quietly if you become dissatisfied.”

These potential countertrends need to be weighed against the erosion of civic organizations. The General Social Survey shows that the average number of associational memberships has fallen by about a fourth over the last quarter-century. Putnam observes, “More Americans than ever before are in social circumstances that foster associational involvement, but nevertheless aggregate associational membership appears to be stagnant or declining.”

He offers these possible explanations for the situation: 1) The movement of women into the labor force; 2) Increased mobility (it takes time for an uprooted individual to put down new roots); 3) fewer marriages, more divorces, fewer children, lower real wages; 4) changes in scale (replacement of the corner grocery store by the supermarket and electronic shopping at home); 5) the replacement of community-based enterprises by outposts of distant multinational firms; and 6) the technological transformation of leisure, which “privatizes” or “individualizes” our use of leisure time (television, movies, VCRs, “virtual reality” helmets) and thus disrupts opportunities for social-capital formation.

Putnam suggests attacking the problem of declining social capital through research 1) to determine what types of organizations and networks most effectively embody-or generate-social capital in the sense of mutual reciprocity, the resolution of dilemmas of collective action, and the broadening of social identities; 2) to identify the macro-sociological crosscurrents that might intersect with the trends described here, (What will be the impact of electronic networks on social capital? What about the development of social capital in the workplace?); 3) to count the costs as well as the benefits of community engagement (with declining social capital has also come a substantial decline in intolerance and discrimination); and 4) to explore creatively how public policy impinges on social-capital formation.

Putnam concludes, “In America…there is reason to suspect that this democratic disarray may be linked to a broad and continuing erosion of civic engagement at that began a quarter-century ago…High on America’s agenda should be the question of how to reverse these adverse trends in social connectedness, thus restoring civic engagement and civic trust.”

 

AHA Conference Report

July 2000

The Executive Director of AHA, Tony Hileman, told the board of directors one of his primary projects for the coming year is to improve the association’s relations with local chapters and individual members. Hileman said the transfer of membership records and chapter records from the Amherst, NY office to the new Washington DC office will soon be completed. A program to make members and chapters more aware of the value of belonging to the American Humanist Association will be evident in the near future. The success of this project is vital to fulfilling the new AHA Mission Statement approved by the board and members attending the June conference in Hasbrouck Heights, New Jersey. The adopted mission statement defines AHA as a democratic voice for humanism whose aim is to increase public awareness and understanding of humanism and to serve the needs of its members in their pursuit of living meaningful lives.

As Hileman begins his second year as Executive Director his predecessor, Fred Edwords, will devote his time and energy as Executive Editor to revising and modernizing The Humanist magazine with the goal of making it more relevant to humanism.

The board also adopted a resolution calling for the United States of America to ratify the UN Convention on the Rights of the Child; a resolution opposing Capital Punishment; a resolution Opposing the Use of Corporal Punishment and a resolution calling upon the U.S. Congress and all political leaders to make protecting children from gun violence a top priority.

The following is the mission statement adopted by the board of directors:

The MISSION of the American Humanist Association is to be a clear, democratic voice for humanism in the United States, to increase public awareness and acceptance of humanism, to establish, protect and promote the position of humanists in our society, and to develop and advance humanist thought and action. Guided by reason and our rapidly growing knowledge of the world, by ethics and by compassion, the American Humanist Association is dedicated to serving the needs of its members in their pursuit of fuller, more meaningful lives that add to the greater good of our society and all humanity.

–Flo Wineriter
President, Humanists of Utah
Member AHA Board of Directors