Monday, May 4, 2009

Self-Abnegation

The notion that through the act of suppressing one's sense of self, one could achieve a noble result is an idea that has generally inspired our conduct up until relatively recently.

However, our notion of the self has changed over time. In fact, the very idea of the self is fairly new one, viewed over the span of history. The consciousness of self that we take for granted would be unfamiliar to our earlier ancestors. And now that we know we have a self, we must treasure it. To deny the self, or to allow it otherwise to be diminished, would be viewed as a violation.



I am convinced that at one time, people occupied less aesthetic and psychic space than is the case today. In fact, it is undeniable that many of our citizens could not care less how they are viewed, and if it is a distasteful chore to look at them, it is none of their concern.

I don't believe it was always so. Our dress, manners, and other modes of expression were much more restrained at one time. This societally-imposed restraint on our way of being in the world, it has been said, made it difficult to achieve happiness. But did happiness used to be the goal of living? I think it was not.

Instead, I gather, that if any one principle guided our existence in former times, it was a desire to find our place in the moral order of the society (whatever that may have been). We believed that there was an abstract notion of the good and the not-good that held true for all members of society. Not everyone could live up to it, but all could be measured against it. I do not wish to claim that virtue used to be more prevalent. But, for example, there is plenty of anecdotal evidence that, successfully or not, an attempt used to be made to try impart "goodness" in the young, as an aspect of one's overall education. Such an emphasis would be regarded nowadays as quaint.

Formerly in society, because the notion of good and not-good held so much sway, I think we walked a humbler path. We ourselves were not so important; the moral order of society was. We accepted our lot, by and large. Whoever sought to stand out did so because he or she felt in possession of something genuinely worth exposing to the world, and not as much in order to bring attention to one's own person. And if our sense of injustice was aroused at all, it tended to be over larger concerns. I think one could prove easily that the public today is somewhat indifferent to the larger concerns of society and the world, but is acutely sensitive to anything that would impinge on what it sees as its prerogatives.

Since the self was not a sacrosanct entity, as it is now, it used to be more within the bounds of imagination to take on suffering, and use it for creative ends. The artist, for example, used to exist more or less in opposition to society. Now, partly out of guilt, I believe, for our past treatment of artists, but more so out of distaste for the violence to the self that art clearly could demand, the artist has been invited into society. We have institutions that "nurture" artists that did not exist formerly: creative writing programs, fellowships, teaching positions and the like. We have publicity, lest the artist's work should go unnoticed, as used to be very often the case in earlier eras. But however much our solicitude for artists has aided in assuring their personal well-being, it is questionable whether this solicitude has done much to make the art itself any better.

Our over-protection of the self has not just been harmful to artistic endeavors; it has affected virtually everyone's daily experience. Without denying that there are still lonely people around us, nowadays it would take a super-human effort to experience the kind of solitude that used to be an accepted part of life. To live in a rural community as so many of us did, with transportation slow or unavailable altogether, without the means to connect to the outside world that we take for granted, with books and musical instruments one's only diversion -- this is a way of living that we would find intolerable. The self-denial of such a life would be disorienting. The 'affirmation' we all expect nowadays would have been brutally absent.

We are offered wonderful variety and diverse means of fulfillment. But we have also, as a practical matter, made it impossible truly to abnegate the self, and thereby attain true solitude. Such extreme solitude would be unlikely to provide happiness or even long maintain health and sanity. But it used to be more taken for granted that this kind of solitude could either occur naturally or be attained through effort, and out of such solitude we have gotten the benefit some of our finest actions, as well as our most memorable thought.

Friday, January 2, 2009

Changing Relations Between Labor and Society

It would be ideal if nations viewed the contributions of their labor forces with close to the same respect, if not with as much regard, as those of their corporations and financial institutions.

In our society, however, the two can probably never have equal value. At best, labor can be the sentimental favorite over capital; it can be lionized in some quarters, but it lacks the hold over the American imagination that the corporation has. And over the last 30 years, however, labor in general has been devalued as manufacturing has gone into decline, while the management, and especially, the financial class has been elevated in the popular mind to the point that no amount of malfeasance has been able to reduce its status significantly.

As has been plentifully noted, during the period 1945-75, the working classes of Europe and North America rose into the middle class as their wages and benefits increased. Obviously, the improvement in the standard of living for many ordinary workers was the result of a generations-long struggle. But in the post-war era, labor finally took its 'place at the table' in the new consumer society that was coming into being. It was true that in some ways labor was feared more than respected, and the growing power (and oftentimes, corruption) of labor unions was resented. But at the end of the day, a person without much education could, through a union labor job, provide middle-class comforts for himself and his family. The larger society, in general, believed in providing this kind of opportunity to the workforce.

I can't exactly relate the series of events that has led to the precarious state of labor in the present day, but a couple of trends do stand out. The world is no longer as parochial, economically or culturally, as it used to be. There is now a willingness to have things manufactured wherever it may be cheapest to do so; at the same time, many more countries are able to produce consumer goods than before. In such a climate, a high-wage traditional manufacturing sector is difficult to sustain. The strongest argument for maintaining manufacturing jobs is, sadly, largely the sentimental one.

But for all the pain and dislocation caused by the loss of jobs making durable goods in this country, I have consistently been surprised at how weak the protest has been. The economy has probably adjusted well enough: other kinds of jobs, some high-paying (though many have not been so), have come into being; but the fabric of our society has been damaged severely by the loss of manufacturing jobs. For over 30 years, the regions of the country that formerly relied on factories, mills, mines, and so forth have sustained blow after blow, never really fully taking part in whatever prosperity we have enjoyed during this time period.

This is not a class-conscious society we inhabit -- that is probably one reason for the quiescence of the working class during these last decades of economic transformation. No matter our background, most of us don't like to be thought of as anything but middle class -- it would distasteful, in many cases, to see oneself as part of a victimized stratum of society, and to engage in protest on one's own behalf as a member of such a stratum. That social class is so difficult to define in a society such as our own has had the odd effect of making it difficult to try to defend the interests of those who are economically and socially most vulnerable. For all the evidence that birth often determines our station, we Americans refuse to believe that could be possible: at bottom, we believe that our place in society is always earned. So while we generally sympathize with the farmers, factory workers, and others whose ability to earn a living has dwindled in the past decades, the truth, I believe, is that we feel that if those whose skills have become obsolete or unneeded were just cleverer they could find the solution to their difficulty. This attitude is not particular to one's political affiliation, necessarily (and regrettably).

How else to explain the lack of engagement on the part of most of the influential classes regarding the diminishing rewards not just of all manner of blue-collar work, but of other kinds of jobs that are essential to maintaining society, such as teaching, nursing, and so forth? It might be romanticizing the past to say that work was ever held to be as intrinsically important as the product of that work. But the evidence that we at one time, as a society, believed far more in ensuring the long-term well-being of workers than we do now is incontrovertible.

The rightward turn of American politics since 1968 is often the scapegoat for many of the ills that have beset the American workforce. However, our government policy, as it impacts labor issues, is as much the symptom of our society's changed (or more to the point, diminished) view of work as it has been the cause of Labor's ills.

When our society was more hierarchical, it was also more paternalistic. This paternalism could be heavy-handed, as inevitably would be the case when institutions act in loco parentis. Like any solicitous parent, big business and government are inclined to make glaring mistakes: giving the wrong kind of support to its citizens (old-style welfare I believe is a fair example), or withholding support altogether. But they also created programs that they thought would be in the best interest of citizens: Social Security, Works Progress Administration, Medicare, in the case of government; fully-covered healthcare and other generous benefits for autoworkers and some other blue-collar workers.

Even when management and labor were in firm opposition (during the birth of the labor movement, for example), there was a sense that however much one side may have been anathema to the other, both were part of the same 'family' (forgive the continuation of this metaphor, for I think it is apt); there was no possibility of unilateral exit. Management treated labor at first as a disobedient child, but eventually accorded its respect to unions. Nowadays, companies must overcome a great deal of their own spite before even agreeing to the formation of a union, much less agreeing to union demands. And most manufacturers simply close factories and set up shop overseas to escape the burden of union contracts.

For reasons that are obscure, many Westerners, particularly Britons and Americans, grew weary of protecting and providing for workers. The costs of social supports for workers were (and are still considered) high, and strikes were disruptive. But something else was at work. We began to believe that the individual, unguided will was more efficacious than collective action. We no longer could see any advantage to large government, large management structures, and powerful unions: they stood in the way of individual decision-making and innovation.

This last observation is at least partly true. Even if we were able to do so, we should not wish to replicate the old-style war of attrition between unions and business. Nonetheless, while our business environment is more open to innovation than practically any other in the world, and there are numerous ways to acquire wealth, we find that our wealth, once acquired, is terribly vulnerable. Our individual persons are also vulnerable, as we have less and less health care, retirement savings, ability to afford tuition, and so forth than formerly. Our new-found faith that work is only a private matter that large, general-interest entities such as government and unions should stay out of has undermined society. Our freedom to fend for ourselves has left our economic and social landscape barren.

Wednesday, November 5, 2008

The Three Possible Environmental Futures

There are three scenarios that are, I believe, the most plausible ones regarding the future of the natural environment.

One might be termed "the environmental cataclysm". In this view, the depradations of humanity will cause portions -- perhaps even large portions -- of the planet to become harsher, or even uninhabitable. Temperate zones with adequate precipitation will become drought-stricken and overheated. Rising sea levels will cause flooding in populated coastal areas -- forced migrations will result. Disease and famine will affect many more countries than they do currently, and not just in the developing world. Natural areas will disappear, with extinctions on a massive scale. At best, civilization as we know it would exist in a very different form from what we know today. However, a few predict that the environment could deteriorate even to the point where it could no longer sustain life in a meaningful way.

For the second scenario, we can take all these developments, and imagine, if you will, a measured, rational response to them. In other words, the human race will simply adapt to the changes in the environment that it has brought about. Our population will still grow, as predicted; we will simply accept greater crowding as natural and inevitable (there is still plenty of space to house people -- we will simply have less compunction about using it). We will use technology to adapt our agriculture to the new climactic conditions with which we are presented. Medical science will find responses to the new disease strains that develop, etc.

The first scenario is possible, but the second seems more likely. We will adjust so gradually and skillfully to the new state of the world that we will scarcely notice the extent to which we have transformed the environment. Our relationship to the natural world, already so different from that of 200, 100, or even just 50 years ago, will have changed even further, 50 years hence -- probably this new view of the environment will just serve to rationalize whatever further environmental changes we are responsible for.*

*for much of history, we thave hought of the world as divided into two regions: cities and farms (for human habitation), and wilderness (unsuitable for civilized living, and therefore to be avoided or admired, but not necessarily exploited). When technology permitted us to use almost any natural environment to whatever end we liked, our thinking about wilderness changed radically. Even today, with all of our understanding of the importance of preserving natural ecosystems, the survival of wilderness remains only an intellectual ideal.


The relationship with the environment we have maintained for most of history (this relationship also being so much part of our cultural legacy) has changed in the last 50 years or so, but is now being altered even further, to the detriment of nature. Our cities, which have been, in their own way, harmonious, rational conceptions, are sprawling outward in an improvised response to rural-to-urban inmigration and other trends (in the few instances where the growth of cities is actually being planned, as in China, the construction is on not on a human, but a gigantic and impersonal scale). Our farms, which once had a cultural significance that was almost as central to them as their practical function, are becoming factories in all but name; in this country they have become expendable, as we have become convinced that any type of uninhabited land could be better put to other human uses. Freedom, convenience, and the potential to benefit ourselves materially have become the measures we use to decide the purpose to which we put our land. The survival of natural or even just pastoral areas is not just a noble but a vital cause to fight for -- I think that if these areas succumb to human pressure, as they may well, our conception of ourselves will have to change fundamentally.**


**There is still at least a sentimental attachment to the idea of natural systems that exist independently of humans. But in very few cases do we deny ourselves the chance to encroach upon these same natural areas. However, if they disappear fully, as they are seemingly on their way to doing, it would be the first worldwide calamity for which almost no one could be held blameless. The notion that there is, somewhere, always some enclave of moral goodness would be contradicted with unprecedented force. The damage to our self-image as ultimately wise beings would be permanent.

The conversion of the world from a more or less natural environment to one that is merely adequate to sustain human life would be a soul-destroying turn of events. The dedication of all of the planet's resources towards our material well-being would also permanently weaken whatever is left of our culture. This may seem a bit alarmist, and though I always hope for the best, the cultural undermining that goes hand-in-hand with the aesthetic deterioration of the environment is already underway. In this country, this is obvious if one has been alive for at least 40 years. One proof that America is culturally much weaker than it was even just a few decades ago may be seen in how we use our land. For as many wonderful places as are still left, much of the country is an eyesore, a place where, the people notwithstanding, one would not wish to be for any longer than necessary. The inhospitality and inhumanity of our new buildings, roads, towns, shopping areas, etc. weaken our sensibilities; they undermine the notion that life should be exalted.

However, there is one more way things could be. One can envision a cultural change that would cause material betterment to be viewed only a means to achieving a certain level of comfort, not of achieving happiness. We would ensure our survival not through sheer population numbers, but through an apportionment of resources; neither would we equate our primacy in the world with our dominance of it. We could give voice to our personalities through achievements, work, family life, and other pursuits with meaningful but less tangible goals, and less so through acquisition.*** It might seem precious to suggest that a view to the future might take more precedence in our thought, that the rational self might be permitted to come to the fore, when presented with the awful environmental and cultural consequences of submitting, more or less without question, to a desire (which is not our need) to be gratified; however, it is worth making an argument for an environmental future based on such thinking, if one feels attached to living life as we conceive of it now.

***Do not confuse this for the argument that possessions are corrupting. In a rational conception of the environmental future, we could all enjoy possessions-- we would just have to plan (though only a humane course of action in this regard would be at all acceptable) for there to be fewer of us than we might at first have thought; we would also have to be more selective about what those possessions were.

Friday, September 19, 2008

The Two Competing World-Views

From the time of the French Revolution until very recently, two opposing ideas about the possession and distribution of wealth have vied with one another. One has held that seeking and holding individual wealth was basic to human nature, and the pursuit of wealth should be only minimally regulated, if at all. As importantly, those who amassed wealth would be entitled to use it as they saw fit -- private property is a basic right that should always be protected, in this view. Any program to redistribute wealth would have been seen as taking it from its rightful owners.

The opposing school of thought maintained that individual wealth-seeking was not to be trusted and would lead to unfairness, or worse -- wealth could only benefit society if it was held and applied more or less collectively. Capitalism could be allowed to create wealth, but through taxation and other instruments, its benefits would be spread throughout the society. But as we know, it was often the case that the capitalist system itself was seen as pernicious, and such concepts as private property, profit, and so forth were anathema. The economy would have to be run by the state, not private interests, in order to ensure that all citizens might share wealth equally.

As we have also seen, economic systems have evolved that have successfully drawn from both capitalism and socialism. However, in the main, the economic philosophy that most rewards individual wealth-seeking (free-market capitalism) has been seen to triumph over the variants of socialism (represented by a range of governmental and economic models -- from capitalism with some elements of socialism, as in Western Europe, to the 'pure' communism of the early Soviet Union and the first 40 years or so of communist China). State-run socialist economies have shown themselves to be failures, in the main. In the West, socialist policy survives only in Western Europe and Canada, in the form of pensions, health-care plans, supports for maternity and child-rearing, etc. (though many do argue that these supports come at a high expense economically to the nations that try to maintain them; these social welfare systems may not even be sustainable for much longer in their present form)*.

*There is the example of Venezuela. However, in Venezuela, wealth redistribution is not formal, institutional, as in Europe. When Hugo Chavez leaves power, it is possible that his social programs for the poor would be altered or discontinued altogether. For the purposes of this essay, I am also excluding the examples of North Korea, Myanmar, and other such countries.

For 300 years at least, the question of which economic and social model could most benefit society has preoccupied the world. During that time, religion has mainly been in retreat. And though the long decline of religion obviously has had many causes other than economic ones, it is clear that material well-being has taken up ever more of our attention, and religious faith, while still very strong in some quarters, has lost its hold; religious life has long since ceased to be central to Western society , whereas economic life has been so ever more. One could add that atheism was the official policy of many, if not all the communist states. Secularism was one of the thrusts of the Arab nationalist movements and the modernization programs of such countries as Turkey. Some major religions -- Catholicism for example -- have been forced to modernize or face obsolescence.

Religion has not been the only traditional institution that has been threatened by the indomitability of capitalism and its social corollaries. Traditional cultures and their values have also been in retreat. Whether it is a question of the society and values of the rural United States, or the indigenous peoples of the Amazon Basin, the machinery of commerce and communication has been steadily eroding local mores and cultural practices. Indigenous cultures throughout the world are every bit as endangered as the natural environments in which these cultures are frequently found.

However, over time, a resistance movement has arisen against the trend towards materialism in the world, and has become especially potent in recent decades. It is largely religious in nature, but it would be a mistake to see the movement as solely religion-based. The principle embodiment of this anti-materialist force has been so-called 'radical' Islam (but there are other groups that are in many ways aligned with Islamism in this movement, in some cases while being mortally opposed to it -- Islamism simply having the greatest number of adherents).

The Islamic world has many resentments against the Western, capitalist world (note: this essay should not be taken for support for or judgement against these resentments). Some of these are neither economic nor cultural: for example, the Palestinian question. But one of the main ones (and one that is less often talked about) is of the social decay that the bourgeois capitalist model seems inevitably to carry with it. We value the freedom of Western society, but much of the world is fearful of the cost of that freedom: crime, the decline of family life, the unclear social heirarchy, the emphasis on material well-being at the expense of spiritual integrity. We are fearful of the absolutism we see in Islamism (which is why some of us are given to calling it 'Islamo-fascism'); in some of the non-Western world, both within and outside of Islam, many are fearful of losing their religious and cultural foundations in a flood of commercial homogenization.

We now have a world-wide free market not just in commerce, but in culture. In both cases the large have an advantage over the small. But while the rain-forest tribes of Ecuador face long odds against the global culture of homogeneity, the Islamic world, with its broad territorial span and ample population, intends to put up a fight. The conflict between Islam and the West is most often charactarized as a religious one, and in some ways in may well be. But it is just as much conflict between two opposing attitudes about culture and the role it should play in everyday life. In the West, culture is seen merely as an embellishment to life. In fact the word itself is much more commonly used to refer to the arts or to entertainment than it is to talk about behavior and customs; in some of the rest of the world, including, I believe, much of the Islamic world, culture means the various traditions of a place, a tribe, or of a nation.

I think it is more than coincidence that during the same period (the last 40 years) that the Islamic world has become more assertive, evangelical Protestantism has entered the cultural mainstream, not just in this country, but in other countries that formerly were dominated by the Catholic church. And if one looks a little past the well-known stances the evangelicals take on so-called 'social' issues, and tries to see the motivation behind them, one detects that at the root they are just trying to impose order on a world that they see as bewildering, bereft of moral order (I feel obliged to say, as I did regarding Islamism, that I am in no way trying to be an apologist for any views you may associate with evangelicalism -- nor am I critiquing those views). It may also be significant that the trend in Judaism, especially in America, was once assimilation, but that nowadays, the Orthodox and Hasidic Jews -- living in separate, self-sustaining communities -- are becoming more numerous and ever more influential both in this country (within Judaism) and in Israel.

I am not in a position to postulate as to whether religions can be reconciled with one another or not. But I do think that if any one religion is hardening its stance towards the rest of the world, the true reason may be found in cultural changes now taking place globally -- every bit as much as one may find the reason within the theology of that religion. We would do well to evaluate the role of culture in our own society, and see just how much that role has been altered -- diminished -- over the years.




Saturday, September 6, 2008

Population Control

Any discussion of controlling the global human population brings up a dilemma. On the one hand, in much of the world, the population is too great to allow for much more than subsistence living for the majority of citizens. And in those countries that have been able both to sustain large populations and to develop their economies relatively successfully, the environmental impact has been severe, and is being felt both within those countries' borders and outside of them.

On the other hand, the value of individual human life is not ours to measure; it should never be assessed solely on a numerical basis. While limiting population growth might reduce strains on the environment and the economy, however great these strains may be, we realize that it also would prevent individual people from being born, with all their potential for productiveness and happiness. We cannot easily deny the unborn the same right to existence we ourselves enjoy.

We know that these latter arguments are more than just ethical: they are also moral and religious ones for many of us. We would be right to consider the moral and religious facets of the problem. But if we may, let us first look at the two strains of thought that have dominated the population control debate.

Making predictions about the impact of unchecked population growth, controlling human populations for rational ends, and other 'scientific' approaches to population control were once fully part of mainstream thought. With Thomas Malthus as their intellectual father, many writers in the 19th and 20th centuries speculated imaginatively on what would happen if human beings grew in number indefinitely. They usually made dark predictions: mass starvation, irreversible environmental damage, and so forth. Most saw birth control as the most humane solution to the problem of overpopulation (though some also viewed famine and disease as a natural and ultimately beneficial control on population).

In more recent times, population growth has been seen as inevitable, and the goal has been not so much how to prevent it as to adapt to it, even benefit from it (meanwhile, population control, once the topic of best-sellers, has been relegated almost to the intellectual fringe). We are trying to devise ways of feeding the billions more people who are due to arrive, and are forecasting (if not planning adequately for) their energy and housing needs. The human-life-as-statistic of the Malthusians is out of date; now we have made a fetish of the individual will to reproduce. For example, we have become very respectful of individual cultural and religious dictates that foster larger families: it would be in intellectual bad taste to question the multiple-child family in the country with 35% unemployment and over-taxed natural resources. We acknowledge 'demographic changes' (often a euphemism for population growth), but rarely discuss our potential to alter those changes (almost as if they were weather phenomena). And just as it was once acceptable to view large human populations as burdensome, it is now the fashion to tout the 'potential' of the billions more people expected to be added to the planet in the coming decades.

Our new view of population growth is more humane than that of the cold Malthusians. But it contains disturbing contradictions. We pay lip-service to protecting the wild natural areas that still exist, but do not squarely acknowledge that adding to the current world population on the order of one quarter to one third (as I believe is forecast) will all but ensure the deterioration (if not destruction) of these areas (as it is well known, large and still-growing populations are already putting unbearable pressure on rain forests, oceans, and other critical ecosystems). Global warming may be a proven threat to the livability of the planet, but every forecast of energy needs I have seen shows fossil fuel use growing to levels that common sense tells us are unsustainable. It is uncommon to see a discussion of how population growth increases the demand for energy; instead, the rise in energy use is attributed instead (again, euphemistically) to 'growing economies', or some other impersonal force.

However, this does not remove the basic dilemma of population science that I acknowledged at the beginning. We Americans use up natural resources out of proportion to our numbers, but we would be offended if we were viewed as 'excess' population whose disappearance would aid in the planet's survival. But if our lives, our ideas, and our potential are indispensable, then so then are everyone else's.

The only solution to the problem of human overpopulation, it seems to me, is to acknowledge the sacredness of human life while also allowing that an over-abundance of it will eventually degrade the quality of that life irreversibly. At the present, these two notions are seen almost as mutually exclusive, and from what I can see, irreconcilable.

The contemporary trend is to concede that human life will inexorably multiply as much as the environment can possibly support. It is seen somehow as eccentric to predict that in time our great numbers will infringe on the very sanctity of human life that we would like to uphold. We hold simple human existence to be sacred; we might also define what it means to protect the sanctity of being.

Thursday, August 28, 2008

The Master of Arts Degree In Teaching

Note: the author has taught education courses both at the undergraduate and graduate level.

The MA in Teaching, required for permanent teacher certification in grades K-12 in most, if not all American states, is misguided requirement, at least as it is now generally designed. The degree is based on the premise that the craft of teaching can be transmitted through formal study, when in truth it can probably best be acquired through experience and guided practice. The Master's is meant to give professional as well as a kind of intellectual status, but only succeeds in conferring the former -- and this only because diplomas do, after all, foster respect, but also because the degree itself is a prerequisite for permanent teacher licensure. If you don't have a Master's in Education, you simply can't have a career as a teacher.

The body of knowledge acquired in the Master's in Education course of study is oftentimes lightweight. Graduate schools of education expose prospective teachers mainly to theories about how children think and learn; these theories tend to reflect educational fashion and are often pseudo-scientific in that they are can’t be proven (leaving aside that many of them have been shown to be ineffective or even deleterious over time). Other coursework tends to center around 'issues' in education: addressing the different 'learning styles' of children, 'teacher-centered' (old-fashioned teacher-led) instruction versus 'student-centered' (in which teacher acts more as an enlightened guide) instruction, and the like.

Oftentimes, graduate education course material can verge on the silly and irresponsible. For example, students may be asked to read a set of articles representing different points of view on the topic of giving homework (some will be for it, though it is also likely that many of these articles would be against giving homework). All will cite research that supports their positions. However, when teachers arrive at their first teaching job, it is almost certain that they will have to follow school and district mandates regarding homework. Teachers may rightly question the usefulness of an intellectual debate in an area that really isn't debatable in actual practice.

I am not suggesting the study of education theory should be done away with altogether. Any seriously-intended theory of education is worthy of study. Teachers should be exposed to all ideas current in educational thought, especially given the complex teaching environment we find today. But entire courses and units should not be dedicated to many of the topics that are considered the most important in education theory; they should be condensed, and approached from a more common-sense point of view*. Additionally, knowledge not only of one's core subject area, but general knowledge should be placed at greater value in teaching schools than it now is. I don't want to be lectured ad nauseum on the different 'learning styles' I will encounter in my classroom. Must I constantly distinguish between 'visual' and 'kinesthetic' learners**, and be obliged to come up with ways of engaging them at all times? Or might I just be trusted to come up with a common sense approach on my own or in collaboration with fellow teachers? As a graduate student, I was appalled at the lack of emphasis on learning how to instruct properly in the actual subject area you would be teaching. I was forced instead to learn about the aforementioned 'learning styles', 'democratic classrooms', the umpteen different ways of arranging one's classroom, the pros and cons of having students contribute to making classroom rules, and other soul-deadening material -- never once was I taught how to, say, teach a substantive American history lesson to an elementary school class.

*The writing quality of the contemporary educational education theorist also needs to be addressed. I wince when recalling all the polemical literature I had to endure both as a graduate student and later as a professor who had actually to teach it. Education writers seem not to believe that their audience is capable of making an inference. I should mention that a good deal of the education-related reading material of recent authorship is much in need of editing for style and form.

**visual learners learn best through pictures, graphs, films, etc.; kinesthetic learners apparently learn best in ways that allow them to move about freely (!); 'tactile' learners must handle objects that represent geometric shapes (if they are learning geometry), etc.

Education theory is seemingly willfully separated from classroom experience, particularly from the experience of teachers in low-income schools. For example, the aspect of teaching that most bedevils teachers, particularly new teachers -- classroom discipline -- is rarely treated with the proper urgency in teacher-education courses. Though the topic is covered over several different courses in the typical graduate education program, it is rare to see a course solely devoted to classroom management.

Let's take this matter of teaching discipline in graduate education programs separately for a moment. Classroom discipline problems are the main reason new teachers become frustrated with the teaching profession. And the problem of classroom discipline is seldom approached honestly. In the public schools, the regulations governing discipline are fairly weak -- they are usually an inadequate deterrent to the student who misbehaves chronically. In all teaching settings, the teacher is and must be the primary disciplinarian. But teachers must also receive the unequivocal, firm backing of their school administration and their community to give the proper weight to their own disciplinary actions.

Regrettably, teachers do not always receive proper support in matters of discipline. To digress for a moment, this is ultimately a reflection of contemporary society's ambivalence about controlling the will of the individual; it also reflects our collective view that the rights of the few should be considered on an equal basis with the welfare of the many. At any rate, many teachers, particularly (but not exclusively) in disadvantaged areas, are left to get along as best they can with the most challenging students. Many teachers with disadvantaged student populations find themselves overwhelmed, and leave the profession within a few years.

Teacher education programs idealize the classroom setting, and would have us believe that there is no such thing as the classroom where a program of behavioral modification would not be sufficient to control an unruly student. I have taught graduate students who are working in challenging urban schools, and the difference between the largely middle-class school settings profiled in the course readings and those where my grad students have to work is truly galling. In their coursework, education students are taught how to deal with students who possess some sense of remorse, who have 'internalized' the norms and expectations of society, at least to a degree; they are not taught what they really need to know: how to deal with students with overwhelming behavioral and social problems.

II.

So, far too often, graduate education courses are intellectually stultifying and, worse, do not do a good job of preparing teachers to lead classrooms. The teachers who do succeed are the ones who somehow manage to survive their first years in the profession, finding the toughness and possessing the dedication to weather challenges that very often have to be experienced to be truly believed. Their master's degrees are decidedly not what give them the resources to be successful.

Yet dispensing with formal training altogether is not the solution. Instead, we ought to restructure the training to make it both more intellectually invigorating as well as more practically oriented. We should keep what I believe was the original intent of the master's in education (training people to teach), but revise and shorten the degree course, combining it with a more rigorous apprenticeship than we generally have had prospective teachers undergo up to now.

Teaching is not a science or an 'art' but a craft. Master's degree courses are taught as if education were a kind of applied social science -- through readings, lectures, and discussions, though without the enjoyment of getting to do experiments. It would be better instead to have the students watch videos of actual classes, for example -- not as a special feature but as a regular element of class. Instructors could select readings that support (or refute) the practices that the graduate students witness in these videos. Any papers the students write could be in response to the teacher practice that is observed, using the theory to comment on the practice. Other creative teaching methods should be freely employed: student dramatization of discipline problems, instructional films made by the graduate students that illustrate educational theory, debates, mock trials, and so forth.

Aside from one or two courses on classroom discipline and behavior management, I wish I'd had one on managing paperwork, record-keeping, and other administrative tasks. Like the class discipline course, it would be practical in nature. Class time could be spent grading sample homework assignments, learning to enter grades, do report cards, etc. Videos could again be used to show how teachers can best manage classroom routines.

I'll say again that it would be unwise to dispense with educational theory classes altogether. Students should be able to choose from among courses that emphasize different educational philosophies. A useful (and interesting) course would be on the problems and history of urban education since World War II (ideally, the course should be as free as possible of an ideological slant), with an emphasis on the recent innovations that are showing so much promise in inner-city settings.

In order to allow these or any other suggestions for change in the teaching master's program to work, the teaching quality at schools of education simply must improve. Teachers are incessantly reminded that they must be 'engaging' in their own classrooms by education professors who are anything but engaging themselves. While there is a minority of professors who strive to engage their students, to connect the (at times) disconnected or abstruse theory to classroom reality, many (intentionally or not) treat their graduate students with condescension, believing that Gardiner's theory of intelligences or Piaget's stages of development are sufficiently interesting on their own to stimulate graduate student interest. No educational theories have any real interest save when they are thoughtfully connected with real teaching and learning experiences. We need more professors of education who can teach educational theory creatively and with a more practical aim.

I am not as familiar with student teaching internships as I am with graduate education courses. I'm sure some student teaching experiences are invaluable to prospective teachers. However, if we could follow the principle that greater rigor in teacher training should always be the goal, we might create apprenticeship experiences that demand more of participants. I'm not sure if serving as a teacher assistant for a semester is adequate preparation for taking on the responsibilities of a classroom single-handed. Graduate teaching programs should be leading the way in devising internships that give a realistic experience of working with children, especially the kind of challenging student teachers are ever more likely to encounter, and that allow one to handle, in some small way, the many demands of teaching (paperwork, lesson planning, parent contacts, etc.).

Some graduate programs probably are not in need of any alteration. Institutions such as the Bank Street College of Education are philosophically perfectly attuned to the kinds of learning environments to which they send so many of their graduates. The preparation they offer is probably more than adequate for the teacher at the 'progressive' school. However, many schools are (sadly) too overwhelmed to be progressive, and the only way for teachers to do well in such schools is to be as well prepared as they possibly can be before entering the classroom. If they are teaching and getting their Master's at the same time, they should not be made to suffer through coursework that has little connection to their everyday teaching experience -- that will make them cynical, and ultimately disheartened. The Master's in Education should be looked at just as closely as all the other elements of public education that are currently being subject to revision and reform.

Sunday, July 13, 2008

Cultural Poverty of the United States

In recent times, some of our political candidates have spoken up against 'poverty'. I wonder if this is the right term to use. If what is being referred to is poverty and the destitution that goes with it, then the priority should be to eliminate it. Without denying that poverty still can be found in the United States, I think what people are more often referring to is not true poverty, but income inequality, lack of access to health care and good schools, lack of well-paying jobs, and other impediments to getting and maintaining secure wealth: in other words, economic insecurity. My observation tells me that economic insecurity is much more often the problem than true poverty, as I understand the term. It threatens our national way of life.

There is no question that we are vulnerable economically at this time in our history, and that some kind of decisive action needs to be taken to forestall long-term decline. Unfortunately, the kinds of decisive actions we would need would go against the grain of not just our political, but our social culture. And when I mention social culture, I feel I should also add that we do suffer, as a nation, a kind of poverty that would drag down any effort towards national improvement in any arena of life; I am referring to cultural poverty.

America used to possess a vital 'vernacular' or folk culture (another term would be popular culture, but the common meaning of that term has changed so much as to make it misleading in this instance). In our country, folk culture could be found in many locales: the immigrant enclaves of our cities, our rural areas, particularly in the South; threads of a distinct American everyday culture could actually be found anywhere that had a distinct regional character (which was, at one time, most areas of the country). Our folk culture had many wonderful manifestations: regional folk music styles, certain sports, ways of life, crafts, customs, foods.

Nowadays, most of these products of regional American culture require conscious preservation for them to survive in any form. It is obvious to everyone that we have become homogenized; we have lost our regional folkways (the ones which have produced some unique cultural products, from blues music to stickball*). Consumerism, mass communications and entertainment are some of the powerful agents of conformity that have caused this. The only cultural sub-groups that exist authentically outside of the mainstream are the ones primarily guided by their religion.

*though the effort is self-conscious, some vernacular arts, such as bluegrass music, are actually thriving as the result of a massive effort of cultivation and preservation.

Independent creative folk culture is seemingly on the wane. The hope is that through thoughtful preservation, the folk arts can somehow remain vital. They may not be integrated into our daily lives as intimately as they once were, but at least they won't die out altogether, and could even continue developing.

However, the decline of traditional folkways is not the primary cause of the cultural poverty I was referring to earlier; it is a symptom of it. In discussing this, let us switch over to a second definition of 'culture': the local and national norms, customs, rituals, conventions etc. that shape our behavior, sometimes inhibiting and regimenting it, at all times giving our lives some kind of order, if not value and meaning.

This aspect of culture has never been as strong in the United States as in the older nations. We may have social classes, but no class system. We may practice politeness, but decorum seems more than what is necessary. There is no institution we do not alter in order to give ourselves more freedom, when possible and practical: law, religion, marriage, family, school. Our personal ideal is to be open and free with one another; social reserve makes us suspicious. The hauteur we perceive in some foreigners is repellent to Americans.

Many people have observed these things about us. And I am only repeating what we have all heard already when I say that another tendency of ours is to make convenience and practicality the measure of a thing's value. This is especially true of food, to give just one example. I'm pretty sure that breakfast cereal, fast food, pre-packaged snacks, and other ways of making eating less troublesome were American innovations.

I know all these are banal observations. But while our love of personal freedom, convenience, choice, and so forth have been enduring parts of our character, they have never been as unfettered as during the last two to three generations. We have become more convinced of our entitlement to absolute freedom of action, while doing away with any central guiding principles for our personal and civic lives; the effect on our society has been confusing, and even damaging in some cases.

What 'guiding principles' could I be referring to? Not 'decency' or 'morals', or anything like them. As it is with our folk culture, any decline in the sense of right and wrong that one may perceive is not a cause, but a symptom of something else that is larger but less visible.

Ours has never been as hierarchical a society as some others, nor has it been especially authoritarian. But there was a time when we shared a greater faith that authority would be used for a benevolent purpose. We once believed, more so than now, that authority properly lay with government, with family, with school, and other institutions of society; we did not question it because it would not have occurred to us to to do so. We believed in the accumulated wisdom of these institutions. We also had a passive belief, but a belief all the same, that society's natural order was indeed hierarchical. Parents, school administrations, police, the military, government representatives, etc. were above the rest of society, and it was our natural place to follow and obey them.

One can't go any further without acknowledging that in a hierarchical society, certain groups are held in an inferior position unfairly and against their will. We know this has been true in our society no less than in any other; we have also learned that it is unwise to follow authority unquestioningly. Those who have fought, often heroically, against hierarchies of race, class, and sex have bettered our society immeasurably; the same may be said about those who have fought against the unwise or unfair use of the official instruments of authority.

Perhaps in order to ensure that we would suffer the least possible injustice at the hands of societal, governmental, or familial authorities, we have weakened them, and given them an excess of oversight. The school, the policeman, the parent, the government, and so forth all hold less power over us than they formerly did. This is not entirely unwelcome -- individuals should not be totally powerless in the face of an arbitrary instrument of authority. And we need to have continuing vigilance to make sure that the power of authority is wielded fairly.

I would offer that the dilution of authority in our society has been a double-edged sword. It has made every action, every decision that would have an impact on others frought with ramifications; as a result, we are often paralyzed. One can easily find examples of large-scale projects that would have been undertaken decisively by earlier generations lying half-finished, or languishing in the beginning stages. Specifically, I would argue that the stalled effort to rebuild New Orleans, the decision about how to build on the World Trade Center site, the search for alternative fuels and modes of transportation are but three examples of how our ambivalence towards authority has led not to more enlightened decisions, but simple inaction.

Our thoughtless rejection of authority on principle has done as much harm in the private as it has in the public sphere. While we may agree, if asked, that parental authority, for example, should be final, the broad culture undermines parental authority. In the popular culture, parents are often shown as well-meaning dolts, self-absorbed, or simply incompetent. And if many parents heave off the burden of having to wield authority over their children, it is because the 'culture' (or, what is 'in the air'), gives them tacit permission to do so. Certainly, parents who try to be strict go well against the grain of contemporary thinking. I can't say for certain whether kids are any less well-mannered than in earlier generations -- every generation complains about its children's manners -- but I'm pretty sure that children are no longer expected to present themselves with any amount of formality, in private or in public.

The authority of the teacher and the school is also greatly diminished by our rejection of authority as an abstract principle. For as much lip-service as we pay to the importance of education, the teaching profession itself is not respected; the decisions handed down by the teacher or administration are seen as provisional, if they go against the wishes of dissatisfied parents or of other parties. The multifarious cures being offered for our underachieving schools would be unnecessary if the incivility, slatternly personal self-presentation, and watered-down thought and intellectual content that are tolerated in the broader society could somehow be kept away from the classroom.

In the domain of the arts, it is seen as stodgy to maintain that there is a body of visual art, literature, history, and so forth that is deserving of preservation and study more than the rest. With the abandonment of a hierarchy, as it were, of culture, we have allowed almost every cultural product to have its day. This is not an entirely bad development. Leaving aside that any set of criteria for judging the worth of art is biased and myopic, obviously some of the art that once would have been considered 'low' deserves attention on its own terms. But in (rightly) recognizing that standards of 'quality' are arbitrary, we have then allowed ourselves to think it is better to have no standard at all. This attitude, combined with our faith that technological progress should always be embraced, is causing society to be altered in unpredictable ways.

I would love to have it demonstrated that our everyday culture is as strong as it once was. But by almost every measure I can think of, American culture is has become progressively weaker, across the social spectrum. Families in which there is strong, intact structure have become the exception; with the decline of family life there has also been the gradual disappearance of handed-down cultural traditions: foods, crafts, rituals. People increasingly live in communities that, either by accident or by design, are culture-free, in that they have no ties to any kind of shared past. In most households, what culture there is to be found is of the instantaneous, electronic kind. Though some would have us believe that the new electronic media are allowing the mind to develop in new, possibly beneficial ways, time may show that they have a spiritual cost we cannot measure.

I won't try to convince anyone that our country is a coarser place than it was before. But it is a less interesting place than it used to be. The negative social effects that stem from a lack of a strong, agreed-upon culture apparently can be withstood; it is almost harder, somehow, to live where there is little sense of place.