Thursday, June 16, 2005

POPULAR CULTURE AND BRAIN FUNCTION

There are those who will see this title as faintly oxymoronic, but it is presented here in all seriousness. Sailing through current cultural waters is not always plain or clear; to many of us who have lived long enough to know other times (and other places) it is at least apparent that “The times they are a changing…” as they always appear to do. It is also obvious that many things that are considered to be new and innovative are the old things garbed in the latest fashions. This should come as a surprise to no one, but consider for the moment the charge that in the old day’s people were more repressive and hypocritical than at present, especially about sexual practices, sexual openness, public cursing and racial issues, to name a couple of sensitive spots. (Leaving aside age differences for the moment, perhaps their prerogative, some members of my generation may claim to see clearly “how fast everything is going straight to Hell”). But the same considerations about underlying feelings apply to the social management of aggression. The differences are shown primarily in terms of which values are suppressed and which values are in plain view.

A tendency has been noted for people at large to maintain their old ways at all costs: “The two enemies of reform are the knaves who oppose it and the fools who favor it” (Anon). It has also been said that most people prefer old problems to new solutions. Taken together these tendencies, along with the viability of the ideas presented above, may work to maintain an anachronistic, inner neural and emotional stability, within an apparently changeful external world.
Behind one series of shifts of cultural focus however, there appears a more puzzling level of complexity, i.e., the infusion, engulfment and wholesale take-over of almost every phase of what passes for popular culture by the field of entertainment. While not essentially peculiar to the U.S. or to this century, in terms of public adulation of well known figures, we seem to have gone whole-hog for being mesmerized out of thinking about much other than circus-circus. In the early days of silent movies, Valentino, Chaplin, Fairbanks and Pickford to name a couple of then, and still beloved, “artists” were literally mobbed at any public appearance, here or abroad, and they probably helped give rise to comments such as: “Celebrity-worship and hero-worship should not be confused. Yet we confuse them every day, and by doing so we come dangerously close to depriving ourselves of all real models.” (Pulitzer prize-winner Daniel Boorstin,1974). If nothing else, this may warn us that celebrity-worship should be approached with caution -- more of them might actually be elected to public office.

Before WWII many “celebrity” publications consisted of cheaply made, frequently rather poorly printed exposé’s of a small number of popular film stars and their movies, together with “gossip” purporting to reveal vaguely shady and presumably sensational life styles. These ran about neck-and-neck in readership with True Story magazine and True Romances. (The word “true” was taken seriously only by hard-core users). Shades of Randolph Hearst’s love-life! Now grown to mega-media proportions, it may be the force behind the hordes of their progeny having now burgeoned into the predominate and overflowing content of every form of public presentation – internet included – with detailed accounts of dating practices, marriages, divorces, pregnancies, narcissistic love or hate spats, drugged, drunken or neurotic escapades – displayed together with glittering photos of popular entertainers in expensive clothing with partially bared body parts. These “come-on’s” are not confined to checkout counters; they often consist of frantically beckoning, attractively gotten up images from drama, stage and movies, comedic arts, popular music --and the musicians, now inhabiting most TV offerings up to and including the “these messages” slots; (advertising has been described as the science of arresting human intelligence long enough to get money from it; what better setting?) and there is a peculiar scavenging of material for “news” programs which, because the big networks also are heavily invested in the manufacture of most current entertainment, amounts to a kind of cannibalism that then engages in a process of regurgitation.

We thus see previews, along with other news, bits of the features themselves, (together with current box-office sales), and then the résumé’s, over and over again. In time they will all be repeated, (not to mention “Oscar” and “Grammy” award nights that are strung out for weeks), in case any one of us suffers unmet voyeuristic needs. We are talking “popular” here, as in People Magazine; if it has sold before it will sell again – and vive the sexual/moralistic revolution – lending an unfounded touch of intellectual class to the whole mind-numbing process. Popular is in, private is out. Artist and artistry is in, fine art, now a mere whisper, seems to be mostly out. We are being neither elitist here, (nor anti-sexuality, heaven forbid), nor are we alone in these views: A New Yorker Magazine writer, Ken Auletta, in an interview with U.S. News and World Report (Mar. 15, 2004, p. 20) appears to concur: “Why does it seem as if there aren’t slow news days anymore? Answer: One reason is the manufacturing of non-news into news. We’re preoccupied with ratings. Editors and owners are worried that there are so many news sources, and they want to get people’s attention. So we cover Martha Stewart like she’s World War III. …They (publishers) know this is a society where if your name is well known, you benefit, even if it is known for the wrong reasons, …It’s Joey Buttafuoco all over again. …It’s a freak show”.

As they say on Madison Avenue, “There is no such thing as “bad” publicity”. For a look at what may be served up as news, in Time of March 1, 2004, staff writer Poniewozik under the heading, WELCOME BACK, CAPOS, wrote about the HBO show “The Sopranos”, which was supposed to be ending that season. He reported the show’s creator (David Chase) “says he doesn’t want the show to repeat itself. And the gracious thing would be to…admire his artistic integrity and thank him for the memories. But on behalf of those (viewers) who are greedy and not gracious, let me remind Mr. Chase that he is making a freaking TV show. TV repeats itself—that’s what it’s for. Bad shows do it badly, and great shows like The Sopranos do it so well you hardly notice. Every season, New Jersey mob boss Tony Soprano (James Gandolfini) outwits his rivals and deceives his family, friends and therapist, all while remaining oblivious to his failings. His marriage to Carmela (Edie Falco) unravels as he chases anything with legs and hair spray and she pursues sad, unconsummated flirtations.” The writer, apparently confident in his predictions asks, “Anybody got a problem with that?” Not if you are into popular stuff like this show, apparently, but the problem may be that it is indeed so popular, (and then too, Italian-American “profiling” could still be OK in New Jersey). We are left to muse over why the term “freak” should pop up in both reviews.

But having gotten some of this “off our chest” for the moment, and if not thereby alienating all potential readers, now is the time to confess where ideas for this writing were generated – the boob-tube News, of course. Two items of general interest caught my eye and ear in a moment of lowered chagrin. The first had to do with what seemed to constitute the blackballing from certain air waves of the allegedly infamous “smut peddler” Howard Stern; the action was apparently taken by media executives who presented it as being in the interests of the public weal. From what I could gather this announcement resulted in fitful, and expected, arousal of passions surrounding the issue of freedom of speech, or at least freedom of smut, and echoed right up to and through the halls of congress. The second, following immediately, had to do with taking pet dogs along with the rest of the family on fairly extended vacations to the great open spaces. Clearly proponents of this practice, the presenters assured us our city dogs especially, though less accustomed during the remainder of the year to unleashed exercise, would return from such hearty running and romping for longer periods over the fields in better health and spirits.

Now I don’t want to be a protagonist here. Not belonging to either in-group absolves me of such a role; I have not read or listened to Stern, and while entitled still to be a dog lover, I am not a dog owner (too lazy). But there are commonalities; for one thing my own area of interest tells me Stern and the dog and I are possessed of similar mammalian brains. Harking back to some elementary studies, specifically the venerable Papez-MacLean theory of emotions, it is understood that while intellectual functions are carried on in the newer (sic), highly developed part of the brain, our emotional behavior is sometimes dominated by a “crude, primitive system”—older structures that were assumed to have undergone very little change “in the whole course of evolution from mouse to man” (MacLean,1973 -- Briefly, this theory involves the limbic system with its connections from the reticular activating formation and from the brain stem -- if one wants to get a little more specific. As a whole, the limbic system has been associated with four primary functions: memory, sense of smell, autonomic visceral functions, and emotional behavior. Bear with me on this.)

As is usually the case, the evolutionary implications here are gratuitous; no hard evidence is furnished demonstrating metamorphosis from one life form to the other. While research into microevolution is, especially in the study of under-water microorganisms, fairly clear in showing the changes in functional life forms as endowed by their creator, claims of macroevolution, i.e., monkey uncles, have never been shown to have occurred within the annals of scientific comparative research. In earlier days cultural anthropologists such as Franz Boaz [1891], had a tougher row to hoe; they were up against the lavishly and imaginatively drawn figures of creatures from mice, “up” through apes, to variously endowed “racial types”, humanoid figures of a less and less bent-over biped stance, from which white men of our father’s and grandfather’s day could easily find where their place and social destiny belonged (See Harper’s, March, 2004). Happily this skullduggery, (based often on spurious data from skull measurements) is hopefully being stored away in the mustier closets of natural history, though it still hangs about in the shadows of prejudice and superstition. Mammals do have generally similar brain structure, which might just as easily suggest that when a workable model was provided during the process of creation it was not discarded. Together with the history of The Fall, however, just how workable all this has been is still the subject of sometimes bitter debate.

It is precisely because of the human brain and its organization that we could become lost beyond recall as a species, according to Arthur Koestler, Budapest/Viennese cum American novelist who has written very cogently on scientific topics (Janus; Looking backward, 1978). Because of a superimposed modern neocortex over a more “primitive”, largely unchanged “old” brain, humankind is unable to refrain from going to war with fellow human beings – and finally to becoming self destructive in the use of our own (left hemispheric), sophisticated weaponry. The human brain with which we are endowed may conquer external nature, but may be conquered in turn by the ancient and destructive foe within. Koestler sees an “ontogenetic” principle showing humans to be victims of some subordinate part of this mental hierarchy, which in turn exerts a tryrannical rule over the whole. He posits a situation where “Aberrations of the human mind” due to some obsessional pursuit of a part- truth masquerading as a whole truth, leads to sub-level emotional overload; “In rage and panic the sympathico-adrenal apparatus takes over from the cortical centers which normally control behavior. When sex is aroused the gonads seem to take over from the brain.” (ibid, 1978).
Since Koestler is clear on the implications of his “ontogenetic principle”, as in “ontology follows phylogeny”, he too tends to see things in Darwinian terms. It is certainly true that people can behave in self- destructive ways, (as did Koestler himself, who had led a life much like the novels he probably wrote for Hollywood, such as Darkness at Noon, with stints in the Spanish underground and the French Foreign Legion). Koestler died by his own hand in a suicide pact with his wife in 1983. However, his kind of orientation, far from providing clarification, at most implies the possibility of a sort of emergent process within a given species, producing alterations that are not inconsistent genetically and are usually clearly linked to observable cultural or environmental changes. While exhibiting synthesis, they do not constitute some upward spiraling, future-pointing form. “There is nothing in emergent evolution that purports to be strictly naturalistic, (or) that precludes an acknowledgment of God.” (C. Lloyd Morgan. In Emergence, London, Williams and Norgate,1923). As in zoology 101, Chevalier de Lamark (1816) found by long effort at demonstrating acquired traits, no matter how many mice tails he cut off no short-tailed types occurred.

Yet from another standpoint relative to intra-cerebral clashes it has been surmised that “the conflict between the need to belong to a group and the need to be seen as unique and individual is the chief struggle of adolescence”. Pertinent thereto are recent findings in fMRI (frequency Magnetic Resonance Imaging) studies showing an unexpected increase in growth of cortical, prefrontal brain cells, the most lavish since infancy, during early adolescence. (Giedd, J., NATURE, March 9, 2000). The prefrontal cortex “acts as the CEO of the brain, controlling planning, working memory, organization, and modulating mood…. (it has been) dubbed the area of sober second thought”. Perhaps the prefrontal cortex can be seen as a culturally developing foil for the “primitive” brain). Researchers conclude from these findings that “If a teen is doing music, sports or academics, those are the connections that will be hard wired. If they are lying on the couch or playing video games or MTV, those are the cells and connections that are going to survive.” These are seen as the areas of main involvement for the rest of the young person’s life. (One may speculate that those kids who want to be particularly “cool”, shocking, rude or just attention-getting might find characters like Howard Stern a convenient vehicle. Unfortunately, if they become too deeply involved they may be the carriers of our next cultural trappings – bringing what they have studied so closely into the rest of their lives and the lives of others, to say nothing of a possible trail of gang violence from the “hood”).

Considerations such as these refer us to other studies in brain function effects done by sociologists, and also to split-brain research by central nervous system experts in the study of brain behavior. In the literature of bilateral hemispheric, affective, functioning there are implications for complex human behavior, as in the work of Warren TenHouten (1985) at UCLA. He applied the theory of cerebral lateralization to the sociology of knowledge. His work takes off from the ground-breaking studies of Bogen and Bogen, (1969, ff.), who conducted early surgeries in “split-Brain” patients. TenHouten quotes Emil Durkheim on the “constitutional duality of human nature”, (roughly Koestler’s theme, above): “The old formula homo duplex is therefore verified by the facts. Far from being simple, our inner life has something like a double center of gravity. On the one hand is our individuality – and, more particularly, our body in which it is based; on the other is everything in us that expresses something other than ourselves”.

TenHouten applied these ideas to a study of economic organization in modern society and it’s cognitive styles, but for our purposes he states: ”The two modes of thought Durkheim saw as characterizing the human mind have parallels in the relationship between self and society. At one pole we find the society within the consciousness of the individual; at the other pole, the individual’s consciousness within the society. This distinction … parallels Bogen’s speculation … that each hemisphere represents its own other and the world in complementary mappings, such as that the left hemisphere maps the self as a subset of the world, and the right hemisphere maps the world as a subset of the self.” If this reasoning seems complex, one can imagine the struggle that each teen-ager must be going through in order to achieve the needed feeling they belong to a group, and yet feel individual and unique. All this at the same time the prefrontal cortex of each differentiated hemisphere is expanding rapidly, puberty is being established and barely settled in – and when they are often urged off to war.

To take a closer look at the process of acculturation in the context of brain behavior, elementary theory and practice from social-psychological studies are instructive. Personal experience in using an old class-room “trick”, in both the United States and England, demonstrates that certain behavioral responses are predictable. People pulled in off the street would probably work as well, but for convenience’s sake one may “round up the usual suspects” --the university level class of perhaps 40 people or so, divide them into two groups which are then usually nearly equivalent in terms of ages, sex, socio-economic grouping, intelligence and educational level. Next send group A and group B to separate rooms with no communication between the two. Each person (or subject) is provided with the same list of 10 or 12 community figures such as lawyers, doctors, technical workers, nurses, etc.; each list also contains the term politicians as one of those community figures.

Group A is told that they are to rank these personages in terms of their value to their community. It is noted by the instructor, as an example, that students across the country at the same grade level as group A tended to rank politicians quite highly. Exactly the same instructions are given to group B, except one word only is changed: the word “low” is substituted for the word “highly”. With rare exception the resulting differences in ranking of politicians between the two groups are not only in the expected direction, the differences often reach statistical significance.

This is an elementary, (and old), propaganda device called The Bandwagon effect, as in, “get on board, every body is doing it”. A similar device was used in the Milgram study noted below: The Voice of Authority, as when General so-and-so does something “it is OK for me to do it too”. Leaving aside the scariness of how easily opinion polls or other behavior may be manipulated, deeper implications appear when the subjects are asked why they ranked politicians as they had. Invariably those subjects who had ranked pols highly said their thoughts had turned to the more statesman-like personalities in the news, while subjects in the
“low-value” group B said they had thought of the likes of Tammany hall ward heelers. In other words, you do not have to get people to change their minds about anything in order to alter their behavioral responses; they will comply simply by selecting a compatible response from the total range of culturally defined meanings. They will not only rationalize their answer, but will feel quite righteous in the process.

The point here is that we can in this way see how acculturation works; having been socialized to the extent of identifying with, (or against), certain social groups, people often find that their ideas of right and wrong, or good and bad, come with the territory. Furthermore they are usually convinced that most of it is their own idea. All the while our brains store these concepts and ideas in special ways in order to, among other things, keep us feeling sane. (That is what “rationalizing” means). In this process the intellectual functioning of the upper reaches of our brain are constantly influenced by the “lower” realms, said to belong to the crocodile and the horse as well as to us humans. This brings us back to the dog if not to Howard Stern.

In my neighborhood dogs are mostly well behaved, in dog owner terms, and usually leashed anyway. Through puppydom and even beyond some can be foolishly loving and friendly to a fault toward strangers as well as family and friends. It is difficult at times not to notice, though they maintain an air of confident, companionable abandonment, their forthright and frequent public production of excrement. There are also the uninhibited sexual liaisons, or at least earnest attempts, nothing daunted by the discomfiture often expressed by human onlookers. Our beloved pets can also sometimes aggressively snap or bite. All this may be traced to the effects of a paucity of cortical (inhibitory) brain structure and failure to automatically acquire our socially acceptable, “nicer” traits; if the noble animal assumes that we humans do these same things too, it is of course being more reasonable than perhaps we deserve, but in the absence of specific intensive training they still fail to acquire our cultural inhibitions (or what seems to be left of them). Dutiful owners, however, follow along with plastic bags and scoops even if their pets vaguely wonder about their master’s sense of values; they protect their own floors and at least harvest gratitude from neighbors, while giving interceptive tugs on the leash. The owner thus becomes the cerebral cortex, conscience, or superego, if you will, of the creature already possessed of an excellent, standard model, no frills, “more primitive” mammalian brain.

Our own model has more bells and whistles; we feel and do the same as our canine soul mates and more, only with a weather eye out for what other people, or our consciences, will say about it. Wolves and dogs are social animals and like humans, tend to congregate in groups. They are however, relatively unburdened by our culture, popular or otherwise. Perhaps I belabor the obvious here, but it should be clear that our cultural features are acquired and learned; other creatures seem severely limited in the ability to acquire extended social characteristics, Clever Hans the Wonder Horse not excepted.

When we wish to consider the “higher” intellectual functions of the human brain, what comes to mind (no pun intended) is the capacity for cognition including synthesis, fantasy, and symbolism usually associated with “creative” expressions. Special areas of the brain have been carefully mapped out and isolated where these functions are at least mediated, and they play starring roles in our notions of culture (Arieti, 1976). Lest we forget, there are present also areas of reasoning, judgment and memory for past learning (and past errors). After an emotional storm, including wrong behavior, we can usually say, at least inwardly, “I knew all the time it was not right”. But are these brain areas separate and exclusive from the phylogenetic more primitive brain of fight, flight and lust shared by our four footed cohabitants? This question involves the aforementioned limbic system which, as it turns out, is pervasive throughout the brain -- so much so that some writers appeared to give up on the idea of any specificity for its functions. Researchers (Weil, et al. 1974) stated,…”before we become discouraged with the concept of the limbic system we should examine closely the question of whether or not affect itself pervades all aspects of behavior. We obviously believe that it does on both the sensory and motor sides of the coin.”

There is no reason, therefore, to think of some parts of the brain as becoming non-functional and in a disconnected state, while other parts function all alone. In the living and intact organism all parts of the brain and it’s neural centers hum right along, no matter what; some parts may just be “sleepier” than others. As with the adolescents above, we can entertain two (or more) cerebral crusades at the same time.

It should be obvious, however, that even though it resides there, our cultural identity does not come from the brain itself: it comes to us from our human environment. There is enough evidence for example from studies of feral children, (Candland, D.K., 1993), a subject perhaps not getting much press lately, demonstrating stunted growth and development when a surrounding social group is absent. Children adopted into cultures different from that of their parents amply reflect how little is carried away with them in terms of language, values and beliefs. (There are also those of us who are convinced that we were stolen at birth, or shortly thereafter, from rich and famous homes by Gypsies. The conviction becomes more pronounced during teen-age years when it is realized that we were, in fact, given into the hands of crude and unfeeling peasants who are notably mean and miserly in matters of the family car.)
Emotions, and the temperament that goes with them, do appear early on in life and thus are most likely brain-related in special ways. They may profoundly influence our cultural attitudes and beliefs; efforts at tapping these features in various populations often have taken the form of scales, or questionnaires, ideally administered by experienced social scientists, in an attempt to tease out the underlying nature or origin of our belief systems. Following the interest in the “trait” of authoritarianism with the widely used “f scale” by T. H. Adorno, et al, (measuring fascist tendencies during the 1940’s), Milton Rokeach (1960, ff), developed a scale to measure social preferences entitled the Dogmatism, or “Dog” scale. While probably not revealing many hard wired character traits, the scale did show our strong tendency to prefer to socialize, and live or join with people or groups seen as most like the way we perceive ourselves to be; (among our extreme end groups in those days were Arabs and Islamic fundamentalists). This scale had the added usefulness of measuring political attitudes and racial or ethnic bias.
But let’s get back to Howard Stern, and hopefully, some cerebral relatedness. For one thing, I cannot imagine Stern doing his “schtick” without carrying in his head at one and the same time, a profound appreciation for what he may regard as the prudes and straight-laced “hypocrites” out there in radioland. He, along with his aficionados, could not enjoy the genital and anal, in-your-face preoccupation nearly as much without picturing, somewhere in their own heads, a large segment of the audience exhibiting shock or disapproval. In that sense, the “Sterns” of this world (there should be a play on the words for back-sides and nether regions here), vitally need their cultural opposites, (or perhaps we could say “fore-parts”), in order to exist at all.
This thinking takes us along a path suggesting the need we have for others (the existential “Other”), in order to develop an identity of our own; contrary to Koestler and Durkheim, this path most likely represents dialectic relationships rather than dualistic ones, (concepts which have already been dealt with exhaustively by certain Nordic and German philosophers, antedating but possibly anticipating, this present writing). For our purposes however, Stern’s activation of the higher brain centers, right along with expressing the “primitive” body functions, demonstrates how you can’t do one without the other. A dialectic relationship presumes that each polar opposite needs the other to exist, and will eventually be joined together with the other in some future resolution. Again, like the teen-agers, apparently opposite “projects” in cerebral centers wend their way to some consensual conclusion, so be supportive of those kids as they try to find that way. It is maintained here that throughout the ebb and flow of popular cultural change over the ages, it is safe to say the brain, together with its various component parts, performs much the same duties no matter what social context is presented to it.
In a particular era, say the late Victorian of most of the western world, certain sexual behaviors (together with their symbolism) may be stringently held in suppressed, or pre-conscious or even repressed, “unconscious” check, within cognitive mazes of the mind-- a word signifying brain + culture, (which may be much more than that, but not less). Proceeding through Edwardian times, through the industrial revolution to the 1920’s, a reversal in attitudes begins to take place as many younger people suppress their learned cultural inhibitions in order join in -- and more importantly fit in, with their most “popular” peers (as they did with their lusty superiors in Edward’s court). And so it goes back and forth; during seasons of change, and between dissimilar or warring groups, the “upper” cerebral centers play a reciprocal hide-and seek with the “lower” centers sometimes in the form of outright denial), and with the current group-culture of one’s peers, be they gang, neighborhood, political party or nation.

So this is of course why it may be said that people these days are as repressed and hypocritical as ever they were. Those tendencies are obdurate and durable because to change them one must change ones entire identity to a greater or lesser degree. That is a little like asking a leopard to change its spots, especially since the leopard may see no compelling reason to do so. In the case of human socio-cultural identity such a change often requires us to join the “enemy”, or at least the opposition. When it occurs we may often then realize that God can do amazing things; He can even induce us to change into his loving children!

ADDENDUM (and stray thoughts)

Of the social elements that can apparently redirect attitudes, beliefs and cultural orientation, one’s religion, or lack of it is most prominent throughout world history. Especially in times of crisis, personal or world-wide, people go through changes in spiritual interests and concerns; either at the urging of authorities, or voluntarily, there is frequently a change in faith. Ancient Jews, for example, became much more Messianic after the fall of their temple in 70 BC; (they were particularly hopeful for the advent of a Messiah with military skills). As “fate or fortune” would have it, today in the midst of increased interest in the Bible and an unprecedented growth in church attendance – and our concern here with culture and the field of entertainment – comes the passion film by Mel Gibson.

I have not seen it nor do I plan to do so, but from the many reviews both pro and con, Gibson has done something outstanding, not only for Hollywood, but in the annals of entertainment history; he deserves much credit for that. Language is a powerful cultural medium and in order to understand it’s meaning it is useful to go to historical roots --this approach might be helpful in parsing the film. For example there is the Anglo-American penchant for showmanship; “ballyhoo” is an inspirational art, as P. T. Barnum proved. He knew the value of passing out samples and free passes to the right people. And there is the attraction of a well-known, dedicated actor who has specialized in violent action pictures. His previous work is said to have been lavish in blood-letting scenes traditionally surrounding good-guys versus evil villains, (but he cannot be held accountable for doing what Hollywood always insists upon as rites of passage). Gibson implies that he was directed by heavenly influences in order to carry out this “Biblical” production. Apparently not a newcomer to religion, he has a Catholic background and has evidently done some Bible reading – he remarked how very pleased he was to see how “Marian” his film turned out.

Evangelical Christians, who have not always been sympathetic to Catholic iconography, are reported to have attended in “droves”. In spite of side-line complaints about its free-wheeling interpretations and narrow focus on a torturous, drawn-out death unrelieved by the rest of the Gospel, it is accepted as a really important, brand new “entertainment” product with a message. In fact, it appears to be rather an astounding achievement. Cultural binding was not mentioned here before, but the powerful limitations this imposes keeps most of us within the close confines of the ideological space we occupy throughout life – new paradigms and totally new ideas are rare; while clearly emerging from it, most new ideas bear the stamp of all the rest of our popular culture. Besides, as a famous economist once said, (Galbraith) “Most new ideas are bad ones” – Gibson’s is apparently not a bad one. He is dealing in a medium that produces something truly new only once every thirty years or so, (“talking pictures” being one of them), and the public reaction to his work, mostly word-of-mouth, is rarer still; then too, there is the precedent setting box-office!

It remains to be seen whether the Lord is turning all of it into something also essentially good for the long run, but a further word about brain effects: Gibson himself noted, as did others, the most immediate reaction to seeing the Passion is a profound silence. That seems to be the effect of witnessing two hours devoted to the writhing under extremely painful, deadly wounds inflicted by arrogant and intentionally cruel executioners upon the body of a beloved figure of surpassing kindness – and bearer of hope for our immortal souls. Perhaps the very notoriety may bring in pre-believers, who might thereby get a taste of how His followers “who did not yet call themselves Christians”, may have reacted; it is not likely that many viewers will leave the theater as from the regular cinema fare, or even a powerful Sunday sermon, gushing, “O, I just adored it”! There is surely a moment or two of cognitive dissonance, requiring time to erect neural defenses against images of the raw behavior of our more “primitive brain”, -- as the “higher” cortical centers begin to permit us to gulp down the terrible effrontery of our sins, seen there weighing heavily upon the cross. These are the centers that also become glazed-over during the showing itself.

Over periods of more prolonged exposure to severe scenes or experiences of painful and repeated wounds adaptation occurs, so that bigger and sharper pains are required to achieve the same degree of shock. That is why professional torturers are trained in confining “camps” with long periods of pain, insult and degradation to self and others, and little time-out from the task. (Also a strong reason not to raise young children with insult and injury as behavior-modifier’s). Executions, such as hangings, have been carried out in public long before news became a media, and sensationalism always drew crowds of onlookers. Even the more “high-minded” of us have trouble pulling away from a spectacle of disaster in the street. This reaction is probably linked with self-preservation; if there is danger out there we want to see it coming, if only for “the next time”. In terms of entertainment, circuses in the Roman coliseum catered to a similar mixture of thrill/repugnancy and serious study-oriented curiosity, with a covert concern by spectators about how they might themselves go through the same fates. Another category drawing interested on-lookers is that of the flagellants who whip themselves till blood flows in the streets --or the “pilgrims” who crawl on raw, bleeding hands and knees, intent on demonstrating contrition by experiencing some of the misery of Jesus.

Here it is maintained that human cultural trends always seem to demand an ebb and flow from higher to lower brain centers, spanning from one era to another. It was reported that Johannes Sebastian Bach, in composing his St. Matthew’s Passion, was so aware (circa 1740) of possible anti-Semitic overtones, that he made sure that his entire chorus, Jew, Gentile, assorted rabble and Romans, cried out for crucifixion. The prestigious journal of the American Medical Association, (JAMA 225, No. 11, 1986), Published an article, The Crucifixion of Jesus, in which the details of his death, as it might have occurred from a medical perspective – including prolonged flagellation -- was described in terms very similar to Gibson’s Passion. Did he read it before the filming? In his place I might not have proceeded –I don’t have the stomach for it. It is said that the film brings home more profoundly what the Son of Man went through in absolving us of our sins. Personally I sometimes find it difficult to read through parts of the first four Gospel crucifixions, even though thanks be to God, the Good News is included therein.

Sunday, June 12, 2005

POPULAR CULTURE AND THE BRAIN – REPRISE

Men never do evil so fully and so happily as when they do it for conscience’s sake. Blaise Pascal, (Pensees, 1660).

An essay on relationships between brain function and social-cultural behavior was begun in March of 2004, (before the news of failures by U.S. military personnel to abide by the Geneva Convention in our military prisons or the televised murder of Nicholas Berg by beheading, became uncommon currency). Also noted was this well known quote from Blaise Pascal, which might have served to blunt some of the surprise following the news reports without doing much to cushion their shock or repugnance: In fact, all the pros and cons about who is responsible for such breeches of conduct were investigated at least 30 years before. That statement by Pascal is true of nearly all people at nearly all times, in varying degrees; psychologist Stanley Milgram demonstrated it in his landmark study “Perils of Obedience”, and Robert Jay Lifton wrote of it when describing the warm, affectionate home life of Nazi doctors who concurrently were conducting brutal experiments on prisoners of all ages in German concentration camps. He referred to this effect as “doubling”, the tendency to be two different people in disparate roles. An overview of such studies suggest the primary variables involved in taking on sadistic behaviors by most ordinary people appear to be: (1), The general development of one’s moral sense, or conscience, (2), how close to active consciousness, and how deeply impacted by a socialized conscience, is our emotional quotient for Sadism, and (3), how culturally and socially susceptible we are to authoritarian or to peer influences.

Phillip Zimbardo, past president of the American Psychological Association, is remembered for, among other works, a study he conducted in 1971entitled the Stanford Prison Experiment, in which a dozen bright, likeable students perpetrated Iraqi/U.S. prison-type torture and abuse on their “prisoners”. The latter were known by the perpetrators to be fellow students who were “imprisoned” for experimental purposes; one week into the study it had to be halted due to violent behavior, mostly on the part of the “guards”. There were outspoken fears of irreversible effects by the observers, who feared for their own susceptibility to long standing emotional repercussions.

In that day and age, in the context of the death of prisoner George Jackson, killed in San Quentin, and other news similar to that of the Attica prison riots of 1971, the Stanford Prison Experiment was indeed “newsworthy”. It told the world how “ordinary people, middle-class college students, can do things they would have never believed they were capable of doing. It seemed to say, as Hannah Arendt said of Adolf Eichmann, that normal people can take ghastly action”. Answering an advertisement in the Palo Alto Times and following interviews and a battery of tests, the two dozen applicants judged to be the most normal, average, and healthy were randomly assigned to be either prisoners or guards. Zimbardo’s reasoning was given as an interest in focusing “on the power of roles, rules, symbols, group identity and situational validation of behavior that would repulse ordinary individuals…. I had been conducting research for some years on deindividuation, vandalism and dehumanization that illustrated the ease with which ordinary people could be led to engage in anti-social acts by putting them in situations where they felt anonymous, or they could perceive of others in ways that made them less than human, as enemies or objects”. (Toronto, 1996). Zimbardo had wondered, in the course of experiment planning, “…what would happen if we aggregated all of these processes, making some subjects feel deindividuated, others dehumanized within an anonymous environment in the same experimental setting, and where we could carefully document the process over time.” The study showed the development of danger to individuals early on; even though the Guards had been instructed not to use violence but maintain control of the prison, the “worst instances of abuse occurred in the middle of the night when the “guards” thought the staff was not watching …(and)resulted in extreme stress reactions that forced us to release five prisoners, one a day, prematurely.” Zimbardo told the Toronto Symposium in 1969 that his prison experiment “was both ethical and unethical”. It was unethical, he said, “because people suffered and others were allowed to inflict pain and humiliation on their fellows over an extended period of time. “And yes, although we ended the study a week earlier than planned, we did not end it soon enough.”

There is a current trend to account for such behaviors as “hard wired” or “genetically” ordained for some people, (see Whose Life Would You Save? Discover magazine, April, 2004), but the evidence remains poorly substantiated and highly impressed by a certain popular infatuation with electronic gadgetry. Developmental fantasies put forward by researchers about what human brain-behavior was like thousands of years ago still have no more credence today than does anybody else’s educated guess. In the otherwise impressive efforts to establish maps of localized brain functions, observing a particular neuronal response to experimental stimuli seems currently to congregate all attention onto an MRI scanner. Mapping of the more painstaking, perhaps pedestrian, effects of socialization and acculturation, to say nothing of concomitant personality variables, often tend to be almost ignored. Also often given short shrift are efforts to show developmental patterns of brain-behavior within various age-groups. The work of J. Giedd, (NATURE, March 9, 2000), showing a marked increase in growth of prefrontal brain cells in early teen years, using fMRI (frequency Magnetic Resonance Imaging), is a redeeming case in point.

In previously relating popular culture to brain function it was noted that while making such a connection might seem somewhat oxymoronic to some, it was and is presented in all seriousness. The perils of bias and cultural, time-binding effects are rampant in such ventures however; efforts at off-setting the major and more inevitable consequences were attempted in part by stating some of the biases and personal, probably idiosyncratic, opinions at the outset. For one thing it should come as no surprise to anyone that what are often considered to be new and innovative attitudes and practices may be old stuff garbed in the latest fashions. In particular the assumption that in “the old days” people were more repressive and hypocritical about their “true” feelings, especially about sexual practices, sexual openness in general, public cursing and ethnic preferences, to name a couple of sensitive spots. It was maintained here that in numerous ways it can be shown that people today are generally just as hypocritical and repressive about their underlying feelings and practices as ever they were; they just seem to be selecting different elements to repress, or veil. What is openly presented is in line with their current cultural values.

The same considerations apply to our social-cultural management of aggression; differences over time are shown primarily in terms of which values are suppressed and which values are in plain view. Not only Pascal, but Cesar Chavez, that nonviolent warrior, saw the underlying and overt relationship when he wrote in 1968, “In some cases, nonviolence requires more militancy than violence.” Their observations sheds some glimmer of light on the famed study conducted by Stanley Milgram in the late 1960s, sometimes titled “Perils of Obedience”. Milgram referred to the writings of Hannah Arendt, (1950), who asserted that evil, such as seen in Germany during the Holocaust, was perpetrated by very ordinary, run of the mill citizens under the various propagandist devices of German authority. The Milgram study showed this observation to be more than hypothetically true in terms of the resultant unbridled willingness of naïve U.S. persons selected at random, under the direction of white-coated, authoritarian, clip-board bearing experimenters, to administer torture in the form of apparently painfully disabling electric shock --to fellow human beings.

So who should bear the onus of guilt and criminal responsibility for lapses in humane treatment of prisoners? The guards are the perpetrators and should face some punishment, but as Phillip Zimbardo and others clearly demonstrated, it is a failure in taking proper, ordinary management of any detention center and its program that matters most. Whether it is a government such as one imbued with Nazi propaganda issues, or an entity overseeing prisoner detention in the most benign setting, it is always the fault of those responsible for guarding the guards from their own worst impulses.

Wednesday, June 08, 2005

OUT ON THE OCEAN --OR A SEA OF TROUBLES

From my bedroom window, looking southwest through low clouds, the wintry sun seemed to be splashing its pale beams on an immense, cold Pacific --with little or no warming effect. A break in the in-rushing series of storms had only created a clearer scene of desolation; perhaps in these lulls between attacks we often see more clearly the power of an enemy. But after all, an important man in New Testament times, called Simon by Paul, and Cephas or Peter by Jesus, seemed to have gathered much needed strength during just these times; between the spiritual and worldly storms in those days, he may have had to look back several times to reassess his purposes.

New storms, --a sea of troubles --can wreak terrible havoc. Is it time to take arms against a sea of troubles, and by opposing, end them? Peter proved early in life to be quick to take arms. He had been devoted from his early Galilean years to await the advent of a Messiah with power to overcome persecution and suffering, the two biggest enemies in the only life with which he was familiar. Or should we rather bear those ills we have, than fly to others we know not of? Much of the time Peter had apparently expected a messiah who would strike down the cruel and aggressive Roman legions. The examples set by Jesus to bear the persecution, to undergo the suffering and agonies brought by others without protest, put Peter in a maelstrom of conflict and doubt. Through new storms of opposition and persecution, however, he became one who did in fact learn how to bear those ills.

Peter developed a powerful hope in the face of an imperfect world. It is clear that he did not regard it as a kind or friendly place; he must have often felt as if he were “a stranger in an alien land”, and his hope clearly could not come from there. As we see the hardship wrought in our own time by nature and by man in all parts of this globe it is just possible to see his point of view. Through violent insurgency, or reports of prisoner mis-treatment by all factions, or the tsunami-ravaged towns, villages and people who lived there, we also know this present planet and its ways all too well; from where did hope spring?

How does one find new ways in place of old ones? In the New Yorker, January 17, 2005, Dan Baum writes a piece entitled Battle Lessons, What the Generals Didn’t Know. Mainly this is a military-oriented article describing ways our fighting force can be deadlier to the enemy and more protective of our own --features which are invaluable in conducting a war. Baum notes that learning is taking place in the field and soldiers teach one another as they go. The first example he gives is, to this writer, awe inspiring and clearly unlearned behavior: “ Watching TV,” he recalls “On the morning of April 3rd, as the Army and the Marines were closing in on Baghdad, I happened to look up at what appeared to be a disaster in the making. A small unit of American soldiers was walking along a street in Najaf when hundreds of Iraqis poured out of the buildings on either side. Fists waving, throats taut, they pressed in on the Americans, who glanced at one another in terror. I reached for the remote and turned up the sound. The Iraqis were shrieking, frantic with rage. From the way the lens was lurching, the cameraman seemed as frightened as the soldiers. This is it, I thought. A shot will come from somewhere, the Americans will open fire, and the world will witness the My Lai massacre of the Iraq war. At that moment, an American officer stepped through the crowd holding his rifle high over his head with the barrel pointed to the ground. Against the back drop of the seething crowd, it was a striking gesture---almost Biblical. ‘Take a knee,’ the officer said, impassive behind surfer sunglasses. The soldiers looked at him as if he were crazy. Then, one after another, swaying in their bulky body armor and gear, they knelt before the boiling crowd and pointed their guns at the ground. The Iraqis fell silent, and their anger subsided. The officer ordered his men to withdraw.”

The officer “was trying that day to get in touch with Grand Ayatollah Ali al-Sistani, a delicate task that the Army considered politically crucial. American gunfire would have made it impossible. The Iraqis already felt that the Americans were disrespecting their mosque. The obvious solution to Hughes (the officer), was a gesture of respect.” (p. 42).

Lieutenant Colonel Chris Hughes is at this writing rotated home and attending the Army War College in Pennsylvania. On the day in question he did something unexpected; “shortly before the invasion the Army had (despairingly) concluded that it’s officers lacked the ability to do precisely what he did, innovate and think creatively”. He had responded with insight --and courage.

But this, I am sure, must be similar to the way Peter had to learn patience and hope. He had a great leader who was unhesitating in his submission to the cross, which was God’s plan. Peter, who had such great expectations for a strong, aggressive arm of the Lord had to learn that it was not only the despotic Romans, but he himself, who had to submit to God in the person of His Son. Throughout the rest of his life after Calgary, Peter followed in the steps of Jesus and finally, we are told, died the same death as his Lord and Master; all that takes great patience, faith, determination and courage. Cesar Chavez, a great non-violent leader in our time said it well: “Sometimes nonviolence takes more militancy than violence.”

Monday, June 06, 2005

ON A BLEAK SATURDAY

On a bleak Saturday afternoon in January the rare but clearly predictable southern California storms were raging, though fairly subdued in my own neighborhood; so much so that out of my kitchen window there appeared the uplifting sight of flocks of white sails, each of 12 or 15 small boats, out on the rainy bay. Momentarily perceived as flocks of white doves afloat against the dark and lowering sky and water, the two-person crews were testing their seamanship aweather rather than alee as usual, and while one or two small vessels were yawing wildly along the course, all regained port safely. Alone and under the weather physically and spiritually, it occurred to me that for 3 days I had been keeping my household lights burning during the day and most of the night as if to alert for encroaching dangers -–two if by sea(?). In the face of news of tsunamis in Asia, floods in Scandinavia and Northern UK, and southern California weather turning deadly --- together with news of unremitting warfare in Iraq, I had been wandering in prayer for a meaning of sorts.

Hendrik Hertzberg (The New Yorker, Jan. 17 2005) noted that “The terrible arbitrariness of the (tsunami) disaster has troubled clergymen of many persuasions. The Archbishop of Canterbury is among those newly struggling with the old question of how a just and loving God could permit, let alone will, such an undeserved horror.” (p.35). I had thought of the white sails as signs, perhaps a kind of omen of oceanic hope, but the whole question apparently goes back to one of the oldest, if not the oldest book in the Jewish bible. In a brief review of Job’s experiences William Safire, a New York Times columnist concludes, in Where Was God, Jan. 2005, “(1) Victims of this cataclysm in no way “deserved” a fate inflicted by the Leviathanic force of nature. (2) Questioning God’s inscrutable ways has its exemplar in the Bible and need not undermine faith. (3) Humanity’s obligation to ameliorate injustice on earth is being expressed in a surge of generosity that refutes … cynicism.” At least I know it is not only me, these are the times when many people, in many different places, are asking the big questions: “where were you God –How could you let these things happen?” Safire’s comments are germane, but fall short of answering the present day bewilderment of hordes of victims and onlookers. Hertzberg says it more directly as he goes on to amplify our human plight: “Nearly four million men, women, and children have died as a consequence of the Congo civil war. Seventy thousand have perished in the Darfur region of Sudan. In the year just ended, scores of thousands died in wars and massacres elsewhere in Africa, in Asia, in the archipelagoes of the Pacific, and, of course, in Iraq. Less dramatically, but just as lethally, two million people died of malaria around the world, and another million and a half of diarrhea. Five million children died of hunger. Three million people died of AIDS, mostly in Africa. The suffering of these untimely deaths---whether inflicted by deliberate violence, the result of human agency, or by avoidable or treatable malady, the result of human neglect---is multiplied by heartbroken parents and spouses, numbed and abandoned children, and, often, survivors vulnerable to disease and starvation, and dependent, if they are lucky, on the spotty kindness of strangers.” This writer adds, “The giant wave that radiated from western Sumatra on the day after Christmas destroyed the lives of at least a hundred and fifty thousand people and the livelihoods of millions more. A hundred and fifty thousand; fifty times the toll of 9/11, but ‘only’ a few percent of that of the year’s slower, more diffuse horrors. The routine disasters of war and pestilence do, of course, call forth a measure of relief from public and private agencies (and to note that this relief is almost always inadequate is merely to highlight the dedication of those who deliver it). But the great tsunami has struck a deeper chord of sympathy.”

Why do we not consider a different question, why do we still not stop to think or wonder if our Lord has not been asking similar questions of us over the years. “O man, you have been here many times before, you know what “natural” events and your own propensity for violence can do. Where are you when it is time to build the ark, to gather the flock, to prepare for future safety and while peaceful resolutions are still possible? Where are the early warning signals that only now, in India at least, you think might be put in place? When did you work and plan and spend your military and defense budgets for flood and famine relief worldwide ahead of time, instead of creating more death and destruction; when did you to plan for the loving and secure communities that you are capable of building, the kind of world I made you for and reminded you of in the Garden?” Why do we play to the crowd and so rarely think first of others rather than rush to profit-taking ---and then sue each other for unrequited love after the fact?

During this musing one of the newer TV programs came on hosted by an attractive young lady named Maria, recently employed in the trading pits of the NY stock exchange (where long term problem solving is usually scarce). A good presenter, she had the task of asking one of our US senators about how the future plans for one famous pre-planned safety cushion, the Social Security Program, is faring in government circles. To her obvious (mock?) surprise, since this issue had been kicked around a lot during these last election days, the senator seemed to recall only the most recent words on the subject and stated that there were no plans in place at all at present. Whether our Social Security system is revamped to meet participant needs, –or if ever, --currently it is clear that this problem is high on many legislative agendas, often for purposes of political leverage. It is slow to emerge in enactment form because it involves the future, and many a legislator shies away from the role of prophet until the voice counts are in ---and also such mementos are heartily wished dead by some of our lawmakers. Like world peace itself, these issues are usually designed for aftermaths, for picking up the pieces after disaster has struck. Left to welfare agencies and “the spotty kindness of strangers”, we will be asked to contribute to the results of catastrophes that happen regularly -- in some regions on a daily basis. We may seek and even find alternatives to fuel oil –after it is gone, or at least after pouring a lot more of our money into middle-eastern pockets.

Even the well honed techniques of management, if applied to arcane government processes, could improve matters; this idea is usually rejected however, (noted by staffers as useful only for “profit making” agencies), but many believe every-day business practices could well institute change as opposed to status quo. (See recent reports on snarled government procedures). New pending solutions are often left to sketchy “relief” organizations that tend to be self perpetuating, and refer frequently to do-gooder phraseology having to do with the importance of giving “in time of need” --and will spring up to report the next need, and, etc.

Are not they who say “yes, we will help repair the damage that could perhaps have been prevented” very much like the “whited sepulchers” of old? They only confess the duplicity, and their own complicity, in self-oriented planning. In any event, I cling to the notion that those “white doves” out on the stormy bay signify peace and the loving nature of the one who said that those who seek wisdom must first really seek the Kingdom --presumably before they make up their minds about what constitutes wisdom. Some day we will learn to give the right answer: the UN General Assembly met on 1/24/05 to commemorate the 60th anniversary of the end of Nazi death camps. Elie Wiesel spoke as follows, “The Jewish witness that I am speaks of my people’s suffering as a warning. He sounds the alarm to prevent these tragedies being done to others. And yes, I am convinced if the world had listened to those of us who tried to speak we may have prevented Darfur, Cambodia, Bosnia, and naturally Rwanda.” The Arab countries were notable by their absence, and the UN itself struggles to maintain a leadership presence ---but such a body is sorely needed as we move steadily towards increased armed conflict around the globe.

We may still, however, be too busy following Pope’s directive, “Presume not God to scan, the proper study of mankind is man.” -- We can only learn worldly solutions to old problems that way –and that they usually do not work. Although our Father in heaven has given us promise of future happiness in the world to come, even now we struggle to make our way to its gates.