There are those who will see this title as faintly oxymoronic, but it is presented here in all seriousness. Sailing through current cultural waters is not always plain or clear; to many of us who have lived long enough to know other times (and other places) it is at least apparent that “The times they are a changing…” as they always appear to do. It is also obvious that many things that are considered to be new and innovative are the old things garbed in the latest fashions. This should come as a surprise to no one, but consider for the moment the charge that in the old day’s people were more repressive and hypocritical than at present, especially about sexual practices, sexual openness, public cursing and racial issues, to name a couple of sensitive spots. (Leaving aside age differences for the moment, perhaps their prerogative, some members of my generation may claim to see clearly “how fast everything is going straight to Hell”). But the same considerations about underlying feelings apply to the social management of aggression. The differences are shown primarily in terms of which values are suppressed and which values are in plain view.
A tendency has been noted for people at large to maintain their old ways at all costs: “The two enemies of reform are the knaves who oppose it and the fools who favor it” (Anon). It has also been said that most people prefer old problems to new solutions. Taken together these tendencies, along with the viability of the ideas presented above, may work to maintain an anachronistic, inner neural and emotional stability, within an apparently changeful external world.
Behind one series of shifts of cultural focus however, there appears a more puzzling level of complexity, i.e., the infusion, engulfment and wholesale take-over of almost every phase of what passes for popular culture by the field of entertainment. While not essentially peculiar to the U.S. or to this century, in terms of public adulation of well known figures, we seem to have gone whole-hog for being mesmerized out of thinking about much other than circus-circus. In the early days of silent movies, Valentino, Chaplin, Fairbanks and Pickford to name a couple of then, and still beloved, “artists” were literally mobbed at any public appearance, here or abroad, and they probably helped give rise to comments such as: “Celebrity-worship and hero-worship should not be confused. Yet we confuse them every day, and by doing so we come dangerously close to depriving ourselves of all real models.” (Pulitzer prize-winner Daniel Boorstin,1974). If nothing else, this may warn us that celebrity-worship should be approached with caution -- more of them might actually be elected to public office.
Before WWII many “celebrity” publications consisted of cheaply made, frequently rather poorly printed exposé’s of a small number of popular film stars and their movies, together with “gossip” purporting to reveal vaguely shady and presumably sensational life styles. These ran about neck-and-neck in readership with True Story magazine and True Romances. (The word “true” was taken seriously only by hard-core users). Shades of Randolph Hearst’s love-life! Now grown to mega-media proportions, it may be the force behind the hordes of their progeny having now burgeoned into the predominate and overflowing content of every form of public presentation – internet included – with detailed accounts of dating practices, marriages, divorces, pregnancies, narcissistic love or hate spats, drugged, drunken or neurotic escapades – displayed together with glittering photos of popular entertainers in expensive clothing with partially bared body parts. These “come-on’s” are not confined to checkout counters; they often consist of frantically beckoning, attractively gotten up images from drama, stage and movies, comedic arts, popular music --and the musicians, now inhabiting most TV offerings up to and including the “these messages” slots; (advertising has been described as the science of arresting human intelligence long enough to get money from it; what better setting?) and there is a peculiar scavenging of material for “news” programs which, because the big networks also are heavily invested in the manufacture of most current entertainment, amounts to a kind of cannibalism that then engages in a process of regurgitation.
We thus see previews, along with other news, bits of the features themselves, (together with current box-office sales), and then the résumé’s, over and over again. In time they will all be repeated, (not to mention “Oscar” and “Grammy” award nights that are strung out for weeks), in case any one of us suffers unmet voyeuristic needs. We are talking “popular” here, as in People Magazine; if it has sold before it will sell again – and vive the sexual/moralistic revolution – lending an unfounded touch of intellectual class to the whole mind-numbing process. Popular is in, private is out. Artist and artistry is in, fine art, now a mere whisper, seems to be mostly out. We are being neither elitist here, (nor anti-sexuality, heaven forbid), nor are we alone in these views: A New Yorker Magazine writer, Ken Auletta, in an interview with U.S. News and World Report (Mar. 15, 2004, p. 20) appears to concur: “Why does it seem as if there aren’t slow news days anymore? Answer: One reason is the manufacturing of non-news into news. We’re preoccupied with ratings. Editors and owners are worried that there are so many news sources, and they want to get people’s attention. So we cover Martha Stewart like she’s World War III. …They (publishers) know this is a society where if your name is well known, you benefit, even if it is known for the wrong reasons, …It’s Joey Buttafuoco all over again. …It’s a freak show”.
As they say on Madison Avenue, “There is no such thing as “bad” publicity”. For a look at what may be served up as news, in Time of March 1, 2004, staff writer Poniewozik under the heading, WELCOME BACK, CAPOS, wrote about the HBO show “The Sopranos”, which was supposed to be ending that season. He reported the show’s creator (David Chase) “says he doesn’t want the show to repeat itself. And the gracious thing would be to…admire his artistic integrity and thank him for the memories. But on behalf of those (viewers) who are greedy and not gracious, let me remind Mr. Chase that he is making a freaking TV show. TV repeats itself—that’s what it’s for. Bad shows do it badly, and great shows like The Sopranos do it so well you hardly notice. Every season, New Jersey mob boss Tony Soprano (James Gandolfini) outwits his rivals and deceives his family, friends and therapist, all while remaining oblivious to his failings. His marriage to Carmela (Edie Falco) unravels as he chases anything with legs and hair spray and she pursues sad, unconsummated flirtations.” The writer, apparently confident in his predictions asks, “Anybody got a problem with that?” Not if you are into popular stuff like this show, apparently, but the problem may be that it is indeed so popular, (and then too, Italian-American “profiling” could still be OK in New Jersey). We are left to muse over why the term “freak” should pop up in both reviews.
But having gotten some of this “off our chest” for the moment, and if not thereby alienating all potential readers, now is the time to confess where ideas for this writing were generated – the boob-tube News, of course. Two items of general interest caught my eye and ear in a moment of lowered chagrin. The first had to do with what seemed to constitute the blackballing from certain air waves of the allegedly infamous “smut peddler” Howard Stern; the action was apparently taken by media executives who presented it as being in the interests of the public weal. From what I could gather this announcement resulted in fitful, and expected, arousal of passions surrounding the issue of freedom of speech, or at least freedom of smut, and echoed right up to and through the halls of congress. The second, following immediately, had to do with taking pet dogs along with the rest of the family on fairly extended vacations to the great open spaces. Clearly proponents of this practice, the presenters assured us our city dogs especially, though less accustomed during the remainder of the year to unleashed exercise, would return from such hearty running and romping for longer periods over the fields in better health and spirits.
Now I don’t want to be a protagonist here. Not belonging to either in-group absolves me of such a role; I have not read or listened to Stern, and while entitled still to be a dog lover, I am not a dog owner (too lazy). But there are commonalities; for one thing my own area of interest tells me Stern and the dog and I are possessed of similar mammalian brains. Harking back to some elementary studies, specifically the venerable Papez-MacLean theory of emotions, it is understood that while intellectual functions are carried on in the newer (sic), highly developed part of the brain, our emotional behavior is sometimes dominated by a “crude, primitive system”—older structures that were assumed to have undergone very little change “in the whole course of evolution from mouse to man” (MacLean,1973 -- Briefly, this theory involves the limbic system with its connections from the reticular activating formation and from the brain stem -- if one wants to get a little more specific. As a whole, the limbic system has been associated with four primary functions: memory, sense of smell, autonomic visceral functions, and emotional behavior. Bear with me on this.)
As is usually the case, the evolutionary implications here are gratuitous; no hard evidence is furnished demonstrating metamorphosis from one life form to the other. While research into microevolution is, especially in the study of under-water microorganisms, fairly clear in showing the changes in functional life forms as endowed by their creator, claims of macroevolution, i.e., monkey uncles, have never been shown to have occurred within the annals of scientific comparative research. In earlier days cultural anthropologists such as Franz Boaz , had a tougher row to hoe; they were up against the lavishly and imaginatively drawn figures of creatures from mice, “up” through apes, to variously endowed “racial types”, humanoid figures of a less and less bent-over biped stance, from which white men of our father’s and grandfather’s day could easily find where their place and social destiny belonged (See Harper’s, March, 2004). Happily this skullduggery, (based often on spurious data from skull measurements) is hopefully being stored away in the mustier closets of natural history, though it still hangs about in the shadows of prejudice and superstition. Mammals do have generally similar brain structure, which might just as easily suggest that when a workable model was provided during the process of creation it was not discarded. Together with the history of The Fall, however, just how workable all this has been is still the subject of sometimes bitter debate.
It is precisely because of the human brain and its organization that we could become lost beyond recall as a species, according to Arthur Koestler, Budapest/Viennese cum American novelist who has written very cogently on scientific topics (Janus; Looking backward, 1978). Because of a superimposed modern neocortex over a more “primitive”, largely unchanged “old” brain, humankind is unable to refrain from going to war with fellow human beings – and finally to becoming self destructive in the use of our own (left hemispheric), sophisticated weaponry. The human brain with which we are endowed may conquer external nature, but may be conquered in turn by the ancient and destructive foe within. Koestler sees an “ontogenetic” principle showing humans to be victims of some subordinate part of this mental hierarchy, which in turn exerts a tryrannical rule over the whole. He posits a situation where “Aberrations of the human mind” due to some obsessional pursuit of a part- truth masquerading as a whole truth, leads to sub-level emotional overload; “In rage and panic the sympathico-adrenal apparatus takes over from the cortical centers which normally control behavior. When sex is aroused the gonads seem to take over from the brain.” (ibid, 1978).
Since Koestler is clear on the implications of his “ontogenetic principle”, as in “ontology follows phylogeny”, he too tends to see things in Darwinian terms. It is certainly true that people can behave in self- destructive ways, (as did Koestler himself, who had led a life much like the novels he probably wrote for Hollywood, such as Darkness at Noon, with stints in the Spanish underground and the French Foreign Legion). Koestler died by his own hand in a suicide pact with his wife in 1983. However, his kind of orientation, far from providing clarification, at most implies the possibility of a sort of emergent process within a given species, producing alterations that are not inconsistent genetically and are usually clearly linked to observable cultural or environmental changes. While exhibiting synthesis, they do not constitute some upward spiraling, future-pointing form. “There is nothing in emergent evolution that purports to be strictly naturalistic, (or) that precludes an acknowledgment of God.” (C. Lloyd Morgan. In Emergence, London, Williams and Norgate,1923). As in zoology 101, Chevalier de Lamark (1816) found by long effort at demonstrating acquired traits, no matter how many mice tails he cut off no short-tailed types occurred.
Yet from another standpoint relative to intra-cerebral clashes it has been surmised that “the conflict between the need to belong to a group and the need to be seen as unique and individual is the chief struggle of adolescence”. Pertinent thereto are recent findings in fMRI (frequency Magnetic Resonance Imaging) studies showing an unexpected increase in growth of cortical, prefrontal brain cells, the most lavish since infancy, during early adolescence. (Giedd, J., NATURE, March 9, 2000). The prefrontal cortex “acts as the CEO of the brain, controlling planning, working memory, organization, and modulating mood…. (it has been) dubbed the area of sober second thought”. Perhaps the prefrontal cortex can be seen as a culturally developing foil for the “primitive” brain). Researchers conclude from these findings that “If a teen is doing music, sports or academics, those are the connections that will be hard wired. If they are lying on the couch or playing video games or MTV, those are the cells and connections that are going to survive.” These are seen as the areas of main involvement for the rest of the young person’s life. (One may speculate that those kids who want to be particularly “cool”, shocking, rude or just attention-getting might find characters like Howard Stern a convenient vehicle. Unfortunately, if they become too deeply involved they may be the carriers of our next cultural trappings – bringing what they have studied so closely into the rest of their lives and the lives of others, to say nothing of a possible trail of gang violence from the “hood”).
Considerations such as these refer us to other studies in brain function effects done by sociologists, and also to split-brain research by central nervous system experts in the study of brain behavior. In the literature of bilateral hemispheric, affective, functioning there are implications for complex human behavior, as in the work of Warren TenHouten (1985) at UCLA. He applied the theory of cerebral lateralization to the sociology of knowledge. His work takes off from the ground-breaking studies of Bogen and Bogen, (1969, ff.), who conducted early surgeries in “split-Brain” patients. TenHouten quotes Emil Durkheim on the “constitutional duality of human nature”, (roughly Koestler’s theme, above): “The old formula homo duplex is therefore verified by the facts. Far from being simple, our inner life has something like a double center of gravity. On the one hand is our individuality – and, more particularly, our body in which it is based; on the other is everything in us that expresses something other than ourselves”.
TenHouten applied these ideas to a study of economic organization in modern society and it’s cognitive styles, but for our purposes he states: ”The two modes of thought Durkheim saw as characterizing the human mind have parallels in the relationship between self and society. At one pole we find the society within the consciousness of the individual; at the other pole, the individual’s consciousness within the society. This distinction … parallels Bogen’s speculation … that each hemisphere represents its own other and the world in complementary mappings, such as that the left hemisphere maps the self as a subset of the world, and the right hemisphere maps the world as a subset of the self.” If this reasoning seems complex, one can imagine the struggle that each teen-ager must be going through in order to achieve the needed feeling they belong to a group, and yet feel individual and unique. All this at the same time the prefrontal cortex of each differentiated hemisphere is expanding rapidly, puberty is being established and barely settled in – and when they are often urged off to war.
To take a closer look at the process of acculturation in the context of brain behavior, elementary theory and practice from social-psychological studies are instructive. Personal experience in using an old class-room “trick”, in both the United States and England, demonstrates that certain behavioral responses are predictable. People pulled in off the street would probably work as well, but for convenience’s sake one may “round up the usual suspects” --the university level class of perhaps 40 people or so, divide them into two groups which are then usually nearly equivalent in terms of ages, sex, socio-economic grouping, intelligence and educational level. Next send group A and group B to separate rooms with no communication between the two. Each person (or subject) is provided with the same list of 10 or 12 community figures such as lawyers, doctors, technical workers, nurses, etc.; each list also contains the term politicians as one of those community figures.
Group A is told that they are to rank these personages in terms of their value to their community. It is noted by the instructor, as an example, that students across the country at the same grade level as group A tended to rank politicians quite highly. Exactly the same instructions are given to group B, except one word only is changed: the word “low” is substituted for the word “highly”. With rare exception the resulting differences in ranking of politicians between the two groups are not only in the expected direction, the differences often reach statistical significance.
This is an elementary, (and old), propaganda device called The Bandwagon effect, as in, “get on board, every body is doing it”. A similar device was used in the Milgram study noted below: The Voice of Authority, as when General so-and-so does something “it is OK for me to do it too”. Leaving aside the scariness of how easily opinion polls or other behavior may be manipulated, deeper implications appear when the subjects are asked why they ranked politicians as they had. Invariably those subjects who had ranked pols highly said their thoughts had turned to the more statesman-like personalities in the news, while subjects in the
“low-value” group B said they had thought of the likes of Tammany hall ward heelers. In other words, you do not have to get people to change their minds about anything in order to alter their behavioral responses; they will comply simply by selecting a compatible response from the total range of culturally defined meanings. They will not only rationalize their answer, but will feel quite righteous in the process.
The point here is that we can in this way see how acculturation works; having been socialized to the extent of identifying with, (or against), certain social groups, people often find that their ideas of right and wrong, or good and bad, come with the territory. Furthermore they are usually convinced that most of it is their own idea. All the while our brains store these concepts and ideas in special ways in order to, among other things, keep us feeling sane. (That is what “rationalizing” means). In this process the intellectual functioning of the upper reaches of our brain are constantly influenced by the “lower” realms, said to belong to the crocodile and the horse as well as to us humans. This brings us back to the dog if not to Howard Stern.
In my neighborhood dogs are mostly well behaved, in dog owner terms, and usually leashed anyway. Through puppydom and even beyond some can be foolishly loving and friendly to a fault toward strangers as well as family and friends. It is difficult at times not to notice, though they maintain an air of confident, companionable abandonment, their forthright and frequent public production of excrement. There are also the uninhibited sexual liaisons, or at least earnest attempts, nothing daunted by the discomfiture often expressed by human onlookers. Our beloved pets can also sometimes aggressively snap or bite. All this may be traced to the effects of a paucity of cortical (inhibitory) brain structure and failure to automatically acquire our socially acceptable, “nicer” traits; if the noble animal assumes that we humans do these same things too, it is of course being more reasonable than perhaps we deserve, but in the absence of specific intensive training they still fail to acquire our cultural inhibitions (or what seems to be left of them). Dutiful owners, however, follow along with plastic bags and scoops even if their pets vaguely wonder about their master’s sense of values; they protect their own floors and at least harvest gratitude from neighbors, while giving interceptive tugs on the leash. The owner thus becomes the cerebral cortex, conscience, or superego, if you will, of the creature already possessed of an excellent, standard model, no frills, “more primitive” mammalian brain.
Our own model has more bells and whistles; we feel and do the same as our canine soul mates and more, only with a weather eye out for what other people, or our consciences, will say about it. Wolves and dogs are social animals and like humans, tend to congregate in groups. They are however, relatively unburdened by our culture, popular or otherwise. Perhaps I belabor the obvious here, but it should be clear that our cultural features are acquired and learned; other creatures seem severely limited in the ability to acquire extended social characteristics, Clever Hans the Wonder Horse not excepted.
When we wish to consider the “higher” intellectual functions of the human brain, what comes to mind (no pun intended) is the capacity for cognition including synthesis, fantasy, and symbolism usually associated with “creative” expressions. Special areas of the brain have been carefully mapped out and isolated where these functions are at least mediated, and they play starring roles in our notions of culture (Arieti, 1976). Lest we forget, there are present also areas of reasoning, judgment and memory for past learning (and past errors). After an emotional storm, including wrong behavior, we can usually say, at least inwardly, “I knew all the time it was not right”. But are these brain areas separate and exclusive from the phylogenetic more primitive brain of fight, flight and lust shared by our four footed cohabitants? This question involves the aforementioned limbic system which, as it turns out, is pervasive throughout the brain -- so much so that some writers appeared to give up on the idea of any specificity for its functions. Researchers (Weil, et al. 1974) stated,…”before we become discouraged with the concept of the limbic system we should examine closely the question of whether or not affect itself pervades all aspects of behavior. We obviously believe that it does on both the sensory and motor sides of the coin.”
There is no reason, therefore, to think of some parts of the brain as becoming non-functional and in a disconnected state, while other parts function all alone. In the living and intact organism all parts of the brain and it’s neural centers hum right along, no matter what; some parts may just be “sleepier” than others. As with the adolescents above, we can entertain two (or more) cerebral crusades at the same time.
It should be obvious, however, that even though it resides there, our cultural identity does not come from the brain itself: it comes to us from our human environment. There is enough evidence for example from studies of feral children, (Candland, D.K., 1993), a subject perhaps not getting much press lately, demonstrating stunted growth and development when a surrounding social group is absent. Children adopted into cultures different from that of their parents amply reflect how little is carried away with them in terms of language, values and beliefs. (There are also those of us who are convinced that we were stolen at birth, or shortly thereafter, from rich and famous homes by Gypsies. The conviction becomes more pronounced during teen-age years when it is realized that we were, in fact, given into the hands of crude and unfeeling peasants who are notably mean and miserly in matters of the family car.)
Emotions, and the temperament that goes with them, do appear early on in life and thus are most likely brain-related in special ways. They may profoundly influence our cultural attitudes and beliefs; efforts at tapping these features in various populations often have taken the form of scales, or questionnaires, ideally administered by experienced social scientists, in an attempt to tease out the underlying nature or origin of our belief systems. Following the interest in the “trait” of authoritarianism with the widely used “f scale” by T. H. Adorno, et al, (measuring fascist tendencies during the 1940’s), Milton Rokeach (1960, ff), developed a scale to measure social preferences entitled the Dogmatism, or “Dog” scale. While probably not revealing many hard wired character traits, the scale did show our strong tendency to prefer to socialize, and live or join with people or groups seen as most like the way we perceive ourselves to be; (among our extreme end groups in those days were Arabs and Islamic fundamentalists). This scale had the added usefulness of measuring political attitudes and racial or ethnic bias.
But let’s get back to Howard Stern, and hopefully, some cerebral relatedness. For one thing, I cannot imagine Stern doing his “schtick” without carrying in his head at one and the same time, a profound appreciation for what he may regard as the prudes and straight-laced “hypocrites” out there in radioland. He, along with his aficionados, could not enjoy the genital and anal, in-your-face preoccupation nearly as much without picturing, somewhere in their own heads, a large segment of the audience exhibiting shock or disapproval. In that sense, the “Sterns” of this world (there should be a play on the words for back-sides and nether regions here), vitally need their cultural opposites, (or perhaps we could say “fore-parts”), in order to exist at all.
This thinking takes us along a path suggesting the need we have for others (the existential “Other”), in order to develop an identity of our own; contrary to Koestler and Durkheim, this path most likely represents dialectic relationships rather than dualistic ones, (concepts which have already been dealt with exhaustively by certain Nordic and German philosophers, antedating but possibly anticipating, this present writing). For our purposes however, Stern’s activation of the higher brain centers, right along with expressing the “primitive” body functions, demonstrates how you can’t do one without the other. A dialectic relationship presumes that each polar opposite needs the other to exist, and will eventually be joined together with the other in some future resolution. Again, like the teen-agers, apparently opposite “projects” in cerebral centers wend their way to some consensual conclusion, so be supportive of those kids as they try to find that way. It is maintained here that throughout the ebb and flow of popular cultural change over the ages, it is safe to say the brain, together with its various component parts, performs much the same duties no matter what social context is presented to it.
In a particular era, say the late Victorian of most of the western world, certain sexual behaviors (together with their symbolism) may be stringently held in suppressed, or pre-conscious or even repressed, “unconscious” check, within cognitive mazes of the mind-- a word signifying brain + culture, (which may be much more than that, but not less). Proceeding through Edwardian times, through the industrial revolution to the 1920’s, a reversal in attitudes begins to take place as many younger people suppress their learned cultural inhibitions in order join in -- and more importantly fit in, with their most “popular” peers (as they did with their lusty superiors in Edward’s court). And so it goes back and forth; during seasons of change, and between dissimilar or warring groups, the “upper” cerebral centers play a reciprocal hide-and seek with the “lower” centers sometimes in the form of outright denial), and with the current group-culture of one’s peers, be they gang, neighborhood, political party or nation.
So this is of course why it may be said that people these days are as repressed and hypocritical as ever they were. Those tendencies are obdurate and durable because to change them one must change ones entire identity to a greater or lesser degree. That is a little like asking a leopard to change its spots, especially since the leopard may see no compelling reason to do so. In the case of human socio-cultural identity such a change often requires us to join the “enemy”, or at least the opposition. When it occurs we may often then realize that God can do amazing things; He can even induce us to change into his loving children!
ADDENDUM (and stray thoughts)
Of the social elements that can apparently redirect attitudes, beliefs and cultural orientation, one’s religion, or lack of it is most prominent throughout world history. Especially in times of crisis, personal or world-wide, people go through changes in spiritual interests and concerns; either at the urging of authorities, or voluntarily, there is frequently a change in faith. Ancient Jews, for example, became much more Messianic after the fall of their temple in 70 BC; (they were particularly hopeful for the advent of a Messiah with military skills). As “fate or fortune” would have it, today in the midst of increased interest in the Bible and an unprecedented growth in church attendance – and our concern here with culture and the field of entertainment – comes the passion film by Mel Gibson.
I have not seen it nor do I plan to do so, but from the many reviews both pro and con, Gibson has done something outstanding, not only for Hollywood, but in the annals of entertainment history; he deserves much credit for that. Language is a powerful cultural medium and in order to understand it’s meaning it is useful to go to historical roots --this approach might be helpful in parsing the film. For example there is the Anglo-American penchant for showmanship; “ballyhoo” is an inspirational art, as P. T. Barnum proved. He knew the value of passing out samples and free passes to the right people. And there is the attraction of a well-known, dedicated actor who has specialized in violent action pictures. His previous work is said to have been lavish in blood-letting scenes traditionally surrounding good-guys versus evil villains, (but he cannot be held accountable for doing what Hollywood always insists upon as rites of passage). Gibson implies that he was directed by heavenly influences in order to carry out this “Biblical” production. Apparently not a newcomer to religion, he has a Catholic background and has evidently done some Bible reading – he remarked how very pleased he was to see how “Marian” his film turned out.
Evangelical Christians, who have not always been sympathetic to Catholic iconography, are reported to have attended in “droves”. In spite of side-line complaints about its free-wheeling interpretations and narrow focus on a torturous, drawn-out death unrelieved by the rest of the Gospel, it is accepted as a really important, brand new “entertainment” product with a message. In fact, it appears to be rather an astounding achievement. Cultural binding was not mentioned here before, but the powerful limitations this imposes keeps most of us within the close confines of the ideological space we occupy throughout life – new paradigms and totally new ideas are rare; while clearly emerging from it, most new ideas bear the stamp of all the rest of our popular culture. Besides, as a famous economist once said, (Galbraith) “Most new ideas are bad ones” – Gibson’s is apparently not a bad one. He is dealing in a medium that produces something truly new only once every thirty years or so, (“talking pictures” being one of them), and the public reaction to his work, mostly word-of-mouth, is rarer still; then too, there is the precedent setting box-office!
It remains to be seen whether the Lord is turning all of it into something also essentially good for the long run, but a further word about brain effects: Gibson himself noted, as did others, the most immediate reaction to seeing the Passion is a profound silence. That seems to be the effect of witnessing two hours devoted to the writhing under extremely painful, deadly wounds inflicted by arrogant and intentionally cruel executioners upon the body of a beloved figure of surpassing kindness – and bearer of hope for our immortal souls. Perhaps the very notoriety may bring in pre-believers, who might thereby get a taste of how His followers “who did not yet call themselves Christians”, may have reacted; it is not likely that many viewers will leave the theater as from the regular cinema fare, or even a powerful Sunday sermon, gushing, “O, I just adored it”! There is surely a moment or two of cognitive dissonance, requiring time to erect neural defenses against images of the raw behavior of our more “primitive brain”, -- as the “higher” cortical centers begin to permit us to gulp down the terrible effrontery of our sins, seen there weighing heavily upon the cross. These are the centers that also become glazed-over during the showing itself.
Over periods of more prolonged exposure to severe scenes or experiences of painful and repeated wounds adaptation occurs, so that bigger and sharper pains are required to achieve the same degree of shock. That is why professional torturers are trained in confining “camps” with long periods of pain, insult and degradation to self and others, and little time-out from the task. (Also a strong reason not to raise young children with insult and injury as behavior-modifier’s). Executions, such as hangings, have been carried out in public long before news became a media, and sensationalism always drew crowds of onlookers. Even the more “high-minded” of us have trouble pulling away from a spectacle of disaster in the street. This reaction is probably linked with self-preservation; if there is danger out there we want to see it coming, if only for “the next time”. In terms of entertainment, circuses in the Roman coliseum catered to a similar mixture of thrill/repugnancy and serious study-oriented curiosity, with a covert concern by spectators about how they might themselves go through the same fates. Another category drawing interested on-lookers is that of the flagellants who whip themselves till blood flows in the streets --or the “pilgrims” who crawl on raw, bleeding hands and knees, intent on demonstrating contrition by experiencing some of the misery of Jesus.
Here it is maintained that human cultural trends always seem to demand an ebb and flow from higher to lower brain centers, spanning from one era to another. It was reported that Johannes Sebastian Bach, in composing his St. Matthew’s Passion, was so aware (circa 1740) of possible anti-Semitic overtones, that he made sure that his entire chorus, Jew, Gentile, assorted rabble and Romans, cried out for crucifixion. The prestigious journal of the American Medical Association, (JAMA 225, No. 11, 1986), Published an article, The Crucifixion of Jesus, in which the details of his death, as it might have occurred from a medical perspective – including prolonged flagellation -- was described in terms very similar to Gibson’s Passion. Did he read it before the filming? In his place I might not have proceeded –I don’t have the stomach for it. It is said that the film brings home more profoundly what the Son of Man went through in absolving us of our sins. Personally I sometimes find it difficult to read through parts of the first four Gospel crucifixions, even though thanks be to God, the Good News is included therein.