One of the characteristics of punk rock was rebellion against authority - whether that was the then mainstream music style, or mocking of the British monarchy. Surely this has also been the appeal of a few Monty Python films we could name. Sometimes rebellion shocks our sensitivities; sometimes we're completely unaware of the need to have our sensitivities shocked.
It must be both right and healthy to question authority, or perceived authority, and also peer pressure. Many of us are taught as young kids to question peer pressure. "Would you put your head in the oven just because Pete told you to?" However at that time most parents are delivering this line or one of its variants, I'm pretty confident that they're not intending to teach their kids to question their parental authority (otherwise it would be self-refuting teaching).
Today there is a substantial amount of challenging of old authorities, and often this has got to be a good thing - provided that we're not unthinkingly substituting one unchallenged authority for another. How much is our morality really ours, rather than the product of external influences? How much of what we think is 'thinking for ourselves' is actually not that at all? Whatever the source of our moral code, even if we do manage to think for ourselves, how often do we abdicate our responsibility when it comes to action?
I am not schooled in psychology but I found Milgram's Obedience Experiment intriguing. Having been actively involved in Research Ethics, I think it should be required study material for all those conducting scientific experiments on live subjects, and all those in the agencies which regulate such studies. This might seem ironic as it has been suggested that the original experiment was unethical, and would not be approved in more recent times.
Watch coverage of the 1963 experiment (provided of course that you don't find that too authoritarian a suggestion!):
As with most scientific experiments, some potential methodological flaws have been proposed - such as, in this case, uncertainty whether or not some participants continued because they suspected that the experiment was a hoax. Even if this were the case for some who continued, in my own view, I think it is significant that they continued without knowing this for certain. For some or many subjects, perhaps being designated 'teacher' was a positive reinforcement. Maybe those of us who have received knowledge in a variety of settings can be lured to behave in a particular way by this temptation too - a false hierarchy on the basis of 'owning' information which we are not in a position to verify for ourselves.
Once in that position, it seems a trait widespread in humanity to adopt the mantle of authority - whether that's as a preacher promoting the 20th Century heresy of creationism, or an advocate of scientism (rather than science), or judgmentalism regarding religious dogma, or using military force to advance political models which we think work in our setting but may not elsewhere. We can feel that we're in a privileged group, with a duty to convince others of the authority we (think we) possess. We would do better, I believe, to adopt the mantle of humility - of one beggar pointing another beggar to where he/she found bread; and having found the bread, to choose to serve rather than self-elevate.
I try to avoid using the same music in different posts, but some of the pride theme songs are relevant, and one from my last post is equally pertinent in this:
"Well, the Book of Leviticus and Deuteronomy
The law of the jungle and the sea are your only teachers..."
...Well, the rifleman's stalking the sick and the lame
Preacherman seeks the same, who'll get there first is uncertain
Nightsticks and water cannons, tear gas..."
In Northern Ireland, we are historically familiar with riots, tear gas and water cannons. With the recent failure of the Haass talks to deliver agreement on controversial political issues, where are the authorities? Are our political "leaders" looking over their shoulders regarding what a (small) proportion of their constituency thinks and having their behaviour constrained by it? You may say that this is the nature of politics - to be aware of and represent such a constituency - but as has been said "you can't please all the people all the time" (otherwise there would be no viable political opposition party) so perhaps more political courage and moral fibre is required to lead and 'do the right thing.' Of course Milgram's experiment doesn't allow us to blame our political leaders - we mustn't offload responsibility - we put them there, even if we say we're "a bit uncomfortable" with some of the stuff they say/do.
We also have ongoing individual and collective responsibility for the other authorities we sign off responsibility to. The reason we need some kind of independent Research Ethics process is because of the recognition that scientists, eager to advance knowledge, will sometimes not be in the best place to assess whether the price that research participants are being expected to pay is justified to deliver the answer it is hoped that the research will deliver.
One interesting finding of related work by Milgram was when he looked at changes to the physical proximity of the teacher and learner: the more remote the person being 'punished,' the less reluctance there was to delivering the electric shocks. Could this not have something to say in the context of use of drone weapons?
"They said what’s up is down, they said what isn’t is
They put ideas in his head he thought were his
He was a clean-cut kid
But they made a killer out of him
That’s what they did...
Well, everybody’s asking why he couldn’t adjust
All he ever wanted was somebody to trust
They took his head and turned it inside out
He never did know what it was all about."
Well of course we're so much more enlightened now than in the 1960's, aren't we? We're much more informed about holocaust, so perhaps more of us would have bailed out if we were the teachers in Milgram's experiment - and of course it can't be repeated because of our presumptions around research ethics. Except that it was. Several times, albeit with some modifications. The ethical concerns tend to centre around the "inflicted insight" in the experiment - that participants (teachers) were given insight into their potential to cause extreme suffering, and that this can be highly distressing to them subsequently. This is compounded by the fact that their "consent" to participate was compromised by deception as to the purpose of the study in which they were to participate. However, today Ethics Committees may still deem it appropriate in some circumstances for potential participants to be unaware of the detail and/or nature of a study when they consent, even regarding the main purposes - it would be a matter of weighing the perceived benefits against perceived risk/harm to participants. Some research questions are bigger than others...
So, what of Milgram revisited? In fairness #2 wasn't really a repeat of the original experiment. It replaced the human 'learner' with a computer avatar. This wasn't to deal with concerns about applying shocks, because of course none was delivered in the original experiment. It looked at whether the 'teachers' responded in a similar fashion to those in the original experiment, despite the fact that in this one the 'teachers' knew that neither the shocks nor the recipient were real. This 2006 version of the experiment found that teachers who saw & heard the avatar behaved and responded physiologically in the same way as if they were real ie like those in the original experiment. The authors claimed that this provides an alternative, ethically more acceptable, method for further studies in this territory of obedience. Also in 2006, TV illusionist Derren Brown did a partial repeat/re-enactment. Using subject selection, he felt that a gradual change in participants perceptions was more effective at getting them to do things which they previously would have resisted, than 'jarring' their initial values. In 2009, using a modified protocol which satisfied ethical review requirements, Jerry Burger (real name!) published a paper reporting that he found the same obedience rates as in the original experiment, despite adding a neat twist (see 'Contrary to expectation...' in the abstract).
[c/o American Psychologist, Jan 2009]
In 2009 BBC popular science programme Horizon also repeated the experiment, with consistent findings. She seems such a nice girl...
A variety of other TV shows (although some are likely to have been done in less robust experimental settings) have made the same observations too. One example is the Discovery Channel's apply the shock. In the Horizon documentary, psychologist Clifford Stott noted that idealism regarding scientific enquiry had a significant effect on the teachers. "The influence is ideological. It's about what they believe science to be, that science is a positive product, it produces beneficial findings and knowledge to society that are helpful for society. So there's that sense of science is providing some kind of system for good." Science has provided and continues to provide huge potential for good, but we have heard that language before in other contexts as a 'justification' for immoral action, on a huge scale. Science, when functioning properly, provides answers to mechanisms and equips us to improve things technologically, but its researchers and participants (just like those who fail to give it due credit) are not immune to the problems of "power, greed and corruptible seed."
Tongue-in-cheek kitsch, like most things Ramones. Still, let's keep questioning who we're investing authority in - politically, morally, in terms of the sources of our information, what we might be blind to, and what road it's taking us down. We really should work that out "in fear and trembling."