Social proof and getting things wrong
Thoughts on belief and the importance of accepting error
A friend asked me, “Hey Katy, why do parties keep fighting doomed legal cases, even though they have lost several such cases before and they really should settle, if they were rational?”
As I worked in the court system for about three and a half years, and I was a litigator for about two years, I had thoughts on this. People do not always behave in rational ways, particularly when the issues in a case go to the core of who they are. People who fight doomed legal cases remind me of some psychologists who infiltrated a cult. Bear with me, there is method in my madness.
In When Prophecy Fails, social psychologists Leon Festinger, Henry W. Riecken and Stanley Schachter open with the following observation:
A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.
We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.
But man’s resourcefulness goes beyond simply protecting belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.1
Festinger et al infiltrated a cult which had a prophecy about the end of the world, said to occur on a specific date, namely 21 December 1954. The leaders of the cult, Mrs Keech and Dr and Mrs Armstrong (all pseudonyms) believed that Mrs Keech was in communication with divine astral beings, who told her that the Earth was going to be catastrophically flooded, but that her followers would be rescued by aliens in UFOs.
Mr Keech, Mrs Keech’s husband, did not believe the prophecy in the least, but he was an extraordinarily patient and gentle man:
He simply went about his ordinary duties in the distributing company where he was a traffic manager, and did not allow the unusual events in his home to disturb in the slightest his daily routine.2
Before the date of disaster struck, the believers were secretive. However, once the prophecy was falsified, the behaviour of the believers changed markedly. While they were despondent at first, they then increased in fervour, and rather than being secretive, they tried to convert others to follow their beliefs.
This tended to support the hypothesis of Festinger et al, who had looked at historical instances where doomsday prophecies had been disproven and formulated the following five conditions where believers would increase their fervour, rather than giving up their beliefs:
There must be conviction.
There must be commitment to this conviction.
The conviction must be amenable to unequivocal disconfirmation.
Such unequivocal disconfirmation must occur.
Social support must be available subsequent to the disconfirmation.3
So, the belief must be very deep, and it must require certain actions or behaviours to be undertaken. The more drastic the commitment to the belief, the more wedded the believer becomes to it. Therefore, for example, if a believer has given up her job and cut herself off from family, the believer will be more wedded to the belief. If the disconfirmation occurs, this causes cognitive dissonance: “I have changed my behaviour, given up my job and cut off my family for nothing.” This is painful for people to face. It’s also related to the sunk cost fallacy: if I have already sunk costs into this activity, I must continue it.
However, if a believer is surrounded by others who continue the belief, it is less painful to presume that the beliefs still have validity; it is just that there was a mistake, or that they misunderstood something about the prophecy. Believers then reassure each other and go out to try to proselytise after the disconfirming event, because of the importance of social proof.
Humans are social beings. If other people also believe that something is the case, then we become increasingly convinced that it is true. Therefore, when our beliefs are rocked to the core, we may run around, desperately seeking social proof from others, in order to bolster our conviction that we have not made a mistake and taken drastic action for no good reason. Our desire for social proof is particularly ardent when our beliefs have been rocked or where we are uncertain.
Reading this book, I was reminded of a legal trial I once saw. The judge told the plaintiff in the first half day that he should consider speaking with the defendants. This is “judge-speak” for, “Plaintiff, you have absolutely no case at all, and you should go settle this matter before you rack up your legal costs.” The plaintiff did not take the judge’s advice. Instead, he doubled down, and argued his case more vociferously. At several points during the evidence, the defendants’ barrister presented him with information (including contemporaneous documents written by him) which explicitly disconfirmed his account of what had happened. In each instance, he had an explanation as to why what he had written did not really mean what it seemed to mean.
The weird thing, from my point of view, was that the plaintiff did not seem to be lying. Lying requires a consciousness that one is not telling the truth. Conversely, it seemed to me as an observer that he believed absolutely in his own account of what had occurred. It was just that none of the documents or the witnesses confirmed it, and hence his evidence was unconvincing. If he was lying, he was lying to himself, to save himself from cognitive dissonance: “I could not have done such a foolish thing.”
Of course, he lost the trial. I saw later that he appealed, and lost the appeal. Leave to the High Court of Australia was refused. I still wonder if he accepted those decisions.
I’m not a proselytiser by nature. I write for myself, and to make people think, but I’m not seeking to make you agree with me. I do not like being preached at, and I tend to recoil from it, and so I tend to avoid doing this to others. I also hate my emotional buttons being pressed in efforts to persuade me into something I feel uneasy about.
Whether others believe what I say or not is up to them. As a result of reading When Prophecy Fails, I am also now careful, if I feel I must go out and convince other people of the truth of something. Why do I feel I must convince others? Is it because I have sound evidence, or is it because (counterintuitively) I lack certainty and I’m seeking social proof from others, to avoid cognitive dissonance? Many of the most dogmatic statements on social media seem to me to be in search of social proof where the situation is uncertain.
In any case, in short compass, that is my explanation for why people fight hopeless legal cases. It may also explain why sovereign citizens and some litigants-in-person (who tend not to accept the jurisdiction of the court) flock together and surround themselves with others who will confirm their beliefs. This provides the necessary social support, so that even when a judge does not behave in the way they predicted or apply the law as they anticipated, they can still gain social proof from those around them.
Something I have also been thinking a lot about lately is that it is painful to admit you are wrong. This is particularly the case, perhaps, if you are generally clever, and you are used to being right. Some people will continue to swear that red is blue, or that up is down, to save themselves the pain of cognitive dissonance. On the other hand, I am prepared to admit that I am fallible. I think humility is important. I don’t like being wrong! But I am human. I will sometimes be wrong.
I had a “moment” the other week. My PhD thesis, and my first monograph, was on accounting for profit for breach of contract, and how this could be justified. In a recent United Kingdom Supreme Court case, Rukhadze v Recovery Partners Ltd,4 Lord Leggatt said that the case I relied upon, Attorney-General v Blake,5 was “anomalous”6 and Lord Burrows said it was “exceptional.”7 This was just another step in the sidelining of Attorney-General v Blake, which had already been thrown into doubt by One Step (Support) Pty Ltd v Morris-Garner.8
Despite this, I continue to believe that there are ways in which Attorney-General v Blake could be appropriately rationalised. The case was too vague in its criteria and hence courts did not know how to apply it, and that caused a mess. I think I came up with workable criteria.
However, in my writings describing the law of England and Wales, am I going to say that Attorney-General v Blake is emphatically correct? No, I am not. In my view, it has now been confined to its facts (involving the betrayal notorious traitor and double-agent, George Blake, who defected to the Soviets, and then wrote the most depressing memoirs when the Iron Curtain fell). I don’t think those facts will often arise.9
I don’t want to become one of those academics who continues to push her theory when it’s been rejected. For me, it is less important to convince others of my correctness, than it is to reflect what the law currently is and to be a trustworthy source. I’ve accepted now that if the courts don’t read my theory or reject it: them’s the breaks. The nature of being a private law scholar is that one’s theories are reality-tested by the courts. I must have the ability to admit that my theories will not always be right, even if there is a certain painful cognitive dissonance involved.
So, I think we have to be ready in this very polarised world to accept cognitive dissonance, and to accept the prospect that sometimes our deeply held beliefs will be challenged, or even proven wrong. There is a difference between evidence, reasons for our decisions and the way we rationalise those decisions. Hopefully I have made you think about why you believe what you believe, and why you seek to persuade others.
Leon Festinger, Henry W. Riecken and Stanley Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World. (Minnesota, University of Minnesota Press, 1956) pg 1.
Ibid, pg 49.
Ibid, pgs 4 and 216
[2025] UKSC 10 (Rukhadze).
[2001] 1 AC 26.
Rukhadze [2025] UKSC 10 [148].
Rukhadze [2025] UKSC 10 [278].
[2018] UKSC 20; [2019] AC 649 [72]–[82] (Lord Reed).
Although, actually, I know of at least three cases involving government agents and profits.
Thank you for this very thoughtful post with so many implications.
I don't see it in your references, but I commend this article to you which I think fits in nicely with your argument.
https://sese.asu.edu/sites/default/files/2021-10/Why%20Smart%20People%20Are%20Vulnerable%20to%20Putting%20Tribe%20Before%20Truth%20-%20Scientific%20American%20Blog%20Network.pdf
This is great, Katy - took me a while to get to it, but glad I did. Being wrong has been very much in my thoughts recently.