Conflict is part of life. How do you solve it?
April 17, 2023

Conflict during crisis and how to handle it Guest: David Faro

Conflict during crisis and how to handle it Guest: David Faro

My guest in this episode is David Faro. He and I talked about crisis management and especially what happens in a crisis that may lead to conflict; how to improve communication and learning from a crisis. He explained how time pressure and high stakes can cause curse of knowledge especially when experts communicate with the public during a crisis. He also talked about hindsight bias and outcome bias that make crisis even more difficult to handle leading to conflicts when the crisis is over. He suggested what kind of policies can be developed to handle these behavioral tendencies.

Transcript

I.E: Hello, and welcome back to ‘We Can Find a Way’, a podcast about conflict resolution. My name is Idil Elveris. This is a podcast that pioneers a culture change in handling conflict because conflict is everywhere. It is also the only bilingual podcast that addresses conflict on an international scale. ‘We Can Find a Way’ is sponsored by Koc Attorneys at Law, the Istanbul and Antalya-based boutique law firm, founding partners of Koc Attorneys at Law are staunch believers of using dialogue and finding common ground to resolve conflicts. They're very happy to be supporting this podcast in the hope that it will help advance the much needed discussion on de-escalation and reduction of polarization in conflict situations within the legal practice, as well as in the public discourse.

‘We Can Find a Way’ is helped by my Marketing Manager, Julia Nelson, who helped me to improve it so much, so please check the website of ‘We Can Find a Way’. There are guests, their life stories, the transcripts of the episodes, and all sorts of information. We are also working on adopting the braille alphabet for the podcast, so we will try to be as inclusive as possible. In this episode, I also would like to announce a *giveaway* that will be an environmentally friendly soap for the first five persons that sign up for the website of ‘We Can Find a Way’.

Now, having said all of this, let me turn to my guest in this episode who is David Faro. David received his PhD from University of Chicago and is working on behavioural economics and decision-making at London Business School. His research is about consumer and managerial decision making. Recently, he started to teach crisis management, so he and I talked about crisis situations and especially what happens in a crisis that may lead to conflict - how to improve communication and decision making in a crisis and learning from a crisis. Let's now move to the interview that took place on twenty ninth of March. Okay, David please tell us what happens during a crisis that may lead to conflict.

D.F: Thank you Idil for having me in on the podcast. So I think, you know, to your question first, it will be good maybe to have a quick definition of what the crisis is. There are many definitions, but recurring themes in these definitions is that it's a high stakes situation, there are potentially very negative outcomes that are looming that are possible that can threaten an organization or a society or even like individuals sometimes. It can be health-related. It can be financial. It can be reputation-related. So, it doesn't have to kill you necessarily. But there's a possible threat that can affect the long-term viability of the organization, for example.

Other features that are common to crises are a great sense of uncertainty or novelty. Like, we don't know what's going to happen, we haven't seen this kind of situation before, it's a novel problem, which might require different kind of solutions than the ones we have until now. Time pressure is another common feature. There is often, like, it needs to be done very fast. The crisis also can sometimes take very long time. Like, the whole Covid crisis, you know, lasted years. But often in crisis moments, there is a sense that something needs to be done very fast. Of course, we want to do the right thing because not doing the right thing has these like, you know, severe consequences. So, what do we say, time pressure, high stakes, uncertainty, novelty, sense of threat. Lastly, there are a lot of stakeholders that are affected by crisis.

And maybe that's actually linked to your topic of conflict. Crisis often pit the several stakeholders, incentives, against each other. And you might ask why that might happen. And one reason to think is that, often in crisis, our resources are stretched, our human resources, financial resources, health-related resources. So, there is maybe more to fight about or the different perspectives or needs of the stakeholders might be in conflict. And I think from that, we can then begin to think how it might affect our decisions, the way we communicate to each other and also what we learn from the crisis.

I.E: You have defined for us what a crisis is, but now can you go deeper into how these circumstances lead to conflict?

D.F: Sure. Let's think of one example. As I told you before, maybe before we started the podcast, what I focus on in my teaching and my research are human-related, human behaviour issues in crisis.

So, a very important one is something called ‘curse of knowledge’. It's the idea that when somebody knows something well, that person may have a hard time to anticipate that the other side may not have that kind of knowledge, may not share that kind of knowledge to the same extent, or may not understand what he or she or they mean. This might cause communication problems. I'll give you an example shortly, but that's called ‘curse of knowledge’.

I.E: But what is the curse here that the person is an expert? It can't be that?

D.F: Well, actually, it is interesting that you ask that. It is in some sense that the person is cursed by their expertise or by the extra knowledge that they have. But of course, knowledge is not a curse per se, but the fact that we are unable to anticipate that others share this knowledge is the curse.

I.E: And you're unable to explain it to the full extent. It's almost like what's his name, this American Dr. Fa...

D.F: Fauci.

I.E: Exactly. So was he cursed by knowledge then?

D.F: It's interesting that you give that example. I don't want to refer necessarily to Fauci, but yes, experts, medical experts, for example, given the task of communicating very complex information potentially to politicians, to the public, and they might know a lot or until they do know a lot about the issue, the virus, the treatments, etcetera, the vaccine, and they have also a lot of background knowledge about the circumstances that affect the situation. However, they might not understand or they might not feel that the public needs all this information or they don't have or may not have the time to communicate this information. And that's why actually curse of knowledge is especially relevant in crisis moments because the expert or the politician or a leader may feel that they don't have the luxury or the time or the patience to communicate in a way that doesn't suffer from curse of knowledge.

I.E: I think lawyers have that often too. Because they have to explain another complex issue with very little information or make it more understandable they have to omit by way of, like, time pressure, etcetera. And then, of course, they're unable to communicate, I guess.

D.F: Yes. Exactly. And any use of jargon, for example, legal jargon is a typical kind of feature. But I should also add that it is not that people necessarily of those experts necessarily feel or know the discrepancy. Then I decide. You know what, I understand that there is this gap, but I don't have time to explain to you sorry. No, curse of knowledge actually means that they might overestimate the extent of understanding. Hence, the curse. Okay?

A very beautiful illustration of this was a study by a researcher called Newton that looked at the extent to which people who have knowledge. In this case, musical knowledge can communicate the other side that doesn't share the musical knowledge. And they actually did it: a beautiful study that basically asks people to communicate a song they have in mind to the other side. Okay so suppose I have a song in mind, and I'm going to tap it on the wall now. And the other side, other participants had to guess what song the tapper is tapping.

But critically, the tapper's first estimate what shares of the songs will be correctly guessed.

I.E: Are you gonna tap now?

D.F: I can tap now.

I.E: Yes. Go ahead.

(David taps his chosen song.)

I.E: Oh my god. No. What song is this?

D.F: The song is ‘We will, We will rock you’

I.E: Oh my god.

D.F: So that's the issue. I think it's easy because in my mind, it's playing very vividly, you know, Queen, the lyrics, etcetera. But you just heard this, like, very abstract kind of noises. So here, the problem is that you're unable to guess the song and it might be also difficult to tap the song properly. The issue is not about the guessing or the tapping, the issue is about me, the tapper, thinking that the song will be guessed much more than it is actually guessed.

I.E: You really overestimated my understanding.

D.F: Exactly. In this case, I was cursed by knowledge, and similarly, experts are cursed by knowledge. They might overestimate the extent to which their abstract messages around the virus or around some financial problems might be understood by the audience.

I.E: So the conflict is between society, the public and the expert then.

D.F: Exactly. You can very easily see how it might lead to conflict because if I think that you should understand and you don't understand, I might think that you are not paying attention, or are not very smart, very careful, not very caring, and this might lead to conflict. This might also, by the way, lead to issues such as conspiracy theories or lack of trust, if the leader doesn't take enough time to explain what they have in mind in terms of the problem or the solutions, and he or they assume that the public should understand and they don't. You can easily see how this might lead to reduced trust and rumours and conspiracy theories about what's actually going on because the leader doesn't take time to explain what they have in mind.

I.E: What do you recommend that experts do in order to prevent this problem from even occurring?

D.F: Empathy is the big thing, perspective taking, empathy are key things that we try to encourage in leaders and build in leaders and…

I.E: Or in experts as well then?

D.F: Right. First of all, awareness is good. Just the fact that we know what ‘curse of knowledge is’ and when and why it might be more likely to occur. So, we said, you know, especially in crisis, why, while there is time pressure, there are high stakes.

So I might think I don't have time for very patient communication because I have other things that I'm trying to address and put out. Just a mere fact that we are aware that this might arise especially in those kind of situations, can you use a better chance for us to address that? Because I might think, okay, I know that now I'm in a time pressure or stressed or trying to find a solution, but at least I know that I probably did not communicate very clearly to you and maybe I should do that in some other time when I'm less under time pressure.

I.E: Yes. But when will you find that time during a conflict? I think that's the question.

D.F: It’s a question and, you know, these are not easy answers. But clearly, in crisis, like particularly of COVID, it takes over months, many, many months. So hopefully, our leaders should find the time to communicate. And it isn't that something about the public also to each other, the people that we have to work with have to make sure they understand our messages to have great performance, greater, you know, performance with them.

I.E: For instance, that's why they wanted to communicate some things through the Prime Minister, some things they wanted to communicate through the Queen, not like expert knowledge, but some messages had to go through “laypeople”, not in the sense of lay, but like that people can understand better when the Prime Minister says ‘you MUST stay at home’ like that people understand.

D.F: Yeah. Exactly. So I think changing the person that communicates, choosing the person right correctly. Some people might suffer from this to a lesser extent because they just use clearer language, more visual language, lay terms that are, you know, that translate the science to things that we can understand.

Another thing that I think is related is something that we call ‘hindsight bias’. It's a little bit different and it often happens after a crisis kind of unfolds. We look back at the crisis and try to understand what happened, what went wrong, what could have been done better.

I.E: And we probably attack what was done or who has done it?

D.F: Exactly. We tend to harshly criticize the people that were involved in the crisis and took some decisions which might be very valid criticisms. But what hindsight bias does, it can cause unnecessarily harsh criticism and very negative communication as a result. So let's first define what hindsight bias is - our tendency to think that what happened in the past was predictable, was something that we could have or they could have, should have known.

I.E: Foreseen.

D.F: Foreseen exactly. And critically, we come to believe that only after we ourselves know what happened and therefore the word hindsight bias.

So now we are in a midst of a crisis in Ukraine and in Russia. Okay? And you can think of different things that could happen. It could be that Ukraine wins the war, whatever that might look like. It could be that Russia wins the war, whatever that might look like. It could be that there's a stalemate and there is some kind of peace agreement or there is some kind of stalemate and there is a sense of quiet conflict maybe not quiet and continues for a very long time. So these are four different outcomes that could emerge from the current crisis.

Suppose now I might tell you, hey, Idil how likely is outcome number one, you know, or Ukraine winning, outcome number two, Russia winning, outcome number three, outcome number four, etcetera. I think given where we are, it's fair to say that we are pretty uncertain about each of these outcomes. Anything could happen. I don't know. We can ask experts. We don't have to come to a conclusion around this now. We might now feel a great sense of uncertainty. You might put like fifty-fifty or thirty whatever seventy or forty sixty whatever on each of these outcomes. Two years from now when one of these outcomes is going to occur, it's going to look like this is a higher likelihood. We would have said, actually, we might even remember ourselves feeling much more confident than we feel now.

And there is beautiful research that exactly do that by Baruch Fischhoff from 1970s that talk about -not Ukraine and Russia- but about other kind of military conflicts. He basically asked people about those potential four outcomes and it showed that the people who know what happened tend to give much higher estimates, probability estimates that's the thing that would have happened compared to those that people did not know what happened. He illustrated hindsight bias, this thing that we intuitively share, but also in a very scientifically kind of clear, clean way. Our tendency to think that what we know we….

I.E: What we know today could have been known before, and we don't, actually.

D.F: Exactly. We don't.

I.E: How does this lead to conflict then afterwards? Because we know what happened now and we are now looking back and criticising, well, you haven't done this. You haven't done this.

D.F: Exactly. You know, if we think that this was obvious, we might come to criticize and harshly judge those people who are facing difficult uncertain times in real time, and it can lead to organizational conflict, can lead to replacing people unnecessarily, it can lead to different kind of solutions or that we don't undertake because we don't understand and appreciate the uncertainty faced by those people that were in real time.

I.E: We are actually losing know-how by replacing people too quickly. We are losing the know-how that was generated during the crisis.

D.F: Exactly. You might lose very precious knowledge. Knowledge about how to deal with uncertainty and going through a crisis would prepare people in various ways and by harshly judging those, we might discourage them. Even if we don't replace them, we might demotivate them. And instead of empowering them to deal with the better deal with the next crisis, we might create fear of….

I.E: Fear of decision making.

D.F Yeah. Exactly.

I.E: So what do you suggest that needs to be done that can help in those situations?

D.F: That's a great question. And again, hindsight bias is a very difficult one to address. One thing that would be very helpful to do is try to recreate using various methods, including simulations, including recreating the decision moments in terms of how much information we had, time pressure, lack of sleep, fatigue, and in various other ways, mental load cognitive load. So try to recreate as much as we can the situation as it was faced by the decision makers at the time. If you want to approximate and try to understand how people, what could they do at the time given the circumstances you should approximate those situational factors, which I think should include fatigue. But the general idea is also recreate the amount of information, maybe information overload and the number of choices that they had, things that they could do. Because in hindsight, the solution sometimes seems very obvious because once you know what happened, but at the time, the solution is not at all that clear. There are many things that can be done, so it's also important to recreate the set of options. It could be medical options. It could be political. It could be military options. It could be financial options. Having those many options in front of you would make you realize that actually it's not nearly as simple and as clear as it seems in hindsight.

I.E: So I guess we are coming to the third issue now, what happens in a crisis that leads to conflict, which is another bias that you mentioned before we speak.

D.F: And that's one is called outcome bias. Unlike the two previous ones that we mentioned, actually, the two are related, you know, the one with the song that I tapped or the one with the experts and the one hindsight. It's all about what you know and you don't empathize or take the perspective of other people who don't know. They don't know what happened. They don't have the information. They don't have the expertise. That's about knowledge. Outcome bias is about whether the outcome is good. Or bad.

I.E: If it's good, then there is no conflict.

D.F: It's a good point. Often, we don't ponder enough. We don't talk enough about what could have gone wrong or some of the things that maybe we should have conflicts about if things turn out to be fine. Could be that conflict actually is warranted, is necessary because maybe something in the process was working really poorly. But because of luck, or because of some other reason competition did something really badly or just due to situational factors, things didn't end up, turned out to be bad at all or turned out to be good. And so we might brush over some issues that we should talk about and we should argue about. In fact, what happens is that people very rarely spend much time discussing issues when things turn out to be fine. But when things turn out to be bad with negative outcomes, well we might tend to over-speak and in fact, we tend to also judge harshly against others, especially because of the negative outcome and sometimes we don't take into account the process that they might have taken, the information that they considered, the choices that they had considered, we just extensively focus too much on whether the outcome was good or bad. And this too can lead to unfortunate circumstances including conflicts because again, like you said, we might ignore a lot of luck or maybe the person was just unlucky. But in fact, they used very good sense of judgment and knowledge and experience but was just lucky to end up with a negative outcome. Again, this had created conflict and harsh judgment towards that person, not taking into account the process that they might have used to arrive to a decision. So, the outcome bias is what, it is to judge the quality of a decision based on how things turn out. And like you said, that's subject also to luck. Would I put enough effort on how they got there?

I.E: It just reminds me now of Credit Suisse issue being “resolved” by take over. If we are thinking this is a success that has saved the bank, then we're not really discussing issues that led the bank there. But you tell us.

D.F: Yes, I think you are right. I think the fact that it's seems to have been currently “resolved” may inhibit extensive analysis of how they got there or how we got there and what other kind of precautions might be necessary to avoid those kind of situations in the first place.

I.E: So how can we avoid this conflict then? What is the recipe here?

D.F: I like your word of recipe. Unfortunately, there are no, like, obvious recipes with this kind of like inherent very strong human tendencies. As we discussed outcome bias, as you saw, the focus we had was overweighting the outcome and underweighting the process. How do we avoid that? First, we have to look at our incentives. Why is it that we tend to incentivize by results as opposed to process? I think it's very natural. The results are easier to judge, they seem more objective, they seem more concrete, but we can avoid that or we can minimize that difference by having clear processes to have a process for decision making during crisis moments or for any decision, what is the process when we hire someone for example? When we have a clear process for decision making we can also document it. For example, when we have a meeting or a decision, there are usually meeting notes. So we can go back to those notes and see how do we decide to hire this person? What were our criteria? What were our trade-offs? What were our constraints? Were we under some kind of pressure, etcetera?

We can also critique the process. We can say, well, this process wasn't good and it also didn't lead to good outcome, but maybe there were some aspects of the process which we do like and we want to keep. And actually, maybe the outcome was to some extent a matter of luck that this person ended up a little bit crazy, which we couldn't have known, or maybe we could have known, maybe we could have some kind of assessment of mental health or whatever. So, we can focus on the process. And, of course, importantly, good processes should lead to good outcomes in the long run. If you always have bad luck, maybe it means that the process is bad. So to your question, what can we do is: have a process, document the process, reward good processes. For example, if people put a lot of effort or study a lot of information reward them or incentivize that as opposed to just how things end up. These are not easy things to do, but there are organizations that do pay more attention on process as opposed to results.

I.E: It looks like there are crises that didn't exist before, like, from natural incidents to pandemics, etcetera, are we learning from these incidents? Because the research that you mentioned existed before, but are the crisis after crisis situations enhancing our knowledge so that we can handle the conflict situations and crisis better?

D.F: Good question. Do we face more crisis than before? I don't know.

I.E: Or more intense crisis maybe?

D.F: These are very good questions.

Whether we learn or not, similar to what we discussed today, we talk about some of the barriers for learning, but also about some of the opportunities to learn better. I think we kind of were on the edge of some kind of financial crisis in recent weeks. And whether we perform better or not compared to something that happened in 2008. We can talk to experts around financial crisis. I would think that there are some things that are working better. That means clearly we did learn some things and we can think about what those might be, about the system, system changes that were implemented.

I.E: Maybe intervention coming rather than…

D.F: Intervention coming faster, central banks acting faster, anticipating panic kind of situations. And I don't want you to conclude on a very pessimistic or optimistic note because these are very broad questions. But we can maybe take them to more local kind of examples or circumstances and think within our families, within our organizations and within societies, can these barriers be better understood? And can we do something about it?

I.E: Thank you very much. Is there anything you would to add?

D.F: No. Thank you Idil.

I.E: In this program my guest was David Faro. He explained how time pressure and high stakes can cause ‘curse of knowledge’, especially when experts communicate with the public during a crisis. He also talked about ‘hindsight bias’ and ‘outcome bias’ that might make crisis more difficult to handle leading to conflicts when the crisis is over. He suggested, what kind of policies can be developed to handle these behavioural tendencies.

So I hope you enjoyed this episode. If you did, please follow the podcast, its website, like and share it. You can also write a review. Also, like the excerpts I share in my YouTube channel, or in the Instagram account of ‘We Can Find a Way’ and do not forget that you will get environmentally friendly soaps if you're among the first five persons that sign on to the website of ‘We Can Find a Way. I would like to close by thanking my sponsor Koc Attorneys at Law, my Marketing Manager, Julia Nelson, and musician, Imre Hadi, and artist, Zeren Goktan who allowed me to use their music and photograph in the podcast. Thank you and see you next month.

 

 

 

 

David Faro, UK Profile Photo

David Faro, UK

David Faro received his Ph.D. from the University of Chicago and is working on behavioral economics and decision-making. His research is about consumer and managerial decision-making. Recently he started to teach crisis.