The four horsemen in R&D (and how to defeat them)



Recently I've been reading the book The Art of Thinking Clearly. The book examined most of the common faults in reasoning, and I've been reflecting on their correlation and application in our R&D work. Among the list, I picked 4 of the most relevant pitfalls that I have observed over the years, and I gave them a catchy name: The 4 horsemen in R&D (CAOS). Yes, they do cause Chaos!

What are they?

Confirmation bias: We cherry-pick evidence to support our existing beliefs and ignore the rest.

Action bias: We feel compelled to do something, particularly in new or shaky circumstances, even if we have made things worse by acting too quickly or too often.

Omission bias: We tend to prefer inaction when inaction leads to guaranteed bad outcome, but action has uncertain outcome.

Sunk cost fallacy: We consider the costs incurred to date instead of the future costs in our decision-making.


How to spot them?

Confirmation bias

is the mother of all sins, and the most dire one of the four. I've commonly seen it in 2 scenarios:

  • People cherry-pick evidences to prove the impact of a project when there is no impact, or too early to talk about impact. A very common example is platform-like projects where the vision is to build something general, supporting a lot of use cases. In order to prove that the vision is achieved, people will pick out examples (usually very few) that there are users that use the platform, but ignore the fact that the majority of the targeted audience is not using the platform.
  • People cherry-pick experiment results to prove something working, instead of testing against cases where it fails. Very often we can find this scenario in research papers where most of the experiment demonstrate how the proposed method works, without critically examining where it fails. Those research work usually will end up being not reproducible and not impactful long-term.


Action bias

is usually under the disguise of "Lean Towards Action" (the Amazon Leadership Principle), but there is actually a subtle difference. Leaning towards action is to avoid analysis paralysis, and risk aversion. While Action Bias is to "Do something, keep busy so we don't look bad". The first principle of determining whether we are following into action bias or not is to examine whether the action contribute to the goal. Therefore there are razors that we can use to identify this bias:

  • When the goal is not clear, and the team is taking action on things that doesn't help clarifying the goal. Example: when the research goal is not even defined, action on building a software ecosystem to support the research goal.
  • Jump on to action on a later component, when the prerequisite had not been done. Example: in data-drive ML, start tinkering the model and the algorithm before having high quality annotated data at reasonable scale, what it leads to is shit-in-shit-out.
  • Keep producing non-bottleneck components, while inventory is piling up. Example: research needs data, but if data collection scales much faster than what the research can consume, it results in the ops team being busy, the data management gets harder, cost goes up but research is not accelerated.


Omission bias

is when we know a project is not going well, but we don't know how to turn it around, we don't know whether taking an action will make things better or even worse. Therefore we chose to do nothing and let the project die slowly. To identify this bias, a simple razor is enough:

If we take no action, is it going to get better?

If the answer is no, then we know that we have to take action. This might seem the opposite of the Action Bias, the subtle difference is that "stop doing something immediately" is actually a valid ,and very often, the best action to take.


Sunk cost fallacy

is usually more obvious in hindsight. When a project is killed, people ask the question "why wasn't it killed a half, a year, several years ago?". But it's not very useful to identify in hindsight since the costs have already incurred. There are common "triggers" that we should identify that is indicating that we are falling into this fallacy:

We've put so much effort in building it so far, it's all gonna be for nothing if we stop now.

The future cost is tiny in comparison with what we have already invested.

People who worked on it so hard will feel very disappointed if we don't continue.

Whenever past cost is taken into major consideration for future decision making, we should raise the alarm.


How to defeat them?

Let's use an example to see how to defeat the four horsemen in the sequence of C-S-O-A.

A project seems to be not going very well, our spidey sense seems to be telling us so

  • The first thing we should do is to seek evidence to disprove, instead of proving the hypothesis of "this project is doing well". This is to overcome the Confirmation Bias, so we can really understand the state of the project.
  • Now that we have evidence that the project is not going well, we need to see forward what would be the cost to make the project right, this cost should be independent of how much we have already invested in the past. It's all about the future. If the cost doesn't make sense or we can't even afford it, then we need to kill the project. This is to overcome the Sunk Cost Fallacy.
  • Now that we know that the project has to be put to an end, but we don't know what's next for the team, we don't know whether killing this and starting another project is gonna lead to a success. But even in this future ambiguity, the certainty of the current project needing to end is clear, therefore we can not omit the action to stop. This is to overcome Omission Bias.
  • Last but not least, after we stop the project, there might be a whole group of people sitting idle, who have nothing to work on. Resist the temptation of putting people to work without clarity just for the sake of keeping people busy. Instead, reflect on what had gone wrong in the previous project then focus on working out the clarity for the next project, then take on the new action. This overcomes the Action Bias.

Comments

Popular posts from this blog

Risk vs. Uncertainty

Feedback Loop

Three tools to help think clearly