I recently read a piece that painted a doomsday scenario for the near future. I found afterward I couldn't trust my cognitive evaluation ("believable, accurate, likely to happen") because my existing root-level
emotional biases about "the way the world works" were influencing my mind's cognitive evaluation of the theory, making it more likely.
If you've seen ROBOCOP 2, I can kind of explain this more simply. In ROBOCOP 2, Murphy awoke to find that he had 200+ new instructions now each as important as his original three. His brain from moment to moment had to treat those instructions as 'true', cognitively.
Similarly, my brain seems to treat some of these emotional biases as automatically root-level 'true'. Murphy threw himself against a power grid to reset the problem. I'm not looking to get electroshock anytime soon.
I'm interested in mental habits, routines, or steps - either in the form of your anecdotes or of references to classic or modern approaches - that help you discern if your own root-level biases are causing you to misperceive or incorrectly evaluate a situation.
If the bias is rooted in fear, then it can sometimes help to ask yourself a bunch of "what if X" questions, where X can be anything related to the idea. While doing this, pay close attention for any subtle feelings of fear, anxiety, excitement, or any other emotion. If they are present, then that's a hint that your emotional biases may be in play.
For example, I believe climate change is real. So I ask myself, "What if climate change isn't real?" I get a little flare of fear, so I know that I am at least partially motivated by an emotional bias there.
Once you detect any emotional bias, that's when you bring Rationality skills into play and start using the scientific method. Test your beliefs. Ask yourself, "What's the easiest way to disprove my own belief?" and then perform that test or do that research. If you give it honest effort, and your belief stands up to many tests, then you can be more confident in it.
It can be tricky to be mindful enough to detect those subtle traces of fear, but with practice it gets easier.
If you want to get even better at working with your biases, check out the broader Rationality topic [0], and perhaps read some rationality fiction [1], it's amazing stuff.
[0] https://www.lesswrong.com/
[1] https://www.hpmor.com/