At least part of my education starting at the turn off the century, we were taught these things happened but that once they were over it was a solved problem, never to happen again.
For me there was a narrative that post 1950 the US was the pinnacle of humanity, the best place on earth. Cold war, Vietnam, Korea, all things on the other side of the world from our walled garden. Civil rights was just a few people in the south having disagreements and 9/11 was either swept under the rug or passed off as some dumb dirty Arab who was irrationally angry and lashed out.
It took me moving to the big bad city for college, where I was supposed to be shot every 5 minutes and robbed of everything including the clothes on my back, to have that world view crack enough to begin questioning what I was told. When I did, I was instantly ostracized from my rural upper midwest hometown and became barely tolerated by my family.
The blinders are very real and it's too easy to ignore uncomfortable truths
Aerospace industry engineer here:
We try to identify failure modes and use tools like Failure Mode Effect Analysis (FMEA) and fishbone analysis to track down failures and how they cascade to understand system behaviors. However, the more you increase the complexity of the system, the more difficult it is to fully think through all the possible ways things can go wrong and it's not unheard of for things to slip through review.
Starliner has consistently been plagued by program management issues where they think they've caught the failure modes and implemented appropriate mitigations. They do an analysis, run some tests to prove those assumptions are correct, and fly it. In this case there was a design flaw in the thrusters that they saw on a different test flight, thought they fixed it, and flew again not knowing that they didn't actually fix the problem.
False sense of security is a dangerous place to be when it comes to fault scenarios, but the alternative is extreme paranoia where you trust nothing. In fairness to Boeing, taking some level of risk is necessary in the space industry but I think it's pretty obvious that they were not paranoid enough and were too trusting that they did their job right