Attend our Next FinOps Live Event | 26 February
Making Change Sticky – The Intersection of FinOps & Behavioural Science Vol. 6: Psychological Safety
Over on LinkedIn, I’ve been working my way through a new blog series called ‘Making Change Sticky – The Intersection of FinOps & Behavioural Science’.
In my previous posts (Vol 1: Choice Architecture, Vol 2: Cognitive Bias, Vol. 3: Bounded Rationality, Vol. 4: Mental Accounting, Vol. 5: Social Preferences), I’ve been exploring how behavioural economics shapes decision-making. Those concepts help us to understand why people act the way they do.
And now we’re at the beginning of a new year, it feels like the perfect time to pivot into discussing how to diagnose behaviour and complexity and how to design solutions that will actually stick; but before we do that, we need to pause. Because none of the upcoming blog topics matter if people don’t feel safe enough to engage with change in the first place.
What is psychological safety?
Amy Edmondson defines psychological safety as a shared belief that the team is safe for interpersonal risk-taking. In practice, it means people feel able to:
- Speak up with ideas.
- Admit mistakes.
- Challenge assumptions.
- Ask for help.
All without fear of blame, ridicule, or career damage.
In change contexts, psychological safety is the difference between a culture that learns and adapts, and one that hides problems until it’s too late.
Why it matters in FinOps
I’ll keep saying it until I’m blue in the face, but FinOps is not just about tools and dashboards; it’s about culture. We all know engineers, finance, and product teams must collaborate across boundaries, but that requires them to trust each other.
Here’s some examples:
Without psychological safety:
- Engineers may hide underutilised resources rather than admit overprovisioning.
- Finance may avoid challenging forecasts to prevent conflict.
- Leaders may receive silence instead of candid feedback about what isn’t working.
With psychological safety:
- Teams can raise anomalies early, without fear of being labelled wasteful.
- Mistakes become opportunities for learning, not punishment.
- Cost becomes a shared problem to solve, not a source of blame.
The link with behavioural economics
All those biases we’ve already covered become that much harder to counter in unsafe environments.
- Loss aversion: fear of being blamed for savings that go wrong.
- Status quo bias: sticking to familiar practices to avoid risk.
- Cognitive dissonance: dismissing evidence that contradicts the team’s self-image.
Psychological safety won’t eliminate these biases, but it will create the space to surface and work through them in a constructive way.
Building psychological safety in FinOps
- Frame cost optimisation as learning, not judgment – Position FinOps as a way to improve how we work, not as a way to catch mistakes.
- Lead with vulnerability – Leaders admitting their own blind spots creates permission for others to do the same.
- Create blameless rituals – Post-incident reviews and cost retrospectives should focus on systems and processes, not individuals.
- Foster cross-functional relatedness – Bringing finance, tech, and product into shared forums builds trust and reduces ‘us vs. them.’
- Reward candour – Recognise those who surface issues early, even if they’re uncomfortable truths.
Why should you care?
Without psychological safety, every framework I’ll discuss in the following posts will be undermined. People won’t admit capability gaps, won’t test new behaviours, and won’t share what they’ve learned.
With psychological safety, those same frameworks become powerful tools for growth.