Psychological safety can be beneficial (to a point) — here’s where it gets dangerous
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
We are designed to relentlessly find ways to improve our comfort and safety. That desire has been a motivating force for innovation from the time we began using tools. We are meant to pursue these things, but never to achieve them. We are not designed for total and continuous comfort.
In The Fearless Organization, Amy Edmondson described teams with the shared belief that it is okay to take risks, admit mistakes and ask questions without fear of reprisals. No organization would challenge these ideas today — they have become self-evident. In Fail Fast, Fail Often Babineaux and Krumboltz described organizations full of action-oriented people who were free to experiment and who saw failure as a valuable opportunity for learning. No organization would challenge this either, although they may add caveats.
However, when our idealized corporate culture is seen in practice, we have a different expression of these values. Over time “psychological safety” and “fail fast” have become managerial bromides, and employees hesitate to take risks or speak up. We have redefined psychological safety to mean freedom from stress, responsibility and risk; we are safe when we are comfortable. Seeing our peers face the consequences for their fast failures, we have transferred that risk to product owners and our stakeholders.
Divorcing tech workers from decision-making
There is nothing more psychologically soothing than simply executing somebody else’s instructions. Receiving a list of activities for a two-week sprint cycle, orchestrated externally, without the danger of change or personal responsibility, is the ultimate safety net. If an activity is not completed when expected, we can say that it was underestimated. If an approach does not work, we can point to the person who instructed us. If a person does what they are told, they can completely extricate themselves if anything goes wrong.
For most organizations, this toxic view of psychological safety is the implied goal. Processes and structures are designed to divorce technology workers as much as possible from decision-making. Once practitioners are onboarded, they are viewed as largely interchangeable. For our part, we comply, enticed by the opportunity to surrender the more stressful parts of our profession.
This mindset has led practitioners and technology functions to become degraded order-takers, replacing the stress of responsibility for a solution with only execution. In seeking comfort and safety, we have also surrendered any sense of ownership in our work.
For a high-performing practice, and for high-performing individuals, it is critical that we have a different type of psychological safety. We need to create environments where safety does not come through the transfer of risk, but where we can encounter risk in a supportive way. To have a psychologically safe team it is critical to give them the ability to speak their minds.
According to Edmonson, “psychological safety in the workplace is the belief that the environment is safe for interpersonal risk-taking. It is a belief that one will not be punished or humiliated for speaking up with ideas, questions, concerns or mistakes.”
In other words, safety needs to be about the ability to take risks, not to avoid stress.
Creating a culture where mistakes are tolerated
As leaders, we need to ensure that there is a tolerance for mistakes in our organizations, and to take the time to uncover the learnings from these failings. We should encourage risk taking in our reports, and especially to speak up and share their ideas. However, we cannot elevate the practice solely by increasing the number of failures we have; we must evaluate ourselves solely by the value we create.
We have much to offer. Collaborative scoping and design leads to products and services that are vastly superior to those designed in isolation by business stakeholders. Factories were originally designed around water wheels, transmitting power through a central shaft. When electricity was introduced, rather than directly powering devices, it was initially used to power the vestigial shaft. This enhanced productivity, but it was a modest improvement.
The true value became apparent only when factories were redesigned around electricity. In the same way, when we use technology to energize outdated approaches, we will see only marginal improvements. It is crucial that we be willing to insert our perspectives. Henry Ford captured this perfectly by saying, “If I had asked what they wanted, they would have said a faster horse.” To be at our best, we need a challenger mindset and a team of supportive leaders. We need to be comfortable being uncomfortable and insert ourselves into the decision-making process.
Encouraging discomfort does not preclude an organization from being psychologically safe, it simply reinforces the need for careful planning and a focus on human factors. Technology work comes with challenges — we simply need to be transparent on risks and to orient our teams continually towards value. Embracing the responsibility and healthy stress of owning the technology in our organizations leads not just to greater professional success, but to greater personal fulfilment.
Jeremy Adamson is an independent data and analytics consultant, an instructor in corporate strategy at the University of New Brunswick, and the author of “Geeks with Empathy” and “Minding the Machines.“
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers