You have no items in your shopping cart.
Personal menu
Search

How Boeing’s organisational failures contributed to 737 MAX crashes

Originally published by the Australian Institute of Health & Safety

Many of the key findings from a recent US report which examined the critical organisational failures that contributed to the two Boeing 737 MAX crashes are equally applicable to many other major accidents and industries, according to Australian National University Emeritus Professor Andrew Hopkins.

The US Federal Aviation Administration (FAA) recently released its expert panel review report, which found there was a disconnect between Boeing’s senior management and other members of the organisation on safety culture.

Specifically, the report said there was an “inadequate and confusing” implementation of the five components of a positive safety culture (namely, reporting culture, just culture, flexible culture, learning culture, and informed culture).

The report, based on more than 4000 pages of Boeing documents, seven surveys, over 250 interviews, and meetings with Boeing employees across six company locations, also said Boeing’s safety management system (SMS) procedures were not structured in a way that ensured all employees understood their role in the company’s SMS. 

“The procedures and training are complex and in a constant state of change, creating employee confusion, especially among different work sites and employee groups,” said the report, which also highlighted a lack of awareness of safety-related metrics at all levels of the organisation; so employees had difficulty distinguishing the differences among various measuring methods, their purpose and outcomes.

In analysing the report’s findings, Professor Hopkins said the root causes of Boeing’s 737 MAX aircraft crashes can be observed in almost every major accident.

“I know from my own work that a similar set of causes led to the gas plant explosion at Longford, outside Melbourne, in 1998,” said Professor Hopkins, who pointed to a number of parallels between the findings in the FAA report and his book, Lessons from Longford: The Esso Gas Plant Explosion, which examined the findings of the Royal Commission into the incident.

“Boeing claimed that safety was not just a priority; it was their number one priority. But its incentive arrangements tell a different story,” he said.

“Financial incentives paid to senior executives gave and still give almost no consideration to aircraft safety, focusing primarily on share market performance,” said Professor Hopkins, who suggested boards need to find ways to effectively incentivise senior executives to focus on major accident risk, such as the risk of an aircraft crash.

He also observed that engineers who had safety concerns were disempowered by their positions in the organisational structure, because they reported to relatively low-level business managers.

“They need reporting lines that run to a chief engineer, or similar executive, answerable to the CEO and the Board. Such a person should not have any commercial responsibilities,” he said.

Professor Hopkins also said that employees of Boeing were fearful of using the company channels provided to report safety issues and preferred to report these things to their immediate managers.

“This meant that such reports often failed to get the necessary attention. Companies need to have reporting channels for bad news,” he said.

“They need to reward people who report bad news. For example, one company I know has a system for rewarding ‘the best catch of the week’,” he said.

Lastly, Professor Hopkins observed that the safety regulator, the FAA, had delegated its inspection role to the company itself. 

As such, the Boeing engineers who filled this regulatory role for the FAA faced an inevitable conflict of interest. “Companies cannot be relied on to self-regulate in this way,” he affirmed.

For board directors who might be working with executive teams at risk of making the same mistakes, Professor Hopkins said it is essential to ensure that there is a senior executive whose sole responsibility is safety. 

“Ensure that this person has the staff to exercise this responsibility effectively. Ensure that you are hearing from this senior executive regularly and that they are reporting to the Board about safety issues being encountered,” he said.

“In other words, make sure you are hearing bad news as well as the good news. Make sure you are rewarding senior business managers, not just for the absence of major accidents, which in any case are rare, but for the effective management of major accident risk.”

For OHS professionals and leaders faced with similar challenges – particularly in companies where organisational failures potentially cause safety-related problems, Professor Hopkins highlighted the fact that Boeing’s safety management system was complex and poorly understood. 

“Rather than relying totally on such systems, your primary job should be to ensure that people are reporting the safety issues they confront and that they are being rewarded and acknowledged for these reports,” said Professor Hopkins, who explained that this is the best way to create a risk-aware workforce. 

“Here’s why this is important. Prior to every major accident there are always warning signs that, had they been reported and acted on, would have averted the accident. If you can develop an effective system of accessing this bad news, you will go a long way to preventing these accidents,” he said.

Leave your comment
*