↓ Expand ↓

Category → Safety Culture

National Academy of Sciences lab safety culture committee meeting in Boston tomorrow

The National Academy of Sciences committee on “Safety Culture in Academic Laboratories” meets starting tomorrow in Boston. The committee has previously met in Washington, D.C., and Berkeley, Calif.

Speakers for the open part of the meeting include Massachusetts Institute of Technology chemistry professor and safety committee chair Rick L. Danheiser. I spoke with Danheiser about MIT’s safety program for “Learning from UCLA.” Also on the agenda is William B. Tolman, chair of the chemistry department at the University of Minnesota and one of the people involved in Dow’s academic lab safety partnerships. And then there’s Susan S. Silbey, who is head of anthropology at MIT and studies “the creation of management systems for containing risks, including ethical lapses, as well as environment, health and safety hazards.”

I can’t attend the meeting, but if anyone else who does would like to recap it for the Safety Zone, please let me know!

Safety compliance as a route to better safety culture

Members of the NAS committee tour labs at UC Berkeley

Members of the NAS academic safety culture committee tour labs at UC Berkeley

One of the things that came up at the National Academy of Science’s “Safety Culture in Academic Laboratories” committee meeting a couple of weeks ago was the idea that safety compliance leads to a better safety culture.

Many safety professionals say that a culture of compliance is definitely not the best safety culture. Compliance is about box-ticking on things like standardized training and lab inspections. A good safety culture means that people are thinking through, talking about, and paying attention to what they’re doing so they’re actually working safer. Compliance will come from a good safety culture, but a good safety culture will not necessarily arise from compliance.

Others argue, however, that safety culture can be improved through compliance. “It’s worked well for us to develop our safety culture through ensuring compliance,” because the compliance component promoted interactions between researchers and safety professionals, said Robert Eaton, director of environmental health and safety at the University of California, San Francisco.

That only works if those interactions on compliance are positive, I suspect. In an organization in which researchers do not respect or understand the role of safety staff, then compliance is unlikely to do much for the overall safety culture.

But perhaps compliance is an essential step en route to a better safety culture? Maybe organizations need some sort of base-level safety compliance to be able to move people to the next level–maybe people can’t be brought to think critically about what they’re doing when they’re not even bothering with the basics of eye protection and closed-toe shoes. Representatives from Sandia and Lawrence Berkeley national laboratories presented what they’re doing to push their organizations beyond what sounded like more of a compliance culture to more of a critical thinking culture. To the academics in the room, “You’re at a state we were at 20 years ago,” said J. Charles Barbour, director of the Physical, Chemical, & Nano Sciences Center at Sandia. Even if compliance culture is a necessary phase, though, perhaps academia can take advantage of the knowledge in industry and government labs to move people faster to critical thinking and safer work practices.

One more meeting tidbit: Stanford University chemistry professor Robert Waymouth‘s suggestion for how to get recalcitrant faculty on board with lab safety programs was to appeal to their egos–in his words, their “desire for excellence”–with the explicit goal of being better than and informing industry rather than the other way around. (Along with, I hope, a desire not to have their lab members get hurt.)

A final note: At the start of the open session, committee chair Holden Thorp noted that topics discussed during information-gathering do not necessarily indicate what will wind up in the final report.

National Academy of Sciences lab safety culture committee meeting in Berkeley this week

Last month, the National Academy of Sciences kicked off a yearlong study of “Safety Culture in Academic Laboratories.” The project is supposed to focus not so much on what should be done to improve safety in academic labs, but on how to get people to actually do it. C&EN’s Jeff Johnson attended and reported on the first meeting of the committee, which is chaired by H. Holden Thorpe Thorp. Thorpe Thorp transitions at the end of this month from chancellor of the University of North Carolina to provost at Washington University in St. Louis.

The second Safety Culture committee meeting is this week, Wednesday and Thursday (June 26 and 27) at the University of California, Berkeley. The agenda is here. Since it’s local to me, I plan to attend, and I’m sure at least one blog post will result.

Improving graduate education in chemistry

Earlier this week, the American Chemical Society released a report on “Advancing Graduate Education in the Chemical Sciences.” ACS president and University of Wisconsin, Madison, chemistry professor Bassam Z. Shakhashiri commissioned the report, charging the commission with defining the purposes of graduate education in the chemical sciences and what steps should be taken to ensure that programs “address important societal issues as well as the needs and aspirations of graduate students.”

One of the five report conclusions was:

Academic chemical laboratories must adopt best safety practices. Such practices have led to a remarkably good record of safety in the chemical industry and should be leveraged.

The commission could easily have folded safety under another conclusion:  ”Current educational opportunities for graduate students…do not provide sufficient preparation for their careers after graduate school.” Clearly the commission members felt strongly that laboratory safety needed to be called out as a separate point.

The report notes that “students’ lack of familiarity with best practices in laboratory safety … represents a significant gap, regardless of the type of employment the student ultimately pursues,” whether students are looking at academic, industrial, or government positions. The report emphasizes that institutions should develop a culture of working safely rather than just following rules and regulations. In that respect, it jumps off from and references the ACS Safety Culture Task Force report Creating Safety Cultures in Academic Institutions released earlier this year. And the report recommends that ACS develop a comprehensive safety curriculum based on best practices.

The report addresses the finances of safety, too:

The costs of safety practices for research should be built into the indirect costs charged by universities; they should be adequate to provide what is needed (including supplies, equipment, skilled personnel, training, and more). The direct-cost budgets of research grants do not seem to provide the appropriate mechanism for funding safety measures. The top down approach to handling the costs of safety is imperative to make certain there is uniform implementation of safety practices and hardware across all chemical laboratories of a university and to eliminate conflicts of interests among individual PIs making financial decisions regarding safety implementation in their own laboratories.

The costs of safety practices outside research laboratories, most notably in teaching facilities, are inevitably an institutional responsibility. Suitable standards should govern them, and appropriate mechanisms should fund them.

Based on the University of California’s definition of indirect costs–”those that are better calculated on an institutional basis rather than costed-out by project (e.g. research administration and accounting, purchasing, library, space, maintenance)”–safety definitely should be part of overhead. But who pays for what in academic departments can be the subject of intense debate, so it’s nice to see the ACS commission take a clear stand. The commission included two chancellors and one dean, along with many professors and some industry representatives.

Last but not least, a few quotes from the report on the importance of lab safety in graduate education:

Progress would afford better protection to students and other workers at all academic levels and would better prepare students to meet the natural expectations of their future colleagues and employers.

and

[T]oday’s companies demand safety performance from their employees that far exceeds what students are accustomed to in academic settings. There are many safety skills that are easily taught, such as doing hazard analyses, but the core issue is that students must be “grown” to value safety in a manner that is “bone deep” and can drive the highest level of performance, known as interdependent behavior. This culture of safety is often a surprise to newly hired students. It should not be.

and

[T]here is a demonstrated, strong correlation between occupational safety and operating performance of factories.30 A great many industrial organizations have found safety to be powerfully coupled in a general way to productivity. They are not committed just because a safety culture reduces their exposure to liability, but in much greater degree because a bone-deep safety culture protects their people and because workers who consistently think carefully about what they are doing perform better.

30Veltri, A.; Pagell, M.; Behm, M.; Das, A. A Data-Based Evaluation of the Relationship between Occupational Safety and Operating Performance. Jour. SH&E Res.

2007, 4, feature 2.

Webinars: Safety culture and chemical safety committees

The University of California posted video of last week’s webinar on “Creating Safety Cultures in Academic Institutions” on YouTube, and I’ve embedded the video below. Still haven’t had time to watch it myself!

Also, there’s another webinar coming up tomorrow on “Enhancing a Culture of Safety Through the Development of a Chemical Safety Committee.” The presenter will be Robert Emery, the University of Texas Health Science Center at Houston’s vice president for safety, health, environment, and risk management. The webinar is scheduled for 11 a.m. Pacific/2 p.m. Eastern.

Chemjobber and Janet Stemwedel discuss #SheriSangji case and academic lab safety culture

This is old, but I didn’t flag it at the time and I think readers might find it useful: Back in August, Chemjobber and Janet Stemwedel of San Jose State University and Doing Good Science had a (long!) conversation about lab safety, which Chemjobber recorded and posted as a podcast. Stemwedel, who got her PhD in chemistry before transitioning to philosophy, followed up by posting transcripts from parts of the discussion. Here are the links:

Podcast: Chemjobber and Prof. Janet Stemwedel talk #SheriSangji

Safety in academic chemistry labs (with some thoughts on incentives)

[on incorporating safety into tenure decisions] … if it became a matter of “Show us the steps you’re taking to incorporate an awareness and a seriousness about safety into how you train these graduate students to be grown-up chemists,” that’s a different kind of thing from, “Oh, and did you have any accidents or not?” Because sometimes the accidents are because you haven’t paid attention at all to safety, but sometimes the accidents are really just bad luck.

Why does lab safety look different to chemists in academia and chemists in industry?

It really does seem that the commenters who are coming from industry are saying, “These conditions that we’re hearing about in the Harran lab (and maybe in academic labs in general) are not good conditions for producing knowledge as safely as we can.” And the academic commenters are saying, “Oh come on, it’s like this everywhere! Why are you going to hold this one guy responsible for something that could have happened to any of us?” It shines a light on something interesting about how academic labs building knowledge function really differently from industrial labs building knowledge.

Community responsibility for a safety culture in academic chemistry

Something bad happened, and the reason something bad happened, I think, is because of a culture in academic chemistry where it was acceptable for a PI not to pay attention to safety considerations until something bad happened. And that’s got to change.

The principles of “inherently safer” processes or experiments

The U.S. Chemical Safety & Hazard Investigation Board released a video a couple of weeks ago on “Inherently Safer: The Future of Risk Reduction.” Although the video stems from CSB and National Research Council investigations into the BayerCropScience explosion in 2008, the principles of inherently safer processes can also be applied to research-scale experiments.

As outlined in the video, those principles are:

  • Minimize – reduce the amount of hazardous material in the process
  • Substitute – replace one material with another that is less hazardous
  • Moderate – use less hazardous process conditions, such as lower pressure or temperature
  • Simplify – design processes to be less complicated and therefore less prone to failure

“It’s not a specific technology or a set of tools and activities, but it’s really an approach to design and it’s a way of thinking,” said Dennis Hendershot, a consultant with the American Institute of Chemical Engineers Center for Chemical Process Safety, at a 2009 CSB meeting. “The safety features are built right into the process, not added on. Hazards are eliminated or significantly reduced rather than controlled or managed.”

The video goes on to say that the goal of inherently safer process design is not only to prevent an accident but to reduce the consequences of an accident should one occur. A research lab experiment gone wrong, of course, is unlikely to affect the surrounding community in the way that a manufacturing incident might. But research lab incidents have cost millions of dollars and caused personal injuries in the form of lost eyes, hands, and fingers; burns and other unspecified injuries; and deaths of several researchers (for more, see the Laboratory Safety Institute’s Memorial Wall).

Dow and Minnesota team up on safety

Dow Chemical and the University of Minnesota (UMN) announced on Monday a pilot program to improve laboratory safety in the university’s chemistry and chemical engineering laboratories.

UMN is one of the universities benefiting from a program Dow announced last year in which the company is investing $25 million per year for 10 years in research programs at 11 academic institutions. The new safety program is independent of that effort but germinated in the relationship established between Dow and the university, says Frank S. Bates, head of UMN’s chemical engineering and materials science department.

The safety program also extends beyond research programs sponsored by Dow. Central to the effort is a Joint Safety Team (JST) made up of the safety officers from every chemistry and chemical engineering research group. “All of those safety officers will be interacting with Dow and working together to learn best safety practices” from the company, says William B. Tolman, chair of the chemistry department.

At a kick-off meeting a few weeks ago, representatives from Dow and the university agreed that their focus would be on building and sustaining a good safety culture. UMN already seems to have some good procedures and protocols in place, says Pankaj Gupta, senior strategy leader for research and development  at Dow. The task is how to raise awareness of those and how to share Dow’s best practices and adapt them to a university setting.

To that end, in the next couple of weeks, Dow and UMN plan to survey chemistry and chemical engineering faculty, postdocs, and students to get their feedback on the current state of laboratory safety and what needs to be improved. Then the program will try to address those concerns by having Dow representatives visit the campus to work with members of the JST. Some or all JST members will also visit Dow, where they will be exposed to things like Dow’s training program, its laboratory audits, and how scientists approach experiments, Gupta says. Repeat surveys will help determine how the program progresses.

Gupta has already surveyed recently-hired Dow employees to get their input on the differences between academic and Dow safety culture. “The number one theme that came up again and again was awareness,” Gupta says, adding that other concerns included specifications for protective equipment, protocols, and pre-task analysis. “When our new employees come in, they spend about 30 hours in mandatory training before they can set foot in the lab to do an experiment,” providing an immediate lesson that safety comes first, Gupta says. Monthly safety meetings and pre-task analysis, in which peer groups discuss the hazards of new procedures and what to do if something goes wrong, also reinforce that safety is an integral part of laboratory experiments.

One of the things the pilot program will work on is creating an environment in which it is both expected and comfortable for people to raise questions and work with each other around hazard assessment, says Lori Seiler, associate director for environmental health and safety in research and development at Dow.

The pilot program will run through the summer. Then Dow and UMN will take stock of the effort and figure out how to proceed. Two UMN alumni now employed at Dow—one chemist and one chemical engineer—are on the core team working with the university.

Neither Dow nor UMN comes to the program with the expectation that the university will duplicate Dow’s safety program, Bates says. “But there’s a lot of room between what we’ve done in the past and what they do at Dow,” he says. “Our intention is to make things better in a university setting.”

Key to the effort is the JST, Tolman adds. “We decided early on that it would be actual students and postdocs who would lead the effort, since they’re the ones in the labs,” he says. And the interdepartmental nature of the team should strengthen it, by providing both a common goal and a wider range of experience.

The team should also help address the problem of high turnover in academic labs, Tolman says. Even as some JST members leave every year, their replacements will learn from and be supported by veteran members. And if the safety officers are trained well, they in turn will do a better job of training new research group members, Tolman says.

“My own safety officer from my group came in my office two days ago and she told me flat-out, ‘This is going to make my job easier,’” Bates adds. He hopes that the JST will add some professionalism to the safety officers and promote their authority in the research groups they serve. “And to have a partner at Dow who they can consult with and make contact with occasionally as a resource? That’s just fantastic,” Bates says.

Bates and Tolman say that their faculty members are enthusiastic about the program, even though it means a big time commitment for the safety officers. “We agree it takes time, but it needs to take time. This is important and a high priority for us,” Tolman says.

And although the safety officers may have some busy weeks ahead, in six months or a year from now, “it’s not going to take any more time. I think it will take less time and less concern on the part of the safety officers,” Bates says.