Yesterday I talked about the “Safety-Critical Mindset” as one of the key requirements of being a successful developer in the medical device industry. I see three components of the Safety-Critical Mindset:
- a deep and constant appreciation for the responsibility you carry, which drives…
- a healthy paranoia about all the ways your device can fail and hurt people, tempered by…
- engineering judgment that you use to objectively evaluate risks and the mitigations for those risks
Let’s dig into each of these a bit more.
Remember the stakes
When you’re developing safety-critical devices such as medical devices, you can’t ever forget what’s at stake. Real people can be injured or killed by your work, if you don’t do your job correctly.
It’s easy to say now. It’s hard to remember when the going gets tough.
It’s hard to remember when you’re asked to estimate how long the development will take, and you know management doesn’t want to hear the real answer.
It’s hard to remember when you’re deep in the details of the development process, and so focused on just getting the damn thing done.
It’s hard to remember when investors are threatening to pull funding because the project is late.
It’s hard to remember when you discover an issue during development that could cause the materials cost to increase dramatically, meaning that the planned price point for the product won’t be profitable.
Using your engineering skills to make things that people’s lives depend on… it’s an enormous responsibility. You can’t forget it. Even when it’s hard.
Cultivate a healthy paranoia
Now that you are acutely aware of what’s at stake, you should rightfully be nervous. How do you productively channel this anxiety? Let it fuel you as you examine all the ways things can go wrong. In order to be sure that you’ve made your device safe, you must first determine all the ways in which it is NOT safe.
There are a couple of popular methods to do this:
- FMEA (Failure Modes and Effects Analysis) is a “bottom-up” method where you examine individual components in your design and think through what happens if they fail in different ways, and if patient harm could result.
- FTA (Fault Tree Analysis) is a “top-down” method where you start with the ways a patient can be harmed by your device, and dig down into the possible causes.
It’s often appropriate to do both types of analysis for a medical device, as the two processes differ in the types of issues they tend to uncover.
Regardless of how you do it, you must be diligent in your risk analysis. This is where the healthy paranoia will be your ally. Constantly ask yourself:
- “Have I thought of everything?”
- “What else could go wrong?”
- “Even if this doesn’t cause immediate harm, does it make it possible for harm to occur if something else goes wrong?”
The process of performing a rigorous risk analysis of a medical device is tedious. There’s just no getting around it. After doing this for weeks (and it typically takes weeks, if not months), you’ll be sick of it. You’ll want to get back to doing the fun work of designing and building.
Don’t ignore the little voice in your mind telling you that you’re missing something. Review your work multiple times with fresh eyes, and solicit others to poke holes in the design.
Be objective and quantitative
It’s possible to swing to far in the paranoia direction and want to bury your device development in layers of “safety” that are not necessary. Overstatement of risks doesn’t help anyone. If you do a rigorous analysis and that analysis tells you that the risk is acceptable, then you’ve done your job.
Be objective, and as quantitative as possible, when evaluating risks. Instead of using hand-wavy statements like “it could fail early”, come up with numerical statements like “This part will last at least ten thousand mechanical cycles”. If there is uncertainty, try to bracket the problem. What’s the best case and the worst case? If you substitute the worst case reliability numbers into your analysis, what does that tell you?
You’ve identified a hazard in your medical device, and now you need to mitigate it.
- Sometimes you can modify the design to eliminate or greatly reduce the chance of occurrence.
- Sometimes you have to change the indications for use of your device – explaining that it can’t be used under certain conditions.
- Sometimes, all it takes is an addition to the user manual explaining how to deal with a certain situation.
You must objectively evaluate the different mitigation options and decide which ones are sufficient.
There’s a spectrum between ignoring or trivializing real risks and acting like Chicken Little. A key to successfully developing medical devices is finding the right balance.
- Remember what is at stake, and feel the fear
- Use that fear to motivate you to be rigorous and thorough
- Objectively evaluate risks and mitigations