You need privacy to get good data: The Lubitz lesson
Andreas Lubitz intentionally crashed Germanwings flight 9525 into the French Alps, killing all 150 people on board. Lubitz had a history of mental illness, and should not have been flying a plane that day. Many news reports were quick to blame German privacy protection laws. (For example the Times headline was “German obsession with privacy let killer pilot fly;” Time magazine said “German privacy laws let pilot ‘hide’ his illness from employers;” and the Washington Post report was “Crash challenges German identity, notions of privacy.”).
But before we jump to privacy protection laws, let us begin with the real question: what systems could we have had in place to stop him from flying, even if we did not care about privacy at all?
Of course, this question applies not just to pilots, but also to members of so many professions we trust not to hurt us. The police, of course, come to mind immediately as an example. How about a barber who holds a razor blade at your neck? How about an industrial worker with access to chemicals with which she could cause an explosion or a catastrophic fire?
The question also applies to matters other than mental illness. Anyone with an infectious disease, even a mild cold, is a potential threat in a hospital ward with many immune-suppressed patients.
If the airline had known the extent of Lubitz’s illness, we can suppose that they would have grounded him[1]. German privacy laws did play at least some role in keeping details of his illness from his employer. But let us see how this would have played out if there were no such laws. Germanwings would likely have fired Lubitz: however the monetary aspects may have been handled, there is no question that Lubitz would not have remained a pilot (and possibly not even become one in the first place). But Lubitz apparently loved to fly: becoming a pilot was his dream. If he knew in advance that his medical record would be shared with his employer, he may have not have sought treatment in the first place. Thereby, he could have ensured he had a pristine medical record, and been qualified to fly.
The crucial point is that decisions based on data are only as good as the data themselves. In real life, data are often dirty, for a variety of reasons. Data cleaning is difficult but necessary before any data analysis is attempted. By creating incentives for falsifying data, we virtually guarantee that the data will be dirty. Think of how many times you have intentionally provided incorrect or incomplete data, such as your phone number, on a web form. If you feel the web site has no legitimate need to know your telephone number, you choose to give it some junk that satisfies the site’s requirement that this field not be left blank. The benefits to you of lying about your phone number are so much lower than those to someone like Lubitz lying about their health. Surely, with his career on the line, the prospective pilot would do his best to hide any disqualifying condition.
So, what is the solution? Separate testing. Employees in critical positions of trust should be tested, randomly, periodically, and when there is any doubt. This is how many other requirements are managed. Consider alcohol or drug use. Pilots are expected to be sober when they show up to fly. About a dozen pilots are caught every year with blood alcohol level above the limit. If someone showed up to work visibly drunk, presumably they would be tested and disqualified. This testing has nothing to do with medical records. Pilots can tell their doctors about their alcohol use. Pilots with alcohol abuse problems can seek treatment and receive help without jeopardizing their careers, as long as they manage to show up to work sober. We could test similarly for other conditions. If someone is going through an acute mental crisis, a trained mental health practitioner will likely be able to see that right away. In fact, co-workers may be able to guess something is wrong, just as they would if the employee showed up drunk. Are such tests expensive? Yes. Are they fallible? Yes. But so are judgments made even from perfect and complete medical records. Certainly, judgments made on incorrect and intentionally misleading records will be worse, not to mention the health costs to employees of seeking care only outside the system and the needless loss of privacy for professionals in positions of trust.
[1] There is a complicating wrinkle in the case of Andreas Lubitz. He had just seen a psychiatrist and had been given a medical note that he could have used to excuse himself from work that day. Lubitz chose to tear up the note and fly. Should the psychiatrist have alerted the airline directly? Possibly. Extreme actions are justified when there is danger of significant imminent harm. To me, this is the difference between the doctor advising you to stay home when you are sick, so as not to infect others, and the doctor contacting the relevant health authorities to recommend involuntary quarantine. The latter path of action is sometimes justified, but the threshold has to be kept high.