26 June, 2013
I've still to hear back from @ee.
Sadly, I can't say I'm surprised. I understand all too well the circumstances that make it unlikely we'll see any action until it's too late.
Let's indulge in some role-play. Imagine you're part of a business with a legacy database. You're a decision maker, but held accountable to a large group of shareholders. Your job, fundamentally, is to protect their profits. That's what you're employed to do.
It comes to your attention that the legacy database contains passwords in a format that is easily readable by anyone with access to the data.
Let's assume good intentions here. Let's assume your first reaction is shock, and horror. You've read the stories, and you know this ends badly. Something needs to be done.
You really have two choices here: fix the problem fast, before it becomes a bigger problem; or wait and deal with the inevitable fallout.
Fixing the problem up front is the "right" thing to do here, but it's going to cost money. It's likely also going to increase support costs by removing the ability for support staff to quickly help customers who have forgotten their passwords.
It's worth looking a little deeper at what "it's going to cost money" really means. This is a legacy system, dating back over a decade. Think about all the internal systems that have been built with the assumption that these passwords were readable. Imagine all the tiny utilities and hacks and scaffolding that is built upon this foundation. All of these add business value. All of these have economic impact. All of these will have to be changed.
Consider as well that a large number of customer facing systems will also be making these assumptions. This "fix" will require your customers (your oldest, most loyal customers, mind you) to make changes to their systems. You have to communicate these changes, and deal with the support burden of handling those that didn't receive those communications. Or ignored them.
That's before we've even looked at data migrations (which present a huge risk to existing data), security checking and so on and so forth.
This is not a small job. The "right" thing here is likely going to cost a huge amount of money, and for no appreciable business benefit. If anything, it will reduce the business's leverage, and curtail its ability to operate and grow.
Now consider the alternative: wait for things to go wrong. Horrible as it feels, this will cost far less money, even in the long term. When the database is leaked, you may get sued. You're insured, and since lawyers are often far more thorough than engineers, you're likely to be covered even in the event that the data was poorly protected. You'll be able to point to policy, to safeguards, and to an economic argument not dissimilar to the above.
In short, you end up in a place where it's suicide to try to fix the problem. More than that, it may even be in breach of contract: your job is to defend your shareholders' profits.
You guys all listen to Roman Mars's 99% Invisible, right? The amazingly talented Philip Roberts put me on to this podcast a few weeks ago, and I've been devouring episode after episode with reckless abandon.
You should subscribe. Seriously, I'll wait.
I mention 99% Invisible for a couple of reasons. First, it's awesome. A group of talented, smart and—dammit—good looking readers such as your fine selves will no doubt love it. Second, though, I want to give credit where credit is due. The following thoughts would not have occurred to me so clearly were it not for An Architect's Code, a recent episode of 99% Invisible.
I won't spoil the episode by trying to capture it here: you should go and listen. What struck me as I was listening, though, was a huge disconnect between other "professions" and the job of software engineering.
Let's imagine the above economic arguments applied to, say, a proposal to construct buildings in a town without proper planning permission. Or, better, with cheaper—but unsafe—building materials.
At this point, the economic argument vanishes. An ethical, professional requirement kicks in that supersedes shareholder profit. Builders have a professional obligation to ensure building safety.
Obviously, some builders will cut corners, but think of the language I'm using here: This is no longer about some builders who "just get the job done" while others "go the extra mile". This is about "bare minimum" versus "cutting corners". These are worlds apart.
You see, the real problem here is not that EE is storing passwords insecurely. That's a symptom. The underlying issue is that successive engineers have been handed that database, and each one has faced an uphill battle to fix the problem because they've had to argue the case of "right" versus "practical". If they refuse to work on the database without it being fixed, there's a whole room full of engineers who are happier to turn a blind eye.
Imagine a building firm that required that its contractors follow hazardous practices. Imagine an economy that encouraged and rewarded that behaviour.
Imagine a hospital whose executives were required by contract to fire surgeons for putting patient safety above profits.
I'm a software engineer. I like to think I have a "professional" attitude, but right now, an attitude is all that really is. My code of ethics is personal, and is not backed by any law or any trade body. If I was the brave engineer at EE who turned whistleblower, or who tried to effect change, I would have no power, no clout. Just my code, my pride, and my P45.
The bar for software engineers, in general, is still "good enough" versus "extra mile", not "professional" versus "cowboy".
This has to change.
As soon as a skilled occupation evolves to the point where malpractice can actively harm a human being's life (remember, if "they" get your email, it's game over), it needs to get itself a code of ethics. It needs to assert what it means to be a "professional" in that occupation. It needs to be able to educate its employers in what it means to screw things up, and it needs to be able to refuse to screw things up in such a way that the employers' jobs get harder, not easier.
In short, it needs a guild.
The status quo should be doing things right, and cutting corners should be the cheap, unprofessional last resort.
I think this, ultimately, is why the situation last week angered me so. It angered me because this screw up wasn't just EE's. As I said in a tweet, this isn't just a policy problem, it's a software problem. It's a software problem created by members of my profession who didn't have the power to say "no" to a demand that wasn't fully understood.
It's my problem. It's our problem. And it has to stop.