The Department of Homeland inSecurity has unveiled its shiny new terrorist screening system, which will scan people walking through the airport and identify those who appear to be suspicious. That’s right: it will analyze pulse rate, breathing, skin temperature, and even “fleeting facial gestures”. People who are flagged as suspicious will be pulled aside for “enhanced screening”. Hoo boy.

DHS is bragging about this system because in tests with 140 people, some of whom were told to “act suspicious”, it correctly identified 78% of them. 78%! And they act like it’s some kind of triumph that people who were intentionally acting suspicious were flagged as such. A trained police officer– or possibly even an untrained clown from the street– could probably have identified every one of them.

Leaving aside the horrendous implications of a system designed to basically analyze our thoughts and intents, the mind reels to consider how colossally ineffective this system will actually be. I think Cory Doctorow’s analysis of such a system (written prior to this announcement) sums it up quite nicely.

If you ever decide to do something as stupid as build an automatic terrorism detector, here’s a math lesson you need to learn first. It’s called “the paradox of the false positive,” and it’s a doozy.

Say you have a new disease, called Super-AIDS. Only one in a million people gets Super-AIDS. You develop a test for Super-AIDS that’s 99 percent accurate. I mean, 99 percent of the time, it gives the correct result: true if the subject is infected, and false if the subject is healthy. You give the test to a million people.

One in a million people have Super-AIDS. One in a hundred people that you test will generate a “false positive”– the test will say he has Super-AIDS even though he doesn’t. That’s what “99 percent accurate” means: one percent wrong.

What’s one percent of one million?

1,000,000/100 = 10,000

One in a million people has Super-AIDS. If you test a million random people, you’ll probably only find one case of real Super-AIDS. But your test won’t identify one person as having Super-AIDS. It will identify 10,000 people as having it.

Your 99 percent accurate test will perform with 99.99 percent inaccuracy.

That’s the paradox of the false positive. When you try to find something really rare, your test’s accuracy has to match the rarity of the thing you’re looking for. If you’re trying to point at a single pixel on your screen, a sharp pencil is a good pointer: the pencil-tip is a lot smaller (more accurate) than the pixels. But a pencil-tip is no good at pointing at a single atom in your screen. For that, you need a pointer– a test– that’s one atom wide or less at the tip.

This is the paradox of the false positive, and here’s how it applies to terrorism:

Terrorists are really rare. In a city of twenty million like New York, there might be one or two terrorists. Maybe ten of them at the outside. 10/20,000,000 = 0.00005 percent. One twenty-thousandth of a percent.

That’s pretty rare all right. Now, say you’ve got some software that can sift through all the bank-records, or toll-pass records, or public transit records, or phone-call records in the city and catch terrorists 99 percent of the time.

In a pool of twenty million people, a 99 percent accurate test will identify two hundred thousand people as being terrorists. But only ten of them are terrorists. To catch ten bad guys, you have to haul in and investigate two hundred thousand innocent people.

Guess what? Terrorism tests aren’t anywhere close to 99 percent accurate. More like 60 percent accurate. Even 40 percent accurate, sometimes.

What this all means is that the Department of Homeland Security has set itself up to fail badly. They are trying to spot incredibly rare events– a person is a terrorist– with inaccurate systems.

I can hardly wait to go to the airport this Friday to fly to St. Louis.