According to the AP:
Security at American airports is no better under federal
control than it was before the Sept. 11 attacks, a congressman says two
government reports will conclude.
The Government Accountability Office, the investigative arm of
Congress, and the Homeland Security Department's inspector general are
expected to release their findings soon on the performance of
Transportation Security Administration screeners.
This finding will not surprise anyone who has flown recently. How
does anyone expect competent security from screeners who don't know the
difference between books and books of matches? Only two books of matches are now allowed on flights; you can take as many reading books as you can carry.
The solution isn't to privatize the screeners, just as the solution
in 2001 wasn't to make them federal employees. It's a much more complex
I wrote about it in Beyond Fear (pages 153-4):
No matter how much training they get, airport screeners
routinely miss guns and knives packed in carry-on luggage. In part,
that's the result of human beings having developed the evolutionary
survival skill of pattern matching: the ability to pick out patterns
from masses of random visual data. Is that a ripe fruit on that tree?
Is that a lion stalking quietly through the grass? We are so good at
this that we see patterns in anything, even if they're not really
there: faces in inkblots, images in clouds, and trends in graphs of
random data. Generating false positives helped us stay alive; maybe
that wasn't a lion that your ancestor saw, but it was better to be safe
than sorry. Unfortunately, that survival skill also has a failure mode.
As talented as we are at detecting patterns in random data, we are
equally terrible at detecting exceptions in uniform data. The
quality-control inspector at Spacely Sprockets, staring at a production
line filled with identical sprockets looking for the one that is
different, can't do it. The brain quickly concludes that all the
sprockets are the same, so there's no point paying attention. Each new
sprocket confirms the pattern. By the time an anomalous sprocket rolls
off the assembly line, the brain simply doesn't notice it. This
psychological problem has been identified in inspectors of all kinds;
people can't remain alert to rare events, so they slip by.
The tendency for humans to view similar items as identical makes it
clear why airport X-ray screening is so difficult. Weapons in baggage
are rare, and the people studying the X-rays simply lose the ability to
see the gun or knife. (And, at least before 9/11, there was enormous
pressure to keep the lines moving rather than double-check bags.) Steps
have been put in place to try to deal with this problem: requiring the
X-ray screeners to take frequent breaks, artificially imposing the
image of a weapon onto a normal bag in the screening system as a test,
slipping a bag with a weapon into the system so that screeners learn it
can happen and must expect it. Unfortunately, the results have not been
This is an area where the eventual solution will be a combination of
machine and human intelligence. Machines excel at detecting exceptions
in uniform data, so it makes sense to have them do the boring
repetitive tasks, eliminating many, many bags while having a human sort
out the final details. Think about the sprocket quality-control
inspector: If he sees 10,000 negatives, he's going to stop seeing the
positives. But if an automatic system shows him only 100 negatives for
every positive, there's a greater chance he'll see them.
Paying the screeners more will attract a smarter class of worker, but it won't solve the problem.