Ratha (papertygre) wrote,

Secrecy and security; Societies that are sufficiently open

The risks of overclassification:
The lesson of 9/11 is that we are losing protection by too much secrecy. The risk is that by keeping information secret, we make ourselves vulnerable. The risk is that when we keep our vulnerabilities secret, we avoid fixing them. In an open society, it is only by exposure that problems get fixed. In a distributed information networked world, secrecy creates risk -- risk of inefficiency, ignorance, inaction, as in 9/11. As the saying goes in the computer security world, when the bug is secret, then only the vendor and the hacker know -- and the larger community can neither protect itself nor offer fixes.
From testimony by the Director of the National Security Archive via Bruce Schneier via sui66iy.

I find this pretty convincing.

I am also reminded of a recent conversation with zunger and aaangyl at Sushi-O-Sushi wherein the Kryptonite vulnerability was discussed:
The lock's flaw was apparently first publicized in 1992 in the United Kingdom, according to BikeBiz.com. The BBC even covered it, but the news apparently didn't resurface until a dozen years later.
So, what I want to know is, why didn't the Kryptonite news "take" on the first go-round? If the Bic vulnerability was already publicized that long ago, why did it only get widespread attention, and make Kryptonite get embarrassed enough to fix the problem, last year? Is society more "open" now due to the Internet?

Does the "open society" approach to security only work if a certain critical threshold of communication speed, ease, and volume is first crossed?
Tags: culture, security
  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded