Friday, August 31, 2007

Introducing Dr. Henry Jenkins

I just read a fascinating interview of Dr. Henry Jenkins from MIT in the most recent 'Game Informer' magazine.  Will try to get a copy of that out to you once I find a copy of it on line.  In the mean time, check out the good Dr.'s own blog:
 
Go to his latest entries:
 
And an article on PBS where he dispells some commonly held myths about gaming and psychology:
And his page at MIT:
 

Wednesday, August 29, 2007

If you are thinking about buying an iPod this week for School

PLEASE WAIT until wednesday afternoon.
 
 
 

Tips on being a savvyer web shopper

Good info for folks who are new to buying online:
 
 
I also recommend http://www.pricegrabber.com but you should always check the vendors on any of these price tools at reseller ratings.

Defend Fair Use website launch, backed by the CCIA

 
 
I've been waiting for the consumer electronics industry to start beating the drums on this issue.  What it simply comes down to is that the content industries know they are doomed if they don't adapt to the digital world, they have been putting off changes for the last 30 years by these so called DRM experiments which anyone with half a brain had told them couldnt and wouldnt work, and the computer and consumer electronics guys let them get away with it.  If the CCIA is genuine about this, this is a good start.  But a lot of these CCIA guys have ties to the content industries and the battle is far from over.

Monday, August 27, 2007

Hackers at Microsoft

Microsoft has a new blog for the 'Hackers' they employ.  More importantly for us, they discuss the term Hacker and why it's not a perjorative, and also talk about the differences between white hat and black hat hackers.
 
 
 
 

Tuesday, August 21, 2007

DRM shutdown leaves Google video users stranded

http://arstechnica.com/news.ars/post/20070821-google-video-store-gets-stay-of-execution-full-refunds-coming.html

Not so bad if you are one of the few thousand Google video buyers or were unfortunate enough to buy in to Circuit City's failed DIVX format, but what happens if Apple decides to shut down their iTunes store or if one of the studios pulls their catalogue.  Think it can't happen?  If it can happen to google Apple is no different.

DISCLAIMER:
E-mail Confidentiality/Proprietary Notice: The information contained in this transmission is confidential/proprietary and may be subject to protection under the law. The message is intended for the sole use of the individual or entity to whom it is addressed. If you are not the intended recipient, you are notified that any use, distribution or copying of the message is strictly prohibited. If you received this transmission in error, please contact the sender immediately by replying to this e-mail and delete the material from any computer. Thank you.

Monday, August 20, 2007

FW: [IP] Voting excerpts from CRYPTO-GRAM [RISKS] Risks Digest 24.79

From Professor Farber's IP list and Bruce Schneier's cryptogram. 

Bruce posits that the vast majority of 'secure' system vendors want to use a trick to make their potential customers think about security backwards.  

Think about how this ties in to the 'Freakonomics' theories.

Sam

-----Original Message-----

Begin forwarded message:

Date: Wed, 15 Aug 2007 03:34:56 -0500
From: Bruce Schneier <schneier@SCHNEIER.COM>
Subject: Voting excerpts from CRYPTO-GRAM

    [Note: This item has been PGN-excerpted with Bruce's permission.  
PGN]

                   CRYPTO-GRAM
                 August 15, 2007
                by Bruce Schneier
                 Founder and CTO
                  BT Counterpane
               schneier@schneier.com
              http://www.schneier.com
             http://www.counterpane.com

A free monthly newsletter providing summaries, analyses, insights, and
commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit
<http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at
<http://www.schneier.com/crypto-gram-0807.html>.  These same essays
appear in the "Schneier on Security" blog:
<http://www.schneier.com/blog>.  An RSS feed is available.

       Assurance

Over the past several months, the state of California conducted the most
comprehensive security review yet of electronic voting machines. People
I consider to be security experts analyzed machines from three different
manufacturers, performing both a red-team attack analysis and a detailed
source code review. Serious flaws were discovered in all machines and,
as a result, the machines were all decertified for use in California
elections.

The reports are worth reading, as is much of the commentary on the
topic. The reviewers were given an unrealistic timetable and had trouble
getting needed documentation. The fact that major security
vulnerabilities were found in all machines is a testament to how poorly
they were designed, not to the thoroughness of the analysis. Yet
California Secretary of State Debra Bowen has conditionally recertified
the machines for use, as long as the makers fix the discovered
vulnerabilities and adhere to a lengthy list of security requirements
designed to limit future security breaches and failures.

While this is a good effort, it has security completely backward. It
begins with a presumption of security: If there are no known
vulnerabilities, the system must be secure. If there is a vulnerability,
then once it's fixed, the system is again secure. How anyone comes to
this presumption is a mystery to me. Is there any version of any
operating system anywhere where the last security bug was found and
fixed? Is there a major piece of software anywhere that has been, and
continues to be, vulnerability-free?

Yet again and again we react with surprise when a system has a
vulnerability. Last weekend at the hacker convention DefCon, I saw new
attacks against supervisory control and data acquisition (SCADA) systems
-- those are embedded control systems found in infrastructure systems
like fuel pipelines and power transmission facilities -- electronic
badge-entry systems, MySpace, and the high-security locks used in places
like the White House. I will guarantee you that the manufacturers of
these systems all claimed they were secure, and that their customers
believed them.

Earlier this month, the government disclosed that the computer system of
the US-Visit border control system is full of security holes. Weaknesses
existed in all control areas and computing device types reviewed, the
report said. How exactly is this different from any large government
database? I'm not surprised that the system is so insecure; I'm
surprised that anyone is surprised.

We've been assured again and again that RFID passports are secure. When
researcher Lukas Grunwald successfully cloned one last year at DefCon,
industry experts told us there was little risk. This year, Grunwald
revealed that he could use a cloned passport chip to sabotage passport
readers. Government officials are again downplaying the significance of
this result, although Grunwald speculates that this or another similar
vulnerability could be used to take over passport readers and force them
to accept fraudulent passports. Anyone care to guess who's more likely
to be right?

It's all backward. Insecurity is the norm. If any system -- whether a
voting machine, operating system, database, badge-entry system, RFID
passport system, etc. -- is ever built completely vulnerability-free,
it'll be the first time in the history of mankind. It's not a good bet.

Once you stop thinking about security backward, you immediately
understand why the current software security paradigm of patching
doesn't make us any more secure. If vulnerabilities are so common,
finding a few doesn't materially reduce the quantity remaining. A system
with 100 patched vulnerabilities isn't more secure than a system with
10, nor is it less secure. A patched buffer overflow doesn't mean that
there's one less way attackers can get into your system; it means that
your design process was so lousy that it permitted buffer overflows, and
there are probably thousands more lurking in your code.

Diebold Election Systems has patched a certain vulnerability in its
voting-machine software twice, and each patch contained another
vulnerability. Don't tell me it's my job to find another vulnerability
in the third patch; it's Diebold's job to convince me it has finally
learned how to patch vulnerabilities properly.

Several years ago, former National Security Agency technical director
Brian Snow began talking about the concept of "assurance" in security.
Snow, who spent 35 years at the NSA building systems at security levels
far higher than anything the commercial world deals with, told audiences
that the agency couldn't use modern commercial systems with their
backward security thinking. Assurance was his antidote:

"Assurances are confidence-building activities demonstrating that:
"1. The system's security policy is internally consistent and reflects
     the requirements of the organization,
"2. There are sufficient security functions to support the security 
policy,
"3. The system functions to meet a desired set of properties and *only*
     those properties,
"4. The functions are implemented correctly, and
"5. The assurances *hold up* through the manufacturing, delivery and
     life cycle of the system."

Basically, demonstrate that your system is secure, because I'm just not
going to believe you otherwise.

Assurance is less about developing new security techniques than about
using the ones we have. It's all the things described in books like
"Building Secure Software," "Software Security," and "Writing Secure
Code."  It's some of what Microsoft is trying to do with its Security
Development Lifecycle (SDL). It's the Department of Homeland Security's
Build Security In program. It's what every aircraft manufacturer goes
through before it puts a piece of software in a critical role on an
aircraft. It's what the NSA demands before it purchases a piece of
security equipment. As an industry, we know how to provide security
assurance in software and systems; we just tend not to bother.

And most of the time, we don't care. Commercial software, as insecure as
it is, is good enough for most purposes. And while backward security is
more expensive over the life cycle of the software, it's cheaper where
it counts: at the beginning. Most software companies are short-term
smart to ignore the cost of never-ending patching, even though it's
long-term dumb.

Assurance is expensive, in terms of money and time for both the process
and the documentation. But the NSA needs assurance for critical military
systems; Boeing needs it for its avionics. And the government needs it
more and more: for voting machines, for databases entrusted with our
personal information, for electronic passports, for communications
systems, for the computers and systems controlling our critical
infrastructure. Assurance requirements should be common in IT contracts,
not rare. It's time we stopped thinking backward and pretending that
computers are secure until proven otherwise.

DISCLAIMER:
E-mail Confidentiality/Proprietary Notice: The information contained in this transmission is confidential/proprietary and may be subject to protection under the law. The message is intended for the sole use of the individual or entity to whom it is addressed. If you are not the intended recipient, you are notified that any use, distribution or copying of the message is strictly prohibited. If you received this transmission in error, please contact the sender immediately by replying to this e-mail and delete the material from any computer. Thank you.

Wednesday, August 08, 2007

Tough questions for and weasel answers from the TSA

http://www.schneier.com/interview-hawley.html

"You could perhaps feel better by setting up employee checkpoints at entry points, but you'd hassle a lot of people at great cost with minimal additional benefit, and a smart, patient terrorist could find a way to beat you. Today's random, unpredictable screenings that can and do occur everywhere, all the time (including delivery vehicles, etc.) are harder to defeat. With the latter, you make it impossible to engineer an attack; with the former, you give the blueprint for exactly that. "

He gives the same argument for not screening workers that he DISMISSED for passengers.  What a load of bull.

DISCLAIMER:
E-mail Confidentiality/Proprietary Notice: The information contained in this transmission is confidential/proprietary and may be subject to protection under the law. The message is intended for the sole use of the individual or entity to whom it is addressed. If you are not the intended recipient, you are notified that any use, distribution or copying of the message is strictly prohibited. If you received this transmission in error, please contact the sender immediately by replying to this e-mail and delete the material from any computer. Thank you.

Meet the Hackers

 
Good introduction to why we differentiate between the terms Hackers and other terms like crackers etc.

More great Font advice

 
It's amazing how something as simple as font selection can influence how you feel about what is written.
 
Sam