I don't understand, Nathan, why you are arguing this. Business can't use encryption that is so proprietary that you cant even get any hardware that it will run on let alone the software that does the encryption, if in fact there is software involved (that would be military grade encryption). Businesses need to communicate to outside entities to conduct their business. That requires standardized encryption. Nothing that runs everywhere is so obscure that every hacker on the planet can't take a crack at it. So either they give it a go publicly and the algorithm is improved, or they take a crack at it privately and the algorithm is broken. We all have a vested interest in standardization. Even companies that think they have a vested interest in keeping things proprietary. Exhibit #1 Microsoft IE. They owned the world with their non-standard extensions, but it wasn't enough, and now IE is dead because it was too entrenched in those proprietary technologies. Exhibit #2, Apple. While they have a decent business, their products will never be more than niche products long term. Even though Apple sells fad products, iOS has less than a 20% market share in both phones and tablets, and they have never even reached 20% with other computing devices. Why? Standardization, it is good for business because businesses need to talk to each other. Standardization is also good for consumers because we don't want to have to worry whether this app is going to work on our device. So yes, this is all about people, entities, organizations that have a vested interest in pushing standards. We all have that vested interest. Some like Microsoft may be starting to see the light. some like Apple never will and will remain perpetually relegated to their niche.

Mark Murphy
STAR BASE Consulting, Inc.
mmurphy@xxxxxxxxxxxxxxx


-----Nathan Andelin <nandelin@xxxxxxxxx> wrote: -----
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>
From: Nathan Andelin <nandelin@xxxxxxxxx>
Date: 04/04/2016 06:05PM
Subject: Re: Encryption algorithm used for the IBMi OS passwords.



Based on my exposure with his writing, I do not personally
believe that Schneier means what you assert he does.


He stated that the passwords were stored in "plain text", and then labeled
that an example of "security by obscurity". I couldn't help but make note
of the irony.

Basically, whenever an IT system
is designed and used in secret &#8211; either actual secret or simply away
from public scrutiny &#8211; the results are pretty awful.'


I accept the generalization. I hope you're willing to accept exceptions.

The proper context is that the airport x-ray scanner was designed in
secret, developed in secret, and deployed in secret, with the thought
that keeping the details secret meant that hackers would not be able to
break it.


I doubt that the developers really believed that "hackers would not be able
to break it". They knew that storing passwords in plain text would make it
vulnerable.

I believe that in most cases it's more a matter of shops accepting risks
associated with bad practices vs. the cost of best practices. It's more of
a question of resource constraints vs. risks.

If the vendor had released their design for public review, they never
would have used Win98, never would have stored passwords in clear text,
and in short, never would have deployed such a thing. The context is
not Windows 98, but the secret / proprietary / closed / unvetted
'security process.'


I acknowledge benefits associated with public or peer review. Are you
suggesting that public review of your internal systems and 'security
process' should be your highest priority?

Schneier has spent a lifetime decrying proprietary algorithms. Again,
from the same post: 'Smart security engineers open their systems to
public scrutiny, because that&#8217;s how they improve.'


Schneier fits the profile I mentioned earlier. i.e. vendors who make a
living pontificating about "standards".

I think the context, the framework that Schneier is working with when he
says that 'obscurity is insecurity' is Kerckhoff's principle, which can
be paraphrased as 'the system should remain secure even if the enemy has
a copy of the algorithm.'


I agree with that.

Sure, it's fine if you keep the exact
algorithm you choose to use a secret as long as that algorithm has been
tested and vetted in the open by experts.


How does your algorithm remain secret if you allow it to be vetted "in the
open"?

But Schneier himself would not do such a
thing: he would publish his algorithm and have the entire security
community work on it, crackers and all. I'm not speculating here, he
has actually done exactly that with Blowfish, Twofish, Threefish, and more.


I'd suggest that his motive might have less to do with strong encryption,
and more to do with wanting his algorithms to be widely used.

Kerckhoff's principle argues that the secret lies in the key, not in the
algorithm. The top minds in the cryptography field agree that
published, vetted algorithms are superior to obscure, unpublished
algorithms. At least, I don't know of any who disagree.


Again you're talking about people who have a vested financial interest in
pushing "standards".

Based on conversations with my dad, I believe that the protocols,
practices, and algorithms used by the U.S. military are not vetted "in the
open".

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.