Bad SSL: security awareness in interesting times (1)

Secure websites, browsers and the difference with plain text websites

The details behind a (web) connection may not be interesting for a general audience and anyway not immediately meaningful to bother.
The intention of this post is to make a rather delicate issue understandable to as many non-technical people as possible: an effort is made to clarify the substantial aspect of the issue and its implications, trying to avoid the geeks' stuff and jargon.

TL;DR
  • Securing communication with a website takes more than using https.
  • Technology and cryptographic science behind the security of internet communication is highly complex but nevertheless, at any given time, publicly and freely available best practices exist to keep the level of actual security consistent with the purpose of a given resource.
  • This is nowadays important, given the indiscriminate surveillance system operating on global scale which stores - in an unbelievably detailed timeline - profiles, actions, preferences of suspects and free citizens alike.
  • The badssl project, using data collected from the worst ranking secure websites as tested by a third party service, shows several sensitive websites do not offer proper protection of their data.


Despite inevitable differences among the various browsers used at home, work or while moving, all of this software provides an indication when a website is reached through a secure connection (as opposite to a plain text one) and some generate a warning when the secure connection shows an anomaly (e.g. a security certificate still active past its expiry date or one not matching the actual domain name reached by the user). The most important difference between a https and a http site should essentially be that even an attacker in a so called privileged position should not be able to tap or inject data in a secure communication, whilst both the possibilities are open to the same attacker when dealing with plain text.
Most of the times, the browser tells only whether or not a connection is secure ...in a certain sense: if it makes use of the so called secure version of the http protocol known as https, that is.

Generally, through the eyes of a browser, a website uses https or not
and that's all we immediately know about that secure communication.

This alone has never represented sufficient information to determine the actual degree of security a certain website provides. Sometimes, important information such as "Your connection to credit.ford.com is encrypted with obsolete cryptography" is just a couple clicks away, under the details behind the padlock icon, while the icon itself shows no difference with an A+ https resource.

Green padlock icon in Google Chrome (Apple OS X, May 2015)
Same connection/session - just some more details displayed: should the padlock really be green?

What makes a secure website and what it takes to maintain one

A secure website implies - important to stress: at the time of writing - taking care of at least 7 aspects (and more than 20 so called protocol details) of the configuration which make evaluating the security of any implementation a rather complex task, to say the least. These 7 aspects refer to selection, implementation and testing of different engines, variables, ciphers ...and are a valuable example of the living nature of security applied to continuously evolving knowledge and technology: minor and major elements of the whole picture continuously become deprecated, then obsolete and while this does not always happen with the same publicity, not always the involved parties follow the developments affecting their own products and services releasing up to date versions of their software.

Just an anecdote on a high severity vulnerability (CVSS score 10, 8.1, 8.1) I personally reported 3 years ago, through the US DHS CERT: an unauthenticated access to the configuration affecting a switch, which the vendor downplayed arguing the product was, back than already, End Of Life. Well, three years later, the switch is still present on the manufacturer's corporate website and not present on the official EOL page, while there has been no update on the CERT page since the report.
[update May 2017, the switch is now listed in the EOL page, not until January 2016, 3 years and half after the report, by the way]

People working in information security have probably heard this mantra more than necessary: "Security is not a product but a process" as in "Information will eventually be compromised, either via a vulnerability discovered past the release of a product or by means of a - direct/indirect - technique not known at the time the software was published. Keeping software (and underlying OS) consistently up to date with patches and configuration updates is key to reducing the so called attack surface as part of the global security posture".

As in most of the things in fact, even in information security research, there is often a certain margin for deeper knowledge that will eventually be reached and either used to make a product more secure or to misuse the vulnerability for criminal intents.

By definition, a secure website should at any given time be so:
what's the point in a secure website no longer secure after awhile?
Continuous commitment is simply unavoidable.

It is simple to verify such an effort is not consistently found even on sensitive resources, where we would take for granted some added degree of security.

Reasons to care about proper secure server configuration (yes, as a user as well)

There are still applicable MiTM techniques an average user may fall victim of, from rogue free wifi at fast foods / airports / shopping malls / hotels to home router's hacks (seasonally occurring on a bulk scale), deliberate or incidental target, cybercriminal or adversary state, it does not really matter: one's most reserved information could, more often than we wish, be compromised.
The revelations about the nature of the operations conducted by intelligence agencies all over the world - from countries considered democratic - and the extent of the surveillance, in fact already operational, indiscriminate and extremely aggressive, could turn even an indifferent person in a paranoid individual who will start using strong encryption even to text a grocery list. This is however generating some technological, social and political debate.

While in the last millennium (suspected) criminals and terrorists were traced and monitored based on their specific set of residential or mobile phones, it seems nowadays, the possibilities offered by technology combined with enormous financial resources have made bulk collection a more convenient method to combat these threats: a permanent collection of information on world scale, described by some as necessary or inevitable. These good fella's (several intelligence agencies around the globe, implying public finances) have gone as far as:

...and what not. The documents cache Edward Snowden entrusted Laura Poitras and Glenn Greenwald with is probably still large and deep enough for some year to come.

Being a (suspect) criminal or terrorist is no longer necessary for one's life
to get, with unimaginable detail, captured and stored ...for future use.

Even a minor change to one of the several aspects that make a secure connection may compromise the integrity of the communication and the NSA, for one, has gone as far as promoting the general adoption of a specific cryptoalgorithm which has led to a certain controversy.

When an improvement takes place at a certain depth, we may not notice or hear about it but we will most probably be affected by the change; some advancements in technology are announced with emphasis (think of plug-in hybrid cars or 4G/LTE mobile connectivity) whilst others are nearly untold (think about facts related to genetically modified foods).

Assessing secure websites

Given there is no definitive or universally applicable secure configuration and assessing a secure host requires highly specialistic knowledge, it sounds inevitable we might never determine to what extent our communication is protected.

Without the effort of select members of the community, such a sensitive matter as cryptography applied to contemporary communication would probably remain an obscure discipline for geeks; despite someone observing they may not have been consistently vigilant, multiple open source or otherwise non-profit initiatives have been essential in validating existing secure protocols and algorithms and evaluating proposals for improvement or innovation in cryptology. A community funded project has recently concluded the source code review of a controversial open source cryptographic software - TrueCrypt - which at various moments in time had been suspected to contain a backdoor.

Since the publication of the NSA files, security of certain communication has become more relevant and several organisations have implemented secure http for websites and other forms of encryption for email and file transfer protocols (TLS).
It might not always be essential - e.g. when checking the weather, I guess - but whilst knowing a communication is secure would still reassure someone to shop online, indication that the secure http protocol is used is just one of the many elements required for a secure communication; the configuration used at the client side may also be more or less secure (that alone is a topic on itself).

Qualys has - several years already - been offering, as a free service, the SSL Labs Test for servers and browsers, the most comprehensive security assessment for secure websites to date (for the record, I am just a regular user of the service and not affiliated to Qualys in any way).
Already before the extensive details returned with every test, on the service home page, SSL Labs publishes three lists, each containing respectively:
  • the most recent 10 best scores
  • the most recent 10 sites tested
  • the most recent 10 worst scores
The result is a tag from F to A+ (worst to best) and some special tags like T for test or M for name Mismatch.

According to my - not definitive - knowledge of the web application, and other special cases end up in the the most recent 10 worst scores list.

part 2

Comments

Popular posts from this blog

SSLLabs SSL Test on 716 .gov https sites

Is DHS running honeypots?

The majority of DHS subdomains vulnerable to Man in The Middle attacks