Pinkerton’s Paradox

 By CSC’s Eric Pinkerton

Benjamin Franklin proclaimed that “You can do anything you set your mind to.”  After watching Return of the Jedi when I was nine, I set my mind to telekinesis.  It took me about five years, but eventually I did it — by focusing my mind and intensely staring at an orange for several years, I finally proved Franklin wrong.

We all make mistakes; I don’t know how many I have made during my career, but I’ll bet it’s up there in the thousands — a mistyped IP address or a wrongly calculated subnet mask here, five years lost to the mastery of a fictional ability there.

I’d guess that my obsession with telekinesis was probably rooted in my propensity towards laziness. We are all lazy to some degree, but if I can automate something, or make something more efficient to try and save myself some time, I’m all over it — all too often spending more time on the process than I might have spent just finishing the task in the first place.

I’m comfortable confessing all of this to you here because I can infer with some certainty that if you are reading this, you are probably human and, as such, equally prone to error and indolence as myself. To err is human after all.

So how does this relate to information security you ask? Allow me to present what I like to call ‘Pinkerton’s paradox.’

We have the classic triangle shape we are all familiar with in security, but this time, instead of confidentiality, PPintegrity and availability, we have PAE, PAL and PFU (pronounced pay-pal-foo).

PAE = People are Evil
It’s no secret that the PAEs are nasty people, from disgruntled employees and script kiddies to cyber criminals and state-sponsored attackers. This is the part of cybersecurity we all know about. It’s been drilled in to us by a million vendors, however, what is often overlooked is that all of these adversaries have one thing in common: they would all be completely benign without help from the PFUs.

PFU = People Make Mistakes
As I have already stated, everyone makes mistakes, and here is where it gets interesting. Think of the supply chain that results in IT services being delivered from base elements through hardware manufacturing and software coding to operational solutions. Thousands of humans are involved at every step and for every component. Manufacturers, engineers, developers, architects, implementers, testers, integrators, operational staff, etc., etc., and any error or oversight at any stage has the potential to become a security vulnerability.

It is simply impossible to write perfect code, and imperfect code is compiled by imperfect code, which is interpreted by imperfect code on imperfect hardware. Put plainly, it’s a miracle that anything works at all.

PFUs also are compounded by their PALs.

PAL = People are Lazy
Now it’s both unfair and uncharitable to simply assert that everyone is lazy, so I will bundle those folks in with a whole team of equally unfortunate souls: the under resourced, under funded, poorly motivated, underpaid, disempowered, mis-focused teams that abound in today’s organizations. Regardless of which label we choose, the outcome is the same here: known issues are left unaddressed.

As a security consultant I often hear comments that infer such issues, and which fall under the following three categories:

That’s not my job — ‘The project’s role was to do x, but it’s operations’ responsibility to do y.”

That is my job, but I’m unable to do it — “We know about that, but it’s just too hard to fix.”

That is my job, but I am unwilling to do it — “We tried to fix that, but it broke something else and we were told never to…”

In such cases the problems are known to people working for the company and the solutions are often embarrassingly simple, and yet companies endlessly exhaust most, if not all, of their resources on technologies to spot and stop the evil people over whom they have no control, influence or understanding, when it might be more productive to switch to addressing internal issues that are rarely framed as security issues. I see this as akin to addressing the symptoms rather than the cause.

By exploring and better understanding how these things are intertwined, we can be better informed and make better decisions about how we approach security.

Remember you can do almost anything you set your mind to.


Eric Pinkerton, a CSC Cybersecurity principal security consultant, has worked on numerous cloud assurance engagements, including complex control audits, detailed threat risk assessments and technical configuration reviews. Pinkerton is also proud to have contributed to both the forthcoming NESAF Cloud Security Framework and the current CSA Cloud Controls Matrix.


  1. Very enjoyable, Eric.


  2. Veronica Pinkerton says:

    Thought provoking and amusing at the same time -Great


  3. I’m surprised there was anything left of the orange after a few years.


  4. Clinton Firth says:

    Great insights as always Eric – right level of humour to hopefully get organisations to address serious cyber issues


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: