A few months ago I wrote about The Black Swan of Security – how major data loss events have 3 common characteristics –
1) A major data loss event appears as a complete surprise to the company .
2) Data loss has a major impact to the point of maiming or destroying the institution (note the case of Card Systems)
3) Data loss is ‘explained’ after the fact by human hindsight (Hannaford Supermarkets, Bank of America…hackers, viruses, drive-by Wifi attacks…)
A colleague of mine, who is a mathematician by training and banking executive by vocation, saw one of my presentations on Black Swan Data Security and told me I must read Imperfect Knowledge Economics by Professor Roman Frydman from NYU. I’ll take it out of the library, as soon as I can get over to the Hebrew U on Mount Scopus. Everything Roman Frydman and Michael D. Goldberg write about economic models surely holds true for information security today.
Why do our security threat models fail to account for what happens in in real-world and cyberspace? What drives the aggregate outcome of a multi-billion dollar security and compliance industry (1 percent of the US GDP) that fails to prevent the GFC and data leakage of over 250 million credit cards? Is “self-interest” really sufficient to understand security rationality? What is the role of history, the social context and common values in protecting digital assets and systems? How should threat models be used by policymakers and professional investors?
To paraphrase John Kay, writing about the book in The Financial Times, “the quest for advanced security technology gets in the way of useful security countermeasures.”

CLINICAL TRIAL READINESS ASSESSMENT
Are you really ready to run a clinical trial? How can you best assess your clinical trial readiness? We work with C-level executives and management