NoticeBored

NBlog November 22 - A to Z of social engineering controls

NoticeBored - Tue, 11/21/2017 - 10:30pm
I didn't quite finish the A-to-Z on social engineering methods yesterday as planned but that's OK, it's coming along nicely and we're still on track. 
I found myself dipping back into the A-to-Z on scams, con-tricks and frauds for inspiration or to make little changes, and moving forward to sketch rough notes on the third and final part of our hot new security awareness trilogy: an A-to-Z on the controls and countermeasures against social engineering. Writing that is my main task for today, and all three pieces are now progressing in parallel as a coherent suite.
It's no blockbuster but I have a good feeling about this, and encouraging feedback from readers who took me up on my offer of a free copy of the first part.
Along the way, a distinctive new style and format has evolved for the A-to-Zs, using big red drop caps to emphasize the first item under each letter of the alphabet. I've created and saved a Word template to make it easier and quicker to write A-to-Zs in future - a handy tip, that, for those of you who are singing along at home, writing your own awareness and training content.
I'd like to include some graphics and examples to illustrate them and lighten them up a bit, but with the deadline fast approaching that may have to wait until they are next updated. Getting the entire awareness module across the line by December 1st comes first, which limits the amount of tweaking time I can afford - arguably a good thing as I find this topic fascinating, and I could easily prepare much more than is strictly necessary for awareness purposes. 
Aside from that, the release of an updated OWASP top 10 list of application security controls prompted me to update our information security glossary with a couple of new definitions, and a radio NZ program about a book fair in Edinburgh (!) prompted me to explain improv sessions as a creative suggestion for the train-the-trainer guide for the social engineering module.
Breaking news about Uber losing millions of personal records to hackers has the potential to become a case study at some point. Initial rather vague news reports speak of hacking user credentials from Github and using them to access and steal info from cloud storage services, and raise concerns about the way the privacy noncompliance incident was handled and concealed, which in turn hints at a governance issue - in other words, this looks like becoming yet another multi-faceted incident, relevant to several infosec topics. Possibly, as with the Sony Pictures Entertainment incident, there may be enough meat on the bones to merit creating a special awareness module all by itself: it depends how the story evolves from here, and how much pertinent information is published.
Categories: NoticeBored

NBlog November 21 - A to Z of social engineering techniques

NoticeBored - Tue, 11/21/2017 - 2:39am
On a roll from yesterday's A-to-Z catalog of scams, con-tricks and frauds, I'm writing another A-Z today, this time focusing on social engineering techniques and methods.  
Yesterday's piece was about what they do.  Today's is about how they do it.
Given my background and the research we've done, it's surprisingly easy to find appropriate entries for most letters of the alphabet, albeit with a bit of creativity and lateral thinking needed for some (e.g. "Xtreme social engineering"!).  That's part of the challenge of writing any A to Z listing ... and part of the allure for the reader. 
What will the Z entry be?  As of this moment, I don't actually know but I will come up with zomething!
Both awareness pieces impress upon the reader the sheer variety of social engineering, while at the same time the alphabetical sequence provides a logical order to what would otherwise be a confusing jumble of stuff. Making people aware of the breadth and diversity of social engineering is one of the key learning objectives for December's NoticeBored module. Providing structured, useful, innovative awareness content is what we do.
We hope to leave a lasting impression that almost any social interaction or communication could be social engineering - any email or text message, any phone call or conversation, any glance or frown, any blog item (am I manipulating your thoughts? Am I persuading you to subscribe to NoticeBored? Look deeply into my eyes. Concentrate on the eyes. You are starting to feel drowsy ...)
Yes, hypnosis will make an appearance in today's A-Z.  It's not entirely serious!
Tomorrow, after completing the second, I'd like to complete the set with a third piece concerning the controls against social engineering. Can we come up with a reasonable list of 26? Come back tomorrow to find out how that turns out.
Categories: NoticeBored

NBlog November 20 - an A to Z catalog of social engineering

NoticeBored - Mon, 11/20/2017 - 12:14am

A productive couple of days' graft has seen what was envisaged to be a fairly short and high-level general staff awareness briefing on social engineering morph gradually into an A-to-Z list of scams, con-tricks and frauds.
It has grown to about 9 pages in the process. That may sound like a tome, over-the-top for awareness purposes ... and maybe it is, but the scams are described in an informal style in just a few lines each, making it readable and easily digestible. The A-to-Z format leads the reader naturally through a logical sequence, perhaps skim-reading in places and hopefully stopping to think in others.
For slow/struggling readers, there are visual cues and images to catch their eyes but let's be honest: this briefing is not for them. They would benefit more from seminars, case studies, chatting with their colleagues and getting involved in other interactive activities (which we also support through our other awareness content). The NoticeBored mind maps and posters, for instance, express things visually with few words.
Taking a step back from the A-Z list, the sheer variety and creativity of scams is fascinating, and I'm not just saying that because I wrote it! That's a key security awareness lesson in itself. Social engineering is hard to pin down to a few simple characteristics, in a way that workers can be expected to recognize easily. Some social engineering methods, such as ordinary phishing, are readily explained and fairly obvious but even then there are more obscure variants (such as whaling and spear phishing) that take the technique and threat level up a gear. 
It's not feasible for an awareness program to explain all forms of social engineering in depth, literally impossible in fact. It's something that an intensive work or college course might attempt, perhaps, for fraud specialists who will be fully immersed in the topic, but that's fraud training, not security awareness. We can't bank on workers taking time out from their day-jobs to sit in a room, paying full attention to their lecturers and scribbling notes for hour after hour. There probably aren't 'lecturers' in practice: most of this stuff is delivered online today, pushed out impersonally through the corporate intranet and learning management systems.
Our aim is to grab workers' attention, fleetingly, impart useful information and guidance, and motivate them to take even more care in future: yes, that's a benign form of social engineering. Maybe we should include it in the A-to-Z?
[Email me for a FREE copy of this briefing]
Categories: NoticeBored

NBlog November 19 - IoD advises members to develop "cyber security strategy"

NoticeBored - Sat, 11/18/2017 - 8:57pm

report for the UK Institute of Directors by Professor Richard Benham encourages IoD members to develop “a formal cyber security strategy”.
As is so often the way, 'cyber' is not explicitly defined by the authors although it is strongly implied that the report concerns the commercial use of IT, the Internet, digital systems and computer data (as opposed to cyberwar perpetrated by well-resourced nation states - a markedly different interpretation of 'cyber' involving substantially greater threats).

A 'formal cyber security strategy' would be context dependent, reflecting the organization's business situation. That broader perspective introduces other aspects of information risk, security, governance and compliance. All relevant aspects need to be considered at the strategic level, including but not just 'cyber security'. 
Counteracting or balancing the desire to lock down information systems and hence data so tightly that its value to the business is squeezed out, 'cyber security strategy' should be closely aligned with, if not an integral part of, information management. For instance it should elaborate on proactively exploiting and maximising the value of information the organization already holds or can obtain or generate, working the asset harder for more productive business purposes. In some circumstances, that means deliberately relaxing the security, consciously accepting the risks in order to gain the rewards. 
I find it ironic that the professor is quoted:“This issue must stop being treated as the domain of the IT department and be the subject of boardroom policy. Businesses need to develop a cyber security policy, educate their staff, review supplier contracts and think about cyber insurance.”Does he not appreciate that, in common parlance and understanding of the term, cyber is the geeks' domain, their home turf? Over-use of both 'cyber' and 'security' biases the entire report and perpetuates the issue, unfortunately.
'Information risk management' would be a more appropriate term since it concerns: 
  • 'Information' not just 'data': there's a huge amount of valuable information outside the computer systems and networks, not least in workers' heads. That, too, is a valuable asset which deserves to be nurtured, exploited and protected. No amount of 'cyber security' is going to stop an experienced employee resigning to work for a competitor, taking loads of proprietary information with them, or blabbing about trade secrets on social media, over coffee or down the pub.
  • 'Risk' not just 'security'. Security is not inherently valuable unless it addresses risk ... and security controls are not the only way to address risks. In referring to 'cyber insurance' for instance, the report yet again over-emphasizes IT, whereas insurance plus incident management, business continuity management and other aspects would provide a more rounded, sensible, strategic approach, fundamental to which is an appreciation of the risks.
  • 'Management', as in systematically planning, directing, monitoring and controlling things to achieve business objectives. Fire-and-forget does not apply here: management needs to keep a close eye on developments, especially as the risks are changing rapidly around us. There are governance aspects to it too, including that point about not leaving it to IT!
An 'information risk management strategy', then, has legs. We're getting somewhere!
To be clear, my beef is not just with the semantics. Frequent and widespread reference to 'cyber security' and related neologisms doesn't make it right. It is too specific, too narrow to address the real issues, bordering on being a dangerous diversion. It's a bit like the distinction between 'global warming' and 'climate change'. They are strongly related concepts, of course, but need to be handled differently in practice. There's more to climate change than the Earth warming up a bit.
On a positive note, I’m pleased to see the report state:"Ensure all your staff have regular cyber awareness training, building it into induction processes and ensure your people are a robust and secure first line of defence."Personally, I’d have preferred the term “continuous information risk and security awareness” to counteract the obsessive focus on both 'cyber' and 'security', and to draw the distinction between awareness and training. They are complementary approaches with different objectives and methods.  If that's unclear, take a good look at NIST SP800-50 "Building an Information Technology Security Awareness and Training Program" or Rebecca Herold's "Managing an Information Security and Privacy Awareness and Training Program".
Categories: NoticeBored

NBlog November 16 - color-coding awareness

NoticeBored - Wed, 11/15/2017 - 8:51pm
Looking back, I see that I've blogged quite a few times in different contexts about color.
For example, most of the security metrics I discuss are colored, and color is one of several important factors when communicating metrics, drawing the viewer's eye towards certain aspects for emphasis. 
We talk of white hats and black hatsred teams and so on.
Traffic light RAG coloring (Red-Amber-Green) is more or less universally understood to represent a logical sequence of speed, intensity, threat level, concern or whatever - perhaps an over-used metaphor but effective nonetheless. Bright primary colors are commonly used on warning signs and indications, sometimes glinting or flashing for extra eye-catchiness.
Red alert is a pleonasm!
Jeff Cooper, father of the "modern technique" of handgun shooting, raised the concept of Condition White, the state of mind of someone who is totally oblivious to a serious threat to their personal safety. Cooper's Color Code is readily adapted to the information risk and security context, for example in relation to a worker's state of alertness and readiness for an impending hack, malware infection or social engineering attack. We're currently exploring and expanding on that idea as part of December's awareness briefing for professionals on social engineering.
Categories: NoticeBored

NBlog November 15 - ethical social engineering for awareness

NoticeBored - Tue, 11/14/2017 - 1:10pm
Security awareness involves persuading, influencing and you could say manipulating people to behave differently ... and so does social engineering. So could social engineering techniques be used for security awareness purposes?
The answer is a resounding yes - in fact we already do, in all sorts of ways.  Take the security policies and procedures, for instance: they inform and direct people to do our bidding. We even include process controls and compliance checks to make sure things go to plan. This is manipulative.

Obviously the motivations, objectives and outcomes differ, but social engineering methods can be used ethically, beneficially and productively to achieve awareness. Exploring that idea even reveals some novel approaches that might just work, and some that are probably best avoided or reversed.

Social engineering method,
technique or approach
Security awareness & training equivalents Pretexting: fabricating plausible situations Case studies, rôle-plays, scenarios, simulations, tests and exercises Plausible cover stories, escape routes, scorched earth, covering tracks ‘What-if’ scenarios, worst-case risk analysis, continuity and contingency planning Persuading, manipulating, using subconscious, visual, auditory and/or behavioral cues such as body language, verbal phrasing     and       emphatic     timing Apply the methods and techniques used in education, marketing and advertising (e.g. branding disparate awareness materials consistently to link them together) Deceiving/telling lies, making false promises, masquerading/mimicry, fitting-in, going undercover, building the picture, putting on a persona or mask (figuratively speaking), acting and generally getting-in-character Emphasize the personal and organizational benefits of being secure; “self-phishing” and various other vulnerability/penetration tests Distracting, exploiting confusion/doubt to slip through, doing the unexpected Develop subtle underlying themes and approaches (such as ethics, a form of self-control) while ostensibly promoting more obvious aspects (such as compliance) Appealing to greed/vanity, charming, flirting Emphasize the positives, identify and reward secure behaviors Playing dumb, appealing for assistance Audience-led awareness activities e.g. a workshop on “What can we do to improve our record on malware incidents?” Exploiting relationships, trust and reliance Collaborating with other corporate functions such as risk, HR, compliance, health & safety etc. on joint or complementary awareness activities Empathizing, befriending, establishing trust, investing time, effort and resources Being realistic about timescales, and setting suitable expectations.  Anticipating and planning for long-term ‘cultural’ changes taking months and years rather than days and weeks to occur Exploiting reputation and referrals from third parties (transitive trust) Gather and exploit metrics/evidence of the success of awareness activities Claiming or presenting false or exaggerated credentials, using weak credentials to obtain stronger ones Do the opposite i.e. study for qualifications in information security and/or adult education Assertiveness, aggression, 'front', cojones, brazen confidence, putting the victim on the back foot or catching them off-guard Be more creative, adopting or developing unusual, surprising, challenging and perhaps counter-cultural awareness activities Creating and using urgency and compulsion to justify bypassing controls (Over?) Emphasizing ‘clear and present dangers’ (within reason!) Bypassing, sidestepping or undermining controls Addressing individuals and teams directly, regardless of hierarchies and norms Exploiting management/support overrides Using managers, auditors and other authority figures as communications vehicles Puppetry, persuading others to do our bidding (possibly several layers deep) ‘Train-the-trainer’!  Develop and support a cadre of security friends/ambassadors.  Gain their trust and favor.  Involve them proactively. Fast/full-frontal/noisy or slow/gradual attrition/blind-side/silent attacks, or both! Focus on a series of discrete topics, issues or events, while also consistently promoting longer-term themes Mutuality, paying a debt forward (e.g. if I give you a gift, you feel indebted to me) Give rewards and gifts, “be nice” to your audience, respect their other business/personal interests and priorities Targeting the vulnerable, profiling, building a coherent picture of individual targets, researching possible vulnerabilities and developing novel exploits Working on specific topics for specific audiences e.g. following up after security incidents, systematically identifying and addressing root causes Shotgunning (i.e. blasting out attacks indiscriminately to hook the few who are vulnerable) and snipering (e.g. spear phishing) Combining general-purpose awareness materials plus targeted/custom materials aimed at more specific audiences Pre-planned & engineered, or opportunistic attacks (carpe diem), or both! Planned awareness program but with ‘interrupts’ (see below) Dynamic, reactive/responsive attacks, turning the victim on himself, not entirely pre-scripted/pre-determined, being alert and quick-witted enough to grasp opportunities that arise unexpectedly Spotting and incorporating recent/current security incidents, news etc., including business situations and changes, into the awareness program Con-man, con-artist, fraudster, sleight-of-hand, underhand, unethical, selfish, goal-oriented, covertly focused Do the opposite i.e. be very open and honest, sharing the ultimate goals of the awareness program Using/replaying insider information and terminology obtained previously Referring back to issues covered before, and ‘leaving the door open’ to come back to present issues later on; re-phrasing old stuff and incorporating new information Systematically gathering, combining, analyzing and exploiting information Systematically gather, analyze and use metrics (measures and statistics) on awareness levels and various other aspects of information security Exploiting technical, procedural and humanistic vulnerabilities Work on policies, procedures, practices and attitudes, including those within IT Multi-mode, blended or contingent attacks e.g. combining malware with social engineering, plus hacking if that is appropriate to get the flag True multimedia e.g. written/self-study materials, facilitated presentations/seminars, case studies, exercises, team/town-hall/brown-bag meetings, videos, blogs, system messages, corridor conversations, posters, quizzes, games, classes, security clubs, Learning Management Systems, outreach programs …
Categories: NoticeBored

NBlog November 14 - 50 best infosec blogs

NoticeBored - Mon, 11/13/2017 - 7:22pm
I'm delighted that this blog has been featured among the 50 Best Information Security Blogs. Fantastic! Thank you, top10vpn.com ... and congrats to the other top blogs on the list, many of which I read and enjoy too. It's humbling to be among such august company.
We update this blog frequently in connection with the security awareness materials we're preparing, on security awareness techniques in general, or on hot infosec topics of the day. Blogging helps get our thoughts in order and expand on the thinking and research that goes into the NoticeBored modules. More than just an account of what's going on, updating the blog (including this very item) is an integral part of the production process.
A perennial theme is that it's harder than it appears to security awareness properly. Anyone can scrabble together and push out a crude mishmash of awareness content (typically stealing or plagiarizing other people's intellectual property - tut tut) but if they don't really appreciate what it all means, nor how to apply the principles of awareness, training and adult education, they are unlikely to achieve much. It's all too easy to add to the clutter and noise of modern life, more junk than mail.
Simply understanding what awareness is intended to achieve is a challenge for some! As I blogged the other day, being aware is not the ultimate goal, just another step on the journey - a crucial distinction. 


It could be said that this lack of understanding, rather than the usual lame excuse - lack of funds - is the main reason that security awareness programs falter or fail. I'm sure there are many other reasons too:
  • Lack of creativity: people gradually tune-out of dull, uninspiring approaches and come to ignore the same old same old (they get Bored of the Notices). If all the awareness program ever blabbers on about is compliance, privacy and phishing, over and over like a cracked record, don't be surprised if the audience nods off or slips quietly away for something more stimulating;
  • Poor quality communications: a lot of this stuff is technical and complex, so there's an art to explaining it in terms that resonate with the audience. Simply writing and drawing things professionally takes skill, effort and practice, and time (perhaps our most valuable resource). A perfectionist by nature, I cringe when I look back at some of the awareness content we first delivered when we launched this service, or for that matter when I see a simple typo in this blog or an error in something we delivered just last month. I hope I never stop learning and improving;
  • Lack of skills and competencies: I hinted at this just a moment ago. Awareness is an interpersonal/human activity, while information security is mostly about the technology. Spot the difference! Few cybersecurity professionals, in particular, are comfortable, let alone competent at relating to ordinary non-tech people. Disparagingly and dismissively referring to them as "users" is a massive clue about a lack of respect. Even presidents need to appreciate the importance of earning and retaining the trust and support of the people. I've blogged about innovative approaches such as operant conditioning and treating security awareness as a (beneficial!) form of social engineering;
  • Limited or waning support, particularly from influential managers and other individuals.  Awareness is a cultural issue, hence the tone at the top can mine or undermine it;
  • Naive, superficial approaches with a preponderance of childish cartoons, games and trivia. Having fun is appropriate in moderation but some of this stuff is deadly serious and should not be taken too lightly;
  • Weak or absent awareness metrics: if it's uncertain whether the awareness program is or is not having a positive effect on the organization, creating more value than it expends, then don't be surprised at lackluster support from management and limited funding (as I said, a lame excuse: rather than just bemoaning the fact, ask why the budget is inadequate, then work hard to address the reasons);
  • Lack of focus and purpose: in the corporate context, security awareness has to support the achievement of the organization's business objectives, otherwise it's irrelevant, unhelpful and doomed. Awareness is best designed-in as an integral part of the information risk and security machinery, greasing the cogs and oiling the bearings as it were;
  • Conversely, there's myopia: intense focus on too narrow a field of view, ignoring or failing to address the wider issues, not least how information risk and security concerns the organization, its business and its people. It's really not hard to think up dozens of potential topic areas, turning a creative awareness program into something much richer and more vibrant than the norm. Just lose the blinkers;
  • Irrelevance: a tricky one, this, given the diversity of the intended audiences and the topics. People are unlikely to be equally interested on every awareness item, yet others may benefit, hence the need for a spectrum, a mixture of ingredients that, together, bake a tasty cake;
  • Lack of direction: where are we going with this? Good question! This blog meanders from side to side, even glancing off at tangents some times but generally it tends back towards the middle ground: awareness is an essential and valuable means of mitigating information risks. Thinking about your awareness program, do you have a crystal clear vision of what it is intended to achieve, and how it is going to do that? What's your cunning plan?
Anyway, I encourage you to browse all 50 of best infosec blogs and track the ones that appeal to your imagination. Part of the fun of securing information is that it is a complex and dynamic enterprise. We need all the help and inspiration we can get!
Categories: NoticeBored

NBlog November 13 - a rich seam

NoticeBored - Mon, 11/13/2017 - 3:24pm
So much of human interaction involves techniques that could legitimately be called social engineering that we're spoilt for choice on the awareness front for December.  
December's topic exemplifies the limitations of "cybersecurity" with its myopic focus on IT and the Internet. Social engineers bypass, undermine or totally ignore the IT route with all its tech controls, and that's partly what makes them such a formidable threat. 
IT may be a convenient mechanism for identifying, researching and communicating with potential victims, for putting on the appearance of legitimate, trustworthy individuals and organizations, and for administering the scams, but it's incidental to the main action: fooling the people.
Maybe it's true that you can't fool all of the people all of the time, depending on precisely what is meant by 'all'. I think it's fair to say that we are all (virtually without exception) prone, predisposed or vulnerable to social engineering of one form or another. We can't help it: social interaction is genetically programmed into us and reinforced throughout our lives from the moment we're born, or even before. Some expectant mothers report their babies respond to the music and other sounds around them. A newborn baby probably recognizes its mother's and other familiar voices and sounds immediately. To what extent it trusts or could be fooled by them is a separate issue though!
The idea that we are inherently vulnerable, while powerful, is only part of the story. We're also inherently capable of social engineering. We have the capacity, the tools and capabilities to influence and manipulate others to varying extents. Again, that newborn baby is sending out an avalanche of signals to humans in the area, from the moment of its first gasp and cry. The communications may be non-verbal but they are loud and clear!
Categories: NoticeBored

NBlog November 10 - one step at a time

NoticeBored - Thu, 11/09/2017 - 10:37pm

This colorful image popped onto my screen as I searched our stash of security awareness content for social engineering-related graphics. It's a simple but striking visual expression of the concept that security awareness is not the ultimate goal, but an important step on the way towards achieving a positive outcome for the organization. 
A major part of the art of raising awareness in any area is actively engaging with people in such a way that they think and behave differently as a result of the awareness activities. For some people, providing cold, hard, factual information may be all it takes, which even the most basic awareness programs aim to do. That's not enough for the majority though: most of us need things to be explained to us in terms that resonate and motivate us to respond in some fashion. In physical terms, we need to overcome inertia. In biology, we need to break bad habits to form better ones.
Social engineering is a particular challenge for awareness since scammers, fraudsters and other social engineers actively exploit our lack of awareness or (if that fails) subvert the very security mechanisms we put in place. "Your password has expired: pick a new one now to avoid losing access to your account!" is a classic example used by many a phisher. It hinges on tricking victims into accepting the premise (password expired) at face value and taking the easy option, clicking a link that leads them to the phisher's lair while thinking they are going to a legitimate password-change function. Our raising awareness of the need to choose strong passwords may be counterproductive if employees unwittingly associate phishing messages with user authentication and security!
Part of our awareness approach in December's NoticeBored materials on social engineering will be to hook-in to our natural tendency to notice something amiss, something strange and different. Humans are strong at spotting patterns at a subconscious level. For instance, did you even notice the gradation from red to green on the ladder image? That was a deliberate choice in designing the image, a fairly crude and obvious example ... once it has been pointed out anyway! See if you can spot the other, more subtle visual cues (and by all means email me to see what you missed!). 
Those occasional flukes we call "coincidences" hold an extra-special significance for us, popping into our conscious thoughts in a remarkable way. As we are routinely bombarded with information through our five senses, pattern recognition is an efficient way to interpret the information flow in relation to our prior experience and expectations (in 'normal' situations), and to identify new or different patterns (something 'abnormal' and perhaps threatening). In the jungle, such a difference might alert us to a well-camouflaged lion lurking among the grasses, a potentially harmful item of food that smells rotten, or the howl of a pack of hyenas closing in. Especially when there's precious little time to react, and failing to respond may be life-threatening, reflexes can literally save our skins. 
There are some reflexive aspects to security awareness concerning information security incidents or crises that threaten our personal safety. Mostly, though, we must supplement reflexes with learned behaviors. Awareness starts by pointing out dangers and encouraging/promoting particular responses in a deliberate, conscious way ... but through repetition, rehearsal and reinforcement we aim to make even learned responses subconscious - quick and automatic, similar to true reflexes.
I'm currently working up a suite of 'scam busters' - leaflets that describe different scams, frauds and social engineering attacks (providing information), and explain how to bust or avoid them (motivational guidance and advice, a 'call to action' you could say). Each scam buster fits on a single page, including a distinctive image that, we hope, will catch the eye and pop into the person's memory if they find themselves facing the situations described, or rather variants thereof. I'm in two minds about providing an example of each scam on the other side of the page: sometimes less is more, but briefly describing actual social engineering incidents might help bring home the point that these are genuine, real-world threats, not just theoretical concerns. Some readers will barely skim the front page, others may enjoy reading and thinking on. Either way, it's a win for security awareness.   
Categories: NoticeBored

NBlog November 7 - pipes and bikes

NoticeBored - Tue, 11/07/2017 - 12:38am
The past few days have been very successful.  
Yesterday, at last, I fixed the water pipe feeding water to the stock tanks in the nick of time before the animals went thirsty, a mammoth job for this long-time office worker (!). 
The pipe is an old galvanized steel pipe, laid when this was a working farm, well before it became a pine forest. An ancient Lister diesel engine and piston pump sends water in two directions, either to the house tanks or to the stock tanks. 
The house line was fine, luckily but the stock line wasn't, and evidently hadn't been maintained in a long time. Just getting to the start of the line across the stream was a mission with a 60 degree muddy incline going up about 8m, then a strip of native bush, then the pines ... which had been toppled by a cyclone back in April. 
What would once have been just forest is now a forest clearing with a few hundred near full-sized trees laying on the ground, toppled like the matchsticks some of them were destined to become. 
Spurred on by the falling firs, the vicious NZ bramble seized the opportunity to flourish in the Spring sunshine, forming man-eating bramble patches a few metres high and several metres across the hillside.  Here's the easy bit at the bottom of the hill after a day or two's clambering, de-brambling and chainsawing ...


It has taken several days spread over several weeks to cut back the brambles to locate the pipeline as it climbs out of the gulley where the stream flows, chainsaw the fallen firs off the line, then replace the munted (broken) bits of pipe with modern high-density polythene pressure pipe and fittings. Last evening, I was elated to hear the sound of water flowing into the stock tanks above the paddocks where the now-thirsty sheep and cattle live. 
Don't tell anyone, but I did a little dance. 
On Saturday I took a motorbike training course, for which I received a pin badge and certificate that will hopefully reduce the cost of my bike insurance. I guess I can put "GCR" after my name too!
So now it's back to the office, replacing physical with mental effort as we crack on with the next awareness module, covering social engineering.  More on that tomorrow.
Categories: NoticeBored

NBlog November 3 - audit sampling (LONG)

NoticeBored - Thu, 11/02/2017 - 4:35pm
[This piece was stimulated by a question on the ISO27k Forum about ISO27k certification auditors checking information security controls, and a response about compliance audit requirements. It's a backgrounder, an essay or a rant if you like. Feel free to skip it, or wait until you have a spare 10 mins, a strong coffee and the urge to read and think on!]
“Sampling” is an important concept in both auditing and science. Sampling (i.e. selecting a sample of a set or population for review) is necessary because under most circumstances it is practically impossible to assess every single member  – in fact it is often uncertain how many items belong to the set, where they are, what state they are in etc. There is often lots of uncertainty.
For example, imagine an auditor needs to check an organization’s “information security policies” in connection with an internal audit or certification/compliance audit. 

Some organizations make that quite easy by having a policy library or manual or database, typically a single place on the intranet where all the official corporate policies exist and are maintained and controlled as a suite. In a large/diverse organization there may be hundreds of policies, thousands if you include procedures and guidelines and work instructions and forms and so forth. Some of them may be tagged or organized under an “information security” heading, so the auditor can simply work down that list … but almost straight away he/she will run into the issue that information security is part of information risk is part of risk, and information security management is part of risk management is part of management, hence there should be lots of cross-references to other kinds of policy. A “privacy policy”, for instance, may well refer to policies on identification and authentication, access control, encryption etc. (within the information security domain) plus other policies in areas such as accountability, compliance, awareness and training, incident management etc. which may or may not fall outside the information security domain depending on how it is defined, plus applicable privacy-related laws and regulations, plus contracts and agreements (e.g.nondisclosure agreements) … hence the auditor could potentially end up attempting to audit the entire corporate policy suite and beyond! In practice, that’s not going to happen.
In many organizations, the job would be harder still because the policies etc. are not maintained as a coherent suite in one place, but are managed by various parts of the business for various purposes in various formats and styles. On top of that, ‘policy lifecycle management’ is an alien concept to some organizations, hence even the basics such as having a defined owner, an ‘issued’ or ‘effective from’ date, a clear status (e.g. draft, exposure draft, issued and current, withdrawn) etc. may not be there. Simply getting hands on copies of current policies is sometimes tricky, making it hard to determine how many policies there are, where they are, who owns them, whether they are current, whether they have been formally sanctioned or mandated or whatever.
Note: there could be several ‘audit findings’ in these circumstances, particularly the latter, before the auditor has even started reviewing a single policy in detail!
Scope concerns are emerging already: are ‘compliance policies’ part of the ‘information security policies’ that were to be checked? What about ‘business continuity policies’ or ‘health and safety policies’? What about the ‘employee rulebook’, oh and that nice little booklet used by the on-boarding team in the depths of HR in a business unit in Mongolia? What about a key supplier’s information security policies …? Information is a vital part of the entire business, the entire supply chain or network in fact, making information risk and security a very broad issue. An audit can’t realistically cover “everything” unless it is deliberately pitched at a very high level – in which case there would be no intent to delve deeply into each and every policy.
The next issue to consider is the time and resources available for the audit. Audits are inevitably constrained in practice: usually there is an audit plan or schedule or diary for each audit within the period (often several years), and auditors are in short supply, especially in specialist areas where deep technical knowledge is needed (e.g. tax, information security, risk, health and safety, engineering …).
Another issue is the depth and detail of the audit checks or tests or assessments or reviews or whatever you call them. I could spend hours poring over and painstakingly picking apart a relatively simple website privacy policy in great detail, digging out and checking all the external references (plus looking for any that are missing), exploring all the concerns (and the plus points too: I strive to be balanced and fair!), writing up my findings and perhaps elaborating on a set of recommended improvements. Add on the time needed to initiate and plan the audit, contact the business people responsible, schedule interviews and meetings, complete the internal quality assurance, discuss the draft findings and report, and close the audit, and the whole thing could easily consume a week or three – auditing a single, simple policy in depth. It would need to be an unusually valuable audit to justify the expense, since I could have spent my time on other, more worthwhile audit work instead (an opportunity cost). 
Yet another relevant matter is how the auditors go about sampling, the sampling rationale or technique or method. Again, there are lots of possibilities e.g. random sampling, stratified sampling, sampling by exception, pragmatic sampling, dependent sampling etc. The auditors might pick out a couple of items at each level in the policy pyramid, or all the information security policies released within the past six months, or every one produced by the Information Risk and Security Management function at HQ, or every one with a “C” or a “D” in the title, or all those on a pre-compiled shortlist of ‘dubious quality, worth a look’, or all those that explicitly reference GDPR, or whatever. Rather than all, they might pick ‘the top 10%’ by some criterion, or ‘the bottom 10%’ or whatever. They might simply start with whatever policies are most readily available, or whichever ones happen to catch their eye first, and then ‘go from there’, following a trail or a contingent sequence that arises naturally in the course of the initial reviews. The auditors' nose often leads the way.
In my experience, surprisingly few audits are conducted on a truly scientific basis, using sound statistical techniques for sampling and data analysis. It’s fairly unusual for the sampling rationale even to be formally considered and documented, except perhaps as a line or two of boilerplate text in the audit scoping and planning documentation. Usually, the auditors and/or their managers and audit clients come to an informal arrangement, or simply ‘get on with it and see how it goes’, relying on the auditors’ experience and preference. For sausage-machine audits that are repeated often (e.g. certification audits), the sampling rationale may be established by convention or habit, perhaps modified according to the particular circumstances (e.g. an initial infosec policy audit at a new client might seek first to assess the entire policy suite at a high level, with more in-depth audits in specific areas of concern in later audits; an audit at a small local firm might sample just 1 or 2 key policies, while auditing a global conglomerate might involve sampling 10 or more).
Finally, there’s a sting in the tail. All sampling entails risk. The auditors are trying to determine the characteristics of a population by sampling a part of it and generalizing or extrapolating the results to the whole. If the sample is not truly representative, the conclusions may be invalid and misleading, possibly quite wrong. More likely, they will be related in some fashion to the truth … but just how closely related we don’t normally know. There are statistical techniques to help us determine that, if we have taken the statistical approach, but even they have assumptions and uncertainties, which means risk. Furthermore, the evidence made available to the auditors varies in terms of its representativeness. Sensible auditors are quite careful to point out that they can only draw conclusions based on the evidence provided. So not only are they practically unable to conduct 100% sampling, the sample itself might not be truly representative, hence they may miss material facts, hence an audit “pass” does not necessarily mean everything is OK!  Most formal audit reports include some boilerplate text to that effect. That is not just a ‘get out of jail free’ card, an excuse or an attempt to gloss-over audit limitations: there is a genuine issue underneath to do with the audit process. It’s reminiscent of the issue that we can identify, assess and quantify various kinds of information risk, but we can’t prove the absence of risk. We can say things are probably safe and secure, but we can never be totally certain of that (except in theoretical situations with specific assumptions and constraints). Same thing with audits.
Categories: NoticeBored

NBlog November 1 - privacy & GDPR update

NoticeBored - Wed, 11/01/2017 - 5:01pm

We have revised and re-issued the privacy awareness module with a particular focus on the General Data Protection Regulation this time around
GDPR is a major shake-up in European privacy laws with global implications. Does your organization know what’s coming? Do you understand the implications? Will you (your employees, IT systems, policies, procedures and websites) plus your suppliers and business partners, be ready by May 2018?
One of six new high-res poster images
provided in November's NoticeBored moduleBringing workers up to speed on privacy through awareness and training is an essential part of business for all organizations. Persuading everyone to take care of the personal information they handle means more than just informing them about their compliance obligations: they need to be sufficiently motivated to change their ways.
The break-glass poster is meant to catch the eye and make people think. It's not literal, of course, but every organization should have a suitable process in place to handle reporting of privacy breaches plus other incidents and near-misses. Encouraging people to report issues is one of the objectives of the awareness materials. The 3-day breach reporting deadline under GDPR will be challenging even for organizations that have a strong approach to privacy. For those with low awareness, it may prove impossible.
Taking that idea a step further, in addition to poster graphics, NoticeBored subscribers have the benefit of seminar slide-decks and briefings, FAQs, policy and procedure templates and the usual range of goodies designed to make it easy to raise awareness in this important area. It's basically a privacy and GDPR awareness kit.
So what about you? What are you doing in the way of awareness on privacy, GDPR and compliance? If awareness is just another thing on your lengthy to-do list, get in touch, preferably well before May 25th 2018 - urgently if your organization is blissfully unaware of what's coming. Management-level awareness is the key to making stuff happen. Let us help you with that.
Categories: NoticeBored

NBlog October 31 - spooky happenings in NZ

NoticeBored - Mon, 10/30/2017 - 8:48pm

Last night as darkness draped itself across the IsecT office, an eerie silence descended. No more tippy tappy on the keyboards, the writing finished, our job almost done for another month - the end of another chapter. 
A fantastic horror/thriller on the movie channel delivered the perfect stress antidote, a different kind of tension entirely. More poppycocck than Hitchcock but fun nevertheless.
Today we've packaged up November's privacy awareness materials, just under 100 megs of it, ready to deliver to our subscribers, and updated the website with details of the new module. My energy sapped, even strong coffee has lost its potency. It's time for a break! I'll have a bit more to say about the module tomorrow, if I evade the demons and survive the night that is.
Categories: NoticeBored

NBlog October 30 - polish til it gleams

NoticeBored - Sun, 10/29/2017 - 10:59pm
Today we're busy finalizing the privacy awareness materials for delivery to subscribers imminently. It is always a bit fraught at this time of the month as the deadline looms but things are going well this time around - no IT hardware failures or other crises at least. 
The new materials are proofread and gleaming, ready to package up and upload as soon as the poster graphics come in. I even managed a few hours off yesterday to visit friends at the radio club. Luxury!
We'll have a bit of a break before starting the next awareness module on social engineering, long enough hopefully to repair a broken pipe supplying water for the animals. I've been patiently chainsawing fallen pine trees out of the way for some while now, finding three breaks in the pipe so far. The stock water tanks have nearly run dry so it's a priority to fix the breaks, pump the water and finish the job. Our contingency plans involve carting water around in portable containers or getting a tanker delivery direct to the tanks, not exactly ideal with temperatures starting to climb towards summer, and a pregnant 'house cow' due to give birth any day now. 
We'll update the NoticeBored website soon too with details of the privacy module, taking the opportunity to make a few other changes while we're at it. I need to update ISO27001security.com as well, incorporating some additional materials kindly donated for the ISO27k Toolkit. It's all systems go here!
Categories: NoticeBored

NBlog October 29 - peddling personal data

NoticeBored - Sat, 10/28/2017 - 6:53pm
Earlier this month, I blogged about personal data being valuable and hence worth protecting like any asset. But what about commercial exploitation such as selling it to third parties? Is that OK too?
Some companies find it perfectly acceptable to Hoover-up all the personal information they can to use or sell to third parties, whereas others take a more conservative and (to my mind) ethical position, limiting personal data collection, using it for necessary internal business activities and refusing to sell or disclose it further (not even to the authorities in the case of Apple). 
The EU position on this is clear: personal information belongs to the people, not the corporations. Since privacy is a fundamental human right, people must retain control over their personal information, including the ability to limit its collection, accuracy, use and disclosure. 
The US position is ambiguous, at best. Efforts to tighten-up US laws around privacy and surveillance have been lackluster so far, often being stalled or knocked back by those same tech companies that are busy profiting from personal information, or by the spooks.
With the battle lines drawn up, once GDPR comes into effect next May the charge is on. Privacy and unrestricted commercial exploitation of personal information are essentially incompatible, so something has to give. We've already witnessed the failure of a half-baked attempt at self-regulation (Safe Harbor) and it seems Privacy Shield is also faltering. What next?
One possibility is a commercial response, where organizations increasingly decline doing business with US corporations that openly exploit and fail to protect personal information. That, coupled with the massive fines under GDPR, might finally drive home the message where it hurts them most: the bottom line. 
As Rana Foroohar from the Financial Times puts it "Privacy is a competitive advantage. Technology companies may have to say whether they are data peddlers or data stewards." Personally, I don't see it as a quite such a black-and-white issue, with plenty of room between those extremes.
A key issue, for me, is that matter of personal choice: we deliberately choose to give up some elements of our privacy under some circumstances, and that's fine provided we are fully informed and voluntarily accept the implications - two of the requirements under GDPR. What's unacceptable, to me anyway, is when my personal information is obtained sneakily and/or exploited or disclosed to third parties, without my knowledge and consent. I resent that. How about you? Perhaps it's another one of those cultural things.
Categories: NoticeBored

NBlog October 27 - Equifax cultural issues

NoticeBored - Thu, 10/26/2017 - 10:57pm

Motherboard reveals a catalog of issues and failings within Equifax that seem likely to have contributed to, or patently failed to prevent, May's breach of sensitive personal information on over 145 million Americans, almost half the population.




Although we'll be using the Equifax breach to illustrate November's awareness materials on privacy, we could equally have used them in this month's module on security culture since, according to BoingBoing:"Motherboard's Lorenzo Franceschi-Bicchierai spoke to several Equifax sources who described a culture of IT negligence and neglect, in which security audits and warnings were routinely disregarded, and where IT staff were unable to believe that their employers were so cavalier with the sensitive data the company had amassed."'A culture of IT negligence and neglect' is almost the opposite of a security culture, more of a toxic culture you could say. Workers who simply don't give a stuff about information security or privacy are hardly likely to lift a finger if someone reports issues to them, especially if (as seems likely) senior managers are complicit, perhaps even the source of the toxin. Their lack of support, leadership, prioritization and resourcing for the activities necessary to identify and address information risks makes it hard for professionals, staff members and even management colleagues who do give a stuff .... and that's why we are determined to help organizations educate and motivate management through security awareness and training materials written for that specific audience, not just staff.
In case it's not crystal clear already, consider this. Which of these do you think would have a better grip on its information risks, privacy and other compliance imperatives:
  • An organization whose security-aware managers understand the issues, proactively supporting, encouraging and leading the associated information risk management activities, or 
  • One whose managers pay this no attention whatsoever, perhaps even actively undermining any attempts to deal with the issues? 
To establish and nurture your security culture, subscribe to NoticeBored for a fresh, all-inclusive, creative approach to security awareness. Don't be the next Equifax. Or Sony. Or Target. Or NSA. Or ... just another depressing statistic.
Categories: NoticeBored