NBlog September 19 - what is 'security culture'?

NoticeBored - Mon, 09/18/2017 - 3:01pm
For some while now, I've been contemplating what security culture actually means, in practice. 
Thinking back to the organizations in which I have worked, they have all had it some extent (otherwise they probably wouldn't have employed someone like me!) but there were differences in the cultures. What were they?
Weaknesses in corporate security cultures are also evident in organizations that end up on the 6 o'clock news as a result of security and privacy incidents. In the extreme, the marked absence of a security culture implies more than just casual risk-taking. There's a reckless air to them with people (including management - in fact managers in particular) deliberately doing things they know they shouldn't, not just bending the rules and pushing the boundaries of acceptable behavior but, in some cases, breaking laws and regulations. That's an insecurity culture!
The strength of the security culture is a relative rather than absolute measure: it's a matter of degree. So, with my metrics hat on, what are the measurable characteristics? How would we go about measuring them? What are the scales? What's important to the organization in this domain?
A notable feature of organizations with relatively strong security cultures is that information security is an endemic part of the business - neither ignored nor treated as something special, an optional extra tacked-on the side (suggesting that 'information risk and security integration' might be one of those measurable characteristics). When IT systems and business processes are changed, for instance, the information risk, security and related aspects are naturally taken into account almost without being pushed by management. On a broader front, there's a general expectation that things will be done properly. By default, workers generally act in the organization's best interests, doing the right thing normally without even being asked. Information security is integral to the organization's approach, alongside other considerations and approaches such as quality, efficiency, ethics, compliance and ... well ... maturity.  
Maturity hints at a journey, a sequence of stages that organizations go through as their security culture emerges and grows stronger. That's what October's NoticeBored security awareness content will be addressing, promoting good practises in this area. Today I'll be exploring and expanding on the maturity approach, drawing conceptual diagrams and thinking about the governance elements. What would it take to assemble a framework facilitating, supporting and strengthening the corporate security culture? What are the building blocks, the foundations underpinning it? What does the blueprint look like? Who is the architect?
Where does one even start? 
I've raised lots of rhetorical questions today. Come back tomorrow to find out if we're making progress towards answering any of them! 
Categories: NoticeBored

NBlog September 15 - symbolic security

NoticeBored - Thu, 09/14/2017 - 10:13pm

An article bemoaning the lack of an iconic image for the field of “risk management” (e.g. the insurance industry) applies to information risk and security as well. We don’t really have one either. 
Well maybe we do: there are padlocks, chains and keys, hackers in hoodies and those Anonymous facemasks a-plenty (a minute's image-Googling easily demonstrates that). Trouble is that the common images tend to emphasize threats and controls, constraints and costs. All very negative. A big downer.
Information risk and security may never be soft and cuddly ... but I'm sure we can do more to distance ourselves from the usual negative imagery and perceptions. I really like the idea of information security being an enabler, allowing the organization do stuff (business!) that would otherwise be too risky. So I'll be spending idle moments at the weekend thinking how to sum that concept up in an iconic image. Preferably something pink and fluffy, with no threatening overtones.

Categories: NoticeBored

NBlog September 13 - surveying the corporate security culture

NoticeBored - Wed, 09/13/2017 - 4:07am
Inspired perhaps by yesterday's blog about the Security Culture Framework, today we have been busy on a security culture survey, metrics being the first stage of the SCF. We've designed a disarmingly straightforward single-sided form posing just a few simple but carefully-crafted questions around the corporate security culture. 
Despite its apparent simplicity, the survey form is quite complex with several distinct but related purposes or objectives:
  • Although the form is being prepared as an MS Word document with the intention of being self-completed on paper by respondents (primarily general staff), the form could just as easily be used for an online survey on the corporate intranet, a survey app, or a facilitated survey (like shoppers being stopped in the shopping mall by friendly people with clipboards ... and free product samples to give away).
  • The survey form is of course part of our security awareness product, linking-in with and supporting the other awareness content in October's module on 'security culture', and more broadly with the ongoing awareness program.  The style and format of the form should be instantly familiar to anyone who has seen our awareness materials. 
  • A short introduction on the form succinctly explains what 'security culture' means and why it is of concern and value to the organization, hence why the survey is being carried out. I'm intrigued by the idea of positioning the entire organization as a ‘safe pair of hands’ that protects and looks after information: a reasonable objective given the effort involved in influencing the corporate security culture. Even the survey form is intended to raise awareness, in this case making the subtle point that management cares enough about the topic to survey workers' security-related perceptions and behaviors including their attitudes towards management. 
  • Conducting the survey naturally implies that management will consider and act appropriately on the results. We take that implied obligation seriously, and will have more to say about it in the module's train-the-trainer guide. The survey is more than just a paper exercise or an awareness item: respondents will have perfectly reasonable expectations merely as a result of participating.
  • The survey questions themselves are designed to gather measurable responses i.e. data on a few key criteria or aspects of 'security culture'.  We have more work to do on the questions, and even when we're done we hope our customers will adapt them to suit their specific needs (e.g. if there is an organization-wide issue around compliance, it might be worth exploring attitudes and perceptions in that area to tease out possible reasons for that).  For starters, though, the questions are extremely simple -  at face value, very quick and easy to read and answer - and yet given sufficient responses, the survey is a powerful, statistically valid and meaningful metric measuring a complex, multi-faceted and dynamic social construct. No mean feat that!
  • It would be feasible to develop further forms to survey populations other than 'general employees'. I'm thinking particularly of management and perhaps third parties: how does the corporate security culture appear from their perspectives? What concerns them? Are there issues that deserve concerted action? We may not have the time to prepare forms for October's NoticeBored module ... but we might pose that suggestion to our subscribers, again in the train-the-trainer guide.
  • Beneath each of the questions are spaces for respondents to comment, plus we encourage respondents to make their views known either on the reverse or (to maintain their anonymity) on a separate sheet, web page or email. We take the interactive approach quite deliberately and routinely because there's a lot of value to be gained by getting workers to open up a little and mention things that concern or interest them, from their perspectives and in their terms. In the particular context of the survey, we want to give respondents the opportunity to explain, expand or elaborate on the numeric responses if they feel the need. It's surprising just how powerful and insightful quotes direct from the horse's mouth can be. Pithy quotations make excellent content to illustrate and pep-up management reports and further awareness materials.
  • Mentioning 'free product samples' and 'sufficient responses' suggests the possibility of offering some sort of inducement for people to complete the survey, other than the opportunity to express their opinions and hopefully influence management. I have previously mentioned the gold-silver-bronze 'award menu' included in the Information Security 101 module: bronze level rewards would be ideal for this purpose. [Provided the anonymity aspect is addressed, a more attractive silver or gold award could be offered in, say, a prize draw: given the potential business value of the information generated by a well-designed survey, that's not a bad investment.]
So there we go. All we have to show for a whole day's work is a single page survey form (oh, and this blog piece!), illustrating once again the key point I made in relation to the elevator pitch for InfoSec 101: the shortest, pithiest awareness pieces are often the hardest to prepare. Less really is more!
Categories: NoticeBored

NBlog September 12 - Security Culture Framework

NoticeBored - Tue, 09/12/2017 - 12:58am

In preparing for our forthcoming awareness module on security culture, I've been re-reading and contemplating Kai Roer's Security Culture Framework (SCF) - a structured management approach with 4 phases.
1. Metrics: set goals and measureSpeaking as an advocate of security metrics, this sounds a good place to start - or at least it would be if SCF explored the goals in some depth first, rather than leaping directly into SMART metrics: there's not much point evaluating or designing possible metrics until you know what needs to be measured. In this context, understanding the organization's strategic objectives would be a useful setting-off point. SCF talks about 'result goals' (are there any other kind?) and 'learning outcomes' (which implies that learning is a goal - but why? What is the value or purpose of learning?): what about business objectives for safely exploiting and protecting valuable information?
SCF seems to have sidestepped more fundamental issues. What is the organization trying to achieve? How would what we are thinking of doing support or enable achievement of those organizational objectives? Security awareness, and information security as a whole, is not in itself a goal but a means to an end. I would start there: what is or are the ends? What is information security awareness meant to achieve? 
Having discussed that issue many times before, I'm not going to elaborate further on today, here except to say that if the Goals are clear, the Questions arising are fairly obvious, which in turn makes it straightforward to come up with a whole bunch of possible Metrics (the GQM method). From there, SMART is not such a smart way to filter out the few metrics with a positive value to the organization, whereas the PRAGMATIC metametrics method was expressly designed for the purpose.
SCF further muddies the waters by mentioning a conventional Lewin-style approach to change management (figure out where you are, identify where you want to be, then systematically close the gap) plus Deming's Plan-Do-Check-Act approach to quality assurance. I'm not entirely convinced these are helpful in setting goals and identifying measures. I would have preferred to elaborate on the process of analyzing the organization's core business, teasing out the 'hooks' in the business strategies on which to hang information security and hence security awareness. Those are powerful drivers, not least because only a fool would seriously resist or interfere with something that explicitly supports or enables strategic business objectives - a career-limiting move, to be sure!

2. Organization: involve the right people
Involving the right people makes sense for any activity including the previous step in SCF - in other words, the right people need to be involved in defining and clarifying the organization's objectives, which means these two activities overlap. Despite the numbering, they are not entirely sequential. The right people must be actively engaged in setting goals initially, and in deciding who else needs to be involved.
Sequencing issues aside, the second module of SCF discusses ways to identify 'the right people' for two distinct purposes: (1) those who will run the 'security culture program' (whatever that is! It is undefined at this stage); and (2) the target audience for security awareness (again, part of the vague 'security culture program').  

I fully support the idea of identifying awareness audiences, which is why NoticeBored delivers three parallel streams of content aimed at workers in general, managers and professionals. While we don't subdivide those audiences, we recommend that the security awareness professionals to whom the materials are delivered do so - it's standard advice in the train-the-trainer guide in virtually every awareness module to identify who has an interest in the monthly topic, and work with them to customize, communicate, inform and persuade. In many cases that comes down to business departments or functions, and sometimes individual people (e.g. the Privacy Officer clearly needs to be actively engaged in privacy awareness, along with the Legal/Compliance function - or their equivalents since their titles, responsibilities and interests may vary). 

SCF picks out executives, HR and Marketing as obvious examples of groups you would probably want to involved, and fair enough ... although I can think of many more (such as the two mentioned above). In fact it's hard to think of any part of the organization that could safely be excluded, given that information flows throughout the entire organization like a nervous system.

SCF mentions the idea of nominating ambassadors or champions, hinting at the process we call 'socializing information risk and security'. It also mentions the need for regular communications of tailored messages - good stuff.

3. Topics: choose activitiesThe advice here is to "Build culture that works by choosing relevant topics and activities". I'm confused by 'culture that works' but in practice determining the security awareness and training topics is the focus of this module, and that's quite straightforward.  There's sound advice here:"One thing to note about topics is that it is highly unlikely, and usually not something you would want, to cover all topics in one year. Long-term results are created by carefully crafting a plan to build the security culture you want over the course of several years."  True, for two reasons: (1) given a broad perspective on information risk and security, there are lots of topics to cover, hence a lot of information to impart; and (2) cultural changes are inevitably slow. People need time to receive and internalize information, and change their ways. They need gentle encouragement and support, motivation and, in some cases, enforcement of the security rules."Some topics are relevant at different stages of an employee lifecycle. One example is introducing new employees to policies and regulations when they begin working. Another is during relocation, when it may make sense to train the employee in local security routines."The need to include information risk and security in induction or orientation training is obvious, no problem there. Relocation, though, is not a strong example: in 'employee lifecycle' terms, what about internal moves and promotions, and eventually leaving the organization?  Those are almost universal activities that do indeed have information risk and security implications that the awareness program might usefully cover. Hmmm, perhaps we should put that idea into practice with NoticeBored awareness materials. We already cover some aspects (such as periodically reviewing and adjusting workers' information access rights).
Some of the advice in SCF has become lost in translation e.g.:"To map down topics that builds up under goal and matches an organizational map is one method to get a good overview. The easiest one is those who targets the whole organization and builds up under the overall goals in the goal hierarchy. Those who only target segments of the organization demands mostly more work."Que?

SCF mentions a few forms or styles of awareness and training - mostly training in fact, with an emphasis on computer methods. 

4. Planner: plan and executeSCF's advice in this area is straightforward and conventional, quite basic though helpful for someone just getting into security awareness for the first time, or at least the first time in a structured, planned way. 

Aside from defining goals, audiences and topics, and establishing metrics, there's little discussion of project or program management as a whole, including (1) risk management (what are the risks to your awareness program? What could go wrong? What should you be doing to mitigate the risks? And what about opportunities? Can you seize the opportunity and take advantage of business/organizational situations, or for that matter novel information risk and security situations such as the recent ransomware outbreaks, and forthcoming changes in privacy as a result of GDPR?); (2) resource management (e.g. recruiting, training and developing the awareness team, plus the extended team taking in those awareness ambassadors mentioned earlier); and (3) change management (it's ironic that change is noted earlier in SCF but not in the sense of managing changes to the awareness program itself - aspects such as changes to management support and perceptions, personnel changes, changes of focus and approach as old ways lose their impact and new ideas emerge, maturity, and changes prompted by the security metrics).

ConclusionSCF has some good points, not least focusing attention on this important topic. The advice is fairly basic and not bad overall, although the sequencing and reference to other approaches is a bit muddled and confusing.

Of more concern are the omissions, important considerations conspicuously absent from the website's overview of SCF e.g. business value, psychology, adult education, compliance, motivation and maturity. I'm disappointed to find so little discussion of security culture per se, given the name of the framework: it mostly concerns the mechanics of planning and organizing security awareness and training activities, barely touching on the before and after stages. Perhaps Kai's training courses go further.

That said, both the Security Culture Framework website and Kai's book "Build a Security Culture" are succinct, and patently I have been sufficiently stimulated to write this critique. I prefer Rebecca Herold's "Managing an Information Security and Privacy Awareness and Training Program" but you may feel differently. There's something to be said for getting to know both of them, plus other approaches too such as David Lacey's "Managing the Human Factor in Information Security" - another excellent book.
Categories: NoticeBored

NBlog September 11 - Security culture

NoticeBored - Mon, 09/11/2017 - 2:11am

Last night we watched a documentary on the History Channel about 9-11 - a mix of amateur and professional footage that took me back to a Belgian hotel room in 2001, watching incredulously as the nightmare unfolded on TV. Tonight there are more 9-11 documentaries, one of which concerns The War On Terror. As with The War On Drugs and The War On Poverty, we're never going to celebrate victory as such: as fast as we approach the target, it morphs and recedes from view. It's an endless journey.
The idea of waging war on something is a rallying cry, meant to sound inspirational and positive. In some (but not all) cultures it is ... and yet, in a literal sense, it's hard to imagine any sane, level-headed person truly relishing the thought of going to war. According to Margaret Atwood, "War is what happens when language fails", in other words when negotiations fail to the point that violent action is perceived as the best, or last remaining, option.
In truth, The War On Whatever involves more than just violent action: the negotiations don't stop, they just change. In public, they evolve into rhetoric and propaganda, fake news and extremism intended to elicit deeply emotional responses. In private, there's the whole issue of reaching agreement, defining the bottom line, stopping the untenable costs, saving face and redefining the boundaries.
National cultures and attitudes towards war and safety go way beyond the remit of our awareness service, and yet the corporate security culture has its roots in human perceptions, beliefs, ethics and moral values. We're unlikely to make much headway in changing those, although that alone needn't stop us trying! Hopefully we can influence some attitudes and hence some behaviors, perhaps drawing on cultural cues as part of the process.
There's plenty more to say on security culture as we work our way through the month: I promise future episodes will be less jingoistic and more upbeat. 
Categories: NoticeBored

NBlog September 8 - security certification

NoticeBored - Fri, 09/08/2017 - 2:35am
Aside from the elevator pitch, another short awareness item in our newly-revised Information Security 101 module is a course completion certificate, simply acknowledging that someone has been through the induction or orientation course.
I say 'simply' but as usual with NoticeBored, there's more to it.
For a start, some of us (especially those who consider ourselves 'professionals') just love our certificates: our qualifications and the letters before/after our names mean something to us and hopefully other people. This is a personal thing with cultural relevance, and it's context-dependent (my 30-year-old PhD in microbial genetics has next to nothing to do with my present role!). My even older cycling proficiency certificate is meaningless now, barely a memory, but at the time I was proud of my achievement. Receiving it boosted my self-esteem, as valuable a benefit as being able to demonstrate my prowess on two wheels. I'm tempted to use Cprof on my business cards just to see if anyone reads them!
On the other hand, a certificate indicating a pass mark in some assessment or test can be misleading. The driving test, for example, is a fairly low hurdle in terms of all the situations that a driver may have to deal with over the remainder of their driving career. There is clearly a risk that a newly-certified and licensed driver might be over-confident as a result of passing the test and going solo, a time when accidents are more likely hence some countries encourage a subsequent period of driving with special P-plates (meaning probationary, or passed or potential or ...) in the hope that others will give new drivers more space. In risk terms, there are risk-reduction benefits in letting new drivers continue to hone their new-found skills, offsetting the increased risk of incidents.
In the same way with the InfoSec 101 course completion certificate, we're glad to acknowledge the personal achievement and boost people's self-esteem (yay - something positive associated with information risk and security!), although there is a risk they might believe themselves more competent in this area that they truly are. On balance, we'd rather deal with that issue, in part through the ongoing security awareness activities that delve deeper into areas covered quite superficially in the 101 module, across a broader range of topics, and partly through the corporate support structure, processes  - the security culture that will be covered in next month's NoticeBored materials.
Perhaps at some later point well after induction, it might be appropriate to test workers again then issue the equivalent of those advanced driver certificates, accompanied with benefits analogous to lower insurance premiums? We include awareness tests in every module, so it's certainly feasible to track their scores and reward the star performers. There's even a rewards menu in the 101 module, complete with bronze, silver and gold-level certificate ideas, among many others.
Notice the emphasis on positivity and reward. We'd much rather focus on those who pass and succeed, than those who fail. Let's be frank here, failing something as basic as an InfoSec 101 awareness test (or driving test!) is really bad news, perhaps even justifying dismissal of new workers at the end of a probationary period. Such a hard line is something organizations might consider appropriate or necessary, especially in industries where information risks are substantial (e.g. defense, critical infrastructure, finance, government, health and IT), but it's not part of our remit. Personally, I would find such an approach unacceptable: instead I'd rather settle for remedial one-on-one training and limiting access to information until a passing grade is attained. To be honest, I'm more comfortable passing the buck to local management and HR in such delicate areas, especially given the employment law compliance implications.
There's another aspect to the 101 course completion certificate, concerning the award issue process itself: we provide a form letter to be sent along with the certificate by or on behalf of the CISO, ISM or some other appropriate manager. Most of all, it's an opportunity to re-emphasize that newcomers are integral, valuable parts of an organization that proactively protects and exploits information. Encouraging further contact between workers and the Information Security function bolsters the social network, directly supporting the oft-espoused but generally vacuous line that "We are all responsible for information security".  Yes we are, but there's more to it than trotting out some trite line on a poster or policy.
By the way, that's NOT our certificate imaged above. Ours is more classy, more refined, more attractive, more valuable. At least we think so. Aside from the execution, the concept is invaluable. And now it's yours. 
Categories: NoticeBored

NBlog September 6 - passwords are dead

NoticeBored - Wed, 09/06/2017 - 12:33am

I've blogged about passwords several times. It's a zombie topic, one that refuses to go away or just lie down and die quietly.
On CISSPforum, we've been idly chatting about user authentication for a week or so. The consensus is that passwords are a lousy way to authenticate, for several reasons.
First the obvious.  Passwords are:
  • Hard to remember, at least good ones are, especially if we are forced to think up new ones periodically for no particular reason;
  • Generally weak and easily guessed, due to the previous point;
  • Sometimes generated and issued not chosen or changeable by the user;
  • Readily shared or disclosed (e.g. by watching us type), or written down;
  • Readily obtained by force, coercion, deception and other forms of social engineering such as phishing or password reset tricks, or interception, or hacking, or brute force attacks, or spyware or .. well clearly there are lots of attacks;
  • Often re-used (for different sites/apps etc., and over time).

Next comes some less obvious, more pernicious lousiness:
  • Badly-designed sites/systems sometimes prevent us using strong passwords (e.g. they must be less than 20 characters with no spaces nor special characters ...; must be typed or clicked manually - no automation allowed);
  • Poor guidance on choosing passwords encourages poor choices, 
  • Passwords are sometimes weakened covertly by even lousier sites/systems (e.g. we can enter complex 50 character passwords but they only actually use 6, or store them in plaintext, or use a pathetically weak or broken hashing algorithm, often without a salt ...).

In short, passwords are not a reliable way to authenticate people. As a security control, they are weak to mediocre at best, not strong ... which is obvously a concern when authentication really matters. Some sites and apps have moved to multi-factor authentication, generally passwords or PIN codes plus some other factor, such as a cryptographic token, 'bingo card' or some other piece of hardware, or software, or biometrics, or locational information (e.g. GPS coordinates) or system characteristics (operating system + IP address or IMEI).
Passwords are dead
Long live passwords
Martin from Sweden has been telling us about an interesting federated authentication system there called BankID, based around a mobile app. The app serves credentials to various Swedish organizations enrolled in the scheme, not just the bank that originally authenticated the user (using a hardware token). It allows the user to check the details at authentication time (e.g. the transactions you are authorizing). It is multifactor: you need PINs or passwords to access your mobile and the app, plus the app, plus the device, plus the keys. Presumably it has mechanisms to handle lost/stolen mobiles, and new mobiles.
It's a successful, working system, not just a model or theory.  Cool!
I'm still interested in the idea of continuous authentication, supplementing the conventinal one-time login process at the start of a session with user activity monitoring during the session to confirm that the logged in user is behaving normally, and has not suddenly started typing differently, accessing different apps and sites, gambling, making large payments to Swiss bank accounts or whatever. 
Categories: NoticeBored

NBlog September 4 - InfoSec 101 elevator pitch, final part

NoticeBored - Sun, 09/03/2017 - 7:17pm
Moving on from our discussion of the first two paragraphs of this month's elevator pitch paper in part 1 and part 2, here's the closing paragraph:
As a manager, you play a vital governance, leadership and oversight rôle.  Please make the effort to engage with and support the security awareness program, discuss information risk and security with your colleagues, and help us strengthen the corporate security culture.In classical marketing terms, it's the call-to-action for people who have been lured and hooked. Having presented our case, what do we actually want them to do?  
Compared to the preceding two, the third paragraph is quite long. 
While we could easily have dropped the first sentence, it serves a purpose. It shows deference to the management audience, acknowledging their influential and powerful status, gently reminding them that they are expected to direct and oversee things. Essentially (in not so many word), it says "Pay attention! This is an obligation, one of your duties as a manager."
The final sentence, including those three words in bold, was especially tricky to write for the InfoSec 101 module. What is it, exactly, that we expect senior managers to do in relation to this very broad introductory-level topic? Think about that question for a moment. There are many possible answers e.g.:
  • Show leadership
  • Demonstrate commitment
  • Support the Information Security Management System (in an ISO27k organization)
  • Get actively involved in information risk and security management activities, such as risk assessment and risk treatment decisions
  • Raise the profile and priority of information risk and security matters
  • Provide adequate resources to do this stuff properly for once (!)
  • Encourage or enforce compliance
In the end, we settled on asking managers to demonstrate their 'support' in a non-specific way. In practice, that would vary between individual managers in various business units or departments. The call-to-action is context-dependent and hence very difficult to specify without an understanding of the audience and their situation, which we don't possess at the point of writing the awareness materials. It should be clearer when the messages are being delivered, and obviously we hope they make enough sense to resonate and influence the audience's decisions and behaviors, otherwise awareness is a pointless exercise.

In other awareness modules, the closing message for the elevator pitch is usually more obvious in that we focus the spotlight on distinct areas of information risk and security each month. For instance, in August's awareness module on cyberinsurance, the elevator pitch ended with a thought-provoking question: "Without cyberinsurance, serious cyber incidents could prove devastating if they occurred: we would save the insurance premium but is that a gamble worth taking?". The call-to-action was implicit rather than explicit. Our words deliberately raise a doubt. We couldn't simply say "Buy cyberinsurance!" as that may be inappropriate and unnecessary for some customers, not least those who already have it. Although more explicit, something along the lines of "Consider taking out cyberinsurance" would have been bland, lame and pathetic. "Is that a gamble worth taking?" is more of an intellectual challenge. In fishing terms, we're trying to get a rise out of the audience.

This month, we've deliberately sown the seed for next month's awareness module on 'security culture'. There will be much more to say, expanding those three bold words into an entire awareness topic. Linking the awareness topics together like this is yet another way to form a series of discrete awareness items into a coherent program, in turn supporting the security culture. 
So, there you go. Over three blog pieces, it has taken me about a thousand words to explain a hundred. Has it prompted you to think differently about management-level security awareness? 

I think it's obvious why short awareness items can take a disproportionate amount of effort to compose. The end result has very few words, but they are very carefully selected for maximum impact and value.

Cue Blaise Pascal:
"Je n'ai fait celle-ci plus longue que parce que je n'ai pas eu le loisir de la faire plus courte."
which Google translates as:"I have made it longer only because I have not had the leisure [time] to make it shorter."If you don't have the time and energy to prepare security awareness content, leave it to us! It's what we do - more than just a job, it's our passion.
Categories: NoticeBored

NBlog September 3 - InfoSec 101 elevator pitch, part 2 of 3

NoticeBored - Sat, 09/02/2017 - 8:28pm
Yesterday, I started telling you about one of the smallest deliverables in our awareness portfolio, the elevator pitch aimed at senior executive management. Despite its diminutive size, a lot of effort goes into selecting and fine-tuning those 100-odd words.
[Sorry if this detailed deconstruction of the pitch one paragraph at a time is tedious but I think it's useful to understand the design, the purpose of the page and the thinking that goes into it. As far as I know, we are the only security awareness provider specifically targeting senior management in this way. I've made disparaging comments in the past about awareness programs aimed at "end-users": neglecting other employees - especially managers and professionals - seems incredibly short-sighted to me, a bit like trying to teach the passengers how to drive a car, ignoring the driver and the mechanics.] 
OK, pressing swiftly ahead, the elevator pitch can be interrupted at any point. If someone is presenting or talking it through with an exec, they may well need to break off answer questions or respond to comments. If a busy exec is quickly skimming the piece online or on paper, they might get distracted by a phone call or email. We may only have their attention fleetingly, if at all. 
If we're lucky, the exec will swallow the bait and be hooked ... so the second paragraph has the essential barb:Cybersecurity is important but there’s more to it than IT. Information security enables the business to exploit information in ways that would otherwise be too risky.'Cybersecurity' is all the rage, of course. It's a term we see frequently in the media.  Although it's rarely defined, it is generally interpreted as IT and network security, specifically around Internet-related tech incidents such as hacking and malware. That's all very well, but what about all the rest of information risk and security? What about social engineering scams and frauds, piracy, industrial espionage and so forth? What about the whole insider-threat thing: where does that fit in relation to 'cyber'? 
Oh, hang on a moment: explaining the first 10-word sentence of the second paragraph took me about 100 words. Admittedly my explanation rambles on a bit, but on the other hand it's still just the tip of the iceberg.
The sentence that ends the second paragraph again mentions "business" - quite deliberately so but did you even notice? Here in New Zealand, we are suffering a spate of intensely annoying radio advertisements that inanely repeat some key word that, I presume, the client asked the ad agency to promote. "Wallpaper" is one that springs to mind, repeated about a dozen times in a typical 30 second ad. Maybe they think they are being clever because here I am talking about their wallpaper advertisement, but the repetition is so distracting that I can't remember the rest of the ad, including the company or product names. I reflexively hit the off button whenever I catch the first few seconds!
Rant aside, our second paragraph throws down a challenge before the reader. It's deliberately open-ended and thought-provoking. If cybersecurity is more than just IT, what else is it?  How does information security enable the business, and what's all this about risk anyway?

Tomorrow I'll conclude this little series by blogging about the final paragraph. Are you on the hook?
Categories: NoticeBored

NBlog September 2 - InfoSec 101 elevator pitch, part 1 of 3

NoticeBored - Fri, 09/01/2017 - 10:17pm
The elevator pitch is an awareness format we developed specifically for busy senior executives and other senior managers. 

Its main aim is to tell them just enough so they know what the awareness topic concerns. We'd like to intrigue them, prompting them to ask questions and seek more information, ideally influencing their decisions and actions as they go about their business in a more secure fashion.
The 'elevator pitch' name and button panel image alludes to the idea of condensing a complex subject down to a short statement that could literally be expressed during a short elevator ride. We don't actually envisage someone standing there in the elevator car reading out a prepared script to the captive audience, so much as being primed and ready to respond off-the-cuff to an informal opener from an exec along the lines of "So, how are things with you?".
We limit ourselves to about 100 words per topic. As you'll see from the example above, that works out to just 3 paragraphs or so, of 2 or 3 sentences each. It takes a surprising amount of effort to put things across so succinctly: the real art is in figuring out what's appropriate to leave out, and how to express the essentials in a way likely to resonate with senior managers.
Imagine being a fisherman selecting some juicy morsel to bait the hook. Ideally the pitch needs to catch the target's eye, intriguing them and sparking their imagination so they gulp it down ... but being realistic, very few execs are going to have the time and interest to drop everything and focus on information security, at least not on the strength of a snatched conversation in the elevator or a casual corridor chat.  
Let's look at the InfoSec 101 elevator pitch in more detail, breaking it down a paragraph at a time:Information is a valuable but vulnerable business asset that requires protection against risks. Responding to the risks through suitable controls involves all those who create, use and handle information.  Yes, that’s everyone.Those first few words are crucial, explicitly positioning information risk and security as a business issue. The whole pitch, in fact, is prioritized  such that, if our audience is distracted or busy and cuts it short, they have already had the most important information. If they get nothing else, "Information is a valuable but vulnerable business asset" is a fundamental awareness message. Whereas we'd like them to swallow the bait, we'd settle for a nibble.

The next two sentences emphasize that information risk is a concern for everybody in the organization (meaning the execs, managers, staff and others such as contractors and consultants, temps and interns - and in fact various outsiders such as our ISPs and CSPs, business partners, accountants, legal and tax advisers, authorities and more: as I said, writing an elevator pitch is largely about what to leave out without materially affecting the message).  

Given that Information Security 101 is an introductory module, the first paragraph cuts directly to the chase. I'll pick this up again tomorrow, exploring the second paragraph.
Categories: NoticeBored

NBlog September 1 - back to basics: InfoSec 101

NoticeBored - Thu, 08/31/2017 - 6:00pm
When someone initially joins an organization, they immediately start absorbing the corporate culture – ‘the way we do things here’ – gradually becoming a part of it. Most organizations run security orientation or induction sessions to welcome newcomers and kick-start the cultural integration process, with individual sessions lasting between a few minutes and a few hours depending on the topics to be covered, local practice, and of course the audience (e.g. there may be a quick-start process for managers, and more in-depth training for technical specialists).
Let's be honest: orientation tends to be as dull as a lecture on the dangers of teenage pregnancy. It's trial-by-fire, something to be endured rather than enjoyed. 
The new NoticeBored Information Security 101 module covers common information risks (e.g. malware) and controls that are more-or-less universal (e.g. antivirus). The awareness materials are deliberately succinct and quite superficial: they outline key things without delving into the details.  
Given the context of a continuous NoticeBored-style security awareness program delivering a stream of fresh materials, there's no need to cover everything about information risk and security in one hit. The pressure's off. Relax! All we really need in the induction session do is help newcomers set off on the right foot, engaging them as integral and valuable parts of the organization’s Information Security Management System. 
That leaves room to focus on an even more important objective, one that we will expand upon in next month’s module. Building relationships between Information Security professionals and business people in general, makes a huge difference to the corporate security culture. Think about it: would you rather pick up the phone to the friendly professional who took time to meet you when you joined the organization, or a total stranger?
First impressions count, so the module is designed to help Information Security deliver engaging and interesting induction sessions accompanied by impressive supporting materials.  
As well as orientation, Information Security 101 also facilitates the initial launch or relaunch of an awareness program (perhaps in support of an ISO/IEC 27001 Information Security Management System, for PCI-DSS, or for other compliance reasons). It introduces the new program, quickly bringing everybody up to the same foundation level of awareness and understanding.  We're literally getting them on the same page in the sense of introducing and explaining the corporate information security policy.

The InfoSec 101 module costs just US$645 (plus GST for Kiwis) ... or free as part of a regular NoticeBored subscription.  Email me!
Categories: NoticeBored

NBlog August 31 - strengthening Information Security’s social network

NoticeBored - Wed, 08/30/2017 - 11:27pm
Some security awareness programs simply broadcast messages at the organization. Messages flow from the Information Security function to the audience - specifically an audience dubbed "end users" in many cases, a disparaging term implying low-level staff who use computers (neglecting all others). A more effective approach, however, is to emphasize social networking and socialization of security as a primary driver of cultural change, with bidirectional communications increasing the chances that the awareness program reflects and responds to the business.
Establishing a strong social network of friends and supporters of information security throughout the organization takes commitment and sustained effort on the part of the entire Information Security function. The payback over the medium to long-term, however, makes it an approach well worth considering. An actively engaged and supportive social network will keep the awareness program, and in fact the information security program as a whole, business-aligned and relevant to current security issues in the organization, broadening and deepening the department’s influence. On top of that, you can achieve far more through a distributed network of supportive contacts than you can possibly manage alone.
Support from senior management is great but, in our experience, many of the most well-connected and influential workers are low-ranking individuals. They are ‘people people’ with the common touch, a natural flair for social interaction. 
This is why we're providing a template rôle description for the Information Security Awareness Contact in September's Information Security 101 module to get you started if you decide to structure and formalize the rôle to this extent. That may not be appropriate or necessary, depending on how your organization handles such issues. Speak to your management and HR about the concept before going too far down that line, including aspects such as recruiting, guiding/coordinating, motivating and rewarding people who accept the rôle. 
Colleagues in HR, Security Administration, IT/PC Support, Business Continuity, Risk Management, Compliance and Health & Safety may have similar social networks already in place (e.g. departmental reps, fire marshals and first responders). Invest some time in meeting both those colleagues and their best contacts to find out how the arrangements work on both sides, pick up useful tips ... and hopefully make a few solid-gold contacts of your own.
Categories: NoticeBored

NBlog August 30 - information risk assessment (reprise)

NoticeBored - Tue, 08/29/2017 - 5:19pm

On ISO27k Forum this morning, an FAQ made yet another appearance. SR asked:"I am planning to do risk assessment based on Process/Business based. Kindly share if you have any templates and also suggest me how it can be done."Bhushan Kaluvakolan responded first by proposing a risk assessment method based on threats and vulnerabilities (and impacts, I guess), a classical information-security-centric approach that I've used many times. Fair enough.
I followed up by proposing an alternative (and perhaps complementary) business-centric approach that I've brought up previously both on the Forum and here on NBlog:
  1. Consider the kinds of incidents and scenarios that might affect the process, both directly and indirectly. Especially if the process is already operating, check for any incident reports, review/audit comments, known issues, management concerns, expert opinions etc., and/or run a risk workshop with a range of business people and specialists to come up with a bunch of things – I call them ‘information risks’. This is a creative, lateral thinking process – brainstorming. Focus on the information, as much as possible, especially information that is plainly valuable/essential for the business. If necessary, remind the experts that this is a business situation, a genuine organizational concern that needs pragmatic answers, not some academic exercise in precision.
  1. Review each of those information risks in turn and try to relate/group them where applicable. Some of them will be more or less severe variants on a common theme (e.g. an upstream supply chain incident can range from mild e.g. minor delays and quality issues on non-critical supplies, to severe e.g. sudden/unanticipated total failure of one or more key suppliers due to some catastrophe, such as the Japanese tsunami). Others will be quite different in nature (e.g. various problems with individual employees, IT systems etc.). A neat way to do this is to write each risk on a separate sticky note, then stick them on a white board and briefly explain them, then move them into related/different groups of various sizes and shapes.
  1. Discuss and evaluate each (grouped) risk according to its probability (or possibility or chance or likelihood or frequency … or whatever) of occurrence, and the organizational impact (or severity or criticality or trouble or nastiness or scale or cost or size or drama … or whatever) if it ever does occur. Plot them out on a PIG (probability-impact graph). There are several examples on NBlog e.g. here and here, plus tips on running risk workshops etc. Instead of the ‘whiteboard’ noted above, those ‘sticky notes’ could be text overlaid on a colourful blank PIG graphic, pre-drawn on a computer screen in, say, Powerpoint or Visio.
  1. Once all/most of your identified risks are on the PIG, and you have had a good chance to discuss their wording and positioning and relationships, set aside some time to focus on any in the red zone i.e. severe + high probability risks: these are clearly priorities for the business. What can/should be done to treat them? What needs to be put in place to enable the risks to be treated? Who needs to drive that work (the ‘risk owner’)? How will the resources be found and allocated? When does it need to happen, and how? Continue with the orange zone risks, and the greens too if you are obsessive and have the time and energy (are there existing risk treatments/controls for the greens that might safely be relaxed or retired?). This generates a draft action or risk treatment plan, prioritized according to information risk. Look for opportunities to schedule and align activities where it makes sense, including other parallel activities where applicable (e.g. linking process changes with IT system or supplier changes, business reorganization etc.).
  1. Check for any outliers, anomalies, and open issues. Looking at the whole PIG, is there anything that seems odd, or wrong, or worrying? Take an even broader business or strategic perspective: how does this PIG and this set of information risks fit in with other PIGs and other risks facing the organization? Are there issues and constraints in this area that often crop up in other areas too, hinting at a common cause that maybe ought to be tackled too? Again, are there opportunities to hook-on to other business initiative, projects and activities? And how does all this align with and support business objectives?
  1. As actions/risk treatments are completed, several information risks on the PIG should move as they become less likely and/or severe – so review, update and reconsider the PIG periodically. Look especially hard for changes such as new or emerging information risks that aren’t yet represented on the PIG. Relevant incidents and near-misses that aren’t adequately reflected in identified risks indicate omissions in your risk identification and assessment process … so look for others too, and make improvements. If necessary, run focus group sessions to address information risks that remain stubbornly stuck in the orange or red zones. If risk treatments aren’t working, what needs to be done to fix them? Are there alternative approaches worth trying? Are there competing priorities or constraints that management needs to address … or are the risks acceptable, in fact (if so, get that in writing! Hold someone senior personally accountable for the risk acceptance decision)? 
  1. Keep notes on the risk management process, the workshops, techniques, issues etc. and refine the approach every time it runs. [That’s how I got here, and my journey to enlightenment continues!]
The PIG part of this approach is especially controversial, I know. There are other forms of risk analysis, including truly quantitative approaches (based on actual data and mathematically sound models) and other qualitative methods … but I find this good enough for my purposes, and simple enough that, once they get the hang of it, workshop attendees focus on discussing and understanding and tackling the risks rather than obsessing about the analytical method. YMMV.
For truly business- or safety-critical situations, or if you are uncertain about whether any given approach is OK, you might try several different methods, comparing and contrasting the results for additional insight. Chris Hall has previously suggested involving different groups of people in separate sessions to emphasize their different perspectives, expertise and interests (cool tip - thanks Chris!). It’s hard, though, to bottom-out the reasons for the differences, not least because this is all based around predicting an inherently uncertain future. It’s all crystal ball gazing. This is closer to witchcraft and alchemy than science. There comes a point where it’s better to just get on with it and see how things go, than to continue endlessly refining the analysis or obsessing about the methods. You can always come back later for another gaze, another go at mixing your magic potions. Meanwhile, those risks need treating, the red ones urgently.
Categories: NoticeBored

NBlog August 27 - thanks a million

NoticeBored - Sun, 08/27/2017 - 4:34pm

According to Google's Blogger stats, over the weekend this blog topped 1 million page views so I guess we must be doing something right!
It would be hard to come up with something new to say every day, if it weren't for the fact that we are all bombarded by stuff from other blogs and groups, from advisories and committees, and from several billion Websites. There's lots of stuff going on in the world of infosec which keeps me interested and hopefully you too.
My main concern is the human as opposed to technological aspects, hence my overriding interest in promoting good practices in information risk and security governance and management (especially ISO27k and security metrics), security awareness, policies, procedures etc. to keep a lid on social engineering scams, frauds, hacks and malware attacks, ineptitude, thievery, spying, piracy and so forth. Having said that, managing technology requires understanding it (IT especially) so I try my best to keep an eye on that too. And the physical side. And compliance. And risk management.  And business ...
I interpret and react to the news rather than simply passing things on, an approach I hope rubs off on you. I'm expressing personal opinions here, hopefully adding value based on my experience and knowledge. I encourage you all to think about what you read, reinterpret it in your context, be critical and by all means disagree with me. I don't hold all the answers. I know I am outspoken, cranky and off-base sometimes. I'm human too. This blog is my catchpa!
OK, must press on. We have sick animals to tend plus an awareness module to complete. Back soon.
Categories: NoticeBored

NBlog August 25 - awareness boosters

NoticeBored - Thu, 08/24/2017 - 10:13pm

The Information Security 101 awareness module update is going well. We might even finish slightly ahead of the deadline, provided I can resist the temptation to keep polishing and adding to the content!
One of the deliverables is a 'menu' of rewards for workers who uphold the information risk and security practises, controls and behaviors we wish to encourage. The rewards are divided into bronze, silver and gold categories.
Bronze rewards are generally free or cheap, and yet welcome - a nice way to thank workers for simply participating in awareness seminars, case study/workshop session or quiz maybe. Here are just a few examples:
  • A phone call, personal thank-you note and/or email
  • Letter of participation or commendation to be placed in the employee’s personnel file (whatever that means!)
  • Relaxed dress code for the recipient – for a defined period such as a day or a week 
  • Generic certificate acknowledging a level of competence (e.g. on completion of security induction training - there's a template in the module)
  • Note and/or photo on hall-of-fame, newsletter and/or the Security Zone (Information Security's intranet website - again there's a generic website design specification in the module)
  • Plain (dull bronze) pin badge or sticker with awareness program logo
  • Plain (dull bronze) staff pass lanyard with awareness program logo and stock message (such as how to contact the Help Desk or Site Security)

Moving up a level, silver awards are more valuable and attractive, requiring a little more money and effort:
  • Polo/tee-shirt printed with corporate and/or awareness program logo and a relevant quotation or catch-phrase
  • Fancy pin badge with awareness program logo and catch-phrase (e.g. “I’m security aware!”)
  • Informal party and presentation for the recipient and team (refreshments provided)
  • Phone call, personal thank you note and/or email to the award winner plus one to their manager copied to HR, commending them and explaining why they deserve the award
  • Business cards with awareness program logo and message, showing the recipient’s name as a 'security ambassador'
  • Shiny silver staff pass lanyard with awareness program logo, recipient's name and personalized message

Gold-level awards are of course fancier still, some quite distinctive, special and valuable:
  • Fleece or coat embroidered with security awareness logo, quotation and the recipient’s name
  • Programmable LED/LCD message badge pre-loaded with suitable rotating messages
  • Personalized business card holder containing special business cards showing the awareness program logo and maybe an appropriate awareness message or personal endorsement on the reverse side
  • Special name plate, cubicle sign or pin badge engraved with the awareness program logo and the recipient’s name and date
  • Smart, high quality, collectable trinkets (e.g. desk clock, watch, laptop bag/carry-all/in-flight luggage bag etc.) engraved/printed with the security awareness logo and ideally the recipient’s name
  • Gold staff pass lanyard or carrier, identifying the recipient as a security guru (“Ask me about information security”)

There are more than 50 suggestions along those lines in the 101 module, some quite innovative, for instance the chance for a one-on-one chat with a senior/executive manager, over coffee, lunch or dinner. Some are designed to reward entire teams such as the leaders in a corporate league table based on departmental or business unit performance, measured using specific security metrics. An awards ceremony or gala dinner might work for some organizations, perhaps as part of an annual security awareness event.
As with all the NoticeBored materials, customers are free to adapt the menu to suit their situation, requirements and constraints (including budget!). The concept is at least as valuable as the menu itself. I must day it's an awareness approach I've personally found very successful in the past, although it may not suit every organization. 
What a contrast to conventional compliance enforcement through penalizing those who don't comply. That may still be needed but hopefully not nearly as often.
Categories: NoticeBored

NBlog August 24 - hot potato or mash?

NoticeBored - Wed, 08/23/2017 - 11:40pm
I'm currently working on a couple of interrelated matters concerning ISO/IEC JTC 1/SC 27 business. One is the possibility of renaming and perhaps re-scoping the committee's work. The other is a study period exploring cybersecurity.
They are related because cyber is a hot potato - a bandwagon no less. Some on the committee are raring to disable the brakes and jump aboard.
When asked to describe what cybersecurity is, one expert replied "Budget". That's more than just a cynical retort. Cyber risk, cyber security, cyber threats, cyber attacks, cyber incidents and cyber insurance are all over the headlines. Several countries have invested in cyber strategies and units. There is money in cyber, so that's a good thing, right?
As I've said before, the focus on cyber is problematic for several reasons, not least distinctly different interpretations of the very term, a gaping chasm separating two distinct domains of understanding:
  1. In informal use (including most journalists and commentators in the blogophere), cyber means almost anything to do with IT, the Internet in particular. The primary concerns here are everyday hackers and malware (or rather "viruses").

  2. In (some?) government and defense circles, cyber alludes to cyberwar, meaning state-sponsored extreme threats exploiting all means possible to compromise an enemy's critical infrastructures, IT systems, comms, economy and society. Compared to the other interpretation, this off-the-scale nastiness requires a fundamentally different approach. Firewalls and antivirus just won't cut it, not by a long chalk. If anything, those everyday hackers and malware are a source of chaff, handy to conceal much more insidious compromises such as APT (Advanced Persistent Threats) and malicious processor hardware/firmware. Authorities stockpiling rather than disclosing vulnerabilities, and building red teams like there's no tomorrow, hints at what's going on right now.
As if that's not enough, every man and his dog is either coming up with his own unique definition or ducking the issue by remaining (deliberately?) vague and imprecise. There's little consensus, hence lots of confusion and talking at cross purposes.
It is entirely possible that SC 27 might find itself lumbered with the cyber moniker because it's sexy, in which case those different interpretations will have to be addressed at some point. Unfortunately a precedent has been set by ISO/IEC 27032 which unhelpfully refers to "the Cyberspace" - in practice a curious mashup of the Internet and virtual worlds. Quite bizarre.
Worse still, even the cyberwar version of cyber implies it is all about technology: since IT systems, networks and data are the concern, it is implied that technical controls are going to save the day.
My concern is that by going down the cyber alley, the committee, and hence the ISO27k standards, may neglect the rest of information risk and security beyond the technology. Consider these examples:
  • The Bradley/Chelsea Manning and Edward Snowden incidents were information incidents but not cyber attacks (at least not as most people would define and use the term) and yet clearly they caused immense damage.
  • Many common-or-garden frauds and scams either don’t involve IT at all, or the IT aspect is incidental. They are targeting people, not (just) computers. If someone tricks a corporate financier or a little old lady to authorize or make an inappropriate payment, does it matter whether they are coerced into submitting the transaction online or popping down to the bank branch with a cheque? Would cybersecurity stop naïve investors being taken in by fake lotto wins, or pump-n-dump, penny-stock or pyramid schemes? Somewhere here I have a ‘419’ advance fee fraud letter sent to me in the post in the 80’s, before the Internet and email were invented.
  • Piracy and counterfeiting is an enormously costly issue globally: again, cyber plays an incidental role in intellectual property theft. Those container loads of fake Nike trainers arriving at the ports are not cyber attacks. Is it a cyber crime when a new employee brings with them a head-full of trade secrets from their previous employers, plus a box of business cards for all their business contacts?
  • Is it a cyber crime when someone uses a fake library card to fool a utility into posting them a bill that they use to set up a credit account and … later … join a government department or apply for a passport? Identity theft existed long before computers were invented. It’s a rare CV that doesn’t at least bend the truth, and I’m sure many claimed courses, qualifications and work experiences are entirely fictitious.
  • The secret services will always use conventional tradecraft such as pickpocketing/theft, infiltration and coercion, as well as cyber means. By the way, is ‘cybertage’ (sabotage targeting IT by any means including physical attacks using, say, bombs or electromagnetic pulses, not just hacks and malware) part of your remit, particularly for highly exposed critical infrastructure such as comms, power and water systems?
  • The recent brouhaha over fake news and Russian involvement in the US presidential elections is, I’m sure, just the tip of the iceberg. Propaganda and control of the media have always been key tools to influence and manipulate the population. Political parties still use leaflets and posters and house-to-house appearance plus TV and radio advertisements to supplement their online campaigns. These are not so much cyber as societal concerns involving information, very topical here with a general election looming.
  • Substantial or total shutdown or failure of GPS and the Internet are credible scenarios in the event of global conflict (cyber war or terrorism or whatever), with horrendous consequences. There are so many vulnerabilities in our IT systems that compromise on a massive scale is not just possible but highly likely, almost certain I’d say, rendering them untrustworthy. What happens if/when, despite all our efforts, the cyber controls plus the IT systems and networks fail – what then? What if, say, ISIS or Anonymous or a superpower holds the entire cyber economy to ransom, instead of just individual organizations? Continuity management has implications at personal, organizational, national and global levels.

    Categories: NoticeBored

    NBlog August 23 - Information Security outreach

    NoticeBored - Tue, 08/22/2017 - 9:14pm

    Further to yesterday's ISO27k Forum thread and blog piece, I've been contemplating the idea of extending the security awareness program into an "outreach" initiative for Information Security, or at least viewing it in that way. I have in mind a planned, systematic, proactive approach not just to spread the information risk and security gospel, but to forge stronger more productive working relationships throughout the organization, perhaps even beyond.  
    Virtually every interaction between anyone from Information Security and The Business is a relationship-enhancing opportunity, a chance to inform, communicate/exchange information in both directions, assist, guide, and generally build the credibility and information Security's brand. Doing so has the potential to:
    • Drive or enhance the corporate security culture through Information Security becoming increasingly respected, trusted, approachable, consulted, informed and most of all used, rather than being ignored, feared and shunned (the "No Department");
    • Improve understanding on all sides, such as identifying business initiatives, issues, concerns and demands for Information Security involvement, at an early enough stage to be able to specify, plan, resource and deliver the work at a sensible pace rather than at the last possible moment with next to no available resources; also knowing when to back-off, leaving the business to its own devices if there are other more pressing demands, including situations where accepting information risks is necessary or appropriate for various business reasons;
    • Encourage and facilitate collaboration, cooperation and alignment around common goals;
    • Improve the productivity and effectiveness of Information Security by being more customer-oriented - always a concern with ivory-tower expert functions staffed by professionals who think they (OK, we!) know best;
    • Improve the management and treatment of information risks as a whole through better information security, supporting key business objectives such as being able to exploit business opportunities that would otherwise be too risky, while complying with applicable laws and regulations.

    Aside from the opportunity, there's also a relationship-harming risk too, if (when!) we get those interactions wrong - an information risk that can be treated in the conventional manner:
    • We can't totally avoid the risk, short of becoming isolated hermits which would render Information Security pointless and worthless;  
    • However, we could emphasize productive interactions and try to cut down on unproductive ones maybe - a form of risk mitigation. We could also be more proactive in this area, for example making sure that Information Security people have the skills and aptitude for forming and maintaining productive relationships with the rest of the business, and the good sense to recognize and respond when things are not going well. Measuring the strength of its business relationships with various other functions or business units would help Information Security improve them systematically where appropriate, implying the value of relationship metrics;
    • We could share the risk by collaborating with other risk and assurance functions when interacting, especially the ones that have strong relationships throughout the business. We can learn from and support them, and vice versa. We might also share the risk with the general business by persuading general management that strong internal relationships to specialist functions are valuable assets, worth investing in (e.g. if you are thinking about employing security consultants or taking advice from vendors on security matters, come to us first: we may well be able to assist directly, or broker your supplier relationships).
    • We are forced to accept any remaining untreated information risk, like it or not ... but that's not the end of the story. In the event of relationship issues, we could put in place arrangements to deal with them as effectively and efficiently as possible - such as having escalation routes to management, perhaps even incident management or contingency plans in this area. The metrics I mentioned should give us early warning of impending problems, avoiding nasty surprises.
    All in all, I see a lot of upside potential, and the downsides can be managed. This idea looks like a winner to me. What do you think?
    Categories: NoticeBored