1 Cover
2 Title Page You CAN Stop Stupid Stopping Losses from Accidental and Malicious Actions Ira Winkler Dr. Tracy Celaya Brown
3 Introduction Introduction We believe that the title of a book is perhaps its most critical characteristic. We acknowledge that the title, You Can Stop Stupid is controversial. We had considered other possible titles, such as Stopping Human Attacks, but such a title does not convey the essence of this book. Although we do intend to stop attacks that target your users, the same methodology will stop attacks by malicious insiders, as well as accidents. The underlying problem is not that users are the targets of attacks or that they accidentally or maliciously create damage, but that users have the ability to make decisions or take actions that inevitably lead to damage. That is the fundamental issue this book addresses, and it makes a critical distinction: The problem lies not necessarily in the user, but also in the environment surrounding the people performing operational functions.
What Is Stupid? What Is Stupid? Managers, security specialists, IT staff, and other professionals often complain that employees, customers, and users are stupid. But what is “stupid”? The definition of “stupid” is having or showing a great lack of intelligence or common sense. First, let's examine the attribute of showing a great lack of intelligence. When your organization hires and reviews people, you generally assess whether they have the requisite intelligence to perform the required duties. If you did hire or retain an employee knowing that they lacked the necessary intelligence to do the job, who is actually stupid in this scenario: the employee or the employer? Regarding a person who shows a great lack of common sense, there is a critical psychological principle regarding common sense: You cannot have common sense without common knowledge. Therefore, someone who is stupid for demonstrating a great lack of common sense is likely suffering from a lack of common knowledge. Who is responsible for ensuring that the person has such common knowledge? That responsibility belongs to the people who place or retain people in positions within the organization. In general, don't accuse someone in your organization of being stupid. Instead, identify and adjust your own failings in bad employment or training practices, as well as the processes and technologies that enable the “stupidity.”
Do You Create Stupidity? Do You Create Stupidity? When people talk about employee, customer, and other user stupidity, they are often thinking of the actions those users take that cause damage to your organization. In this book, we refer to that as user-initiated loss ( UIL ). The simple fact is that a user can't initiate loss unless an organization creates an environment that puts them in a position to do so. While organizations do have to empower employees, customers, and other users to perform their tasks, in most environments, there is little thought paid to proactively reducing UIL. It is expected that users will make mistakes, fall for tricks, or purposefully intend to cause damage. An organization needs to consider this in its specification of business practices and technological environments to reduce the potential for user-initiated loss. Even if you reduce the likelihood for people to cause harm, you cannot eliminate all possibilities. There is no such thing as perfect security, so it is folly to rely completely on prevention. For that reason, wise organizations also embed controls to detect and reduce damage throughout their business processes.
How Smart Organizations Become Smart Not All Industries Are as Smart Deserve More Reader Support for This Book
4 I: Stopping Stupid Is Your Job 1 Failure: The Most Common Option History Is Not on the Users’ Side Today's Common Approach We Propose a Strategy, Not Tactics 2 Users Are Part of the System Understanding Users' Role in the System Users Aren't Perfect “Users” Refers to Anyone in Any Function Malice Is an Option What You Should Expect from Users 3 What Is User-Initiated Loss? Processes Culture Physical Losses Crime User Error Inadequate Training Technology Implementation UIL Is Pervasive
5 II: Foundational Concepts 4 Risk Management Death by 1,000 Cuts The Risk Equation Risk Optimization Risk and User-Initiated Loss 5 The Problems with Awareness Efforts Awareness Programs Can Be Extremely Valuable Check-the-Box Mentality Training vs. Awareness The Compliance Budget Shoulds vs. Musts When It's Okay to Blame the User Awareness Programs Do Not Always Translate into Practice Structural Failings of Awareness Programs Further Considerations 6 Protection, Detection, and Reaction Conceptual Overview Protection Detection Reaction Putting It All Together 7 Lessons from Safety Science The Limitations of Old-School Safety Science Most UIL Prevention Programs Are Old-School The New School of Safety Science Putting Safety Science to Use Safety Culture The Need to Not Remove All Errors When to Blame Users We Need to Learn from Safety Science 8 Applied Behavioral Science The ABCs of Behavioral Science Engineering Behavior vs. Influencing Behavior 9 Security Culture and Behavior ABCs of Culture Types of Cultures Subcultures What Is Your Culture? Improving Culture Behavioral Change Strategies Is Culture Your Ally? 10 User Metrics The Importance of Metrics The Hidden Cost of Awareness Types of Awareness Metrics Day 0 Metrics Deserve More 11 The Kill Chain Kill Chain Principles Deconstructing the Cyber Kill Chain Other Models and Frameworks Applying Kill Chains to UIL 12 Total Quality Management Revisited TQM: In Search of Excellence Other Frameworks COVID-19 Remote Workforce Process Activated Applying Quality Principles
6 III: Countermeasures 13 Governance Defining the Scope of Governance for Our Purposes Traditional Governance Security and the Business Analyzing Processes Grandma's House 14 Technical Countermeasures Personnel Countermeasures Physical Countermeasures Operational Countermeasures Cybersecurity Countermeasures Nothing Is Perfect Putting It All Together 15 Creating Effective Awareness Programs What Is Effective Awareness? Governance as the Focus Where Awareness Strategically Fits in the Organization The Goal of Awareness Programs Changing Culture Defining Subcultures Interdepartmental Cooperation The Core of All Awareness Efforts Metrics Gamification Getting Management's Support Enforcement Experiment
7 IV: Applying Boom 16 Start with Boom What Are the Actions That Initiate UIL? Metrics Governance Awareness Feeding the Cycle Stopping Boom 17 Right of Boom Repeat as Necessary What Does Loss Initiation Look Like? What Are the Potential Losses? Preventing the Loss Detecting the Loss Mitigating the Loss Determining Where to Mitigate Avoiding Analysis Paralysis Your Last Line of Defense 18 Preventing Boom Why Are We Here? Reverse Engineering Step-by-Step 19 Determining the Most Effective Countermeasures Early Prevention vs. Response Start with Governance Prioritize Potential Loss Define Governance Thoroughly Matrix Technical Countermeasures Define Awareness It's Just a Start 20 Implementation Considerations You've Got Issues Business Case for a Human Security Officer It Won't Be Easy 21 If You Have Stupid Users, You Have a Stupid System A User Should Never Surprise You Perform Some More Research Start Somewhere Take Day Zero Metrics UIL Mitigation Is a Living Process Grow from Success The Users Are Your Canary in the Mine
8 Index
9 End User License Agreement
1 Chapter 8Table 8.1: E-TIP Table Example
2 Chapter 11Table 11.1: Kill Chain Comparison
3 Chapter 15Table 15.1: Quarterly Plan
1 Chapter 4 Figure 4.1: The risk equation Figure 4.2: Cost of countermeasures compared to vulnerabilities Figure 4.3: The risk optimization point
Читать дальше