In short, any time a user is provided with the ability to access information or perform tasks more than is required for their work, there is a risk to potentially be contained. Sometimes, expanded information access and enhanced capabilities can help empower people to do their jobs more effectively. Empowering users to succeed while reducing loss is always about finding the right balance. For example, there is no reason for employees to have access to other people's PII, unless their job specifically requires such access.
Shadow IT is a term for computing equipment, software, and access that is unknown to the IT department. It is typically acquired and introduced outside of the organization's normal process. It may or may not be purchased through the use of organizational funds.
One example is people's choices of laptops. Some people prefer to use Macs, as opposed to corporate PC-compatible systems, and they purchase a Mac directly and use it as their primary system. The problem is that the Mac systems are generally unknown to the IT department and will not be maintained per organizational standards. For example, if the Mac is lost, there will not be an ability to remotely erase company data on the system.
Shadow IT also includes software. Users add software to their personal and corporate devices that has not been vetted by the organization. Frequently, the organization has made a conscious choice not to use the software because of a variety of issues, including adherence to regulations, security requirements, and proper maintenance and patching. In one of the most notorious cases, Jared Kushner, advisor to President Trump, installed and used WhatsApp to communicate with foreign leaders. (See “Jared Kushner's Use of WhatsApp Raises Concerns Among Cybersecurity Experts,” CNN, www.cnn.com/2019/03/23/politics/kushner-whatsapp-concerns/index.html
.) WhatsApp violates the law in that it does not adhere to record-keeping requirements.
Additionally, while communications may be encrypted, there are a variety of security concerns. Shadow IT systems may not be patched properly or have updated anti-malware software, which puts the whole organization at risk. If the employee leaves the organization, nobody knows to collect the system or at least delete the organizational data on the system.
In one case we are familiar with, which is not uncommon for organizations, an employee was unhappy with the available Internet access bandwidth, as well as the fact that his access was both filtered and monitored, so he had a new Internet connection installed in a corporate office. This created a rogue connection that bypassed the organization's security posture and created a backdoor for outside criminals.
Another case of Shadow IT is the use of online storage systems, such as Box, Dropbox, and Google Drive. Users frequently use third-party services to perform their jobs and bypass obstacles. Some services might not have strong security. Either way, the organization loses control of its information once it's placed on the servers and they are not otherwise aware of it.
Shadow IT includes aspects of both user enablement and design and maintenance. Because the infrastructure has to allow for rogue devices, no matter the source, it is a network maintenance issue. Because organizations are directing that IT departments allow users to bring their own personal devices to work, it is a form of user enablement. One typical example of this is that organizations want employees to use their own cell phones to save costs because the organization doesn't have to purchase cell phones for the employees. They create an infrastructure to allow employee cell phones to connect to the network and to also access and store organizational information.
You do not need to categorize Shadow IT. You just need to understand it for the risk that it is and incorporate it into your strategy to mitigate UIL.
It is safe to say that everyone has looked at some document, computer screen, electronic or mechanical device, or a general situation and found it confusing. This is the case where even the experienced pilots in the Boeing 737 MAX airplanes could not figure out that the computer had the wrong readings and was forcing the airplanes down.
Fortunately, there are few interfaces involving such drastic consequences. Even so, a great deal of loss can be attributed to well-meaning users who fail to properly interact with some system. This is often not the user's fault. It is an area where design, maintenance, and user enablement overlap.
There is a discipline within the fields of psychology and mechanical engineering of ergonomics that is sometimes referred to human factors . Within the computer field, a similar discipline is referred to as human-computer interaction (HCI). While these fields intend to optimize human interaction with systems, the net result is to also reduce loss, which sometimes even includes reducing the loss of life. We recommend that you look into the relevant fields for additional guidance.
As you can see, how you configure work and computers to interact with users can have a substantial impact on loss. You need to understand that if you see otherwise intelligent and capable people initiating losses that it may very well be caused by how they are required to interact with your systems.
To prevent and mitigate UIL, you have to acknowledge it for what it is: pervasive and natural to your organization. Some categories of loss lend themselves to preventing the potential attacks from reaching users. Some attacks are clearly intentional, and the only way to prevent them is to remove the capability of the user to cause the loss.
Sometimes awareness alone can mitigate a particular loss, but in all likelihood the loss will only be mitigated through a layered approach of countermeasures. This unfortunately goes against much of the current hype that users are your first and last line of protection and that awareness is a silver bullet that will stop user-related losses. Again, awareness is a tactic, and solving UIL requires a comprehensive strategy.
Ideally, you now have an understanding that the nature of the problem is not necessarily that users make mistakes but that user actions can initiate loss in some form. This empowers you to know that users do not control the destiny of the organization. Instead, your job is to prevent users from making potentially harmful actions and then mitigate the resulting loss.
However, before we detail a holistic strategy, we need to set the foundation for that strategy. We have to ensure there is common knowledge, if for no other reason than to practice what we preach. While many of the disciplines covered in Part II appear unrelated, they all play a part in ensuring a comprehensive strategy.
People often mistakenly assume that “mitigating loss” means preventing all potential loss. That is impossible. There will always be some form of loss in operations. Perhaps one of the best definitions of risk is this one from ISO 27000:
Risk is the effect of uncertainty on objective.
Similarly, we want to be careful about what we mean when we discuss “optimizing risk.” People generally believe that minimizing risk implies you should spend whatever it takes to avoid as much risk as possible. Trying to prevent all risk and loss might cost more to achieve than the actual loss you hope to mitigate.
What you are actually trying to do is manage the loss. The concept of balancing potential loss with the cost of mitigating it is called risk management .
Читать дальше