Mike Bursell - Trust in Computer Systems and the Cloud

Здесь есть возможность читать онлайн «Mike Bursell - Trust in Computer Systems and the Cloud» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Trust in Computer Systems and the Cloud: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Trust in Computer Systems and the Cloud»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

Learn to analyze and measure risk by exploring the nature of trust and its application to cybersecurity 
Trust in Computer Systems and the Cloud The book demonstrates in the importance of understanding and quantifying risk and draws on the social and computer sciences to explain hardware and software security, complex systems, and open source communities. It takes a detailed look at the impact of Confidential Computing on security, trust and risk and also describes the emerging concept of trust domains, which provide an alternative to standard layered security. 
Foundational definitions of trust from sociology and other social sciences, how they evolved, and what modern concepts of trust mean to computer professionals A comprehensive examination of the importance of systems, from open-source communities to HSMs, TPMs, and Confidential Computing with TEEs. A thorough exploration of trust domains, including explorations of communities of practice, the centralization of control and policies, and monitoring Perfect for security architects at the CISSP level or higher, 
 is also an indispensable addition to the libraries of system architects, security system engineers, and master’s students in software architecture and security.

Trust in Computer Systems and the Cloud — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Trust in Computer Systems and the Cloud», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Maybe because humans have a propensity towards anthropomorphism in order to allow them better to understand the systems with which they interact, though they are not consciously aware that the system is non-human

Because humans are interacting with a system that they are clear is non-human, but they find it easier to interact with it as if it had at least some human characteristics

Because humans have been deceived by intentionally applied techniques into believing that the system is human

By this stage, we have maybe stretched the standard use of the term anthropomorphism beyond its normal boundaries: normal usage would apply to humans ascribing human characteristics to obviously non-human entities. The danger we are addressing here goes beyond that, as we are also concerned with the possibility that humans may form trust relationships to non-human entities exactly because they believe them to be human: they just do not have the ability (easily) to discriminate between the real and the generated.

Identifying the Real Trustee

When security measures are put in place, who puts them there, and for what reason? This might seems like a simple question, but often it is not. In fact, more important than asking “for what” are security measures put in place is the question “for whom are they put in place?” Ross Anderson and Tyler Moore are strong proponents of the study of security economics , 51 arguing that microeconomics and game theory are vital studies for those involved in IT security. 52 They are interested in questions such as the one we have just examined: where security measures—which will lead to what we termed behaviours —are put in place to benefit not the user interacting with the system but somebody else.

One example is Digital Rights Management (DRM). Much downloadable music or video media is “protected” from unauthorised use through the application of security technologies. The outcome of this is that people who download media that are DRM protected cannot copy them or play them on unapproved platforms or systems. This means, for example, that even if I have paid for access to a music track, I am unable to play it on a new laptop unless that laptop has approved software on it. What is more, the supplier from which I obtained the track can stop my previously authorised access to that track at any time (as long as I am online). How does this help me, the person interacting with the music via the application? The answer is that it does not help me at all but rather inconveniences me: the “protection” is for the provider of the music and/or the application. As Richard Harper points out, “trusting” a DRM system means trusting behaviour that enforces properties of the entity that commissioned it. 53 Is this extra protection, which is basically against me, in that it stops my ease of use? Of course not: I, and other users of the service, will end up absorbing this cost through my subscription, a one-off purchase price, or my watching of advertisements as part of the service. This is security economics , where the entity benefiting from the security is not the one paying for it.

When considering a DRM system, it may be fairly clear what actions it is performing. In this case, this may include:

Decrypting media ready for playing

Playing the media

Logging your usage

Reporting your usage

According to our definition, we might still say that we have a trust relationship to the DRM software, and some of the actions it is performing are in my best interests—I do, after all, want to watch or listen to the media. If we think about assurances, then the trust relationship I have can still meet our definition. I have assurances of particular behaviours, and whether they are in my best interests or not, I know (let us say) what they are.

The issue gets murkier when I cannot necessarily discover what behaviour is happening, because if I cannot, then I have no way to know if it is in my best interests or not. One might even expect that if behaviours are in my best interests, they would be disclosed to me as part of the description of the actions about which I am deciding to accept assurances. When I have significant concerns that there are behaviours that are explicitly against my interests, things become concerning. A large-scale example of this is the trust relationship that governments need to have to critical national infrastructure. The exact definition of critical national infrastructure —often capitalised or abbreviated to CNI—varies between experts and countries but is the collection of core hardware, software, and services that are key to keeping citizens safe and key elements of society functioning. A list might include the following:

Power generation

Water and sewerage

Basic transport networks

Emergency services

Healthcare

Location services (e.g., GPS)

Telecommunications

Internet access

For the purposes of many governments, the final two have become so intertwined that they can hardly be separated. What is noteworthy about telecommunications and core Internet capabilities is the small number of suppliers across the world. One of those is Huawei, which is based in the People's Republic of China. The government of the United States, whose relationship with the Chinese state and government can be characterised as a rivalry, if not out-and-out enemies, takes the view that given the nature of the ownership of Huawei, and its base in China, the telecommunications equipment that it manufactures and provides cannot be trusted.

This is a strong stance to take, and the concerns that are expressed are well-defined. The US government asserts that there is a real risk that a telecommunications equipment—and associated software—provider who is based within China may be under enough pressure from the Chinese government to include hidden features that could affect the confidentiality, integrity, or availability of services that are part of the United States' critical national infrastructure. If this were the case, it would allow communications that could be critical to the United States to be eavesdropped on or even tampered with by the Chinese government or those acting for it. The suggestion that the Chinese government would ever exert pressure to insert such capabilities—typically known as back doors —is strongly disputed by the Chinese and Huawei itself. However, to frame these concerns within our definition of trust relationships as well as from the point of view of the US government, there is insufficient assurance that the actions to be taken by such pieces of equipment are as expected and, therefore, the US government has taken the view that there should be no trust relationship formed with equipment that might be supplied by Huawei.

This is an extreme example, but when we see relationships of this type, where there are or may be actions that are hidden from us, it must be appropriate to say that we cannot have assurance and, therefore, should not label this as a proper trust relationship. In order to be adequately informed about entities and whether to form relationships to them, we need to have as much information about actions as possible before a trust relationship is formed, along with assurances about those actions. The problem with this is that one of the key sources of information about an entity is the entity itself, but we cannot trust any information that an entity provides about itself because, of course, we have no trust relationship to it to allow us to do so. This issue and how to mitigate it will be key as we move to deeper examinations about trust between computer systems and discussions around the topics of application programming interfaces (APIs) and open source software.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Trust in Computer Systems and the Cloud»

Представляем Вашему вниманию похожие книги на «Trust in Computer Systems and the Cloud» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Trust in Computer Systems and the Cloud»

Обсуждение, отзывы о книге «Trust in Computer Systems and the Cloud» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x