Upload
collin-stevens
View
216
Download
0
Tags:
Embed Size (px)
Citation preview
The Promise and Peril of Trusted Computing in Governmental Systems
Presentation to the TRUST2008 Educational
Event
Prof. Clark Thomborson
4th March 2008
Lessig’s Taxonomy of Control
Legal Illegal
Moral
Immoral
Easy Difficult
Inexpensive
Expensive
Easy
Difficult
Difficult to specify for transnational systems!•Morality, legality, difficulty, and expense are different for each user.•Undesired behaviour is different for each system owner.
Undesired Behaviour
For a well-designed system
Design Constraints
The TPM (trusted platform module) has already been designed, and is present in most modern PCs.
A trusted PC has one or more users, one owner, and access to some remote-attestation services.
The owner can examine and modify some, but not all, of the state and function of the trusted PC. Users have fewer privileges.
The user or owner can direct the TPM to send a message, cryptographically signed, attesting to the boot-state of the trusted computer. Users (and owners) can use the remote-attestation service to
help them decide whether to trust a remote PC which has sent an attestation message.
Note: in this talk I am focussing solely on the remote-attestation feature. I believe this to be the most likely “design win”, although TPMs are also useful for local attestation to a configuration manager.
What is Trust?
This is a very complex subject! There are many competing definitions.
My primary inspiration is Niklas Luhmann’s 1973 monograph, Trust: a mechanism for the reduction of social complexity.
Luhmann’s theory of trust is based on his analysis of communication in social systems. If you decide to interact with any system which you don’t
completely understand, then you are trusting this system. Trust is increasingly necessary, as our systems become
more complex. By Luhmann’s definition, every useful computer is a
“trusted” computer! Our goal in this talk is to explore the trustworthiness of
trusted computers, when they are used in governmental applications.
Who Will Buy Them?
Who would own the trusted computers in a governmental application? 192 member states of the UN 104 agencies of national governments 106 agencies of municipalities, counties, provinces, states 108 employee-users, 1010 resident- or citizen-users.
Is there any agreement among these potential owners on what behaviour (by their users) should be prevented and what should be made easy? Can we afford to design more than a few trusted systems? Could different trusted systems interoperate, if their owners don’t
agree on what is “undesirable”? What should happen when states go to war with each other, should
their trusted computers be allies or neutrals? (When a civilian user is drafted, should their TC become militarised?)
This is a daunting problem. No wonder progress has been so slow!
I won’t pretend to solve the problem, but I will suggest a way in which we might approach a solution. From the top...
Functional and Non-Functional Requirements on Trustworthy Systems
Rules: prohibitions and permissions, defining what a user can and can’t do.
Assurances: Obligations and exemptions: defining what a user has
promised to do. Entitlements and disqualifications: defining what another party
(e.g. a governmental agency) has promised to the user. Services: requests and provisions, defining how an
obligation or entitlement can be fulfilled. Interventions: controls and observations, defining how a
system can enforce all rules, regulate all assurances, and quality-assure all services.
At this level of generality, all systems are identical. The devil is in the details! Let’s look at a few use and misuse
cases.
Observations
A trusted computer can send a message describing its boot state. Use: an attestation service can assure a remote owner that their
platform has not been modified unexpectedly. Misuse: an attestation service might reveal, to an attacker,
information about the patching level of a sensitive computer. Should attestation messages cross national boundaries, i.e. should
every country have its own attestation service? Should attestation services be federated?
Challenges: Accurate and efficient friend/foe recognition of TCs, despite a wide
variety of boot states. (A TPM’s measurement of its platform is a semi-unique identifier, not an assurance of its properties.)
Developing legal, ethical, economic, and technical structures for an attestation service which spans national boundaries, or has a diverse cultural, economic, or technical constituency.
Developing legal intercepts for trusted computers, or obtaining international agreement that trusted computers should not be subject to a search without the consent of the owner.
Obligations
A user might sign a tax form using a trusted computer, thereby undertaking an obligation to pay tax by a certain date. Use case: a trusted computer could be an enforcement agent,
making it easy for its user to fulfil all obligation, and making it difficult to avoid fulfilling them. (Governments would still have traditional legal, moral, and economic controls.)
Misuse cases: the owner of the computer is a “piggy in the middle” and could be accused of fraud, or be the victim of fraud; Any technical enforcement will have false-positive or false-negative errors relative to any legal system, and if these are predictable then they will be exploited.
Challenges: Providing trustworthy platforms to users, not just to owners; Developing a legal structure, whereby the owner of a trusted
platform has a fiduciary obligation to its users; Developing legal, social, ethical, and technical systems which
allow obligations to be communicated (in a trustworthy fashion) across jurisdictional boundaries.
A Path Forward?
Develop a set of use and misuse cases which are technically feasible for trusted computers in a specific but simple governmental setting.
Answer the following questions: Do all owners agree that all the use cases are desirable? Are all the misuse cases undesirable to owners? Would the system be easier to use, or less expensive, than a
similar system without TPMs? Would some users feel that a similar system without TPMs
would be less acceptable, either legally or ethically? If all answers are “yes” then repeat, with a more
complex system.
Measuring Complexity
Candidate terms in a product metric: Breadth of integration: (1) within a single agency, (2)
intragovernmental, (3) e-government, (4) government-commercial, (5) transnational.
Legal diversity (for transnational systems): number of different legal traditions in the attestation service providers.
Moral diversity: number of different religions or value systems among the attestation service providers.
Economic diversity: (1) all users are educated professionals, (2) all users own a Vista or other TPM-enabled PC OS; (3) users must possess (not necessarily own) a TPM device such as a cellphone.
Some low-complexity systems have been developed, e.g. e-government for a single country as an optional mechanism for service delivery (3x1x1x1 = 3). Will any low-complexity system be a “design win” for TPMs? The cost of development is huge, but the “network effect” seems likely
to overbalance the development cost for a sufficiently large system. Where is the tipping point? Are there any sufficiently large markets for low-complexity systems?
Steps Toward a High-Complexity Trusted Computer System Members of my research group are investigating the similarities and
differences in governmental identification systems in various countries.
Initial findings: very different expectations for privacy. Pakistan (PhD student Zulfiqar Ahmad): strong support for privacy in the
Koran. A good person will enter someone else’s house by the “front door”, and not sneak in by a “back door”. This seems to be a personal obligation, not a governmentally-enforced one, but Pakistan is moving to harmonise its governmental systems to comply with EU privacy directives.
New Zealand (ME student Yu-Cheng Tu): has a Privacy Act, and is developing a semi-anonymised government login service. A user who identifies themselves to one governmental agency may obtain and maintain a distinct identity with another governmental agency.
Thailand (PhD student Pita Jarupunphol): Theravada Buddhism offers little apparent support for privacy of personal data. The privacy debate in Thailand, to date, has been based on European value systems.
Research goal: discover congruences and disparities between digital identities in these countries, which would enable or prevent “exports” and “imports” of digital-identity information.
FRG’s BSI and the TCG
First round: 2003-4. Second round: BSI presentation to TCG
September 2007. The BSI “appreciates and supports an increased level
of IT security provided by the deployment of Trusted Computing solutions on IT platforms of enterprises, public administrations and citizens on the basis of TCG's specifications. The German Federal Government promotes and actively participates in this process.”
but certain key requirements must be met. Source: F. Samson of the BSI, via W. Pincott of NZ’s SSC.
Germany’s Requirements
Availability of the specifications
Open standards Freedom of research Interoperability Transparency Certification National IT industry Freedom of choice
Guarantee of IT security Availability of critical
infrastructures Protection of digital
works Data protection Standardisation International
cooperation
Most of these are prohibitions or obligations on the suppliers of TC systems; some are TC system specs but many more are needed for an governmental TC system.
New Zealand’s Principles for DRM and TC in e-Government
In 2006, the New Zealand Parliament adopted a set of principles and policies for the use of trusted computing and DRM in its operations.
http://www.e.govt.nz/policy/tc-and-drm/principles-policies-06/tc-drm-0906.pdf
1. “For as long as it has any business or statutory requirements to do so, government must be able to: use the information it owns/holds; provide access to its information to others, when they are entitled to access
it.” I think other sovereign governments will require a similar assurance.
All governmental documents must have high availability and integrity: ECM. Some governmental documents are highly confidential, requiring special-
purpose DRM systems under single-agency control. Key control is a primary vulnerability in ECM (and in DRM).
To mitigate this vulnerability, I would advise governmental agencies to separate their key-management systems from their ECM systems.
System interfaces should conform to an open standard, so that there can be secondary vendors and a feasible transition plan to a secondary vendor.
NZ e-Government Principle #2
2. “Government use of trusted computing and digital rights management technologies must not compromise the privacy rights accorded to individuals who use government systems, or about whom the government holds information.”
I have some trouble with this principle – it seems to ask too much of the technology.
We can’t build a usable system which would prohibit all possible compromises of privacy rights! Can any computer system satisfy our diverse requirements for privacy, law
enforcement, and national security? Governments should avoid purchasing systems which make it easier
for users to cause privacy compromises. Governments should avoid purchasing systems which make it harder
for users to prevent, detect, or respond to privacy compromises.
NZ e-Government Principle #3
3. “The use of trusted computing and digital rights management technologies must not endanger the integrity of government-held information, or the privacy of personal information, by permitting information to enter or leave government systems, or be amended while within them, without prior government awareness and explicit consent.”
All sovereign governments (and all corporations) would have similar requirements for high integrity, confidentiality control at some appropriate perimeter, and for interventions (to control and observe).
Technical analysis: These requirements cannot be achieved with a closed-source DRM
system on an unauditable TC platform. This is ECM (enterprise content management): documents entering a
governmental (or corporate) security boundary must be “owned” by the receiving agency, so they can be fully managed by a local rights server.
This is not DRM. Strong controls (e.g. a manager’s explicit consent) should be placed on any individual’s importation of non-owned documents and objects. Each importation is a threat to system confidentiality, and imported documents may be subject to amendment.
NZ e-Government Principle #4
4. “The security of government systems and information must not be undermined by use of trusted computing and digital rights management technologies.”
Each principle is supported by policies. In this case: “Agencies will reject the use of TC/DRM mechanisms, and
information encumbered with externally imposed digital restrictions, unless they are able to satisfy themselves that the communications and information are free of harmful content, such as worms and viruses.”
Is this the “killer app” for the NZ principles? This requirement is surprisingly difficult to achieve, in current
TC/DRM technology. It much easier in TC/ECM. I believe the NZ e-Government unit has rendered an important
service to the international community by identifying and publicising this security issue with DRM.
I think other governments will give similar advice to their agencies. Why not incorporate this into an ISO standard?
Why Malware Scans are Problematic in TC/DRM An infected document may have been encrypted at a time
when its malware payload is not detectable. An infected document may be opened at any time in the future.
Non-owned documents can only be scanned on computers that are trusted by the licensor. DRM: Document access is controlled by the licensor, even when the
licensee possesses a digital copy. A difficult requirement: malware scanners need efficient and
unrestricted access to the DRM keystores.• This will be expensive, and may not be allowed by the license contract.
License contracts might contain suitable assurances against malware, to mitigate the risk of accessing an unscanned document.
• I would advise corporations against signing DRM license agreements which indemnify licensors against loss due to malware in the licensed object.
Malware Scans are Easier in TC/ECM
Owned documents can be scanned for malware, on any computer platform that is owned (and trusted) by the recipient. In ECM systems, all documents are controlled by the recipient. Recipient computers can have full administrative rights over the
document. The donor trusts the recipient to observe the terms of their
ECM contractual agreement. The recipient may be under an obligation to maintain a tamper-
evident audit of accesses. There can be no requirement on the recipient to obtain donor
permission to open a document. This would be DRM. The donor’s trust allows the recipient to run efficient, offline,
and effective malware scans. The trustworthy recipient will not abuse this trust.
Pulling It All Together...
Controls: ethical, legal, economic, technical. Specifications:
Rules: prohibitions and permissions, defining what a user can and can’t do.
Assurances: • Obligations and exemptions: defining what a user has promised
to do.• Entitlements and disqualifications: defining what another party
(e.g. a governmental agency) has promised to the user. Services: requests and provisions, defining how an obligation or
entitlement can be fulfilled. Interventions: controls and observations, defining how a system
can enforce all rules, regulate all assurances, and quality-assure all services.
Multilevel security: Users, owners, attesters.
Are you Ready to Go?
Functional and security analysis: use cases and misuse cases.
There are also “confuse cases” where a user causes inadvertent damage, e.g. by sending email carelessly.
Governmental requirements and principles can help a technologist decide whether a particular series of actions (or inactions) by a user would be a use or a misuse.
We should analyse simple, real-world, applications before we attempt anything more difficult.
We won’t learn much by analysing systems with generic requirements such as “confidentiality”, “privacy”, “integrity”, etc.
The Complexity of a governmental TC system might be estimated by Scope x Legal diversity x Ethical diversity x Economic diversity.
Let’s develop use and misuse cases for ... a TC system that mediates a healthcare entitlement...