Hacking Exposed
Hacking Exposed
Hacking Exposed
Campbell Murray
Encription limited
Your Instructor:
Campbell Murray
Tiger Scheme Senior Tester
CISSP / CEH
CHECK Team Leader
The Nature of the Beast
“Your company will become a designated target of Information Warfare. If not
yesterday or today then definitely tomorrow.”
-Win Schwartu, Information Warfare, Thunder’s Mouth Press, 1996.
“95% of organizations that lose their corporate data fail within 18 months.”
-Sunday Times, Computer Security, 1991.
• Hostile IW Forces
– Could be funded by
– Hostile nations; or
Terrorists Limited
Widespread
– An AT&T crew removing an old cable in the Newark, New Jersey, accidentally severed a fiber-optic
cable carrying more than 100,000 calls. Starting at 9:30am on January 4, 1991, and continuing for
much of the next day, the effects included
• downtime of the New York Mercantile Exchange and several commodities exchanges;
• disruption of Federal Aviation Administration (FAA) air-control communication in the New York metropolitan
area;
• lengthy flight delays into, and out of, the New York area;
• blockage of 60 percent of the long-distance telephone calls into, and out of, New York City.
The CIA Model of Security
We can define security in terms of three concepts:
• Confidentiality
– Information must not be disclosed to anyone who is not authorized to access it.
• Integrity
– The system must not corrupt the information or allow any unauthorized
malicious or accidental changes to it. There are three aspects to integrity:
authorized actions, separation and protection of resources, and error detection
and correction.
• Availability
– In order for the system to be usable the services provided by the system must be
present in an obtainable form. Aspects of availability include: presence of object
or service in a usable form, capacity to meet service needs, adequate
time/timeliness of service.
The DDPRR Model of Security
We can also define information security in terms of:
• Deter
– To create and implement policies that allow us to generate a feasible and
believable deterrence.
• Detect
– To create and implement policies that allow us to detect when, where and how
an intrusion has taken place.
• Protect
– To create and implement polices and procedures that allow us to manage the
people and the information system in an effective manner so as to protect the
information system from unauthorized usage.
• React
– To create and implement procedures and policies that define how we react to an
intrusion in order to ensure that the penetration does not happen again, and
that the vulnerability used to gain access to the system is eliminated.
• Recover
– Recover all data and programs from a breach in security.
Attack Sophistication vs. Intruder Technical
Knowledge
• Web Exploits
– SQLi / XSS / XST / configuration errors /
file uploads etc
• Denial of Service
Several of these -
systems previously
Connection via Target Target
Hacker subverted
cloned mobile phone Network
Internet
Service Intermediate
Provider links
Open Source
Intelligence
Motivational Open Source Physical
Factors Intelligence Intelligence
Vulnerability
Assessment
Target Target Decision
Target Selection
Vulnerability
Intelligence
Identification Management
Y
N Vulnerability
Deployment
Choice Topology
Criteria Target
Penetration
Assurance of Service
• Integrity
THREAT MATRIX • Availability
Threat Model
• Confidentiality
Threat Matrix • Non-Repudiation
Vulnerability
Security Policy Assessment Security Policy
• Known Exploits
Risk Analysis
Reasons to perform risk analysis
• Improve Awareness.
– Discussing issues of security can raise the general level of interest and concern among employees.
• Identify Assets, Vulnerabilities and Controls.
– Some companies are unaware of their computing assets and the vulnerabilities associated with those
assets. A systematic analysis produces a comprehensive list of assets and risks.
• Improve Basis for Decisions.
– Controls reduce productivity through increased overheads and inconvenience to users. However some
risks are so great that they warrant the use of strict controls.
• Justify Expenditures for Security.
– Some security mechanisms are very expensive without an obvious benefit. A risk analysis can help to
identify instances that are worth the expense of a major security mechanism.
• Promote Awareness.
– Continually educate users and others on risk and related policies
Steps in Risk Analysis
1. Identify assets.
2. Determine vulnerabilities and threats.
3. Estimate likelihood of exploitation.
4. Compute expected annual loss.
5. Survey applicable controls and their costs.
6. Project annual savings of control
Identify Assets
The first step of a risk analysis is to identify the assets of the information system.
• Hardware:
– These include central processors, boards, keyboards, monitors, terminals, microcomputers,
workstations, tape drives, printers, disk drives, cables, connections, communications controllers, etc.
• Software:
– These include source programs, object programs, purchased programs, operating systems, system
programs, etc.
• Data:
– These include data used during execution, stored data on magnetic media, printed data, audit logs, etc.
• People:
– These include people needed to run the computing systems or specific programs, etc.
• Documentation:
– These include documentation on programs, hardware, systems and the entire system.
• Supplies:
– These include buildings, paper, forms, laser cartridges, magnetic media, etc.
Identify Vulnerabilities of Assets
Asset Secrecy Integrity Availability
Hardware Overloaded Failed
Destroyed Stolen
Tampered with Destroyed
Unavailable
People Unavailable
Documentation Unavailable
Supplies Unavailable
Compute Expected Costs
• The following questions can lead to an analysis of the ramifications of a computer
security failure.
– What legal obligations are there to preserve confidentiality or integrity of data ?
– Could release of this data cause a person or organisation harm ?
– Could unauthorized access to this data cause loss of future business opportunity ?
– What is the psychological effect of the lack of computer service ?
– What is the value of access to data or programs ?
– What problems could arise from loss of security and what would it cost to fix.
– How much would a replacement cost ?
For example: How do we compare two events if the first event called EventA costs $10,00
and occurs 3 times a year, and the second event called EventB cost $60,000 and occurs
once every 2 years (prob 0.5 times a years. Cost(EventA) = $10,000 * 3 and Cost (EventB)
= $60,00*.5
Vulnerability
Question: What do we mean by vulnerability when talking
about security ?
Answer: A vulnerability to a system can be defined as:
• A point where a system is susceptible to attack.
– (See Michel E. Kabay, Enterprise Security: Protecting Information Assets,
McGraw-Hill, 1996).
• A weakness in the security system that might be exploited to cause harm or loss.
– (See Charles P. Pfleeger, Security in Computing, Addison Wesley, 1997)
• Weakness in an information system, system security procedures, internal controls, or
implementation that could be exploited or triggered by a threat source.
– ISO 27001
Vulnerabilities
• Physical Vulnerabilities
– Intruders can break into your computing facilities. Once in they can sabotage and vandalise your
computers, and they can steal hardware, diskettes, printouts etc.
• Natural Vulnerabilities
– Computers are very vulnerable to natural disasters and to environmental threats. Disasters such as fire,
flood, earthquakes and power loss can wreck your computer and destroy data.
• Hardware/Software Vulnerabilities
– Certain kinds of hardware and software failures can compromise the security of a computer system.
Software failures of any kind may cause your system to fail, and may open up your system to penetration,
or make your system so unreliable that it can’t be trusted.
• Media Vulnerabilities
– Disk packs and tapes can be stolen or damaged by such mundane perils as dust and ballpoint pens.
• Emanation Vulnerabilities
– All electronic equipment emits radiation that can be intercepted.
• Communication Vulnerabilities
– If your computer’s attached to a network then its message can be intercepted, and possibly modified or
misrouted.
• Human Vulnerabilities
– The people who administer and use your computer facilities represent the greatest vulnerability of all.
Computing Systems Vulnerabilities
Interruption Interception
(Denial of Service) (Theft)
Hardware
Interruption Interruption
The Computer System (Loss)
(Deletion)
Modification Modification
Fabrication
Types of Computer Misuse
Mode Misuse Type
External Misuse
1. Visual Spying Observation of keystrokes or screen.
2. Misrepresentation Deceiving operators and users.
3. Physical Scavenging Dumpster-diving for printout.
Hardware Misuse
4. Logical Scavenging Examining discarded/stolen media.
5. Eavesdropping Intercepting electronic or other data.
6. Interference Jamming, electronic or otherwise.
7. Physical Attack Damaging or modifying equipment or power.
8. Physical Removal Removing equipment and storage media.
Types of Computer Misuse
Mode Misuse Type
Masquerading
9. Impersonation Using false identities external to the computer system.
10. Piggybacking attacks Usurping communication lines, workstations.
11. Spoofing attacks Using playback, creating bogus nodes and systems.
12. Network weaving Masking physical whereabouts or routing.
Pest Programs
13. Trojan Horse Attacks Implanting malicious code, sending letter bombs.
14. Logic Bombs Setting up time or event bombs (a form of Trojan Horse).
15. Malevolent worms Acquiring distributed resources (e.g. rabbits and bacteria).
16. Virus attacks Attaching to programs and replicating.
Types of Computer Misuse
Mode Misuse Type
Bypasses
17. Trapdoor attacks Utilizing existing flaws in the system.
18. Authorization attacks Password cracking etc.
Active Misuse
19. Basic active attack Creating, modifying, entering false or misleading data.
20. Incremental attack Using salami attacks.
21. Denial of Service Perpetrating saturation attacks.
Passive Misuse
22. Browsing Making random and selective searches
23. Inference, aggregation Exploiting database inferences and traffic analysis.
24. Covert Channels Exploiting covert channels or other data leakage.
External Misuse
• Generally non-technological and unobserved, external misuse is physically
removed from computer and communications facilities.
• It has no direct observable effects on the systems and is usually
undetectable by the computer security systems.
• Types of external misuse include:
– Visual spying. For example: remote observation of typed key strokes or screen
images.
– Physical scavenging. For example: collection of waste paper or other externally
accessible computer media such as discards - so called Dumpster Diving.
– Deception. Various forms of deception external to computer systems and
telecommunications. For example misrepresentation of oneself or of reality - so
called social engineering.
Hardware Misuse
There are two types of hardware misuse: passive and active.
• Passive Hardware Misuse. This tends to have no immediate side effect on hardware or
software behaviour, and includes
– logical scavenging (such as the examination of discarded computer media),
– electronic or other types of eavesdropping that intercept signals, generally unbeknownst to
the victims, for example picking up emanations (see tempest standard),
– planting a spy-tap device in a terminal, workstation or mainframe, or other hardware sub-
system.
• Active Hardware Misuse. This generally has noticeable effects and includes:
– Theft of computing equipment and physical storage media.
– Hardware modifications, such as internally planted Trojan horse hardware devices.
– Physical attacks on equipment and media, such as interruption of power supplies. This type of
attack can also make use of EMP weapons.
Masquerading
Masquerading attacks include
• Impersonation of the identity of some other individual or computer subject.
For example, using a computer identifier and password to gain access to a
computer system. The computer identifier and password may belong to a
person or a computer demon.
• Spoofing attacks. For example, using the identity of another machine on a
network to unauthorized access. Types of attacks include a) IP spoofing, b)
machine spoofing, and c) demon spoofing
• Piggyback attacks. For example, a communication channel to a computer
may be hijacked by an unauthorized user.
• Playback attacks. For example the playback of network traffic in the attempt
to recreate a transaction.
• Network weaving to hide physical whereabouts. This is where a person will
connect through several machines to a target machine.
Pest Programs: Trojan Horse
• A Trojan Horse
– A Trojan Horse is an entity (typically a program, but not always) that contains code or
something interpretable as code, which when executed can have undesirable effects, such as
the clandestine copying of data or the disabling of the computer system.
• A Logic Bomb
– A Logic Bomb is a Trojan horse in which the attack is detonated by the occurrence of some
specified logical event such as the first subsequent login by a particular user.
• A Time Bomb
– A Time Bomb is a Logic bomb in which the attack is detonated by the occurrence of some
specified time-related logic event, e.g., the next time the date is 18th of Dec.
• A Letter Bomb
– A Letter Bomb is a peculiar type of Trojan horse attack whereby the harmful agent is not
contained in a program, but rather is hidden in a piece of mail or data. The harmful agent is
usually special characters that are only meaningful to a particular mail agent. This bomb is
triggered when it is read as a piece of electronic mail.
Trojan Horse Exploitations
• Password-Catching Trojan Horses.
– Beginning in autumn 1993, Trojan horses appeared in the network software of numerous
Internet computers. In particular, telnet, a program that permits people to connect from one
machine to another, was altered so that all user names and user passwords were logged for
later illegal use.
• Emergency System Trojan Horse.
– A former employee maliciously modified the software of the Community Alert Network
installed in New York and San Jose, California. The software did not fail until it was needed in
response to a chemical leak at Chevron’s refinery in Richmond, California. The emergency
system was then down for 10 hours. (See ACM Software Engineering Notes, Vol. 17, No. 4,
1992).
• Beware of Smart Telephones.
– A scam was detected involving third-party pay phones that could capture and record credit-
card numbers for later illegal use (See ACM Software Engineering Notes, Vol. 16, No. 3,
1991). This type of Trojan horse attack is also seen in teller-machine fraud.
Time-Bomb and Logic-Bomb Exploitations
• General Dynamics Logic Bomb
– A programmer, Michael John Lauffenberger, was convicted of logic bombing
General Dynamics’ Atlas Rocket Database. He quit his job and hoped to be rehired
at a premium when the logic bomb went off. However it was discovered by
another programmer.
• Pandair Logic Bomb
– A contractor programmer, James McMahon, was accused of planting logic bombs
in a Pandair Freight system in the United Kingdom. One bomb locked up
terminals, and another bomb was set to wipe out memory. He was cleared of all
charges due to insufficient evidence.
• Logic Bomb Deletes Brokerage Records
– Donald Gene Burleson was prosecuted on felony charges for planting a time
bomb that shortly after he was fired deleted more that 168,000 brokerage
records from the USPA in Fort Worth, Texas. He was convicted and jailed.
Pest Programs: The Virus
• Question: What is a Virus
– The virus is in essence a Trojan horse with the ability to propagate itself and thus
to contaminate other systems.
There are several types of Password attacks that attempt to exploit any authentication and
authorization vulnerabilities.
– Exhaustive trial-and-error attacks, this is enumerating all possible passwords and trying each of them in
turn.
– Guessing of passwords is typically based on commonly used strings of characters, such as dictionary words.
– Capture of unencrypted passwords is often possible if the password is stored in a unencrypted form on a
system, or is transmitted over a network in a unencrypted form.
– Derivation of passwords may be possible if a known algorithm is used to generate passwords.
– Existence of universal passwords due to design or implementation flaws., e.g, a master or skeleton key used
to open a wide range of locks.
– Absence of passwords in password files will allow user to access a system without knowing the valid
password for a user.
– Editing the password file is possible if the file is not properly protected.
– A Trojan horse can be used to subvert the password checking routines.
Active Misuse
There are three types of Active Misuse
– Basic Active Misuse, This is the creation, modification and use of false or
misleading information. Examples of this type of misuse include:
• A British reporter was convicted of altering financial data, and reading Prince Phillip’s
electronic mailbox, on Prestel. (See ACM Software Engineering Notes, Vol.12, No. 2,
1988).
• Nicholas Whiteley (AKA the Mad Hacker) was sentenced in the UK on June 7, 1990 to 4
months for malicious damage.
– Incremental Attacks, This type of attack is often called a salami attack. In a salami
attack numerous small pieces (possible using round off) are collected.
• Volkswagen lost almost $260 million as the result of an insider using a salami attack on
financial transactions.
– Denial of Service. This is when a machine or network is overloaded to the point
where it can no longer respond to valid requests.
A Question
• What do we mean by
Risk, Threat and
Vulnerability?
Network Security Assessment Methodology
• After identifying public IP network blocks that are related to the target network
space, analysts should carry out bulk TCP, UDP, and ICMP network scanning and
probing to identify active hosts and accessible network services (e.g., HTTP, FTP,
SMTP, POP3, etc.), that can in turn be abused to gain access to trusted network
space.
• Key pieces of information that are gathered through bulk network scanning include
details of accessible hosts and their TCP and UDP network services, along with
peripheral information such as details of ICMP messages to which target hosts
respond, and insight into firewall or host-based filtering policies.
• After gaining insight into accessible hosts and network services, analysts can begin
offline analysis of the bulk results and investigate the latest vulnerabilities in
accessible network services.
Investigation of Vulnerabilities
• New vulnerabilities in network services are disclosed daily to the security community
and underground alike, through Internet mailing lists and public forums including
Internet Relay Chat (IRC).
– Proof-of-concept tools are often published for use by security consultants, whereas full-
blown exploits are increasingly retained by hackers and not publicly disclosed in this
fashion.
• Here are five web sites that are extremely useful for investigating potential
vulnerabilities within network services:
– Security Focus (http://www.securityfocus.com)
– Packet Storm (http://www.packetstormsecurity.org)
– CERT vulnerability notes (http://www.kb.cert.org/vuls/)
– MITRE Corporation CVE (http://cve.mitre.org)
– ISS X-Force (http://xforce.iss.net)
Exploitation of Vulnerabilities
• Upon qualifying potential vulnerabilities in accessible network services to a degree that it's
probable that exploit scripts and tools will work correctly, attacking and exploiting the host is
the next step.
• There's not really a lot to say about exploitation at a high level, except that by exploiting a
vulnerability in a network service and gaining unauthorized access to a host, an attacker breaks
computer misuse laws in most countries (including the United Kingdom, United States, and
many others).
• Depending on the goal of the attacker, she can pursue many different routes through internal
networks, although after compromising a host, she usually undertakes the following:
– Gain superuser privileges on the host
– Download and crack encrypted user-password hashes (the SAM database under Windows and the
/etc/shadow file under most Unix-based environments)
– Modify logs and install a suitable backdoor to retain access to the host
– Compromise sensitive data (databases and network-mapped NFS or NetBIOS shares)
– Upload and use tools (network scanners, sniffers, and exploit scripts) to compromise other networked
hosts
The Cyclic Assessment Approach
Common Vulnerability Exposure (CVE)
• What is CVE
– CVE is a list of information security vulnerabilities and exposures that aims to
provide common names for publicly known problems. The goal of CVE is to
make it easier to share data across separate vulnerability capabilities (tools,
repositories, and services) with this "common enumeration."
• What is a "vulnerability"
– An information security vulnerability is a mistake in software that can be directly
used by a hacker to gain access to a system or network.
• What is an "exposure"
– An information security exposure is a mistake in software that allows access to
information or capabilities that can be used by a hacker as a stepping-stone into
a system or network.
• The CVE Website is: www.cve.mitre.org
An Example of a CVE
• Name:
– CVE-2001-0537
• Description:
– HTTP server for Cisco IOS 11.3 to 12.2 allows
attackers to bypass authentication and execute
arbitrary commands, when local authorization is being
used, by specifying a high access level in the URL.
• Reference:
– CISCO:20010627 IOS HTTP authorization vulnerability
– CERT:CA-2001-14
– BUGTRAQ:20010629 Re: Cisco Security Advisory: IOS HTTP
authorization vulnerability
– BUGTRAQ:20010702 Cisco IOS HTTP Configuration Exploit
– BUGTRAQ:20010702 Cisco device HTTP exploit...
– BUGTRAQ:20010702 ios-http-auth.sh
– XF:cisco-ios-admin-access(6749)
– BID:2936
– OSVDB:578
Penetration Testing using CVE
The Open Vulnerability Assessment
Language (OVAL)
----------------------------------------------------
OVAL Definition Interpreter
Version: 5.3 Build: 20
Build date: Jun 28 2007 14:29:43
Copyright (c) 2002-2007 - The MITRE Corporation
----------------------------------------------------
Options:
-h = show options available from command line
-o <string> = path to the oval-definitions xml file DEFAULT="definitions.xml"
-d <string> = save data to the specified XML file DEFAULT="system-characteristics.xml"
-r <string> = save results to the specified XML file DEFAULT="oval-results.xml"
-v <string> = get external variable values from the specified XML file. DEFAULT="external-variables.xml"
-e <string> = evaluate the specified list of definitions. Supply definition ids as a comma seperated list like:
oval:com.example:def:123
-i <string> = use data from input System Characteristics file
-n = perform Schematron validation of the oval-defiitions file.
-c <string> = use the specified xsl for oval-definitions Schematron validation. DEFAULT="oval-definitions-
schematron.xsl"
-m = do not verify the oval.xml file with an MD5 hash
-p = print all information and error messages
-s = do not apply a stylesheet to the results xml.
-t <string> = apply the sepcified xsl to the results xml. DEFAULT="results_to_html.xsl"
-x <string> = output xsl transform results to the specified file. DEFAULT="results.html"
-z = return md5 of current definitions.xml
The Interpreter’s Output
Other Mark-Up Languages
• Common Configuration Enumeration - CCE
– CCE provides unique identifiers to system configuration issues in order
to facilitate fast and accurate correlation of configuration data across
multiple information sources and tools. For example, CCE Identifiers can
be used to associate checks in configuration assessment tools with
statements in configuration best-practice documents.
• Common Malware Enumeration - CME
– CME provides single, common identifiers to new virus threats and to the
most prevalent virus threats in the wild to reduce public confusion
during malware incidents. CME is not an attempt to replace the vendor
names currently used for viruses and other forms of malware, but
instead aims to facilitate the adoption of a shared, neutral indexing
capability for malware.
• Common Weakness Enumeration - CWE
– International in scope and free for public use, CWE provides a unified, measurable set of
software weaknesses that will enable more effective discussion, description, selection, and
use of software security tools and services that can find these weaknesses in source code.
Common Attack Pattern Enumeration and
Classification - CAPEC
• Building software with an adequate level of security assurance for its mission becomes more and
more challenging every day as the size, complexity, and tempo of software creation increases and
the number and the skill level of attackers continues to grow.
• To identify and mitigate relevant vulnerabilities in software, the development community needs
more than just good software engineering and analytical practices, a solid grasp of software
security features, and a powerful set of tools.
– All of these things are necessary but not sufficient.
• To be effective, the community needs to think outside of the box and to have a firm grasp of the
attacker’s perspective and the approaches used to exploit software.
• Attack patterns are a powerful mechanism to capture and communicate the attacker’s
perspective.
– They are descriptions of common methods for exploiting software. They derive from the concept of
design patterns applied in a destructive rather than constructive context and are generated from in-depth
analysis of specific real-world exploit examples.
• To assist in enhancing security throughout the software development lifecycle, and to support the
needs of developers, testers and educators.
• The objective of this effort is to provide a publicly available catalog of attack patterns along with a
comprehensive schema and classification taxonomy.
CPE and XCCDF
• Common Platform Enumeration
– CPE is a structured naming scheme for information technology systems, platforms,
and packages. Based upon the generic syntax for Uniform Resource Identifiers
(URI), CPE includes a formal name format, a language for describing complex
platforms, a method for checking names against a system, and a description
format for binding text and tests to a name.
• The Extensible Configuration Checklist Description Format
– XCCDF is a specification language for writing security checklists, benchmarks, and
related kinds of documents.
– An XCCDF document represents a structured collection of security configuration
rules for some set of target systems. The specification is designed to support
information interchange, document generation, organizational and situational
tailoring, automated compliance testing, and compliance scoring.
– The specification also defines a data model and format for storing results of
benchmark compliance testing.
– The intent of XCCDF is to provide a uniform foundation for expression of security
checklists, benchmarks, and other configuration guidance, and thereby foster more
widespread application of good security practices.
A Question
• What role does CVE play in
a penetration test?
• What role can OVAL play in
a security audit?
Cryptography
Jobs of Cryptography
An Asymmetric cipher is a cipher that does not use then same key to
encrypt and decrypt information, thus KENC KDEC
Cipher Text
Plain Text Original
C
P Encryption Decryption Plain Text
P
C=E( KENC , P ) P=E( KDEC , C )
A Simple Cipher
The Caesar cipher is named after Julius Caesar, said to be the first to use it.
In the Caesar cipher each character is substituted by another. This technique
is called a monoalphabetic cipher.
Plaintext: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
Ciphertext: O P Q R S T U V W X Y Z A B C D E F G H I J K LMN
• A Simple Substitution cipher, or Monoalphabetic cipher, is one in which each character in the plain
text is replaced with a corresponding character of cipher-text.
• A Homophonic substitution cipher is like a simple substitution crypto-system, except that a single
character of plaintext can map to one of several characters of ciphertext. For Example, A could
correspond to 5, 14 and 147.
– These types of ciphers where used as early as 1401 by the Duchy of Mantua.
• A Polygram substitution cipher is one which blocks of characters are encrypted in groups. For
Example, ABA could correspond to RTQ.
– The Playfair cipher is an example of this type of cipher and was used by the British in World War One.
P1 P2 Pn
Initial
Vector
C1 C2 Cn
Triple DES
• With Triple DES, each block of plaintext is processed three times
• The two-key version of triple DES is estimated to be 1013 times stronger
than single-DES. The three-key version is even stronger.
• If Key1, Key2 and Key3 are all the same the Triple-Des collapses to Single DES.
• The issue with Triple-Des is Key Management.
Key1 or
Key1 Key2 Key3
P = D(KPRIV , E(KPUB , P) )
– That is a user can decode with a private key what someone else has encrypted with
the corresponding public key.
P = D(KPUB , E(KPRIV, P) )
– That is a user can decode with a public key what someone else has encrypted with the
corresponding private key. This is used for Digital Signatures.
Certificates
• As humans we establish and use trust all the time. However trust is based upon the
ability of people to identify and certify who and what they are.
– We use certificates and trust all the time to do business.
• Within modern encryption systems a public key and a user’s identity are bound into
a certificate which is then signed by someone to certify the accuracy of that binding.
• Certificates can be awarded by certification authorities (CA), and certificate
authorities can themselves have certificates
• Encryption keys are used to create and manage certificates, as keys may have a
limited life to them.
– How efficient is the protocol ? A protocol requiring several steps to establish an encryption
key that will be used many times is one thing; it is quite another to go through several time-
consuming steps for a one-time use.
– How easy is the protocol to implement ?
Digital Signatures
• A digital signature is a protocol that produces the same effect as a real signature: it is a
mark that only the sender can make, but other people can easily recognize as belonging
to the sender. Just like read signatures, a digital signature is used to confirm agreement
to a message.
• Digital signatures must meet two primary conditions:
– Unforgeable:
• If person P signs a message M with a signature S(P,M) it is impossible for anyone else to produce
the pair [M, S(P, M)].
– Authentic:
• If a person R receives the pair [M, S(P, M)] purportedly from P, R can check that the signature is
really from P. Only P could have created this signature, and the signature is firmly attached to M.
• Two additional requirements are also desirable:
– Not alterable:
• After being transmitted, M cannot be changed by R or an interceptor.
– Not Reusable:
• A previous message presented will be instantly detected by R.
Symmetric Key Digital Signatures
• With private key encryption, the secrecy of the key guarantees the authenticity of the
message as well as secrecy. If Sandy and the Bank have an encryption key in common,
she can encrypt her request to transfer money. The bank can be sure of its authenticity
because nobody else has Sandy’s key.
• Conventional symmetric key encryption does not prevent forgery.
– Any one who knows the key can create a digital signature.
– Thus the bank has no protection against repudiation (denial of sending a message).
• The repudiation problem can be solved if both the sender and the receiver use and
arbitrator.
– Sender and arbiter share a key Ks
– Recipient and arbiter share a key KR
– Identity of sender is S and identity of recipient is R
– Content of message between sender and recipient is M
– The arbitrator will use a sealing function. A sealing function is a mathematical function
affected by every bit of its input. For example, the bytes of the input can be used as numbers
and the sum of all input computed.
Symmetric Key Digital Signatures
S sends sealed
1 M to Arbiter ARBITER
Arbiter retrieves M
KS 2
SENDER plaintext M from S
M
KR
KS
M S M
• In order to stop a message that has been signed using a digital signature
from being reused, we need to make use of timestamps.
• The protocol is the same as the asymmetric protocol for signing documents
except that the message contains a time stamp.
– e.g. M = [ Message , Time-Stamp ]
• The timestamp will define when the message was sent.
– Two messages with the same contents and the same time-stamp will be ignored,
thus a message can not be re-used.
• Resource allocator.
– Manages and allocates resources.
• Control program.
– controls the execution of user programs and operations of I/O
devices.
• Kernel.
– The one program running at all times (all else being application
programs).
Diagram of Process State
Preemptive verses Non-Preemptive
• Preemptive process management schemes include the possibility of
removing a process from the CPU even if there is no I/O-request.
– In this case the process is set "ready" and has to queue again for getting
calculation time. It is said to have been preempted.
A process can be removed for instance if its calculation takes too long or if a
process with a higher priority enters the "ready"-state.
• Non-preemptive process management schemes do not allow a process to
be removed from the CPU until it is either finished or issues an I/O-request.
– With non-preemptive schemes it is impossible to set a process directly from
"running" to "ready".
Process Control Block (PCB)
Information associated with
each process.
• Process state
• Program counter
• CPU registers
• CPU scheduling information
• Memory-management
information
• Accounting information
• I/O status information
A Question
• So is windows XP a
pre-emptive or none-
preemptive operating
system?
Models of Information Security
Sources of Computer Security Policy
Real-world
Goals of the environment;
Organisation laws, risks,
costs, etc.
Real-World
Security
Policy
Computing
Computer
Environment:
Security
Functions,
Policy
threats, costs
Computer
Security
Model
Real-World Security Policy
• Individual accountability.
– Individual persons are held responsible for their actions. The principle implies that people
are uniquely and accurately identified and that records of their actions are kept and
reviewed. In addition, individual accountability also includes:
• The functions performed by management in order to manage the people who use, and
administer, the information system.
• The functions performed in order to administer the information system.
• The functions performed by the users of the information system
• Authorization.
– Explicit rules are needed about who can use the resources, and in what ways. Rules are
also needed about who can authorize the use of resources and how that authorization can
be delegated.
• Least privilege.
– People should be authorized only for the resources that they need to do their jobs.
Real-World Security Policy
• Separation of duty.
– Functions should be divided between people, so that no one person can commit a fraud
undetected. For example, two people in a bank must sign large cashier’s cheques.
• Auditing.
– Work and its results must be monitored both while the work is being performed and after it
has been completed. Information must be checked for internal consistency and for
consistency with other criteria.
• Redundancy.
– The principle of redundancy affects both work and information. Important steps are
performed by two or more people (sometimes by different people - separation of duty), and
the results are checked for consistency.
• Risk Reduction.
– Risk can never be eliminated, therefore the strategy must be to minimize it, while keeping
the cost of enforcement proportional to the risk.
The Chinese Wall Security Policy
• We can define a security policy that reflects certain commercial needs for information access
protection. Their base is people in legal, medical and investment. Basically a conflict of interest
exists when one person can obtain sensitive information on competing companies. Their policy
starts by building three levels of abstraction:
– Object: at the lowest level are elementary objects, such as files. Each file contains information concerning
one company.
– Company groups: At the next level, all objects concerning each company are grouped together.
– Conflict classes: At the highest level, all groups of objects for competing companies are clustered.
• The access control policy is rather simple. A person can access any information as long as the
person has never accessed information from a different company in the same conflict class.
• That is, access is allowed if either the object requested is in the same company group as the
object that has been previously accessed, or the object requested belongs to a conflict class that
has never been accessed before.
DAC/MAC
• Discretionary Access Control (DAC)
– Within discretionary access control certain amount of access control is left to the discretion
of the object’s owner, or anyone else who is authorized to control the objects access. The
owner can determine who should have access rights to an object and what those rights
should be. Commercial environments typically use DAC to allow anyone in a group to access
a file. Typically DAC access rights can change dynamically.
• Mandatory Access Control (MAC)
– Mandatory access control means that access control policy decisions are made beyond the
control of an individual owner of an object. A central authority determines what information
is to be accessible by whom, and the user cannot change access rights. An example of MAC
occurs in military security, where an individual data owner does not decide who has a top-
secret clearance, nor can the owner change the classification of an object from top-secret to
secret.
Bell-La Padula Model
• *-Property
“A subject s who has read access to an object o may have write access to an object
p only if C(o) C(p)”
– This says that a person obtaining information at one level may pass that
information along only to people at levels no lower that the information. This
property is to prevent Write-Down of information, which occurs when a subject
with write access to high-level data transfers that data by writing it to a low-level
object.
Implications of the Bell-La Padula Model
O5
Write
Read
S2 O4
Sensitivity Write
of Objects
Read
O3
Trust
Write
of Subjects
Read
S1 O2
Write
Read
O1
Take-Grant System Models
• This model was published in 1978 and in this model there are only four primitive
operations: create, revoke, take and grant.
• Let R be a set of rights and S be a set of subjects and O be a set of objects.
– objects can be either active (subjects) or passive (non-subject objects).
• Each subject or object is denoted by a node on a graph; the rights of a particular
subject to a particular object denoted by a label direct from the subject to the
object.
Right
Object
r
Subject s o
Take-Grant Operators
• Create(o, r)
– a new node with label o is added to the graph. From s to o there is a directed
edge with label r, denoting the rights of s on o.
r
Creation of o
an object
s Becomes s
Grant(o, p, r)
Subject s grants to o access rights r on p. A specific right is grant. Subject
s can grant to o access rights r on p only if s has grant right on o, and s
has r rights on p. Informally, s can grant (share) any of its rights with o, as
long as s has the right to grant privileges to o. An edge from o to p is
added, with label r. O is an active subject.
grant grant r
s o p s o p
r r
Granting Becomes
access rights
Take-Grant Operators
• Revoke(o, r)
– The right r is revoked from s on o. The edge from s to o was labeled q r; the label is
replaced by q. Informally, s can revoke its rights to do r on o.
q r q
Creation of o o
an object
s Becomes s
Take(o, p, r)
Subject s takes from o access rights r on p. A specific right is take.
Subject s can take from o access rights r on p only if s has take right on o,
and o has r rights on p. Informally, s can take any rights o has, as long as
s has the right to take privileges from o. An edge from s to p is added with
label r.
take r take r
s o p s o p
r
Granting Becomes
access rights
Example of the Take/Grant Model
Question: Can S3 access O4 ?
O4
O1
r/w
r
r/w
take
O2 O3
S1
r
take take
take
S2
S3
Security Standards
The Orange Book
• What is the purpose of the orange book? According to the book itself, the evaluation criteria were
developed with three basic objectives:
• Measurement. To provide users with a metric with which to assess the degree of trust that can be
placed in computer systems for the secure processing of classified or other sensitive information.
• Guidance. To provide guidance to manufacturers as to what to build into their trusted commercial
products to satisfy trust requirements for sensitive applications.
• Acquisition. To provide a basis for specifying security requirements in acquisition specifications. The
Orange Book provides a clear way of specifying a coordinated set of security functions. A customer
can be confident that the system he or she acquires has already been checked out for the needed
degree of security.
• The Orange Book defines four broad hierarchical divisions of security protection, and each division
has a defined set of characteristics. The divisions are as follows:
– D Minimal
– C Discretionary
– B Mandatory Protection
– A Verified protection
Evaluation Classes
Class Name Example
D Minimal security None: Reserved for Systems that are submitted for
evaluation but fail. Basic operating systems such as
MS-DOS.
C1 Discretionary IBM: MVS/RACF
security protection
C2 Controlled access Hewlett-Packard: MPE V/E
protection Wag Laboratories: SVS/OS CAP 1.0
B1 Labeled security AT&T: System V/MLS
protection UNISYS: OS 1100
B2 Structure Honeywell Information Systems: Multics
protection Trusted Information Systems: Trusted XENIX
B3 Security domains Honeywell federal Systems: XTS-200
A1 Verified design Honeywell Information Systems: SCOMP
Boeing Aerospace: SNS
ITSEC
• The Information Technology Security Evaluation Criteria (ITSEC) was published in 1992, and it has
become known as Europe’s White Book. ITSEC defines eight distinct security functions, along with
classes of functionality and assurance levels.
• The eight distinct security functions are as follows:
– Identification and Authentication: The systems security policy specifies the subjects and objects that must be
identified and authenticated.
– Administration of Rights: The security policy must specify the rights that each subject and object possesses
and identify the relationship between them.
– Verification of Rights: The system must verify a subjects rights when a subject tries to access an object.
– Audit: The system must audit security related events.
– Object Re-use: Objects must be cleared of data before they are re-used.
– Error Recovery: The security policy defines errors conditions and how to recover from them.
– Continuity of Service: The system must be able to continue to make certain key services available and to
maintain system security.
– Data Communications Security: Information transmitted between two nodes must be secure, each node must
be authenticated and there must be non-repudiation in the transaction.
ITSEC Classes of Functionality
Class Meaning
F1 Derived from orange book class C1
F2 Derived from orange book class C2
F3 Derived from orange book class B1
F4 Derived from orange book class B2
F5 Derived from orange book class B3/A1
F6 A distinct class of system with high integrity (in contrast to confidentiality) requirements
for data and programs. It is particularly appropriate for database systems.
F7 A distinct class of system with high requirements for either a complete system or a special
function of a system. It is particularly appropriate for process control systems.
F8 A distinct class of system with high requirements for the safe guarding of data integrity
during data communications.
F9 A distinct class of system with high demands on the confidentiality of data during data
communications. It is particularly appropriate for cryptographic systems.
F10 A distinct class for networks with high demands on the confidentiality and integrity of the
information to be communicated. It is particularly appropriate when sensitive information
needs to be communicated over unsecure (e.g. public) networks.
ITSEC Assurance Levels
Assurance Level Meaning
E1 Testing
E2 Configuration control and controlled distribution; roughly equivalent to
orange book class C2 assurance.
E3 Access to detailed design and course code; roughly equivalent to
orange book class B1 assurance.
E4 Rigorous vulnerability analysis; roughly equivalent to orange book
class B2 assurance.
E5 Demonstrates correspondence between detailed design and course
code; roughly equivalent to orange book class B3 assurance.
E6 Formal models and formal descriptions, linked by formal
correspondences; roughly equivalent to orange book class A1
assurance.
The Common Criteria
• The Common Criteria defines standards to be used as the basis for evaluation of security properties of IT Products
and Systems. The standard addresses protection of information from unauthorised disclosure, modification, or
loss of use, in particular:
– User view: A way to define Information Technology (IT) security requirements for some IT products:
– Evaluator/scheme view: A tool to measure the confidence we may place in the security of a product.
– Common structure & language for expressing product/system IT security requirements (Part 1).
– Catalogs of standardized IT security requirement components & packages (Parts 2 & 3).
– Develop Protection Profiles (PP) and Security Targets (ST) -- specific IT security requirements for products & systems --
Consumers then use them for decisions
– Evaluate products & systems against known & understood requirements CONFIDENCE
The Common Criteria: Evaluation Concepts and Relationships
Assurance
Evaluation
Techniques
Gives Evidence of
Assurance
Produces
Giving
Require Counter-
Confidence Risk
That -measures Minimise
To
Owners
Assets
Key Concepts - The Constructs
Protection Profile (PP):
An implementation-independent set of security objectives and requirements for a
category of IT products or systems that meet similar consumer needs for IT
security.
Security
Requirements
(PP)
TOE
Implementation
TOE
Description
Protection Profile &
Security
Security Target Environment
Threats
Security Policies
Common Contents Secure Usage Assumptions
Security
Objectives TOE IT Security Objectives
Environmental Security Objectives
IT Security
Requirements TOE IT Functional & Assurance Requirements
Requirements for IT environment
PP Claims
Key Concepts Hierarchy of the Parts
Class Name
FAU Audit
FCO Communications
FCS Cryptographic Support
FDP User Data Protection
FIA Identification & Authentication
FMT Security Management
FPR Privacy
FPT Protection of TOE Security Functions
FRU Resource Utilization
FTA TOE Access
FTP Trusted Path / Channels
Assurance Requirements
Classes of Security Assurance Requirements:
Class Name
ACM Configuration Management
ADO Delivery & Operation
ADV Development
AGD Guidance Documents
ALC Life Cycle Support
ATE Tests
AVA Vulnerability Assessment
APE Protection Profile Evaluation
ASE Security Target Evaluation
AMA Maintenance of Assurance
Example Hierarchy
(Functional or Assurance)
CLASS
FAMILY FAMILY
PP or ST
Flexibility of defining
requirements.
Evaluation Assurance Levels (EALs)
Evaluation Assurance Levels &
(rough) Backward Compatibility Comparison
• It defines a process.
– Information Policy
• This invites you to stand back and think about all of your information assets and their value to your organisation.
You ought then to devise a policy that identifies what information is important and why. From a practical point of
view, it is only that information with some significant value that should be of concern.
– Scope
• Excluding low value information allows you to define the scope of your management concerns. You may discover
that your concerns pervade your organisation as a whole. In this case you will need to regard all of your information
systems and their external interfaces IT and electronic forms of communication, filing cabinets, telephone
conversations, public relations and so on, as being in scope. Alternatively, your concerns may focus onto a particular
customer-facing system.
Information Security Management System
• Risk assessment
– Now you know what information is in scope and what its value is, your next move should be to determine the risk of losing
that value. Remember to consider everything. At one extreme you need to consider the complexities of technology; at the
other you need to consider business forces in terms of advancing technology and enterprise, as well as the ugly side of
industrial espionage and information warfare.
• Risk management
– You then need to decide how to manage that risk. Your forces certainly include technology, but don't forget people,
administrative procedures and physical things like doors and locks and even CCTV. Don't forget insurance.
– If you can't prevent something from happening, maybe you can discover if it does happen and do something to contain it or
otherwise reduce the danger. In the end, you will of course, need an effective continuity plan.
• Statement of applicability
– You are required to identify all of your chosen security controls and justify why you feel they are appropriate, and show why
those ISO 27001 controls that have not been chosen are not relevant.
An Overview of the Process
· Threats,
Undertake Risk Assessment · Vulnerabilities,
· Impact
Information Classification
Objective: To ensure that information receives an appropriate level of protection.
– Classification guidelines.
– Information labeling and handling
ISO 27001 - Human Resource Security
Prior to Employment
Objective: To ensure that employees, contractors and third party users understand their
responsibilities, and are suitable for the roles they are considered for, and to reduce the risk of
theft, fraud or misuse of facilities.
– Roles and Responsibilities
– Screening
– Terms and conditions of employment
During Employment
Objective: To ensure that all employees, contractors and third party users are aware of
information security threats and concerns, their responsibilities and liabilities, and are
equipped to support organizational security policy in the course of their normal work, and to
reduce the risk of human error.
– Management responsibilities
– Information security awareness, education and training
– Disciplinary process
ISO 27001 - Human Resource Security
Objective: To ensure that employees, contractors and third party users exit an organization or
change employment in an orderly manner.
– Termination responsibilities
– Return of assets
Secure Areas
Objective: To prevent unauthorized physical access, damage and interference to the organization’s premises
and information.
– Physical security perimeter
– Physical entry controls
– Securing offices, rooms and facilities
– Protecting against external and environmental threats
– Working in secure areas
– Public access, delivery and loading areas
Equipment Security
Objective: To prevent loss, damage, theft or compromise of assets and interruption to the organization’s
activities.
– Equipment siting and protection
– Supporting utilities
– Cabling security
– Equipment maintenance
– Security of equipment off-premises
– Secure disposal or re-use of equipment
– Removal of property
ISO 27001 - Communications and
Operations Management
Back-Up
Objective: To maintain the integrity and availability of information and information processing
facilities.
– Information Back-Up
ISO 27001 - Communications and
Operations Management
Media Handling
Objective: To prevent unauthorized disclosure, modification, removal or destruction of assets, and
interruption to business activities.
– Management of removable media
– Disposal of media
– Information handling procedures
– Security of system documentation
ISO 27001 - Communications and
Operations Management
Exchange of Information
Objective: To maintain the security of information and software exchanged within an
organization and with any external entity.
– Information exchange policies and procedures
– Exchange Agreement
– Physical media in transit
– Electronic messaging
– Business information systems
Monitoring
Objective: To detect unauthorized information processing activities.
– Audit logging
– Monitoring system use
– Protection of log information
– Administrator and operator logs
– Fault logging
– Clock synchronization
ISO 27001 - Access Control
Business Requirements for System Access
Objective: To control access to information
– Access control policy.
User Responsibilities
Objective: To prevent unauthorized user access, and compromise or theft of information and information
processing facilities..
– Password Use.
– Unattended user equipment.
– Clear desk and clear screen policy
ISO 27001 - Access Control
Network Access Control
Objective: To prevent unauthorized access to networked services
– Policy on use of network services
– User authentication for external connections
– Equipment identification in networks
– Remote diagnostic and configuration port protection
– Segregation in the network
– Network connection control
– Network routing control
Objective: To ensure information security when using mobile computing and teleworking
facilities.
– Teleworking
ISO 27001 - Information Systems Acquisition,
Development and Maintenance
– Key management
Objective: To ensure information security events and weaknesses associated with information systems are
communicated in a manner allowing timely corrective action to be taken..
Objective: To ensure a consistent and effective approach is applied to the management of information
security incidents
– Collection of Evidence
ISO 27001 - Business Continuity
Management
Security Compliance with security policies and standards, and technical compliance
Objectives: To ensure compliance of systems with organizational security policies and standards.
– Compliance with security policies and standards
– Technical Compliance Checking
– he/she causes a computer to perform any function with the intent to secure
access to any program or data held in any computer,
– he/she knows at the time when he/she causes the computer to perform the
function that this is the case.
The Basic Hacking Offence
The terms used in the Act for Section One are as follows:
1(1) A person is guilty of an offence if
a) he causes a computer to perform any function with intent to
secure access to any program or data held in a computer, and
b) the access he intends to secure is unauthorised
or
c) he knows at the time when he causes the computer to perform the
function that this is the case.
1(2) The intent a person has to commit an offence under this section
need not be directed at
a) any particular program or data
b) a program or data of any particular kind
or
c) a program or data held in any particular computer.
1(3) A person guilty of an offence under this section shall be liable
on summary conviction to imprisonment for a term not exceeding six
months or to a fine not exceeding level 5 on the standard scale or
both.
Case Law
• In 1993 Paul Bedworth was acquitted of conspiracy to commit offences under Section 1 and
Section 3 of the Computer Misuse Act 1990. Paul Bedworth’s defence counsel argued that
Bedworth was addicted to computer hacking and, as a result he was not capable of forming the
necessary intent to commit the offences charged. Although addiction is not a defence to a
criminal charge, the jury acquitted him.
• It is certainly possible for the employees of a company to commit the basic hacking offence when
using their own computer terminals at work if they intend to again access to any program or data
in respect of which they know they do not have authority to access. In the case of Denco Ltd v
Joinson (1991) it was held that an employee who used an unauthorised password to gain access
to information stored on a computer which he was not entitled to see was guilty of gross
misconduct and could be summarily dismissed from his employment.
– In the Case of Denco Ltd v Joinson (1991) it was held that an employee who used an unauthorised
password to gain access to information stored on a computer, and that he knew he was not entitled to see
this information was guilty of gross misconduct and could be summarily dismissed from his employment.
• In December 1993 it a male nurse was convicted of hacking into a hospital's computer system and
modifying entries, including prescriptions. The hacker gained access to the computer system after
learning the password through observing a locum doctor having trouble logging in. The hacker:
– prescribed drugs normally used to treat heart disease to a 9 year old with meningitis;
– prescribed antibiotics to a patient in a geriatric ward and "scheduled" an unnecessary X-ray
• He was sacked for unprofessional behaviour and jailed for 12 months.
The Ulterior Intent Offence
The Section 2 offence is described in the Act as unauthorized
access with intent to commit or facilitate the commission of
further offences.
– It was held in the Court of Appeal that had the attempt been successful, the theft would
have taken place in New York.
• They did not meet, or even know each other or their real names until they were
introduced by the arresting officers, all their contact was by discussing hacking and
swapping passwords on various bulletin boards.
• All were arrested at about midnight while engaged in hacking in their own houses.
They were charged with conspiracy to commit offences contrary to Section 3 of the
1990 Act. They were also charged with conspiracy to make dishonest use of services
provided by British Telecom. Karl Strickland and Neil Woods pleaded guilty. Woods
also admitted causing £15,000 of damage to a computer owned by the Polytechnic
of Central London. Strickland's activities included hacking into NASA and ITN's Oracle
network.
Eight Legged Groove Machine
• In Strickland and Woods, the defendants were each sentenced to six-months in prison at
Southwark Crown Court on 21 May 1993. Judge Michael Harris stated:
"I have to mark your conduct with prison sentences, both to penalize you for what you have done and for
the losses caused, and to deter others who might be similarly tempted."
• He also said:
"There may be people out there who consider hacking to be harmless, but hacking is not harmless.
Computers now form a central role in our lives, containing personal details, financial details, confidential
matters of companies and government departments and many business organisations. Some of these
services, providing emergency services, depend on their computers to deliver those services. It is essential
that the integrity of those systems should be protected and hacking puts that integrity into jeopardy."
• The judge also remarked that hackers needed to be given a "clear signal" by the courts that their
activities will not and cannot be tolerated. Interestingly, after the case, Detective Sergeant Barry
Donovan, formerly attached to Scotland Yard's computer crimes squad, said that, since the
publicity surrounding the arrest of Woods and Detective Strickland, the amount of hacking in UK
had decreased dramatically, although it was still an international problem.
Air Force Rome Lab Case
ISP HQ NATO
Rome Lab
Latvia
USBR WPAFB
JPL, NASA
ISP UK
Goddard
AF Contractor
SFC
S. Korean Army
Atomic Research
Inst AF Contractor
• (b) is recorded with the intention that it should be processed by means of such equipment,
• (c) is recorded as part (or with the intention that it should form part) of a relevant filing system (i.e. any set of
information relating to individuals to the extent that, although not processed as in (a) above, the set is structured,
either by reference to individuals or by reference to criteria relating to individuals, in such a way that specific
information relating to a particular individual is readily accessible), or
• (d) does not fall within paragraph (a), (b) or (c) but forms part of an accessible record [which is defined in Section 68 of
the Act and which can be summarised here as a health record, educational record (local education authority schools
and special schools only), local authority housing record or local authority social services record - N.B. data forming part
of an accessible record may fall within paragraphs (a), (b), (c) or (d) of the definition of data].
Data and Exceptions
• In addition to automatically processed information, the 1998 Act is concerned with "manual data"
falling within the definition of "relevant filing system" in paragraph (c). Such data may be subject to
transitional relief until 2001 or 2007, for details of which see Transitional Provisions in the Act.
Organizations now have to consider which of its paper-based and other manual information come
within the Act.
• What manual data are covered by the Act?
– Under section 1(1)(c) of the Act, data includes manual data that is recorded as part of a relevant filing
system. The term relevant filing system means:
• "any set of information relating to individuals to the extent that, although the information is not processed by means of
equipment operating automatically in response to instructions given for that purpose, the set is structured, either by
reference to individuals or by reference to criteria relating to individuals, in such a way that specific information relating
to a particular individual is readily accessible.”
• Exceptions applying to both automatically and manually processed data
– Unstructured data is not covered by the Act. So word-processed text files, email messages, and plain paper
text are less likely to come under the scope of the Act than databases, formatted text files, Rolodex cards,
pre-printed forms and papers filed in an organized way.
Personal Data
• Data Processor
– Data processor in relation to personal data, means any person other than an employee of
the data controller who processes the data on behalf of the data controller. (The data
processor is equivalent to a computer bureau in the 1984Act.)
– There is a higher duty of care upon data controllers when the processing of personal data is
carried out on their behalf by data processors.
Recipient and Third Parties
• Recipient
– A recipient means any person to whom the data are disclosed. This may include an employee or
agent of the data controller, a data processor or an employee or agent of the data processor.
– The term does not include any person to whom disclosure is or may be made as a result of a
particular inquiry by or on behalf of that person made in the exercise of any power conferred by
law.
• Third party
– is any person other than:
• the data subject;
• the data controller; or
• any data processor or other person authorized to process data for the data controller or processor.
– The expression third party does not include employees or agents of the data controller or data
processor. These people are - for the purpose of this expression - to be interpreted as being part
of the data controller or processor. As such, this expression is distinguishable from "recipient",
which effectively separates employees and agents from the data controller/processor itself.
The Eight DP Principles
1. Personal data shall be processed fairly and lawfully.
2. Personal data shall be obtained only for one or more specified and lawful purposes, and shall not
be further processed in any manner incompatible with that purpose or those purposes.
3. Personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes
for which they are processed.
4. Personal data shall be accurate and, where necessary, kept up to date.
5. Personal data processed for any purpose or purposes shall not be kept for longer than is necessary
for that purpose or those purposes.
6. Personal data shall be processed in accordance with the rights of data subjects under this Act.
7. Appropriate technical and organisational measures shall be taken against unauthorised or
unlawful processing of personal data and against accidental loss or destruction of, or damage to,
personal data.
8. Personal data shall not be transferred to a country or territory outside the European Economic
Area, unless that country or territory ensures an adequate level of protection for the rights and
freedoms of data subjects in relation to the processing of personal data.
Principle 1
• Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless:
– at least one of the "conditions for processing" are met; and in the case of sensitive personal data, at least one
of the "conditions for processing sensitive data" are met.
• The processing is necessary in order to protect the vital interests of the data subject.
– The Data Protection Commissioner considers that reliance on this condition may only be
claimed where the processing is necessary for matters of life and death, for example, the
disclosure of a data subject's medical history to a hospital Accident & Emergency
Department treating the data subject after a serious road accident.
• The processing is necessary for the administration of justice and various other
legislative purposes;
• The processing is necessary for the purposes of legitimate interests pursued by the
data controller, except where the processing is unwarranted in any particular case
because of prejudice to the rights and freedoms or legitimate interests of the data
subject.
Principle 1
• The Act introduces categories of sensitive personal data, namely, personal data consisting of information as to:
a) racial or ethnic origin; b) political opinions, or religious and other beliefs; c) membership of a trade union; d)
physical or mental health or condition; e) sexual life; f) the (alleged) commission by them of any offence.
• This sensitive data cannot be processed unless it comes under one of a number of special cases, such as these:
– The data subject has given their explicit consent to the processing of the personal data.
– The information contained in the personal data has been made public as a result of steps deliberately taken by the data
subject.
– Processing is necessary to protect the vital interests of the data subject or another person.
– The processing is by any not-for-profit body which exists for political, philosophical, religious or trade-union purposes
(special constraints apply).
– The processing is necessary for various legislative and medical reasons (special constraints apply here too).
– The processing of information as to racial or ethnic origin is necessary for racial and ethnic monitoring (special constraints
apply).
Principle 2
• Personal data shall be obtained only for one or more specified and lawful
purposes, and shall not be further processed in any manner incompatible
with that purpose or those purposes.
• Unlike the 1984 Act, there are two means by which a data controller may
specify the purpose for which the personal data are obtained, namely:
– in a notice given by the data controller to the data subject in accordance with
the fair processing code; or
– in a notification given to the Commissioner under the notification provisions of
the Act (provisions are not yet in place).
• Case-Study
– In 1998 an employee of the National Westminster Bank was found supplying
information about customers to his father, a private investigator. The employee
was found guilty; the fines and court costs amounted to over £9000.
Principle 3
• Personal data shall be adequate, relevant and not excessive in relation to the
purpose or purposes for which they are processed.
• This is similar to the fourth Principle in the 1984 Act, though the definition of
processing used to be narrower.
• Case Study
– When the Poll Tax was introduced several years ago, local authorities had to collect
information on every citizen. A few authorities decided that, while they were at it, the
questionnaires could ask for other interesting items of information. The DP Registrar
forced them to change their questionnaires - at great expense - as this extra data was
excessive for the purposes of Poll Tax collection.
Principle 4
• Data are inaccurate if they are incorrect or misleading as to any matter off act.
• This Principle is not contravened because of any inaccuracy in personal data which
accurately record information obtained from the data subject in a case where the
data controller has taken reasonable steps to ensure the accuracy of the data.
Principle 5
• If you plan to retain personal data for historical analysis, identifying trends
or data mining, you must register this in the Data Protection Register.
Failure to do this may result in a breach of the fifth Principle.
• The Act gives some further guidance on matters which should be taken into account in deciding whether
security measures are "appropriate". These are:-
– taking into account the state of technological development at any time and the cost of implementing any measures, the measures must ensure a
level of security appropriate to the harm that might result from a breach of security and the nature of the data to be protected.
• The Act introduces express obligations upon data controllers when the processing of personal data is carried out
by a data processor on behalf of the data controller. In order to comply with the seventh Principle the data
controller must:-
– choose a data processor providing sufficient guarantees in respect of the security measures they take,
– ensure that the processing by the data processor is carried out under a contract which is made or evidenced in writing, under which the data
processor is to act only on instructions from the data controller. The contract must require the data processor to comply with obligations equivalent
to those imposed on the data controller by the seventh Principle.
Principle 8
• The European Economic Area consists of the EU Member States together with Iceland,
Liechtenstein and Norway.
• It might be sufficient to write into the contract of your outsourcing company that they
must comply with the terms of the UK's Data Protection Act 1998.
• This principle also affects organizations wishing to ship/transmit data to foreign countries -
such as India - where it can be processed more cheaply. Companies with head offices
outside the EEA will also have to realize they may no longer be able satisfy requests to
send personal data to head office.
Rights of the Individual
• The Act gives rights to individuals in respect of personal data held about them by
others. The rights are:
– Right of subject access (Sections 7 to 9).
– Right to prevent processing likely to cause damage or distress (Section 10).
– Right to prevent processing for the purposes of direct marketing (Section 11).
– Rights in relation to automated decision-taking (Section 12).
– Right to take action for compensation if the individual suffers damage by any contravention of
the Act by the data controller (Section 13).
– Right to take action to rectify, or erase inaccurate data (Section 14).
– Right to make a request to the Commissioner for an assessment to be made as to whether any
provision of the Act has been contravened (Section 42 of the Act).
Subject Access Rights
• Any individual may make a (written) request to the data controller for details of what information
is held about himself or herself. A standard fee is payable, but an individual is then entitled:
– to be told by the data controller whether they (or someone else on their behalf) are processing that
individual's personal data;
– if so, to be given a description of:
1. the personal data,
2. the purposes for which they are being processed, and
3. those to whom they may be disclosed;
– to be told of:
1. all the information which forms any such personal data. If any of the information in the copy is not intelligible without explanation, the data
subject should be given an explanation of that information, e.g. explanations of codes and abbreviations, and,
2. information as to the source of those data (some exceptions apply); and
– where a decision significantly affecting a data subject is made about them by fully automated
means, they are entitled to be told of the logic involved in that process. The data controller is
not required to do this where the logic in question constitutes a trade secret (phrase not
defined here).
Subject Access Rights
• A data controller must comply with a subject access request within forty days of receipt of the
request. If the controller needs to confirm the identity of the person making the request, a
response must be issued within forty days of receipt of that confirmation.
• The information given in response to a subject access request should be all that which is
contained in the personal data at the time the request was received.
– However, routine amendments and deletions of the data may continue between the date of the
request and the date of the reply. Hence the information revealed to the data subject may differ from
the data which were held at the time the request was received, even to the extent that data are no
longer held.
– However, the data controller must not tamper with the information just to make it acceptable to the
data subject
Subject Access Rights
• A particular problem arises for data controllers who may find that in complying with a
subject access request they will disclose information relating to an individual other
than the data subject who can be identified from that information. The Act
recognizes this problem and sets out only two circumstances in which the data
controller is obliged to comply with the subject access request in such circumstances,
namely:
– where the other individual has consented to the disclosure of the information; or
– where it is reasonable in all the circumstances to comply with the request without the
consent of the other individual.
• If a data subject believes that a data controller has failed to comply with a subject
access request in contravention of the Act, they may apply to Court for an order that
the data controller complies with the request. An order will be made if the Court is
satisfied that the data controller has failed to comply with the request in
contravention of the Act.
Injunction and Redress
• The data subject may take out an injunction
under Section 9 to prevent the data controller
from processing or even collecting personal
data. This can be done where processing is
likely to cause substantial damage or distress
which is unwarranted.