2 AccessControl 3 Protocols

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 143

Part II: Access Control

Part 2  Access Control 1


Access Control
 Two parts to access control
 Authentication: Are you who you say you are?
o Determine whether access is allowed
o Authenticate human to machine
o Or authenticate machine to machine
 Authorization: Are you allowed to do that?
o Once you have access, what can you do?
o Enforces limits on actions
 Note: “access control” often used as synonym
for authorization
Part 2  Access Control 2
Are You Who You Say You Are?
 How to authenticate human a machine?
 Can be based on…
o Something you know
 For example, a password
o Something you have
 For example, a smartcard
o Something you are
 For example, your fingerprint

Part 2  Access Control 3


Something You Know
 Passwords

 Lots of things act as passwords!


o PIN
o Social security number
o Mother’s maiden name
o Date of birth
o Name of your pet, etc.

Part 2  Access Control 4


Trouble with Passwords
 “Passwords are one of the biggest practical
problems facing security engineers today.”
 “Humans are incapable of securely storing
high-quality cryptographic keys, and they
have unacceptable speed and accuracy when
performing cryptographic operations. (They
are also large, expensive to maintain, difficult
to manage, and they pollute the environment.
It is astonishing that these devices continue
to be manufactured and deployed.)”
Part 2  Access Control 5
Why Passwords?
 Why is “something you know” more
popular than “something you have” and
“something you are”?
 Cost: passwords are free
 Convenience: easier for admin to
reset pwd than to issue a new thumb

Part 2  Access Control 6


Keys vs Passwords

 Crypto keys  Passwords


 Spse key is 64 bits  Spse passwords are 8
characters, and 256
 Then 264 keys
different characters
 Choose key at  Then 2568 = 264 pwds
random…  Users do not select
 …then attacker must passwords at random
try about 263 keys  Attacker has far less
than 263 pwds to try
(dictionary attack)
Part 2  Access Control 7
Good and Bad Passwords
 Bad passwords  Good Passwords?
o frank o jfIej,43j-EmmL+y
o Fido o 0986437653726
o password 3
o 4444 o P0kem0N
o Pikachu o FSa7Yago
o 102560 o 0nceuP0nAt1m8
o AustinStamp
o PokeGCTall150
Part 2  Access Control 8
Password Experiment
 Three groups of users  each group
advised to select passwords as follows
o Group A: At least 6 chars, 1 non-letter
winner o Group B: Password based on passphrase
o Group C: 8 random characters
 Results
o Group A: About 30% of pwds easy to crack
o Group B: About 10% cracked
 Passwords easy to remember
o Group C: About 10% cracked
 Passwords hard to remember
Part 2  Access Control 9
Password Experiment
 User compliance hard to achieve
 In each case, 1/3rd did not comply
o And about 1/3rd of those easy to crack!
 Assigned passwords sometimes best
 If passwords not assigned, best advice is…
o Choose passwords based on passphrase
o Use pwd cracking tool to test for weak pwds
 Require periodic password changes?
Part 2  Access Control 10
Attacks on Passwords
 Attacker could…
o Target one particular account
o Target any account on system
o Target any account on any system
o Attempt denial of service (DoS) attack
 Common attack path
o Outsider  normal user  administrator
o May only require one weak password!

Part 2  Access Control 11


Password Retry
 Suppose system locks after 3 bad
passwords. How long should it lock?
o 5 seconds
o 5 minutes
o Until SA restores service
 What are +’s and -’s of each?

Part 2  Access Control 12


Password File?
 Bad idea to store passwords in a file
 But we need to verify passwords
 Cryptographic solution: hash the pwd
o Store y = h(password)
o Can verify entered password by hashing
o If Trudy obtains “password file,” she
does not obtain passwords
 But Trudy can try a forward search
o Guess x and check whether y = h(x)
Part 2  Access Control 13
Dictionary Attack
 Trudy pre-computes h(x) for all x in a
dictionary of common passwords
 Suppose Trudy gets access to password
file containing hashed passwords
o She only needs to compare hashes to her pre-
computed dictionary
o After one-time work, actual attack is trivial
 Can we prevent this attack? Or at least
make attacker’s job more difficult?

Part 2  Access Control 14


Salt
 Hash password with salt
 Choose random salt s and compute
y = h(password, s)
and store (s,y) in the password file
 Note: The salt s is not secret
 Easy to verify salted password
 But Trudy must re-compute dictionary
hashes for each user
o Lots more work for Trudy!

Part 2  Access Control 15


Password Cracking:
Do the Math
 Assumptions:
 Pwds are 8 chars, 128 choices per character
o Then 1288 = 256 possible passwords
 There is a password file with 210 pwds
 Attacker has dictionary of 220 common pwds
 Probability of 1/4 that a pwd is in dictionary
 Work is measured by number of hashes

Part 2  Access Control 16


Password Cracking: Case I
 Attack 1 password without dictionary
o Must try 256/2 = 255 on average
o Like exhaustive key search
 Does salt help in this case?

Part 2  Access Control 17


Password Cracking: Case II
 Attack 1 password with dictionary
 With salt
o Expected work: 1/4 (219) + 3/4 (255) = 254.6
o In practice, try all pwds in dictionary…
o …then work is at most 220 and probability of
success is 1/4
 What if no salt is used?
o One-time work to compute dictionary: 220
o Expected work still same order as above
o But with precomputed dictionary hashes, the
“in practice” attack is free…
Part 2  Access Control 18
Other Password Issues
 Too many passwords to remember
o Results in password reuse
o Why is this a problem?
 Who suffers from bad password?
o Login password vs ATM PIN
 Failure to change default passwords
 Social engineering
 Error logs may contain “almost” passwords
 Bugs, keystroke logging, spyware, etc.

Part 2  Access Control 19


Passwords
 The bottom line…
 Password cracking is too easy
o One weak password may break security
o Users choose bad passwords
o Social engineering attacks, etc.
 Trudy has (almost) all of the advantages
 All of the math favors bad guys
 Passwords are a BIG security problem
o And will continue to be a big problem
Part 2  Access Control 20
Password Cracking Tools
 Popular password cracking tools
o Password Crackers
o Password Portal
o L0phtCrack and LC4 (Windows)
o John the Ripper (Unix)
 Admins should use these tools to test for
weak passwords since attackers will
 Good articles on password cracking
o Passwords - Conerstone of Computer Security
o Passwords revealed by sweet deal
Part 2  Access Control 21
Biometrics

Part 2  Access Control 22


Something You Are
 Biometric
o “You are your key”  Schneier
 Examples
o Fingerprint
o Handwritten signature Are
o Facial recognition Know Have
o Speech recognition
o Gait (walking) recognition
o “Digital doggie” (odor recognition)
o Many more!
Part 2  Access Control 23
Why Biometrics?
 More secure replacement for passwords
 Cheap and reliable biometrics needed
o Today, an active area of research
 Biometrics are used in security today
o Thumbprint mouse
o Palm print for secure entry
o Fingerprint to unlock car door, etc.
 But biometrics not too popular
o Has not lived up to its promise (yet?)

Part 2  Access Control 24


Ideal Biometric
 Universal  applies to (almost) everyone
o In reality, no biometric applies to everyone
 Distinguishing  distinguish with certainty
o In reality, cannot hope for 100% certainty
 Permanent  physical characteristic being
measured never changes
o In reality, OK if it to remains valid for long time
 Collectable  easy to collect required data
o Depends on whether subjects are cooperative
 Also, safe, user-friendly, etc., etc.
Part 2  Access Control 25
Biometric Modes
 Identification  Who goes there?
o Compare one-to-many
o Example: The FBI fingerprint database
 Authentication  Are you who you say you are?
o Compare one-to-one
o Example: Thumbprint mouse
 Identification problem is more difficult
o More “random” matches since more comparisons
 We are interested in authentication
Part 2  Access Control 26
Enrollment vs Recognition
 Enrollment phase
o Subject’s biometric info put into database
o Must carefully measure the required info
o OK if slow and repeated measurement needed
o Must be very precise
o May be weak point of many biometric
 Recognition phase
o Biometric detection, when used in practice
o Must be quick and simple
o But must be reasonably accurate
Part 2  Access Control 27
Biometric Errors
 Fraud rate versus insult rate
o Fraud  Trudy mis-authenticated as Alice
o Insult  Alice not authenticated as Alice
 For any biometric, can decrease fraud or
insult, but other one will increase
 For example
o 99% voiceprint match  low fraud, high insult
o 30% voiceprint match  high fraud, low insult
 Equal error rate: rate where fraud == insult
o A way to compare different biometrics
Part 2  Access Control 28
Fingerprint Comparison
 Examples of loops, whorls, and arches
 Minutia extracted from these features

Loop (double) Whorl Arch

Part 2  Access Control 29


Fingerprint: Enrollment

 Capture image of fingerprint


 Enhance image
 Identify points
Part 2  Access Control 30
Fingerprint: Recognition

 Extracted points are compared with


information stored in a database
 Is it a statistical match?
 Aside: Do identical twins’ fingerprints differ?
Part 2  Access Control 31
Hand Geometry
 A popular biometric
 Measures shape of hand
o Width of hand, fingers
o Length of fingers, etc.
 Human hands not unique
 Hand geometry sufficient
for many situations
 OK for authentication
 Not useful for ID problem

Part 2  Access Control 32


Hand Geometry
 Advantages
o Quick  1 minute for enrollment, 5
seconds for recognition
o Hands are symmetric  so what?
 Disadvantages
o Cannot use on very young or very old
o Relatively high equal error rate

Part 2  Access Control 33


Iris Patterns

 Iris pattern development is “chaotic”


 Little or no genetic influence
 Different even for identical twins
 Pattern is stable through lifetime
Part 2  Access Control 34
Iris Scan
 Scanner locates iris
 Take b/w photo
 Use polar coordinates…
 2-D wavelet transform
 Get 256 byte iris code

Part 2  Access Control 35


Measuring Iris Similarity
 Based on Hamming distance
 Define d(x,y) to be
o # of non match bits / # of bits compared
o d(0010,0101) = 3/4 and d(101111,101001) = 1/3
 Compute d(x,y) on 2048-bit iris code
o Perfect match is d(x,y) = 0
o For same iris, expected distance is 0.08
o At random, expect distance of 0.50
o Accept iris scan as match if distance < 0.32
Part 2  Access Control 36
Iris Scan Error Rate
distance Fraud rate

0.29 1 in 1.31010
0.30 1 in 1.5109
0.31 1 in 1.8108
0.32 1 in 2.6107
0.33 1 in 4.0106
0.34 1 in 6.9105
0.35 1 in 1.3105
== equal error rate
Part 2  Access Control
distance 37
Attack on Iris Scan
 Good photo of eye can be scanned
o Attacker could use photo of eye
 Afghan woman was authenticated by
iris scan of old photo
o Story is here
 To prevent attack, scanner could use
light to be sure it is a “live” iris

Part 2  Access Control 38


Equal Error Rate Comparison
 Equal error rate (EER): fraud == insult rate
 Fingerprint biometric has EER of about 5%
 Hand geometry has EER of about 10-3
 In theory, iris scan has EER of about 10-6
o But in practice, may be hard to achieve
o Enrollment phase must be extremely accurate
 Most biometrics much worse than fingerprint!
 Biometrics useful for authentication…
o …but identification biometrics almost useless today
Part 2  Access Control 39
Biometrics: The Bottom Line
 Biometrics are hard to forge
 But attacker could
o Steal Alice’s thumb
o Photocopy Bob’s fingerprint, eye, etc.
o Subvert software, database, “trusted path” …
 And how to revoke a “broken” biometric?
 Biometrics are not foolproof
 Biometric use is limited today
 That should change in the (near?) future

Part 2  Access Control 40


Something You Have
 Something in your possession
 Examples include following…
o Car key
o Laptop computer (or MAC address)
o Password generator (next)
o ATM card, smartcard, etc.

Part 2  Access Control 41


2-factor Authentication
 Requires any 2 out of 3 of
o Something you know
o Something you have
o Something you are
 Examples
o ATM: Card and PIN
o Credit card: Card and signature
o Password generator: Device and PIN
o Smartcard with password/PIN

Part 2  Access Control 42


Single Sign-on
 A hassle to enter password(s) repeatedly
o Alice wants to authenticate only once
o “Credentials” stay with Alice wherever she goes
o Subsequent authentications transparent to Alice
 Kerberos --- example single sign-on protocol
 Single sign-on for the Internet?
o Microsoft: Passport
o Everybody else: Liberty Alliance
o Security Assertion Markup Language (SAML)

Part 2  Access Control 43


Authorization

Part 2  Access Control 44


Authentication vs
Authorization
 Authentication  Are you who you say you are?
o Restrictions on who (or what) can access system
 Authorization  Are you allowed to do that?
o Restrictions on actions of authenticated users
 Authorization is a form of access control
 Classic authorization enforced by
o Access Control Lists (ACLs)
o Capabilities (C-lists)

Part 2  Access Control 45


Lampson’s Access Control Matrix
 Subjects (users) index the rows
 Objects (resources) index the columns
Accounting Accounting Insurance Payroll
OS program data data data

Bob rx rx r --- ---

Alice rx rx r rw rw

Sam rwx rwx r rw rw


Accounting
program rx rx rw rw rw
Part 2  Access Control 46
Are You Allowed to Do That?
 Access control matrix has all relevant info
 Could be 1000’s of users, 1000’s of resources
 Then matrix with 1,000,000’s of entries
 How to manage such a large matrix?
 Need to check this matrix before access to
any resource is allowed
 How to make this efficient?

Part 2  Access Control 47


Access Control Lists (ACLs)
 ACL: store access control matrix by column
 Example: ACL for insurance data is in blue
Accounting Accounting Insurance Payroll
OS program data data data

Bob rx rx r --- ---

Alice rx rx r rw rw

Sam rwx rwx r rw rw


Accounting
program rx rx rw rw rw
Part 2  Access Control 48
Capabilities (or C-Lists)
 Store access control matrix by row
 Example: Capability for Alice is in red
Accounting Accounting Insurance Payroll
OS program data data data

Bob rx rx r --- ---

Alice rx rx r rw rw

Sam rwx rwx r rw rw


Accounting
program rx rx rw rw rw
Part 2  Access Control 49
ACLs vs Capabilities
r r
Alice --- file1 Alice w file1
r rw

w ---
Bob r file2 Bob r file2
--- r

rw r
Fred r file3 Fred --- file3
r r

Access Control List Capability

 Note that arrows point in opposite directions…


 With ACLs, still need to associate users to files
Part 2  Access Control 50
ACLs vs Capabilities
 ACLs
o Good when users manage their own files
o Protection is data-oriented
o Easy to change rights to a resource
 Capabilities
o Easy to delegate---avoid the confused deputy
o Easy to add/delete users
o More difficult to implement
o The “Zen of information security”
 Capabilities loved by academics
o Capability Myths Demolished

Part 2  Access Control 51


CAPTCHA

Part 2  Access Control 52


Turing Test
 Proposed by Alan Turing in 1950
 Human asks questions to another human
and a computer, without seeing either
 If questioner cannot distinguish human
from computer, computer passes the test
 The gold standard in artificial intelligence
 No computer can pass this today
o But some claim to be close to passing

Part 2  Access Control 53


CAPTCHA
 CAPTCHA
o Completely Automated Public Turing test to tell
Computers and Humans Apart
 Automated  test is generated and scored
by a computer program
 Public  program and data are public
 Turing test to tell…  humans can pass the
test, but machines cannot pass
o Also known as HIP == Human Interactive Proof
 Like an inverse Turing test (well, sort of…)
Part 2  Access Control 54
CAPTCHA Paradox?
 “…CAPTCHA is a program that can
generate and grade tests that it itself
cannot pass…”
o “…much like some professors…”
 Paradox  computer creates and scores
test that it cannot pass!
 CAPTCHA used so that only humans can get
access (i.e., no bots/computers)
 CAPTCHA is for access control
Part 2  Access Control 55
Do CAPTCHAs Exist?
 Test: Find 2 words in the following

 Easy for most humans


 A (difficult?) OCR problem for computer
o OCR == Optical Character Recognition
Part 2  Access Control 56
CAPTCHAs
 Current types of CAPTCHAs
o Visual  like previous example
o Audio  distorted words or music
 No text-based CAPTCHAs
o Maybe this is impossible…

Part 2  Access Control 57


Part III: Protocols

Part 3  Protocols 58
Protocol
 Human protocols  the rules followed in
human interactions
o Example: Asking a question in class
 Networking protocols  rules followed in
networked communication systems
o Examples: HTTP, FTP, etc.
 Security protocol  the (communication)
rules followed in a security application
o Examples: SSL, IPSec, Kerberos, etc.

Part 3  Protocols 59
Protocols
 Protocol flaws can be very subtle
 Several well-known security protocols
have significant flaws
o Including WEP, GSM, and IPSec
 Implementation errors can occur
o Recent IE implementation of SSL
 Not easy to get protocols right…

Part 3  Protocols 60
Ideal Security Protocol
 Must satisfy security requirements
o Requirements need to be precise
 Efficient
o Small computational requirement
o Small bandwidth usage, minimal delays…
 Robust
o Works when attacker tries to break it
o Works even if environment changes
 Easy to use & implement, flexible…
 Difficult to satisfy all of these!
Part 3  Protocols 61
Chapter 9:
Simple Security Protocols
“I quite agree with you,” said the Duchess; “and the moral of that is
‘Be what you would seem to be’ or
if you'd like it put more simply‘Never imagine yourself not to be
otherwise than what it might appear to others that what you were
or might have been was not otherwise than what you
had been would have appeared to them to be otherwise.’ ”
 Lewis Carroll, Alice in Wonderland

Seek simplicity, and distrust it.


 Alfred North Whitehead

Part 2  Access Control 62


Authentication Protocols

Part 3  Protocols 63
Authentication
 Alice must prove her identity to Bob
o Alice and Bob can be humans or computers
 May also require Bob to prove he’s Bob
(mutual authentication)
 Probably need to establish a session key
 May have other requirements, such as
o Use public keys
o Use symmetric keys
o Use hash functions
o Anonymity, plausible deniability, etc., etc.

Part 3  Protocols 64
Authentication
 Authentication on a stand-alone computer is
relatively simple
o Hash password with salt
o “Secure path,” attacks on authentication
software, keystroke logging, etc., can be issues
 Authentication over a network is challenging
o Attacker can passively observe messages
o Attacker can replay messages
o Active attacks possible (insert, delete, change)
Part 3  Protocols 65
Simple Authentication
“I’m Alice”

Prove it

My password is “frank”
Alice Bob

 Simple and may be OK for standalone system


 But insecure for networked system
o Subject to a replay attack (next 2 slides)
o Also, Bob must know Alice’s password
Part 3  Protocols 66
Authentication Attack
“I’m Alice”

Prove it

My password is “frank”
Alice Bob

Trudy
Part 3  Protocols 67
Authentication Attack

“I’m Alice”

Prove it

My password is “frank”
Trudy Bob

 This is an example of a replay attack


 How can we prevent a replay?
Part 3  Protocols 68
Simple Authentication

I’m Alice, my password is “frank”

Alice Bob

 More efficient, but…


 … same problem as previous version
Part 3  Protocols 69
Better Authentication
“I’m Alice”

Prove it

h(Alice’s password)
Alice Bob

 Better since it hides Alice’s password


o From both Bob and Trudy
 But still subject to replay
Part 3  Protocols 70
Challenge-Response
 To prevent replay, use challenge-response
o Goal is to ensure “freshness”
 Suppose Bob wants to authenticate Alice
o Challenge sent from Bob to Alice
 Challenge is chosen so that…
o Replay is not possible
o Only Alice can provide the correct response
o Bob can verify the response

Part 3  Protocols 71
Nonce
 To ensure freshness, can employ a nonce
o Nonce == number used once
 What to use for nonces?
o That is, what is the challenge?
 What should Alice do with the nonce?
o That is, how to compute the response?
 How can Bob verify the response?
 Should we rely on passwords or keys?

Part 3  Protocols 72
Challenge-Response
“I’m Alice”

Nonce

h(Alice’s password, Nonce)


Alice Bob

 Nonce is the challenge


 The hash is the response
 Nonce prevents replay, ensures freshness
 Password is something Alice knows
 Note: Bob must know Alice’s pwd to verify
Part 3  Protocols 73
Generic Challenge-Response
“I’m Alice”

Nonce

Something that could only be


Alice from Alice (and Bob can verify) Bob

 In practice, how to achieve this?


 Hashed password works, but…
 Encryption is better here (Why?)
Part 3  Protocols 74
Symmetric Key Notation
 Encrypt plaintext P with key K
C = E(P,K)
 Decrypt ciphertext C with key K
P = D(C,K)
 Here, we are concerned with attacks on
protocols, not attacks on crypto
o So, we assume crypto algorithms are secure

Part 3  Protocols 75
Authentication: Symmetric Key
 Alice and Bob share symmetric key K
 Key K known only to Alice and Bob
 Authenticate by proving knowledge of
shared symmetric key
 How to accomplish this?
o Cannot reveal key, must not allow replay
(or other) attack, must be verifiable, …

Part 3  Protocols 76
Authentication with
Symmetric Key
“I’m Alice”
R
E(R,K)
Alice, K Bob, K

 Secure method for Bob to authenticate Alice


 Alice does not authenticate Bob
 So, can we achieve mutual authentication?
Part 3  Protocols 77
Mutual Authentication?

“I’m Alice”, R

E(R,K)

E(R,K)
Alice, K Bob, K

 What’s wrong with this picture?


 “Alice” could be Trudy (or anybody else)!
Part 3  Protocols 78
Mutual Authentication
 Since we have a secure one-way
authentication protocol…
 The obvious thing to do is to use the
protocol twice
o Once for Bob to authenticate Alice
o Once for Alice to authenticate Bob
 This has got to work…

Part 3  Protocols 79
Mutual Authentication

“I’m Alice”, RA

RB, E(RA, K)

E(RB, K)
Alice, K Bob, K

 This provides mutual authentication…


 …or does it? See the next slide
Part 3  Protocols 80
Mutual Authentication Attack
1. “I’m Alice”, RA
2. RB, E(RA, K)

Trudy Bob, K

3. “I’m Alice”, RB

4. RC, E(RB, K)

Trudy Bob, K
Part 3  Protocols 81
Mutual Authentication
 Our one-way authentication protocol is
not secure for mutual authentication
o Protocols are subtle!
o The “obvious” thing may not be secure
 Also, if assumptions or environment
change, protocol may not be secure
o This is a common source of security failure
o For example, Internet protocols

Part 3  Protocols 82
Symmetric Key Mutual
Authentication
“I’m Alice”, RA

RB, E(“Bob”,RA,K)

E(“Alice”,RB,K)
Alice, K Bob, K

 Do these “insignificant” changes help?


 Yes!
Part 3  Protocols 83
Public Key Notation
 Encrypt M with Alice’s public key: {M}Alice
 Sign M with Alice’s private key: [M]Alice
 Then
o [{M}Alice ]Alice = M
o {[M]Alice }Alice = M
 Anybody can use Alice’s public key
 Only Alice can use her private key

Part 3  Protocols 84
Public Key Authentication

“I’m Alice”
{R}Alice

R
Alice Bob
 Is this secure?
 Trudy can get Alice to decrypt anything!
o So, should have two key pairs
Part 3  Protocols 85
Public Key Authentication

“I’m Alice”

[R]Alice
Alice Bob
 Is this secure?
 Trudy can get Alice to sign anything!
o Same a previous  should have two key pairs
Part 3  Protocols 86
Public Keys
 Generally, a bad idea to use the same
key pair for encryption and signing
 Instead, should have…
o …one key pair for encryption/decryption…
o …and a different key pair for
signing/verifying signatures

Part 3  Protocols 87
Session Key
 Usually, a session key is required
o I.e., a symmetric key for a particular session
o Used for confidentiality and/or integrity
 How to authenticate and establish a
session key (i.e., shared symmetric key)?
o When authentication completed, want Alice and
Bob to share a session key
o Trudy cannot break the authentication…
o …and Trudy cannot determine the session key

Part 3  Protocols 88
Authentication & Session Key
“I’m Alice”, R
{R,K}Alice

{R +1,K}Bob
Alice Bob

 Is this secure?
o Alice is authenticated and session key is secure
o Alice’s “nonce”, R, useless to authenticate Bob
o The key K is acting as Bob’s nonce to Alice
 No mutual authentication
Part 3  Protocols 89
Public Key Authentication
and Session Key
“I’m Alice”, R
[R,K]Bob

[R +1,K]Alice
Alice Bob

 Is this secure?
o Mutual authentication (good), but…
o … session key is not secret (very bad)

Part 3  Protocols 90
Public Key Authentication
and Session Key
“I’m Alice”, R
{[R,K]Bob}Alice

{[R +1,K]Alice}Bob
Alice Bob

 Is this secure?
 Seems to be OK
 Mutual authentication and session key!
Part 3  Protocols 91
Public Key Authentication
and Session Key
“I’m Alice”, R
[{R,K}Alice]Bob

[{R +1,K}Bob]Alice
Alice Bob

 Is this secure?
 Seems to be OK
o Anyone can see {R,K}Alice and {R +1,K}Bob
Part 3  Protocols 92
Perfect Forward Secrecy
 Consider this “issue”…
o Alice encrypts message with shared key K and
sends ciphertext to Bob
o Trudy records ciphertext and later attacks
Alice’s (or Bob’s) computer to recover K
o Then Trudy decrypts recorded messages
 Perfect forward secrecy (PFS): Trudy
cannot later decrypt recorded ciphertext
o Even if Trudy gets key K or other secret(s)
 Is PFS possible?
Part 3  Protocols 93
Perfect Forward Secrecy
 Suppose Alice and Bob share key K
 For perfect forward secrecy, Alice and Bob
cannot use K to encrypt
 Instead they must use a session key KS and
forget it after it’s used
 Can Alice and Bob agree on session key KS
in a way that ensures PFS?

Part 3  Protocols 94
Naïve Session Key Protocol

E(KS, K)

E(messages, KS)

Alice, K Bob, K

 Trudy could record E(KS, K)


 If Trudy later gets K then she can get KS
o Then Trudy can decrypt recorded messages
Part 3  Protocols 95
Perfect Forward Secrecy
 We use Diffie-Hellman for PFS
 Recall: public g and p

ga mod p
gb mod p

Alice, a Bob, b
 But Diffie-Hellman is subject to MiM
 How to get PFS and prevent MiM?

Part 3  Protocols 96
Perfect Forward Secrecy
E(ga mod p, K)
E(gb mod p, K)

Alice: K, a Bob: K, b
 Session key KS = gab mod p
 Alice forgets a, Bob forgets b
 So-called Ephemeral Diffie-Hellman
 Neither Alice nor Bob can later recover KS
 Are there other ways to achieve PFS?
Part 3  Protocols 97
Mutual Authentication,
Session Key and PFS
“I’m Alice”, RA
RB, [{RA, gb mod p}Alice]Bob

[{RB, ga mod p}Bob]Alice


Alice Bob

 Session key is K = gab mod p


 Alice forgets a and Bob forgets b
 If Trudy later gets Bob’s and Alice’s secrets,
she cannot recover session key K
Part 3  Protocols 98
Timestamps
 A timestamp T is derived from current time
 Timestamps used in some security protocols
o Kerberos, for example
 Timestamps reduce number of msgs (good)
o Like a nonce that both sides know in advance
 “Time” is a security-critical parameter (bad)
 Clocks never exactly the same, so must allow
for clock skew  creates risk of replay
o How much clock skew is enough?

Part 3  Protocols 99
Public Key Authentication
with Timestamp T
“I’m Alice”, {[T, K]Alice}Bob
{[T +1, K]Bob}Alice

Alice Bob

 Secure mutual authentication?


 Session key?
 Seems to be OK
Part 3  Protocols 100
Public Key Authentication
with Timestamp T
“I’m Alice”, [{T, K}Bob]Alice
[{T +1, K}Alice]Bob

Alice Bob

 Secure authentication and session key?


 Trudy can use Alice’s public key to find
{T, K}Bob and then…
Part 3  Protocols 101
Public Key Authentication
with Timestamp T

“I’m Trudy”, [{T, K}Bob]Trudy


[{T +1, K}Trudy]Bob

Trudy Bob

 Trudy obtains Alice-Bob session key K


 Note: Trudy must act within clock skew

Part 3  Protocols 102


Public Key Authentication
 Sign and encrypt with nonce…
o Secure
 Encrypt and sign with nonce…
o Secure
 Sign and encrypt with timestamp…
o Secure
 Encrypt and sign with timestamp…
o Insecure
 Protocols can be subtle!

Part 3  Protocols 103


Public Key Authentication
with Timestamp T

“I’m Alice”, [{T, K}Bob]Alice


[{T +1}Alice]Bob

Alice Bob

 Is this “encrypt and sign” secure?


o Yes, seems to be OK
 Does “sign and encrypt” also work here?
Part 3  Protocols 104
Chapter 10:
Real-World Protocols
The wire protocol guys don't worry about security because that's really
a network protocol problem. The network protocol guys don't
worry about it because, really, it's an application problem.
The application guys don't worry about it because, after all,
they can just use the IP address and trust the network.
 Marcus J. Ranum

In the real world, nothing happens at the right place at the right time.
It is the job of journalists and historians to correct that.
 Mark Twain

Part 2  Access Control 105


Real-World Protocols
 Next, we look at real protocols
o SSL  practical security on the Web
o Kerberos  symmetric key, single sign-on
o GSM  mobile phone (in)security

Part 3  Protocols 106


Secure Socket Layer

Part 3  Protocols 107


Socket layer
 “Socket layer”
lives between Socket application User
application “layer”
and transport transport
OS
layers
network
 SSL usually
between HTTP link
NIC
and TCP
physical

Part 3  Protocols 108


What is SSL?
 SSL is the protocol used for majority of
secure transactions on the Internet
 For example, if you want to buy a book at
amazon.com…
o You want to be sure you are dealing with Amazon
(authentication)
o Your credit card information must be protected
in transit (confidentiality and/or integrity)
o As long as you have money, Amazon does not
care who you are
o So, no need for mutual authentication

Part 3  Protocols 109


Simplified SSL Protocol
Can we talk?, cipher list, RA
certificate, cipher, RB
{S}Bob, E(h(msgs,CLNT,K),K)
h(msgs,SRVR,K)
Alice Data protected with key K Bob

 S is known as pre-master secret


 K = h(S,RA,RB)
 “msgs” means all previous messages
 CLNT and SRVR are constants

Part 3  Protocols 110


SSL Keys
6 “keys” derived from K = h(S,RA,RB)
o 2 encryption keys: send and receive
o 2 integrity keys: send and receive
o 2 IVs: send and receive
o Why different keys in each direction?
 Q: Why is h(msgs,CLNT,K) encrypted?
 A: Apparently, it adds no security…

Part 3  Protocols 111


SSL Authentication
 Alice authenticates Bob, not vice-versa
o How does client authenticate server?
o Why would server not authenticate client?
 Mutual authentication is possible: Bob
sends certificate request in message 2
o Then client must have a valid certificate
o But, if server wants to authenticate client,
server could instead require password

Part 3  Protocols 112


SSL MiM Attack?
RA RA
certificateT, RB certificateB, RB
{S1}Trudy,E(X1,K1) {S2}Bob,E(X2,K2)
h(Y1,K1) h(Y2,K2)
Alice E(data,K1) Trudy E(data,K2) Bob

 Q: What prevents this MiM “attack”?


 A: Bob’s certificate must be signed by a
certificate authority (CA)
 What does browser do if signature not valid?
 What does user do when browser complains?
Part 3  Protocols 113
SSL Sessions vs Connections
 SSL session is established as shown on
previous slides
 SSL designed for use with HTTP 1.0
 HTTP 1.0 often opens multiple simultaneous
(parallel) connections
o Multiple connections per session
 SSL session is costly, public key operations
 SSL has an efficient protocol for opening
new connections given an existing session
Part 3  Protocols 114
SSL Connection
session-ID, cipher list, RA
session-ID, cipher, RB,
h(msgs,SRVR,K)
h(msgs,CLNT,K)

Alice Protected data Bob

 Assuming SSL session exists


 So, S is already known to Alice and Bob
 Both sides must remember session-ID
 Again, K = h(S,RA,RB)
 No public key operations! (relies on known S)
Part 3  Protocols 115
Kerberos

Part 3  Protocols 116


Kerberos
 In Greek mythology, Kerberos is 3-headed
dog that guards entrance to Hades
o “Wouldn’t it make more sense to guard the exit?”
 In security, Kerberos is an authentication
protocol based on symmetric key crypto
o Originated at MIT
o Based on work by Needham and Schroeder
o Relies on a Trusted Third Party (TTP)

Part 3  Protocols 117


Motivation for Kerberos
 Authentication using public keys
o N users  N key pairs
 Authentication using symmetric keys
o N users requires (on the order of) N2 keys
 Symmetric key case does not scale
 Kerberos based on symmetric keys but only
requires N keys for N users
- Security depends on TTP
+ No PKI is needed

Part 3  Protocols 118


Kerberos KDC
 Kerberos Key Distribution Center or KDC
o KDC acts as the TTP
o TTP is trusted, so it must not be compromised
 KDC shares symmetric key KA with Alice,
key KB with Bob, key KC with Carol, etc.
 And a master key KKDC known only to KDC
 KDC enables authentication, session keys
o Session key for confidentiality and integrity
 In practice, crypto algorithm is DES
Part 3  Protocols 119
Kerberos Tickets
 KDC issue tickets containing info needed to
access network resources
 KDC also issues Ticket-Granting Tickets
or TGTs that are used to obtain tickets
 Each TGT contains
o Session key
o User’s ID
o Expiration time
 Every TGT is encrypted with KKDC
o So, TGT can only be read by the KDC
Part 3  Protocols 120
Kerberized Login
 Alice enters her password
 Then Alice’s computer does following:
o Derives KA from Alice’s password
o Uses KA to get TGT for Alice from KDC
 Alice then uses her TGT (credentials) to
securely access network resources
 Plus: Security is transparent to Alice
 Minus: KDC must be secure  it’s trusted!

Part 3  Protocols 121


Kerberized Login
Alice wants
Alice’s a TGT
password
E(SA,TGT,KA)

Alice Computer KDC


 Key KA = h(Alice’s password)
 KDC creates session key SA
 Alice’s computer decrypts SA and TGT
o Then it forgets KA
 TGT = E(“Alice”, SA, KKDC)
Part 3  Protocols 122
Alice Requests “Ticket to Bob”
I want to
talk to Bob
Talk to Bob REQUEST

REPLY

Alice Computer KDC


 REQUEST = (TGT, authenticator)
o authenticator = E(timestamp, SA)
 REPLY = E(“Bob”, KAB, ticket to Bob, SA)
o ticket to Bob = E(“Alice”, KAB, KB)
 KDC gets SA from TGT to verify timestamp
Part 3  Protocols 123
Alice Uses Ticket to Bob

ticket to Bob, authenticator

E(timestamp + 1, KAB)

Alice’s Bob
Computer

 ticket to Bob = E(“Alice”, KAB, KB)


 authenticator = E(timestamp, KAB)
 Bob decrypts “ticket to Bob” to get KAB which he
then uses to verify timestamp
Part 3  Protocols 124
Accessing a Remote Principal
rlogin Bob “Alice”, “Bob”, TGT, SA{timestamp}

KDC
SA{“Bob”, KAB, KB{“Alice”, KAB}}

Alice’s workstation
Alice

KB{“Alice”, KAB}, KAB{timestamp}

Bob
KAB{timestamp+1}
Kerberos
 Key SA used in authentication
o For confidentiality/integrity
 Timestamps for authentication and
replay protection
 Recall, that timestamps…
o Reduce the number of messageslike a
nonce that is known in advance
o But, “time” is a security-critical parameter
Part 3  Protocols 126
Kerberos Questions
 When Alice logs in, KDC sends E(SA, TGT, KA)
where TGT = E(“Alice”, SA, KKDC)
Q: Why is TGT encrypted with KA?
A: Extra work for no added security!
 In Alice’s “Kerberized” login to Bob, why
can Alice remain anonymous?
 Why is “ticket to Bob” sent to Alice?
o Why doesn’t KDC send it directly to Bob?

Part 3  Protocols 127


Kerberos Alternatives
 Could have Alice’s computer remember
password and use that for authentication
o Then no KDC required
o But hard to protect passwords
o Also, does not scale
 Could have KDC remember session key
instead of putting it in a TGT
o Then no need for TGT
o But stateless KDC is major feature of Kerberos

Part 3  Protocols 128


GSM (In)Security

Part 3  Protocols 129


Cell Phones
 First generation cell phones
o Brick-sized, analog, few standards
o Little or no security
o Susceptible to cloning
 Second generation cell phones: GSM
o Began in 1982 as “Groupe Speciale Mobile”
o Now, Global System for Mobile Communications
 Third generation?
o 3rd Generation Partnership Project (3GPP)

Part 3  Protocols 130


GSM System Overview

air
interface

Mobile
Base AuC
VLR
Station
“land line”
HLR
PSTN
Base Internet
etc. Home
Visited Station Network
Network Controller

Part 3  Protocols 131


GSM System Components
 Mobile phone
o Contains SIM (Subscriber
Identity Module)
 SIM is the security module
o IMSI (International Mobile
Subscriber ID)
o User key: Ki (128 bits) SIM
o Tamper resistant (smart card)
o PIN activated (usually not used)

Part 3  Protocols 132


GSM System Components
 Visited network  network where mobile is
currently located
o Base station  one “cell”
o Base station controller  manages many cells
o VLR (Visitor Location Register)  info on all
visiting mobiles currently in the network
 Home network  “home” of the mobile
o HLR (Home Location Register)  keeps track of
most recent location of mobile
o AuC (Authentication Center)  has IMSI and Ki
Part 3  Protocols 133
GSM Security Goals
 Primary design goals
o Make GSM as secure as ordinary telephone
o Prevent phone cloning
 Not designed to resist an active attacks
o At the time this seemed infeasible
o Today such an attacks are feasible…
 Designers considered biggest threats to be
o Insecure billing
o Corruption
o Other low-tech attacks
Part 3  Protocols 134
GSM Security Features
 Anonymity
o Intercepted traffic does not identify user
o Not so important to phone company
 Authentication
o Necessary for proper billing
o Very, very important to phone company!
 Confidentiality
o Confidentiality of calls over the air interface
o Not important to phone company
o May be important for marketing
Part 3  Protocols 135
GSM: Authentication
 Caller is authenticated to base station
 Authentication is not mutual
 Authentication via challenge-response
o Home network generates RAND and computes
XRES = A3(RAND, Ki) where A3 is a hash
o Then (RAND,XRES) sent to base station
o Base station sends challenge RAND to mobile
o Mobile’s response is SRES = A3(RAND, Ki)
o Base station verifies SRES = XRES
 Note: Ki never leaves home network!
Part 3  Protocols 136
GSM: Confidentiality
 Data encrypted with stream cipher
 Error rate estimated at about 1/1000
o Error rate is high for a block cipher
 Encryption key Kc
o Home network computes Kc = A8(RAND, Ki)
where A8 is a hash
o Then Kc sent to base station with (RAND,XRES)
o Mobile computes Kc = A8(RAND, Ki)
o Keystream generated from A5(Kc)
 Note: Ki never leaves home network!
Part 3  Protocols 137
GSM Security
1. IMSI
2. IMSI
4. RAND
3. (RAND,XRES,Kc)
5. SRES
Mobile Base Home
6. Encrypt with Kc Station Network

 SRES and Kc must be uncorrelated


o Even though both are derived from RAND and Ki
 Must not be possible to deduce Ki from known
RAND/SRES pairs (known plaintext attack)
 Must not be possible to deduce Ki from chosen
RAND/SRES pairs (chosen plaintext attack)
o With possession of SIM, attacker can choose RAND’s

Part 3  Protocols 138


GSM Insecurity (1)
 Hash used for A3/A8 is COMP128
o Broken by 160,000 chosen plaintexts
o With SIM, can get Ki in 2 to 10 hours Base
Station
 Encryption between mobile and base
station but no encryption from base
VLR
station to base station controller
o Often transmitted over microwave link
 Encryption algorithm A5/1 Base
Station
o Broken with 2 seconds of known plaintext Controller

Part 3  Protocols 139


GSM Insecurity (2)
 Attacks on SIM card
o Optical Fault Induction  could attack SIM
with a flashbulb to recover Ki
o Partitioning Attacks  using timing and power
consumption, could recover Ki with only 8
adaptively chosen “plaintexts”
 With possession of SIM, attacker could
recover Ki in seconds

Part 3  Protocols 140


GSM Insecurity (3)
 Fake base station exploits two flaws
o Encryption not automatic
o Base station not authenticated

RAND
SRES Call to
destination
No
Mobile Fake
encryption Base Station Base Station

 Note: GSM bill goes to fake base station!

Part 3  Protocols 141


GSM Insecurity (4)
 Denial of service is possible
o Jamming (always an issue in wireless)
 Can replay triple: (RAND,XRES,Kc)
o One compromised triple gives attacker a
key Kc that is valid forever
o No replay protection here

Part 3  Protocols 142


3GPP: 3rd Generation
Partnership Project
 3G security built on GSM (in)security
 3G fixed known GSM security problems
o Mutual authentication
o Integrity-protect signaling (such as “start
encryption” command)
o Keys (encryption/integrity) cannot be reused
o Triples cannot be replayed
o Strong encryption algorithm (KASUMI)
o Encryption extended to base station controller
Part 3  Protocols 143

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy