SEARCH

SEARCH BY CITATION

Keywords:

  • biometrics;
  • privacy–security tradeoff;
  • trust model;
  • syndrome coding;
  • template protection;
  • BioPSTM

ABSTRACT

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

This paper presents a formal model, namely Biometric Privacy–Security–Trust Model (BioPSTM), aiming to describe the tradeoff between privacy and security and their relationship with trust in biometric authentication systems. The relationship between trust and privacy–security pair requires a comprehensive approach that should consider user acceptance and the pricing between privacy and security. The proposed model is quite new in that it combines the formal formulation of tradeoff between privacy and security with trust over a user's acceptance model. The formal model presents a three-dimensional approach to indicate demand responsive pricing between privacy, security, and trust. The model is interpreted over a general syndrome-based biometric template protection method by discussing possible privacy and security requirements. The proposed model has been applied on countries that are aware of biometric security technologies. The evaluation on country profiles presents an overall description of the user acceptance model and its relationship with biometric technologies. Copyright © 2012 John Wiley & Sons, Ltd.

INTRODUCTION

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

Increased demand for digital signature schemes has promoted research interest in the interaction between biometrics and cryptography. Use of biometrics in conjunction with cryptography can be instrumental in linking a person with a secure digital signature. Use of biometric information can mitigate the drawbacks of traditional cryptosystems that require users to protect a secret key by selecting a password or carrying a media such as tokens or smart cards. Using personal entropy (e.g., biometrics) instead avoids hazards such as stolen passwords, tokens, or easily guessed passphrases. Biometric-based keys have several advantages including the fact that they eliminate the need of memorizing such long bit streams, and because biometric features of a person form an intrinsic characteristic, it cannot be easily transferred to a third party. It is also important that keys generated from biometrics should generally be unique and exactly the same on every use. These requirements encourage researchers to devise biometric cryptographic keys that do not alter with varying age or changing environmental/health conditions [1].

One of the foremost debates in biometric identity management is the question of guaranteeing security of the transactions while preserving the privacy of the individuals. Privacy and security challenges intuitively address a fundamental tradeoff. Practically, it is very difficult to achieve a fully secure and perfect privacy-preserving biometric authentication system. Designers should be aware of this tradeoff and develop systems that optimize the pareto between privacy and security. The basic idea to optimize the privacy–security tradeoff is to diminish authentication failures (i.e., false acceptance and false rejection) while generating a reliable syndrome and biohash from the enrolled biometric instead of using the original trait. The syndrome (or sketch) of the biometric input that is generated as the helper data can be stored in the smart card or database to assist the biometric signature recovery during the release stage. Similarly, apart from the traditional cryptographic hashing, biohashing extracts bit strings from biometric traits that can be reliably discriminated among a population [7]. Here, biohash can be defined as a set of user-specific compact code that can be produced by noninvertible transform functions through mixing biometric features with pseudo-random numbers, that is, synthetically created keys. Note that the biohash, or generally biometric hashing, aims to conceal the biometric information and the recognition is performed over calculating the difference between source and template biometric hash strings. In biohashing, there still exists a possibility either to leak private information about the owner of the biometric trait or to fail when verifying the individual. On the other hand, the traditional cryptographic hashing is sensitive to even a one-bit change in the source binary string, which may cause a significant change in the resulting hashed string (flood effect).

The so-called privacy–security tradeoff is the constraint that the stored data do not provide any personal information about the enrolled biometric template to preserve privacy, whereas the security of the users is guaranteed by maximizing the rate of the biometric template that can be recovered successfully even in noisy circumstances. Because the helper data must contain certain information about the biometric measurements to assist the recovery of the biometric template in release stage, there exists an obvious tradeoff between security and privacy level of the system.

Most of the published works related to new generation biometric authentication approaches have focused on the tradeoff between privacy and security. However, there is an accompanying debate on trust that should concord with both privacy and security, as contrary to the tradeoff between these two aspects. A system becomes more trustworthy if the pareto between privacy and security is improved together with trust. In trust, it is important to take into account the user responses to pricing between privacy and security. To make people trust the developed biometric authentication system, the offered services should be acceptable only if the price asked is reasonable. Because it is impossible to perfectly optimize the tradeoff, users should be aware of the cost they will have to pay. Therefore, it can be intuitively deduced that trust or trustworthiness is directly related to the optimized tradeoff between security and privacy.

The researchers are now working on improving the pareto between privacy and security while making people trust the entire solution. This work focuses on this three-legged debate and aims to assist research by presenting a formal model of privacy, security, and trust. The proposed model is a simple and comprehensible alternative to existing privacy and security models. In addition to these two notions, this study expands the existing approaches by taking into account a third notion: trust. The proposed model describes the relationship between privacy, security, and trust aiming to find an answer to the following fundamental question: Is it possible to increase acceptance and trustworthiness of a biometric authentication system by obtaining secure syndromes and strong biohashes while simultaneously minimizing the information leakage about the personal information such as biometric measurement?

Rationale and novelty

In this study, the formal model of privacy, security, and trust, namely Biometric Privacy–Security–Trust Model (BioPSTM), is evaluated over a syndrome-based template protection method, the selected biometric security system that produces secure syndromes and biohashes by utilizing secure keys, syndrome encoding and decoding functions, biometric hash generation mechanisms, and other biometric front-end functions. The proposed model concentrates on formulating the tradeoff between privacy and security in terms of normalized equivocation rates. The equivocation between any pair of the components in the biometric security system presents privacy leakages and security losses. The trust model, built on the tradeoff between privacy and security, is based on the demand responsive pricing [2, 3] considering the user acceptance over a general model. The parameters of the proposed combined model might be adjusted to model privacy and security levels and user acceptance flexibly for various potential uses of biometric technologies. The newly proposed three-dimensional formal model, BioPSTM, represents a comprehensive approach that could be applied to any biometric security system. To give an overview, country profiles in the fields of national security, financial security, Internet security, and personal security are elaborated in terms of privacy concerns and user acceptance regarding the uses of biometric technologies in the selected fields. The privacy–security–trust model of some sample countries is given as a case study in this work.

The proposed trust model presents an objective measure of trustworthiness of any biometric authentication system. The objective measure formulated in terms of privacy, security, and trust may help the decision makers to see the big picture if a biometric authentication system is deployed. The proposed model presents the tradeoff between privacy and security by considering the public willingness and their of on trustworthiness. For instance, such a formal model might be useful if a government is planning to apply biometric authentication in eVisa applications. The social aspects, privacy threats, and security needs should be encompassed together when a biometric solution is decided to be deployed. The proposed BioPSTM presents a formulation that can be used to express the trust of people on such new technologies.

The tradeoff between privacy and security has been identified and studied by many researchers (for literature survey see Section 2). However, this study is quite new that it builds a formal link between trust and privacy–security tradeoff. The researchers, market leaders, and decision makers may benefit from this formal model by utilizing measurable privacy and security weight and observing public willingness factors. Such stakeholders may benefit from large-scale evaluations, pilot applications, questionnaires, and research outputs to draw a projection of the use of biometric technologies in various authentication systems. A sample case study is presented in Section 6.

The paper is organized as follows. Section 2 presents a brief overview of existing biometric template protection methods and formal models proposed for privacy, security, and trust. Section 3 scrutinizes the syndrome-based biometric security solution, and Section 4 discusses the requirements of a more privacy-preserving and secure biometric authentication system. Section 5 elaborates the proposed formal model, and Section 6 presents how it can be applied to countries' policies by evaluating their privacy and security profiles. Section 7 concludes the paper.

BACKGROUND INFORMATION AND RELATED WORKS

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

The common approach to improve the pareto between privacy and security is to propose sophisticated biometric template protection methods applicable to biometric authentication systems. In this paper, the state-of-the-art is scrutinized in two fronts: current approaches in (i) biometric template protection and (i) privacy, security, and trust models.

Biometric template protection

Biometric template protection and biometric cryptosystems have been extensively studied by a significant number of researchers. Many algorithms and systems have been proposed to solve the security and privacy threats in biometric authentication. The recent approaches focus on combining biometrics and cryptography to overcome the tradeoff between privacy and security by applying sophisticated methods. The prominent biometric template protection methods can be divided into four categories: (i) Feature Transformation, (ii) Biometric Hardening, (iii) Biometric Cryptosystems, and (iv) Biometric Key Generation.

In Feature Transformation approach, secure biometric templates are extracted by using a transformation function. The transformation function utilizes a randomly generated synthetic information (traditional private keys) associated with each user. Depending on the characteristics of the transformation function, the template protection mechanism either lets the invertible transformation function recover the original biometric template (salting) [4-7] or applies a one-way noninvertible hash function to conceal the biometric trait [8-10]. Note that the hash functions stated here should not be confused with the traditional cryptographic hash functions. Apart from the cryptographic hashing, that is, MD5, biohashing [7, 4] conceals biometric signal with a synthetically created private key through a noninvertible transform to extract a hash string. The resulting string of each user is then discriminated from the rest of the population by calculating the distance between the hashes obtained from the template and the query trait. Contrarily, in traditional cryptography, even a single bit change in the source signal will cause a flood effect resulting with a certain matching failure. The recent biometric hashing techniques usually apply error correction to reduce the number of changing bits when a test trait is presented. However, because of the uncertainty problems of biometric signals, it is usually impossible to obtain the same biometric bit string for each presentation. Such inconsistent nature of the biometric signals obliges researchers to avoid using traditional cryptographic hash functions directly for biohashing.

The transformation approach has the advantage that the biometric signals are protected and strengthened with a private key rather than using the biometric trait only. However, the security of these methods mostly relies on the security level of the key generation system. Moreover, blending biometric features with the randomly generated synthetic data may cause a bias of artificially created data during authentication. The potential risk of using synthetic keys as the input for salting or noninvertible functions occurs when an attacker somehow compromises the key. In this situation, the biometric authentication mechanism must present a satisfying performance no worse than the case where only typical biometric discrimination is applied.

Biometric Hardening can only be used for behavioral biometrics that are temporarily streaming in time domain, such as voice, gestures, and keystroke dynamics. In biometric hardening, keys improve at each successful login and become stronger after the authentication mechanism learns the user's distinctive behaviors [11, 12]. Such methods require sophisticated learning mechanisms and robust features to guarantee an accurate system. A major drawback of biometric hardening is that the first attempts to login are too risky because the biometric passwords are supposed to be weak at the initial stages.

Biometric Cryptosystems concentrate on storing a public helper data that is not supposed to reveal any significant information about the original biometric, and it is needed during matching to extract a cryptographic key. Biometric cryptosystems generally focus on either key binding or key generation. The related studies in key binding are mostly based on the Fuzzy Commitment method [13]. The researchers have improved this method, named fuzzy vault scheme [14], and applied it to various biometrics [15-17]. However, the fuzzy methods need more improvement because they fail to highlight on robustness to typical variations in the biometric signals. Additionally, the biometric cryptosystems are still sensitive to error correction performance and weak against privacy and security threats [18, 19].

The key generation methods concentrate on generating a key directly from biometrics. The milestone study of Dodis et al. presents a generalized mathematical model to extract keys from uncertain data having a random nature that is not distributed uniformly [20]. Biometric key generation methods may benefit from four major building blocks: quantization, error correction, extractor, and randomization. Quantization-based approaches transform continuously distributed biometric data distributed with a probability density function into discretely distributed data with a discrete probability density function. To improve quantization, user-specific information is taken into account to represent individuals better in space [21-24]. The main limitation of quantization-based methods is that it is difficult to generate keys with high stability and entropy. To solve this problem, error correction codes and extractors have been proposed. Error correction codes are widely used in secure (fuzzy) sketches to add redundant information to the input variable for the purpose of reproducing the original value correctly [17, 23, 25-27]. Fuzzy extractors [20, 28, 29] are realized by combining error correction codes and randomness extractor functions efficiently to transform the biometric probability density function into a totally different uniform distribution. The recent approaches apply an additional block for randomization to create multiple random sequences [8, 25, 30, 31]. Private keys are generally used for encrypting the sketches or syndromes to preserve privacy and introduce cancelable biometrics [10, 32].

To improve the performance of key generation, statistical analysis of biometric traits has become popular. Methods relying on statistical analysis consider the use of data clusters, feature distributions, or the models created from the intrinsic parameters of the classes obtained in a supervised or unsupervised way. It is claimed that the performance of the statistical methods can be enhanced if reliable biometric identities are extracted. The promising methods in literature utilize classification [33], statistical analysis of biometric features [34-36], fuzzy clustering [37], and other smart machine learning methods [38-41].

Formal studies in privacy, security, and trust models

Biometric security solutions are emerging as a combined solution to guarantee security and preserve privacy [16, 42, 43]. Privacy and security concerns have been studied by researchers aiming to deepen research in the fields of mathematical models [44-46], new constructions of biometric key generation [47], privacy and security threats [48-50], and the tradeoff between privacy and security [45, 46, 51].

The biometric template protection has been put into precise mathematical terms by defining the amount of information by an entropy measure. There are several papers dealing with Shannon entropy [21, 22], average min entropy [20], and the equivocation rate [8, 26, 30]. A survey of the security and privacy measures for biometric authentication systems can be seen in [52].

Tang et al. [44] proposed a mathematical model to protect users' privacy against malicious use while enabling them to access the service provider whenever they want. They formalized the identity privacy and transaction anonymity and realized a system that employs a private information retrieval and El-Gamal public-key infrastructure. For the trust relationships, they only present some assumptions ignoring possible threats such as fake fingerprints, security link errors, and high authentication failure rates.

Golic et al. [47] have approached to the biometric key generation problem by applying a randomized wrap-around code-offset construction for the Euclidean metric instead of using traditional quantization methods. Then, they strengthen the reliability of the system by applying code-redundancy construction for the Hamming metric. Golic et al. considered entropy measures, statistical dependence, and randomness extraction in secure sketches and analyze the entropy in Hamming and Euclidean metric constructions.

Simoens et al. [50] analyzed attacks on biometric sketches and figured out the privacy weaknesses in the proposed schemes. They define notions of security against distinguishability and reversibility attacks on biometric sketches. They clear out the privacy and security threats in fuzzy sketches by determining bounds of the information leakage.

Tuyls et al. [49] proposed a performance measure for security and robustness of a syndrome-based biometric authentication system. They benefit from the equivocation rate to describe the possible leakages in system components and the average entropy for the extracted randomness. They compute the tradeoff between security and robustness over a typical syndrome-based authentication system.

Privacy–security tradeoff in biometric security systems is analyzed in [45, 46, 51]. These studies mostly focus on attacks while generating a secret key from common randomness, and hence, the largest rate of the biometric key or biohash can be characterized [53]. Lai et al. [51] showed a fundamental tradeoff between security and privacy in any biometric security system and formulated the requirements of perfect security of the generated biometric key and the perfect privacy for biometric measurements. They formulated privacy and security in terms of equivocation rate and average entropy. Ignatenko et al. presented a rate–leakage pair in secret generation systems and formulate a zero-leakage biometric system [45, 46]. They concentrate on the privacy leakage defined as mutual information between the helper data and the biometric information. Vetro et al. [54] and formerly Sutcu et al. [26, 31] provided a formal quantification of the tradeoff between security and robustness as a function of the Slepian-Wolf coding rate. They demonstrated the tradeoff over a syndrome-based coding scheme and compute robustness and security in bits on fingerprint and iris-based biometric authentication systems.

There exist many trust models in literature aiming to discover the relationship between trust, demand, and utility in marketplaces consumers. The basic motivation behind the proposed models is to clarify how trust mediates agent interactions and how agent demands are likely to increase with increasing trust [55]. The popular models are FIRE [56], Regret [58], Yu and Shing [57], and probabilistic trust models [59]. An overall review of existing methods can be found in [60]. Although the proposed trust models present the security, trust, and reputation formally, a deepened approach in template-protecting biometric authentication systems is lacking in literature. This paper aims to fill this gap by proposing a formal model from an information theoretic perspective.

SYNDROME-BASED BIOMETRIC AUTHENTICATION

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

Although there exist too many precious studies in literature aiming to protect biometric templates while presenting low failure rates, syndrome-based biometric authentication [26, 30] and biohashig [4, 7] are selected as the pilot methods in this paper. The reason behind the selection of these methods is that first they are widely used and second they include all possible risks of privacy leakage and security losses. Because it takes into account between any bilateral pair of biometric bit strings, syndromes, biohash strings, and private keys, the authors interpret the proposed model on the syndrome-based biometric authentication method.

Enrollment and verification

A typical secure biometric authentication system that uses syndrome coding and biohashing is composed of two phases: enrollment and verification. During enrollment (Figure 1), the user provides his or her biometric to extract binary feature vectors. The extracted biometric bit string is denoted as x with length n. The feature vector x is a realization of a binary random vector X of fixed preset length n drawn according to some probability distribution PX(x). To secure the biometric trait, a syndrome encoding function, Fs_ec(), is used to map x into a syndrome s. Additionally, the enrolled biometric bit string x is hashed with a biometric hash function, Fhash(), to generate a biohash string, h. Note that the user-specific syndromes and the syndrome-coding scheme can be known and are not required to be a secret. Syndromes are just used as a helper data to reconstruct the enrolled biometric from the probe biometric. Designers usually prefer graphical coding approaches for syndrome encoding because they can closely approach the Shannon bound and present more powerful encoding [30]. The access control system stores s, h, and the graph of the used code. Because majority of the recent biometric authentication systems utilize smart cards, the enrolled information might be stored in the smart card chip.

image

Figure 1. Enrollment.

Download figure to PowerPoint

In verification (Figure 2), a user presents his or her probe biometric, and a bit string inline image is extracted. Note that inline image is an error prone version of x and it is assumed that x and inline image are very similar to each other. A syndrome decoding function is then used to extract an estimate of x, inline image, by using the s, which was previously stored in the smart card. Consequently, inline image is used to generate an estimated syndrome inline image, which will then be compared with the original s. Meanwhile, inline image is concealed with a private key k by using the biometric hash function to generate an estimate of h, inline image. The system verifies the person if ś and s are equal, and h and inline image are close enough to each other.

image

Figure 2. Verification.

Download figure to PowerPoint

Syndrome coding

A secure syndrome-coding scheme consists of four building blocks: quantizer, codebook, encoder, and decoder. To generate a reliable syndrome and reconstruct the corresponding enrolled template accurately, each component of the biometric features should be discretized efficiently. Although there exist new approaches to create syndromes in continuous domain [20], majority of the proposed methods are based on quantizing in discrete domains. In this paper, the features are assumed to be mapped onto a discrete domain M. For each quantized domain, a codebook C is considered where inline image, such that inline image. Here, δ denotes the maximum tolerable distance where inline image in the quantized domain can be shifted under noise.

In addition to the quantizer and the codebook, the syndrome-coding scheme utilizes an encoder and decoder as well. The encoder is the function that outputs the distance of inline image from its nearest codeword in the codebook C. On the other hand, the decoder shifts every inline image by some distance d, maps it to the nearest codeword in C, and shifts it back by the same d. An ideal syndrome-coding scheme for two persons I and J is illustrated in Figure 3.

image

Figure 3. Syndrome-coding illustration.

Download figure to PowerPoint

Biohashing

Biohashing might be applied in several ways depending on the enrolled biometric feature, whether it is supplied in binary or numerical form. Traditionally, the well-known biohashing method, which is proposed by Teoh et al. [4, 7], uses a number of randomly generated arrays, Ri, where i = 1 … n; each of which should be linearly independent from each other. The biohash h is computed simply as

  • display math

Here, t denotes a threshold value. Note that each Ri can be generated by using the user's private key κ, which can be set as a seed for the random number generator. If the biometric input is already in binary form, simple XOR operations might be applied to conceal the input. The resulting biohash, h, is then compared with the other biohashes in the population by using Hamming distance. It is noteworthy to mention that more sophisticated biohashing mechanism exists in literature. However, for the sake of simplicity and concept proving, the biohashing method depicted in Figure 4 is selected in this study.

image

Figure 4. Biohashing illustration.

Download figure to PowerPoint

SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

Although there might be too many risks and threats generated by the system components such as network security policy, storage of personal information, and communication protocols, the biometric-oriented threats are specifically discussed in this paper. It is assumed that all other factors, rather than biometric threats that may cause leakages, equivocation, and loss of private information, are ignored. A secure and privacy-preserving biometric authentication system should meet the following conditions:

  1. Biometric discrimination: the biometric authentication algorithms should present efficient discrimination of genuine and imposter distributions. The selected methods should represent users by reliable features or classes that could be easily distinguished from each other.
  2. Randomly generated private keys: the synthetically created private keys should not be predicted. It is desired that there should not be an information leakage or statistical dependence between the private key and the enrolled biometric bit string. In any case, the designer should take into account the attack scenario where all the private keys in the population are somehow released.
  3. Reliable biohashing: the biohashing function should be based on a noninvertible transformation. The biohashing mechanism should combine biometric trait with synthetically created private keys efficiently that no one can release any information about the key or the biometric data. Moreover, the proposed method should not give any concession on either the biometric information or the randomly created synthetic information so that none of them becomes dominant in discrimination of individuals in the entire population. Biohashing should be applied if the selected algorithm preserves statistical independence between the biohash output and the input parameters (key and the biometric string).
  4. Secure syndrome coding: a robust syndrome should allow exact recovery of the original biometric data that might be realized by applying powerful error correcting codes. A reliable syndrome should not reveal too much information about the original biometric template. Otherwise, the syndromes may leak biometric data, which is a critical privacy threat.
  5. Minimized equivocation and leakage: the equivocation or leakage between any pair of the template-protecting biometric authentication system should be prevented. The equivocation between the enrolled biometric measurement and the generated biometric hash string or syndrome may cause privacy leakage. Similarly, the leakage between the private key and the biohash string or syndrome may trigger security threats. Additionally, the equivocation between syndrome and the generated biometric hash string should be avoided to increase both privacy and security level of the system.

All these conditions affect the trustworthiness and user acceptance in a biometric authentication system. Trustworthiness among users increases if they are convinced that both privacy and security are guaranteed mutually. The proposed formal model should consider the tradeoff between privacy and security and requirements to improve pareto between these two concepts.

PROPOSED FORMAL MODEL

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

The formal model is based on two steps: (i) formulating the tradeoff between privacy and security and (ii) formulating the trust considering the privacy–security tradeoff.

Privacy security tradeoff

The security of a biometric authentication system highly depends on the recognition performance and the key management system. From the biometric perspective, the system is secure if interclass variation is high and intraclass variation is low. This means that a secure system must present high biometric discrimination in the population yielding less authentication failures. Well-known noise problems due to environmental conditions, sensor differences, pose problems, and uncertainty problems in data capturing affect the security of the system, thus increase false acceptance and false rejection rates. Hence, biometrics cannot be directly used for encryption or authentication.

On the other hand, a privacy-preserving biometric authentication system must ensure that no one can recover the original biometric traits from the stored templates. It is urgently required that the information leakage about the biometric measurements should be as small as possible. The recent techniques that benefit from syndrome coding and biometric hashing should take into account the possible leakages between syndromes, biohashes, and biometric measurements and ensure randomly generated private keys do not involve predictable patterns of the biometric data.

In line with the challenges stated earlier, a trustworthy biometric security system should introduce a reliable authentication scheme presenting low error rates to provide transaction security while implementing advanced biohashing and syndrome-coding methods to protect biometric traits. Solutions to improve privacy and security should tackle with the entropy problems of biometric signals that tend to vary over time and sensitive to environmental conditions. The security–privacy tradeoff can be overcome by satisfying the security and privacy conditions.

Security of transactions over a biometric authentication system is surely related to the biometric discrimination, which can be described as the rate of the biometric bit string. The average entropy of the biometric measurement should be greater than a threshold value such that the interclass variation among the classes in a population is maximized while the intraclass variation is minimized. Security of a system should also be guaranteed by preventing leakage between the used private key and the generated biohash as well as the equivocation between the key and the encoded syndrome.

Privacy conditions address the maximization of the rate of the private key while minimizing the information leakage about the biometric measurement. Degree of privacy can be increased by preventing the leakage between the syndrome and the biometric bit string. Moreover, privacy should be preserved by minimizing the equivocation rate between the generated biometric hash and the biometric measurements.

Privacy and security of any biometric security system should also minimize the equivocation between syndrome and the generated biometric hash. This condition should be taken into account while formulating both privacy and security.

Privacy and security in terms of normalized equivocation rate

The template protection requirement can be put into a formal model by defining the amount of information by an appropriate entropy measure and the randomness of the private keys. Some approaches in the literature have focused on the Shannon entropy to find optimized shielding functions or template protection mechanisms to enhance privacy [21, 22]. However, the majority of the published works have proposed the so-called average (min) entropy, which is defined as the negative algorithm of the average maximal probability of correct decision [20]. As noted in [47], the min entropy reflects only the one-step decision about the random variable, whereas in biometric hashing and syndrome generation, one can make several guesses. An entropy measure is required to correctly guess a random variable that appears to be more appropriate, which means that the average conditional Shannon entropy, described as the equivocation rate, is closer to this measure than the average (min) entropy. Moreover, because it is obvious that the components of a biometric authentication system, such as private keys, biohashes, and syndromes, may not be usually uniformly distributed, the leakage between any pair of the system components should be minimized.

The proposed formal model, BioPSTM, is built on the mathematical model of privacy–security tradeoff. The relations between privacy and security are expressed by the normalized equivocation rate as proposed in [51]. The formal definition of the normalized equivocation rate between any two random processes is given as follows:

Definition 1. For two random processes X = {X1, X2, …} and Y = {Y1, Y2, …}, the normalized equivocation rate between X and Y is ΨXY = H(X|Y)/H(X), where H(.) denotes the entropy function. If ΨXY is arbitrarily close to 1, this means that Y does not leak any information about X.

The following notation is used in the proposed privacy and security formulation. X denotes the distribution of the binarized biometric bit strings with a probability density function PX, S denotes the distribution of the syndromes (sketch or helper data) with a probability density function PS, h denotes the distribution of the generated biometric hash string with a probability density function Ph, and κ denotes the distribution of the synthetically created private key with a probability density function Pκ.

Privacy conditions

In a biometric authentication system, the following conditions should be satisfied to preserve privacy:

  1. Normalized equivocation rate between X and S should be less than a threshold such that ΨXS ≤ εxs, where εxs ≤ 1 denotes an arbitrary positive constant.
  2. Normalized equivocation rate between X and h should be less than a threshold such that ΨXh ≤ εxh, where εxh ≤ 1 denotes an arbitrary positive constant.
  3. Normalized equivocation rate between S and h should be less than a threshold such that ΨSh ≤ εsh, where εsh ≤ 1 denotes an arbitrary positive constant.
  4. The average entropy of the private key should be as high as possible such that 1/nH(κ) ≤ Rk. This condition addresses the rate of the private key, which is defined as the average entropy of the randomly generated key. Rk ≤ 1 denotes an arbitrary positive constant, and n is the length of the key.
Security conditions

The conditions that should be encountered for the security of any biometric authentication system are as follows:

  1. Normalized equivocation rate between κ and S should be less than a threshold such that ΨκS ≤ εκs, where εκs ≤ 1 denotes an arbitrary positive constant.
  2. Normalized equivocation rate between κ and h should be less than a threshold such that Ψκh ≤ εκh, where εκh ≤ 1 denotes an arbitrary positive constant.
  3. Normalized equivocation rate between S and h should be less than a threshold such that ΨSh ≤ εsh, where εsh ≤ 1 denotes an arbitrary positive constant.
  4. The average entropy of the biometric measurement should be as high as possible such that 1/mH(X) ≤ RX. This condition addresses the biometric discrimination capability of the system. The greater this value, the higher the biometric discrimination and the lower authentication failure rates. RX ≤ 1 denotes an arbitrary positive constant, and m is the length of the biometric bit string.

Privacy, P, and security, S, should meet conditions stated earlier serially. To increase privacy or security of a biometric security system, each condition denoted as either equivocation rate or average entropy should be met together. Here, the conditions are linked with AND operator yielding the product of each term representing the privacy or security requirements. In this respect, P and S can be formulated as in the following way:

  • display math(1)
  • display math(2)

Note that ΨSh is the common term both for P and S. By applying some simple algebra, one can write

  • display math(3)

Here, ρ denotes the overall tradeoff value. In the ideal case, ρ = 1, which means that there is no equivocation between any pair of the components and no security threats exist. Such a system is accepted as a completely secure and privacy-preserving system theoretically.

BioPSTM

The biometric authentication systems try to satisfy the users by improving the security of transactions and preserving the privacy of individuals. By other words, individuals should trust the entire solution, which is only achievable by improving the pareto between privacy and security.

Equation (3) formally illustrates the tradeoff between privacy and security. The associated mathematical link between privacy and security can be connected with the trust by keeping this relation formally. The trust function, which might be applied to any biometric authentication system, should consider the mathematical relationship between privacy and security cordially. The rationale behind the proposed BioPSTM is to utilize the formal representation stated in Equation (3) to build a link between trust and the privacy–security tradeoff. The resulting privacy–security–trust model will then have the capability to draw a big picture where design criteria and measurable success indicators of any biometric authentication can be presented objectively.

BioPSTM formulates the users' trust on the system and their response to the adjusted parameters of privacy and security by utilizing an acceptance probability T(P, S). This probability reflects the user's willingness to participate in the authentication procedure at the asked privacy and security price. The privacy and security price is expressed as ρ in Equation (3).

The proposed BioPSTM is inspired from the user's acceptance model, known as demand responsive pricing [2], which is widely used in telecommunications sector. The proposed model is one of the most comprehensive models in literature and fits well with the privacy–security tradeoff. The rationale behind this model is to describe operators' marketing strategies in telecommunications sector to attract customers through demand responsive pricing. They have differing service spectral efficiencies and offer a rate (in terms of bits per second) as well as a total price (currency). The customers' response to each offer is modeled through an acceptance probability that reflects the customers' willingness to buy the offered service at the asked price. One renowned study in demand responsive pricing is presented in [3], where the authors proposed a framework that can be used as a model for competition among future operators likely to operate in a mixed commons/property rights regime under the regulation of spectrum policy server. They propose a user acceptance model A(u, p), where u is the utility of the user and p is the associated price. A similar acceptance model is presented as the Cobb–Douglas curves that are used in microeconomics to model user demands and offered prices in the market [61].

BioPSTM takes into account the users' acceptance related to the security level of the transactions and the prevention mechanisms to preserve privacy. For instance, the asked security level in critical e-passport operations requires a strict authentication mechanism where people are obliged to give their multiple biometrics, although they are not very keen on to present their private biological information. Contrarily, the expected security level adjusted to enter a grocery market is not critical so that only a few people will be pleasant to use a biometric authentication system just to buy a few bottles of juice. Specifically, the users' perspective is addressed by introducing an acceptance probability, T(S, P), where P is the privacy function (Equation (1)) and S is the security function (Equation (2)).

Trustworthiness of a biometric identity management system should have the following qualitative properties:

  • display math

The qualitative properties state that as S and P go to 0, the trustworthiness decreases to 0. Similarly, as S and P go to , the trustworthiness increases to its theoretic upper bound 1. Trust or trustworthiness is directly proportional to security and privacy.

Although there are several candidates for the acceptance probability in literature [2, 3], the function shown in Equation (4) has been proposed for BioPSTM:

  • display math(4)

where ζ is the security weight, η is the privacy weight, and ω represents the level of public willingness to provide personal information (personal identity number (PIN), password, or biometric) in a biometric authentication system. Note that the trust function can be differentiated through adjusting the aforementioned parameters. For instance, the designer may demand equally weighted privacy and security by selecting ζ = η. If the security becomes more important than privacy of the users (i.e., military applications, passport, or visa checks), ζ should be increased. Conversely, if privacy becomes more critical (i.e., web-based authentication in eID applications), η should be increased.

In Figure 5(a), T(S, P) is illustrated for ζ = η = 1 and ω = 5, which encounters a balance between privacy and security. As expected, trustworthiness increases with privacy and security. People certainly trust the system when P = S = 1 in this plot. The contour plots under the trust surface depict the tradeoff between privacy and security. Similarly, Figure 5(b) depicts the case when security concerns are getting more critical. In this case, selected parameters, ζ = 5, η = 1 and ω = 5, give more significance on security than privacy. Contrarily, by selecting ζ = 1, η = 5, ω = 5, designer pays more attention on the security of system, which is depicted in Figure 5(c).

image

Figure 5. The effects of ζ and η. (a) ζ = η = 1, ω = 5, (b) ζ = 5, η = 1, ω = 5, and (c) ζ = 1, η = 1, ω = 5.

Download figure to PowerPoint

The constant ω affects the level of willingness in accordance with the selected weights, ζ and η. Figure 6(a) illustrates the case (ζ = η = ω = 5) where only a small portion of the population trust the system, whereas the rest is still skeptical about the system. Contrarily, Figure 6(b) demonstrates a more relieved society where people mostly trust the system. In the second case where ζ = η = 5, ω = 100, only the minority of the population is skeptical about the procedures and does not care about privacy and security.

image

Figure 6. The effects of ω in the proposed trust model. (a) ζ = η = ω = 5 and (b) ζ = η = 5, ω = 100.

Download figure to PowerPoint

The upper boundaries for positive parameters ζ, η, and ω can be empirically determined. Because limω[RIGHTWARDS ARROW]T(S, P) = 1, greater values of ω do not represent concerns in real life applications. The proposed upper limit for ω is 103 yielding approximately 80% of the population trust the system when the privacy and security weight is kept low, ζ = η = 1. Similarly, when limζ[RIGHTWARDS ARROW]T(S, P) = limη[RIGHTWARDS ARROW]T(S, P) = 1, this situation represents a fully trusted system. Although the weight parameters can be identified by the designer, values of ζ and η greater than 10 do not represent real life conditions. If either ζ or η is selected as a greater value unrealistic weight for privacy or security may spread among people.

It is noteworthy that the proposed BioPSTM here is presented just to give intuition about the relations between privacy, security, and trust. The three-dimensional formal model can be improved by defining more effective functions and parameters. It is also noteworthy that the parameters of the model can be adjusted flexibly if one can represent the limitations and the requirements of the evaluated system properly. Some projections to determine privacy and security weights and willingness parameters due to the country profiles are given in Section 6.

CASE STUDY—COUNTRY PROFILES

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

Biometrics are becoming widespread in national, financial, Internet, and personal domains to increase security of citizens and electronic transactions. Biometric authentication systems can be used in various case scenarios in either public or private domain. The risk of identity theft, for instance, has become one of the major threats in personal security. To shed light on how the proposed model can be used in applications, the appropriate privacy and security criteria that are related to all functions of any template-protecting biometric authentication procedure should be determined. In this case study, privacy criteria are determined by considering the possible uses of biometric technologies. The first column of Table 1 indicates the relevant privacy criteria that might be concerned in biometric technologies. Similarly, the security criteria are determined by considering the security domains such as national, financial, Internet, and personal uses. Note that the security criteria are linked with the selected privacy concerns by considering the potential use case scenarios of biometric technologies.

Table 1. Security and privacy criteria and the calculation of privacy ratings.
  Security criteria
Privacy criteriaSymbolNational securityFinancial securityInternet securityPersonal security
Border passesa   
Communication data retentionb   
Data sharingc  
Financial transactiond   
Government access to datae   
Identity cards and biometricsf   
Statutory protectiong   
Visual surveillanceh   
Privacy rating (weight, η) inline imageinline imageinline imageinline image

The case study illustrated in this paper aims to show how a designer can adjust the willingness, privacy weight, and security weight parameters to clarify the trust needs of a biometric solution. A sample study has been implemented to evaluate the proposed trust model on countries' privacy and security profiles. The security weight ζ used in this case study is influenced from the Unisys Security Index [62], which is a global measure of public perception of major security issues in 130 countries (Australia-AU, Belgium-BE, Brazil-BR, France-FR, Germany-DE, Hong Kong-HK, Italy-IT, Malaysia-MY, Netherlands-NL, New Zealand-NZ, Spain-ES, UK, and US). The index provides a statistically robust monitor of concerns about four areas of interest with a scale runs from 0 to 300 (0 represents no concern). The security index represents concerns about (i) national security related to terrorism and health epidemics, (ii) financial security related to financial fraud and ability to meet personal financial obligations, (iii) Internet security related to all online transactions, and (iv) personal security concerning physical safety and identity theft.

The privacy parameter η, on the other hand, is influenced from the 2007 privacy rankings published at www.privacyinternational.org. The selected criteria for privacy rankings are (i) border pass, (ii) communication data retention, (iii) data sharing, (iv) financial transaction, (v) government access to data, (vi) identity cards and biometrics, (vii) statutory protection, and (viii) visual surveillance. Note that the selected criteria are closely related to biometric application areas. The matches between the selected privacy criterion and the security concern can be seen in Table 1. The scale of privacy ranking, namely privacy index, runs from 0 to 5, where 5 represents a country that guarantees a perfect privacy preservation.

The willingness factor, ω, is computed from the answers to supplemental question for biometrics asked to 12 139 subjects in the same 13 countries stated in [63]. In this study, acceptance of traditional (PINs), fingerprints, and facial expressions is analyzed.

The 5-scale security parameter is calculated as μ/60, where μ denotes the Unisys security index. Privacy ratings for four major areas of security index are calculated by averaging the privacy ranks of related criteria. Table 1 shows how the average privacy ratings are calculated for each security criterion.

Parameters, ζ, η, and ω, can be selected by considering the average privacy ratings (privacy weight), security weight, and willingness of the people in the selected countries. Table 2 shows how privacy weight factors are calculated for each security criterion. In the upper part of Table 2, privacy rankings of each country are illustrated. In the lower part, average privacy ratings yielding privacy weight for each security criterion (national, financial, Internet, and personal) are depicted. For instance, people of Germany and Italy are more concerned (with the score of 3) than the other nations when national security is considered. On the other hand, people of Spain, UK, and US are not really concerned when personal security is taken into account.

Table 2. Privacy ratings used as privacy weight factor ζ.
Privacy criteriaAUBEBRFRDEITMYNLNZESUKUS
a1221231211
b422111313213
c2121411212
d131242122111
e232132122222
f312222113111
g242244242421
h222312211
Privacy ratings (weight, η) versus security criteria
National securityN/AN/A21.7331.32.7N/A2.71.71.3
Financial security1.521.51.54N/A11.52N/A11.5
Internet security31.5212.5N/A212.5N/A12.5
Personal security21.521.522.511.5N/A0.511

Security indices (runs from scale 0 to 300) and security weight values (runs from scale 0 to 5), ζ, are illustrated in Table 3. According to the security indices presented in [62], countries with serious concerns about national security are Brazil, Malaysia, US, and Spain. Including these countries, German people have serious concerns about financial security. Malaysia, Brazil, and Germany have serious concerns about personal security although biometrics have been widely used in these countries. Moreover, people in Germany, Brazil, US, and Malaysia are still skeptical about Internet security. The Unisys analysis shows that Malaysia, Brazil, Germany, US, and maybe Spain have serious concerns about security in a general manner, whereas the other countries have moderate concerns.

Table 3. Security indices, security weight (ζ), and willingness values (ω) for each country.
CountryUnisys security index (security weight, ζ)Willingness (ω)
National securityFinancial securityInternet securityPersonal securityFingerprintFacePIN
Australia117 (2)127 (3)100 (2)111 (2)726178
Belgium80 (2)110 (2)93 (2)102 (2)614074
Brazil205 (4)186 (4)148 (3)187 (4)714756
France90 (2)95 (2)91 (2)79 (2)583763
Germany146 (3)162 (3)164 (3)167 (3)624359
Italy113 (2)115 (2)78 (2)99 (2)633559
Malaysia187 (4)155 (3)121 (3)179 (3)632234
Netherlands74 (2)96 (2)96 (2)84 (2)805985
New Zealand124 (3)123 (3)111 (2)108 (2)714564
Spain169 (3)156 (3)91 (2)136 (3)694266
UK124 (3)141 (3)111 (2)133 (3)756778
US161 (3)152 (3)127 (3)140 (3)725269

The trust on the applications used to increase security in national, financial, personal, and Internet transactions highly depends on the public willingness (as percentage value runs from scale 0 to 100), ω, in the considered countries. The last three columns of Table 3 represent willingness scores of each country when fingerprints, facial scans, and PINs are used for authentication. As seen from the ω values, fingerprint-based authentication is widely accepted in majority of the countries, whereas PINs are moderately accepted. However, people are skeptical about face biometrics in general. Netherlands seems less concerned with biometrics and PINs as compared with the other countries, whereas majority of the Malaysian people are still against biometrics. To give intuition for readers, the trust plots for Netherlands, Germany, Malaysia, and US in finance domain are depicted in Figure 7.

image

Figure 7. Country profile examples. (a) Netherlands (η = 1.5, ζ = 2, ω = 80), (b) Germany (η = 4, ζ = 3, ω = 62), (c) Malaysia (η = 1, ζ = 3, ω = 63), and (d) US (η = 1.5, ζ = 3, ω = 72)

Download figure to PowerPoint

The privacy–security tradeoff curves for a selected trust value of 0.8 are illustrated in Figure 8. As depicted, Germany is the most skeptical country because its tradeoff curve is farther from the origin. Netherlands seems more relaxed as its privacy–security curve is the nearest to the origin.

image

Figure 8. Privacy–security tradeoff curves in financial domain at T = 0.8.

Download figure to PowerPoint

Once the BioPSTM mesh is extracted, the suppliers or decision makers can easily estimate the tendencies and approaches in the society while discussing the capabilities of the biometric authentication system. For instance, the parameters of a biometric authentication system can be adjusted to obtain a privacy–security tradeoff curve (as illustrated in Figure 8) at a certain trust value. The decision makers can determine the requirements by considering how much privacy and security is required for any use case by tracking the privacy–security tradeoff curve and anticipate the expected trust value at the selected operation point.

CONCLUSION

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

A formal privacy–security–trust model, namely BioPSTM, based on the tradeoff between privacy and security in template-protecting biometric authentication systems has been proposed in this work. The tradeoff between privacy and security is formulated in terms of equivocation rate between any pair of the syndrome-based biometric template protection scheme and the average entropy values of biometric measurements and the private keys. The proposed model describes the users' perspective, which is addressed by introducing an acceptance probability function. The acceptance probability function represents how people trust the overall system. The parameters of BioPSTM give flexibility to adjust the level of privacy and security of the scheme as well as public willingness. BioPSTM is demonstrated on 12 countries having experience in biometric solutions in national, financial, Internet, and personal domains. Note that the case study is just given to demonstrate how one can select appropriate parameter values. The proposed BioPSTM may have a positive impact in forecasting users' perception generally by providing measurable decision criteria reflecting the privacy, security, and trust needs. The decision makers, researchers, investors, and other stakeholders who are interested biometric authentication systems may assess candidate solutions or case scenarios quantitatively. For further studies, the selection or calculation of these parameters can be realized through more objective and formal way. The proposed model can be improved by defining new user acceptance probability or tradeoff functions for privacy and security.

ACKNOWLEDGEMENTS

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES

This work was funded by TUBITAK BILGEM under the auspices of the project with the code G202-S432, known as Turkey's Citizenship Card Pilot Project. The authors would like to thank Dr. Omer Ileri for his assistance in identifying the potentials of user acceptance probability that has been applied in telecommunications sector. The authors are also grateful to Dr. Umut Uludag, Dr. Oktay Adalier, and Mehmet Ugur Dogan for their valuable comments and reviews.

REFERENCES

  1. Top of page
  2. ABSTRACT
  3. INTRODUCTION
  4. BACKGROUND INFORMATION AND RELATED WORKS
  5. SYNDROME-BASED BIOMETRIC AUTHENTICATION
  6. SECURITY AND PRIVACY REQUIREMENTS IN SYNDROME-BASED BIOMETRIC AUTHENTICATION SYSTEMS
  7. PROPOSED FORMAL MODEL
  8. CASE STUDY—COUNTRY PROFILES
  9. CONCLUSION
  10. ACKNOWLEDGEMENTS
  11. REFERENCES
  • 1
    Maio D, Maltoni D, Jain AK, Prabhakar S. Handbook of Fingerprint Recognition. Springer, Verlag: New York, USA, 2003.
  • 2
    Badia L, Lindstrom M, Zander J, Zorzi M. Demand and pricing effects on the radio resource allocation of multimedia communication systems. Globecom 2003, 2003.
  • 3
    Ileri O, Samardzija D, Mandayam NB. Demand responsive pricing and competitive spectrum allocation via a spectrum server. IEEE DySpan, Baltimore, MD, 8–11 Nov., 2005; 194202.
  • 4
    Teoh ABJ, Goh A, Ngo DCL. Random multispace quantization as an analytic mechanism for biohashing of biometric and random identity inputs. IEEE Transactions on Pattern Analysis and Machine Intelligence 2006; 28(12):18921901.
  • 5
    Kanak A, Sogukpinar I. Fingerprint Hardening with Randomly Selected Chaff Minutiae. LNCS-4673, Springer Verlag: Heidelberg, Germany, 2007; 384390.
  • 6
    Roberge CSD, Stoianov A, Gilroy R, Kumar BV. Biometric encryption. ICSA Guide to Cryptography, ch. 2. McGraw Hill, New York, 1999.
  • 7
    Jin ATB, Ling DNC, Goh A. Biohashing: two factor authentication featuring fingerprint data and tokenised random number. Pattern Recognition Issue 2004; 11(37):22452255.
  • 8
    Sutcu Y, Sencar HT, Memon N. A secure biometric authentication scheme based on robust hashing. Conference on MM-SEC, New York, USA, 2005.
  • 9
    Savvides M, Vijaya Kumar BVK, Khosla PK. Cancelable biometric filters for face recognition. Proceedings of the International Conference on Pattern Recognition 2004; 922925.
  • 10
    Ratha NK, Chikkerur S, Connell JH, Bolle RM. Generating cancelable fingerprint templates. IEEE Transactions on Pattern Analysis and Machine Intelligence 2007; 29(4):561572.
  • 11
    Monrose F, Reiter MK, Wetsel S. Password hardening based on keystroke dynamics. Proceedings of the ACM Conference Computer and Communications Security, 1999; 7382.
  • 12
    Monrose F, Reiter MK, Li Q, Wetsel S. Using voice to generate cryptographic keys. A Speaker Odyssey, Speaker Recognition Workshop, 2001; 237242.
  • 13
    Juels A, Wattenberg M. A fuzzy commitment scheme. In Proceedings of the ACM Conference on Computer and Communications Security, Tsudik G (ed.) ACM: New York, NY, USA, 1999; 2836.
  • 14
    Juels A, Sudan M. A fuzzy vault scheme. Proceedings of IEEE International Symposium on Information Theory, Lausanne, Switzerland, 2002; p. 408.
  • 15
    Clancy TC, Kiyavash N, Lin DJ. Secure smartcard-based fingerprint authentication. ACM SIGMM Multimedia, Biometrics Methods and Application Workshop, 2003; 4552.
  • 16
    Uludag U, Pankanti S, Prabhakar S, Jain AK. Biometric cryptosystems: issues and challenges. Proceedings of the IEEE 2004; 92(6):948960.
  • 17
    Uludag U, Jain AK. Securing fingerprint template: fuzzy vault with helper data. IEEE Workshop on Privacy Research in Vision, NY, 2006.
  • 18
    Davida GI, Frankel Y, Matt B. Enabling secure applications through off-line biometric identification. Proceedings of the IEEE Symposium on Privacy & Security, 1998; 148157.
  • 19
    Davida GI, Frankel Y, Matt B, Peralta R. On the relation of error correction and cryptography to an online biometric-based identification scheme. Coding & Cryptography Workshop, 1999; 129138.
  • 20
    Dodis Y, Reyzin L, Smith A. Fuzzy extractors: how to generate strong keys from biometrics and other noisy data. EUROCRYPT'04, LNCS, Vol. 3027. Springer Verlag: Berlin, Germany, 2004; 523540.
  • 21
    Linnartz JP, Tuyls P. New shielding functions to enhance privacy and prevent misuse of biometric templates. 4th International Conference on Audio- and Video-Based Biometric Person Authentication, 2003; 393402.
  • 22
    Tuyls P, Goseling J. Capacity and examples of template-protecting biometric authentication systems. In ECCW Workshop bioAW, LNCS, Vol. 3087. Springer Verlag: Berlin, Germany, 2004; 158170.
  • 23
    Buhan IR, Doumen JM, Hartel PH, Veldhuis RNJ. Fuzzy extractors for continuous distributions. ACM Symposium on Information, Computer and Communications Security, Singapore, 2007; 353355.
  • 24
    Buhan I, Doumen J, Hartel P, Veldhuis R. Embedding renewable cryptographic keys into continuous noisy data. In 10th International Conference on Information and Communications Security, (ICICS), LNCS, Vol. 5308. Springer Verlag: Birmingham, UK, 2008; 294310.
  • 25
    Chang E, Li Q. Hiding secret points amidst chaff. EUROCRYPT, LNCS, Workshop, Vol. 4004, 2006; 5972.
  • 26
    Sutcu Y, Li Q, Memon N. protecting biometric templates with sketch: theory and practice. IEEE Transactions on Information Forensics and Security 2007; 2(3):502512.
  • 27
    Ong TS, Teoh ABJ. Fuzzy key extraction from fingerprint biometrics based on dynamic quantization mechanism. 3rd International Symposium on Information Assurance and Security, 2007; 7176.
  • 28
    Tong VVT, Sibert H, Lecoeur J, Girault M. Biometric fuzzy extractors made practical: a proposal based on fingercodes. Advances in Biometrics, LNCS 4642, 2007; 604613.
  • 29
    Alvarez FH, Encinas LH. Security efficiency analysis of a biometric fuzzy extractor for iris templates. Advances in Soft Computing 2009; 63:163170.
  • 30
    Sutcu Y, Rane S, Yedidia J, Draper S, Vetro A. Feature transformation of biometric templates for secure biometric systems based on error correcting codes. IEEE Computer Vision and Pattern Recognition Workshop, CVPRW2008, 2008; 16.
  • 31
    Sutcu Y, Rane S, Yedidia J, Draper S, Vetro A. Feature extraction for a Slepian-Wolf biometric system using LDPC codes. IEEE International Symposium on Information Theory, 2008.
  • 32
    Martinian E, Yekhanin S, Yedidia JS. Secure biometrics via syndromes. Allerton Conference on Communication, Control and Computing, Monticello, 2005.
  • 33
    Kanak A, Sogukpinar I. Classification based revocable biometric identity code generation. BioID- MultiComm2009, LNCS 5707, 2009; 276284.
  • 34
    Hao F, Chan CW. Private key generation from on-line handwritten signatures. Information Management and Computer Security 2002; 10(4):159164.
  • 35
    Vielhauer C, Steinmetz C. Feature correlation analysis for biometric hashes. EURASIP Journal on Applied Signal Process 2004; 4:542558.
  • 36
    Zhang W, Chen T. Generalized optimal thresholding for biometric key generation using face images. International Conference on Image Processing 2005; 3:784787.
  • 37
    Sheng W, Howells G, Fairhust M, Deravi F. Template-free biometric key generation by means of fuzzy genetic clustering. IEEE Transactions on Information Forensics and Security 2008; 3(2):183191.
  • 38
    Yamazaki Y, Komatsu N. A secure communication system using biometric identity verification. IEICE Transactions on Information and Systems 2001; E84-D(7):879884.
  • 39
    Chang YJ, Zhang W, Chen T. Biometric-based cryptographic key generation. International Conference on Multimedia and Expo, 2004.
  • 40
    Fairhust M, Hoque S, Howells WGJ, Deravi F. Evaluating biometric encryption key generation. 3rd Cost 275 Workshop Biometrics Internet, 2005; 9396.
  • 41
    Lee YJ, Park KR, Lee SJ, Bae K, Kim J. A new method for generating an invariant iris private key based on the fuzzy vault system. IEEE Transactions on Systems, Man and Cybernetics, Part B 2008; 38(5):13021313.
  • 42
    Jain AK, Nandakumar K, Nagar A. Biometric template security. EURASIP Journal on Advances in Signal Processing, Special Issue on Advances in Signal Processing and Pattern Recognition Methods for Biometrics, 2008; 117.
  • 43
    Ballard L, Kamara S, Monrose F. Towards practical biometric key generation with randomized biometric templates. CCS'08, 15th ACM Conference on Computer and Communications Security, 2008; 235244.
  • 44
    Tang Q, Bringer J, Chabanne H, Pointcheval D. A formal study of the privacy concerns in biometric-based remote authentication schemes. ISPEC2008, LNCS 4991, Sydney, Australia, 2008; 5670.
  • 45
    Ignatenko T, Willems F. Privacy leakage in biometric secrecy systems. 46th Allerton Conference on Communication, Control and Computing, Monticello, 2008.
  • 46
    Ignatenko T, Willems F. Secret rate-privacy leakage in biometric systems. IEEE International Symposium on Information Theory, 2009; 22512255.
  • 47
    Golic JD, Baltatu M. Entropy analysis and new constructions of biometric key generation system. IEEE Transactions on Information Theory 2008; 54(5):20262040.
  • 48
    Ratha NK, Connel JH, Bolle RM. Enhancing security and privacy in biometrics-based authentication systems. IBM Systems Journal 2001; 40(3):614634.
  • 49
    Tuyls P, Skoric B, Kevenaar T. Security with noisy data: private biometrics, secure key storage and anti-counterfeiting. Springer Verlag: London, 2007.
  • 50
    Simoens K, Tuyls P, Preenel B. Privacy weaknesses in biometric sketches. 30th IEEE Symposium on Security and Privacy, 2009; 188203.
  • 51
    Lai L, Ho S-W, Poor HV. Privacy-security tradeoffs in biometric security sytems. 46th Allerton Conference on Communication, Control and Computing, Monticello, 2008.
  • 52
    Buhan I, Kelkboom E, Simoens K. A survey of the security and privacy measures for anonymous biometric authentication systems. 6th International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2010.
  • 53
    Ahlswede R, Csiszar I. Common randomness in information theory and cryptography, Part II: CR capacity. IEEE Transactions on Information Theory 1998; 44:225240.
  • 54
    Vetro A, Draper SC, Rane S, Yedidia JS. Securing biometric data. Preprint of a chapter in Distributed Source Coding, Dragotti PL, Gastpar M (eds). Academic Press: San Diego, USA, 2009.
  • 55
    Salehi-Abari A, White T. The relationship of trust, demand, and utility: be more trustworthy, then I will buy more. 8th Annual International Conference on Privacy, Security & Trust, 2010; 7279.
  • 56
    Huynh TD, Jennings NR, Shadbolt NR. An integrated trust and reputation model for open multi-agent systems. Autonomous Agents & Multi-Agent Systems 2006; 13(2):119154.
  • 57
    Yu B, Shingh MP. A social mechanism of reputation management in electronic communities. CIA'00, 2000; 154165.
  • 58
    Sabater J, Sierra C. Regret: a reputation model for gregarious societies. 4th Workshop on Deception Fraud and Trust in Agent Societies, 2001; 6170.
  • 59
    Josang A, Ismail R. The beta reputation system. 15th Bled Electronic Commerce Conference, 2002.
  • 60
    Sabater J, Sierra C. Review on computational trust and reputation models. Artificial Intelligence Review 2005; 24(1):3360.
  • 61
    Varian HR. Intermediate Microeconomics: A Modern Approach. W.W. Norton: New York, 1987.
  • 62
    Lieberman Research Group. Unisys Security Index: Global Summary 1 December 2008 (Wave 3). Retrieved from http://www.unisyssecurityindex.com.
  • 63
    Lieberman Research Group. Unisys Security Index: Supplemental Question 1 December 2008 (Wave 3). Revised version 4/9/2009, Retrieved from http://www.unisyssecurityindex.com.