2094
Analyzing the Privacy of a Vickrey Auction Mechanism
Then, B
1
and B
2
communicate the local function
of X to B
3
. With this information, bidder B
3
can
¿QGRXWWKHELGRI;$QH[DPSOHRIDQDXFWLRQ
is given in Figure 1.
In practice, this kind of collusion is extremely
unlikely to occur. Let us suppose that before the
auction begins, bidders B
1
, B
2
, and B
3
make a
deal to perform that plan. Let us note that the plan
will work only if X sends both the transformation
of his or her own bid and the transformation of
some other value to bidders in the set {B
1
, B
2
,
B
3
}. Recall that after a bidder applies his or her
function, he/she freely (e.g., randomly) chooses
what bidder will receive his or her transformed
value to continue with the transformation. So,
bidders B
1
, B
2
, and B
3
must be really lucky to
be positioned in the necessary positions with
UHVSHFWWRVRPHVSHFL¿FELGGHU;RUHYHQWRDQ\
RWKHUELGGHU<WKDWZDVQRW¿[HGLQDGYDQFH,Q
particular, the probability of this risk is negligible
when the number of bidders is high.
The previous problem could be formulated the
other way around. In this case, no deal is made
EHW ZHHQWKUHHVSHFL¿FELGGHUVEHIRUHWKHDXFWLRQ
begins. Instead, after some bidder (say B
2
) notices
that he/she received some transformed value from
;WKHQKHVKHWULHVWR¿QGZKRDUHLQWKLVVSHFL¿F
case, the bidders who play the roles of B
1
and B
3
LQWKHSUHYLRXVSUHVHQWDWLRQ$IWHUKHVKH¿QGV
them, he/she can propose them to make a deal to
uncover the bid of X. However, this scenario is
not possible. Recall that once a bidder has applied
his or her local function to some value, he/she
sends two data to the next bidder in the transfor-
mation: The (partially) transformed value and a
set containing all the bidders who have already
applied their functions to this value. Hence, bidder
B
2
receives from X the set of previous bidders,
but not an ordered sequence of bidders who have
participated in this transformation. So, B
2
is not
SURYLGHGZLWKWKHQHFHVVDU\LQIRUPDWLRQWR¿QG
out who B
1
is. According to that information, B
2
can infer who B
1
is only if the size of the received
set is 2 (i.e., the transformation thread under con-
sideration is that of the transformation of the bid
of B
1
). However, even in this case B
2
is unable to
know who B
3
is.
The use of several stages in our scheme yields
a different risk of collusion. In the previous sec-
tion, we introduced some conditions that are
necessary to avoid bidders located in interme-
diate points during the decoding of the second
Bidder 1 Bidder 2 Bidder 3 Bidder 4 Bidder 5 Bidder 6
Bidder 7
Bidder 8 Bidder 9 Bidder 10
Bid
30 18 65 46 47 39
97
58 63 46
1
st
425.234 171.33 1683.41 909.87 945.367 678.261
3432.27
1374.45 1592.34 909.87
2
nd
6715.23 6461.33 7973.41 7199.87 7235.37 6968.26
9722.27
7664.4 7882.3 7199.87
3
rd
3.13306
E09
3.0146
E09
3.72007
E09
3.35917
E09
3.37573
E09
3.25111
E09
4.53602
E09
3.57592
E09
3.67759
E09
3.35917
E09
4
th
6.64109
E17
6.17765
E17
9.16682
E17
7.569
E17
7.63919
E17
7.11848
E17
1.33002
E18
8.5115
E17
8.9713
E17
7.569
E17
5
th
4.70011
E18
4.65376
E18
4.95268
E18
4.7929
E18
4.79992
E18
4.74785
E18
5.36602
E18
4.88715
E18
4.93313
E18
4.7929
E18
6
th
2.2208
E25
2.1989
E25
2.34014
E25
2.26465
E25
2.26796
E25
2.24336
E25
2.53544
E25
2.30918
E25
2.3309
E25
2.26465
E25
7
th
1.21115
E40
1.19232
E40
1.31567
E40
1.24918
E40
1.25207
E40
1.23066
E40
1.49347
E40
1.28824
E40
1.30746
E40
1.24918
E40
8
th
5.08121
E43
5.08119
E43
5.08132
E43
5.08125
E43
5.08125
E43
5.08123
E43
5.08149
E43
5.08129
E43
5.08131
E43
5.08125
E43
9
th
1.96679
E49
1.96679
E49
1.96684
E49
1.96681
E49
1.96681
E49
1.9668
E49
1.9669
E49
1.96682
E49
1.96683
E49
1.96681
E49
10
th
2.43639
E79
2.43638
E79
2.43647
E79
2.43642
E79
2.43643
E79
2.43641
E79
2.43661
E79
2.43645
E79
2.43647
E79
2.43642
E79
2
nd
highest
1
st
1.96684
E49
2
nd
5.08132
E43
3
rd
1.31567
E40
4
th
2.34014
E25
5
th
4.95268
E18
6
th
9.16682
E17
7
th
3.72007
E09
8
th
7973.41
9
th
1683.41
10
th
65
Winner
Figure 1. Example of auction
2095
Analyzing the Privacy of a Vickrey Auction Mechanism
highest bid collecting enough information to
infer the global function. However, we did not
consider a scenario where these bidders collude.
Contrarily to the previous situation, in this case
LWLVUHODWLYHO\HDV\IRUDELGGHUWR¿QGDQRWKHU
bidder to collude. Let B
2
be a bidder located in
the intermediate point between decoding stages
2 and 3. Since he/she receives a set denoting all
the bidders that have been located in intermedi-
ate points so far, he/she knows who the bidder
between stages 1 and 2 is. Let B
1
be that bidder.
B
1
and B
2
can together infer the part of the global
decoding function that applies from stage 1 to
stage 3. If four or less stages are used and one of
these bidders is the winner, then B
1
and B
2
can
uncover the global function: Since the function
that governs the transformation from stage 3 on
depends on only two parameters, the winner, who
owns two examples of its application, can infer it.
However, let us note that B
1
and B
2
FDQQRW¿QG
out who the bidders are in the rest of intermedi-
ate points (recall that the next bidder after B
2
is
not in an intermediate point or is in the same
intermediate point than he/she). So, they do not
have any privileged information about the func-
tion that governs the transformation from stage
3 on. Note that if this function depends on three
parameters, then even the winner will be unable
WRLQIHULW+HQFHE\XVLQJ¿YHVWDJHVLQVWHDGRI
four we easily solve this collusion problem.
Let us consider a last collusion scenario. Let
us suppose that before the auction begins, some
bidders make a deal to collude. If these bidders
are located in enough intermediate points by
chance, then they could exchange their informa-
tion and infer the global function. In spite of the
fact that the probabilities of having some bidders
in enough intermediate points is negligible when
the number of bidders is high, even this problem
can be avoided by using n+1 stages, where n is the
number of bidders. Let us note that in this case,
all bidders are located in an intermediate point.
Then, only the collusion of n-2 nonwinning bid-
ders and the winner (i.e., n-1 bidders) allows the
colluders to uncover the global function. Let us
note that, in this case, the colluders can access the
values in all intermediate points but one. Let this
unknown point be located at stage i. Then, the col-
luders can compare the partial transformation of
the second highest bid at stages i-1 and i+1. Since
the global function from the beginning to i-1 and
from i+1 to the end is uncovered, the winner can
use these parts of the function to know the value
of its own bid at points i-1 and i+1. So, the win-
ner has two examples of the function that applies
from stage i-1 to stage i+1, which depends on 2
parameters. The consequence is that a group of
n-1 colluders that includes the winner can uncover
the global function. Let us note that as stated by
Brandt and Sandholm (2004), it is impossible to
¿QGDFRPSOHWHO\SULYDWHPHFKDQLVPWRSHUIRUP
the Vickrey auction. So, providing a mechanism
where even the collusion of n-1 bidders is unable
to breach the privacy is not possible.
Let us note that the complexity of our proto-
FROLV2QZKHQD¿[HGQXPEHURIVWDJHVHJ
LVXVHGEHFDXVHWKHFRGL¿FDWLRQRIDOOELGVLV
performed in parallel. However, if n+1 stages are
used, then the complexity is O(n
2
), which could be
too high if the number of bidders is high. Hence,
the risk of the previous kind of collusion should
be assessed before taking the decision of using
n+1 stages.
IMPLEMENTATION
The distributed algorithm that performs our
method and takes into account all the previous
considerations is depicted in Figure 2. Besides,
we have implemented a simple simulation of our
protocol. It is a monoprocessor C++ simulation,
where the relevant events that would occur in
the real system are represented. In particular, we
simulated a simple environment where 10 bidders
participate in the auction. Exponential, additive,
and multiplicative functions were used, as ex-
plained in previous sections. Since exponential
2096
Analyzing the Privacy of a Vickrey Auction Mechanism
Figure 2. Actual Algorithm
Notation:
P: set of bidders, with n = |P|.
Given p
i
P, the bid of p
i
is
b
i
.
ST: number of stages.
S: set of bidders already used in this stage.
G: set of bidders already used in intermediate points (in the inverse transformation).
Initialitation:
ST is agreed by all bidders (ST t 5);
i
of each stage i is commonly fixed by all bidders such that:
- 1d i d ST-
i
F
imod3
,where F
1
= {f | f(x) = x
r
}, F
2
= {f | f(x) = x+r}, and F
0
= {f | f(x) = r · x};
For each bidder p
j
P, privately do:
-
For each stage i, choose a function f
ji
i
;
- c := f
j1
(b
j
);
-p :=ChooseRandomly(P \ {p
j
});
- Transmit (c, {p
j
}, 1) to bidder p; /* where 1 denotes the first stage */
Inductive Case (forward way):
When tuple (c,S,i) is transmitted to bidder p
j
, do:
c: = f
ji
(c);
If |S| d n-2 then /* more bidders must apply their functions */
-p :=ChooseRandomly(P \ (S {p
j
}));
- Transmit (c, S {p
j
}, i) to bidder p;
else if i < ST then /* there are more stages */
-p :=ChooseRandomly(P);
- Transmit (c, Ø, i+1) to bidder p;
else broadcast c;
Comparison:
Once n values are broadcasted, we publicly perform:
Obtain the second highest value (namely d);
p := ChooseRandomly(P); G := Ø;
Trans
mit (d,
Ø, ST,
G) to bidd
er p;
Inductive Case (backward way):
When tuple (c, S, i, G) is transmitted to bidder p
j
, do:
c := f
-1
ji
(c);
If |S| d n-2 then /* more bidders must apply their functions */
-p :=MaybeChooseRandomly((P \ (S {p
j
})) G);
- If p does not exist then p := ChooseRandomly(P \ (S {p
j
}));
- /* Depending on whether p finishes this stage: */
- If |S| d n-3 then Transmit (c, S {p
j
}, i, G) to bidder p
- else Transmit (c,S {p
j
},i,G {p}) to bidder p;
else if i > 1 then /* there are more stages */
-p :=ChooseRandomly(P \ G);
- Transmit (c, Ø, i-1, G {p}) to bidder p; /* p is first bidder of next stage */
else Broadcast c;
Resolution:
Once a value c is broadcasted, do:
Ask for the bidder whose bid was greater than c. Let k be such bidder;
Assign item to bidder k at price c;
Figure 2. Actual algorithm
functions dramatically modify values, we were
concerned about modifying values, and the preci-
sion of the overall process. This issue is critical
because it concerns whether the transformed bids
keep their relative order like the original bids.
Besides, it also concerns whether the decoding
of a coded bid matches the original value. Even
WKRXJK¿YHVWDJHVDUHHQRXJKWRJXDUDQWHHPRVW
privacy properties, more stages were used in or-
der to maximize numerical errors and study the
numerical stability of the algorithm. In particular,
we used 10 stages where the kinds of functions
2097
Analyzing the Privacy of a Vickrey Auction Mechanism
cyclically alternated, as we said before. The C-
data type used to store values was the standard
long double type that provides 80-bit precision
ÀRDWLQJSRLQWQXPEHUV
In our experiment, bidders choose a random
bid between 1 and 100. Then they select multipli-
cative factors between 1 and 10 and exponential
factors between 1.01 and 1.1. Regarding additive
factors, they choose values whose magnitude is
close to that of the values they manipulated in the
previous stage. In spite of the fact that the prob-
ability of each value is uniformly distributed in
our experiment, bidders of a true auction are free
to choose their values according to any private
criterion. A simulation of the process is depicted
in Figure 2.
The winner is depicted in bold face, while
the second highest bidder is shown in italics.
%RWKWKHFRGL¿FDWLRQDQGWKHGHFRGLQJSKDVHV
are depicted. Let us note that the decoding of
the second highest value matched perfectly the
original second highest bid (65). For the sake of
checking precision issues, the rest of the values
were decoded as well. It turns out that not only
the relative order between bids is kept, but also
WKH¿QDOYDOXHVPDWFKSHUIHFWO\ZLWKWKHRULJLQDO
bids. That is, all values in any decoding stage i
FRLQFLGH WR WKRVH LQ WKH FRGL¿FDWLRQ VWDJH QL
The reason for that numerical precision is that
our protocol does not mix different kinds of op-
erations in the same stage. That is, in spite that
the order in which each exponential function is
applied is different for each bid, no single addi-
tion or multiplication is performed until all the
bids are transformed according to all exponential
functions in this stage. A similar argument applies
when the kind of function is either multiplicative
or additive. This fact minimizes numerical preci-
sion problems.
Let us note that if the number of bidders in an
auction is high, then the factors used by bidders
in their local functions should be constrained by
some upper and lower bounds. Otherwise, the size
of transformed values could be too high or too
ORZDQGLWFRXOGRYHUÀRZWKHPD[LPDOSUHFLVLRQ
allowed by the chosen data types. Alternatively,
data types with unlimited precision (and unlimited
memory usage) can be constructed and used.
CONCLUSION AND FUTURE WORK
In this paper, we have studied some properties
of the protocol presented in López et at. (2004).
In that paper, a mechanism to perform a Vickrey
auction was presented in such a way that only the
VHFRQGKLJKHVWELGDQGWKHLGHQWLW\RIWKH¿UVWELG-
der were revealed. Contrarily to some previous
works, this is achieved without the necessity of
any trusted third part, not even the auctioneer.
However, some relevant issues concerning risks
against the privacy, with or without collusion,
were not considered. Besides, a discussion about
practical issues was missing. In this paper, we have
addressed all these topics, and some results about
a simple implementation have been reported.
Let us note that this protocol requires that
all bidders behave conforming to the protocol.
That is, they are supposed to follow the behavior
described by the protocol. However, the result
of the auction could be ruined if one or more
bidders cheat. For example, if a bidder applies a
different function when a bid is coded and when
it is decoded, the published value of the second
highest bid could be wrong. Following this line,
we are currently developing a mechanism to
detect liars in the protocol. It consists of apply-
ing a kind of checksumWR¿QGRXWZKHWKHUVRPH
ELGGHUPRGL¿HGWKHUHVXOWV,WLVDGLI¿FXOWWDVN
because faults in the results must be detected
without revealing too much information, which
could break the privacy.
ACKNOWLEDGMENT
We would like to thank Felix Brandt for his helpful
comments about our auction protocol. This work
2098
Analyzing the Privacy of a Vickrey Auction Mechanism
has been supported in part by the MCYT project
MASTER (reference TIC2003-07848-C02-01),
and by the JCCLM project PAC-03-001.
REFERENCES
Brandt, F., & Sandholm, T. (2004). (Im)possibility
of unconditional privacy preserving auctions. In
Proceedings of the 3rd International Joint Con-
ference on Autonomous Agents and Multiagent
Systems (pp. 810–817). ACM Press.
Lipmaa, H., Asokan, N., & Niemi, V. (2002).
Secure Vickrey auctions without threshold trust.
In Proceedings of the Annual Conference on Fi-
nancial Cryptography, LNCS 2357 (pp. 87–101).
Springer.
López, N., Núñez, M., Rodríguez, I., & Rubio, F.
(2004). Improving privacy in Vickrey auctions.
ACM SIGEcom Exchanges, 5(1), 1–12.
Myerson, R. B. (1981). Optimal auction design.
Mathematics of Operations Research, 6, 58–73.
Naor, M., Pinkas, B., & Sumner, R. (1999). Privacy
preserving auctions and mechanism design. In
Proceedings of the ACM Conference on Electronic
Commerce (pp.129–139). ACM Press.
Sandholm, T., & Lesser, V. (1995). On automated
contracting in multienterprise manufacturing. In
Proceedings of Distributed Enterprise: Advanced
Systems and Tools, ESPRIT Working Group 9245
(pp.33–42).
Vickrey W. (1961). Counterspeculation, auctions,
and competitive sealed tenders. Journal of Fi-
nance, 16, 8–37.
ENDNOTES
1
The RET requires, for example, that the
ORZHVWELGGHUH[SHFWV]HURSUR¿WWKDWELG-
ders are risk-neutral, and that bidders have
independent and private values for their
items. These properties are rarely met in
real e-commerce environments.
2
For instance, if a bidder played the role of an
auctioneer in a previous auction where the
item being sold was similar, then he could
¿QGRXWWKHELGVLQWKHFXUUHQWDXFWLRQ
3
Alternatively, in order to guarantee that the
winner (and not the second bidder) buys the
item, the price could be raised in any tiny
amount μ.
4
In (López et al., 2004) the possibility that a
bidder knew two examples of the function
was not explicitly considered. However, the
g l o b a l f u n c t i o n d e p e n d e d i n g e n e r a l o n m o r e
than two unknowns, since each stage used
a different kind of function and introduced
a new unknown in the global function.
This work was previously published in International Journal of E-Business Research, Vol. 2, Issue 3, edited by I Lee, pp. 17-27,
copyright 2006 by IGI Publishing (an imprint of IGI Global).
2099
Copyright © 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Chapter 7.9
E-Services Privacy:
Needs, Approaches, Challenges,
Models, and Dimensions
Osama Shata
6SHFLDOL]HG(QJLQHHULQJ2I¿FH(J\SW
ABSTRACT
This chapter introduces several aspects related to
e-privacy such as needs, approaches, challenges,
and models. It argues that e-privacy protection,
although being of interest to many parties such
as industry, government, and individuals, is very
GLI¿FXOWWRDFKLHYHVLQFHWKHVHVWDNHKROGHUVRIWHQ
KDYHFRQÀLFWLQJQHHGVDQGUHTXLUHPHQWVDQGPD\
H YH Q K DYHF R Q ÀLFW L QJ X Q G H U V W D Q G L Q J RIH S U L Y D F \
6R¿QGLQJRQHPRGHORURQHDSSURDFKWRHSULYDF\
protection that may satisfy these stakeholders
is a challenging task. Furthermore, the author
hopes that this chapter will present an acceptable
GH¿QLWLRQ IRU HSULYDF\ DQG XVH WKLV GH¿QLWLRQ
to discuss various aspects of e-privacy protec-
tion such as principles of developing e-privacy
policies, individuals and organizations needs of
various privacy issues, challenges of adopting and
coping with e-privacy policies, tools and models
to support e-privacy protection in both public and
private networks, related legislations that protect
or constraint e-privacy, and spamming and Inter-
net censorship in the context of e-privacy. The
author hopes that understanding these aspects
will assist researchers in developing policies and
V\VWHPVWKDWZLOOEULQJWKHFRQÀLFWLQHSULYDF\
protection needs of individuals, industry, and
government into better alignment.
INTRODUCTION
The Internet in general and the World Wide Web
(WWW) in particular, were initially intended
to facilitate sharing of information between
individuals, research centers, organizations,
and so forth. However, they have now become
the fastest growing means to provide a variety
of services such as e-government, e-commerce,
e-communication, e-entertainment, e-education,
e-investment, and so on.
2100
E-Services Privacy
$OWKRXJK³HOHFWURQLFVHUYLFHV´HVHUYLFHV
is a term that implies the offering of services by
electronic means, it is mostly used now to mean
the offering of services via the Internet/WWW.
E-services are of various types, including those
that enable individuals and organizations to access
LQIRUPDWLRQHJVXU¿QJWKH:::DQGWKRVH
that facilitate transmitting of data (e.g., banking
application, e-shopping).
Individuals and organizations using and offer-
ing e-services are subject to many potential threats
(e.g., unauthorized intrusion and collection of IP
addresses, session hijacking, copying/stealing
information digitally stored, etc.). This raises the
need for high standards of security measures.
One of the threats that is receiving growing
attention is violating the privacy of users using
e-services. One type of violation may occur by
harmful software that attacks computers to col-
lect sensitive information for purposes such as
identity theft, or to destroy stored information.
This requires continuous adopting of new and
up-to-date protection techniques.
A second type of privacy violation is commit-
ted by organizations offering e-services. Such or-
ganizations tend to collect some of an individual’s
SHUVRQDOLGHQWL¿DEOHLQIRUPDWLRQ3,,ZKLFKLV
considered critical for the organizations’ interests,
but also is seen private by the individual using the
e-services. This necessitates preventing PII from
being collected without consent and protecting PII
collected with consent. This has raised Internet
privacy protection as one of the top policy issues
for legal institutions and legislators.
,Q RUGHU WR UHVROYHWKLV FRQÀLFWRI LQWHUHVWV
between individuals and organizations, several
laws and acts have been issued with the aim of
balancing the interests of the two parties. The
purpose of these laws and acts is to organize the
process of collecting, processing, and protecting
PII of individuals using e-services, and hence, to
provide some protection for individuals. This is
ZKDWZHFDOOLQWKLVFKDSWHU³HVHUYLFHSULYDF\
SURWHFWLRQ´RU³HSULYDF\´IRUVKRUW
(SULYDF\LVDFRQFHSWWKDWLVGLI¿FXOWWRGH¿QH
It is seen differently by the parties involved. Some
of the organizations that collect PII may view the
Internet as a public environment, and those who
connect to it should expect to be noticed. Other
organizations offer free services, thus those who
use the services should expect some trade off. On
the other hand, individuals believe that their online
activities and all their PII are private and belong
to them. Since these individuals switch between
TV channels and view whatever they prefer in
privacy without being tracked, they expect the
V D P H S U L Y D F \ Z K H Q V X U ¿ Q J W K H : : : / HJ L VOD W R U V
always debate comprehensively, before the issuing
of any related privacy law, on how to balance the
interests of the collecting organizations and indi-
viduals, and what principles and standards may
be used (e.g., the Canadian Personal Information
Protection and Electronic Documentation Act
[Government of Canada-1]). In addition, e-privacy
may be broadened to cover individuals’ rights not
to receive any unsolicited advertisements (spam-
ming) in their e-mail inboxes, as well as their
rights to access Web sites without restrictions
(Internet censorship). Another question is whether
or not the meaning of e-privacy would differ ac-
cording to whether the communication network
used is public (Internet) or private (belongs to an
organization or a workplace).
There are many issues related to e-privacy
such as its criticality, types, scope, standards,
legal requirements, challenges, approaches, and
how to protect it.
This chapter will discuss some of these issues.
In particular, it aims to:
,QWURGXFHHSULYDF\GH¿QHDQGFRQVLGHULW
from various perspectives,
2. Highlight standards and principles of e-
privacy policies,
3. Identify various challenges of adopting and
coping with e-privacy policies,
4. Discuss e-privacy in the context of public
networks,
2101
E-Services Privacy
5. Relate e-privacy to electronic security,
6. Identify critical organizational, legal, and
technical issues in managing e-privacy,
7. Introduce example models and approaches
to maintain individuals’ e-privacy,
8. Discuss spamming and Internet censorship
in the context of e-privacy, and
9. Introduce e-privacy considerations in private
networks.
WHAT IS E-PRIVACY?
Privacy is an abstract word that has various
meanings, scopes, and dimensions to individu-
als depending on each individual’s background,
psychology, beliefs, and ethics. However, most
individuals will relate its meaning to their right
to act freely from unauthorized intrusion and to
their right to keep what they believe to be private
from others.
Hence, we can look at e-privacy as an
individual’s right to act freely online (on the In-
ternet/WWW) without being monitored, traced,
restricted, and to keep their PII from being col-
lected or distributed to other parties without their
FRQVHQW8QIRUWXQDWHO\WKLVGH¿QLWLRQPD\QRW
EHDSSURYHGE\VRPHRUJDQL]DWLRQVWKDW¿QGLW
necessary to monitor individuals while they are
online to collect some necessary PII.
7KLVFKDSWHUZLOOXVHWKHDERYHGH¿QLWLRQRI
e-privacy as a base when discussing the balancing
RIFRQÀLFWLQJLQWHUHVWVEHWZHHQLQGLYLGXDOVDQG
organizations and to highlight some other related
issues. The next section will introduce examples of
violating individuals’ privacy online and discuss
the need for protecting e-privacy.
THE NEED FOR E-SERVICES
PRIVACY PROTECTION
New technology has enabled electronic services
providers and other parties to monitor online us-
ers and to collect and transfer users’ PII. These
technological capabilities have raised concerns
among users that their PII could be used in ways
that they would consider an invasion of their e-
privacy. As an attempt to understand the need for
e-privacy protection, it would be helpful to list
some examples of the misuse of the Internet which
have affected individuals’ e-privacy, whether the
misuse was intentional or unintentional.
In a recent article, Cobb and Cobb (2004)
gave an example to illustrate what happens when
people fail to understand how technology may
affect privacy. Some years ago, legislators in the
State of Florida authorized counties to put all
public records on the Web. As a result, anyone
could view the private data in the records such
as name, social security number, address, and in
some cases signatures. This can undoubtedly be
FODVVL¿HGDVDSULYDF\YLRODWLRQDQGFDQOHDGWR
various crimes such as identity theft. A sample
r e c o r d t h a t h a s b e e n u s e d b y C o b b a n d C o b b ( 2 0 0 4 )
may be examined for illustration at: http://www.
privacyforbusiness.com/example1.htm.
While Cobb and Cobb’s example of privacy
invasion was unintentional and would not be
FRQVLGHUHGDFULPLQDODFWRQHFDQ¿QGPDQ\
examples of intentional privacy invasion for the
sake of electronic fraud. One example of this is
Western Union’s Web site was broken into, and it
is thought that the hackers have managed to copy
credit ca rd infor mation for more than 15,000 cus-
tomers. (ID Theft, Schemes, Scams, Frauds).
Large organizations also suffer from private
information violations. In a presentation by Fred
+ROERUQKHSUHVHQWHGWKDW³RIODUJH
organizations detected computer security attacks
LQDFNQRZOHGJHG¿QDQFLDOORVVHVGXH
to computer breaches; theft of proprietary infor-
PDWLRQFDXVHGWKHJUHDWHVW¿QDQFLDOORVV²
million average.”
One can argue that these examples of privacy
invasion may occur in non-electronic environ-
ments as well and that the use of the Internet and
electronic services has just made them easier.
2102
E-Services Privacy
However, there are examples of privacy inva-
sions that would occur only because of the elec-
tronic services and the use of the Internet/WWW.
Such invasions are committed by unauthorized
software. This software is of various types,
including spy-ware, ad-ware, viruses, cookies,
online activities trackers, scum-ware, and Web
beacons. However, these various types share the
characteristic of being uninvited. In most cases,
the user is not aware of their existence until his or
her computer starts functioning in an unexpected
ZD\DQGWKH\FDQEHGLI¿FXOWWRUHPRYH:KLOH
some unauthorized software can cause serious
problems such as destroying personal data, and
identity theft, the aim of much other unauthorized
software is to collect information for the sake of
sending ads or even for security purposes. The
¿UVWW\SHRIXQDXWKRUL]HGVRIWZDUHFDXVHVLOOH-
gal activities and would require individuals and
organizations that keep online data to increase
WKHLU VHFXULW\ PHDVXUHV HJ WR XVH ¿UHZDOOV
Meanwhile, the second type of the unauthorized
software would need to be controlled by some
policy. To understand this better consider the
following examples:
• An online shopping Web site tracks its visi-
tors’ online activities and collects their PII
to e-mail them discounted offers.
• $VHDUFKHQJLQHWUDFNVLWVXVHUV¶VXU¿QJDQG
passes their PII to its sponsoring organiza-
tions.
• An e-government Web site collects PII with
individuals’ consent, but shares the PII with
other government agencies.
• An organization is increasing its security
measures and checks outgoing e-mail to
protect its critical data from being leaked, or
is monitoring its employees online activities
during working hours to increase productiv-
ity — is this an invasion of its employees’
e-privacy?
Such examples of what may be considered e-
privacy invasion led many legislators to conclude
that industry needs standards, policies, and laws
t o o r g a n i z e t h e p r o c e s s o f m o n i t o r i n g i nd i v i d u a l s’
online activities and online private data collec-
tion and usage. Some governments (e.g., U.S.,
EU, Australia, and Canada) have already issued
related bills and laws (e.g., the Canadian Personal
Information Protection and Electronic Documen-
tation Act [Government of Canada-1]).
The discussion of the need for e-services
privacy protection usually focuses on e-services
provided within public networks and does not
focus on e-services provided within non-public
networks (although the later type also requires
attention). However, the topic of e-privacy is still
HYROYLQJ DQG WKHUH LV QR XQL¿HG GH¿QLWLRQ RU
scope for online privacy and protection. We will
introduce in Section vi a discussion of the related
standards, principles, and models applicable to
e-privacy.
STANDARDS, PRINCIPLES, AND
MODELS OF E-SERVICES PRIVACY
PROTECTION
The rapid growth of Internet e-services and Web-
based applications that target consumers and
collect their PII led to concerns over e-privacy.
This required the issuing of laws for enforcing
e-privacy protection. Most e-privacy-related laws
enforce sites and organizations that collect PII
from individuals using an e-service by adopting
an e-privacy policy with minimally enforced stan-
GDUGVDQGVSHFL¿FDWLRQV,QPDQ\FDVHVWKHODZV
and regulations for e-privacy are amendments to
off-line privacy acts that are already in place.
In Canada, there is the Personal Information
Protection and Electronic Documents Act. The
purpose of this act, as stated on the Department
of Justice Web site is (Department of Justice
&DQDGD³WR SURYLGH&DQDGLDQV ZLWK D ULJKW
of privacy with respect to their personal infor-
2103
E-Services Privacy
mation that is collected, used, or disclosed by
an organization in the private sector in an era
in which technology increasingly facilitates the
FROOHFWLRQDQGIUHHÀRZRILQIRUPDWLRQ´7KH
privacy provisions of the Personal Information
Protection and Electronic Documents Act are
based on the Canadian Standards Association’s
Model Code for the Protection of Personal In-
formation, recognized as a national standard in
1996 (Government of Canada). The code’s 10
SULQFLSOHVDUH³$FFRXQWDELOLW\´³,GHQWLI\LQJ
3XUSRVHV´ ³&RQVHQW´ ³/LPLWLQJ &ROOHFWLRQ´
³/LPLWLQJ8VH'LVFORVXUHDQG5HWHQWLRQ´³$F-
FXUDF\´³6DIHJXDUGV´³2SHQQHVV´³,QGLYLGXDO
$FFHVV´DQG³&KDOOHQJLQJ&RPSOLDQFH´7KLVDFW
is supposed to cover both the government and the
private sectors. There are also other provincial acts
(e.g., Alberta’s Personal Information Protection
Act [Alberta Government] and British Columbia’s
Personal Information Protection Act).
The UK has the Data Protection Act 1998
8.7KHDFW³DSSOLHVWRDGDWDFRQWUROOHU
in respect of any data only if:
(a) t h e d a t a c o n t r o l l e r i s e s t a b l i s h e d i n t h e U n i t e d
Kingdom and the data are processed in the
context of that establishment, or
(b) the data controller is established neither in the
United Kingdom nor in any other EEA State
but uses equipment in the United Kingdom
for processing the data otherwise than for
the purposes of transit through the United
Kingdom.”
The act’s main principles emphasize that
personal data: shall be processed fairly
and lawfully; obtained only for one or
PRUHVSHFL¿HGDQGODZIXOSXUSRVHVVKDOO
be adequate, relevant, and accurate; shall
not be kept for longer than is necessary;
shall be protected by appropriate techni-
cal and organizational measures against
unauthorized or unlawful processing and
against accidental loss or destruction; and
shall only be transferred to a country or
territory outside the European Economic
Area (EEA) under conditions that ensure
an adequate level of protection of personal
data and the rights and freedoms of data
subjects (UK, 1998). The reader may realize
the great similarity between the principles
of the UK’s Data Protection Act 1998 and
those of the Canadian Personal Information
Protection and Electronic Documents Act.
Australia has the Federal Privacy Law. It
contains 11 Information Privacy Principles
(IPPs) and which apply to Commonwealth
and government agencies (Federal Privacy
Commissioner [Australia], 1988). The 11
principles are: Manner and purpose of col-
lection of personal information; solicitation
of personal information from individual
concerned; solicitation of personal infor-
mation generally; storage and security of
personal information; information relating
to records kept by record-keeper; access to
records containing personal information;
alteration of records containing personal
information; record-keeper to check ac-
curacy and so forth of personal informa-
tion before use; personal information to be
used only for relevant purposes; limits on
use of personal information; and limits on
disclosu re of personal i nformation. T he law
also has 10 National Privacy Principles that
apply to parts of the private sector and all
health service providers.
While the Canadian act and the Australian law
cover both the federal and private sectors, other
countries have laws and acts to govern the federal
sector and leave the private sector to develop its
own privacy policies (e.g., USA).
In the United States legislators have passed a
legislation regarding information practices and
e-privacy for the federal government. However,
legislators are still debating whether an e-privacy
act for the private sector is needed or industry