13
A privacy-aware framework for targeted advertising Wei Wang a , Linlin Yang b , Yanjiao Chen a , Qian Zhang a,a Department of Computer Science and Engineering, Hong Kong University of Science and Technology, Hong Kong b Fok Ying Tung Graduate School, Hong Kong University of Science and Technology, Hong Kong article info Article history: Received 21 June 2014 Received in revised form 6 November 2014 Accepted 27 December 2014 Available online 7 January 2015 Keywords: Targeted advertising Privacy Game theory abstract Much of today’s Internet ecosystem relies on online advertising for financial support. Since the effectiveness of advertising heavily depends on the relevance of the advertisements (ads) to user’s interests, many online advertisers turn to targeted advertising through an ad broker, who is responsible for personalized ad delivery that caters to user’s preference and interest. Most of existing targeted advertising systems need to access the users’ pro- files to learn their traits, which, however, has raised severe privacy concerns and make users unwilling to involve in the advertising systems. Spurred by the growing privacy con- cerns, this paper proposes a privacy-aware framework to promote targeted advertising. In our framework, the ad broker sits between advertisers and users for targeted advertising and provides certain amount of compensation to incentivize users to click ads that are interesting yet sensitive to them. The users determine their clicking behaviors based on their interests and potential privacy leakage, and the advertisers pay the ad broker for ad clicking. Under this framework, the optimal strategies of the advertisers, the ad broker and the users are analyzed by formulating the problem as a three-stage game, in which a unique Nash Equilibrium is achieved. In particular, we analyze the players’ behaviors for the scenarios of independent advertisers and competing advertisers. Extensive simulations have been conducted and the results validate the effectiveness of the proposed framework by showing that the utilities of all entities are significantly improved compared with tra- ditional systems. Ó 2015 Elsevier B.V. All rights reserved. 1. Introduction Online advertising provides financial support for a large portion of today’s Internet ecosystem, and is displayed in a variety of forms embedded in web sites, emails, videos and so on. As the effectiveness of advertising largely depends on the relevance between the delivered advertisements (ads) and users’ interests, a popular paradigm for current online advertising system is targeted advertising, where advertisers hire an ad broker to deliver ads to potentially interested users by analyzing users’ online profiles or behaviors [1]. Targeted advertising is beneficial to both advertisers and users: advertisers can gain higher revenue by advertising to users with a strong potential to purchase, and the users in turn receive more pertinent and useful ads that match their preferences and interests. A recent survey [2] revealed that targeted advertising brings 2.68 times revenue per ad compared with non-targeted advertising. Due to the increased effectiveness and benefits, a number of advertisers around the world have already turned to online targeted advertising systems. Examples of such advertising systems include Google AdWords [3] that deli- ver customized ads based on search items, and Ink TAD [4] that pushes ads according to location information revealed in user’s emails. http://dx.doi.org/10.1016/j.comnet.2014.12.017 1389-1286/Ó 2015 Elsevier B.V. All rights reserved. Corresponding author. Tel.: +852 23588766. E-mail address: [email protected] (Q. Zhang). Computer Networks 79 (2015) 17–29 Contents lists available at ScienceDirect Computer Networks journal homepage: www.elsevier.com/locate/comnet

A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

Computer Networks 79 (2015) 17–29

Contents lists available at ScienceDirect

Computer Networks

journal homepage: www.elsevier .com/ locate/comnet

A privacy-aware framework for targeted advertising

http://dx.doi.org/10.1016/j.comnet.2014.12.0171389-1286/� 2015 Elsevier B.V. All rights reserved.

⇑ Corresponding author. Tel.: +852 23588766.E-mail address: [email protected] (Q. Zhang).

Wei Wang a, Linlin Yang b, Yanjiao Chen a, Qian Zhang a,⇑a Department of Computer Science and Engineering, Hong Kong University of Science and Technology, Hong Kongb Fok Ying Tung Graduate School, Hong Kong University of Science and Technology, Hong Kong

a r t i c l e i n f o a b s t r a c t

Article history:Received 21 June 2014Received in revised form 6 November 2014Accepted 27 December 2014Available online 7 January 2015

Keywords:Targeted advertisingPrivacyGame theory

Much of today’s Internet ecosystem relies on online advertising for financial support. Sincethe effectiveness of advertising heavily depends on the relevance of the advertisements(ads) to user’s interests, many online advertisers turn to targeted advertising through anad broker, who is responsible for personalized ad delivery that caters to user’s preferenceand interest. Most of existing targeted advertising systems need to access the users’ pro-files to learn their traits, which, however, has raised severe privacy concerns and makeusers unwilling to involve in the advertising systems. Spurred by the growing privacy con-cerns, this paper proposes a privacy-aware framework to promote targeted advertising. Inour framework, the ad broker sits between advertisers and users for targeted advertisingand provides certain amount of compensation to incentivize users to click ads that areinteresting yet sensitive to them. The users determine their clicking behaviors based ontheir interests and potential privacy leakage, and the advertisers pay the ad broker forad clicking. Under this framework, the optimal strategies of the advertisers, the ad brokerand the users are analyzed by formulating the problem as a three-stage game, in which aunique Nash Equilibrium is achieved. In particular, we analyze the players’ behaviors forthe scenarios of independent advertisers and competing advertisers. Extensive simulationshave been conducted and the results validate the effectiveness of the proposed frameworkby showing that the utilities of all entities are significantly improved compared with tra-ditional systems.

� 2015 Elsevier B.V. All rights reserved.

1. Introduction

Online advertising provides financial support for a largeportion of today’s Internet ecosystem, and is displayed in avariety of forms embedded in web sites, emails, videos andso on. As the effectiveness of advertising largely dependson the relevance between the delivered advertisements(ads) and users’ interests, a popular paradigm for currentonline advertising system is targeted advertising, whereadvertisers hire an ad broker to deliver ads to potentiallyinterested users by analyzing users’ online profiles or

behaviors [1]. Targeted advertising is beneficial to bothadvertisers and users: advertisers can gain higher revenueby advertising to users with a strong potential to purchase,and the users in turn receive more pertinent and useful adsthat match their preferences and interests. A recent survey[2] revealed that targeted advertising brings 2.68 timesrevenue per ad compared with non-targeted advertising.Due to the increased effectiveness and benefits, a numberof advertisers around the world have already turned toonline targeted advertising systems. Examples of suchadvertising systems include Google AdWords [3] that deli-ver customized ads based on search items, and Ink TAD [4]that pushes ads according to location information revealedin user’s emails.

Page 2: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

18 W. Wang et al. / Computer Networks 79 (2015) 17–29

Although targeted advertising benefits both advertisersand users, it has raised severe privacy concerns. A recentsurvey [5] of 2253 participates conducted in 2012 reportedthat the majority of respondents expressed disapproval oftargeted advertising due to privacy disclosure. Such pri-vacy threats come from the fact that ad brokers aggres-sively track users’ online behaviors to obtain theirpreferences and interests, which can be sensitive to theusers. For example, the behavior of searching for a certainkind of medicine implies that the user is likely to have cer-tain relevant disease, whose disclosure is considered as aviolation of the user’s privacy. Moreover, ad brokers rarelyhave clear statements about how the obtained behavioraldata will be used and whom the data will be shared with.Untrusted ad brokers may sell such personal informationto some adversaries without the user’s permission. Beingaware of such privacy risks, users are reluctant to embracethe practice of targeted advertising [6], which hinders theeffectiveness of online advertising systems.

To maintain the merits brought by targeted advertising,it is essential to incentivize users to participate in such sys-tems. Existing studies [7–9] have focused on privacy pre-serving mechanisms to encourage users to involve in thetargeted advertising systems. These mechanisms eitherassume another trusted entity sitting between users andad brokers [7,8], or require users to send perturbed clickinginformation to hide users’ true data. However, thesechanges made on the framework of existing targetedadvertising systems provide privacy protection at the costof the benefits of ad brokers or advertisers. The ad brokersmay not be in favor of introducing an extra entity to sharetheir ad targeting duty, which is the main source of theirrevenue. Similarly, the advertisers may be dissatisfied withthe perturbed clicking information as perturbation under-mines the accuracy of click information, which normallydetermines their payments [10]. Without guaranteed reve-nue, the advertisers and ad brokers naturally tend to main-tain the adoption of traditional targeted advertisingsystems instead of upgrading the systems to provide pri-vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-awaremechanisms in advertising. To promote the adoption ofthe privacy-aware advertising systems, the interests of allentities should be guaranteed, which, unfortunately hasnot yet been addressed by existing proposals.

In this paper, we propose a privacy-aware framework toboost the adoption of privacy preserving targeted advertis-ing systems. Users, the ad broker and advertisers areassumed to be rational and selfish entities, who care onlyabout their own interests. To ensure the interests of allentities, our framework introduces an economic compen-sation mechanism for privacy leakage. Such economiccompensation for privacy loss has already been widelyconsidered in the literature [11,12]. Besides, many compa-nies, including Bynamite, Yahoo, and Google, are alsoengaging in the purchase of users’ private information inexchange for monetary or non-monetary compensation[13,14,12]. Under the proposed framework, the ad brokerscompensate economically for the users’ privacy leakage inorder to incentivize users to click their interested ads. Onthe one hand, the users, with the expectation of receiving

compensation, are inclined to click ads of interests. Onthe other hand, as the compensation can improve click-through rate and bring the ad broker more revenue, thead broker is willing to provide certain amount of compen-sation for users whose ad clicks reveal their private inter-ests. However, to support this framework, there are stillseveral questions that need to be answered. First and fore-most, in order to compensate privacy loss, it is essential toquantify privacy information leakage in ad clicks. Second,how much compensation should be provided for eachuser? The more compensation provided, the more inclinedusers are to click ads; while the ad broker pays more forthe users’ privacy loss. Moreover, how should advertiserspay the ad broker for the ad clicks they benefit from?The amount of payment to the ad broker has an impacton the privacy loss compensation allocated to users, whichin turn affects the click-through rates and advertisers’ rev-enue. In this paper, we answer all these questions via gametheory analysis. In particular, we propose an ad dissemina-tion protocol to protect the users’ privacy to a large extent,and formulate the interactions among all entities as athree-stage game, where each entity aims to maximizeits own utility.

The main contributions of this paper are summarized asfollows.

� We propose a privacy-aware framework for targetedadvertising to motivate users, the ad broker, and adver-tisers to be engaged in the targeted advertising systems.This framework requires no modifications on existingtargeted advertising systems, and takes the incentivesof all parties into consideration. In our framework, thead broker provides a certain amount of compensationfor the users’ privacy leakage from ad clicks in orderto encourage users to click their interested ads, whichin turn improves the click-through rate and brings inmore revenue for the ad broker and advertisers.� We model the framework as a three-stage Stackelberg

game, in which all entities are considered to be selfish,targeting at maximizing their own utilities by selectingoptimal strategies. We analyze the cooperation andcompetition relationship among users, the ad broker,and advertisers, and derive the Nash Equilibrium.� We further analyze the competition among advertisers

who share the whole market. We model the marketsharing scenario as a non-cooperative game and provethe existence of the Nash Equilibrium.� We conduct numerical simulations to evaluate the pro-

posed framework. The results verify that the utilities ofall entities are notably enhanced, which provides strongmotivation for the ad broker and advertisers to imple-ment the compensation policy and users to embracethe targeted advertising.

The rest of the paper is organized as follows. Section 2describes the system model. Section 3 introduces the com-pensation framework. In Section 4, we model the frame-work as a three-stage game and analyze the optimalstrategies of advertisers, the ad broker and users. We fur-ther discuss the competition among advertisers for marketsharing. Numerical results are shown in Section 5 and

Page 3: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

W. Wang et al. / Computer Networks 79 (2015) 17–29 19

related works are reviewed in Section 6, followed by theconclusion in Section 7.

2. System model

In this section, we describe the system model, includingthe targeted advertising system and privacy sensitivity.

2.1. Targeted advertising system

We adopt the most popular advertiser–broker-useradvertising architecture on today’s Internet. Fig. 1 illus-trates such a targeted advertising system which comprisesof a set of advertisers, one ad broker and a group of users.Advertisers are interested in displaying ads to the users ofpotential interests, and are willing to pay a certain amountmoney for the ads that are viewed by the interested users.Ads are delivered by an advertising platform (e.g., Google’sAdWords) run by the ad broker. The advertising platformcollects ads from advertisers and disseminate ads to thetargeted users according to the ad broker’s schedulingalgorithm.

Ad dissemination is scheduled over a series of timeslots. In each time slot, there are a set of active users, thatis, the users who are viewing a device that can display anad in that time slot. A set of ads from different advertisersare selected based on the preferences of the active users inthat time slot. We assume that the ad broker selects atmost one ad from an advertiser in each time slot. In orderto keep notation simple, we consider the ad disseminationin one time slot. We denote the set of active users andthe set of ads in a time slot as fSi : i ¼ 1; . . . ;Ng andfAj : j ¼ 1; . . . ;Kg, respectively.

Users

Ad Bro2. Push Ads

3. Click Information

Fig. 1. Structure of privacy-aware

To preserve users’ privacy, we assume that the user pro-files are kept on their local devices and cannot be accessedby the ad broker or advertisers [9]. In line with the privacypreserving online advertising systems [8,9,15], we focus onprivacy leakage via ad click, while we do not consider theprivacy leakage via web browsing tracking as it occurs out-side the advertising system. To predict the users’ prefer-ences, the ad broker collects the ad click information,which, however, may compromise users’ privacy. In thefollowing section, we quantify privacy leakage from adclicks. We also assume that there is no click fraud, as clickfraud can be addressed by existing solutions [16,17].

2.2. Privacy sensitivity

On the one hand, the privacy level of ad click behavior isconsidered to be content-dependent: clicking differentkinds of ads leads to different levels of privacy leakage.For example, clicking a medical ad may imply that youhave a certain kind of disease, which is normally very sen-sitive to the users; whereas, clicking an umbrella ad usu-ally leaks little privacy of the users. To quantify thecontent-dependent privacy level, we introduce a privacyfactor aj to present the sensitivity of an ad Aj. The contentof an ad with larger aj is more sensitive. For example, theprivacy factor aj1 of a medical ad Ajm is normally larger thanthe privacy factor aj2 of an umbrella ad Aju .

On the other hand, the privacy level of ad click behavioris also related to the total number of users who haveclicked it. According to the notion of k-anonymity [18],when a user’s information is identical with more users (kis larger), it is harder for adversaries to identify the userby looking at the clicking behavior, meaning that less

ker

Advertisers

1. Send Ads4. Click Statistics

targeted advertising system.

Page 4: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

20 W. Wang et al. / Computer Networks 79 (2015) 17–29

information is leaked. Thus, we define the privacy leakageof clicking an ad Aj as aj=nj, where nj is the total number ofusers clicking Aj.

Stage I: Adver�sers announce click prices

Stage II: The Ad broker determines the total amount of compensa�on

Targeted Adver�sing Game

User Click Decision Subgame

Stage III: Users decide their clicking behaviors

Compensa�on is allocated among users

Fig. 2. The procedure of targeted advertising and ad clicking.

3. The compensation framework

In this section, we propose a privacy-aware compensa-tion framework to promote targeted advertising with theconsideration of privacy leakage. The target of the frame-work is to establish a win–win situation among all entitiesin the targeted advertising system. In the framework, withthe promise of privacy leakage compensation, the sensitiveusers are incentivized to view their interested ads. The adbroker and advertisers earn more revenue from more adclicks. It is noteworthy that our proposed framework isbuilt on top of the existing advertiser–broker-user adver-tising architecture as described in Section 2, and intro-duces no new parties.

The ad broker assists advertisers to target interestedusers and deliver ads to those users. Advertisers yieldhigher revenue by delivering ads to desired users via thead broker [1]. In return, the ad broker charges the advertis-ers for each ad click. This business model is prevalent inthe current Internet ecosystem. Recall that users are reluc-tant to click their interested ads when the ad contents aresensitive and their private preferences can be revealed tothe ad broker. To motivate users to click their interestedads, the proposed framework introduces a compensationmechanism in which the ad broker puts forward a totalsum of Mj compensation amount, which is distributed tousers who have clicked sensitive ads.

Under the compensation framework, all entities areconsidered to be rational and aim to maximize their ownutilities. Advertisers gain revenue from clicks. Since in eachtime slot, each ad corresponds to one advertiser. For sim-plicity, we use Aj to denote the advertiser who distributesad Aj. For every click on Aj, the corresponding advertisergains an average revenue of Qj. The expense of advertisersis the money they pay to the ad broker. We denote theamount of payment for each click on Aj as Pj. The utilityof advertiser Aj is defined as its revenue from clicks minusthe fee it pays to the ad broker, which is expressed as

Uaj ¼ njQ j � njPj; ð1Þ

where nj is the total number of clicks on ad Aj.The ad broker receives money from advertisers and

decides the total amount of compensation Mj to be paidto all users for clicking ad Aj. The compensation Mj is thenallocated evenly to each user who has clicked ad Aj. Theutility of the ad broker is defined as the total revenue fromall advertisers minus the total money paid to users.

Ub ¼X

j

njPj �X

j

Mj: ð2Þ

Users decide whether to click their interested ads thatare delivered to their devices based on the amount of com-pensation and their privacy loss. Let Ri;j ¼ 0 denote thatuser Si decides not to click ad Aj and Ri;j ¼ 1 denote that

user Si decides to click ad Aj. Then, the total number ofclicks for ad Aj is nj ¼

PqRq;j. The overall compensation that

user Si receives is

Ci ¼X

j

MjRi;jPqRq;j

: ð3Þ

Studies have shown that different users have differentlevels of privacy sensitivity [31,32]. As described earlier,ad clicks leak users’ preferences to the ad broker. The pri-vacy leakage of clicking ad Aj is aj=nj ¼ aj=

PqRq;j. There-

fore, the total amount of user’s privacy leakage is

Li ¼ajRi;jP

qRq;j: ð4Þ

Then, the utility of user Si is defined as the amount ofmoney it receives from the ad broker minus the amountof money needed to compensate the user for privacyleakage

Usi ¼ Ci �xiLi ¼

Xj

MjRi;jPqRq;j�xi

Xj

ajRi;jPqRq;j

; ð5Þ

where xi is defined as the equivalent amount of compen-sation desired by user Si for every unit of privacy loss.

4. Game theory analysis

In this section, we cast the targeted advertising underthe compensation framework as a three-stage Stackelberggame, and analyze the best strategies of all entities basedon their utilities. We consider two advertising scenarios:independent advertisers who are engaged in independentmarkets and market sharing advertisers who competeagainst each other.

Page 5: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

W. Wang et al. / Computer Networks 79 (2015) 17–29 21

The targeted advertising game consists of three stages,as illustrated in Fig. 2. In the first stage of the game, adver-tisers announce the price of each ad click they will pay tothe ad broker. In the second stage, observing the price itreceives for every click, the ad broker determines theamount of money it compensates all users. In the laststage, users decide whether or not to click the ads. All play-ers act in a non-cooperative way as they make decisionsindependently.

The whole game can be viewed as the combination of auser click decision subgame and a targeted advertisingsubgame. In the user click decision subgame, the ad brokeracts as the leader because it determines the total amountof compensation, and leads the ad delivery procedure.The ad broker first announces the total amount of compen-sation and pushes ads to intended users according to itsscheduling algorithm. The users play as followers, anddecide their clicking behavior based on their evaluationof privacy loss and expected compensation. In the targetedadvertising subgame, the advertisers play as leaders, asthey initiate the whole advertising procedure and pay thead broker to distribute their ads. The ad broker acts asthe follower to initiate the user click decision subgame.

4.1. Targeted advertising for independent advertisers

We first analyze the targeted advertising game for inde-pendent advertisers, and use backward induction to deriveNash Equilibrium, where the optimal strategies for adver-tisers, the ad broker and users are obtained accordingly.

4.1.1. User clicking behavior analysisWe first analyze users’ decision making process. Note

that user Si’s utility not only depends on its own decisions,but also on other users’ choices. Let fRi;j : j ¼ 1; . . . ;Kgdenote user Si’s strategy and fR�i;j : j ¼ 1; . . . ;Kg denotethe strategies of all the other users. The utility of user Si

is Usi ðRi;j;R�i;jÞ. Then, the best response of Si is given by

R�i;j ¼ arg maxRi;j

Usi ðRi;j;R�i;jÞ: ð6Þ

Given the strategies of other users, the best response ofSi is its optimal strategy. Si will not deviate from its bestresponse unilaterally as it can gain nothing. If every useradopts the best response, Nash Equilibrium is reached.

If every user employs the best response with regard toother users’ decisions, no users have motivation to deviatefrom its best response unilaterally. As such, Nash Equilib-rium is reached.

Theorem 1. When the following condition

Mj=aj P minifxig; ð7Þ

is satisfied, there exists a Nash Equilibrium for the gameamong users and the optimal strategy of Si is

R�i;j ¼1; xi 6 Mj=aj

0; xi > Mj=aj:

�ð8Þ

Proof. From Eq. (5), we have

Usi ¼

Xj

ðMj �xiajÞRi;jPqRq;j

: ð9Þ

When Mj P xiaj, selecting Ri;j ¼ 1 can increase Si’s utility.Whereas, if Mj < xiaj, selecting Ri;j ¼ 0 can avoid decreas-ing Si’s utility. So Us

i can be maximized when Si determinesthe value of Ri;j according to Eq. (8), which is Si’s optimalstrategy. Note that Eq. (7) guarantees that the denominatorof Eq. (9) is non-zero. Every user can obtain its optimalstrategy by this mechanism. Thus, Nash Equilibrium isreached when they all adopt the optimal strategy. h

4.1.2. Optimal compensation strategy of ad brokerWhen the ad broker decides its strategy fMj : j ¼ 1; . . . Kg,

users will make their decisions according to Theorem 1.Accordingly, the ad broker can determine the optimalstrategy. As the leader of the user click decision subgame,the ad broker is aware of the impact of the total compen-sation amount on users’ decisions. Taking into accountusers’ possible responses, the ad broker can obtain its opti-mal strategy to maximize its profit.

Proposition 1. The ad broker has a unique optimal strategythat can maximize its utility.

Proof. Based on Eq. (2), we have

Ub ¼X

j

ðPjnj �MjÞ; ð10Þ

where nj ¼P

iRi;j ¼ xi 6Mj

aj: i ¼ 1;2 . . . ;N

n o��� ���. Assume

fx1;x2; . . . ;xNg follow a certain kind of distribution,whose probability distribution function is f ðxÞ. As N is avery large number (there are many end users in the sys-tem), nj can be calculated in this way

nj ¼ NZ Mj

aj

0f ðxÞdx: ð11Þ

So we have

Ub ¼X

j

PjNZ Mj

aj

0f ðxÞdx�Mj

0@

1A; ð12Þ

dUb

dMj¼X

j

PjNaj

fMj

aj

� �� 1

� �; ð13Þ

d2Ub

dM2j

¼X

j

PjNa2

j

f 0Mj

aj

� � !: ð14Þ

As N is very large, there exists two points where f ðxÞ ¼ aj

PjN.

Only the right one has a negative derivative, so there exists

a unique set of fMj : j ¼ 1; . . . Kg, that makes dUb

dMj¼ 0 and

d2Ub

dM2j< 0. To sum up, the ad broker can maximize its utility

by the following unique strategy

M�j ¼ ajf

�1 aj

PjN

� �; ð15Þ

Page 6: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

22 W. Wang et al. / Computer Networks 79 (2015) 17–29

where we select the larger value of f�1 aj

PjN

� �. h

To obtain a closed-form of the ad broker’s optimal strat-egy M�

j , we assume that users’ privacy sensitivities followGaussian distribution Nðl;r2Þ, which is commonly usedto model real-value random variables. In accordance withthe three-sigma rule, the probability that a variable lieswithin ½l� 3r;lþ 3r� is 99.74%, and thus the values out-side this interval can be neglected. Hence, we can useGaussian distribution Nðl;r2Þ to model users’ privacy sen-sitivities with the constraint l > 3r. In this circumstance,the ad broker’s optimal strategy is expressed as

M�j ¼ aj lþ

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi2r2 ln

PjN

aj

ffiffiffiffiffiffiffiffiffiffiffiffi2pr2p

s !: ð16Þ

4.1.3. Optimal ad pricing strategy of advertiserIn the targeted advertising subgame, advertisers deter-

mine their optimal strategies based on the ad broker’scompensation strategy.

Proposition 2. There exists optimal strategies for advertisers,and when users’ privacy sensitivities follow Gaussian distri-bution, the optimal strategy for every advertiser is unique.

Proof. From Eqs. (1) and (11), we have

Uaj ¼ ðQ j � PjÞN

Z M�j

aj

0f ðxÞdx: ð17Þ

It can be seen that Uaj ðPjÞ is a continuous function and its

upper bound is Qj. So there exists P�j , which is the optimalstrategy that maximizes Ua

j .We further analyze the circumstance where

fxi : i ¼ 1; . . . ;Ng follow the Gaussian distributionNðl;r2Þ. To simplify analysis, it is equal to considerfunction ln Ua

j . We have

dðln Uaj Þ

dPj¼ � 1

Qj � Pjþ

fM�jaj

� �R M�

jaj

0 f ðxÞdx

dM�jaj

� �dPj

; ð18Þ

where Pj 2 ð0;Q jÞ. Together with Eq. (16), we obtain

dðln Uaj Þ

dPj¼ � 1

Qj � Pjþ ajr2

NP2j

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi2r2 ln NPj

aj

ffiffiffiffiffiffiffiffi2pr2p

r R M�j

aj0 f ðxÞdx

:

ð19Þ

It is obvious thatdðln Ua

j ÞdPj

is a decreasing function with

regard to Pj, that isd2ðln Ua

j ÞdP2

j< 0, and the following two

equations

limPj!0þ

dðln Uaj Þ

dPj¼ þ1; ð20Þ

limPj!Q�j

dðln Uaj Þ

dPj¼ �1; ð21Þ

hold. So there is one unique P�j , wheredðln Ua

j ÞdPj¼ 0, and this P�j

is the unique optimal strategy for advertiser Aj. h

4.2. Targeted advertising for market sharing advertisers

In the model analyzed above, we assume that for everyclick advertiser Aj gains a constant revenue of Qj, whichonly stands for independent advertisers. For competingadvertisers who send ads about similar products or ser-vices, the whole market size is limited and shared by thoseadvertisers. Therefore, in this subsection, we discuss amore realistic model, for which we assume the whole mar-ket size is Q. The market size for advertiser Aj is positivelyrelated to the proportion of clicks on its ad to the totalclicks on all similar ads, that is njP

qnq

. So the utility of Aj is

Uaj ¼ Q

njPqnq� Pjnj: ð22Þ

As advertisers are aware of the ad broker’s reaction totheir strategies, they can decide their optimal strategiesaccordingly.

Proposition 3. There exists a unique optimal strategy for anadvertiser when users’ privacy sensitivities follow Gaussiandistribution and other advertisers’ strategies are fixed.

Proof. When fxi : i ¼ 1; . . . ;Ng follow the Gaussian distri-bution Nðl;r2Þ, we have

dðUaj Þ

dPj¼ Q

Pq–jnjPqnq

� �2 � Pj

0B@

1CA Nrf Mj

aj

� �Pj

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi2 ln PjN

aj

ffiffiffiffiffiffiffiffi2pr2p

r � nj: ð23Þ

It can be easily seen thatdðUa

j ÞdPj

is a decreasing function with

regard to Pj, that isd2ðUa

j ÞdP2

j< 0, and the following two Equations

limPj!0þ

dðUaj Þ

dPj¼ þ1; ð24Þ

limPj!�1

dðUaj Þ

dPj< 0; ð25Þ

hold. So there is one unique P�j , whered2ðUa

j ÞdP2

j¼ 0, and this P�j

is the unique optimal strategy for advertiser Aj when otheradvertisers’ strategies are fixed. h

Proposition 3 shows that when given other players’strategies P�j, advertiser Aj has one unique optimal strat-egy P�j . As it is a non-cooperative game among advertisers,we next prove that in some situations Nash Equilibriumexists for this game, that is.

Theorem 2. When the total number of users N is large, thereexists a strategy profile P� ¼ ðP�1; P

�2; . . . ; P�kÞ, for which the

following condition holds

8j; Pj : Uaj ðP

�j ; P

��jÞ > Ua

j ðPj; P��jÞ: ð26Þ

Proof. The Gaussian error function erf ðxÞ can be approxi-mated with elementary functions

erf ðxÞ ¼ 1� 1

ð1þ a1xþ a2x2 þ � � � þ a6x6Þ16 ; ð27Þ

Page 7: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

W. Wang et al. / Computer Networks 79 (2015) 17–29 23

where faig are approximate coefficients [19].Then, we have

nj¼NZ Mj

aj

0f ðxÞdx¼N

2� erf

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiln

PjN

aj

ffiffiffiffiffiffiffiffiffiffiffiffi2pr2p

s !þerf

lffiffiffiffiffiffiffiffiffi2r2p� �" #

¼N2� 2� 1

ð1þa1xþa2x2þ���þa6x6Þ16

"

� 1

ð1þa1yþa2y2þ���þa6y6Þ16

#;

where x ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiln PjN

aj

ffiffiffiffiffiffiffiffi2pr2p

rand y ¼ lffiffiffiffiffiffi

2r2p . As the total number of

users N is large, x > 1. Note that we assume that l > 3r so

that y > 3ffiffi2p . Thus nj ! N. The derivative of nj with regard to

Pj is

dðnjÞdPj

¼8N a1 þ 2a2xþ 3a3x2 þ � � � þ 6a6x5 ð1þ a1xþ a2x2 þ � � � þ a6x6Þ17 : ð28Þ

And we have

dðnjÞnj¼

8 a1 þ 2a2xþ 3a3x2 þ � � � þ 6a6x5 ð1þ a1xþ a2x2 þ � � � þ a6x6Þ17 � dPj: ð29Þ

As x > 1; dðnjÞnj! 0. So when advertiser Aj changes its strat-

egy, the change in the number of users who click the ad canbe ignored, which means the optimal strategy of everyadvertiser does not change when other advertisers changetheir strategies. When all advertisers adopt their uniqueoptimal strategies, their utilities are optimized, that is

9P� ¼ ðP�1; P�2; . . . ; P�kÞ

s:t: 8j; Pj : Uaj ðP

�j ; P

��jÞ > Ua

j ðPj; P��jÞ: �

From the proof of the above theorem, we can easilyderive the following proposition.

Proposition 4. The optimal strategy of every advertiser doesnot change when other advertisers change their strategies.

4.3. The adoption of the compensation framework

The previous analyses have focused on analyzing thecompensation framework under the advertiser–broker-user architecture, while our framework can be easilyextended to the advertiser-user architecture, where adver-tisers directly deliver ads to users without the assistance ofad brokers. Our framework can be easily extended to thisarchitecture. In this architecture, instead of paying fees tothe ad broker, advertisers directly compensate users fortheir privacy loss. To this end, we can simply set an extraconstraint

PjnjPj ¼

PjMj in our formulation. As such, all

the fees paid by the advertisersP

jnjPj become the privacyloss compensation

PjMj paid to users.

The compensation framework motivates the users,advertisers and the ad broker to participate in the targetedadvertising system via economic incentives. However,there may exist malicious entities whose target is not max-imizing their utilities but trying to violate the rules defined

by the compensation framework. To cope with such cases,governments have made efforts to regulate the behaviorsof entities in online advertising. The White House has pro-posed a Consumer Privacy Bill of Rights as part of a strat-egy to improve consumers privacy protection, and hasmade an agreement with the Digital Advertising Alliance,including Google, Yahoo!, Microsoft, etc., to regulate theiradvertising behaviors [20]. The Advertising RegulationDepartment of Financial Industry Regulatory Authority(FINRA) reviews broker-dealers’ advertisements and othercommunications with the public to ensure that they are fair,balanced and not misleading [21]. As such, the regulationfrom governments can ensure the all parties follow the rulesdefined in the proposed compensation framework.

5. Numerical results

In this section, we conduct extensive simulations todemonstrate the performance gain of the proposed com-pensation framework, and the impact of the systemparameters on the performance.

5.1. Simulation setup

We consider a system with 107 users, 50 advertisers andan ad broker, where users are randomly allocated to 10 dif-ferent preference profiles. A user is interested in an ad if thead matches the user’s profile [9]. Ads are delivered overmultiple time slots. In each time slot, an activity level, i.e.,the fraction of users that are active to view ads, is randomlychosen in ½0;1�. Unless explicitly otherwise stated, privacyfactor a is randomly distributed in ð0;1Þ, and the revenueof each click Q ¼ 1. The privacy factor a implies the relativecontent sensitivity of ads. An ad with larger a contains moresensitive content, and a user clicking it leaks more privateinformation. An ad with a ¼ 1 refers to the ad containingthe most sensitive content, while an ad with a ¼ 0 refersto the ad containing no sensitive information. Users’ pri-vacy sensitivities x follow Gaussian distribution Nðl;r2Þwhere l ¼ 1; r ¼ 0:01. Recall that the users’ privacy sensi-tivities x is defined as the equivalent amount of compensa-tion desired by the user for every unit of privacy loss. A userwith larger x values more about its private information,and thus needs stronger incentives to click sensitive ads.

To evaluate the performance gain of privacy-aware com-pensation, we compare our framework with the traditionalpaid-to-click (PTC) model, which is a popular online busi-ness model that motivates users to view ads. In the tradi-tional PTC model, a certain price is paid to users for everyclick, while the diversities of ad content and user’s sensitiv-ity are not considered. For fair comparison, we enhance thePTC model by implementing it in our privacy-aware setting.In particular, the enhanced PTC model employs a privacydependent pricing scheme where click price is determinedoptimally according to the user’s privacy sensitivity.

5.2. Results

We first evaluate the overall performance of our frame-work by comparing it with the PTC model in Section 5.2.1.Then, we evaluate our framework under various settings of

Page 8: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

0 1 2 3 4 5 6 7 8 9 10 114

4.1

4.2

4.3

4.4

4.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Average privacy sensitivity

Rev

enue

of a

dver

tiser

PTCCompensation

Fig. 4. Revenue of advertisers versus average privacy sensitivity.

0 1 2 3 4 5 6 7 8 9 10 110

5

10

15

20

25

Average privacy sensitivity

Rev

enue

of a

d br

oker

Compensation frameworkPTC

Fig. 5. Revenue of the ad broker versus average privacy sensitivity.

24 W. Wang et al. / Computer Networks 79 (2015) 17–29

privacy related parameters, i.e., the privacy factor a anduser’s privacy sensitivity x in Section 5.2.2. Finally, westudy the impact of competition among advertisers inSection 5.2.3.

5.2.1. Comparison of different frameworksWe first show the overall merits of the proposed com-

pensation framework by comparing it with the traditionalPTC framework. To show that our framework can incentiv-ize different kinds of users, that is, users with different sen-sitivities, we vary the average privacy sensitivity l. Inparticular, we compare the number of ad clicks, the reve-nue of advertisers, and the revenue of the ad brokers inFigs. 3–5, respectively.

Comparing the number of ad clicks achieved by differ-ent frameworks. Fig. 3 compares the number of ad clicksfor users with different levels of privacy concerns. Numberof ad clicks, which determines click-through rate, is con-sidered as an essential indicator for the efficiency of adver-tising. In all cases demonstrated, the number of ad clicksachieved by the compensation framework is larger thanthat in the PTC framework, which means that our frame-work outperforms the PTC in terms of the efficiency ofadvertising. The number of ad clicks decreases with aver-age privacy sensitivity under the PTC framework, butremains almost unchanged under the compensationframework. This is because the PTC framework pays usersbased a unified pricing model, while the compensation pol-icy considers the privacy loss of different users and is tai-lored to different ads, which can motivate more users.

Comparing the revenue of advertisers achieved by dif-ferent frameworks. Fig. 4 shows that when the average pri-vacy sensitivity increases, the revenue of advertisers dropssignificantly under the PTC framework, while the advertis-ers under the compensation framework still maintainstheir revenue at a high level, which implies that the pro-posed compensation framework successfully incentivizeshighly sensitive users and brings more revenues to adver-tisers. The trends of advertisers’ revenue are consistentwith the trends of number of ad clicks as depicted inFig. 3. The reason is that the revenue of advertisers is muchdetermined by the number of ad clicks. It is indicated in

0 1 2 3 4 5 6 7 8 9 10 114

4.1

4.2

4.3

4.4

4.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Average privacy sensitivity

Num

ber o

f ad

clic

ks

Compensation frameworkPTC

Fig. 3. Number of ad clicks versus average privacy sensitivity.

Eqs. (11) and (15) that when the shape of distributionf ðxÞ does not change, the relationship between nj and Pj

stays the same. Advertiser Aj’s revenue is only affected byQj; nj and Pj, where Qj is a constant number, so that itsstrategy remains the same with the increase in l, whichin turn makes nj and advertiser’s profit constant. Theresults of Fig. 4 further validate that the compensationframework brings more profits to advertisers.

Comparing the revenue of the ad broker achieved bydifferent frameworks. Fig. 5 illustrates the revenue of thead broker with different level of average privacy sensitiv-ity. In all cases demonstrated, the compensation frame-work brings considerably more revenue to the ad brokercompared to the PTC framework, which indicates that thecompensation framework can better motivate the ad bro-ker to involve the advertising systems. With higher aver-age privacy sensitivity, the performance gain of thecompensation framework is larger. The reason is that inour framework, the amount of compensation is tailoredto each click according to the ad content and user’s sensi-tivity, and leads to more ad clicks, which makes advertisers

Page 9: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

W. Wang et al. / Computer Networks 79 (2015) 17–29 25

pay more to the ad brokers. An interesting observation isthat the revenues of the ad broker decrease under bothframeworks with higher privacy sensitivity. This is becausethe price to compensate user’s privacy loss is higher forhigh sensitive users. As such, a larger portion of the ad bro-ker’s income is used to motivate users.

5.2.2. Impact of privacy parametersIn this subsection, we evaluate the proposed compensa-

tion framework under various privacy settings. In particu-lar, we show how privacy factor of an ad and user’s privacysensitivity affect the revenues of advertisers and the adbroker. In Figs. 6–9, we vary the privacy factor a of eachad from 0.1 to 1, and select different values of the meanand standard deviation of user’s privacy sensitivity.

The number of clicks under various privacy settings.Fig. 6 reports the number of clicks achieved by the com-pensation framework with different values of privacy fac-tor a and distributions of privacy sensitivity. The privacyfactor of an ad determine the privacy level of an ad andthe privacy loss of an click. The compensation amount isadjusted according to each ad’s privacy factor to stimulateusers. It can be seen that the number of ad clicks stayunchanged with different values of privacy factor, whichdemonstrate the effectiveness of the stimulation for differ-ent kinds of ads. In Fig. 6(a), we observe that the value of lhas little impact on the number of clicks, which is consis-tent with the results shown in Fig. 3. Fig. 6(b) shows thatwhen the standard deviation of users’ privacy sensitivityvaries from 0.1 to 1, the number of ad clicks declines.When the standard deviation of users’ privacy sensitivityis large, it costs more for the advertisers and the ad brokerto incentive users, as analyzed in Section 4. As such, lesscompensation is allocated to users, resulting in less adclicks. This observation reveals that for users with verydiverse privacy sensitivities, it is more effective to catego-rize users according to their privacy sensitivity and employdedicated strategy for each category.

0 0.2 0.4 0.6 0.8 14.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Privacy factor

Num

ber o

f ad

clic

ks

μ = 10μ = 1μ = 0.1

(a) Varying the mean of users’ privacy sensi-tivities

Fig. 6. Number o

The amount of compensation under various privacysettings. The amount of compensation is an important fac-tor that affects the revenue of the ad broker. We study theamount of compensation in Fig. 7. The amount of compen-sation grows with privacy factor, as the ad broker needs topay more to stimulate users to click ads of higher privacyfactor. For larger l and r, the average privacy loss for userswith privacy sensitivity higher than l is larger, whichdetermines larger amount of compensation to motivateusers.

The revenues under various privacy settings. Fig. 8depicts the revenue of advertisers against the privacy fac-tor and privacy sensitivity. The results are consistent withFig. 6 as the number of clicks largely determines the reve-nue of advertisers. An interesting observation in Fig. 8(b) isthat the revenue of advertisers diminishes as the standarddeviation of privacy sensitivity increases. With larger stan-dard deviation of privacy sensitivity, advertisers have theincentive to increase its offer to encourage the ad brokerto pay more to users. As analyzed in Section 4, the optimalpoint is on the right half of the probability distributionfunction, which means that advertisers and the ad brokersare more concerned about 50% of the users whose privacysensitivities are higher than average, as the other half arealways stimulated to click. With the increment of the stan-dard deviation, the privacy sensitivity level of the ‘‘con-cerned group’’ is rising so that both advertisers and thead broker raise their offers.

Fig. 9 shows the impact of privacy factor and the pri-vacy sensitivity on the revenue of the ad broker. An coun-ter-intuitive observation is that the revenue of the adbroker grows with privacy factor. This is because the pay-ment from advertisers dominates the revenue of the adbroker, and for ads with higher privacy factor, advertiserstend to pay more to motivate the ad broker to allocatemore compensation to users. Combining the results inFigs. 7 and 9, we find that although larger l and r leadto more compensation, while the revenue of the ad brokerdiverges in such two cases: the revenue of the ad broker

0 0.2 0.4 0.6 0.8 14.2

4.3

4.4

4.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Privacy factor

Num

ber o

f ad

clic

ks

σ = 1σ = 0.1σ = 0.01

(b) Varying the standard deviation of users’privacy sensitivities

f ad clicks.

Page 10: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

2

4

6

8

10

12

Privacy factor

Amou

nt o

f com

pens

atio

n

μ = 1μ = 0.1

μ = 10

(a) Varying the mean of users’ privacy sensi-tivities

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.5

1

1.5

2

2.5

3

3.5

4

4.5

Privacy factor

Amou

nt o

f com

pens

atio

n σ = 1

σ = 0.1

σ = 0.01

(b) Varying the standard deviation of users’privacy sensitivities

Fig. 7. Amount of compensation.

0 0.2 0.4 0.6 0.8 14.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Privacy factor

Rev

enue

of a

dver

tiser

μ = 10μ = 1μ = 0.1

(a) Varying the mean of users’ privacy sensi-tivities

0 0.2 0.4 0.6 0.8 14

4.1

4.2

4.3

4.4

4.5

4.6

4.7

4.8

4.9

5

5.1 x 105

Privacy factor

Rev

enue

of a

dver

tiser

σ = 0.01σ = 0.1σ = 1

(b) Varying the standard deviation of users’privacy sensitivities

Fig. 8. Revenue of advertisers.

26 W. Wang et al. / Computer Networks 79 (2015) 17–29

declines with larger l, but increases with larger r. The rea-son is that as l increases, advertisers maintain a constantprofit as shown in Figure Fig. 8(a), and it is the ad brokerwho pays for the users’ higher privacy loss; while as rgrows, users with more diverse privacy sensitivities makeadvertisers pay more to the ad broker to indirectly moti-vate users, as discussed earlier.

5.2.3. Impact of competition among advertisersNote that the above evaluations focus on independent

advertisers, e.g., advertisers for irrelevant contents. Nextwe discuss how the competition among advertisers affecttheir decisions and revenues. In the competition model,advertisers deliver similar ads and share one market. Inthis subsection, we present the simulation results of the

scenario where there are two similar advertisers sharingone market, which can be easily visualized.

First, we study the case where both advertisers deliverads of the same privacy factor, which is set to be 1.Fig. 10 shows the impact of both advertisers’ strategies,i.e. the price of every click, on advertiser 1’s revenue. Theresults show that advertiser 2’s price has little impact onadvertiser 1’s optimal price, which verifies our theoreticalresults in Proposition 4.

Then, we study the case where advertisers deliver ads ofvery different privacy factors. We set the privacy factor ofad 2 as 30 while ad 1’s remains at 1. Recall that the privacyfactor of an ad implies its content sensitivity. As such, ad 2is much more sensitive than ad 1. Fig. 11 shows thatalthough the revenue of advertiser 1 no longer keeps con-stant when the price of advertiser 2 changes as Fig. 10

Page 11: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

5

10

15

20

25

30

35

40

Privacy factor

Rev

enue

of a

d br

oker

μ = 1

μ = 10

μ = 0.1

(a) Varying the mean of users’ privacy sensi-tivities

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

50

100

150

200

250

300

350

400

Privacy factor

Rev

enue

of a

d br

oker

σ = 1

σ = 0.1

σ = 0.01

(b) Varying the standard deviation of users’privacy sensitivities

Fig. 9. Revenue of ad broker.

33.5

44.5

55.5

66.5

x 10 −5

00.2

0.40.6

0.81

1.2

x 10−5

−12

−10

−8

−6

−4

−2

0

2

4

Price of every click from advertiser 2Price of every click from advertiser 1

Rev

enue

of a

dver

tiser

1

−10

−8

−6

−4

−2

0

2

Fig. 10. Revenue of advertisers with the same privacy factor. Privacy factors for both advertisers’ ads are 1.

W. Wang et al. / Computer Networks 79 (2015) 17–29 27

shows, the optimal price of advertiser 1 maintains con-stant, which demonstrates that the Nash Equilibrium canalso be achieved when different ads have different privacyfactors. Another observation from Fig. 11 is that the reve-nue of advertiser 1 decreases with the increment of adver-tiser 2’s price. This is because advertisers are competingwith each other, advertiser 1 loses its user when advertiser2 offers higher price.

6. Related work

Existing studies related to privacy-aware targetedadvertising can be classified into three categories: targetedadvertising mechanisms, privacy preservation on user’s

online profiles or behaviors, and incentive mechanismdesigns for advertising and trading privacy.

Targeted Advertising Mechanisms. Targeted advertis-ing mechanisms mainly focus on the strategies of advertis-ers or the ad broker to deliver ads to matched users.Chakrabarti et al. [22] studies the problem of displayingrelevant ads to web pages and proposes a new class ofmodels to combine relevance and click feedback for con-textual advertising. Mechanisms for search engine tomatch ads with queries are discussed in [23–25]. Thesestudies model the interactions between the ad brokerand advertisers as an auction problem, in which advertis-ers bid for ads displayed on the search results pages. Pro-vost et al. [26] study the problem of targeted advertisingin the context of social networks, and introduce a frame-

Page 12: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

3 3.5 4 4.5 5 5.5 6 6.5

0

0.5

1x 10−5

x 10−5

−15

−10

−5

0

5

10

Price of every click from advertiser 2Price of every click from advertiser 1

Rev

enue

of a

dver

tiser

1

−10

−5

0

5

Fig. 11. Revenue of advertisers with very different privacy factor. Privacy factors for advertiser 1’ ads and advertiser 2’s ads are 1 and 30, respectively.

28 W. Wang et al. / Computer Networks 79 (2015) 17–29

work to select users based on page visitations. These workshave focused on the strategies of advertisers and the adbroker, while our framework take into account the impactof user behaviors on targeted advertising.

Privacy Preservation. Many recent works have investi-gated privacy issues in user’s online profiles and behaviors,which are involved in targeted advertising. Privad [8]keeps the users’ profile at their local devices, and intro-duces an anonymizer sitting between users and the ad bro-ker to anonymize user’s clicks. Kodialam et al. [9] proposea targeted advertising system in which users are allowed tosend perturbed click statistics to preserve privacy. Adnos-tic [7] offloads the ad broker’s task, i.e., behavioral profilingand targeting, to user’s local browser. Privad [15] introducean extra proxy to preserve user’s privacy. These privacy-preserving systems protect user’s privacy at the cost ofadvertisers’ or the ad broker’s interests, and ignore theirincentives to promote such systems. Our compensationframework is built based on their basic privacy-preservingarchitecture, that is, keeping users’ profile at local devices,and aims to fill the gap between user’s privacy protectionand the economic incentives of advertisers and the ad bro-ker. There are also many privacy preserving techniques foruser’s online behaviors [27–29]. These techniques nor-mally perturb user’s actions to hide their precise behaviors,which, however, cannot be directly applied to the targetedadvertising as precise clicking information is a prerequisiteto determine advertiser’s payment.

Incentive Mechanism Design. Several incentive mech-anisms are proposed for advertising and trading user’s pri-vate data. Ning et al. [30] formulate the ad forwardingproblem in delay-tolerant networks as a two-player coop-erative game, and devise an incentive scheme to motivateusers to forward ads. The privacy-aware mechanism wasfirst proposed in [11] to include the valuation of privacyas a part of player’s utility. Ghosh and Roth [12] consider

privacy as a commodity, and derive truthful auctions forprivate data trading. Riederer et al. [33] propose an auctionmechanism to sell user’s personal information to multipleinformation aggregators. Our framework is inspired bythese studies. Nevertheless, they are different from the tar-geted advertising problem studied in this paper, where alladvertisers, users, and the ad broker are decision makers,and their interactions are modeled and analyzed throughdifferent stages [11,12].

7. Conclusion

This paper has studied the privacy-aware targetedadvertising problem, and proposed a compensation frame-work to encourage users to view ads of interest. The com-pensation framework aims to promote targeted advertisingby creating a win–win situation. Under this framework, weanalyze the interactions among advertisers, the ad broker,and users through a three-stage game modeling. Simula-tion results demonstrate the efficacy of the proposedframework, under which users are motivated to click moreinterested ads and both the advertiser and the ad brokerachieve considerable gains in their revenues. Therefore,with proper design, the compensation framework can pro-mote the adoption of targeted advertising and bring meritsfor all entities.

Acknowledgements

The research was support in part by grants from 973project 2013CB329006, China NSFC under Grant61173156, RGC under the contracts CERG 622613,16212714, HKUST6/CRF/12R, and M-HKUST609/13, as wellas the grant from Huawei-HKUST joint lab.

Page 13: A privacy-aware framework for targeted advertising · vacy protection. This conflict between users and advertis-ers/ad brokers hinders the adoption of privacy-aware mechanisms in

W. Wang et al. / Computer Networks 79 (2015) 17–29 29

References

[1] J. Yan, N. Liu, G. Wang, W. Zhang, Y. Jiang, Z. Chen, How much canbehavioral targeting help online advertising?, in: Proc. ACM WWW,2009.

[2] H. Beales, The value of behavioral targeting, in: Network AdvertisingInitiative, 2010.

[3] Google Adwords. <http://www.google.com/ads/adwords/>.[4] Ink tad. <http://www.ink-global.com/what-we-do/tad/>.[5] K. Purcell, J. Brenner, L. Rainie, Search engine use 2012, in: Pew

Internet & American Life Report, 2012.[6] J. Turow, J. King, C.J. Hoofnatle, A. Bleakley, M. Hennessy, Americans

reject tailored advertising and three activities that enable it, in:SSRN, 2009.

[7] V. Toubiana, A. Narayanan, D. Boneh, H. Nissenbaum, S. Barocas,Adnostic: privacy preserving targeted advertising, in: Proc. NDSS,2010.

[8] S. Guha, B. Cheng, P. Francis, Privad: practical privacy in onlineadvertising, in: Proc. NSDI, 2011.

[9] M. Kodialam, T. Lakshman, S. Mukherjee, Effective ad targeting withconcealed profiles, in: Proc. IEEE INFOCOM, 2012.

[10] V. Dave, S. Guha, Y. Zhang, Measuring and fingerprinting click-spamin ad networks, in: Proc. ACM SIGCOMM, 2012.

[11] K. Nissim, C. Orlandi, R. Smorodinsky, Privacy-aware mechanismdesign, in: ACM Proc. EC, 2012.

[12] A. Ghosh, A. Roth, Selling privacy at auction, Games Econ. Behav.(2013) Elsevier.

[13] S. Clifford, Web startups offer bargains for users data, in: The NewYork Times, 2010.

[14] S. Lohr, You want my personal data? Reward me for it, in: The NewYork Times, 2010.

[15] H. Haddadi, S. Guha, P. Francis, Not all adware is badware: towardsprivacy-aware advertising, in: Springer Software Services for e-Business and e-Society, 2009, pp. 161–172.

[16] S. Majumdar, D. Kulkarni, C.V. Ravishankar, Addressing click fraud incontent delivery systems, in: Proc. IEEE INFOCOM, 2007.

[17] H. Haddadi, Fighting online click-fraud using bluff ads, ACMSIGCOMM Comput. Commun. Rev. 40 (2) (2010) 21–25.

[18] L. Sweeney, k-anonymity: a model for protecting privacy, Int. J.Uncertain. Fuzz. Knowl.-Based Syst. 10 (05) (2002) 557–570.

[19] M. Abramowitz, I.A. Stegun, Handbook of Mathematical Functions:With Formulas, Graphs, and Mathematical Tables, Courier DoverPublications, 2012.

[20] D. Weitzner, Obama administration calls for a consumer privacy billof rights for the digital age.

[21] Finra’s Advertising Regulation. <http://www.finra.org/Industry/Issues/Advertising/>.

[22] D. Chakrabarti, D. Agarwal, V. Josifovski, Contextual advertising bycombining relevance with click feedback, in: Proc. ACM WWW,2008.

[23] A. Mehta, A. Saberi, U. Vazirani, V. Vazirani, Adwords andgeneralized online matching, J. ACM 54 (5) (2007) 1–22.

[24] N. Devanur, T. Hayes, The adwords problem: online keywordmatching with budgeted bidders under random permutations, in:Proc. ACM EC, 2009.

[25] N. Buchbinder, K. Jain, J.S. Naor, Online primal-dual algorithms formaximizing ad-auctions revenue, in: Algorithms–ESA, Springer,2007.

[26] F. Provost, B. Dalessandro, R. Hook, X. Zhang, A. Murray, Audienceselection for on-line brand advertising: privacy-friendly socialnetwork targeting, in: Proc. ACM SIGKDD, 2009.

[27] F.G. Mármol, J. Girao, G.M. Pérez, Trims, a privacy-aware trust andreputation model for identity management systems, Comput. Netw.54 (16) (2010) 2899–2912. Elsevier.

[28] C. Rottondi, G. Verticale, A. Capone, Privacy-preserving smartmetering with multiple data consumers, Comput. Netw. 57 (7)(2013) 1699–1713. Elsevier.

[29] F. Tschorsch, B. Scheuermann, An algorithm for privacy-preservingdistributed user statistics, Comput. Netw. 57 (14) (2013) 2775–2787. Elsevier.

[30] T. Ning, Z. Yang, H. Wu, Z. Han, Self-interest-drive incentives for addissemination in autonomous mobile social networks, in: Proc. IEEEINFOCOM, 2013.

[31] D.W. Bedford, Empirical investigation of the acceptance andintended use of mobile commerce: location, personal privacy andtrust, Mississippi State University, 2005.

[32] W. Wang, Q. Zhang, Privacy-preserving collaborative spectrumsensing with multiple service providers, IEEE Trans. WirelessCommun. (2014)

[33] C. Riederer, V. Erramilli, A. Chaintreau, B. Krishnamurthy, P.Rodriguez, For sale: your data: by: you, in: Proc. ACM HotNets, 2011.

Wei Wang is currently a Ph.D. candidate inHong Kong University of Science and Tech-nology. He received his B.E. degree in Elec-tronics and Information Engineering fromHuazhong University of Science and Technol-ogy, Wuhan, China, in 2010. His researchinterests include privacy and fault manage-ment in wireless networks.

Linlin Yang is currently a M.Phil. student inHong Kong University of Science and Tech-nology. She received her B.E. degree of elec-tronic engineering from Tsinghua Universityin 2012. Her research interests include pri-vacy protection for advertising and networkeconomics.

Yanjiao Chen received her B.E. degree ofelectronic engineering from Tsinghua Uni-versity in 2010. She is currently a Ph.D. can-didate in Hong Kong University of Science andTechnology. Her research interests includespectrum management for Femtocell net-works and network economics.

Qian Zhang ([email protected]) joined HongKong University of Science and Technology inSeptember 2005 where she is a full Professorin the Department of Computer Science andEngineering. Before that, she was in MicrosoftResearch Asia, Beijing, from July 1999, whereshe was the research manager of the Wirelessand Networking Group. Zhang has publishedabout 300 refereed papers in internationalleading journals and key conferences in theareas of wireless/Internet multimedia net-working, wireless communications and net-

working, wireless sensor networks, and overlay networking. She is aFellow of IEEE for ‘‘contribution to the mobility and spectrum manage-ment of wireless networks and mobile communications’’. Zhang has

received MIT TR100 (MIT Technology Review) worlds top young inno-vator award. She also received the Best Asia Pacific (AP) Young ResearcherAward elected by IEEE Communication Society in year 2004. Her currentresearch is on cognitive and cooperative networks, dynamic spectrumaccess and management, as well as wireless sensor networks. Zhangreceived the B.S., M.S., and Ph.D. degrees from Wuhan University, China,in 1994, 1996, and 1999, respectively, all in computer science.