31
Facebook User Perception Towards its Privacy Policy And Privacy Setting MBA Media&Entertainment & Communication 2 nd Sem Submitted By: Vijay Gohil Ankur Goswami Rajdeepsinh Jadeja Bhumesh Maheshwary Priya Pandey Gaurav Patel Submitted To: Prof. Nirav Halvadia

privacy policy of fb

  • Upload
    mark

  • View
    27

  • Download
    5

Embed Size (px)

DESCRIPTION

consumer perception towards privacy and securityawarnesspeople

Citation preview

  • Facebook User Perception Towards itsPrivacy Policy And Privacy Setting

    MBA Media&Entertainment & Communication2nd Sem

    Submitted By:

    Vijay GohilAnkur GoswamiRajdeepsinh JadejaBhumesh MaheshwaryPriya PandeyGaurav Patel

    Submitted To: Prof. Nirav Halvadia

  • ACKNOWLEDGEMENT

    We would like to thank Almighty God for providing us strength from time to timeand introducing those important individuals to us who have helped us inaccomplishing this task.

    Our deepest and sincere gratitude goes to our esteemed facultymemberProf.NiravHalvadia ,CMS,Ganpatuniversity,Ahmedabad whose constantsupport and motivation has immensely helped us in reducing the hurdles anddifficulties of this project report..

    And finally we would like to acknowledge our friends and colleagues for their loveand support and for standing besides us whenever we have needed them and alsogiving us valuable comments,motivation and criticism to help us design thisproject.

    Vijay GohilAnkur Goswami

    Rajdeepsinh JadejaBhumesh Maheshwary

    Priya PandeyGaurav Patel

  • INDEX

    Ch. NO Chapter Name PAGE NO

    1 Introduction 6

    2 Research objective andliterature review

    11

    3 Research methodology 14

    s4 Data analysis 17

    5 Conclusion 27

    6 Bibliography 28

    7 Annexure 29

  • CHAPTER 1:INTRODUCTION

  • Social networking

    Sites like Facebook are rapidly gaining in popularity. At the same time, they seemto present significant privacy issues for their users. We analyze two of Facebookssmore recent features, Applications and News Feed, from the perspective enabledby Helen Nissenbaums treatment of privacy as contextual integrity. Offline,privacy is mediated by highly granular social contexts. Online contexts, includingsocial networking sites, lack much of this granularity. These contextual gaps are atthe root of many of the sites privacy issues.

    Applications, which nearly invisibly shares not just a users, but a users friendsinformation with third parties, clearly violates standard norms of information flow.News Feed is a more complex case, because it involves not just questions ofprivacy, but also of program interface and of the meaning of friendship online.In both cases, many of the privacy issues on Facebook are primarily design issues,which could be ameliorated by an interface that made the flows of informationmore transparent to users.

    A social network is a set of people or other social entities such as organizationsconnected by a set of socially meaningful relationships (Wellman, 1997). Socialnetworking sites (SNS) are a type of online communities that have growntremendously in popularity over the past years. For example, the social networkingsite Facebook

    Facebook is now one of the biggest social networking sites. It was founded in 2004in the USA by a former Harvard student Mark Zuckerberg. In the beginning, it wascreated only for students use, but now it is open for everyone who has a validemail address. The success and growth of Facebook has been incredible: after thefirst year, it had already one million users, and five years later, in February 2009,Facebook had already more than 175 million active users. More than half ofFacebook users are outside of college, and the fastest growing demographic isthose 30 years old and older. Facebook had a powerful entry also to the Finnishmarket in the summer and early fall of 2007. In the spring of 2008, the FinnishFacebook network had over 399 000 users (www.facebook.com). Members of asocial network connect to others by sending a Friend message or request, whichusually must be accepted by the receiving party in order to establish a link. Bybecoming Friends, the members allow each other to access their profileinformation, and add each other to their corresponding social networks. However,users can always determine how visible their profile and profile information are.

  • They can restrict the viewing of their profiles from people not part of theirnetwork, or they can keep the profile open for everyone. In this study, we havemainly considered information disclosure in terms of profiles, i.e. what usersreveal themselves in their profiles, but disclosing personal information may alsooccur in participation of discussions, writing messages in other users pages,walls and so on.Earlier research (see e.g. Boyd & Ellison, 2007; Dwyer et al., 2007; Lehtinen,2007) suggests that the main motivation to use online social networking sites is tocommunicate and to maintain relationships. Lehtinen (2007) found that differentinteraction rituals are performed on an SNS for reconstructing the establishedsocial networks. Popular activities include updating personal information andwhereabouts (status), sharing photos and archiving events, getting updates onactivities by friends, displaying a large social network, presenting an idealizedpersona, sending messages privately, and posting public testimonials (Dwyer et al.,2007).

    Several studies have attempted to determine implications of privacy concerns andawareness of privacy to users online practices and behavior (see e.g. Dinev &Hart, 2006; Dwyer et al., 2007; Goettke & Christiana, 2007; Govani & Pashley,2005; Gross & Acquisti, 2005). The real privacy risks are believed to arise whenusers disclose identifiable information about themselves online to people who theydo not know or normally (that is, offline, in real life) would not trust (see e.g.Brooks, 2007). This is assumed to stem from the users lack of privacy concerns(Gross & Acquisti, 2005). Govani and Pashley (2005) investigated studentawareness of the privacy issues and the available privacy protection provided byFacebook.

    They found that the majority of the students are indeed aware of possibleconsequences of providing personally identifiable information to an entireuniversity population (such as, risk of identity theft or stalking), but nevertheless,feel comfortable enough in providing their personal information.

  • Even though they are aware of ways to limit the visibility of their personalinformation, the did not take any initiative to protect the information (Govani &Pashley, 2005). In another study, Tow at al. (2008) conclude that users are oftensimply not aware of the issues or feel that the risk to them personally is very low,and have have a nave sense that online communities are safe. Social networkingsites have a lot of users who have an open profile with considerable amount ofpersonal information (e.g. photos, contact information, current whereaboutsstatus, and so on).

    Do these users feel comfortable with sharing all their personal informationwith a large number of strangers? Or do they actually know who can access theirprofile information? Are they concerned about their privacy? In this study, we lookat users awareness of privacy on online social networking sites. Furthermore, weare interested in whether the awareness (or lack of it) influences users privacybehavior. We highlight two privacy perspectives: protection and disclosure.

    The two viewpoints are analyzed and we attempt to understand whatinfluence users to disclose or protect information on Facebook. This paper isorganized as follows: We will next review earlier literature on online socialnetworking, especially issues related to privacy and legal matters. We will thenintroduce our empirical study.

  • Facebook

    Users share a variety of information about themselves on their Facebookprofiles, including photos, contact information, and tastes in movies andbooks. They list their friends, including friends at other schools. Userscan also specify what courses they are taking and join a variety ofgroups of people with similar interests (Red Sox Nation, NorthernCalifornia).

    The site is often used to obtain contact information, to match names tofaces, and to browse for entertainment. [4] Facebook was founded in2004 by Mark Zuckerburg, then a Harvard undergraduate. The site isunique among social networking sites in that it is focused arounduniversities Facebook is actually a collection of sites, each focusedon one of 2,000 individual colleges. Users need an @college.edu emailaddress to sign up for a particular colleges account, and their privilegeson the site are largely limited to browsing the profiles of students of thatcollege.

    Over the last two years, Facebook has become fixture at campusesnationwide, and Facebook evolved from a hobby to a full-time job forZuckerburg and his friends. In May 2005, Facebook received $13million dollars in venture funding. Facebook sells targeted advertising tousers of its site, and parters with firms such as Apple and JetBlue toassist in marketing their products to college students.

  • CHAPTER 2:RESEARCH OBJECTIVE AND

    LITERATURE REVIEW

  • Objective

    To Measure People Perception Regarding Facebook Privacy Setting AndPrivacy policy

    And Which Social Website they Using More for Compare to privacy SettingAnd Privacy policy

    Facebooks privacy model

    At the time we deployed the survey, Facebook allowed users to manage theprivacy settings of uploaded content (photos, videos, statuses, links and notes)using five different granularities: Only Me, Specific People, Friends Only, Friendsof Friends, and Everyone.2 Specific People allows users to explicitly choosefriends (or pre-created friend lists, discussed below) to share content with.

    The default or recommended privacy setting for all content is Everyone,meaning users share their content with all 750 million Facebook users [7] if theydecline to modify their privacy settings. Facebook allows users to re-use SpecificPeople privacy settings via friend

    lists. Users create a friend list, add a subset of their friends to it, name it, andcan then select the

    list as a basis for privacy control. Friend lists are private to the user whocreates them, unless the user explicitly chooses to display them as part of hisprofile. The granularity of privacy settings varies according to content type. Photosare grouped into albums, and privacy settings are specified on an album granularity(i.e., all photos in an album must have the same privacy setting). For the remainingcontent types, users can specify different privacy settings for each piece of content.

  • User privacy

    Privacy is an emerging challenge in OSNs, and a number of researchers haveexamined different aspects of the privacy problem. Researchers have examined theprivacy model of existing OSNs, demonstrating that sites often leak numeroustypes of privacy information [12, 26, 29]. A number of papers report that usershave trouble with existing extensive privacy controls, and are not utilizing them tocustomize their accessibility [28, 30, 40]. Other work surveys users awareness,attitudes, and privacy concerns towards profile visibility and show that only aminority of users change the default privacy preferences on Facebook [11, 22].However, they do not study to what extent the actual selected settings match userspreferences.

    There is also significant work that explores new approaches that canenhance the content sharing privacy on OSNs [14, 15, 18, 20, 39]. There areseveral closely related papers which have measured the privacy considerations ofdifferent kinds of information, such as News Feed [23], tagged photos [13], basicprofile information [32]. All of these papers demonstrate the importance of theease of information access in alleviating users privacy concerns. Madejski et al.[32] show that privacy settings for uploaded content are often incorrect, failing tomatch users expectations.

    There are two primary distinctions between their work and ours. First, theyrely on text analysis to select content that is potentially privacysensitive; doing solocates additional privacy violations but prevents an overall estimate of the fractionof content that has incorrect settings. Second, we directly compare the user surveyresults to the in-use settings, instead of relying on inferring the existing privacysetting though fake accounts.

  • Privacy policy

    As a response to the online privacy risks and threats, many website privacypolicies specifically address the collection of personal information. Also the abovementioned data protection laws limit the distribution and accessibility of personalidentifiable information. As discussed earlier, the privacy policy may clarify towhich processing the user has given consent, when he or she has uploaded personalinformation into the service. Therefore, the relationship between the dataprotection law and the privacy policy is important. A privacy policy is a notice ona website providing information about the use of users personal identifyinginformation by the website owner (particularly personal information collected viathe website).

    Privacy policies usually contain details of what personal information iscollected, how the personal information may be used, to whom the personalinformation may be disclosed, the security measures taken to protect the personalinformation, and whether the website uses cookies and/or web bugs. The exactcontents of a privacy policy will depend upon the applicable law. For instance,there are significant differences between the European and the US data protectionlaws. At the moment, there are no well-known and generally used internationalprivacy policy standards, yet. However, for example Google has brought updiscussion about international privacy standards which work to protect everyonesprivacy on the Internet.

    Privacy features are technical implementation of privacy controls onwebsites. Privacy settings or tools are also generally used terms. A site shouldmaintain standards of privacy and enable user friendly profile control and set-up toencourage safe participation. Facebook and other SNS have been criticized for thefact that users profiles are by default visible to an audience as wide as possible. Ifthe users do not change their privacy settings, the information is available not onlyto their friends, but in the worst case also to everybody on same networkingservice. Gross and Acquisti (2005) have also present that the service providersown user interface might be reason why people adjust settings so little. Anyhow,privacy features have no meaning, if the end-user does not use them.

  • The study of Gross and Acquisti (2005) shows that only a small number ofFacebook members change the default privacy references, which are set tomaximize the visibility of the users profiles. Cranor et al. (2006) noted thatdespite efforts to develop usable interfaces and features, most users rarely changethe default settings on many of the software packages they use. The reason for whyusers do not change the settings can be the aspect of time consumption, confusion,or users fear of risk to messing up their settings.

  • ANALYSIS OF PRIVACY SETTINGS

    In this section, we begin by investigating the distribution of user-selectedprivacy settings. We then use our user survey to compare the desired privacysettings and actual privacy settings, quantifying the frequency of discrepancies.Finally, in the following section, we examine the potential for aiding users inmanaging privacy by automatically grouping related users.

    Existing privacy settings

    We begin our analysis by examining the distribution of existing privacy settings.For each user who completed our survey, we collected the current privacy settingsfor all of their uploaded content.

    Privacy behavior

    Earlier research has shown that people have little knowledge about the real privacyrisks in the online environment, and that they are unaware of the amount ofpersonally identifiable information they have provided to an indefinite number ofpeople (see e.g. Cranor et al., 2006; German Federal and State Data ProtectionCommissioners, 1997; Goettke & Christiana, 2007).

    Cross and Acquisti (2005) also suggest, that users may have relaxed attitudetowards (or lack of interest in) personal privacy and myopic evaluation of theassociated privacy risks. For example, Facebook privacy policy tells that thirdparties can access and share certain personal information about the user (excludingcontact information). Nevertheless, earlier studies have shown that users do not puteffort to actually read the online social services privacy policies and the terms ofuse (see e.g. Acquisti & Gross, 2006; Gross & Acquisti, 2005; Jones & Soltren,2005).

    Cranor et al. (2006) noticed that users find learning about privacy and reading thewebsite privacy policies to be difficult and time consuming. Quite many users areaware of privacy features and know how to use them, but they do not take initiativeto protect their information (see e.g. Acquisti & Gross, 2006; Dwyer, 2007; Govani& Pashley, 2005; Gross & Acquisti, 2005; Jones & Soltren, 2005).

    For example Acquisti and Gross (2006) show in their study that the majority ofFacebook members claim to know about ways to control the visibility and

  • searchability of their profiles, but only a significant minority (30% of students intheir sample), are unaware of those tools and options. Jones and Soltren (2005) putthe figures for students in their sample at 74% being familiar with the privacyfeature, of which only 62% actually using the features to some degree.

    Gross and Acquisti (2005) used Signaling Theory to analyze the types andamount of information disclosed on Facebook profiles. Signaling theory, whichoriginates from evolutionary biology, has been lately been used to explain why auser shares personal information on SNSs. According to a number of studies (seee.g. Donath, 2007; Dwyer, 2007; Gross & Acquisti, 2005; Lampe et al., 2007), theusers feel the need to present themselves and make a good impression on theirpeers.

    The study of Gross and Acquisti (2005) showed that the users (in this casestudents of Carnegie Mellon University) of Facebook provide astonishing amountof information, for example, their real name, photo(s), date of birth, phone number,current residence, and relationship status. Users may be pragmatically publishingpersonal information because the benefits they expect from public disclosuresurpass it perceived costs. (Gross & Acquisti, 2005, p. 80)

  • Privacy as contextual integrity

    Helen Nissenbaums work offers an analytical framework for understandingthe commonplace notion that privacy is contextual. Nissenbaums account ofprivacy and information technology is based on what she takes to be two non-controversial facts. First, there are no areas of life not governed by context-specificnorms of information flow. For example, it is appropriate to tell ones doctor allabout ones mothers medical history, but it is most likely inappropriate to sharethat information with casual passersby. Second, people move into and out of aplurality of distinct contexts every day.

    Thus, as we travel from family to business to leisure, we will be travelinginto and out of different norms for information sharing. As we move betweenspheres, we have to alter our behaviors to correspond with the norms of thosespheres, and though there will always be risks that information appropriatelyshared in one context becomes inappropriately shared in a context with differentnorms, people are generally quite good at navigating this highly nuanced territory.On the basis of these facts, Nissenbaum suggests that the various norms are of twofundamental types. The first, which she calls norms of appropriateness, dealwith the type or nature of information about various individuals that, within agiven context, is allowable, expected, or even demanded to be revealed (2004,120). In her examples, it is appropriate to share medical information with doctors,but not appropriate to share religious affiliation with employers, except in verylimited circumstances.

    The second set are norms of distribution, and cover the movement, ortransfer of information from one party to another or others (2004, 122). Shegrounds these norms partly in work by Michael Walzer, according to whichcomplex equality, the mark of justice, is achieved when social goods aredistributed according to different standards of distribution in different spheres andthe spheres are relatively autonomous (2004, 123). Information is such a socialgood. Thus, in friendship, confidentiality is a default rule: it may be appropriate toshare the details of ones sex life with a friend, but it is not appropriate for thatfriend to broadcast the same information on her radio show.

  • How social networking sites change the privacy equation

    See Joinson (2008), Lampe et al. (2008, 2007, 2006) and Ellison et al.(2007). Not all social networking sites will be the same; researchers at IBM notedthat individuals tended to use an intracorporate site primarily for making newcontacts, and not for maintaining old ones (DiMicco et al. 2008).

    For a similar discussion of problems of social convergence onFacebook, see boyd (2008, 1819). For the efforts of teenagers to deal with thisproblem on Myspace (in particular, the porous boundaries between teen andparent contexts), see boyd (2007). See also Binder et al. (2009) (suggesting thatsocial networking sites tend to flatten out the separateness of offline social spheres,and that privacy concerns can best be understood as one part of this problem oftension between poorly demarcated social spheres). The flattening of contextsdiscussed here does not extend as far as the vertiginous context collapseconfronting those posting YouTube vlogs, the audience for which is anyone,everyone, and no one all at once (Wesch forthcoming).

    This is so for two reasons: (a) the problem for vloggers is the inability toimagine a context. Facebook users do imagine a context for their postings, even ifthey get that wrong. (b) Wesch suggests that YouTube vloggers are not generallysupporting offline social networks. As noted above, this is in sharp contrast withthe evidence about Facebook.

    The exact numbers are difficult to quantify. One earlier study estimated thathalf of blogs are by children and teenagers (cites in Solove 2007a, 24). Technorati(2008), on the other hand, reports that 75% of bloggers have college degrees, and42% have been to at least some graduate school. 6 As Solove puts it, as peoplechronicle the minutiae of their daily lives from childhood onward in blog entries,online conversations, photographs, and videos, they are forever altering theirfuturesand those of their friends, relatives, and others (2007a, 24).

    The details of the story are taken from Solove (2007a), 50-4. As Solove putsit later, even if information is already circulating orally as gossip among a fewpeople, putting it online should still be understood as a violation of privacyevenif it is read only by people within ones social circle. Putting the informationonline increases dramatically the risk of exposure beyond ones social circle.Placing information on the Internet is not just an extension of water cooler gossip;it is a profoundly different kind of exposure, one that transforms gossip into awidespread and permanent stain on peoples reputations.

  • CHAPTER 3:RESEARCH METHODOLOGY

  • 1.Research design

    A.Exploratory research

    Exploratory research has been used. This research was done through the SecondaryData Collection

    1.Externaldata2. Books etc.3. Various WebsitesOrQualitative Research

    B.Conclusive research Primary data collection by questionnaire. Descriptive > cross section >single cross section

    2. Sampling Definition

    A. Target population definition

    PopulationAll The user of facebook. Sampling elementUser of facebook viewer. Sampling unitUser of facebook viewer Sampling FrameNot available Sample size 100 TimeFeb to april, 2015 ExtentAhmedabad (Gujarat)

  • Sample size determination :

    B. Sampling method

    1. Probability method2.Non probability methodExperience-convenience -Time

    - Place

    3.Scaling techniques

    1. Comparative scaling techniques2.Non comparative scaling techniques-Semantic defferential Scale

    4. Question type

    Close ended-dichotomous question- Scale

    5. Data analysis

    -Are as under

  • CHAPTER 4:

    DATA ANALYSIS

  • 1. Frequency Chart

    Hypothesis:There are more than 75% of People using facebook and less than 20% of peopleare not using the facebook.

  • 2. Discriptive Mean

    Descriptive Statistics

    N Minimum Maximum Mean Std. Deviation

    P_read_privarcy_policy 76 1.00 5.00 1.6447 1.22968

    P_Facebook_Profile_private 76 1.00 5.00 2.1447 1.16280

    P_using_facebook_transfer

    ed_and_processed

    76 1.00 5.00 2.9474 1.08191

    76 1.00 5.00 2.7368 1.18173

    P_Signed_any_applicatiion_

    which_use_my_facebook_a

    ccount

    76 1.00 5.00 2.9605 1.30067

    P_i_see_setting__see_my_f

    reinds_list_status_posts

    76 1.00 5.00 3.0789 1.33430

    P_I_used_settings_see_pho

    tos_and_videos_I_am_tage

    d_in

    76 1.00 5.00 3.1447 1.55524

    P_think_my_page_shared_

    by_google

    76 1.00 5.00 2.5132 1.20547

    P_delete_account_wiil_rem

    ove_my_data_from_faceboo

    k

    75 1.00 5.00 3.2533 1.16356

    P_I_verfiy_the_account_fak

    e_or_not

    76 1.00 5.00 3.3026 1.24386

    P_if_user_block_would_abl

    e_to_trace_me_through_ap

    plication

    76 1.00 5.00 3.5526 1.47327

    Valid N (listwise) 75

    Hypothesis:In this the mean value z=3.5526 approx for everything so this hypothesis is onaccepted region. It would be accepted.

  • 3. Chi-Square

    Chi-Square Tests

    Value df

    Asymp. Sig. (2-

    sided)

    Pearson Chi-Square 52.587 9 .000

    Likelihood Ratio 42.565 9 .000

    Linear-by-Linear Association 24.963 1 .000

    N of Valid Cases 76

    Hypothesis:a. 11 cells (68.8%) have expected count less than 5.The minimum expected count is .63.

  • 4. One way anova

    ANOVA

    Sum of Squares df Mean Square F Sig.

    P_read_privarcy_policy Between Groups .002 1 .002 .001 .974

    Within Groups 112.985 73 1.548

    Total 112.987 74

    P_Facebook_Profile_private Between Groups .132 1 .132 .095 .759

    Within Groups 101.255 73 1.387

    Total 101.387 74

    P_using_facebook_transfer

    ed_and_processed

    Between Groups .467 1 .467 .411 .524

    Within Groups 83.053 73 1.138

    Total 83.520 74

    Between Groups .041 1 .041 .030 .864

    Within Groups 101.639 73 1.392

    Total 101.680 74

    P_Signed_any_applicatiion_

    which_use_my_facebook_a

    ccount

    Between Groups .328 1 .328 .189 .665

    Within Groups 126.552 73 1.734

    Total 126.880 74

    P_i_see_setting__see_my_f

    reinds_list_status_posts

    Between Groups .636 1 .636 .353 .554

    Within Groups 131.711 73 1.804

    Total 132.347 74

    P_I_used_settings_see_pho

    tos_and_videos_I_am_tage

    d_in

    Between Groups 2.296 1 2.296 .954 .332

    Within Groups 175.624 73 2.406

    Total 177.920 74

    P_think_my_page_shared_

    by_google

    Between Groups .084 1 .084 .057 .811

    Within Groups 106.663 73 1.461

    Total 106.747 74

    P_delete_account_wiil_rem

    ove_my_data_from_faceboo

    k

    Between Groups 1.487 1 1.487 1.086 .301

    Within Groups 98.635 72 1.370

    Total 100.122 73

    P_I_verfiy_the_account_fak

    e_or_not

    Between Groups .024 1 .024 .016 .901

    Within Groups 110.643 73 1.516

    Total 110.667 74

    P_if_user_block_would_abl

    e_to_trace_me_through_ap

    plication

    Between Groups 1.778 1 1.778 .807 .372

    Within Groups 160.809 73 2.203

    Total 162.587 74

  • 5. T- Test

    Independent Samples Test

    Levene's Test

    for Equality of

    Variances t-test for Equality of Means

    F Sig. t df

    Sig.

    (2-

    tailed)

    Mean

    Difference

    Std. Error

    Difference

    95% Confidence Interval of the

    Difference

    Lower Upper

    U_Social_Site_Facebook Equal

    variances

    assumed

    1.054 .307 -9.219 86 .000 -

    .82560

    .08955 -

    1.00362

    -

    .64757

    Equal

    variances

    not

    assumed

    -7.212 9.012 .000 -

    .82560

    .11448 -

    1.08452

    -

    .56668

    U_social_site_twiter Equal

    variances

    assumed

    .741 .392 .605 86 .547 .10267 .16963 -.23455 .43989

    Equal

    variances

    not

    assumed

    .559 9.556 .589 .10267 .18371 -.30924 .51459

    U_social_linkedin Equal

    variances

    assumed

    6.620 .012 .765 86 .447 .13502 .17660 -.21605 .48609

    Equal

    variances

    not

    assumed

    .767 9.931 .461 .13502 .17598 -.25746 .52751

    U_social_skype Equal

    variances

    assumed

    7.940 .006 -1.078 86 .284 -

    .16737

    .15532 -.47613 .14139

    Equal

    variances

    not

    assumed

    -1.370 11.635 .196 -

    .16737

    .12215 -.43445 .09971

  • Hypothesis:-In this the mean value z=9.391 approx for everything so this hypothesis is onaccepted region.It would be accepted.

    U_social_instagram Equal

    variances

    assumed

    6.620 .012 -.765 86 .447 -

    .13502

    .17660 -.48609 .21605

    Equal

    variances

    not

    assumed

    -.767 9.931 .461 -

    .13502

    .17598 -.52751 .25746

    U_social_orkut Equal

    variances

    assumed

    .186 .668 .220 86 .826 .02250 .10228 -.18082 .22583

    Equal

    variances

    not

    assumed

    .195 9.391 .850 .02250 .11568 -.23752 .28253

  • 6. One sample T-test

    One-Sample Test

    Test Value = 0

    t Df Sig. (2-tailed)

    Mean

    Difference

    95% Confidence Interval of the

    Difference

    Lower Upper

    Prefrance_Facebook 13.625 87 .000 1.84091 1.5723 2.1095

    Prefrance_Twiter 28.191 87 .000 3.28409 3.0525 3.5156

    Prefrance_Linkdin 17.516 87 .000 2.35227 2.0853 2.6192

    Prefrance_Skype 34.379 87 .000 3.94318 3.7152 4.1712

    Prefrance_Instagram 27.759 87 .000 3.931818 3.65030 4.21334

    Using_orkut 53.059 87 .000 5.67045 5.4580 5.8829

    Hypothesis:In this the mean value z=1.84091approx for everything so this hypothesis is onaccepted region .It would be accepted.

  • CONCLUSION

    From the research, we just find it out that less people are aware aboutFacebook privacy policy and privacy setting. User of Facebook is morebut privacy awareness is less.

  • BIBLIOGRAPHY[1] Facebook Stirs Privacy Concerns Again. The New York Times, 2010.http://gadgetwise.blogs.nytimes.com/2010/04/ 27/facebook-stirs-privacy-concerns-again.[2] Facebook Executive Answers Reader Questions. The New York Times, 2010.http://bits.blogs.nytimes.com/2010/05/11/ facebook-executive-answers-reader-questions/.[3] Is There Life After Facebook? The New York Times, 2010.http://bits.blogs.nytimes.com/2010/05/12/ is-there-life-after-facebook/[4] Adamic, Lada A., Buyukkotken, Orkut, and Adar, Eytan. 2002. A SocialNetwork Caught In The Web.http://www.hpl.hp.com/research/idl/papers/social/social.pdf[5]Acquisti, A. and Gross, R., Imagined communities: awareness, informationsharing, and privacy on the Facebook. From PET 2006. (Cambridge, June 28--30,2006,[6]Boyd D., Friendster and publicly articulated social networking, in theProceeding of Conference on Human Factors and Computing Systems (CHI 2004),Vienna, Austria, April 24-29, 2004.[7] Boyd, D., and Ellison, N., Social network sites: Definition, history, andscholarship, Journal of Computer-Mediated Communication, 13(1), 2007.[8]Brooks, G. 2007. Secret society, New Media Age, 13 December, p. 10.[9]Clark, H.H. Using Language, Cambridge University Press, Cambridge, UK,1996.[10]Cranor L., Gudruru P. and Arjula M., User Interfaces for Privacy Agents,ACM Transactions on Computer-Human Interaction, Vol. 13, No. 2, June 2006,pp. 135178.[11]Dinev, T. & Hart, P. 2006. Internet Privacy Concerns and Social Awarenessas Determinants of Intention to Transact, International Journal of ElectronicCommerce, Vol. 10, No. 2, pp 7-29.[12]Donath, J., Signals in social supernets, Journal of Computer-MediatedCommunication, 13(1), 2007.[13]Dwyer, C., Digital Relationships in the 'MySpace' Generation: Results From aQualitative Study, in the Proceedings of the 40th Hawaii International Conferenceon System Sciences (HICSS), Hawaii, 2007.[14]Dwyer C., Hiltz R., and Passerini K., Trust and Privacy concern within socialnetworking sites: A comparison of Facebook and MySpace, in the Proceedings ofAMCIS 2007, Keystone, CO, 2007.[15] German Federal and State Data Protection Commissioners, Privacy-enhancingtechnologies, Working Group on "privacy enhancing technologies" of the

  • Committee on "Technical and organisational aspects of data protection" of theGerman Federal and State Data Protection Commissioners, 1997. 15[16] Goettke R. and Christiana J., Privacy and Online Social NetworkingWebsites, http://www.eecs.harvard.edu/cs199r/fp/RichJoe.pdf, 5 Nov, 2007)[17] Govani, T., and Pashley, H., Student Awareness of the Privacy Implicationswhile Using Facebook Unpublished manuscript retrieved 1 Nov 2007 fromhttp://lorrie.cranor.org/courses/fa05/tubzhlp.pdf, 2005[18] Alexa. (2010). Facebook.comsite info from Alexa. March, at:http://www.alexa.com/siteinfo/facebook.com[19] Bailey, J. (2009). Life in the fishbowl: Feminist interrogations ofwebcamming. In Lessons from the identity trail: Anonymity, privacy and identityin a networked society. Oxford: OUP, pp. 283301.[20] Benkler, Y. (2003). Through the looking glass: Alice and the constitutionalfoundations of the public domain. Law and Contemporary Problems, 66, 173224.[21] Good, N. S., Grossklags, J., Mulligan, D. K., & Konstan, J. A. (2007).Noticing notice: A large-scale experiment on the timing of software licenseagreements. In Proceedings of the SIGCHI Conference on Human Factors inComputing Systems. San Jose: ACM Press, pp. 607616.[22] Grimmelman, J. (2009). Saving Facebook. Iowa Law Review, 94, 11371206.[23] Gross, R., & Acquisiti, A. (2005). Information revelation and privacy inonline social networks. In Proceedings of WPES05. Alexandria, VA: ACM Press,pp. 7180[24] Hashemi, Y. (2009) Facebooks privacy policy and its third-partypartnerships: Lucrativity and liability. Boston University Journal of Science andTechnology Law, 15, 140161.