38
Seniors Privacy Concerns in Health Technology Wearables Regulations vs. User Concerns Sofie Siggelin Systems Sciences, bachelor's level 2017 Luleå University of Technology Department of Computer Science, Electrical and Space Engineering

Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

Seniors Privacy Concerns in Health

Technology WearablesRegulations vs. User Concerns

Sofie Siggelin

Systems Sciences, bachelor's level

2017

Luleå University of Technology

Department of Computer Science, Electrical and Space Engineering

Page 2: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

ABSTRACT

Technology is rapidly advancing and more sophisticated wearables capable of monitoring health

concerns and potential diseases are entering the market. Meanwhile, regulations are just catching up

and the new EU-wide General Data Protection Regulation (GDPR) will be implemented in May of 2018.

This thesis reviews the concerns voiced by users when using wearables collecting their sensitive health

data and compare them with the upcoming regulatory changes, to see if they address the many worries

of users.

The main goal of the GDPR is to bring ownership of the data back to the individual as well as

harmonizing the market in the EU, but the question is if its focus is on the right things that users actually

value and will their concerns be eased by the new regulations?

A high-level review of the current and upcoming regulations on data collection was made as well as

reviewing already discovered user concerns. The study was made using qualitative methodology and

face-to-face interviews with users affected by medical conditions, in order to identify their perception

of trust in wearable technology monitoring their health status. The results were analyzed using a

thematic analysis where three main areas of concern were discovered. These where then compared

to the literature review.

The three areas of concern that were discovered are: a lack of control where users have a clear need

of ownership of their personal data, the concern of companies abusing individual’s data for commercial

purposes and a doubt in the level of trust users can put in the information they receive.

The GDPR does address several of these concerns by bringing ownership of data back to the users. By

strengthening the need for explicit consent from the companies, more transparent policies and

security implementations of data integrity, the GDPR features several steps that could ensure the

privacy of users such as distribution of data and “the right to be forgotten”.

Upcoming research can go deeper into the GDPR and the future will tell if it is successful in its aim to

empower the user as it might seem excellent on paper but face several challenges in reaching its goal.

Keywords: Health Technology, Wearables, GDPR, Integrity, Privacy

Page 3: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

1

TABLE OF CONTENTS

Abstract ..............................................................................................................................................0

1 Introduction ................................................................................................................................2

1.1 Background .........................................................................................................................2

1.2 Problem Discussion .............................................................................................................4

1.3 Research Question...............................................................................................................5

1.4 Purpose ...............................................................................................................................5

1.5 Expected Contributions .......................................................................................................5

1.6 Delimitations .......................................................................................................................6

2 Literature Review ........................................................................................................................7

2.1 Previous Research of Data Integrity & Privacy in Health Technology ....................................7

2.2 The Regulations of Health Data ............................................................................................9

3 Methodology ............................................................................................................................ 12

3.1 Data Collection .................................................................................................................. 12

3.2 Data Analysis ..................................................................................................................... 14

3.3 Evaluation of Method ........................................................................................................ 15

4 Result & Analysis ....................................................................................................................... 17

4.1 The Respondents ............................................................................................................... 17

4.2 Empirical Findings .............................................................................................................. 17

4.3 Identified Concerns Compared to Previous Research ......................................................... 23

4.4 New Findings on the Concerns of Users ............................................................................. 24

4.5 Identified Concerns Compared to the Upcoming GDPR ...................................................... 24

5 Discussion ................................................................................................................................. 26

5.1 Summary and Discussion of Results ................................................................................... 26

5.2 Limitations and Method Evaluation ................................................................................... 27

5.3 Recommendations on Further Research ............................................................................ 28

6 Conclusions ............................................................................................................................... 29

7 References ................................................................................................................................ 30

8 Appendix ................................................................................................................................... 35

8.1 Interview Structure ............................................................................................................ 35

Page 4: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

2

1 INTRODUCTION

1.1 BACKGROUND

Today’s technology is advancing in a rapid pace and personal data about individuals is being used by

companies and app developers in order to provide more tailored and attractive services for users and

their chosen devices such as their smartphone. One of the most popular expansions have been

advancements in health technology that gathers information about the individual in order to provide

services such as recommendations and conclusions regarding the individual’s health. The World Health

Organization defines the term as “Health technology refers to the application of organized knowledge

and skills in the form of devices, medicines, vaccines, procedures and systems developed to solve a

health problem and improve quality of lives” (WHO, 2017). The term can therefore include both

professional devices used in hospital environments, but also in consumer products bought and used

by the user’s own personal interest.

Health Technology and the Digital Health industry is growing exponentially and attracted over $5,8B

in start-up funding in 2015 (CB Insights, 2016) which strengthens the belief that a more technology

driven health care is in the markets interest, but it is also a welcome advancements by physicians

where an overwhelming majority of 97,5% see themselves using digital technologies for clinical trials

in the upcoming future, where they see big value in wearables as a data collection device (Validic,

2016). This type of data could benefit researchers and scientists in drug development and further

insights into patients when utilizing sensors and tracking in wearables.

While advancements in health technology for hospitals and professionals are being made, consumer

applications focusing on fitness- or general activity tracking, more known as Wellness Applications, has

a steady following of users. A survey conducted in the US concluded that a little over half of all

smartphone users used applications like this (Krebs & Duncan, 2015) which visualizes the growing

interested in using technology to get a sense of control over your health. However, with these

increased capabilities of monitoring different health aspects, the two worlds of Medical Technology

and consumer Wellness Applications are bound to collide into the more general term Digital Health. In

short terms this means that knowledge that could previously only be gained by scientific tests

monitored by health professionals, could now be implemented into consumer products marketed

towards concerned or just curious individuals.

Page 5: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

3

Wearable technology is described as a device that could be implemented into clothing or accessories,

such as a smartwatch or bracelet. Most commonly they include different types of sensors that are

collecting data and can easily connect to the internet (Sung, 2015). The possibilities of utilizing

wearable devices in healthcare are endless and today a significant amount of innovation in the

wearables industry are focused on just that (Roots Analysis Private Ltd., 2016). There are amazing

opportunities when using these devices in the healthcare industry where proactive care can be done

more efficiently and accurate, as wearables together with intelligent systems can track various

conditions such as using electrocardiogram sensors to track arrhythmias (Heart Rhythm Society, 2015),

monitoring the recovery of patients by sleep and exercise patterns (Kelly, Strecker, & Bianchi, 2012) as

well as discover early signs of type 2 diabetes (Robbins, 2016). An important difference to laboratory

tests are that wearables could gather significantly more data thereby tracking patterns and trends in

real time. In the long run, the gathering of this data will be crucial for researchers to understand how

life-style choices impact our health (Forni, 2016) and could specifically benefit the older generation of

the population as they have a higher risk of being diagnosed with these sorts of conditions (NIH, 2015;

Cherney, 2016).

These advancements could however lead to more concerns with privacy and data integrity of the type

of information that could be collected from these sorts of tracking devices. When consumers take their

own initiatives to use wearables that track medical data to gain knowledge about their health status,

integrity needs to be taken account for. Today it is mostly common to worry about information such

as your personal bank details or virus attacks (The Travelers Indemnity Company, 2015), however it

isn’t as common to discuss the risks of your health tracking data leaking to parties catering against your

best interest, such as insurance companies tailoring your rates because of your health tracking or the

risk of information leakage because of insufficient security measurements (Datainspektionen, 2016).

While technology is rapidly advancing, regulations are still catching up. The personal data act (PUL) in

Sweden was first introduced in 1998 and was meant to prevent the violation of personal integrity by

the processing of personal data (Datainspektionen, 1998). This was at a time where internet was still

a recent thing and very few households had access. However, today 96% percent of Swedish

households have access to internet and 85% of individuals use it every day (Eurostat, 2017) , which has

created the need for a change in regulation. To address this change across Europe, the EU has initiated

a new directive called the General Data Protection Regulation (GDPR) that is planned to go into place

in May of 2018 to address the purpose of the initial data act adapted to modern standards as well as

harmonizing the European market regulations (REGULATION (EU) , 2016/679). The new regulations are

significantly stricter and will put more pressure on companies collecting data and strengthen the right

to integrity and ownership of data for users.

Page 6: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

4

1.2 PROBLEM DISCUSSION

As gathered medical data and patterns will be a big player in the future of health care (Murdoch &

Detsky, 2013), it is safe to assume that knowledge only available to individuals from medical services

today could move on to self-powered consumer electronics such as wearables. This means that soon

large databases with sensitive medical data, and extensive amounts of it, could exist in the consumer

market. For example, this means that there could be stored data on an individual’s blood sugar levels

or the blood pressure of another collected every hour for several years, which is unheard of today.

With the collection of this type of data, several issues with privacy and consumers sense of trust could

arise and there are many doubts if the upcoming regulations will serve for the user’s best interest.

When the Swedish integrity committee mapped the state of personal integrity recently, several red

flags for potential risks were found. The most prominent were security risks with the expanding use of

the cloud, but also raised concerns with data now being treated as commercial goods, private

information leaving the country, and therefore not being subject to the same regulations, and the

insufficient processing and storing of sensitive data (Statens Offentliga Utredningar, 2016).

Today’s popular wearable devices, such as Fitbit and Jawbone, can easily be penetrated which leaves

highly private and sensitive information vulnerable for any malicious or ill intended use (Goyal,

Dragoni, & Spognardi, 2016). Therefore, it is not only crucial for the companies to provide technical

solutions for these issues but also for building trust in users for the long run. The ways of building trust

are however a tricky subject and having clear privacy policies are not a definitive answer. A survey

made with U.S. users showed that 74% of the people asked actively chose to not read the privacy policy

of a service they opted to use, but it also showed that a massive 98% missed an important hidden

clause in the policy as they only spent about a minute going through it (Obar & Oeldorf-Hirsch, 2016).

This is a very troublesome statistic and raise the question of how a company can be transparent to

their users in a way they can be completely aware of their rights and information, if ever possible.

The Swedish users are primarily concerned about a lack of safety, fraud, integrity breaches and

surveillance online (IIS, 2016). As data integrity becomes more important for users, companies and

regulatory offices, such as the EU, try to keep up and put user rights into law to address the growing

concern. The question however is if the upcoming regulations and subsequent changes in company

policy actually addresses the worries of their consumers and if their general trust in sharing their own

sensitive data will be eased with the new EU-harmonized regulations coming into place.

Page 7: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

5

1.3 RESEARCH QUESTION

To review the concerns of users we first have to identify them, because of this the first research

question will focus on just that and aims to dig deeper into individual’s perception of privacy when

using advanced wearables monitoring their health.

RQ1: “What are the concerns for users when sharing personal health data gathered

from consumer wearables?”

As the thesis aims to review if the upcoming regulations will address the concerns of users, the second

research question is:

RQ2: “Could the upcoming GDPR address the concerns of users?”

1.4 PURPOSE

The purpose of this thesis is to investigate if the upcoming regulations could address the concerns that

the main user group, i.e. seniors, worry about when sharing personal health and medical data that

could be gathered from wearables monitoring sensitive health conditions such as, but not limited to,

heart disease or diabetes.

The study aims to gain deeper knowledge into the field, rather than to prove a certain statement. The

study will then compare the gathered insights into the most prominent focus of the GDPR and if it

could possibly address their concerns.

As the literature review will suggest, there are plenty of studies focusing on either the user perspective

on trust in technology or general reviews of the GDPR, but few focus on both and if they have a mutual

focus. This study aims put these factors together and narrow the gap by investigating if the user’s

concerns are actually addressed in the upcoming regulations, hence the title – Law vs. Perception.

1.5 EXPECTED CONTRIBUTIONS

The study expects to strengthen previously discovered user concerns as well as identifying new, or less

prominent, ones. With a high-level review of the GDPR and the identified areas of concerns, the thesis

could give a general understanding of how these concerns are addressed in the regulations and if it

could make a difference in their perception of trust. The contributions are mostly expected to be an

outline for interesting focus areas of future research.

Page 8: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

6

1.6 DELIMITATIONS

The research will be limited towards Health Technology and the health- and/or medical data that could

be gathered in wearables bought for commercial use, excluding research or monitoring devices

distributed as parts of professional research studies.

Because of the nature of the devices, the main target group of users are those of a more senior age.

Therefore, this will also be the target group for this thesis and people of an older age will be

interviewed. Therefore, the results of the study can only be reflected through the eyes of seniors.

The participants will be Swedish and even though the GDPR is an EU-wide regulation, the study

conclusions could be swayed by nationality of the subjects. The reason this could have a big effect is

that the trust in government is substantially higher at 58% tending to trust national government,

compared to the EU-average of only 29% (ERCAS, 2015).

The focus of the study is to evaluate the user perspective. Therefore, company interest or legal

possibilities will be explored but not evaluated or investigated. Any technical requirements or system

architecture affected by the GDPR will not be taken into account. The implications for companies

violating the regulation will also be disregarded.

All reviews of regulations will be in relevance to health data collection regulated under the GDPR. The

insights could therefore be affected by other parts of regulations and might be disregarded or

unaware.

It is also important to note that even though the thesis will touch on the legalities of the GDPR, it is not

to be viewed as legal advice for companies or individuals, but merely thoughts and perceptions from

a user perspective.

Page 9: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

7

2 LITERATURE REVIEW

2.1 PREVIOUS RESEARCH OF DATA INTEGRITY & PRIVACY IN HEALTH TECHNOLOGY

The advancements in wearable technology has been significant, as the previous chapter suggests, and

several studies are positive in bringing in more devices and help from technology in order to improve

senior’s health and autonomy. In the study “Older adults’ attitudes towards and perceptions of ‘smart

home’ technologies: a pilot study” (Demiris, o.a., 2004) all of their 15 interviewees aged over 65 were

positive towards the use of sensors and other devices to enhance their life. Some concerns that they

raised were the user-friendliness of the devices as well as the possible need for training to the older

learners, these concerns are seconded by the researches in “Aging Well with Smart Technology”

(Cheek, Nikpour, & Nowlin, 2005). The fact that integrity and privacy concerns are central factors in

medical technology is clear, but the previous research does not suggest that there is a differentiation

in opinion based on age or gender (Ziefle, Röcker, & Holzinger, 2011). The same study discovered that

the most prominent concerns of users were the fear of technical disturbances, fear of data delivery

without consent as well as illegal access (Ziefle, Röcker, & Holzinger, 2011).

Several studies have been performed reviewing how to secure and provide integrity for health data in

different devices and fashions. One study called “Privacy and Security in Mobile Health Apps: A review

and Recommendations” (Martínez-Pérez, de la Torre-Díez, & López-Coronado, 2015) does just as its

title says and reviews the current regulations in place and how to secure these types of applications.

The study concludes that a change in regulations for this topic was necessary and welcomes the GDPR,

but still with certain concerns about the new regulations clarity regarding health data as it could still

be interpreted as too general. Their suggestion did however suggest several best practices for

companies in order to secure Health Data, such as authentication, account limitations and transparent

information policies where they suggest that “The policy should be easy to understand, concise and

clear, since users are not fond of reading large legal documents in an app” (Martínez-Pérez, de la Torre-

Díez, & López-Coronado, 2015, pg. 6), as well as other more technical implementations to prevent data

breaches and loss.

The topic of transparency comes up in several studies as an important factor for users. This is for

example touched on by a study done by McKinsey where one of their findings was the importance of

senior leaders promoting information transparency as a cultural norm (Groves, Kayyali, Knott, & Van

Kuiken, 2013). This could prevent them from facing difficulties from users and regulatory officers.

According to many studies, the aspect of involving the users is key in order to get the sense of integrity

for the users. It is widely recommended for companies to provide consumers with at least some control

Page 10: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

8

or input as to the dissemination of their personal information (Phelps, Nowak, & Ferrell, 2000), clearly

informing the users of what they agree to and where they fit in (Hänsel, Wilde, Haddadi, & Alomainy,

2015) and that providing easy-to-use services that are well thought out for the target audience will

increase trust as well as the propensity for consumers to use the service (Song & Zahedi, 2007). All this

research agrees that being transparent and providing information that the users can clearly

comprehend will increase the willingness to share their sensitive data as they have a sense of control

and trust in how their data is managed. Giving users back their power will provide a win-win and when

making privacy policies better designed and interactive the users will gain trust and a sense of respect

(Aïmeur, Lawani, & Dalkir, 2016).

In the article “Towards Privacy-Aware Research and Development in Wearable Health” (De Mooy &

Yuen, 2017), it is stated that the primary concerns of wearable tech are related to who could possibly

access, share and control the information gathered by the device e.g. who could get access to the

physical device and protocols for companies collecting these kinds of data (De Mooy & Yuen, 2017) .

In the research, they study the users of Fitbit, a smart bracelet type wearable, capable of monitoring

sleep, exercise, heartrate etc. (Fitbit, u.d.) and could conclude several recommendations for

companies in the field. One of these recommendations are to “Use individual expectations to guide

consents” (De Mooy & Yuen, 2017, pg. 3664) which states that individuals should have the option to

opt-in for specific research where identifiable data is used and research that could be outside of the

scope of a user’s normal expectations. They also recommend to honor this participation with rewards,

such as gift cards or other small benefits, which is an interesting aspect of this thesis particular field as

this conclusion shows favor in incentives non-related to the specific condition.

The recommendations also include the aspects of respect and transparency, focusing on both policy,

such as the importance of being able to withdraw consent and to always have innovations serve the

best interest of the individual, as well as technical implementations and applying the appropriate

protections for pseudonymous and anonymous data. The conclusions of this particular article are

heavily aimed at a transparent communication with the user and that the importance of companies

applying good data practices by building a culture of privacy and data integrity internally. They stretch

the importance of committing to the ethics of being more and more connected with individual’s

personal life in order to offer the public a trusted voice.

De Mooy and Yuen raise interesting aspects as to the importance of communicating with the users of

their product, their ethical commitment and towards building trust. Even though the research displays

several recommendations on company policy and valid points on integrity as well as technical

implementations, they fail to display valuable insights on how to actually communicate this effectively

Page 11: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

9

to users which could easily undermine any efforts that have been put in. As previously mentioned, very

few users read and understand the policy of the services they accept to use (Obar & Oeldorf-Hirsch,

2016), which poses the question of how communications count and how to interest users in policy

matters.

Perception of trust is discussed in several studies and in “Sharing of Big Data in Healthcare” the

question of “Would you be happy to share your healthcare data?” was answered positively by the

majority, however when asked “Do you trust private companies to use your medical data for research

purposes?” most where undecided or disagreed (Moss, Shaw, Piper, Hawthorne, & Kinsella, 2017). The

study also raises the need for a dynamic consent, providing the user with an interface giving them

control over how their health data could be used. This approach is seconded by the researchers in

“Moving Beyond Consent for Citizen Science in Big Data Health Research” but also raise the issue of

consent being an “illusion of control in the big data age” as no general individual could be aware of the

possible redistribution of data and suggest consent should be complimented with data driven

accountability (Cheung, 2017).

2.2 THE REGULATIONS OF HEALTH DATA

In Sweden, this type of data collection is regulated under the Personal Data Act or PUL (SFS, 1998:204).

The main obligations for companies is to not process any sensitive data about individuals without their

consent as well as taking necessary steps, both technical and ethical, to ensure the privacy of the data

collected. Health Data falls under the category of sensitive data, making it prohibited to collect such

information without explicit consent, within health and hospital care or if its within non-profit

organizations (Article 16, SFS, 1998:204).

In 2018, a new EU-wide regulation called the General Data Protection Regulation (GDPR) is planned to

come into effect and aim to harmonize regulations across the EU as well as provide new oversight fit

for the digital era (REGULATION (EU) , 2016/679). The regulations will affect all organizations who

operate or provide any services to the member states regardless of where the company was

established and include several aspects of how the member states should manage data and addresses

definitions, documentation and processing of data as well as how to manage and supervise compliance

in organizations.

Health data is continued to be defined broadly as sensitive data, however, the new definitions

specifically mention genetic and biometric data as well. The most prominent clause continues to be

that this type of data can only be collected and processed with explicit consent from the user or

consumer and the conditions for consent has been strengthened. It now needs to be given in a “clear,

Page 12: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

10

affirmative action” (Article 4(11), REGULATION (EU) 2016/679), meaning it must be obtained in an

unambiguous and informed way, leaving no doubt for the user to what he or she has agreed to, be

easy to withdraw from and includes giving the user choices on what type of research and studies they’d

agree to participate in (Hordern, 2016).

There are a few exceptions to the rule of consent, making it acceptable to collect health data for

medical purposes when under a contract with a medical professional, when the data is for public

interest or for explicit scientific research (Article 9(2), REGULATION (EU) 2016/679). The term “public

interest” has not been further elaborated on, but press that a level of human rights and urgency needs

to be involved (Rumbold & Pierscionek, 2017). The latter is a broad term and research for pure

commercial use could argue that they fall under this category, for this purpose the GDPR requires these

organizations to implement an extensive framework with technical security implementations as well

as processing minimal amounts of personal data in suggested ways such as pseudonymizing data which

is a technique to process information in a way that it can no longer be connected to a specific individual

(Article 4(5), REGULATION (EU) 2016/679). Pseudomized data is however still considered personal data

and therefore treated as such (Recital 26, REGULATION (EU) 2016/679).

However, another way of treating data is anonymization. Data is defined as as anonymous when it can

not be identified by any means (Recital 26, REGULATION (EU) 2016/679). This part of the regulation is

specifically important because anonymous data is excempted from the effects of the GDPR as well as

in the current regulations. This could still be considered problematic as further research argues that

anonymized data always has the ability to become personal data once again, depending on the ever

evolving data environment, that a zero-risk state is impossible to achieve (Stalla-Bourdillon & Knight,

2017). It has also been concluded that only three characteristics: zip-code, gender and birth date could

be sufficient to re-identify anonymized data (Sweeney, 2000).

The rights of individuals are however strengthened in the GDPR by other complementary steps. One is

by the power of knowledge as it brings focus to the importance of transparency and adds several

requirements on companies to inform users of what will happen to their data. Users also have the

“right to be forgotten” (Article 17, REGULATION (EU) 2016/679) which broadly means organizations

needs to remove any tracable data of that specific individual per their request. It is concidered to be

derived from the personal belief in integrity and the ownership of their own information and on what

terms to share it (Youm & Park, 2016).

There has been a lot of criticism of the GDPR, one of the most frequent of it beeing to broad which is

for example discussed in “Data protection legislation: A very hungry caterpillar” (van Loenen, Kulk, &

Ploeger, 2016) which believes the regulations are overly ambitious and ambigous when defining and

Page 13: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

11

mapping data. Other critisisms are that the GDPR should mention more possible technical

implementations in order to ensure secure platforms as it currently is to generalistic towards

processing data which could mean difficulties in living up to the standard set by the regulation

(Martínez-Pérez, de la Torre-Díez, & López-Coronado, 2015).

There are other regulations towards data collecting wearables as well, many of these would be

considered medical products, which are devices or applications that are used with the intent of

medically benefiting individuals (European Comission, 2017). In the EU, these types of products

requires a CE-mark in order to be allowed to distribute their product in the EU and pose several

requirements such as clinically proven results (European Comission, 2017). However, the

implementation of this has been critizised, as many developers do not seem to follow the regulations

and in 2015 the Medical Product Agency in Sweden (Läkemedelsverket) issued a warning of

misconduct and banned applications claiming to give results without the proper classification

(Lindström, 2015). The article pointed out that several app developers did not follow the regulations

and lack experience of the demands on medically technical products.

The regulations of these devices are also under reform in a new act called Medical Devices Regulations

or MDR (European Comission, 2017) and plans to be finalized in 2017. These regulations mainly cover

the quality and labeling of devices deemed as a Medical Device, as this thesis will be covering mainly

data collection, no further investigation into the MDR will be done, but is to be considered in the results

of the study.

Page 14: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

12

3 METHODOLOGY

3.1 DATA COLLECTION

The thesis will practice qualitative methods to answer the research question. The goal is to dig deeper

into the perception of privacy and data integrity for consumers with the use of wearables gathering

sensitive data and gather deeper knowledge on the subject. Previous research has reviewed criticisms

of the GDPR as well as user’s general thoughts on sharing data, but because this study wants to look

at the “why”, a qualitative method is used to gather possible themes.

The research consists of interview sessions to gather these insights, which after analysis are discussed

in comparison to the main upcoming regulations and if the concerns could be addressed by the

changes. The interviews were conducted in a semi-structured fashion with clear interview material but

with room for follow-up questions and conversation in order to get a relaxed and conversational

environment as well as picking up on important themes not covered by the predetermined interview

material. According to the literature, this is the most effective way of getting subjects own private

opinions without swaying them into the interviewer’s direction (Holme & Solvang, 1996). In-depth

interviewing is also a successful tool in discovering “culturally embedded normative explanations of

events and behaviours, because they represent ways in which people organize views of themselves, of

others, and of their social worlds” (Silverman, 2016, pg. 56). Because of the lack of conclusions from

previous research, it was important to keep prejudice judgment or any prior assumptions out of the

interview as the conversation was to be led by the subject that was getting interviewed.

The interviews were conducted face to face with the subjects in an environment of their choosing, this

was to maximize the comfort level for each individual. The interviews were recorded and notes were

taken during its course. The transcripts, as well as the notes, was the ground work for which the

analysis were conducted.

Establishing a small group of interviewees with a broad range was important but because of the lack

of statistical grounds from qualitative research, the focus is essentially on the understanding of

consumer views on health data and it was therefore suitable for the study to only include subjects

suffering from a prior condition, as a respondent1, or that have a spouse or close relative with an illness

that could benefit from being monitored as an informant2. The reasoning behind this was to try and

minimize the impact of hypotheses for unaffected subjects and hopefully add authenticity to the data

1 Subjects directly affected by a specific problem (Holme & Solvang, 1996) 2 Subjects not directly affected by the problem, but have valuable insights into the issue (Holme & Solvang, 1996)

Page 15: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

13

collected. The ideal subject would of course already be using one of these devices discussed in the

introduction, but identifying these participants would however be very hard as most of the devices

haven’t yet been released on the Swedish market and are therefore not widely used. Therefore, the

second-best approach is to interview people comfortable with technology and affected by a medical

condition. In addition, however, adding a subject that was already monitored by a portable device

prescribed by a medical professional was valuable for gathering deeper inside knowledge as well.

Because of the importance to speak with the most probable users of these devices, the age of the

participants varied between the age of 54 – 78 as they have a higher risk of conditions such as type 2

diabetes (Cherney, 2016) and high blood pressure (NIH, 2015), which therefore makes them a more

likely consumer of the types of wearables in this thesis. Even though little to no research suggests a

difference in adoptability of new technology or a difference in privacy concerns depending on age

group, this is an important factor to consider when analyzing the results as they’ll only reflect

responses from a more senior perspective.

Choosing the subjects that participated in the interviews focused on people close to the interviewer

as the importance of an already established relationship can benefit the outcome and depth of the

interview (Holme & Solvang, 1996). It is also an important logistical factor to conducting physical

interviews as well as reaching out with complementary questions. Because of the research focus on

sensitive health information, it was also beneficial for the interviewer to be aware of any conditions

that the subject may have and that could affect their response. For the comfort of the user as well as

keeping their privacy, all data collection for this study was anonymized and transcripts of the

interviews will not be shared as this could provide traceable information about the participants.

However, it is also important to raise any concerns of bias when interviewing subjects with a personal

relationship to the interviewer. For instance, in this specific research, the interviewer was an IT-student

with experience in the field and could therefore reflect a sense of confidence in any statements that

could be made in favour of sharing sensitive information. The benefits however outweigh the negatives

as no generalizations or definitive conclusions can really be made out of qualitative studies (Holme &

Solvang, 1996) but merely raise perceptions and room for further exploration.

Because of the extension of this thesis with the review of the GDPR as well as interviews, only five

interviews were conducted. This is a low number, but with the use of a qualitative method it could still

present interesting findings and promote further, possibly more quantitative, research.

3.1.1 Interview Material

The interview questions were structured around the following categories based on the theory:

Page 16: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

14

1) Health Status and Technology in General

a) The Individual’s Health Status and Supervision

b) Health Technology for Supervision of Condition

2) Data Integrity and Regulations

a) Devices and Behavior Online

b) Data Collection in General

c) Data Collection in Health Devices

d) Laws and Regulations

e) Company and Institutional Trust

3) Additional Comments and Discussion

The interview subjects were not given the material beforehand and were only informed about the

overall theme of the thesis before starting the interview. Because of the sensitivity in the information,

all data collected will be anonymous as well as the participants identity, where only characteristics

relevant to the study will be disclosed.

After the first category of questions, on the Individuals Health Status and Supervision, the subject was

given a scenario in which they are now monitoring their condition by a wearable bought in good faith

from a consumer electronics shop. This scenario is crucial to try and narrow the scope from medical

research studies to consumer goods and is referred to several times in the questionnaire.

Because of the new field and the lack of wearables on the consumer market, references to other fields

of data integrity and sensitivity, such as internet banking and social media, were made in order for the

participants to make hypothetical, but relevant, statements from experience as well as to have a point

of reference.

3.2 DATA ANALYSIS

The data analysis followed the structure of a Thematic Analysis which is a way of encoding qualitative

information, which can be done in the form of themes. Boyatzis explains that “a theme is a pattern

found in the information that at the minimum describes and organized possible observations or at the

maximum interprets aspects of the phenomenon” (Boyatzis, 1998).

Page 17: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

15

The process of thematic analysis is summarized into five stages in “Qualitative Research” (Silverman,

2016). The steps are as seen in the table below (table 1).

Table 1 – Thematic analysis (Silverman, 2016, pg. 333)

Silverman goes on to recommend using thematic charts where all initial themes, subthemes and

quotes are gathered into a table. The initial codes consist of repeating words, patterns, concepts and

ideas. Using an iterative process, the codes were then reviewed and coded until the themes were

refined and more abstract. He recommends keeping a hard focus on conceptual development and

abstraction as no clear trajectory or etiquette for thematic analysis has yet to be concluded (Silverman,

2016).

3.3 EVALUATION OF METHOD

The choice of a qualitative method is based in the goal of the research – to gain deeper knowledge

rather than prove a certain statement in which a quantitative study could be more appropriate. In the

research chosen field, a quantitative study could be seen as very beneficial as well as offering a higher

statistical value and structure which makes it easier to draw certain non-biased conclusions in the

subject (Holme & Solvang, 1996). The question of being non-biased is an important aspect in this field

of research and in this case a qualitative study could be seen as a disadvantage as the shear

participation of the researcher could possibly impact the results, as well as the trust and comfort

between interviewee and interviewer. As previously mentioned, the characteristics of the interviewer

could affect the results and this needs to be taken into account. The trust aspect could also be

important when selecting subjects for the interviews as a general trust and embracing of technology

could be an important factor.

However, as the previous research revealed, there is a lack of conclusions in the subject of consumer

trust in these specific wearables in which therefore a qualitative study is the right way to go as a first

step in order to gain insights into the subject. As the purpose of this study is to gain insights, a deeper

Thematic Analysis

1. Familiarize yourself with the dataset (note initial comments and ideas)

2. Generate initial codes (systematically code whole datasets)

3. Search for themes (collate similar codes into potential themes, gather all data for potential theme)

4. Review themes (check if themes work in relation to the dataset, check for examples that do not fit,

generate a thematic map/diagram)

5. Refine themes (Refine specifics of each theme and linkages between them, generate propositions, look

for complexity, associations)

Page 18: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

16

knowledge and understanding about this, a qualitative approach is suitable. These findings could

thereafter evolve into complementary studies in which quantitative methods are used in order to

conclude more certain discoveries. This is common practice according to literature (Holme & Solvang,

1996) and in the current research field is most necessary to produce valuable results.

For this thesis, we were not yet sure what we’re looking for or assuming, therefore it is necessary for

a more flexible approach when reviewing consumer perception and thoughts. It is important to get

insights into their everyday habits and what could affect the evolvement of the health tech field from

an integrity perspective. These types of insights found from qualitative methods also bring a

comparative narrative for practitioners and companies that could benefit them in their own research

and development (Silverman, 2016).

Choosing semi-structured interviews as opposed to other techniques such as observation or case

studies is simply because they would be unsuitable for the purpose as they are beneficial for behavioral

studies rather than thoughts and perception (Holme & Solvang, 1996). A different tactic could however

be to use focus groups with a larger number of participants at each session. This method is however

disregarded because individual interviews could benefit this research better with the individual in

focus rather than open discussions, also disclosing sensitive information could affect any group

discussion where a social extrovert could easily take over.

The important factors of qualitative methods for this particular research is however its suitability to

produce descriptive insights rather than numbers and will therefore produce a subjective result that

could favor from further investigation.

Page 19: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

17

4 Result & Analysis

4.1 THE RESPONDENTS

All respondent personal details and interview transcripts are undisclosed for their comfort and privacy

on this potentially sensitive topic. However, below is a brief summary of the characteristics of the

respondents.

Five respondents were interviewed for this study. They all have varied knowledge of technology and

different online behavior, but all respondents own and use smartphones as well as computers in their

everyday life. Three of the respondents are male, two are female, they are between 54 and 78 years

old and are all residents in the Stockholm County. One of the respondents is an active entrepreneur in

the field of wearable predictive technology and the other four have no relation to the field but are

consumers of technology in general.

All respondents were positive to the technological advancements made in health care and all saw a

future using more of these devices as well as how they themselves could benefit from these types of

wearables.

The interviews were all conducted separately without the influence of the other participants, they took

place face-to-face and lasted between 45 – 120 min.

4.2 EMPIRICAL FINDINGS

The interview analysis was made using the thematic analysis approach described in the previous

chapter and continually tried to answer the research questions at hand.

The first question: “What are the concerns for users when sharing personal health data gathered from

consumer wearables?” will be presented in the findings below, which will then be compared to the

theoretical background and impact of the GDPR in order to answer the second question: “Could the

upcoming GDPR address the concerns of users?”

4.2.1 Lack of Control

The first theme discovered was the concern of lack of control which was displayed in different

subthemes. One of these is the issue of consent, where all respondents voiced a concern of them not

being able to make active and informed choices of what they are consenting to. None of the

respondents read the Terms & Conditions of the services they use, but all choose to agree to the terms

Page 20: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

18

regardless. Many feel that there is a force from the company to consent in order to be able to use their

services:

“You can end up in the situation of “Well, then now you’re not allowed to use our

program because you said no”, so do I really have a choice?”

Many see a possible solution in providing more active choices where they could make informed

decisions on what they are actually agreeing to.

“Well, just having different levels of choices, not just yes or no.”

“Maybe just the fact that there is a checkbox where it says “I allow company X to gather

information about me” or something like that. Then I would at least understand or

have been given the possibility of doing part-selections that might mean something.”

An important aspect of this seems to be the part-selection of terms they can agree to and also that

they will not be shut out from the service because they disagreed with one condition, but merely be

given a more limited service:

“They can go out and offer the possibility to not store historical data or you’re not

allowed to do this, this and that. But that means you still have the right to the service,

but you have the right to affect the consequences of using the service.”

A related aspect to active choices is the sense of owning their personal data and therefore choosing

when to share this with other parties. One respondent felt more secure with the medical device they

used today because of the fact that they only uploaded the data when the devices was physically

handed over i.e. it was not connected to the internet and automatically uploaded data. Another

respondent felt that they didn’t want to be monitored by a wearable they had to wear all day, but

could consider sharing or uploading once a day.

“I believe that if you could solve it as so that you as a user can control the access, this

would be a better way or at least give an increased transparency.”

“The principle that the user owns their information and that different service- or

product providers and others are seen more as a procurator that process it, but does

not own the rights to it.”

Another concern is the time aspect of consent where the respondents expressed a concern that even

though they’d might make an informed choice, they would do so for an undefined period of time. In

the aspect of data collection, they’d see the need for a time limit and a reminder of what they

consented to and if they’d like to withdraw.

Page 21: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

19

“You shouldn’t be able to do this [collecting data] over an infinite period of time, there

should be a time window where its processed, stored and saved”

The lack of control seems to be most prominent in the aspect of health data being collected or

distributed about them. Many have issues with explaining why this is, as they have less concerns

regarding for example bank details or shopping preferences, but deem a higher value of integrity for

their health status. One respondent who was generally not very concerned with data collection

expressed concerns when asked specifically about the collection of their health data:

“I wouldn’t have liked that no… In actuality there’s no difference, but it’s just one of

those feelings you have. Different emotional thoughts on integrity and where it starts

crossing the line. Otherwise, I feel that it doesn’t really matter”

There is a consensus in this concern of the respondents. Many felt that data collection in general wasn’t

really an issue for them, but when faced with the possible collection of their health status, it struck a

nerve.

The topic of emotional connection to health data is hard for the respondents to explain or elaborate

on, but when asked if it would matter if the data collected from them would be anonymous it didn’t

have a significant impact as it was more related to their privacy and emotional connection to the data,

than the actual tracing or labeling.

“I guess it’s because… I have such integrity. I’ve always been like this, I have integrity.

I don’t want people to know everything about me and such. Even though there might

not be anything negative or anything to know, but it’s something to do with personality

I think.”

To summarize, there is a clear concern for users that they could lose control of their information and

this is especially an issue when it comes to their health data. The issue of not knowing what could

happen as well as their own emotional connection to their sensitive data is something hard for the

respondents to explain.

However, there are several steps explained that would make them feel safer and in more control of

their information such as making active part-selection choices with a clear objective as to what they

are agreeing to, a form of local storage where they themselves control the uploading of information

as well as time-limited consent.

Page 22: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

20

4.2.2 Commercial Misconduct

Directly related to the control of their information, a big concern was that companies would exploit

their information for their own commercial interest. One respondent chose to make an analogy to

hotel searching online and summed it up like this:

“Just the fact that if I go out and purchase certain things, somewhere in the background

companies can store this information and that scares me. It scares me because in the

end they will harass me with promotional emails or phone calls or whatever. Just the

fact that you can’t go online and search for a hotel without it two minutes later popping

up on Facebook, it just comes to show you that someone is keeping track of what I’m

actually doing. That might not be that bad as long as I’m booking hotels, but if you’d

consider it for this type of sensitive information [health data] … I think that’s scary…”

The respondents also expressed concern with the commercialization of their health information and

the risk of their medical conditions being exploited for direct marketing.

“The risk is that you will be bombarded with these drugs and alternative medicines just

to stay healthy and that’s not that much fun either, you probably have other stuff to

do than answer telemarketing phone calls.”

There is also a clear differentiation of trust in what type of company that actually distributes the

product. All respondents agreed that they had a higher trust in pharmacies than if they’d purchase the

product in a more commercialized electronics store or other similar stores. Not only did they feel as if

they’d get better advice from the pharmacists or employees of the pharmacy, but they also had a sense

of that there were different regulations and demands in place on these types of stores which makes

them trust them more.

“You have a sense of that this pharmacy function, ergo the business operation is more

serious than electronics stores.”

Another aspect however is the concern that companies would gather or alter information for their own

benefit, such as recommending products based on your health data that could be a potential risk for

their health.

“You could imagine someone starting to send out false information, such as if it’s been

concluded that you should treat your high blood pressure, they’d send out “You should

eat this new miracle drug X” and then X hasn’t been properly tested or it doesn’t even

treat high blood pressure. What do you do then? And that could happen.”

Page 23: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

21

Most respondents went on gut feeling and a sense of seriousity when choosing companies to use for

their services, but their market size was also important. One respondent also expressed that they only

used online shopping if it was a certified e-commerce store 3 and used a secure connection. The

respondents expressed this as a good example of measurements that made them feel as such as the

companies they were using services from were serious and had good intentions, which in turn made

them feel safe, or at least not fazed by a sense of unsureness, when using their services.

4.2.3 Distrust in Information

Lastly there is a distrust and concern in that the information that they are being fed is in any way untrue

or misleading. For the gathering of health data, this has two clear subthemes.

One is the correctness and validity of the actual health data, potential analysis or diagnosis that the

wearable could provide them with.

“I’m a little doubtful. It could be… to much information in a way. And then there’s the

fact that you know that these medical values fluctuate during the day and then… it’s

going to be hard to know what to trust. Maybe you could put too much trust into it.”

“That you can trust the values is in one way a condition for wanting to put your time

into it in the first place.”

This aspect of being able to trust and benefit from the information has many angles. Many expressed

concern in the risk of getting fixated with their data or turning into a hypochondriac, a way of being

overinformed. This angle was also visualized in the fear of discovering untreatable conditions that

they’d rather not know about, a concern voiced by many of the respondents.

“Sure, it could be good [monitoring with wearables], but at the same time… if anything

really bad came up, how would you react?”

“The big question might be that, well if I discover somethings wrong, what can I do

about it? If there’s nothing you can do about, maybe it’s better not knowing at all.”

When asked about if there as any limit on what they’d want to know, one responded said:

“Stuff that you can’t do anything about… I don’t see any good in getting worried before

it happens. The type of, non-actionable conditions. So that assumes, basically, that you

can do something to affect it or get active help to affect it and improve it”

3 For example: Trygghandel - Optional but highly renowned certification for Swedish on-line stores.

Page 24: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

22

Solutions to this was expressed by several respondents as only allowing this type of data collection if

there’s active participation from professionals behind the applications. They did not want to be left to

fight on their own with the information they received but get small notifications on how to improve

their behavior or if anything serious were to be discovered they’d want professionals to reach out for

further investigation.

The second aspect of distrust is connected to the control and commercial exploitation themes as well,

but more rooted in not actually understanding or trusting the information that they’re being fed. This

as well has many angles, one of them being the distrust in the people working for the company. This is

explained as follows by one of the respondents:

“I mean; however it is, many of these companies collecting this information probably

are very serious companies at their core and have high ethical standards and all that,

but in the end, there’s always people that do not follow this and even though they’ve

signed contracts and all that, could possibly for any reason start using this information.

And what scares you is that this information is so simple to download (…) You as a

developer can have all the good intentions in the world with what you’re developing

but you never consider it being used for alternative purposes and that’s the case with

everything.”

An even more apparent concern is the distrust in themselves to actually understand and interpret the

information correctly. This is expressed by all respondents in relevance to company’s terms &

conditions where all of them felt overwhelmed with the extensiveness and judicial formatting of the

text and based this as to why they don’t bother reading them.

“It could also be that the text is so small… and then there’s so many pages and so many

paragraphs. If you’d make it a bit simpler, a bit easier, then you probably would’ve

read it.”

“One reason [of not reading terms & conditions] is that it isn’t written in a way that

even if I read it, I could understand it.”

This was also a concern even in active choice selections, where one respondent drew parallels with the

handling of cookies on websites:

“I mean just what pops up with cookies, where you check the box where it says “I

understand”. I’m not really sure I do understand what it is I’m checking.”

Page 25: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

23

Another angle expressed was a distrust in that even if they would understand the company policies,

you cannot trust what it says regardless. All of the respondents had examples of stories they read in

articles where companies had misbehaved or mistreated their customers in ways that was never

exposed in the company terms & conditions.

“It doesn’t help that you cross in a small checkbox on a website, what control do you

really have of what happens in the background?”

One respondent also expressed this sense in the following way:

“There’s really only two alternatives. Either you’re a sceptic or you’re a happy optimist.

But you don’t really have grounds for either of them.”

In summary, the respondents have concerns regarding the health data collected being accurate and

informing in a way that suits their needs. They also share a concern of not understanding the

information given to them so that they can make educated decisions as well as general distrust in that

you can ever fully trust policies or companies’ security measures.

4.3 IDENTIFIED CONCERNS COMPARED TO PREVIOUS RESEARCH

Several findings are seconded by the previous research on the processing of Health Data. It had already

been concluded that users were not fond of reading long, complicated policies (Martínez-Pérez, de la

Torre-Díez, & López-Coronado, 2015), which was the case in this study’s findings as well. Other

similarities are all related to transparent and active choices, this was significantly important for the

respondents in this study. The three aspects of the risk of dissemination of their personal information

(Phelps, Nowak, & Ferrell, 2000), clearly informing the users of what they agree to and where they fit

in (Hänsel, Wilde, Haddadi, & Alomainy, 2015) and that providing easy-to-use services that are well

thought out for the target audience will increase trust as well as the propensity for consumers to use

the service (Song & Zahedi, 2007) are all strengthened as important for users with the findings of this

study as a main concern is not clearly understanding or knowing what they’ve agreed to when using

services.

The previous research therefore heavily relates to all concerns raised by the participants of this study.

The importance of gaining control of their data, being the owner of their information and therefore

have the power of how this should be distributed is topics that continually come up both previous

research as well as this one. Using active part-selection choices and simplifying terms & conditions

therefore seem to be a big part of the user experience of integrity.

Page 26: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

24

The participants also seemed aware of that there could be an illusion, such as discussed by Cheung, of

how their data is treated and that no single individual could be aware of all potential risks when using

these devices.

On the contrary to what the previous research said, there was nothing discovered that strengthened

the need for specific incentives, such as materialistic ones, for users to share their health data. Rather

it seems like users value their own need for the product higher than other disconnected incentives.

4.4 NEW FINDINGS ON THE CONCERNS OF USERS

There are a few new or less discussed findings that were identified in this study.

The first being the emotional connection to the individual’s health data, in which anonymous data

would not make the respondents feel less violated or more private. The reasoning behind this was

primarily their gut feeling and personal characteristics of how much data they were willing to share for

integrity purposes more than anything they believed could be regulated or managed as well as a high

level of uncertainty of what could happen with their information.

There’s also an overall perception that regulations or policies cannot protect you from potential risks

and therefore you are instead constantly aware of that something could go wrong, but actively choose

not to let it affect your behavior. Ultimately the participants found that the benefits of using products

and services collecting data was far greater than the possible negatives.

4.5 IDENTIFIED CONCERNS COMPARED TO THE UPCOMING GDPR

This section will try to respond to the question of “Could the upcoming GDPR address the concerns of

users?”.

In summary, the three main concerns identified were:

1. Lack of Control

2. Commercial Misconduct

3. Distrust in Information

The GDPR does try and address several of the concerns voiced by the respondents and especially for

the first concern of lack of control. The new regulations put a higher pressure on the way companies

collect consent from the users and their power in withdrawing this consent as well. The power of the

individual is significantly strengthened in the GDPR, with terms such as “the right to be forgotten” as

well as the aforementioned reform of consent and further requirements on transparency.

Page 27: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

25

The new regulations do however not specify that this has to be done by active or part-selection choices,

which was the wishes from the respondents, but it could be assumed that this can be a big part of the

solution.

A concern voiced both by critics and the respondents, with different angles, is the factor of

anonymization. The GDPR as well as the current regulations exclude data collection that is anonymous

from the regulations. The criticism to this is that it can be to general and risky, as data might never be

able to be completely anonymous. From a user stand-point it does not address their emotional

connection to their health data as the respondents do not value other personal data to the heights

that the value the integrity of their health data. As voiced by Sweeney, there are doubts to data ever

being completely anonymous as very little information was needed in order to identify the data once

again (Sweeney, 2000) and the fact that a zero-risk is impossible to achieve just by anonymizing data

(Stalla-Bourdillon & Knight, 2017). All of this research suggests that there is a legitimate tie between

the emotional connection of health data and that anonymizing this will not be a solution to the concern

as you might not be completely sure that it will never get tracked back to the individual.

When considering concern number two of Commercial Misconduct, the GDPR is very diffuse in the

regulations of how to handle research and data collection for commercial use, in the aspect of what is

considered commercial, but do put extra measurements in order to secure and limit this data with one

term being to pseudonymize data. With this in mind, it would be highly unlikely for companies abiding

by the regulations to be able to do any direct marketing of their products based on their user’s data.

However, if there has been explicit consent, the company could be free to do whatever they deem fit

to do with the data.

Concern number three as regards to the distrust in information is less obviously addressed in the GDPR.

As of the medical correctness and quality of diagnosis, this could be addressed in MDR where a quality

control has to be done on these types of products. As the respondents were positive to certifications,

the knowledge of having their wearables CE-marked could help with the ease of that fact. With that

said, there is a reason to worry about the regulations still not being followed as discussed by Lindström

in a review where several companies were not compliant with the rules of the market (Lindström,

2015).

However, the critics thoughts on the GDPR being too broad is seconded by the respondents from their

perspective in which you could never be completely sure that their information is being treated fairly

according to its personal value and that it’s technically secure enough.

Page 28: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

26

5 DISCUSSION

5.1 SUMMARY AND DISCUSSION OF RESULTS

The findings discovered in this thesis were primarily pretty foreseeable and not that surprising when

looking at the theoretical background. However, it was very interesting seeing the emotional

connection to the participant’s health data and that the lack of knowledge of what could happen to

their data was a disturbing thought to them.

However, it seems very clear that there is a trust in technology and its capabilities as there’s a

willingness from the participants to use products targeting their concerns if they feel that it could

provide them with value in their everyday life. The comparison to when online banking, smartphones

and other “revolutionary” tech was launched comes to mind in the interview sessions where most of

the concerns are grounded in uncertainty of the possibilities, just as discussions around self-driving

cars and Artificial Intelligence is today, where most people worry because of the lack of knowledge in

the field and what it could mean for them.

The biggest challenges identified seems to be communication related as well as how the GDPR actually

succeeds in implementing the new regulations. Communications-wise the GDPR features clauses as

“clear and unambiguous” which could possibly lead to several legal actions where company and user

could disagree. It raises questions if it is ever possible to ensure that every user understands the terms

of service and privacy policy, especially with the extreme amounts of data that could be stored in cloud

environments all across the world. The wish from users to get a clearer view of their rights and

agreements as well as the GDPR: s aim to provide better transparency seems excellent on paper, but

the trust in companies to implement them in such a way that addresses the staggering low numbers

of policy readers leaves disbelief in that you could ever argue that every user has been left with “no

doubt for the user to what he or she has agreed to”. However, adding the clause of withdrawing

consent as well as the right to be forgotten could be possible solutions if the problem arises, but do

not really address the initial consent. A comparison raised by one of the interviewees was the EU

Cookie Law, which was designed to protect online privacy, but probably did not change anything in

reality more than an extra checkbox for users to check without reading or understanding.

It is plausible to think that most companies would be positive to implementing the GDPR because of

its trust building capabilities for users as well as the actual legal implications. However, because of the

vagueness in formulation of the GDPR it leaves room for interpretation by the companies, this could

be a potential risk for the user and the regulation officers when checking for compliance of companies.

Page 29: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

27

The skepticism of the EU actually being able to implement and reprimand companies in the ways that

it promises is probably granted and it’s a classic case in that we’ll have to wait and see. Its sheer

existence does not seem to make much of a difference for user’s perception of data integrity.

5.2 LIMITATIONS AND METHOD EVALUATION

The choice of using a qualitative method has deemed a great choice and there is no doubt to use any

other method for this particular purpose. However, it is granted to discuss the extensiveness and

validity of the study as only five participants were interviewed. The reasoning for this were primarily

time constraints as well as an extensive theoretical foundation in the GDPR review and analysis. For

broader results and ground work it could have been better to write the thesis in a pair where more

interviews could have been conducted and the work load would have been shared. However, with the

backing of previous research and an objective outlook on the results, the conclusions of this thesis are

still deemed valid for its purpose.

Another factor worth mentioning is the early stage of both the technology presented and regulations

that have not yet come into place. As there is currently very few products such as the ones described

in the thesis on the market, the interviewees responses were speculative and hypothetical, which

could mean that the results could differ from actual users of the product. The same goes for the GDPR

implementation, if it would have been implemented or more discussed in media it could have impacted

the results as well.

It could also have been interesting to see if changing the participants would have made a difference.

Maybe a younger demographic or residents of other parts of the country would have had an impact,

as well as including individuals more critical to technology advancements. However, it was important

for the study to interview the actual user group of these sorts of devices which statistically are more

of a senior age, between 54-78, but its worth noting that the results could possibly be applicable to all

age groups affected by a condition. Because of the low number of interviewees and lack of statistical

value in the empirical results, it is more valid to focus on the fact that these are people affected by a

medical condition rather than their age.

Worth discussing is also the personal involvement of the interviewer. As previously mentioned, the

experience and opinions of the interviewer could have impacted the discussions with the participants

to a more critical or for that matter positive spectrum. This has been considered by the interviewer

several times during the thesis work and efforts to be objective have been taken, but can never be

guaranteed to have been effective.

Page 30: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

28

5.3 RECOMMENDATIONS ON FURTHER RESEARCH

Several of the findings of this study could be used for further research.

Mainly, a more quantitative study could be done in order to back up the findings of this thesis and to

look for patterns between for example age and gender, but also experience with technologies or type

of medical concern to look for empirical evidence in the field. This would particularly give more

substance to the first research question.

As time goes by, it will also be very interesting to see if the GDPR actually did address the issues raised

in this thesis or if it didn’t really serve any effect to the user’s perception of integrity. This also goes for

advancements in the field and to perform a similar study when the technology has evolved and the

wearables are on the market, to see if user’s concerns have changed.

Another finding identified, however because of the statistically low number of participants should not

be considered definite and are therefore not recited in the results, is that there seems to be different

profiles of integrity personalities. From the interviews, there seems to be two profiles: 1) The individual

does not really care about data collection or its usage, regardless of purpose and 2) The individual is

concerned because they value their privacy, regardless of purpose.

These personality profiles could be interesting to look further into as to what their thoughts and

reasoning are.

Page 31: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

29

6 CONCLUSIONS

Overall, there are several correlations between the new regulations in the GDPR and the concerns

raised by users. There were three main concerns identified that seemed to be the most prominent

ones for individuals.

The first one is the sense of a lack of control where users have a clear need of ownership of their

personal data. This is visualized in a demand for more control over uploading of data, making active

choices that they can take back if they’d like and knowing that consent does not last forever. There

also seems to be a strong emotional connection to individuals own health data and there is a sense of

unease in sharing it with companies and other stakeholders.

The second one is the concern of companies abusing their data for commercial purposes. Users do not

want their health data to be exploited for a commercial benefit and get bombarded with ads targeted

towards their concerns. There’s also a worry of companies recommending drugs and treatments,

based on this data, that are not yet properly tested or lack in quality in order to make money.

Lastly, users doubt the level of trust they can put in the information they receive. They feel unsure

about how they could trust the diagnosis in wearables and the quality that the data analysis provides,

but the concern is also visible in a distrust of policies and the intention of companies. The users feel

like they cannot themselves properly understand a user agreement and therefore do not feel

completely sure with what they’ve agreed to.

The GDPR does address several of these concerns by bringing ownership of data back to the users. By

strengthening the need for explicit consent from the companies, more transparent policies and

security implementations of data integrity, the GDPR features several steps that could ensure the

privacy of users such as distribution of data and “the right to be forgotten”.

The concerns remaining however are how to effectively communicate policy and therefore being

compliant with the new regulations and for users to actually understand what they’re agreeing to.

Because of the extensiveness, it also important to question how the GDPR could effectively be

implemented into the market.

Page 32: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

30

7 REFERENCES

Aïmeur, E., Lawani, O., & Dalkir, K. (2016). When changing the look of privacy policies affects user trust:

An experimental study. Computers in Human Behavior, 368-379.

Boyatzis, R. E. (1998). Transforming Qualitative Information: Thematic Analysis and Code Development.

London: SAGE Publications.

CB Insights. (2016, January 9). Digital Health Funding Hits New Highs In 2015, Reaching Nearly $6B.

Retrieved March 21, 2017, from CB Insights: https://www.cbinsights.com/blog/digital-health-

funding-2015/

Cheek, P., Nikpour, L., & Nowlin, H. D. (2005). Aging Well With Smart Technology. Nursing

Administation Quarterly Vol.29, 329-338.

Cherney, K. (2016, October 10). Age Of Onset For Type 2 Diabetes: Know Your Risk. Retrieved May 8,

2017, from HealthLine: http://www.healthline.com/health/type-2-diabetes-age-of-

onset#childhood3

Cheung, A. S. (2017, March 29). Moving Beyond Consent for Citizen Science in Big Data Health Research.

Retrieved April 5, 2017, from SSRN: https://ssrn.com/abstract=2943185

Datainspektionen. (1998, October 24). The Personal Data Act. Retrieved April 5, 2017, from

Datainspektionen: http://www.datainspektionen.se/in-english/legislation/the-personal-data-

act/

Datainspektionen. (2016, March). Här är hoten mot den personliga integriteten. Integritet i fokus, pp.

9-10.

De Mooy, M., & Yuen, S. (2017). Towards Privacy-Aware Research and Development in Wearable

Health. Proceedings of the 50th Hawaii International Conference on System Sciences .

Demiris, G., Rantz, M. J., Aud, M. A., Marek, K. D., Tyrer, H. W., Skubic, M., & Hussam, A. A. (2004).

Older adults’ attitudes towards and perceptions of ‘smart home’ technologies: a pilot study.

Medical Informatics and the Internet in Medicine Vol. 29, 87-94.

ERCAS. (2015, January 1). Public Integrity and Trust in Europe. Retrieved May 22, 2017, from

Government of the Netherlands:

https://www.government.nl/documents/reports/2016/01/18/public-integrity-and-trust-in-

europe

Page 33: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

31

European Comission. (2017, April 24). Manufacturers - European Commission. Retrieved May 5, 2017,

from European Comission: https://ec.europa.eu/growth/single-market/ce-

marking/manufacturers_sv

European Comission. (2017, April 24). Revisions of Medical Device Directives - European Commission.

Retrieved May 1, 2017, from European Comission:

http://ec.europa.eu/growth/sectors/medical-devices/regulatory-framework/revision_en

Eurostat. (2017, January 30). Internet access and use statistics - households and individuals. Retrieved

March 1, 2017, from Eurostat: http://ec.europa.eu/eurostat/statistics-

explained/index.php/Internet_access_and_use_statistics_-_households_and_individuals

Fitbit. (n.d.). Fitbit Official Site for Activity Trackers & More. Retrieved March 20, 2017, from Fitbit:

https://www.fitbit.com/

Forni, A. (2016, December 16). The Present and Future of Wearables. Retrieved April 5, 2017, from

Gartner: http://www.gartner.com/smarterwithgartner/the-present-and-future-of-

wearables/

Goyal, R., Dragoni, N., & Spognardi, A. (2016). Mind The Tracker You Wear - A Security Analysis of

Wearable Health Trackers. Kongens Lyngby: ACM.

Groves, P., Kayyali, B., Knott, D., & Van Kuiken, S. (2013). The 'big data' revolution in healthcare:

Accelerating value and innvoation. New York: Center for US Health System Reform Business

Technology Office - McKinsey & Company.

Heart Rhythm Society. (2015, May 15). ECG on Smartphones . Retrieved May 19, 2017, from HRS

Online: http://www.hrsonline.org/News/Press-Releases/20154/05/ECG-On-Smartphones

Holme, I. M., & Solvang, B. K. (1996). Forskningsmetodik - Om kvalitativa och kvantitativa metoder.

Lund: Studentlitteratur.

Hordern, V. (2016, January 20). The Final GDPR Text and What It Will Mean for Health Data. Retrieved

March 20, 2017, from Chronicle of Data Protection:

http://www.hldataprotection.com/2016/01/articles/health-privacy-hipaa/the-final-gdpr-

text-and-what-it-will-mean-for-health-data/

Hänsel, K., Wilde, N., Haddadi, H., & Alomainy, A. (2015). Challenges with Current Wearable

Technology in Monitoring Health Data and Providing Positive Behavioural Support.

MobiHealth, 14-16.

Page 34: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

32

IIS. (2016). Svenskarna och Internet. Retrieved May 22, 2017, from IIS:

https://www.iis.se/docs/Svenskarna_och_internet_2016.pdf

Kelly, J. M., Strecker, R. E., & Bianchi, M. T. (2012, October 14). Recent Developments in Home Sleep-

Monitoring Devices. ISRN Neurol. Retrieved May 19, 2017, from

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3477711/

Krebs, P., & Duncan, D. T. (2015, November 4). Health App Use Among US Mobile Phone Owners: A

National Survey. Retrieved April 5, 2017, from NCBI:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4704953/

Lindström, K. (2015, September 16). Många hälsoappar klarar inte lagkraven - nu griper

Läkemedelsverket in. Computer Sweden .

Martínez-Pérez, B., de la Torre-Díez, I., & López-Coronado, M. (2015). Privacy and Security in Mobile

Health Apps: A Review and Recommendations. J Med Syst.

Moss, L., Shaw, M., Piper, I., Hawthorne, C., & Kinsella, J. (2017). Sharing of Big Data in Healthcare:

Public Opinion, Trust, and Privacy Considerations for Health Informatics Researchers.

Proceedings of the 10th International Joint Conference on Biomedical Engineering Systems and

Technologies (BIOSTEC 2017), 5, 463-468.

Murdoch, T. B., & Detsky, A. S. (2013, April 3). The Inevitable Application of Big Data to Health Care.

JAMA, pp. 1351-1352.

NIH. (2015, September 10). Risk Factors for High Blood Pressure. Retrieved May 8, 2017, from National

Heart, Lung and Blood Institute (NIH): https://www.nhlbi.nih.gov/health/health-

topics/topics/hbp/atrisk

Obar, J. A., & Oeldorf-Hirsch, A. (2016, August 24). The biggest lie on the internet: Ignoring the privacy

policies and terms of service policies of social networking services.

Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy Concerns and Consumer Willingness to Provide

Personal Information. Journal of Public Policy & Marketing, 19:1, 27-41.

REGULATION (EU) . (2016/679). General Data Protection Regulation. REGULATION (EU) 2016/679 OF

THE EUROPEAN PARLIAMENT AND OF THE COUNCIL. Official Journal of the European Union.

Retrieved May 18, 2017, from http://ec.europa.eu/justice/data-

protection/reform/files/regulation_oj_en.pdf

Page 35: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

33

Robbins, R. (2016, October 11). New diabetes tech is coming. But will it make much difference?

Retrieved May 19, 2017, from STAT News: https://www.statnews.com/2016/10/11/diabetes-

technology/

Roots Analysis Private Ltd. (2016). Smart Wearables in Healthcare, 2016-2030. General Electric (GE)

Company.

Rumbold, J. M., & Pierscionek, B. (2017). The Effect of the General Data Protection Regulation on

Medical Research. Journal of Medical Internet Research.

SFS. (1998:204). Personuppgiftslag. Retrieved May 18, 2017, from

http://www.riksdagen.se/sv/Dokument-

Lagar/Lagar/Svenskforfattningssamling/Personuppgiftslag-1998204_sfs-1998-

204/?bet=1998:204

Silverman, D. (2016). Qualitative Research. London: Sage .

Song, J., & Zahedi, F. M. (2007). Trust in health infomediaries. Decision Support Systems 43, 390-407.

Stalla-Bourdillon, S., & Knight, A. (2017, March 6). Anonymous data v. Personal data—A false debate:

An EU perspective on anonymisation, pseudonymisation and personal data. Wisconsin

International Law Journal.

Statens Offentliga Utredningar. (2016). Hur står det till med den personliga integriteten? – en

kartläggning av Integritetskommittén . Stockholm: Wolters Kluwers.

Sung, D. (2015, August 3). What is wearable tech? Everything you need to know explained. Retrieved

May 18, 2017, from Wareable: https://www.wareable.com/wearable-tech/what-is-wearable-

tech-753

Sweeney, L. (2000). Simple Demographics Often Identify People Uniquely. Carnegie Mellon University.

Pittsburgh: Data Privacy Working Paper 3.

The Travelers Indemnity Company. (2015). 2015 Consumer Risk Index Summary. Retrieved May 9,

2017, from Travelers: https://www.travelers.com/resources/consumer-risk-index/2015-

summary.aspx

Validic. (2016, September 19). Insights on Digital Health Technology Survey 2016: How Digital Health

Devices and Data Impact Clinical Trials. Retrieved April 9, 2017, from Validic:

http://pages.validic.com/rs/521-GHL-

511/images/Digital_Health_Survey_Results_Pharma_2016.pdf

Page 36: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

34

van Loenen, B., Kulk, S., & Ploeger, H. (2016). Data protection legislation: A very hungry caterpillar -

The case of mapping data in the European Union. Government Information Quarterly 33, 338-

345.

WHO. (2017). Technology, Health. Retrieved April 10, 2017, from World Health Organization:

http://www.who.int/topics/technology_medical/en/

Youm, K. H., & Park, A. (2016). The "Right to Be Forgotten" in European Union Law: Data Protection

Balanced With Free Speech. Journalism & Mass Communication Quarterly, 93(2), 273-295.

Ziefle, M., Röcker, C., & Holzinger, A. (2011). Medical Technology in Smart Homes: Exploring the User’s

Perspective on Privacy, Intimacy and Trust. 35th IEEE Annual Computer Software and

Applications Conference Workshops, 410-415.

Page 37: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

35

8 APPENDIX

8.1 INTERVIEW STRUCTURE

1) Health Status and Technology in General

a) The Individual’s Health Status and Supervision

i) Do you have a condition that you currently monitor?

ii) How do you monitor your condition today?

iii) What are the pros and cons about how you monitor your condition today?

iv) Do you feel informed about your health status?

b) Health Technology for Supervision of Condition - SCENARIO

i) Spontaneous thoughts/feelings about this up and coming tech? Pros and cons.

ii) Pros and cons for you compared to your current method of monitoring.

iii) Looking at the greater public, pros and cons?

2) Data Integrity & Regulations

a) Devices and Online Behaviour

i) What devices do you use today? (Computer, phone etc.)

ii) What online services do you use today? (Internet banking, e-commerce etc.)

iii) Pros and cons of using these devices and services.

iv) What makes you inclined to start using a new device or service?

(1) Are there any aspects you are specifically interested in when choosing? “What’s in it

for me?”

b) Data Collection in General

i) Are you aware of the types of personal information that could be stored about you?

(1) What do you think personal data entails?

ii) Are you aware of how this data could be used?

iii) Do you feel any worries when using these services?

(1) Why/Why not?

c) Data Collection in Health Devices

i) Are you aware of what sorts of data could be gathered from wearables?

(1) What do you think sensitive data entails?

(2) Does it matter if the data is anonymous?

ii) Are you aware of how this data could be used?

iii) Would you feel any worries when using these services?

Page 38: Seniors Privacy Concerns in Health Technology Wearablesltu.diva-portal.org/smash/get/diva2:1104482/FULLTEXT03.pdf · Technology Wearables Regulations vs. User Concerns Sofie Siggelin

36

(1) Why/Why not?

iv) What makes a service trustable?

v) Is there any limit to what you could consider sharing?

vi) Are there any incentives that would make you more obligated to share this type data?

(1) On what conditions?

vii) Does it matter where the product is purchased? Pharmacy or store?

d) Laws & Regulations

i) Do you read the Terms & Conditions? Why/Why not?

ii) Are you aware of your laws and rights to your personal information online?

(1) Do they make you feel more or less safe?

iii) Are you aware of any laws and obligations for companies when storing your personal data?

(1) How do you know about them? Any experiences?

(2) Do they make you feel more or less safe?

iv) What would make you take part of the privacy policy? Company strategy etc.

v) If anything were to happen with your data that you’d feel uncomfortable with, who would

you turn to?

(1) Company, relatives, police?

vi) How do you think your complaint/inquire would be handled?

vii) How would you like it to be handled? When would you feel satisfied?

e) Company and Institutional Trust

i) Does the harmonization of regulations make you feel safer?

ii) What are your thoughts on the upcoming regulations?

iii) Would you trust the company handling your health data to process is correctly?

(1) Does it matter which company it is?

3) Additional Comments and Discussion

- Anything you believe I forgot to ask you?

- Anything you’d like to add?