The Power of Fair Information Practices - A Control Agency Approach
📝 Abstract
Most companies’ new business practices are based on customer data. These practices have raised privacy concerns because of the associated risks. Privacy laws require companies to gain customer consent before using their information, which stands as the biggest roadblock to monetise this asset. Privacy literature suggests that reducing privacy concerns and building trust may increase individuals’ intention to authorise the use of personal information. Fair information practices (FIPs) are potential means to achieve this goal. However, there is lack of empirical evidence on the mechanisms through which the FIPs affect privacy concerns and trust. This research argues that FIPs load individuals with control, which has been found to influence privacy concerns and trust level. We will use an experimental design methodology to conduct the study. The results are expected to have both theoretical and managerial implications.
💡 Analysis
Most companies’ new business practices are based on customer data. These practices have raised privacy concerns because of the associated risks. Privacy laws require companies to gain customer consent before using their information, which stands as the biggest roadblock to monetise this asset. Privacy literature suggests that reducing privacy concerns and building trust may increase individuals’ intention to authorise the use of personal information. Fair information practices (FIPs) are potential means to achieve this goal. However, there is lack of empirical evidence on the mechanisms through which the FIPs affect privacy concerns and trust. This research argues that FIPs load individuals with control, which has been found to influence privacy concerns and trust level. We will use an experimental design methodology to conduct the study. The results are expected to have both theoretical and managerial implications.
📄 Content
Australasian Conference on Information Systems
Libaque-Saenz et al. 2015, Adelaide, South Australia, Australia
The power of fair information practices
The Power of Fair Information Practices – A Control
Agency Approach
Christian Fernando Libaque-Saenz
School of Engineering
Universidad del Pacífico
Lima, Peru
Email: cf.libaques@up.edu.pe
Younghoon Chang
Department of Computing & Information Systems
Sunway University
Selangor, Malaysia
Email: younghoonc@sunway.edu.my
Siew Fan Wong
Department of Computing & Information Systems
Sunway University
Selangor, Malaysia
Email: siewfanw@sunway.edu.my
Hwansoo Lee
IT Law Program, Graduate School
Dankook University
Yongin, Republic of Korea
Email: hanslee992@gmail.com
Abstract
Most companies’ new business practices are based on customer data. These practices have raised
privacy concerns because of the associated risks. Privacy laws require companies to gain customer
consent before using their information, which stands as the biggest roadblock to monetise this asset.
Privacy literature suggests that reducing privacy concerns and building trust may increase individuals’
intention to authorise the use of personal information. Fair information practices (FIPs) are potential
means to achieve this goal. However, there is lack of empirical evidence on the mechanisms through
which the FIPs affect privacy concerns and trust. This research argues that FIPs load individuals with
control, which has been found to influence privacy concerns and trust level. We will use an
experimental design methodology to conduct the study. The results are expected to have both
theoretical and managerial implications.
Keywords
Privacy concerns, trust, perceived control, fair information practices.
1 INTRODUCTION
The development of information technologies (ITs) has brought significant changes to our lives. We
use ITs for a wide range of activities, such as e-commerce, Internet banking, and social networking.
The proliferation of smartphones and the mobile convenience of these devices have increased usage of
related services. These human-computer interactions are producing huge and diverse amounts of
customer data and thus we are creating digital footprints of our everyday activities (Girardin et al.
2008). Companies, on the other side, have started to heavily use and analyse these data together with
other datasets to improve decision-making (Global Pulse 2012). For example, the Bloomberg
Businessweek (2011)’s survey shows that 97 per cent of the respondents’ companies (with more than
$100-million revenues) have used some form of data analytics. Therefore, companies are expected to
rely more on customer data in the future to reduce costs, develop new products and services, and stay
competitive in the market (McAfee and Brynjolfsson 2012; McGuire et al. 2012).
The use of customer data, however, presents privacy challenges that should not be overlooked. ITs and
the associated data collection activities have raised customers’ privacy concerns. A recent study
reported that Facebook users who committed ‘virtual identity suicide’ were mainly motivated by
concerns over privacy (Woollaston 2013). Indeed, due to potential risks such as intrusion, public
Australasian Conference on Information Systems
Libaque-Saenz et al. 2015, Adelaide, South Australia, Australia
The power of fair information practices
disclosure of embarrassing private facts, false light, or appropriation, individuals are uncomfortable
with their data being collected. These uncertainties may lead individuals to stop using a service, or
search for other companies that may provide similar service and protect their privacy at the same time
(Dinev and Hart 2006, Liu et al. 2004).
Given the potential benefits and challenges of this trend involving customer data, the privacy domain
has received considerable attention among researchers. Previous studies (see Table 1) have focused on
individuals’ intention to participate in activities involving the collection and use of their personal data,
such as disclosing personal data and transacting on the Internet. In this study, we use the term
‘privacy-related behavioural intention’ to refer to these intentions. From an analysis of Table 1, the
general conclusion is that privacy concerns have a negative impact on privacy-related behavioural
intentions, while trust has a positive effect on these intentions. In other words, by reducing customers’
privacy concerns and/or building their trust towards organisations, these institutions may monetise
effectively on customer data.
Author (Year) Research Findings Bart et al. (2005) Trust has a significant positive effect on behavioural intention to disclose personal data to buy items through the Internet Chellappa and Sin (2005) Privacy concerns have a significant negative impact on behavioural intention to use online personalisation services, and trust has a significant positiv
This content is AI-processed based on ArXiv data.