Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Abstract
We are living in an era where data means opportunities. New technologies have made it possible to
collect and store huge amounts of data. The telecommunications sector, specifically Network Operators,
is one of the industries that could benefit from these opportunities. By exploiting this asset, Network
Operators could survive lost revenue due to the commoditization of traditional services by other players
such as Google and Skype. However, according to information privacy laws, without the permission of
their customers, Network Operators have many limitations in using data. In this light, the purpose of this
study is to analyze the main factors that affect individuals intention to grant permission to their Network
Operators to use their personal information, and to look at the differences between smartphone and nonsmartphone users. We used a survey to measure intention, privacy-related attitudes, and salient
beliefs; and received 475 responses. The results are expected to have both theoretical and managerial
implications. Moreover, these results may suggest to managers which strategies they should focus on to
encourage their customers to give permission to use their personal information.
Key Words: Privacy, data usage, personal information, Network Operators.
INTRODUCTION
We are entering a new era where data is everywhere. New technologies have been providing capabilities
to capture and store huge amounts of data. For example, when the Sloan Digital Sky Survey was launched
in 2000 in New Mexico, the data captured and stored in the first weeks by its telescope were greater in
quantity than all the data that had been collected in the entire history of astronomy (Cukier 2010).
Big Data is a term related to this explosion of data. Even though there is not a concrete definition for Big
Data, Manyika et al. (2011) refers to it as data sets whose size is beyond the ability of typical database
software tools to capture, store, manage, and analyze. Big Data has three main features: volume huge
data amount, velocity speed data creation, and variety variety of sources. However, experts have
considered a fourth characteristic, value providing valuable insights (DataStax 2011). Therefore, the
huge amount of data we are creating every day through many sources may reveal trends in real time,
which can improve decision-making (UN 2012). In the business field, improving decision-making could
help organizations to enhance their performance in tasks such as minimizing risks and costs, and creating
new products and services (McGuire et al. 2012). For example, Avanade (2012) conducted a survey of
569 high-level decision-makers in 18 countries, finding that 42% of the respondents companies have
leveraged data to increase an existing revenue stream, while 31% have used it to create new sources.
Of course, not all industries have the capabilities to gain benefits from Big Data. Figure 1 shows the five
potential industries that would benefit the most, one of them being the telecommunications industry,
noting that Network Operators telcos have the capabilities to take advantage of this trend.
Figure 1.
EC 95/461995, issued by the European Union, states that for a legitimate personal data 1 process,
unambiguous consent from customers is needed.
Author (Year)
Culnan and Armstrong (1999)
Phelps et al. (2000)
Malhotra et al. (2004)
Dinev and Hart (2006)
Hui et al. (2006)
Norberg et al. (2007)
Lee (2009)
Li et al. (2010)
Xu et al. (2011)
Tan et al. (2012)
Table 1.
Research Topic
Disclosure of personal information in an electronic world.
Privacy concerns to disclose personal information for direct marketing.
Measurement of internet users information privacy concerns.
Factors influencing willingness to disclose personal information in e-commerce.
Motivators for online information disclosure.
Paradox between intention and behavior for information disclosure.
Factors affecting adoption of m-banking (e.g., benefits and privacy).
Privacy calculus in e-commerce.
Link between individual perceptions and Institutional Privacy Assurances.
Privacy concerns and social network sites (SNS).
Literature Review
Table 1 shows some previous studies on privacy and willingness to disclose personal information. We can
see that these studies focused on e-commerce, the Internet, or SNS, but no study has addressed the
privacy calculus in the telecommunications sector. Moreover, the telecommunications sector differs from
the mentioned domains in two features. First, telcos are already storing their customers personal
information at the time they provide services; hence, we address the willingness to give permission to use
this information rather than to disclose it2. Second, we can divide users into two groups: non-smartphone
and smartphone users. The rationale is that the characteristics between them are remarkable in terms of
data production. Indeed, non-smartphone users mainly access SMS and voice services, while smartphone
users access a wider range of services such as banking, SNS, and entertainment. Thus, an extension is
needed, considering the importance of the usage of personal information for telcos.
THEORETICAL FRAMEWORK
2.1
Human behavior has been a widely studied field in social sciences. Many of these studies used TRA
(Ajzen & Fishbein 1980) and its extended version, the theory of planned behavior (TPB) (Ajzen 1988) in
a wide range of behaviors. Even though these theories present parsimonious frameworks, results have
shown that these frameworks explain considerable variance in their dependent variables: intention to
perform behaviors and behaviors themselves (Ajzen 1985; Ajzen 1991).
TRA focuses on volitional behaviors and thus is appropriate for this study. TRA postulates that a behavior
is determined by an intention, while this intention is determined by attitude (ATT) and subjective norm
(SN). ATT reflects the individuals positive or negative evaluation of the performance of the behavior,
and SN refers to the influence from specific peoples expectations about the behavior. However, for a
deeper understanding of the behavior, we should analyze the beliefs influencing ATT and SN: behavioral
and normative beliefs, respectively. The former reflects the subjective probability that the behavior will
produce a certain outcome, while the latter refers to the individuals beliefs that specific groups think
they should or should not perform the behavior. These beliefs are not fixed and depend on the behavior.
Personal data is defined as any information relating to an identified or identifiable natural person.
Disclosure of information is a necessary precondition for the usage of it. The costs associated with the disclosure of personal
information depend on the usage of this information, and not on the disclosure itself (Brandimarte et al. 2012).
1
2
In the case of ATT, Triandis (as cited in Davis 1989) argued that behavioral beliefs and ATT are codeterminants of intentions. Accordingly, we found studies supporting this research stream, which focused
on the direct effect of behavioral beliefs on intention (Davis 1989; Dinev & Hart 2006). On the other hand,
Ajzen (1991) concluded that personal considerations are the main contributors in predicting behavioral
intention and their effect dwarfs the influence of perceived social pressure. For example, prior research
found that SN has a weak and limited role in shaping intentions (White et al. 2009). Furthermore, some
scholars deliberately dropped SN from their data analyses (e.g., Sparks et al. 1995).
Our research follows the course of these studies. Thus, we focused on behavioral beliefs, attitudes, and
intention. Previous studies found that intention predicts behavior well (Ajzen 1991), and this relationship
becomes particularly valuable when time lags are unpredictable (Krueger Jr et al. 2000), as in our case.
From a literature review, we identified privacy-related and inhibitor-related beliefs, and privacy-related
attitudes ruling behaviors that threat personal information. Finally, we followed these two TRA rules: 1)
the more favorable (unfavorable) the ATT, the stronger (weaker) the intention; and 2) the stronger the
beliefs linking the behavior with positive (negative) outcomes, the more favorable (unfavorable) the ATT.
2.2
Prior studies in the privacy domain have addressed information privacy concerns as one of their main
constructs (Culnan & Armstrong 1999; Dinev & Hart 2006). However, past research found contradictory
results about the role of privacy concerns. For example, Chellappa and Sin (2005) found a significant
direct effect of privacy concerns on behavioral intention, while Tan et al. (2012) found that privacy
concerns have a moderator effect rather than a direct effect. Therefore, this research should attempt to
shed light on the role of privacy concerns in a different sector, specifically, telecommunications.
Trust was found by previous research to be another important antecedent of behavioral intention in the
privacy field. Indeed, results support a direct effect of trust on intention (Bart et al. 2005; McKnight et al.
2002). However, Bart et al. (2005) concluded that the strength of this relationship varies depending on the
trustees characteristics. Hence, an analysis of this relationship in other sectors such as telcos is needed.
2.3
Perceived information risks and perceived benefits are also an important part of the privacy calculus. Past
research shows that both of these beliefs are present during the decision process (Culnan & Armstrong
1999; Dinev & Hart 2006). Furthermore, this notion of calculus is consistent with Petronios (2002)
communication privacy management (CPM) theory, which claims that disclosure has both benefits and
risks, and a balance of these two variables is needed for deciding whether or not to disclose information.
In this sense, the telecommunications sector should not remain indifferent to this notion of calculus.
On the other hand, Westin (1967) defined information privacy as the claim of individuals, groups, or
institutions to determine for themselves when, how, and to what extent information about them is
communicated to others. Prior research has operationalized this variable through perceived control over
information (Phelps et al. 2000; Malhotra et al. 2004). Even though the findings of these studies support
the important role of perceived control on intention to disclose personal information, they are mostly
focused on the commerce domain. Therefore, the present research attempts to explore the role of
perceived control in a different environment, the telecom sector, taking into account CPM, which claims
that people develop their privacy rules depending on the context.
Similarly, in their work about fair information practices, Culnan and Armstrong (1999) not only
addressed the effect of control on individuals concerns, but also the effect of notice. Moreover, in a study
about online disclosure of personal information, Hoffman et al. (1999) found that 69% of Web users did
not provide personal information to websites because they were not aware of how their information would
be used. These findings are consistent with Foxman and Kilcoynes (1993) argument about privacys dual
dimension, where control is the active dimension and awareness is the passive dimension. The latter
refers to the degree of consumers awareness of organizational information privacy practices (Malhotra et
al. 2004). Even though these studies agree with the claim that awareness may help to build trust, few of
them have tried to quantify this effect. The present research attempts to provide clearer insights.
Another term associated with privacy is security. Even though most people use these two terms
interchangeably, they are not the same. Privacy, as we mentioned before, is the individuals claim to
control their information, while security is the protection of that information (Prescient 2002). The present
study focuses on individuals perception of protection as a proxy measurement of security in the same
manner than other scholars did by analyzing perceived information protection (Li et al. 2010).
2.4
TPB complements TRA by adding perceived behavioral control (PBC) as a variable that captures
controllability and self-efficacy in non-volitional behaviors. Even though we targeted our behavior under
the TRA framework because of its volitional nature, we have special interest in the controllability
dimension of PBC. Controllability refers to individuals judgments about the availability of resources to
perform the behavior (Ajzen 2002). Principally, we will focus on time and effort, as they may be needed
to perform the behavior under study. However, we use a different approach from controllability in TPB
by addressing time and effort as personal investment; that is, we will focus on the role of the individuals
beliefs about the resources needed to perform an action as a possible inhibitor to accomplish the behavior,
regardless of the availability of these resources.
Figure 2.
3.1
Research Model
CON is the individuals level of anxiety regarding their information privacy (Lanier & Saini 2008).
According to Fagan et al. (2003), anxiety is a state of mind of discomfort in threatening scenarios. In
addition, Allport (as cited in Wang & Pfister 2008) defined ATT as a state of mind as well. Thus, we posit
that CON is an unfavorable ATT. In this sense, following TRAs first rule, we hypothesize that CON has
a direct negative effect on intention because the former reflects an unfavorable evaluation of the
performance of the behavior. This statement is consistent with previous findings (Chellappa & Sin 2005;
Dinev & Hart 2006), and consistent with Vrooms (1964) expectancy theory, which postulates that
individuals tend to minimize negative outcomes such as anxiety.
Hypothesis 1: CON negatively influences intention.
3.2
Trust (TRU)
TRU is the individuals perceptions that telcos will act accordingly to the formers expectations (Pavlou
& Fygenson 2006). Trust literature posits that TRU has two elements: cognitive (perceptions) and
emotional (attitude) (Komiak & Benbasat 2006). Accordingly, Jones (1996) approached TRU as the
trustors attitude of optimism about the trustees goodwill. Thus, we claim that TRU is a favorable ATT.
Following TRAs first rule, we hypothesize that TRU has a direct positive effect on intention because the
former reflects a favorable evaluation of the performance of the behavior. This hypothesis is consistent
with previous research supporting this relationship (Dinev & Hart 2006; McKnight et al. 2002).
Hypothesis 2: TRU positively influences intention.
3.3
PIR is the individuals beliefs of potential loss associated with giving permission to their telcos to use
their personal information (Malhotra et al. 2004). This variable is closely related to CON. However, PIR
refers to the apprehension of possible loss, while CON refers to the internalization of this possibility of
loss (Dinev & Hart 2006).
As we mention in section 3.1, CON is defined as the level of anxiety regarding privacy in threatening
scenarios. Thus, we claim that high PIR may lead individuals to have high levels of CON, and vice versa
for low PIR. This statement is consistent with TRAs second rule, and consistent with risk literature,
which postulates that perception of risks can lead individuals to a state of unpleasant feelings (Dowling
& Staelin 1994). In other words, a positive effect of PIR on CON is expected (Dinev & Hart 2006).
Hypothesis 3: PIR positively influences CON.
On the other hand, Mayer et al. (1995) defined TRU as the willingness to take risks. Therefore, in a
threatening scenario it is expected that as the individuals perceptions of risks rise, their intentions to take
these risks (i.e., trust) decline. Following this direction, as past studies did, we hypothesize a negative
direct effect of PIR on TRU (Dinev & Hart 2006), which is consistent with TRAs second rule.
Hypothesis 4: PIR negatively influences TRU.
3.4
PDC operationalizes subjective privacy. PDC is the individuals beliefs in their ability to manage the
usage of their personal information (Westin 1967). Prior research suggests that giving control to
individuals over personal information mitigates CON because control loads individuals with expectations
that negative outcomes will not occur (Culnan & Armstrong 1999; Malhotra et al. 2004). Following these
findings, we postulate that the individuals PDC has a negative effect on CON. This hypothesis is
consistent with TRAs second rule, and consistent with psychology literature, which suggests that a lack
of perceived control can lead individuals to have anxious feelings (for a review see Rapee et al. 1996).
Hypothesis 5: PDC negatively influences CON.
On the other hand, Brandimarte et al. (2012) claimed that the individuals intention to take risks increases
when they feel in control. As we mentioned in section 3.3, Mayer et al. (1995) defined TRU as the
willingness to take risks. Consequently, we postulate that TRU increases when PDC do so, and vice versa
when the latter decreases. This hypothesized positive effect of PDC on TRU has support in previous
research findings (Das & Teng 2001; Joinson et al. 2010), and is consistent with TRA rules.
Hypothesis 6: PDC positively influences TRU.
3.5
PIP is the individuals perceptions that telcos have the ability to safeguard their personal information from
security breaches (Pavlou & Fygenson 2006). Indeed, individuals have no power to protect their
information stored by telcos. In this sense, Yamaguchi (as cited in Xu 2007) claimed that in such cases
where individuals lack power, they are motivated to rely on proxy control. Proxy control is the
individuals belief that powerful others have the ability to achieve the formers desired outcomes. Thus,
PIP and proxy control are similar concepts. Furthermore, Xu (2007) found that individuals use proxy
control to alleviate their CON, which is consistent with TRAs second rule. Hence, as we did in section
3.4, here we found support in psychology literature to claim that, regardless the control agent (i.e., self or
others), the lower the perception of control, the higher the CON (i.e., negative effect of PIP on CON).
Hypothesis 7: PIP negatively influences CON.
Moreover, Gefen et al. (2003) claimed that trust is a three-dimensional construct, made up of competence,
integrity, and benevolence. We are interested in competence, which is the trustors perception that the
trustee has the ability to perform as the formers expectations (Pavlou & Fygenson 2006). Accordingly,
we hypothesize a positive effect of PIP on TRU, as previous research findings suggest (Chellappa &
Pavlou 2002; Liu et al. 2004). The rationale is that individuals build their TRU based on the telcos ability
to keep information secure (i.e., PIP). This statement is consistent with TRAs second rule.
Hypothesis 8: PIP positively influences TRU.
3.6
PPA is the individuals perceptions of their understanding about telcos privacy practices (Malhotra et al.
2004). As we explained in section 2.3, this variable is a passive dimension of control. In addition,
literature about institutional privacy assurances posits that privacy policies influence decisions about
disclosure of personal information (Xu et al. 2011). The rationale is that privacy policies load individuals
with proxy control over their personal information because customers attempt to align themselves with
this powerful force (Xu 2007). Then, as we did in sections 3.4 and 3.5, following psychology literature,
we hypothesize a negative effect of PPA (i.e., proxy control) on CON, which is consistent with TRAs
second rule, and consistent with previous findings (Culnan & Armstrong 1999; Malhotra et al. 2004).
Hypothesis 8: PPA negatively influences CON.
Also, privacy policies may push companies to refrain from opportunistic behavior (Xu 2007). In other
words, privacy policies may load individuals with beliefs that telcos will act accordingly to the formers
expectations fulfilling the concept of TRU. Consequently, we hypothesize a positive effect of PPA on
TRU. This hypothesis is supported by prior research findings (Liu et al. 2004; Milne & Boza 1999), and
consistent with TRAs second rule.
Hypothesis 9: PPA positively influences TRU.
3.7
Individuals may be willing to engage in threatening behaviors if they can obtain benefits from them
(Culnan & Armstrong 1999); that is, perceived benefits could work as facilitators (Hui et al. 2006). In
fact, perceived benefits are beliefs linking a behavior with positive outcomes. In this case, we follow the
approach described in section 2.1 about beliefs and ATT as co-determinants of intention, in the same
fashion than other scholars did (Lee 2009). Then, by following TRAs rules, we posit that perceived
benefits have a positive effect on intention. This hypothesis is consistent with expectancy theory, which
claims that individuals tend to maximize positive outcomes. We divided benefits into PMB and PNB.
Hypothesis 11 (Hypothesis 12): PMB (PNB) positively influences intention.
3.8
INV is the individuals beliefs about the resources they will need to give permission to their telcos to use
their personal information (adapted from Ajzen 2002). Economics theories approach resources as learning
costs (before action) that could become sunk costs (after action). The former refers to the perceived
resources that will be expended in performing an action, while the latter refers to the perceptions of nonrecoverable resources invested in performing an action (Jones et al. 2002). Hence, INV is learning costs
of performing the behavior under study. Following this economic approach, cost-benefit models suggest a
negative effect of perceived costs on intention (Frazier 1983), which is consistent with TRAs rules. Thus,
we hypothesize that INV has a negative direct effect on intention.
Hypothesis 13: INV (time and effort) negatively influences intention.
RESEARCH METHODOLOGY
The survey method will be used in this research. All measurement items were drawn from a literature
review and adapted to the present research. A refinement procedure was done based on a pilot test, which
consisted of 30 preliminary samples from mobile users at KAIST. All constructs were measured on a
seven-point Likert scale. We used smart PLS as a tool for checking reliability and validity measures. For
the full scale test, we collected 475 responses from actual mobile users in Korea. We used an online panel
company (Wise Research Co.) to collect the samples. The respondents were assured that the data would
be collected and results reported with their anonymity protected. The full scale sample was composed of
240 males (50.5%) and 235 females (49.5%). The age of respondents was almost equally distributed, and
the respondents mobile carriers represented the current market share of the Korean mobile market. We
are still analyzing the results and they will be presented in future studies.
IMPLICATIONS (EXPECTED)
5.1
Above all, this study pioneers a privacy calculus in the telecom sector. We provide a comprehensive
model that describes the privacy calculus manipulated by individuals who decide to give telcos
permission to access their personal information usage. Moreover, a comparison between smartphone and
non-smartphone users will be provided. Second, as it entails an application of TRA, the present research
establishes beliefs and attitudes that could be adapted to related behavior studies. Moreover, the findings
from our study can be referenced by other countries that have similar phenomena. Third, as a contribution
to privacy studies, we expect to clarify the effect of privacy concerns on intention, validate the role of
trust and control in a new environment, quantify the effect of policy awareness on trust, and shed light on
the role of resources from a learning-cost perspective.
5.2
The proposed model shows a set of factors that managers in the telecom sector might manipulate to
encourage their customers to authorize the usage of their personal information. Therefore, managers
should focus on the significant factors that have a positive relationship with intention. By adapting the
factors into their strategies, telcos will be equipped with a valuable asset in a Big Data era.
References
Ajzen, I. (1985). 'From intentions to actions: A theory of planned behavior', in J. Kuhl and J. Bechmann
(Eds.), Action-control: From cognition to behavior, 11-39. Springer, Heidelberg.
Ajzen, I. (1988). Attitudes, personality and behavior. Dorsey Press, Chicago, IL.
Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision
Processes, 50 (2), 179-211.
Ajzen, I. (2002). Perceived behavioral control, selfefficacy, locus of control, and the theory of planned
behavior. Journal of Applied Social Psychology, 32 (4), 665-683.
Ajzen, I. and Fishbein, M. (1980). Understanding attitudes and predicting social behaviour. PrenticeHall, Englewood Cliffs, NJ.
Avanade (2012). Global survey: Is Big Data producing big returns? Avanade, Seattle, WA,
<http://www.avanade.com/Documents/Research%20and%20Insights/avanade-big-data-executivesummary-2012.pdf>.
Bart, Y., Shankar, V., Sultan, F. and Urban, G.L. (2005). Are the drivers and role of online trust the same
for all web sites and consumers? A large-scale exploratory empirical study. Journal of Marketing, 69
(4), 133-152.
Brandimarte, L., Acquisti, A. and Loewenstein, G. (2012). Misplaced confidences: Privacy and the
control paradox. Social Psychological and Personality Science, 0 (0), 1-8.
Chellappa, R.K. and Pavlou, P.A. (2002). Perceived information security, financial liability and consumer
trust in electronic commerce transactions. Logistics Information Management, 15 (5/6), 358-368.
Chellappa, R.K. and Sin, R.G. (2005). Personalization versus privacy: An empirical examination of the
online consumers dilemma. Information Technology and Management, 6 (2-3), 181-202.
Columbus, L. (2012). Roundup of Big Data forecasts and market estimates, 2012. Forbes, 16 August,
viewed on 20 December 2012, <http://www.forbes.com/sites/louiscolumbus/2012/08/16/roundup-ofbig-data-forecasts-and-market-estimates-2012/>.
Cukier, K. (2010). Data, data everywhere. The Economist, 25 February, viewed on 20 January 2013,
<http://www.economist.com/node/15557443>.
Culnan, M.J. and Armstrong, P.K. (1999). Information privacy concerns, procedural fairness, and
impersonal trust: An empirical investigation. Organization Science, 10 (1), 104-115.
Das, T.K. and Teng, B.-S. (2001). Trust, control, and risk in strategic alliances: An integrated framework.
Organization studies, 22 (2), 251-283.
DataStax (2011). Big Data: Beyond the hype. Why Big Data matters to you. DataStax, Burlingame, CA,
<http://odbms.org/download/WP-DataStax-BigData.pdf>.
Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information
technology. MIS Quarterly, 13 (3), 319-340.
Detica (2012). Make Big Data smaller. The Big Data opportunity for telecom operators. BAE Systems
Detica, Guildford,
<https://www.baesystemsdetica.com/uploads/resources/BigData_Telecoms_Vertical_slick_single_pag
es.pdf>.
Dharia, N. (2012). Ovum figures indicate that operators will lose $54bn by 2016 due to smartphone
messaging apps. OVUM, 11 October, viewed on 20 March 2013,
<http://ovum.com/press_releases/ovum-figures-indicate-that-operators-will-lose-54bn-by-2016-dueto-smartphone-messaging-apps/>.
Dinev, T. and Hart, P. (2006). An extended privacy calculus model for e-commerce transactions.
Information Systems Research, 17 (1), 61-80.
Dowling, G.R. and Staelin, R. (1994). A model of perceived risk and intended risk-handling activity.
Journal of Consumer Research, 21 (1), 119-134.
EC_95/461995 (1995). Directive 95/46/ec of the european parliament and of the council of 24 october
1995 on the protection of individuals with regard to the processing of personal data and on the free
movement of such data. Official Journal of the European Communities No L 281: 31-50.
Fagan, M.H., Neill, S. and Wooldridge, B.R. (2003). An empirical investigation into the relationship
between computer self-efficacy, anxiety, experience, support and usage. Journal of Computer
Information Systems, 44 (2), 95-104.
Foxman, E.R. and Kilcoyne, P. (1993). Information technology, marketing practice, and consumer
privacy: Ethical issues. Journal of Public Policy & Marketing, 12 (1), 106-119.
Frazier, G.L. (1983). Interorganizational exchange behavior in marketing channels: A broadened
perspective. The Journal of Marketing, 47 (4), 68-78.
Gefen, D., Karahanna, E. and Straub, D.W. (2003). Trust and TAM in online shopping: An integrated
model. MIS Quarterly, 27 (1), 51-90.
Green, J. (2012). Ovum reveals "death of telephone" is exaggerated as OTT VoIP predicted to cost telcos
$479bn by 2020. OVUM, 25 October, viewed on 25 March 2013,
<http://ovum.com/press_releases/ovum-reveals-death-of-telephone-is-exaggerated-as-ott-voippredicted-to-cost-telcos-479bn-by-2020/>.
Hoffman, D.L., Novak, T.P. and Peralta, M. (1999). Building consumer trust online. Communications of
the ACM, 42 (4), 80-85.
Hui, K.-L., Tan, B.C. and Goh, C.-Y. (2006). Online information disclosure: Motivators and
measurements. ACM Transactions on Internet Technology (TOIT), 6 (4), 415-441.
Joinson, A.N., Reips, U.-D., Buchanan, T. and Schofield, C.B.P. (2010). Privacy, trust, and selfdisclosure online. HumanComputer Interaction, 25 (1), 1-24.
Jones, K. (1996). Trust as an affective attitude. Ethics, 107 (1), 4-25.
Jones, M.A., Mothersbaugh, D.L. and Beatty, S.E. (2002). Why customers stay: Measuring the
underlying dimensions of services switching costs and managing their differential strategic outcomes.
Journal of Business Research, 55 (6), 441-450.
Komiak, S.Y.X. and Benbasat, I. (2006). The effects of personalization and familiarity on trust and
adoption of recommendation agents. MIS Quarterly, 30 (4), 941-960.
Krueger Jr, N.F., Reilly, M.D. and Carsrud, A.L. (2000). Competing models of entrepreneurial intentions.
Journal of Business Venturing, 15 (5), 411-432.
Lanier, C.D. and Saini, A. (2008). Understanding consumer privacy: A review and future directions.
Academy of Marketing Science Review, 12 (2), 1-48.
Lee, M.-C. (2009). Factors influencing the adoption of internet banking: An integration of TAM and TPB
with perceived risk and perceived benefit. Electronic Commerce Research and Applications, 8 (3),
130-141.
Li, H., Sarathy, R. and Xu, H. (2010). Understanding situational online information disclosure as a
privacy calculus. Journal of Computer Information Systems, 51 (1), 62-71.
Liu, C., Marchewka, J.T., Lu, J. and Yu, C.-S. (2004). Beyond concern: A privacytrustbehavioral
intention model of electronic commerce. Information & Management, 42 (1), 127-142.
Malhotra, N.K., Kim, S.S. and Agarwal, J. (2004). Internet users' information privacy concerns (IUIPC):
The construct, the scale, and a causal model. Information Systems Research, 15 (4), 336-355.
Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C. and Byers, A.H. (2011). Big Data:
The next frontier for innovation, competition, and productivity. McKinsey Global Institute, June,
viewed on 15 December 2012,
<http://www.mckinsey.com/insights/business_technology/big_data_the_next_frontier_for_innovation
>.
Mayer, R.C., Davis, J.H. and Schoorman, F.D. (1995). An integrative model of organizational trust.
Academy of Management Review, 20 (3), 709-734.
McGuire, T., Manyika, J. and Chui, M. (2012). Why Big Data is the new competitive advantage. Ivey
Business Journal, July / August, viewed on 20 January 2013,
<http://www.iveybusinessjournal.com/topics/strategy/why-big-data-is-the-new-competitiveadvantage>.
McKnight, D.H., Choudhury, V. and Kacmar, C. (2002). Developing and validating trust measures for ecommerce: An integrative typology. Information Systems Research, 13 (3), 334-359.
Milne, G.R. and Boza, M.-E. (1999). Trust and concern in consumers perceptions of marketing
information management practices. Journal of Interactive Marketing, 13 (1), 5-24.
Norberg, P.A., Horne, D.R. and Horne, D.A. (2007). The privacy paradox: Personal information
disclosure intentions versus behaviors. Journal of Consumer Affairs, 41 (1), 100-126.
Pavlou, P.A. and Fygenson, M. (2006). Understanding and predicting electronic commerce adoption: An
extension of the theory of planned behavior. MIS quarterly, 30 (1), 115-143.
Petronio, S.S. (2002). Boundaries of privacy: Dialectics of disclosure. Suny Press, Albany, NY.
Phelps, J., Nowak, G. and Ferrell, E. (2000). Privacy concerns and consumer willingness to provide
personal information. Journal of Public Policy & Marketing, 19 (1), 27-41.
Prescient (2002). Ensuring privacy of information encrypted relational database model (ERDM).
Prescient International Inc., North York, Ontario,
<http://www.prescient.net/pdf/ERDMTechnicalPaper.pdf>.
Rapee, R.M., Craske, M.G., Brown, T.A. and Barlow, D.H. (1996). Measurement of perceived control
over anxiety-related events. Behavior Therapy, 27 (2), 279-293.
Sheina, M. and Bali, S. (2012). Dialing into telco data. Reliable data management drives smarter telco
business. OVUM, London, <http://www.sas.com/reg/gen/uk/telco-data-report?page=reg>.
Sparks, P., Shepherd, R., Wieringa, N. and Zimmermanns, N. (1995). Perceived behavioural control,
unrealistic optimism and dietary change: An exploratory study. Appetite, 24 (3), 243-255.
Tan, X., Qin, L., Kim, Y. and Hsu, J. (2012). Impact of privacy concern in social networking web sites.
Internet Research, 22 (2), 211-233.
TRUSTe (2011). Smart privacy for smartphones. Understanding and delivering the protection consumers
want. TRUSTe, San Francisco, CA, <http://www.truste.com/why_TRUSTe_privacy_services/harrismobile-survey/>.
UN (2012). Big Data for development: Challenges and opportunities. United Nations Global Pulse, New
York, NY, <http://www.unglobalpulse.org/sites/default/files/BigDataforDevelopmentUNGlobalPulseJune2012.pdf>.
Vroom, V.H. (1964). Work and motivation. Wiley, New York, NY.
Wang, Y.A. and Pfister, R.E. (2008). Residents' attitudes toward tourism and perceived personal benefits
in a rural community. Journal of Travel Research, 47 (1), 84-93.
Westin, A.F. (1967). Privacy and freedom. Atheneum, New York, NY.
White, K.M., Smith, J.R., Terry, D.J., Greenslade, J.H. and McKimmie, B.M. (2009). Social influence in
the theory of planned behaviour: The role of descriptive, injunctive, and ingroup norms. British
Journal of Social Psychology, 48 (1), 135-158.
Xu, H. (2007). The effects of self-construal and perceived control on privacy concerns. In Proceedings of
the 28th Annual International Conference on Information Systems (ICIS 2007), Montral, Canada.
Xu, H., Dinev, T., Smith, J. and Hart, P. (2011). Information privacy concerns: Linking individual
perceptions with institutional privacy assurances. Journal of the Association for Information Systems,
12 (12), 798-824.