Applsci 10 06628
Applsci 10 06628
Applsci 10 06628
sciences
Article
Continuance Use of Cloud Computing in Higher
Education Institutions: A Conceptual Model
Yousef A. M. Qasem * , Rusli Abdullah * , Yusmadi Yaha and Rodziah Atana
Department of Software Engineering and Information System, Faculty of Computer Science and Information
Technology, Universiti Putra Malaysia, Serdang 43400, Malaysia; yusmadi@upm.edu.my (Y.Y.);
rodziah@upm.edu.my (R.A.)
* Correspondence: y.alsharaei@gmai.com (Y.A.M.Q.); rusli@upm.edu.my (R.A.)
Received: 29 July 2020; Accepted: 1 September 2020; Published: 23 September 2020
Abstract: Resource optimization is a key concern for Higher Education Institutions (HEIs).
Cloud Computing, as the recent generation in computing technology of the fourth industrial
revolution, has emerged as the main standard of service and resource delivery. As cloud computing
has grown into a mature technology and is being rapidly adopted in many HEIs across the world,
retaining customers of this innovative technology has become a challenge to the cloud service
providers. Current research trends on cloud computing have sought to study the acceptance or
adoption of technology; however, little research has been devoted to the continuance use in an
organizational setting. To address this gap, this study aims to investigate the antecedents of cloud
computing continuance use in HEIs. Hence, drawing on the prior literature in organizational-level
continuance, this research established a conceptual model that extends and contextualizes the IS
continuance model through the lens of the TOE framework (i.e., technological, organizational,
and environmental influences). The results of a pilot study, conducted through a survey with
information and communications technology (ICT) decision makers, and based on the proposed
conceptual model, indicate that the instrument is both reliable and valid, and so point the way
towards further research. The paper closes with a discussion of the research limitations, contribution,
and future directions.
1. Introduction
Cloud computing (CC) is increasingly becoming a springboard for digital innovation and
organizational agility. Higher education institutions (HEIs) are facing the problems with the increasing
of participants, growing need of IT and infrastructure, education quality of provision, and affordable
education services [1,2]. With the high rate at which IT technology changes, resource management
optimization is a key concern for HEIs [3], not least because on-premise systems can only operate
effectively when they receive adequate initial funding and resources, as well as dedicated and
systematic maintenance regimes [4,5]. Institutions looking to compete in the new world need a flexible
yet comprehensive digital transformation blueprint that integrates various technologies across the
institution with CC being at its foundation. CC, as the current generation in computing technology of
the fourth industrial revolution (IR 4.0), has emerged as the main standard of service and resource
delivery [6], in which it has become an excellent alternative for HLIs to support cost reduction, quality
improvement and, through this, educational sustainability [7] by providing the required infrastructure,
software, and storage as a service [3]. Thus, CC has been adopted rapidly in both private and public
organizations, including HEIs [3,8,9]. As the fifth most frequently used utility after gas, electricity,
water, and telephone lines [10], the CC tracking poll from the International Data Corporation indicates
that USD 370bn in CC will be used by 2022, corresponding to an increase in 22.5% in terms of 5-year
compound annual growth [11].
However, while the subscription model of cloud services contributes to the growth of the
overall market and makes it accessible for HEIs, a new set of challenges have arisen. This research
addresses one such challenge that the cloud service providers face. The possibility of making such a
decision to discontinue a cloud service provider is exacerbated by the low cost of switching between
applications [12] and in general the competitive markets [4]. Therefore, the conceptualization of CC
service in HEIs changes to a decision on ‘continuance’, rather than ‘adoption’.
Moreover, in subscription models offered via cloud-based education systems, it is possible for HEIs
to switch vendors if they perceive greater benefits elsewhere. Thus, it is essential to understand the
conceptual differences between adoption and continuance [13]. Hence, research on CC continuance have
practical and artifact-specific motivations. Furthermore, theoretical research on organizational-level
continuance is also scarce [14], particularly in HEIs [9,15]. Typically, continuance research has been
undertaken at the individual user level; however, organizational continuance decisions are often made
by senior IS executives or others in the organization who may not be intense users of the service in
question [14]. For many of these executive decision makers, a strong influence may be attributed
to factors that are insignificant for individual users (e.g., lowering organizational costs or shifting a
strategic goal) [16].
Thus, to contribute to the body of knowledge on the organizational-level continuance, this study
draws on the prior literature in organizational-level continuance to establish a conceptual model
that extends and contextualizes the IS continuance model to improve our understanding of the
determinants of CC continuance use in HEIs through the lens of the TOE framework (i.e., environmental,
organizational, and technological influences). Following the establishment of the conceptual model [15],
the model was validated by conducting a pilot study with senior decision makers, all of whom were
asked about aspects of their organizations relating to CC services. Therefore, this study sought to
address was the following: “What constructs influence the organizational-level continuance of CC
in HEIs?” To address this question, our research relies on a positivist quantitative-empirical research
design. In this research, the unit of analysis (i.e., a basic element of observation indicating who or what
the researcher has generalized) [17] is the organization, and the organization-level phenomenon will be
observed by individuals involved in organizational CC subscription decisions at an organization [17].
In general, we contribute to research in different ways. First, the most important contribution
of this research constitutes to the body of knowledge within the IS field surrounding continuance
phenomenon. In practical settings, many are concerned with reducing capital IT expenses [3,18,19]
and IS services allow client organizations to select from various services that they can continue using
or discontinue using [8,9]. Therefore, conducting research that focuses on the continuance of IS will
play a critical role in theory and practice. Second, we develop a conceptual model that provides a clear
perspective through which HEIs can answer the question related to their use of CC services: “should
we go, or should we stay?” Third, the results of the full-scale research will assist IT decision makers
in CC when seeking to optimize institutional resource utilization, or to commission and market CC
projects. As a case in point, the results can serve as guidelines that cloud service providers will use
to focus their efforts towards retaining customers. These can also be leveraged by clients to guide
routine assessments over whether the use of a specific CC service should be discontinued. Fourth,
the study is expected to contribute to developing the literature in the best available organizational-level
continuance models for HEI settings. Last, in providing a model for CC continuance use, we provide
a new explanation for organizations’ continuance use of novel technologies. Measuring the model
constructs not only reflectively but also formatively would add little to the practical contribution of the
study. Thus, further quantitative, and qualitative research about the conceptualized model and its
relationships is needed.
Appl. Sci. 2020, 10, 6628 3 of 36
The structure of the rest of this study is as follows, a literature review is given in the next section.
Theoretical models are then identified and analyzed, a conceptual model for exploring continuance of
CC in HEIs is proposed. In turn, the method is explained, and the preliminary results of the study
are presented. Finally, the study’s results are discussed, their implications are examined, and the
contributions of the research are outlined.
technology adoption that have been investigated in previous studies [93–95] (see Figure 1). Certain
studies have reported that, in terms of the methods that are available in technology adoption research,
many are limited because they do not differentiate between changes in the significance of factors in the
Appl. Sci.
different 2020, 10,
phases ofxadoption
FOR PEER REVIEW 4 of 36 for
[96]. Therefore, opportunities, as well as suitable research settings
exploring continuance as the last phase of adoption, have been limited.
[80] √ √ √ √ √
[49] √ √ √ √
Table 1. Literature on CC Continuance.
[14] √ √ √ √ √
SUM Level of
4 Analysis
10 2
Adoption 13
Phase 5 6Theoretical
3 Perspective
2 3 12 1
Type
This IND ORG PRE POST ISC ISS ISD TOE OTH EMP THEO
Research √ √ √
√ √ √√ √ √ √
[14]
√ √ √ √ √
[72]Legend: IND = Individual; ORGA = Organizational; PRE = Pre-Adoption; POST = Post-Adoption; ISC
√ √ √ √
[75]= Information System Continuance; ISS = IS Success Model; ISD = IS Discontinuance Model; TOE =
√ √ √ √ √ √
[76] *
√ √
Technology–Organization–Environment √ Framework; OTH = Others; √ THEO √ =
[71] **
√ EMP = Empirical.
Theoretical/Conceptual; √ * Study examines√ adopters’ √
and non-adopters’
[77]
√ √ √ √
[45] intention
* to increase the level of sourcing; thus, it is√categorized as adoption. ** Study
√ √ √ √
[78]examines adopters’ intention at individual and organizational levels; thus, it is categorized
√ √ √ √ √
[79]
[70]
as an individual. √ √ √ √
√ √ √ √ √
[80]
√ √ √ √
Life Cycle of an Information
[49]
√
System √ √ √ √
[14]
SUMThis study 4 belongs10 to the2 well-established
13 5 stream6 of literature
3 2that has
3 examined
12 the
1
√ √ √ √ √
phenomenon
This Research of “technology adoption”, which was initially defined by Rogers [81] as a five-step
Legend: IND =
process. Since then, a range of
Individual; models
ORGA = has been developed
Organizational; = extend
PRE to Rogers’ POST
Pre-Adoption; = Post-Adoption;
preliminary work [82–
ISC = Information
84]. According System
to some Continuance;
researchers, ISS =is IS
adoption Success Model;
a multi-phase process = IS than
ISDrather Discontinuance Model;
a binary decision
TOE = Technology–Organization–Environment Framework; OTH = Others; THEO = Theoretical/Conceptual;
[85–88], and for some theoreticians, adoption occurs over seven rather than five stages [89,90].
EMP = Empirical. * Study examines adopters’ and non-adopters’ intention to increase the level of sourcing; thus,
itHowever, mostasresearchers
is categorized adoption. ** agree
Study that technology
examines adoption
adopters’ intentionoperates across
at individual andthe following five
organizational stages:
levels; thus,
itawareness, interest,
is categorized evaluation, trial, and continuance. Several researchers advocated a four-phase
as an individual.
model, which involved initiation, adoption, decision, and implementation (e.g., [81,91,92]). However,
certain
In studies
Table, have concentrated
previous studies have their empirical
adapted understanding
various of technology
IS theories adoption analyzed
and empirically into a single
CC in
phase (e.g., adoption or pre-/post-adoption) [14]. This perspective is consistent with the wide-ranging
different contexts from an individual or organizational viewpoint in the pre-adoption or post-adoption
stages of technology adoption that have been investigated in previous studies [93–95] (see Figure 1).
(i.e. continuance use) phase. However, no empirical study was found measuring the continuance use of
Certain studies have reported that, in terms of the methods that are available in technology adoption
CC in HEIs. Therefore, the main contribution of this study is to develop an instrument and conceptualize
research, many are limited because they do not differentiate between changes in the significance of
a model to measure
factors the continuance
in the different use of[96].
phases of adoption CCTherefore,
in the context of HEIs.asBeside
opportunities, well as this context-specific
suitable research
contribution, our study also reduces the gap of related to organizational IS continuance
settings for exploring continuance as the last phase of adoption, have been limited. research.
Figure1.1.Life
Figure Life cycle
cycle of
ofan
anISIS[95].
[95].
A range
A range of theories
of theories havehave beenused
been usedto to study
study adoption
adoption(e.g., UTAT
(e.g., UTAT or or
TAM), continuance
TAM), (e.g., (e.g.,
continuance
ISC), and discontinuance at the individual level. Contrastingly, scholars have used
ISC), and discontinuance at the individual level. Contrastingly, scholars have used theories such theories such as as
TOE, DOI, and social contagion to examine adoption from an organizational
TOE, DOI, and social contagion to examine adoption from an organizational perspective. Additionally, perspective.
Additionally, the ISS and ISD models have been leveraged to investigate continuance and
the ISS and ISD models have been leveraged to investigate continuance and discontinuance respectively,
discontinuance respectively, at the organizational level. Table 2 provides an overview of the various
at the organizational level. Table 2 provides an overview of the various theoretical approaches that have
theoretical approaches that have been used in the literature to examine the lifecycle of an IS.
been Dissimilar
used in theto literature
studies thattohave
examine theonlifecycle
focused of an IS.
the individual Dissimilar
level, those thatto studies
have that have
addressed focused on
continuance
the individual level, those that have addressed continuance and discontinuance
and discontinuance at the organizational level are few and far between [9,14,16,97]. at the organizational
level are few and far between [9,14,16,97].
Appl. Sci. 2020, 10, 6628 5 of 36
After initially adopting an IS, a user decides to either continue or discontinue the IS adoption.
Contrastingly, this is unlikely to retire or replace their on-premise CC services [106]. Nevertheless,
since almost all CC services involve a subscription model, and since this study’s focal point is the issue
of continuance, the aim of this study is not to examine factors that influence the use of CC services in
HEIs. Instead, the main area of focus in the current study is the set of constructs that contribute to IS
continuance. An implication of this is that it is possible to evaluate success and system performance,
which is dissimilar to the pre-adoption phase in which only expectations can be used to estimate usage.
This also enables the integration of post-adoption variables as predictors of IS adoption continuance,
thereby exerting far-reaching impacts on model development.
to a sequence of adoption decisions that are not related to factors such as timing or the behavioral
stage [110]. Hence, perceived usefulness should have a direct impact on IS continuance intention to
Appl. Sci.
have an2020, 10, 6628
indirect 6 of 36
effect on IS continuance intention via satisfaction. In the literature, the ISC model
has primarily been employed to examine continuance use from an individual perspective. However,
the model has been extended for organizational post-adoption studies (e.g., [72]); consequently, it is
extended for organizational post-adoption studies (e.g., [72]); consequently, it is associated with a
associated with a substantial level of external validity. Besides, the ISC model has been used to
substantial level of external validity. Besides, the ISC model has been used to explain the continuance
explain the continuance phenomenon in the education context (e.g., [111–113]).
phenomenon in the education context (e.g., [111–113]).
The focal point of the ISC model is an individual’s continued acceptance of technology, whereas
The focal point of the ISC model is an individual’s continued acceptance of technology, whereas
the purpose of this study is to address organizational continuance use. However, several researchers
the purpose of this study is to address organizational continuance use. However, several researchers
have extended the model for organizational post-adoption context (e.g., [72]). As suggested by the
have extended the model for organizational post-adoption context (e.g., [72]). As suggested by the
TOE model, a critical point of difference between organizational and individual continuance use
TOE model, a critical point of difference between organizational and individual continuance use
settings is that the technology adoption decisions made by organizations are typically informed both
settings is that the technology adoption decisions made by organizations are typically informed both
by technology factors associated with individual beliefs (e.g., satisfaction) and by organizational
by technology factors associated with individual beliefs (e.g., satisfaction) and by organizational factors
factors such as external threats and opportunities. Therefore, to fine-tune ISC model to address the
such as external threats and opportunities. Therefore, to fine-tune ISC model to address the research
research problem, it is necessary to supplement factors in the organizational continuance context,
problem, it is necessary to supplement factors in the organizational continuance context, especially
especially those relating to organizational and environmental settings [72]. Given that researchers
those relating to organizational and environmental settings [72]. Given that researchers need to choose
need to choose a theory or model that is appropriate for the research setting (in this case, continuance
a theory or model that is appropriate for the research setting (in this case, continuance use at the
use at the organizational level), this study substitutes the perceived usefulness construct of the ISC
organizational level), this study substitutes the perceived usefulness construct of the ISC model with
model with logical reasoning. Perceived usefulness is generally considered the most relevant
logical reasoning. Perceived usefulness is generally considered the most relevant technological factor
technological factor that informs IS post-adoption behavior in the ISC model, and a range of studies,
that informs IS post-adoption behavior in the ISC model, and a range of studies, including [72,114],
including [72,114], have employed it as a baseline model. Nevertheless, the theory of planned
have employed it as a baseline model. Nevertheless, the theory of planned behavior (TPB) [115]
behavior (TPB) [115] suggests that net benefits ought to be viewed as behavioral belief (that is,
suggests that net benefits ought to be viewed as behavioral belief (that is, perceived usefulness) [116].
perceived usefulness) [116]. For this reason, the net benefit construct taken from the IS success model
For this reason, the net benefit construct taken from the IS success model is used instead of perceived
is used instead of perceived usefulness, thereby achieving an effective fit with the research setting.
usefulness, thereby achieving an effective fit with the research setting.
studies of system performance [14]. Regarding the service quality construct, this was removed for
the following reasons: firstly, it is comparable to the complete idea of IS success in several ways,
where system and quality constitute a “functional” aspect, while the effects constitute the “technical”
aspects in the “operational” IS (in this case, the system is considered a set of services); and secondly,
in order to assess the service provider’s services, a narrow perspective of service quality is involved.
In CC service research, an assessment of this kind would be an antecedent rather than a measure.
Given these considerations, several constructs, namely information quality, system quality, and net
benefits, are integrated into the conceptual model of this study.
quality, firm size and scope, and the internal social network. Finally, the environmental context
demonstrates that an organization’s IS adoption is significantly impacted by constructs that lie outside
of its direct control (e.g., competitors, government regulations, and supply chains) [102]. In view
of these considerations, it is clear that the TOE framework can play an effective role in identifying
non-technology-level factors that have not been considered in other studies on consumer software
(e.g., constructs relevant to external circumstances) [140]. Additionally, the TOE framework helpfully
interprets the notion of adoption behavior based on the following technological innovations: firstly,
innovations applied for technical tasks (i.e., type 1 innovations); secondly, innovations relating to the
business administration (i.e., type 2 innovations); and thirdly, innovations integrated into core business
procedures (i.e., type 3 innovations) [141].
Along with the technological and organizational variables of continuance use, which were
derived from models of IS continuance, discontinuance, and success, constructs were identified that
affect organizational and environmental persistence, especially insofar as they relate to CC in HEIs.
As a result of this process, collaboration was identified as an organizational variable [42,142–144],
while regulatory policy [145–147] and competitive pressure [72,145] were identified as environmental
variables. Collaboration tasks lie at the heart of HEIs, and collaboration can be conceptualized as the
ability of CC services to facilitate communication among stakeholders [42,142]. In the case of digital
natives, CC services play a vital role in effective collaboration [148,149]. Table 3 presents a mapping
matrix for the continuance use constructs and theories, each of which has been obtained from the
extant and related literature [150,151].
Table 3. Mapping matrix of model constructs from ISC, ISS, ISD, and TOE.
Constructs/Independent Variables
Technology Organization Environment
Technology Integration
Competitive Pressures
Information Quality
System Integration
Regulatory Policy
Confirmation
System Quality
Satisfaction
Collaboration
Net Benefits
Theory/ Technology/Dependent
Source
Model Variable
Organizational level √ √ √
ISD information System [16]
discontinuance intentions.
Information system √ √
ISC [101]
continuance.
√ √ √
ISS Information system success. [105,120]
ECM & Enterprise 2.0 √ √ √
[72]
TOE post-adoption.
Continuance intention to
TAM [75]
use CC.
Disruptive technology √ √
ISC &
continuous adoption [76]
OTH
intentions.
√ √ √
ISS CC evaluation [77]
ISS & Cloud-Based Enterprise √ √ √ √ √
[78]
ISD Systems.
SaaS-based collaboration √ √
ISC [45]
tools.
CC client-provider √ √
ISC [70]
relationship.
Appl. Sci. 2020, 10, 6628 9 of 36
Table 3. Cont.
Constructs/Independent Variables
Technology Organization Environment
Technology Integration
Competitive Pressures
Information Quality
System Integration
Regulatory Policy
Confirmation
System Quality
Satisfaction
Collaboration
Net Benefits
Theory/ Technology/Dependent
Source
Model Variable
Operational Cloud √ √ √
ISC [152]
Enterprise System.
Usage and adoption of CC. √
OTH [143]
by SMEs
Knowledge management √
TOE [153]
systems diffusion
Information technology √ √
TCT [154]
adoption behavior life cycle
√ √ √
ISC Wearable Continuance [155]
Legend: TOE = Technology–Organization–Environment Framework; ISC = Information System Continuance Model;
ISS = IS Success Model; ISD = IS Discontinuance Model; TAM = Technology Acceptance Model; TCT = Technology
Continuance Theory; OTH = Others.
Figure
Figure 3.
3. Research
Research Model.
Model.
Based on a positivist,
The relationships deterministic
among philosophical
perceived usefulnessparadigm, a priori assumptions
and confirmation, continuancein the form of
intention,
hypotheses wereas established
and satisfaction, for later statistical
noted by Bhattacherjee [101] in theanalysis
context to facilitate
of system model validation.
acceptance, are relevantThefor
propositions focus on the link between the independent variables encompassing
investigating CC continuance in HEIs. In this study, perceived usefulness was substituted by net the IS continuance
model,
benefits,ISwhich
success
is amodel, IS discontinuance
cognitive model,
belief relevant to IS useand TOE
[157]. In framework,
the context ofand
TAM,the perceived
dependentusefulness
variable,
namely CC continuance use.
is considered a user’s belief towards system usefulness [98]. Whereas in the organizational context,
The relationships
net benefit is considered among perceived
a belief about theusefulness
degree toandwhichconfirmation,
IS promotes continuance
organizationalintention, and
objectives.
satisfaction,
This definitionas is
noted by Bhattacherjee
aligned [101] in the context
with other organizational-level of system
definitions acceptance,
[157,158]. are relevant
We thus propose thefor
investigating CC
following hypotheses: continuance in HEIs. In this study, perceived usefulness was substituted by net
benefits, which is a cognitive belief relevant to IS use [157]. In the context of TAM, perceived
usefulness
Hypothesisis1considered a user’s belief
(H1). An institution’s towardslevel
satisfaction system
withusefulness
initial CC[98]. Whereas
adoption in theinfluences
positively organizational
its CC
context,
continuancenetuse.
benefit is considered a belief about the degree to which IS promotes organizational
objectives. This definition is aligned with other organizational-level definitions [157,158]. We thus
Hypothesis
propose the 2a (H2a). An
following institution’s extent of confirmation positively influences its satisfaction with CC use.
hypotheses:
H1: An institution’s satisfaction level with initial CC adoption positively influences its CC
Hypothesis 2b (H2b). An institution’s net benefits from CC use positively influence its satisfaction with CC use.
continuance use.
H2a: An institution’s
Hypothesis 3a (H3a).extent of confirmation
An institution’s positively
net benefits from CCinfluences its satisfaction
use positively influence itswith CC use. use.
CC continuance
H2b: An institution’s net benefits from CC use positively influence its satisfaction with CC use.
H3a: An institution’s
Hypothesis 3b (H3b).net
Anbenefits from
institution’s CC use
extent positively positively
of confirmation influenceinfluences
its CC continuance use.
its net benefits from CC use.
H3b: An institution’s extent of confirmation positively influences its net benefits from CC use.
The relationships among system quality, information quality, and continuance intention [120]
The relationships among system quality, information quality, and continuance intention [120] in
in the context of IS success can also be applied to CC continuance use in HEIs. Prior studies have
the context of IS success can also be applied to CC continuance use in HEIs. Prior studies have
examined the relationships among system quality, information quality, and satisfaction [77,159–165].
examined the relationships among system quality, information quality, and satisfaction [77,159–165].
Hence, it follows:
Hence, it follows:
H4a: System4a
Hypothesis quality
(H4a).positively influences
System quality an institution’s
positively influences ansatisfaction
institution’swith CC use.
satisfaction with CC use.
H4b: System quality positively influences an institution’s CC continuance use.
Hypothesis
H5a: 4b (H4b).
Information System
quality qualityinfluences
positively positively influences an institution’s
an institution’s CC continuance
satisfaction with CC use. use.
H5b: Information quality positively influences an institution’s CC continuance use.
Appl. Sci. 2020, 10, 6628 11 of 36
Hypothesis 5a (H5a). Information quality positively influences an institution’s satisfaction with CC use.
The relationships among technical integration, system investment, and discontinuance intention
[16] can also be applied to CC continuance use in HEIs. Thus, we propose:
Presently, the success of HEIs depends in large part on effective collaboration. This is noteworthy
because, by leveraging CC, it is possible for HEIs to exploit new modes of communication between
key stakeholders [42,142]. Digital natives, many of whom are students within HEIs, now require
the Internet to undertake daily tasks [148,149], and also to participate in online group activities (e.g.,
socializing, group studying, and so on) [166]. In order to satisfy student requirements, it is necessary
to practitioners within HEIs to understand the various ways in which knowledge and content can be
delivered to them [167]. In view of this, it is important to know what types of expectations students
have, and to understand how technology can be leveraged or incorporated into teaching activities to
meet these expectations. Hence, it is not unreasonable to suggest that the competitiveness of a HEI
depends on its utilization of novel technology to satisfy student needs, and to enable streamlined
collaboration and communication [168]. Taking the context into account, we thus predict:
Regulatory policy is another critical consideration that is likely to affect an organization’s decision
to use, or to continue using, a technology. One of the reasons for this is because the regulatory policies
established by a government play a key role in setting laws relating to the use of certain technologies
(e.g., CC) [147,169,170]. For example, the authors of [145–147] discussed how regulatory policies
have shaped adoption trends in CC in various research settings. Taking the context into account,
we therefore hypothesized:
Competitive pressure refers to the pressure that an institution’s leadership may feel regarding the
performance-related abilities that its competitors are gaining through the exploitation of CC services
(e.g., an increase in student assessment outcomes due to the use of CC platforms) [72,137,138]. In the
literature, several scholars have noted that competitive pressure plays a determining role in influencing
CC use in multiple research settings [72,153,170–173]. Taking the context into account, we thus predict:
The research model will be used to examine CC continuance use at the organizational level in
HEIs. Nevertheless, institutions are a key element of the CC ecosystem, in which diverse sets of
actors are involved (e.g., government agencies, public organizations, and researchers). The proposed
model can be referenced by CC actors in HEIs as a basis for cooperating with stakeholders, which is a
prerequisite for the creation and provision of improved products and services.
Appl. Sci. 2020, 10, 6628 12 of 36
5. Methodology
A positivism quantitative survey approach is warranted to address the research objectives
illustrated through the research questions and the hypotheses. Instrument development, data collection
and data analysis mechanisms are discussed in detail below.
IS discontinuance model [16], and TOE framework [102]) (See Appendix A). The instrument’s feasibility,
consistency of style and formatting, readability, and linguistic clarity [185,186], were evaluated in
interviews with academic researchers (n = 2) with experience in questionnaire design. Their feedback
on the general design and measurement scales were requested for improving the usability of the
questionnaire. A content-based literature review approach recommended by Webster and Watson [187])
was used for instrument conceptualization and content specification, in which constructs have been
clearly defined (See Table 3). The next stage involved producing an item pool, the purpose of which
was to represent every aspect of the construct without overlap [183]. Notably, the elimination of a
measure from a formative indicator model risks leaving out a relevant part of the conceptual domain
(or, for that matter, changing a construct’s meaning). This is because the construct is the set of all
indicators [188], and also because maintaining irrelevant items will not have the effect of introducing
bias into the results when examining the data with PLS [181]. In view of this, every dimension that
was identified was retained and changed into an item.
The questionnaire comprises three parts: firstly, a preamble; secondly, a demographic section;
and finally, a section on the constructs relating to continuance use of CC in HEIs. For the first section,
we applied the key informant approach [198] in which two questions were used to eliminate participants
from the sample: first, checking for participants whose institutions had not yet adopted CC services;
and second, checking for participants who do not participate in the ICT adoption decision. Participants
eligible for inclusion in the study, they can complete the next two sections of the questionnaire. In the
second section, demographic data about each institution’s age, faculty, student population, years of CC
service adoption, and type of CC service model were gathered. A service provider variable was also
Appl. Sci. 2020, 10, 6628 14 of 36
measured based on asking the respondent who their CC service provider was (e.g., Oracle, Microsoft,
Google, Salesforce, and Amazon, among others). Notably, the service providers did not affect the
final dependent variable. In the third section of the questionnaire, each item sought to address an
aspect of the research question, particularly measuring the information of constructs that led towards
an organizational continuance use. These included satisfaction (S), confirmation (Con), net benefits
(NB), technical integration (TE), system quality (SQ), information quality (IQ), system investment (SI),
collaboration (Col), regulatory policy (RP), and competitive pressure (CP). As shown in Appendix A,
a 5-point Likert scale, ranging from “strongly agree” to “strongly disagree”, was used to measure each
item. Additionally, SAT items were measured on a 5-point Likert scale with different options (e.g.,
1 = very dissatisfied to 5 = very satisfied; 1 = very displeased to 5 = very pleased; 1 = very frustrated to
5 = very contented; and 1 = absolutely terrible to 5 = absolutely delighted).
Further, the instrument development process took into consideration the debate surrounding
the practice of gathering perceptual data on both the dependent and independent variables from a
single respondent [199]. In this debate, a central issue is that of whether the practice may result in
excessive common method variance (CMV). Nevertheless, some studies indicate that CMV is a greater
problem for abstract constructs (e.g., attitude) when compared to concrete measures (e.g., those linked
to IS success in this research) [200]. Besides, it has been noted in the literature that the constructs
of IS success are not highly susceptible to CMV [201]. Moreover, CMV is not a major concern for
formative constructs because the items do not have to co-vary [14]. Furthermore, in the process of
operationalizing the research instrument, CMV can be further reduced by neglecting to group the
items from reflective constructs under the associated construct headings [199,200].
when examining new structural paths in the context of incremental studies that extend previous
models [211], or when the relationships and measures proposed are new or have not been extensively
examined in the prior literature [212,213]; and thirdly, the variance-based approach in PLS is effective
for predictive applications. Therefore, since the study’s objective was to identify the factors underlying
organizational-level CC continuance use (i.e., not to examine a particular behavioral model), PLS was
aAppl.
suitable choice
Sci. 2020, [214].
10, x FOR PEER REVIEW 15 of 36
Figure 4.
Figure 4. Prototype
Prototype Development Processes [215].
Development Processes [215].
6. Preliminary Results
A pilot study was conducted to increase the consistency of the measures used throughout the
research. Noteworthily, the main objective associated with a pilot study is to ensure the validation of
the initial instrument, and to identify any inconsistencies that could undermine the accuracy of the
results [227].
correlation values, in Table 5. According to Briggs and Cheek [233], inter-item correlations in a reflective
measurement scale offer data pertaining to the scale’s dimensionality. Furthermore, mean inter-item
correlation is distinct when compared to a reliability estimate because it is not impacted by scale
length. Resultantly, it provides a less ambiguous sense of item homogeneity. When the mean inter-item
correlation does not exceed 0.3, this suggests that the item is not correlated with others in the same
construct in a strong way [234]. At the same time, inter-item correlations that exceed 0.9 are indicative
of multicollinearity issues [235]. For Cronbach Alpha values that exceed 0.7, this suggests favorable
internal consistency in terms of the items on the scale [236]. Specifically, Cronbach Alpha values
that exceed 0.9 are considered “excellent”, while values exceeding 0.8 and 0.7 are considered “good”
and “acceptable”, respectively [237]. For values exceeding 0.6, these are considered “questionable”,
while values lower than 0.5 are considered “unacceptable” [237]. In this study, relatively only one of
the inter-item correlations fell below the recommended value of 0.3, meaning that the implications
associated with their elimination from the survey instrument were considered. Given their weak
correlations with other items in the measurement of confirmation construct, one item was removed.
Additionally, because the resulting Cronbach Alpha values would not have increased after the deletion
of the item, the item was retained (see Table 5).
Construct No. Items Min Inter-Item Correlation Max Inter-Item Correlation Cronbach’s Alpha
CC Continuance Use 3 0.656 0.828 0.907
Satisfaction 4 0.684 0.81 0.916
Confirmation 5 0.124 0.795 0.78
Net Benefit 13 - - 0.916
Technical Integration 3 0.711 0.788 0.891
System Quality 13 - - 0.927
Information Quality 7 - - 0.928
System Investment 3 0.67 0.731 0.836
Collaboration 5 0.504 0.742 0.899
Regulatory Policy 5 0.398 0.821 0.894
Competitive Pressure 4 0.577 0.772 0.86
All Items 65 0.862 0.913
Drawing on partial least squares (PLS) analysis, the validity of the reflective and formative models
was evaluated based on the recommendations of Hair Jr, Hult [208], which are discussed below.
In the case of convergent validity, this was computed as the average variance extracted (AVE) for
every construct, and it was greater than 0.5 in each case [239]. Every square root for the AVEs was
greater than the corresponding latent variable correlation, thereby indicating a satisfactory level of
discriminant validity (see Table 7).
Appl. Sci. 2020, 10, 6628 19 of 36
the constructs has been established using AVE and CR tests. Based on the feedback received from
the pilot study’s participants, as well as from the statistical analysis, a revised survey instrument
was devised for the full-scale research. An assessment of the reflective measurement model was
undertaken by computing internal consistency, discriminant validity, and convergent validity. In each
case, the instrument was associated with satisfactory results. Formative measures were evaluated
based on convergent validity, collinearity issues, and significance and relevance, revealing satisfactory
performance in terms of convergent validity. As for collinearity issues, these were examined by
computing the variance inflation factors (VIFs) for every indicator, most of which did not exceed the
maximum threshold of 5 [242]. Certain VIFs were greater than the required value, but because formative
measurement models are based on regression (which is informed both by measure intercorrelations
and sample size), these indicators will be considered when using full-scale data for the model.
Additionally, indicators were examined in terms of significance and relevance by considering the initial
research model.
7.3. Limitations
This study has some limitations that will bring about the focus of subsequent research. First, similar
to organizational studies, possible bias to the results may occur from representing individual views
Appl. Sci. 2020, 10, 6628 22 of 36
rather than a shared opinion within the HEIs. This can be addressed if the constructs in the full-scale
study are used for longitudinal evaluations rather than just cross-sectional evaluations, then the client
organizations will be able to learn about critical “pain-points”. Second, the proposed model is intended
for organizational-level usage. However, CC ecosystems involve operationalizing constructs such as
IT infrastructure availability and computer sophistication [244,245]. Future research will have to take
additional perspectives to understand continuance on an organizational level. Finally, the proposed
model contextualized the IS continuance model based on previous literature in organizational-level
continuance to better suite HEIs. However, future research may have to consider contextual constructs
to understand continuance on HEIs.
Author Contributions: Conceptualization, Y.A.M.Q.; Data curation, Y.A.M.Q. and R.A. (Rusli Abdulah); Formal
analysis, Y.A.M.Q., R.A. (Rodziah Atan) and Y.Y.; Methodology, Y.A.M.Q., R.A. (Rusli Abdulah) and Y.Y.;
Project administration, R.A. (Rusli Abdulah), Y.Y. and R.A. (Rodziah Atan); Resources, R.A. (Rusli Abdulah),
Y.Y. and R.A. (Rodziah Atan); Supervision, R.A. (Rusli Abdulah), Y.Y. and R.A. (Rodziah Atan); Validation,
Y.A.M.Q. and R.A. (Rodziah Atan); Writing—original draft, Y.A.M.Q.; Writing—review & editing, Y.A.M.Q.
and R.A. (Rusli Abdulah). All authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by Research Management Center (RMC), Universiti Putra Malaysia (UPM),
UPM Journal Publication Fund (9001103).
Conflicts of Interest: The authors declare no conflict of interest.
Appl. Sci. 2020, 10, 6628 23 of 36
Appendix A
Measurement Items
Constructs Reflective/Formative Theories
Items Adapted Source Previous Studies
(1 = Strongly Disagree to 7 = Strongly Agree)
CCA1: Our institution intends to continue using the cloud computing service rather than
discontinue.
CC Continuous Intention Reflective CCA2: Our institution’s intention is to continue using the cloud computing service rather [101] [45,72,76] ECM & ISD
than use any another means (traditional software).
CCA3: If we could, our institution would like to discontinue the use of the cloud
computing service. (reverse coded).
How do you feel about your overall experience with your current cloud computing
service (SaaS, IaaS, or PaaS)?
SAT1: Very dissatisfied (1)–Very satisfied (7)
Satisfaction (SAT) Reflective [101] [45,72,76] ECM
SAT2: Very displeased (1)–Very pleased (7)
SAT3: Very frustrated (1)–Very contented (7)
SAT4: Absolutely terrible (1)–Absolutely delighted (7).
(1 = Strongly Disagree to 7 = Strongly Agree)
CON1. Our experience with using cloud computing services was better than what we
expected.
CON2. The benefits with using cloud computing services were better than we expected.
Confirmation (Con) Reflective CON3. The functionalities provided by cloud computing services for team projects was [101] [45,72] ECM
better than what I expected.
CON4. Cloud computing services support our institution more than expected.
CON5. Overall, most of our expectations from using cloud computing services were
confirmed.
Our cloud computing service . . .
NB1. . . . increases the productivity of end-users.
NB2. . . . increases the overall productivity of the institution.
NB3. . . . enables individual users to make better decisions.
NB4. . . . helps to save IT-related costs.
NB5. . . . makes it easier to plan the IT costs of the institution.
NB6. . . . enhances our strategic flexibility. [105,120]
Net Benefits (NB) Formative [14,77,78,152] ECM
NB7. . . . enhances the ability of the institution to innovate.
NB8. . . . enhances the mobility of the institution’s employees.
NB9. . . . improves the quality of the institution’s business processes.
NB10. . . . shifts the risks of IT failures from my instituting to the provider.
NB11. . . . lower the IT staff requirements within the institution to keep the system running.
NB12. . . . improves outcomes/outputs of my institution.
NB13. . . . has brought significant benefits to the institution. [116]
Appl. Sci. 2020, 10, 6628 24 of 36
Measurement Items
Constructs Reflective/Formative Theories
Items Adapted Source Previous Studies
TI1. The technical characteristics of the cloud computing service make it complex.
TI2. The cloud computing service depends on a sophisticated integration of technology
Technical Integration (TE) Reflective [16] [14,78] ISD
components.
TI3. There is considerable technical complexity underlying the cloud computing service.
Our cloud computing service . . .
SQ1. . . . operates reliably and stable.
SQ2. . . . can be flexibly adjusted to new demands or conditions.
SQ3. . . . effectively integrates data from different areas of the company.
SQ4. . . . makes information easy to access (accessibility).
SQ5. . . . is easy to use.
SQ6. . . . provides information in a timely fashion (response time). [105,120]
System Quality (SQ) Formative [14,77,78,152]
SQ7. . . . provides key features and functionalities that meet the institution requirements.
SQ8. . . . is secure.
SQ9. . . . is easy to learn.
SQ10. . . . meets different user requirements within the institution.
SQ11. . . . is easy to upgrade from an older to a newer version.
SQ12. . . . is easy to customize (after implementation, e.g., user interface). ISS
Measurement Items
Constructs Reflective/Formative Theories
Items Adapted Source Previous Studies
Col1. Interaction of our institution with employees, industry and other institutions is easy
with the continuance use of cloud computing service
Col2. Collaboration between our institution and industry raise by the continuance use of
cloud computing service
Col3. The continuance uses of cloud computing service improve collaboration among
Collaboration (Col) Reflective [195,196] [42,142–144] TOE
institutions.
Col4. If our institution continues using cloud computing service, it can communicate with
its partners (institutions and industry)
Col5. Communication with the institution’s partners (institutions and industry) is
enhanced by the continuance use of cloud computing service
RP1. Our institution is under pressure from some government agencies to continue using
cloud computing service.
RP2. The government is providing us with incentives to continue using cloud computing
service.
Regulatory Policy (RP) Reflective [172,252,253] [145–147] TOE
RP3. The government is active in setting up the facilities to enable cloud computing service.
RP4. The laws and regulations that exist nowadays are sufficient to protect the use of cloud
computing service.
RP5. There is legal protection in the use of cloud computing service.
CP1. Our Institution thinks that continuance use of cloud computing service has an
influence on competition among other institutions
CP2. Our institution will lose students to competitors if they don’t keep using cloud
Competitive Pressure (CP) Reflective computing service [170,171,197] [72,145] TOE
CP3. Our institution is under pressure from competitors to continue using cloud
computing service
CP4. Some of our competitors have been using cloud computing service
Appl. Sci. 2020, 10, 6628 26 of 36
References
1. Alexander, B. Social networking in higher education. In The Tower and the Cloud; EDUCAUSE: Louisville, CO,
USA, 2008; pp. 197–201.
2. Katz, N. The Tower and the Cloud: Higher Education in the Age of Cloud Computing; EDUCAUSE: Louisville, CO,
USA, 2008; Volume 9.
3. Sultan, N. Cloud computing for education: A new dawn? Int. J. Inf. Manag. 2010, 30, 109–116. [CrossRef]
4. Son, I.; Lee, D.; Lee, J.-N.; Chang, Y.B. Market perception on cloud computing initiatives in organizations:
An extended resource-based view. Inf. Manag. 2014, 51, 653–669. [CrossRef]
5. Salim, S.A.; Sedera, D.; Sawang, S.; Alarifi, A.H.E.; Atapattu, M. Moving from Evaluation to Trial: How do
SMEs Start Adopting Cloud ERP? Australas. J. Inf. Syst. 2015, 19. [CrossRef]
6. Mell, P.; Grance, T. The NIST Definition of Cloud Computing; U.S. Department of Commerce, National Institute
of Standards and Technology: Gaithersburg, MD, USA, 2011.
7. González-Martínez, J.A.; Bote-Lorenzo, M.L.; Gómez-Sánchez, E.; Cano-Parra, R. Cloud computing and
education: A state-of-the-art survey. Comput. Educ. 2015, 80, 132–151. [CrossRef]
8. Qasem, Y.A.M.; Abdullah, R.; Jusoh, Y.Y.; Atan, R.; Asadi, S. Cloud Computing Adoption in Higher Education
Institutions: A Systematic Review. IEEE Access 2019, 7, 63722–63744. [CrossRef]
9. Rodríguez Monroy, C.; Almarcha Arias, G.C.; Núñez Guerrero, Y. The new cloud computing paradigm:
The way to IT seen as a utility. Lat. Am. Caribb. J. Eng. Educ. 2012, 6, 24–31.
10. IDC. IDC Forecasts Worldwide Public Cloud Services Spending. 2019. Available online: https://www.idc.
com/getdoc.jsp?containerId=prUS44891519 (accessed on 28 July 2020).
11. Hsu, P.-F.; Ray, S.; Li-Hsieh, Y.-Y. Examining cloud computing adoption intention, pricing mechanism,
and deployment model. Int. J. Inf. Manag. 2014, 34, 474–488. [CrossRef]
12. Dubey, A.; Wagle, D. Delivering Software as a Service. The McKinsey Quarterly, 6 May 2007.
13. Walther, S.; Sedera, D.; Urbach, N.; Eymann, T.; Otto, B.; Sarker, S. Should We Stay, or Should We Go?
Analyzing Continuance of Cloud Enterprise Systems. J. Inf. Technol. Theory Appl. 2018, 19, 4.
14. Qasem, Y.A.; Abdullah, R.; Jusoh, Y.Y.; Atan, R. Conceptualizing a model for Continuance Use of Cloud
Computing in Higher Education Institutions. In Proceedings of the AMCIS 2020 TREOs, Salt Lake City, UT,
USA, 10–14 August 2020; p. 30.
15. Furneaux, B.; Wade, M.R. An exploration of organizational level information systems discontinuance
intentions. MIS Q. 2011, 35, 573–598. [CrossRef]
16. Long, K. Unit of Analysis. Encyclopedia of Social Science Research Methods; SAGE Publications, Inc.: Los Angeles,
CA, USA, 2004.
17. Berman, S.J.; Kesterson-Townes, L.; Marshall, A.; Srivathsa, R. How cloud computing enables process and
business model innovation. Strategy Leadersh. 2012, 40, 27–35. [CrossRef]
18. Stahl, E.; Duijvestijn, L.; Fernandes, A.; Isom, P.; Jewell, D.; Jowett, M.; Stockslager, T. Performance Implications
of Cloud Computing; Red Paper: New York, NY, USA, 2012.
19. Thorsteinsson, G.; Page, T.; Niculescu, A. Using virtual reality for developing design communication.
Stud. Inform. Control 2010, 19, 93–106. [CrossRef]
20. Pocatilu, P.; Alecu, F.; Vetrici, M. Using cloud computing for E-learning systems. In Proceedings of the 8th
WSEAS International Conference on Data networks, Communications, Computers, Baltimore, MD, USA,
7–9 November 2009; pp. 54–59.
21. Sasikala, S.; Prema, S. Massive centralized cloud computing (MCCC) exploration in higher education.
Adv. Comput. Sci. Technol. 2011, 3, 111.
22. García-Peñalvo, F.J.; Johnson, M.; Alves, G.R.; Minović, M.; Conde-González, M.Á. Informal learning
recognition through a cloud ecosystem. Future Gener. Comput. Syst. 2014, 32, 282–294. [CrossRef]
23. Nguyen, T.D.; Nguyen, T.M.; Pham, Q.T.; Misra, S. Acceptance and Use of E-Learning Based on Cloud
Computing: The Role of Consumer Innovativeness. In Computational Science and Its Applications—Iccsa 2014;
Pt, V.; Murgante, B., Misra, S., Rocha, A., Torre, C., Rocha, J.G., Falcao, M.I., Taniar, D., Apduhan, B.O.,
Gervasi, O., Eds.; Springer: Cham, Switzerland, 2014; pp. 159–174.
24. Pinheiro, P.; Aparicio, M.; Costa, C. Adoption of cloud computing systems. In Proceedings of the
International Conference on Information Systems and Design of Communication—ISDOC ’14, Lisbon,
Portugal, 16 May 2014; ACM: Lisbon, Portugal, 2014; pp. 127–131.
Appl. Sci. 2020, 10, 6628 27 of 36
25. Behrend, T.S.; Wiebe, E.N.; London, J.E.; Johnson, E.C. Cloud computing adoption and usage in community
colleges. Behav. Inf. Technol. 2011, 30, 231–240. [CrossRef]
26. Almazroi, A.A.; Shen, H.F.; Teoh, K.K.; Babar, M.A. Cloud for e-Learning: Determinants of its Adoption by
University Students in a Developing Country. In Proceedings of the 2016 IEEE 13th International Conference
on E-Business Engineering (Icebe), Macau, China, 4–6 November 2016; pp. 71–78.
27. Meske, C.; Stieglitz, S.; Vogl, R.; Rudolph, D.; Oksuz, A. Cloud Storage Services in Higher Educatio-Results
of a Preliminary Study in the Context of the Sync&Share-Project in Germany. In Learning and Collaboration
Technologies: Designing and Developing Novel Learning Experiences; Pt, I.; Zaphiris, P., Ioannou, A., Eds.;
Springer: Berlin/Heidelberg, Germany, 2014; pp. 161–171.
28. Khatib, M.M.E.; Opulencia, M.J.C. The Effects of Cloud Computing (IaaS) on E- Libraries in United Arab
Emirates. Procedia Econ. Financ. 2015, 23, 1354–1357. [CrossRef]
29. Arpaci, I.; Kilicer, K.; Bardakci, S. Effects of security and privacy concerns on educational use of cloud
services. Comput. Hum. Behav. 2015, 45, 93–98. [CrossRef]
30. Park, S.C.; Ryoo, S.Y. An empirical investigation of end-users’ switching toward cloud computing: A two
factor theory perspective. Comput. Hum. Behav. 2013, 29, 160–170. [CrossRef]
31. Riaz, S.; Muhammad, J. An Evaluation of Public Cloud Adoption for Higher Education: A case study from
Pakistan. In Proceedings of the 2015 International Symposium on Mathematical Sciences and Computing
Research (Ismsc), Ipoh, MX, USA, 19–20 May 2015; pp. 208–213.
32. Militaru, G.; Purcărea, A.A.; Negoiţă, O.D.; Niculescu, A. Examining Cloud Computing Adoption Intention
in Higher Education: Exploratory Study. In Exploring Services Science; Borangiu, T., Dragoicea, M., Novoa, H.,
Eds.; 2016; pp. 732–741.
33. Wu, W.W.; Lan, L.W.; Lee, Y.T. Factors hindering acceptance of using cloud services in university: A case
study. Electron. Libr. 2013, 31, 84–98. [CrossRef]
34. Gurung, R.K.; Alsadoon, A.; Prasad, P.W.C.; Elchouemi, A. Impacts of Mobile Cloud Learning (MCL) on
Blended Flexible Learning (BFL). In Proceedings of the 2016 International Conference on Information and
Digital Technologies (IDT), Rzeszow, Poland, 5–7 July 2016; pp. 108–114.
35. Bhatiasevi, V.; Naglis, M. Investigating the structural relationship for the determinants of cloud computing
adoption in education. Educ. Inf. Technol. 2015, 21, 1197–1223. [CrossRef]
36. Yeh, C.-H.; Hsu, C.-C. The Learning Effect of Students’ Cognitive Styles in Using Cloud Technology.
In Advances in Web-Based Learning—ICWL 2013 Workshops; Chiu, D.K.W., Wang, M., Popescu, E., Li, Q., Lau, R.,
Shih, T.K., Yang, C.-S., Sampson, D.G., Eds.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 155–163.
37. Stantchev, V.; Colomo-Palacios, R.; Soto-Acosta, P.; Misra, S. Learning management systems and cloud file
hosting services: A study on students’ acceptance. Comput. Hum. Behav. 2014, 31, 612–619. [CrossRef]
38. Yuvaraj, M. Perception of cloud computing in developing countries. Libr. Rev. 2016, 65, 33–51. [CrossRef]
39. Sharma, S.K.; Al-Badi, A.H.; Govindaluri, S.M.; Al-Kharusi, M.H. Predicting motivators of cloud computing
adoption: A developing country perspective. Comput. Hum. Behav. 2016, 62, 61–69. [CrossRef]
40. Kankaew, V.; Wannapiroon, P. System Analysis of Virtual Team in Cloud Computing to Enhance Teamwork
Skills of Undergraduate Students. Procedia Soc. Behav. Sci. 2015, 174, 4096–4102. [CrossRef]
41. Yadegaridehkordi, E.; Iahad, N.A.; Ahmad, N. Task-Technology Fit and User Adoption of Cloud-based
Collaborative Learning Technologies. In Proceedings of the 2014 International Conference on Computer and
Information Sciences (Iccoins), Kuala Lumpur, MX, USA, 3–5 June 2014.
42. Arpaci, I. Understanding and predicting students’ intention to use mobile cloud storage services. Comput.
Hum. Behav. 2016, 58, 150–157. [CrossRef]
43. Shiau, W.-L.; Chau, P.Y.K. Understanding behavioral intention to use a cloud computing classroom: A multiple
model comparison approach. Inf. Manag. 2016, 53, 355–365. [CrossRef]
44. Tan, X.; Kim, Y. User acceptance of SaaS-based collaboration tools: A case of Google Docs. J. Enterp. Inf. Manag.
2015, 28, 423–442. [CrossRef]
45. Atchariyachanvanich, K.; Siripujaka, N.; Jaiwong, N. What Makes University Students Use Cloud-based
E-Learning?: Case Study of KMITL Students. In Proceedings of the 2014 International Conference on
Information Society (I-Society 2014), London, UK, 10–12 November 2014; pp. 112–116.
46. Vaquero, L.M. EduCloud: PaaS versus IaaS Cloud Usage for an Advanced Computer Science Course.
IEEE Trans. Educ. 2011, 54, 590–598. [CrossRef]
Appl. Sci. 2020, 10, 6628 28 of 36
47. Ashtari, S.; Eydgahi, A. Student Perceptions of Cloud Computing Effectiveness in Higher Education.
In Proceedings of the 2015 IEEE 18th International Conference on Computational Science and Engineering
(CSE), Porto, Portugal, 21–23 October 2015; pp. 184–191.
48. Qasem, Y.A.; Abdullah, R.; Atan, R.; Jusoh, Y.Y. Cloud-Based Education As a Service (CEAAS) System
Requirements Specification Model of Higher Education Institutions in Industrial Revolution 4.0. Int. J. Recent
Technol. Eng. 2019, 8. [CrossRef]
49. Huang, Y.-M. The factors that predispose students to continuously use cloud services: Social and technological
perspectives. Comput. Educ. 2016, 97, 86–96. [CrossRef]
50. Tashkandi, A.N.; Al-Jabri, I.M. Cloud computing adoption by higher education institutions in Saudi Arabia:
An exploratory study. Clust. Comput. 2015, 18, 1527–1537. [CrossRef]
51. Tashkandi, A.; Al-Jabri, I. Cloud Computing Adoption by Higher Education Institutions in Saudi Arabia:
Analysis Based on TOE. In Proceedings of the 2015 International Conference on Cloud Computing, ICCC,
Riyadh, Saudi Arabia, 26–29 April 2015.
52. Dahiru, A.A.; Bass, J.M.; Allison, I.K. Cloud computing adoption in sub-Saharan Africa: An analysis using
institutions and capabilities. In Proceedings of the International Conference on Information Society, i-Society
2014, London, UK, 10–12 November 2014; pp. 98–103.
53. Shakeabubakor, A.A.; Sundararajan, E.; Hamdan, A.R. Cloud Computing Services and Applications to
Improve Productivity of University Researchers. Int. J. Inf. Electron. Eng. 2015, 5, 153. [CrossRef]
54. Md Kassim, S.S.; Salleh, M.; Zainal, A. Cloud Computing: A General User’s Perception and Security
Awareness in Malaysian Polytechnic. In Pattern Analysis, Intelligent Security and the Internet of Things;
Abraham, A., Muda, A.K., Choo, Y.-H., Eds.; Springer International Publishing: Cham, Switzerland, 2015;
pp. 131–140.
55. Sabi, H.M.; Uzoka, F.-M.E.; Langmia, K.; Njeh, F.N. Conceptualizing a model for adoption of cloud computing
in education. Int. J. Inf. Manag. 2016, 36, 183–191. [CrossRef]
56. Sabi, H.M.; Uzoka, F.-M.E.; Langmia, K.; Njeh, F.N.; Tsuma, C.K. A cross-country model of contextual
factors impacting cloud computing adoption at universities in sub-Saharan Africa. Inf. Syst. Front. 2017, 20,
1381–1404. [CrossRef]
57. Yuvaraj, M. Determining factors for the adoption of cloud computing in developing countries. Bottom Line
2016, 29, 259–272. [CrossRef]
58. Surya, G.S.F.; Surendro, K. E-Readiness Framework for Cloud Computing Adoption in Higher Education.
In Proceedings of the 2014 International Conference of Advanced Informatics: Concept, Theory and
Application (ICAICTA), Bandung, Indonesia, 20–21 August 2014; pp. 278–282.
59. Alharthi, A.; Alassafi, M.O.; Walters, R.J.; Wills, G.B. An exploratory study for investigating the critical
success factors for cloud migration in the Saudi Arabian higher education context. Telemat. Inform. 2017, 34,
664–678. [CrossRef]
60. Mokhtar, S.A.; Al-Sharafi, A.; Ali, S.H.S.; Aborujilah, A. Organizational Factors in the Adoption of Cloud
Computing in E-learning. In Proceedings of the 3rd International Conference on Advanced Computer
Science Applications and Technologies Acsat, Amman, Jordan, 29–30 December 2014; pp. 188–191.
61. Lal, P. Organizational learning management systems: Time to move learning to the cloud! Dev. Learn. Organ. Int. J.
2015, 29, 13–15. [CrossRef]
62. Yuvaraj, M. Problems and prospects of implementing cloud computing in university libraries. Libr. Rev.
2015, 64, 567–582. [CrossRef]
63. Koch, F.; Assunção, M.D.; Cardonha, C.; Netto, M.A.S. Optimising resource costs of cloud computing for
education. Future Gener. Comput. Syst. 2016, 55, 473–479. [CrossRef]
64. Qasem, Y.A.; Abdullah, R.; Atan, R.; Jusoh, Y.Y. Towards Developing A Cloud-Based Education As A Service
(CEAAS) Model For Cloud Computing Adoption in Higher Education Institutions. Complexity 2018, 6, 7.
[CrossRef]
65. Qasem, Y.A.M.; Abdullah, R.; Yah, Y.; Atan, R.; Al-Sharafi, M.A.; Al-Emran, M. Towards the Development of a
Comprehensive Theoretical Model for Examining the Cloud Computing Adoption at the Organizational Level.
In Recent Advances in Intelligent Systems and Smart Applications; Al-Emran, M., Shaalan, K., Hassanien, A.E.,
Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 63–74.
Appl. Sci. 2020, 10, 6628 29 of 36
66. Jia, Q.; Guo, Y.; Barnes, S.J. Enterprise 2.0 post-adoption: Extending the information system continuance
model based on the technology-Organization-environment framework. Comput. Hum. Behav. 2017, 67,
95–105. [CrossRef]
67. Tripathi, S. Understanding the determinants affecting the continuance intention to use cloud computing.
J. Int. Technol. Inf. Manag. 2017, 26, 124–152.
68. Obal, M. What drives post-adoption usage? Investigating the negative and positive antecedents of disruptive
technology continuous adoption intentions. Ind. Mark. Manag. 2017, 63, 42–52. [CrossRef]
69. Ratten, V. Continuance use intention of cloud computing: Innovativeness and creativity perspectives.
J. Bus. Res. 2016, 69, 1737–1740. [CrossRef]
70. Flack, C.K. IS Success Model for Evaluating Cloud Computing for Small Business Benefit: A Quantitative
Study. Ph.D. Thesis, Kennesaw State University, Kennesaw, GA, USA, 2016.
71. Walther, S.; Sarker, S.; Urbach, N.; Sedera, D.; Eymann, T.; Otto, B. Exploring organizational level continuance
of cloud-based enterprise systems. In Proceedings of the ECIS 2015 Completed Research Papers, Münster,
Germany, 26–29 May 2015.
72. Ghobakhloo, M.; Tang, S.H. Information system success among manufacturing SMEs: Case of developing
countries. Inf. Technol. Dev. 2015, 21, 573–600. [CrossRef]
73. Schlagwein, D.; Thorogood, A. Married for life? A cloud computing client-provider relationship continuance
model. In Proceedings of the European Conference on Information Systems (ECIS) 2014, Tel Aviv, Israel,
9–11 June 2014.
74. Hadji, B.; Degoulet, P. Information system end-user satisfaction and continuance intention: A unified
modeling approach. J. Biomed. Inf. 2016, 61, 185–193. [CrossRef]
75. Esteves, J.; Bohórquez, V.W. An Updated ERP Systems Annotated Bibliography: 2001–2005; Instituto de Empresa
Business School Working Paper No. WP; Instituto de Empresa Business School: Madrid, Spain, 2007; pp. 4–7.
76. Gable, G.G.; Sedera, D.; Chan, T. Re-conceptualizing information system success: The IS-impact measurement
model. J. Assoc. Inf. Syst. 2008, 9, 18. [CrossRef]
77. Sedera, D.; Gable, G.G. Knowledge management competence for enterprise system success. J. Strateg.
Inf. Syst. 2010, 19, 296–306. [CrossRef]
78. Walther, S.; Plank, A.; Eymann, T.; Singh, N.; Phadke, G. Success factors and value propositions of software as
a service providers—A literature review and classification. In Proceedings of the 2012 AMCIS: 18th Americas
Conference on Information Systems, Seattle, WA, USA, 9–12 August 2012.
79. Ashtari, S.; Eydgahi, A. Student perceptions of cloud applications effectiveness in higher education.
J. Comput. Sci. 2017, 23, 173–180. [CrossRef]
80. Ding, Y. Looking forward: The role of hope in information system continuance. Comput. Hum. Behav. 2019,
91, 127–137. [CrossRef]
81. Rogers, E. Diffusion of Innovation; Macmillan Press Ltd.: London, UK, 1962.
82. Ettlie, J.E.J.P.R. Adequacy of stage models for decisions on adoption of innovation. Psychol. Rep. 1980, 46,
991–995. [CrossRef]
83. Fichman, R.G.; Kemerer, C.F.J.M.s. The assimilation of software process innovations: An organizational
learning perspective. Manag. Sci. 1997, 43, 1345–1363. [CrossRef]
84. Salim, S.A.; Sedera, D.; Sawang, S.; Alarifi, A. Technology adoption as a multi-stage process. In Proceedings
of the 25th Australasian Conference on Information Systems (ACIS), Auckland, New Zealand,
8–10 December 2014.
85. Fichman, R.G.; Kemerer, C.F. Adoption of software engineering process innovations: The case of object
orientation. Sloan Manag. Rev. 1993, 34, 7–23.
86. Choudhury, V.; Karahanna, E. The relative advantage of electronic channels: A multidimensional view.
MIS Q. 2008, 32, 179. [CrossRef]
87. Karahanna, E.; Straub, D.W.; Chervany, N.L. Information technology adoption across time: A cross-sectional
comparison of pre-adoption and post-adoption beliefs. MIS Q. 1999, 23, 183. [CrossRef]
88. Pavlou, P.A.; Fygenson, M. Understanding and predicting electronic commerce adoption: An extension of
the theory of planned behavior. MIS Q. 2006, 30, 115. [CrossRef]
89. Shoham, A. Selecting and evaluating trade shows. Ind. Mark. Manag. 1992, 21, 335–341. [CrossRef]
90. Mintzberg, H.; Raisinghani, D.; Theoret, A. The Structure of “Unstructured” Decision Processes. Adm. Sci. Q.
1976, 21, 246–275. [CrossRef]
Appl. Sci. 2020, 10, 6628 30 of 36
91. Pierce, J.L.; Delbecq, A.L. Organization structure, individual attitudes and innovation. Acad. Manag. Rev.
1977, 2, 27–37. [CrossRef]
92. Zmud, R.W. Diffusion of modern software practices: Influence of centralization and formalization. Manag. Sci.
1982, 28, 1421–1431. [CrossRef]
93. Aguirre-Urreta, M.I.; Marakas, G.M. Exploring choice as an antecedent to behavior: Incorporating alternatives
into the technology acceptance process. J. Organ. End User Comput. 2012, 24, 82–107. [CrossRef]
94. Schwarz, A.; Chin, W.W.; Hirschheim, R.; Schwarz, C. Toward a process-based view of information technology
acceptance. J. Inf. Technol. 2014, 29, 73–96. [CrossRef]
95. Maier, C.; Laumer, S.; Weinert, C.; Weitzel, T. The effects of technostress and switching stress on discontinued
use of social networking services: A study of Facebook use. Inf. Syst. J. 2015, 25, 275–308. [CrossRef]
96. Damanpour, F.; Schneider, M. Phases of the adoption of innovation in organizations: Effects of environment,
organization and top managers 1. Br. J. Manag. 2006, 17, 215–236. [CrossRef]
97. Jeyaraj, A.; Rottman, J.W.; Lacity, M.C. A review of the predictors, linkages, and biases in IT innovation
adoption research. J. Inf. Technol. 2006, 21, 1–23. [CrossRef]
98. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology.
MIS Q. 1989, 13, 319. [CrossRef]
99. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a
unified view. MIS Q. 2003, 27, 425–478. [CrossRef]
100. Oliver, R.L. A cognitive model of the antecedents and consequences of satisfaction decisions. J. Mark. Res.
1980, 17, 460–469. [CrossRef]
101. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model.
MIS Q. 2001, 25, 351. [CrossRef]
102. Tornatzky, L.G.; Fleischer, M.; Chakrabarti, A.K. Processes of Technological Innovation; Lexington Books:
Lanham, MD, USA, 1990.
103. Rogers, E.M. Diffusion of Innovations; The Free Press: New York, NY, USA, 1995; p. 12.
104. Teo, H.-H.; Wei, K.K.; Benbasat, I. Predicting intention to adopt interorganizational linkages: An institutional
perspective. MIS Q. 2003, 27, 19. [CrossRef]
105. DeLone, W.H.; McLean, E.R. Information systems success: The quest for the dependent variable. Inf. Syst. Res.
1992, 3, 60–95. [CrossRef]
106. Eden, R.; Sedera, D.; Tan, F. Sustaining the Momentum: Archival Analysis of Enterprise Resource Planning
Systems (2006–2012). Commun. Assoc. Inf. Syst. 2014, 35, 3. [CrossRef]
107. Chou, S.-W.; Chen, P.-Y. The influence of individual differences on continuance intentions of enterprise
resource planning (ERP). Int. J. Hum. Comput. Stud. 2009, 67, 484–496. [CrossRef]
108. Lin, W.-S. Perceived fit and satisfaction on web learning performance: IS continuance intention and
task-technology fit perspectives. Int. J. Hum. Comput. Stud. 2012, 70, 498–507. [CrossRef]
109. Karahanna, E.; Straub, D. The psychological origins of perceived usefulness and ease-of-use. Inf. Manag.
1999, 35, 237–250. [CrossRef]
110. Roca, J.C.; Chiu, C.-M.; Martínez, F.J. Understanding e-learning continuance intention: An extension of the
Technology Acceptance Model. Int. J. Hum. Comput. Stud. 2006, 64, 683–696. [CrossRef]
111. Dai, H.M.; Teo, T.; Rappa, N.A.; Huang, F. Explaining Chinese university students’ continuance learning
intention in the MOOC setting: A modified expectation confirmation model perspective. Comput. Educ.
2020, 150, 103850. [CrossRef]
112. Ouyang, Y.; Tang, C.; Rong, W.; Zhang, L.; Yin, C.; Xiong, Z. Task-technology fit aware expectation-
confirmation model towards understanding of MOOCs continued usage intention. In Proceedings of the 50th
Hawaii International Conference on System Sciences, Hilton Waikoloa Village, HI, USA, 4–7 January 2017.
113. Joo, Y.J.; So, H.-J.; Kim, N.H. Examination of relationships among students’ self-determination, technology
acceptance, satisfaction, and continuance intention to use K-MOOCs. Comput. Educ. 2018, 122, 260–272.
[CrossRef]
114. Thong, J.Y.; Hong, S.-J.; Tam, K.Y. The effects of post-adoption beliefs on the expectation-confirmation model
for information technology continuance. Int. J. Hum. Comput. Stud. 2006, 64, 799–810. [CrossRef]
115. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [CrossRef]
116. Wixom, B.H.; Todd, P.A. A theoretical integration of user satisfaction and technology acceptance. Inf. Syst. Res.
2005, 16, 85–102. [CrossRef]
Appl. Sci. 2020, 10, 6628 31 of 36
117. Lokuge, S.; Sedera, D. Deriving information systems innovation execution mechanisms. In Proceedings of the
25th Australasian Conference on Information Systems (ACIS), Auckland, New Zealand, 8–10 December 2014.
118. Lokuge, S.; Sedera, D. Enterprise systems lifecycle-wide innovation readiness. In Proceedings of the PACIS
2014 Proceedings, Chengdu, China, 24–28 June 2014; pp. 1–14.
119. Melville, N.; Kraemer, K.; Gurbaxani, V. Information technology and organizational performance:
An integrative model of IT business value. MIS Q. 2004, 28, 283–322. [CrossRef]
120. Delone, W.H.; McLean, E.R. The DeLone and McLean model of information systems success: A ten-year
update. J. Manag. Inf. Syst. 2003, 19, 9–30.
121. Urbach, N.; Smolnik, S.; Riempp, G. The state of research on information systems success. Bus. Inf. Syst. Eng.
2009, 1, 315–325. [CrossRef]
122. Wang, Y.S. Assessing e-commerce systems success: A respecification and validation of the DeLone and
McLean model of IS success. Inf. Syst. J. 2008, 18, 529–557. [CrossRef]
123. Urbach, N.; Smolnik, S.; Riempp, G. An empirical investigation of employee portal success. J. Strateg.
Inf. Syst. 2010, 19, 184–206. [CrossRef]
124. Barki, H.; Huff, S.L. Change, attitude to change, and decision support system success. Inf. Manag. 1985, 9,
261–268. [CrossRef]
125. Gelderman, M. The relation between user satisfaction, usage of information systems and performance.
Inf. Manag. 1998, 34, 11–18. [CrossRef]
126. Seddon, P.B. A respecification and extension of the DeLone and McLean model of IS success. Inf. Syst. Res.
1997, 8, 240–253. [CrossRef]
127. Yuthas, K.; Young, S.T. Material matters: Assessing the effectiveness of materials management IS. Inf. Manag.
1998, 33, 115–124. [CrossRef]
128. Burton-Jones, A.; Gallivan, M.J. Toward a deeper understanding of system usage in organizations: A multilevel
perspective. MIS Q. 2007, 31, 657. [CrossRef]
129. Sedera, D.; Gable, G.; Chan, T. ERP success: Does organisation Size Matter? In Proceedings of the PACIS
2003 Proceedings, Adelaide, Australia, 10–13 July 2003; p. 74.
130. Sedera, D.; Gable, G.; Chan, T. Knowledge management for ERP success. In Proceedings of the PACIS 2003
Proceedings, Adelaide, Australia, 10–13 July 2003; p. 97.
131. Abolfazli, S.; Sanaei, Z.; Tabassi, A.; Rosen, S.; Gani, A.; Khan, S.U. Cloud Adoption in Malaysia: Trends,
Opportunities, and Challenges. IEEE Cloud Comput. 2015, 2, 60–68. [CrossRef]
132. Arkes, H.R.; Blumer, C. The psychology of sunk cost. Organ. Behav. Hum. Decis. Process. 1985, 35, 124–140.
[CrossRef]
133. Ahtiala, P. The optimal pricing of computer software and other products with high switching costs.
Int. Rev. Econ. Financ. 2006, 15, 202–211. [CrossRef]
134. Benlian, A.; Vetter, J.; Hess, T. The role of sunk cost in consecutive IT outsourcing decisions. Z. Fur Betr. 2012,
82, 181.
135. Armbrust, M.; Fox, A.; Griffith, R.; Joseph, A.D.; Katz, R.; Konwinski, A.; Lee, G.; Patterson, D.; Rabkin, A.;
Stoica, I. A view of cloud computing. Commun. ACM 2010, 53, 50–58. [CrossRef]
136. Wei, Y.; Blake, M.B. Service-oriented computing and cloud computing: Challenges and opportunities.
IEEE Internet Comput. 2010, 14, 72–75. [CrossRef]
137. Bughin, J.; Chui, M.; Manyika, J. Clouds, big data, and smart assets: Ten tech-enabled business trends to
watch. McKinsey Q. 2010, 56, 75–86.
138. Lin, H.-F. Understanding the determinants of electronic supply chain management system adoption:
Using the Technology–Organization–Environment framework. Technol. Forecast. Soc. Chang. 2014, 86, 80–92.
[CrossRef]
139. Oliveira, T.; Martins, M.F. Literature review of information technology adoption models at firm level. Electron.
J. Inf. Syst. Eval. 2011, 14, 110–121.
140. Chau, P.Y.; Tam, K.Y. Factors affecting the adoption of open systems: An exploratory study. MIS Q. 1997, 21,
1–24. [CrossRef]
141. Galliers, R.D. Organizational Dynamics of Technology-Based Innovation; Springer: Boston, MA, USA, 2007;
pp. 15–18.
Appl. Sci. 2020, 10, 6628 32 of 36
142. Yadegaridehkordi, E.; Iahad, N.A.; Ahmad, N. Task-Technology Fit Assessment of Cloud-Based Collaborative
Learning Technologes: Remote Work and Collaboration: Breakthroughs in Research and Practice.
Int. J. Inf. Systems Serv. Sect. 2017, 371–388. [CrossRef]
143. Gupta, P.; Seetharaman, A.; Raj, J.R.J.I.J.o.I.M. The usage and adoption of cloud computing by small and
medium businesses. Int. J. Inf. Manag. 2013, 33, 861–874. [CrossRef]
144. Chong, A.Y.-L.; Lin, B.; Ooi, K.-B.; Raman, M. Factors affecting the adoption level of c-commerce: An empirical
study. J. Comput. Inf. Syst. 2009, 50, 13–22.
145. Oliveira, T.; Thomas, M.; Espadanal, M. Assessing the determinants of cloud computing adoption: An analysis
of the manufacturing and services sectors. Inf. Manag. 2014, 51, 497–510. [CrossRef]
146. Senyo, P.K.; Effah, J.; Addae, E. Preliminary insight into cloud computing adoption in a developing country.
J. Enterp. Inf. Manag. 2016, 29, 505–524. [CrossRef]
147. Klug, W.; Bai, X. The determinants of cloud computing adoption by colleges and universities.
Int. J. Bus. Res. Inf. Technol. 2015, 2, 14–30.
148. Cornu, B. Digital Natives: How Do They Learn? How to Teach Them; UNESCO Institute for Information
Technology in Education: Moscow, Russia, 2011; Volume 52, pp. 2–11.
149. Oblinger, D.; Oblinger, J.L.; Lippincott, J.K. Educating the Net Generation; c2005. 1 v.(various pagings):
Illustrations; Educause: Boulder, CO, USA, 2005.
150. Wymer, S.A.; Regan, E.A. Factors influencing e-commerce adoption and use by small and medium businesses.
Electron. Mark. 2005, 15, 438–453. [CrossRef]
151. Qasem, Y.A.; Abdullah, R.; Atan, R.; Jusoh, Y.Y. Mapping and Analyzing Process of Cloud-based Education
as a Service (CEaaS) Model for Cloud Computing Adoption in Higher Education Institutions. In Proceedings
of the 2018 Fourth International Conference on Information Retrieval and Knowledge Management (CAMP),
Kota Kinabalu, MX, USA, 26–28 March 2018; pp. 1–8.
152. Walther, S.; Sedera, D.; Sarker, S.; Eymann, T. Evaluating Operational Cloud Enterprise System Success:
An Organizational Perspective. In Proceedings of the ECIS, Utrecht, The Netherlands, 6–8 June 2013; p. 16.
153. Wang, M.W.; Lee, O.-K.; Lim, K.H. Knowledge management systems diffusion in Chinese enterprises:
A multi-stage approach with the technology-organization-environment framework. In Proceedings of the
PACIS 2007 Proceedings, Auckland, New Zealand, 4–6 July 2007; p. 70.
154. Liao, C.; Palvia, P.; Chen, J.-L. Information technology adoption behavior life cycle: Toward a Technology
Continuance Theory (TCT). Int. J. Inf. Manag. 2009, 29, 309–320. [CrossRef]
155. Li, Y.; Crossler, R.E.; Compeau, D. Regulatory Focus in the Context of Wearable Continuance. In Proceedings
of the AMCIS 2019 Conference Site, Cancún, México, 15–17 August 2019.
156. Rousseau, D.M.J.R.i.o.b. Issues of level in organizational research: Multi-level and cross-level perspectives.
Res. Organ. Behav. 1985, 7, 1–37.
157. Walther, S. An Investigation of Organizational Level Continuance of Cloud-Based Enterprise Systems.
Ph.D. Thesis, University of Bayreuth, Bayreuth, Germany, 2014.
158. Petter, S.; DeLone, W.; McLean, E. Measuring information systems success: Models, dimensions, measures,
and interrelationships. Eur. J. Inf. Syst. 2008, 17, 236–263. [CrossRef]
159. Robey, D.; Zeller, R.L.J.I. Factors affecting the success and failure of an information system for product quality.
Interfaces 1978, 8, 70–75. [CrossRef]
160. Aldholay, A.; Isaac, O.; Abdullah, Z.; Abdulsalam, R.; Al-Shibami, A.H. An extension of Delone and McLean
IS success model with self-efficacy: Online learning usage in Yemen. Int. J. Inf. Learn. Technol. 2018, 35,
285–304. [CrossRef]
161. Xu, J.D.; Benbasat, I.; Cenfetelli, R.T.J.M.Q. Integrating service quality with system and information quality:
An empirical test in the e-service context. MIS Q. 2013, 37, 777–794. [CrossRef]
162. Spears, J.L.; Barki, H.J.M.q. User participation in information systems security risk management. MIS Q.
2010, 34, 503. [CrossRef]
163. Lee, S.; Shin, B.; Lee, H.G. Understanding post-adoption usage of mobile data services: The role of
supplier-side variables. J. Assoc. Inf. Syst. 2009, 10, 860–888. [CrossRef]
164. Alshare, K.A.; Freeze, R.D.; Lane, P.L.; Wen, H.J. The impacts of system and human factors on online learning
systems use and learner satisfaction. Decis. Sci. J. Innov. Educ. 2011, 9, 437–461. [CrossRef]
165. Benlian, A.; Koufaris, M.; Hess, T. Service quality in software-as-a-service: Developing the SaaS-Qual
measure and examining its role in usage continuance. J. Manag. Inf. Syst. 2011, 28, 85–126. [CrossRef]
Appl. Sci. 2020, 10, 6628 33 of 36
166. Oblinger, D. Boomers gen-xers millennials. EDUCAUSE Rev. 2003, 500, 37–47.
167. Monaco, M.; Martin, M. The millennial student: A new generation of learners. Athl. Train. Educ. J. 2007, 2,
42–46. [CrossRef]
168. White, B.J.; Brown, J.A.E.; Deale, C.S.; Hardin, A.T. Collaboration using cloud computing and traditional
systems. Issues Inf. Syst. 2009, 10, 27–32.
169. Nkhoma, M.Z.; Dang, D.P.; De Souza-Daw, A. Contributing factors of cloud computing adoption:
A technology-organisation-environment framework approach. In Proceedings of the European Conference
on Information Management & Evaluation, Melbourne, Australia, 2–4 December 2013.
170. Zhu, K.; Dong, S.; Xu, S.X.; Kraemer, K.L. Innovation diffusion in global contexts: Determinants of
post-adoption digital transformation of European companies. Eur. J. Inf. Syst. 2006, 15, 601–616. [CrossRef]
171. Zhu, K.; Kraemer, K.L.; Xu, S. The process of innovation assimilation by firms in different countries:
A technology diffusion perspective on e-business. Manag. Sci. 2006, 52, 1557–1576. [CrossRef]
172. Shah Alam, S.; Ali, M.Y.; Jaini, M.M.F. An empirical study of factors affecting electronic commerce adoption
among SMEs in Malaysia. J. Bus. Econ. Manag. 2011, 12, 375–399. [CrossRef]
173. Ifinedo, P. Internet/e-business technologies acceptance in Canada’s SMEs: An exploratory investigation.
Internet Res. 2011, 21, 255–281. [CrossRef]
174. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches;
Sage Publications: Los Angeles, CA, USA, 2017.
175. Moore, G.C.; Benbasat, I. Development of an instrument to measure the perceptions of adopting an
information technology innovation. Inf. Syst. Res. 1991, 2, 192–222. [CrossRef]
176. Wu, K.; Vassileva, J.; Zhao, Y. Understanding users’ intention to switch personal cloud storage services:
Evidence from the Chinese market. Comput. Hum. Behav. 2017, 68, 300–314. [CrossRef]
177. Kelley, D.L. Measurement Made Accessible: A Research Approach Using Qualitative, Quantitative and Quality
Improvement Methods; Sage Publications: Los Angeles, CA, USA, 1999.
178. McKenzie, J.F.; Wood, M.L.; Kotecki, J.E.; Clark, J.K.; Brey, R.A. Establishing content validity: Using qualitative
and quantitative steps. Am. J. Health Behav. 1999, 23, 311–318. [CrossRef]
179. Zikmund, W.G.; Babin, B.J.; Carr, J.C.; Griffin, M. Business Research Methods, 9th ed.; South-Western Cengage
Learning: Nelson, BC, Canada, 2013.
180. Sekaran, U.; Bougie, R. Research Methods for Business: A Skill Building Approach; John Wiley & Sons: New York,
NY, USA, 2016.
181. Mathieson, K.; Peacock, E.; Chin, W.W. Extending the technology acceptance model. ACM SIGMIS Database
Database Adv. Inf. Syst. 2001, 32, 86. [CrossRef]
182. Diamantopoulos, A.; Winklhofer, H.M. Index construction with formative indicators: An alternative to scale
development. J. Mark. Res. 2001, 38, 269–277. [CrossRef]
183. MacKenzie, S.B.; Podsakoff, P.M.; Podsakoff, N.P. Construct measurement and validation procedures in MIS
and behavioral research: Integrating new and existing techniques. MIS Q. 2011, 35, 293–334. [CrossRef]
184. Petter, S.; Straub, D.; Rai, A. Specifying formative constructs in information systems research. MIS Q. 2007,
31, 623. [CrossRef]
185. Haladyna, T.M. Developing and Validating Multiple-Choice Test Items; Routledge: London, UK, 2004.
186. DeVon, H.A.; Block, M.E.; Moyle-Wright, P.; Ernst, D.M.; Hayden, S.J.; Lazzara, D.J.; Savoy, S.M.;
Kostas-Polston, E. A psychometric toolbox for testing validity and reliability. J. Nurs. Sch. 2007, 39,
155–164. [CrossRef] [PubMed]
187. Webster, J.; Watson, R.T. Analyzing the past to prepare for the future: Writing a literature review. MIS Q.
2002, 26, 13–23.
188. Mac MacKenzie, S.B.; Podsakoff, P.M.; Jarvis, C.B. The Problem of Measurement Model Misspecification
in Behavioral and Organizational Research and Some Recommended Solutions. J. Appl. Psychol. 2005, 90,
710–730. [CrossRef] [PubMed]
189. Briggs, R.O.; Reinig, B.A.; Vreede, G.-J. The Yield Shift Theory of Satisfaction and Its Application to the IS/IT
Domain. J. Assoc. Inf. Syst. 2008, 9, 267–293. [CrossRef]
190. Rushinek, A.; Rushinek, S.F. What makes users happy? Commun. ACM 1986, 29, 594–598. [CrossRef]
191. Oliver, R.L. Measurement and evaluation of satisfaction processes in retail settings. J. Retail. 1981, 57, 24–48.
192. Swanson, E.B.; Dans, E. System life expectancy and the maintenance effort: Exploring their equilibration.
MIS Q. 2000, 24, 277. [CrossRef]
Appl. Sci. 2020, 10, 6628 34 of 36
193. Gill, T.G. Early expert systems: Where are they now? MIS Q. 1995, 19, 51. [CrossRef]
194. Keil, M.; Mann, J.; Rai, A. Why software projects escalate: An empirical analysis and test of four theoretical
models. MIS Q. 2000, 24, 631. [CrossRef]
195. Campion, M.A.; Medsker, G.J.; Higgs, A.C.J.P.p. Relations between work group characteristics and
effectiveness: Implications for designing effective work groups. Pers. Psychol. 1993, 46, 823–847. [CrossRef]
196. Baas, P. Task-Technology Fit in the Workplace: Affecting Employee Satisfaction and Productivity; Erasmus
Universiteit: Rotterdam, The Netherlands, 2010.
197. Doolin, B.; Troshani, I. Organizational Adoption of XBRL. Electron. Mark. 2007, 17, 199–209. [CrossRef]
198. Segars, A.H.; Grover, V. Strategic Information Systems Planning Success: An Investigation of the Construct
and Its Measurement. MIS Q. 1998, 22, 139. [CrossRef]
199. Sharma, R.; Yetton, P.; Crawford, J. Estimating the effect of common method variance: The method—Method
pair technique with an illustration from TAM Research. MIS Q. 2009, 33, 473–490. [CrossRef]
200. Gorla, N.; Somers, T.M.; Wong, B. Organizational impact of system quality, information quality, and service
quality. J. Strateg. Inf. Syst. 2010, 19, 207–228. [CrossRef]
201. Malhotra, N.K. Questionnaire design and scale development. The Handbook of Marketing Research: Uses, Misuses,
and Future Advances; Sage: Thousand Oaks, CA, USA, 2006; pp. 83–94.
202. Hertzog, M.A. Considerations in determining sample size for pilot studies. Res. Nurs. Health 2008, 31,
180–191. [CrossRef]
203. Saunders, M.N. Research Methods for Business Students, 5th ed.; Pearson Education India: Bengaluru, India, 2011.
204. Sekaran, U.; Bougie, R. Research Methods for Business, A Skill Building Approach; John Willey & Sons Inc.:
New York, NY, USA, 2003.
205. Tellis, W. Introduction to case study. Qual. Rep. 1997, 3, 2.
206. Whitehead, A.L.; Julious, S.A.; Cooper, C.L.; Campbell, M.J. Estimating the sample size for a pilot randomised
trial to minimise the overall trial sample size for the external pilot and main trial for a continuous outcome
variable. Stat. Methods Med. Res. 2016, 25, 1057–1073. [CrossRef]
207. Hair Jr, J.F.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling
(PLS-SEM); Sage Publications: Thousand Oaks, CA, USA, 2016.
208. Chin, W.W.; Marcolin, B.L.; Newsted, P.R. A partial least squares latent variable modeling approach
for measuring interaction effects: Results from a Monte Carlo simulation study and an electronic-mail
emotion/adoption study. Inf. Syst. Res. 2003, 14, 189–217. [CrossRef]
209. Hulland, J. Use of partial least squares (PLS) in strategic management research: A review of four recent
studies. Strateg. Manag. J. 1999, 20, 195–204. [CrossRef]
210. Gefen, D.; Rigdon, E.E.; Straub, D. Editor’s comments: An update and extension to SEM guidelines for
administrative and social science research. MIS Q. 2011, 35, 3–14. [CrossRef]
211. Chin, W.W. How to write up and report PLS analyses. In Handbook of Partial Least Squares; Springer:
Berlin/Heidelberg, Germany, 2010; pp. 655–690.
212. Ainuddin, R.A.; Beamish, P.W.; Hulland, J.S.; Rouse, M.J. Resource attributes and firm performance in
international joint ventures. J. World Bus. 2007, 42, 47–60. [CrossRef]
213. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The use of partial least squares path modeling in international
marketing. In New Challenges to International Marketing; Emerald Group Publishing Limited: Bingley, UK,
2009; pp. 277–319.
214. Urbach, N.; Ahlemann, F. Structural equation modeling in information systems research using partial least
squares. J. Inf. Technol. Theory Appl. 2010, 11, 5–40.
215. Sommerville, I. Software Engineering, 9th ed.; Pearson Education Limited: Harlow, UK, 2011; p. 18.
ISBN 0137035152.
216. Venkatesh, V.; Bala, H. Technology acceptance model 3 and a research agenda on interventions. Decis. Sci.
2008, 39, 273–315. [CrossRef]
217. Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal
field studies. Manag. Sci. 2000, 46, 186–204. [CrossRef]
218. Taylor, S.; Todd, P.A. Understanding information technology usage: A test of competing models. Inf. Syst. Res.
1995, 6, 144–176. [CrossRef]
219. Faulkner, L. Beyond the five-user assumption: Benefits of increased sample sizes in usability testing.
Behav. Res. Methods Instrum. Comput. 2003, 35, 379–383. [CrossRef]
Appl. Sci. 2020, 10, 6628 35 of 36
220. Turner, C.W.; Lewis, J.R.; Nielsen, J. Determining usability test sample size. Int. Encycl. Ergon. Hum. Factors
2006, 3, 3084–3088.
221. Zaman, H.; Robinson, P.; Petrou, M.; Olivier, P.; Shih, T.; Velastin, S.; Nystrom, I. Visual Informatics: Sustaining
Research and Innovations; LNCS, Springer: Selangor, Malaysia, 2011.
222. Hadi, A.; Daud, W.M.F.W.; Ibrahim, N.H. The development of history educational game as a revision tool for
Malaysia school education. In Proceedings of the International Visual Informatics Conference, Selangor, MY,
USA, 9–11 November 2011; Springer: Berlin/Heidelberg, Germany, 2011.
223. Marian, A.M.; Haziemeh, F.A. On-Line Mobile Staff Directory Service: Implementation for the Irbid University
College (Iuc). Ubiquitous Comput. Commun. J. 2011, 6, 25–33.
224. Brinkman, W.-P.; Haakma, R.; Bouwhuis, D. The theoretical foundation and validity of a component-based
usability questionnaire. Behav. Inf. Technol. 2009, 28, 121–137. [CrossRef]
225. Mikroyannidis, A.; Connolly, T. Case Study 3: Exploring open educational resources for informal learning.
In Responsive Open Learning Environments; Springer: Cham, Switzerland, 2015; pp. 135–158.
226. Shanmugam, M.; Yah Jusoh, Y.; Jabar, M.A. Measuring Continuance Participation in Online Communities.
J. Theor. Appl. Inf. Technol. 2017, 95, 3513–3522.
227. Straub, D.; Boudreau, M.C.; Gefen, D. Validation guidelines for IS positivist research. Commun. Assoc. Inf. Syst.
2004, 13, 24. [CrossRef]
228. Lynn, M.R. Determination and quantification of content validity. Nurs. Res. 1986, 35, 382–385. [CrossRef]
229. Dobratz, M.C. The life closure scale: Additional psychometric testing of a tool to measure psychological
adaptation in death and dying. Res. Nurs. Health 2004, 27, 52–62. [CrossRef]
230. Davis, L.L. Instrument review: Getting the most from a panel of experts. Appl. Nurs. Res. 1992, 5, 194–197.
[CrossRef]
231. Polit, D.F.; Beck, C.T. The content validity index: Are you sure you know what’s being reported? Critique
and recommendations. Res. Nurs. Health 2006, 29, 489–497. [CrossRef] [PubMed]
232. Qasem, Y.A.M.; Asadi, S.; Abdullah, R.; Yah, Y.; Atan, R.; Al-Sharafi, M.A.; Yassin, A.A. A Multi-Analytical
Approach to Predict the Determinants of Cloud Computing Adoption in Higher Education Institutions.
Appl. Sci. 2020, 10. [CrossRef]
233. Coolican, H. Research Methods and Statistics in Psychology; Psychology Press: New York, NY, USA, 2017.
234. Briggs, S.R.; Cheek, J.M. The role of factor analysis in the development and evaluation of personality scales.
J. Personal. 1986, 54, 106–148. [CrossRef]
235. Tabachnick, B.G.; Fidell, L.S. Principal components and factor analysis. Using Multivar Stat. 2001, 4, 582–633.
236. Gliem, J.A.; Gliem, R.R. Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for
Likert-type scales. In Proceedings of the 2003 Midwest Research-to-Practice Conference in Adult, Continuing,
and Community, Jeddah, Saudi Arabia, 8–10 October 2003.
237. Mallery, P.; George, D. SPSS for Windows Step by Step: A Simple Guide and Reference; Allyn & Bacon: Boston,
MA, USA, 2003.
238. Nunnally, J.C. Psychometric Theory 3E; Tata McGraw-Hill Education: New York, NY, USA, 1994.
239. Bagozzi, R.P.; Yi, Y. On the evaluation of structural equation models. J. Acad. Mark. Sci. 1988, 16, 74–94.
[CrossRef]
240. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement
error. J. Mark. Res. 1981, 18, 39–50. [CrossRef]
241. Gefen, D.; Straub, D.; Boudreau, M.-C. Structural Equation Modeling and Regression: Guidelines for Research
Practice. Commun. Assoc. Inf. Syst. 2000, 4, 7. [CrossRef]
242. Chin, W.W.; Marcoulides, G. The partial least squares approach to structural equation modeling. Mod. Methods
Bus. Res. 1998, 295, 295–336.
243. Diamantopoulos, A.; Siguaw, J.A. Formative versus reflective indicators in organizational measure
development: A comparison and empirical illustration. Br. J. Manag. 2006, 17, 263–282. [CrossRef]
244. Hair, J.F.; Ringle, C.M.; Sarstedt, M. Partial least squares structural equation modeling: Rigorous applications,
better results and higher acceptance. Long Range Plan. 2013, 46, 1–12. [CrossRef]
245. Cenfetelli, R.T.; Bassellier, G. Interpretation of formative measurement in information systems research.
MIS Q. 2009, 33, 689–707. [CrossRef]
246. Yoo, Y.; Henfridsson, O.; Lyytinen, K. Research commentary—The new organizing logic of digital innovation:
An agenda for information systems research. Inf. Syst. Res. 2010, 21, 724–735. [CrossRef]
Appl. Sci. 2020, 10, 6628 36 of 36
247. Nylén, D.; Holmström, J. Digital innovation strategy: A framework for diagnosing and improving digital
product and service innovation. Bus. Horiz. 2015, 58, 57–67. [CrossRef]
248. Maksimovic, M. Green Internet of Things (G-IoT) at engineering education institution: The classroom of
tomorrow. Green Internet Things 2017, 16, 270–273.
249. Fortino, G.; Rovella, A.; Russo, W.; Savaglio, C. Towards cyberphysical digital libraries: Integrating IoT
smart objects into digital libraries. In Management of Cyber Physical Objects in the Future Internet of Things;
Springer: Berlin/Heidelberg, Germany, 2016; pp. 135–156.
250. Picciano, A.G. The evolution of big data and learning analytics in American higher education. J. Asynchronous
Learn. Netw. 2012, 16, 9–20. [CrossRef]
251. Ifinedo, P. An empirical analysis of factors influencing Internet/e-business technologies adoption by SMEs in
Canada. Int. J. Inf. Technol. Decis. Mak. 2011, 10, 731–766. [CrossRef]
252. Talib, A.M.; Atan, R.; Abdullah, R.; Murad, M.A.A. Security framework of cloud data storage based on
multi agent system architecture—A pilot study. In Proceedings of the 2012 International Conference on
Information Retrieval and Knowledge Management, CAMP’12, Kuala Lumpur, Malaysia, 13–15 March 2012.
253. Adrian, C.; Abdullah, R.; Atan, R.; Jusoh, Y.Y. Factors influencing to the implementation success of big data
analytics: A systematic literature review. In Proceedings of the International Conference on Research and
Innovation in Information Systems, ICRIIS, Langkawi, Malaysia, 16–17 July 2017.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).