Coherence and Transparency: Some Advice For Qualitative Researchers

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Coherence and transparency: some advice for

qualitative researchers

Crispin Coombsa*
Loughborough University, Loughborough, Leicestershire, United Kingdom of Great Britain and Northern Ireland
a

*c.r.coombs@lboro.ac.uk

Abstract
There is relatively little advice in the Engineering domain for undertaking qualitative studies. Researchers have to rely on
generic guidance that may result in imprecise application of qualitative methods. A related discipline to Engineering is
Information Systems (IS) and the experiences of the IS domain may provide some useful insights for undertaking qualitative
studies. This paper synthesizes the guidance from the IS community for crafting high quality qualitative studies and
manuscripts. It reports on five themes: i) Establishing philosophical underpinnings; ii) Clarifying theoretical aims; iii) Selecting
qualitative methods; iv) Demonstrating rigour in qualitative data analysis; and v) Grappling with generalisation. The review
stresses the importance of coherence and transparency for crafting qualitative research manuscripts and provides a list of
reflective questions for qualitative research design.
Keywords
Qualitative studies. Guidance. Data analysis. Case study. Generalisibility. Information systems. Engineering.

How to cite this article: Coombs, C. (2017). Coherence and transparency: some advice for qualitative researchers.
Production, 27, e20170068. http://dx.doi.org/10.1590/0103-6513.006817

Received: Aug. 28, 2017; Accepted: Sept. 5, 2017.

1. Introduction
Engineering research has traditionally adopted positivist research approaches using quantitative methods
(Borrego et al., 2009; Gawlik, 2016; Koro-Ljungberg & Douglas, 2008; Zhou, 2012). However, the dominance
of this research paradigm and method is starting to change in some Engineering disciplines. For example,
(Daly et al., 2013) observe that numerous studies have used qualitative methods to study Engineering Design.
However, because of the dominance of positivist traditions much work is still judged by positivist criteria and
expectations, which may be limiting the publication opportunities for qualitative studies. One way to address
this challenge is for researchers to ensure that they are adopting appropriate research paradigms, suitable
methodological techniques and crafting convincing and persuasive arguments in their manuscripts. Unfortunately,
there is relatively little advice in the Engineering domain for undertaking qualitative studies. Consequently,
researchers have to rely on generic guidance that may result in imprecise application of qualitative methods.
A related discipline to engineering that has experienced similar positivist dominance is Information Systems
(IS) (Davison & Martinsons, 2011). Like the Engineering disciplines, IS is distinct from the approaches of the
natural sciences that are concerned with describing or explaining what exists or what has existed. By contrast,
the Engineering and IS disciplines aim to describe and explain how to create what does now not exist or has not
yet existed (Lee, 2014). IS has seen steady growth over the past 20 years in the number of qualitative research
studies published in high quality international IS journals and conferences. Therefore, the experiences of the IS
domain may provide some useful insights for Engineering researchers wishing to undertake qualitative studies.
There has been considerable advice provided by the IS research community regarding the design and
application of qualitative research. However, much of this advice is scattered across journals either as editorial

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 1/8


reviews, literature reviews, method papers, or commentary or viewpoint articles. Consequently, it is easy for
researchers to miss or omit key points in terms of their qualitative method options or choices. This narrative
review (Paré et al., 2015) synthesizes existing advice and guidance from the IS community for crafting high
quality qualitative studies and manuscripts. Thus, the research question for this paper was to investigate, what
guidance has been provided by the IS community for undertaking qualitative research? It was envisaged that
this review would to help researchers publish high quality qualitative studies in the Engineering disciplines.
The paper begins by conceptualising qualitative research and then discusses the core themes that emerged from
reviewing the extant literature. These themes are structured around the core components of a research method:

(i) Establishing philosophical underpinnings;


(ii) Clarifying theoretical aims;
(iii) Selecting qualitative methods;
(iv) Demonstrating rigour in qualitative data analysis; and
(v) Grappling with generalisation.

The paper then presents a discussion of two overarching themes, transparency and coherence, that permeate
throughout the review and concludes with a series of reflective questions and associated resources to help
researchers focus on key considerations for crafting qualitative studies.

2. Conceptualising qualitative research


So what is qualitative research? The literature provides a range of definitions and interpretations of qualitative
research. Conboy et al. (2012) state that qualitative research aims to empirically investigate a variety of phenomena
through qualitative data from a variety of sources, such as interviews, observations, design efforts, interventions,
and archival materials. Gephart, (2004) stresses that it studies phenomena where they naturally occur, and it
strives to grasp the meanings social actors give them. Thus, Borrego et al. (2009, p. 56) emphasises the textual
nature of many qualitative data sources and “[...] by its emphasis on the context within which the study occurs
[...]”. Indeed, drawing on arguments presented by Orlikowski & Iacono (2001), Sarker et al. (2013) observe
that qualitative researchers’ focus on social/human dynamics can mean that the role of the material artefact
is neglected. This focus on the social aspects in research may help to explain the emergence of a common
misconception that equates qualitative research with interpretivism and quantitative research with positivism
(Conboy et al., 2012; Silverman, 1998).

3. Method
In order to collate the relevant literature to inform this study the Association of Information Systems (AIS)
basket of eight IS journals were searched between 1991 and 2017. These journals are considered the top journals
in the IS field and include European Journal of Information Systems; Information Systems Journal; Information
Systems Research; Journal of AIS; Journal of Information Technology; Journal of MIS; Journal of Strategic
Information Systems; and MIS Quarterly. Each journal was searched using the term ‘qualitative’. The search
results were then manually reviewed by scanning the title and abstracts of the identified articles and included
if they provided guidance on qualitative study design. Articles on action research and design science were not
included as they include aspects that are not typically core to qualitative studies (Sarker et al., 2013). This filtering
process provided a sample of research essays, editorials, viewpoint articles, commentaries and literature reviews.

4. Advice for crafting qualitative research studies


Reviewing the sample of literature revealed five themes. These themes are discussed in the following sections.

4.1. Establishing philosophical underpinnings


Several authors have highlighted that a researcher’s philosophical assumptions will influence their methodological
design decisions (Keutel et al., 2013). The majority of IS studies draw on one of three main philosophical
paradigms: positivism, interpretivism and critical research. Positivism has long been the dominant paradigm

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 2/8


that emerged from the natural sciences. Positivists assume that there is an objective reality that is independent
from the observer and that cause effect relations that are not bounded in context or time can be identified.
By contrast, interpretivists see the world as socially constructed and believe it can be understood by exploring
the meanings that people assign to phenomena. Critical researchers aim to understand and explain phenomena
but also to challenge established social structures and norms in organisations and society (Keutel et al., 2013).
Writing with regard to case study research, Keutel et al. (2013) observe that it is important for researchers to
establish and state their philosophical position and ensure that their methodological design decisions align
with their espoused philosophical paradigm. This is important as the execution of a particular research method
may change depending on the underlying philosophical position of the researcher. For example, case study
research can be conducted in positivist, interpretivist, critical research or critical realism traditions, or may be
part of a pluralist methodology (Conboy et al., 2012). Each philosophical paradigm will influence the case study
research design and execution. Researchers following the positivist position will tend to favour multiple case
studies often following replication logic. Interpretivists will use a smaller number or a single case because they
are seeking detailed in-depth observations to explain the phenomena of interest. Consequently, it is important
that the philosophical worldview of the researcher is clearly stated and the application of qualitative research
methods is consistent with this view. The worldview of the researcher will also influence the theoretical aims
set for the study.

4.2. Clarifying theoretical aims


One of the primary objectives for many studies is to make a contribution to theory as well as practice. It has
been argued that in some respects the demand for theory contributions may have become too strong, with
researchers prioritising theory contribution over and above the practical relevance of the research. This debate
was most recently illustrated in the 2014 special issue of the Journal of Information Technology where
Avison & Malaurent (2014) argue for more ‘theory light’ contributions. The ensuing debate from senior scholars
suggested that theory was still relevant but that there needed to be greater clarity from researchers regarding
the type of theory contribution they were making, the significance of that theoretical contribution and the
adoption of appropriate research methods for theory development.
Gregor (2006) discusses five main types of theory: (i) theory for analysing; (ii) theory for explaining; (iii) theory
for predicting; (iv) theory for explanation and prediction; and (v) theory for design and action. The more common
applications of theory types in IS tend to be either to explain, predict or explain and predict. Studies that aim
to build theory for explaining are not aiming to make testable predictions about the future, but instead seek
to understand how and why events occur. These studies provide a basis for process theory building, helping
to explain how and why events occurred e.g. Structuration theory, (Giddens, 1986) or Actor-Network theory,
(Latour, 1990). By contrast, studies that investigate theory for predicting will aim to predict what will be but
do not explain why. These studies essentially accept that the underlying causal mechanisms remain in a ‘black
box’ e.g. Moore’s Law (Moore, 1965). Theories for predicting and explaining implies an understanding of the
underlying causal factors that will lead to particular events or outcomes, e.g. Technology Acceptance Model
(TAM) (Davis et al., 1989). While Gregor (2006) argues that no theory type belongs to a particular research
paradigm, she acknowledges that researchers that align to a particular world view will tend to focus on some
theory types over others. For example, interpretivists will tend to advocate theory for explaining and positivists
will tend to focus on theory for predicting or theory for predicting and explaining. However, it is important to
note that while these norms have become evident in IS research, “[...] there is no clear and direct connection
between any theory type and any one paradigm [...]” (Gregor, 2006, p. 632). In other words, it is reasonable for
researchers that advocate a positivist paradigm to develop theory for explaining and interpretivists to develop
theory for explaining and predicting. A researchers’ worldview is also likely to be reflected in their preference
for particular methods.

4.3. Selecting qualitative methods


Having established the theoretical aims for the research it is important to select appropriate methods
that align with these aims. Studies that attempt to develop theory for explaining, commonly advocated by
researchers that follow an interpretivist paradigm, tend to employ case studies or ethnographic field studies
often drawing on qualitative research techniques. Studies that investigate theory for explaining and predicting

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 3/8


tend to employ case studies, surveys, archival studies, experiments and statistical analysis and tend to be more
aligned with quantitative research techniques. These studies are often undertaken by researchers that follow a
positivist paradigm (Gregor, 2006). However it is important to stress that a research paradigm does not dictate
the choice of research method i.e. a study that employs qualitative research methods does not mean that the
researchers are following an interpretivist paradigm (Conboy et al., 2012). For example, researchers that take a
positivist position have used theory to establish hypotheses to be tested empirically in qualitative case studies
(Sarker et al., 2013).
IS case study research often features qualitative research methods. In their review of qualitative research in
four leading IS journals Sarker et al. (2013) found that 93% of the studies were case based. Case study research
has grown in popularity in the IS discipline accounting for about 25% of the research publications in core IS
journals between 2001 and 2010 (Keutel et al., 2013). Case study research is suitable for studying broad and
complex phenomena where there is insufficient knowledge to determine clear causal questions (Keutel et al., 2013).
Case study research in IS tends to be positioned in either a positivist or interpretive research stream
(Keutel et al., 2013). In the past case study research was dominated by positivist researchers and would be
informed by methodological authorities such as (Benbasat et al., 1987; Eisenhardt, 1989; Paré, 2004; Yin, 2009).
More recently, the number of interpretive case studies has grown drawing on advice from authorities such as
(Klein & Myers, 1999; Walsham, 1995; Yin, 2009). The observant reader will note that Yin (2009) features as
an authority for both positivist and interpretivist studies, illustrating the need for researchers to be mindful that
qualitative methods are not exclusive to particular research traditions. As noted earlier the choice of research
paradigm is important as it influences how case study research is applied as illustrated in Table 1.

Table 1. Contrasting research paradigms in case study research (Keutel et al., 2013).


Positivist case study research Interpretive case study research
Believe that reality is perceived individually and that knowledge
is created through social interaction arguing that the primary
Seek universal facts that are generalisable beyond the investigated cases
purpose of theorising is to understand and explain the case studied
(Klein & Myers, 1999)
Take the view that all aspects of the research design have to be defined Adopt a flexible approach to case study design and are open to
prior to entering the field (Yin, 2009) adjusting initial assumptions and theories (Walsham, 1995)
Prefer multiple case studies (often between 4-10) to enable analytical Seek in-depth understanding and are therefore content with a small
generalisibility (Eisenhardt, 1989) number or single case study approach (Walsham, 1995)
Seek multiple data sources to arrive at a single explanation generally Seek different viewpoints from that may conflict in order to understand
preferring quantitative data (Yin, 2009) the case (Klein & Myers, 1999)

Reviews of IS case study research (Keutel et al., 2013) and qualitative research (Sarker et al., 2013) reveals
a number of common characteristics of qualitative studies. The theoretical aim of the majority of interpretive
studies was to develop theory to explain whereas positivist studies tended to test constructs and hypotheses
i.e. theory to explain and predict. For positivist case study research there was a slight preference for multiple
cases, but a significant number of positivist case studies adopted a single case approach. By contrast, for
interpretivist case study research the majority of studies adopted a single case approach (Keutel et al., 2013).
Sarker et al. (2013) found a similar pattern with over half of their sample of qualitative studies adopting a single
case study design. Sarker et al. add that that the majority of papers in their sample provided a rationale for
their sampling logic and explained why certain cases were chosen. Both Keutel et al. (2013) and Sarker et al.
(2013) found interviews to be the dominant mode of data collection in positivist and interpretivist studies,
often supported by document and observation data. In Sarker et al.’s sample the number of interviews ranged
from 175 to 6 with an overall average of 40 interviews with the majority (64%) stating that the interviews had
been recorded and transcribed. Many of the studies in Sarker et al.’s sample used a combination of document,
observation or field notes to support interview data.
However, the reviews also revealed several weaknesses in reporting qualitative case study research. Keutel et al.
(2013) observed the majority of researchers did not state their philosophical paradigm that underpinned the
study. A significant number of studies (44%) did not include rationale for their sampling logic or explained
why certain cases were chosen (Sarker et al., 2013). The majority of studies did not state the unit of analysis
adopted for the study (Keutel et al., 2013), provide details of the interview schedule or topic list used to guide
interviews, or explain and illustrate the role of supporting materials (e.g. document, observation or field notes)
in their analysis.

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 4/8


Based on these observations Keutel et al. (2013) conclude that it is important that researchers should take a
mindful approach to case study research, carefully considering their research design decisions and articulating
these choices and the rationale their underpinning rationale. Researchers need to be detailed in their explanations
for these decisions and provide illustrations and examples of how they were applied. Many qualitative studies
include some of this detail but the observations of Keutel et al. (2013) and Sarker et al. (2013) demonstrate
that there is room for further improvement. Adopting this transparent approach may also help researchers to
demonstrate rigour in study design and analysis.

4.4. Demonstrating rigour in qualitative data analysis


Qualitative researchers face a number of challenges when undertaking data analysis. They are often faced
with a large, unstructured data set, strict page limit requirements to present results for journal and conference
paper submissions and a need to demonstrate rigorous data collection and analysis (Conboy et al., 2012).
One way to address this challenge is to refer to established guidelines for undertaking qualitative research and
show how the research adheres to reference criteria (e.g. providing a table of reference criteria and explaining
how each requirement has been addressed). However, qualitative research includes a range of different methods
and techniques and therefore it is important that the researchers are careful to maintain internal coherence in
the methodological decisions that they present in their work. For example, Sarker et al. (2013) describe how
some studies have conflated positivist requirements of validity and reliability with claiming an interpretivist
approach in a single study.
Qualitative IS papers tend to follow an inductive approach to data analysis. Sarker et al. (2013) found
qualitative studies adopted grounded theory analysis using open coding, axial coding, and selective coding,
or thematic coding, content analysis or hermeneutics. They add that while many studies report the use of
coding structures relatively few provide a transparent explanation of these procedures through illustrations and
worked examples. Further, when reporting the outputs of analysis many researchers struggle to write succinct
and persuasive arguments. To address this weakness, Conboy et al. (2012) recommend that researchers look for
exemplar studies published in leading journals to guide their writing combined with advice on qualitative analysis
(Miles et al., 2013), scientific writing (Day, 1998) and qualitative reporting (Golden-Biddle & Locke, 1997).
For example, when adopting an inductive approach to analysis for an interpretive study it is important to
ensure that the findings of the study are ‘data-led’. That is, to present a detailed narrative of the relevant events
and associated perceptions and opinions of participants concerning the phenomena of interest supported by a
range of illustrative quotes from different participants. Presenting these data allows the researcher to extract the
key features and to offer an interpretation that may be theory informed. For an exemplar see Abubakre et al.
(2015).

4.5. Grappling with generalisation


Qualitative researchers are often unclear about the role of generalisation in their studies (Conboy et al., 2012).
Indeed, as the purpose of many qualitative case study research studies is to develop a deep understanding of
a complex situated phenomenon, some researchers would argue that generalisibility should not be an aim.
However, Lee & Baskerville (2003) observe there are several forms of generalisibility and statistical sample based
generalisation is only one of these. For example, it is possible for researchers to argue that they are generalising
to theory, employing analytic generalisation, rather directly to other situations or contexts. Taking this approach,
qualitative researchers can generalise from their empirical findings to build new propositions for theory building.
These propositions may then be investigated through future studies, thereby allowing the original empirical
findings from the initial study to be generalised to other contexts via a theoretical framework or model
(Conboy et al., 2012).
A further dimension to the issue of generalisibility has been highlighted in the recent debate in the Journal
of Information Technology. In their research essay ‘Context is King!’ Davison & Martinsons (2016) argue that
many IS researchers do not give explicit consideration to the context and its key characteristics when reporting
empirical data and thus risk overstating the validity of research findings and conclusions. Sarker (2016) adds that
it is important to get a balance between contextualising findings and generalisibility, but that it is important
not to neglect context as it may change the nature of theory and empirical results. He notes that for deductive
theory testing researchers tend to only consider context when they need to find an explanation for anomalous
results. As these studies only consider contextual issues in an ad hoc manner it is difficult to determine whether
the most important contextual issues are being considered. To address this issue Sarker suggests that theory

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 5/8


could be systematically adapted to context, thereby responding to issues of transferability of theoretical concepts.
For example, investigating whether Western context derived theoretical concepts are transferable to Eastern
contexts. For studies adopting an inductive approach to data analysis researchers need to consider the levels of
abstraction that they can draw from their findings. Thus, researchers have to be mindful of how far they have
to abstract away from the context characteristics of their study to provide universal results.
One possible reason why researchers may not choose to mention context is if they are in non-Western cultures.
Western, and economically developed, countries especially North America and Europe have historically dominated
academic thought and are home to the majority of leading academic journals. Thus, for researchers outside of
these countries there are concerns that there is less interest in their contexts because abstractions from these
studies may not be transferable to developed countries (Cheng et al., 2016). Thus, some researchers have taken
theories created in developed, or Western countries and applied them to their local contexts. However, Davison
& Martinsons (2016) argue that Western-based theories need rigorous testing and that their cross-cultural
applicability should not be assumed. They suggest that studies that take these theories and attempt to apply
them in other cultures without considering constructs that are specific to the local environment are missing
an important opportunity for informing their theorising. However, they acknowledge that theories developed
in non-Western contexts are likely to include ‘concepts somewhat alien to Western minds’. Thus, non-Western
researchers, or researchers from developing countries, may choose to adopt different strategies. They may take
theory derived from Western or developed country contexts and investigate its applicability and possible extension
to non-Western, or developing country contexts through the inclusion of contextual concepts. However, as
Davison & Martinsons (2016) observe, this means that the study is still founded on Western or developed country
thinking. Alternatively, researchers in non-Western or developing countries may wish develop native theory and
compare it to Western or developed country based theories to potentially enhance either one of them, or to
illustrate where context differentiates each theoretical position.

5. Discussion and concluding remarks


The preceding sections suggest that when designing qualitative studies researchers have considerable freedom
in terms of positioning their research, their theoretical aims and the methods employed. However, while there
are no prescriptive guidelines that require particular method choices or theoretical aims to be aligned with
particular research paradigms it is important that researchers are mindful of the coherence and transparency of
their qualitative research design.
In terms of coherence, it is important to demonstrate a logical structure that aligns the researcher’s advocated
research paradigm with the theoretical aims of the study and the chosen methodological techniques. Stating the
researchers position with regard to research paradigm, theory aim and research method, allows the research
community to understand the value and contribution of the study. Establishing the position of the study in
these respects also helps the researcher to explain and justify their research design decisions. Establishing this
internal coherence is particularly important as it provides an opportunity for researchers to connect with
established norms and traditions. These norms and traditions are manifested in editors, reviewers and researchers
interpretations of the quality of a qualitative research study. While researchers may wish to deviate from these
norms they should be mindful that following established norms means that less explanation and justification
is required in their manuscripts. Thus, ensuring a strong internal coherence, will allow evaluators to follow the
logic of the researcher’s design choices and is likely to increase the confidence of reviewers and readers that
the study is rigorous and of high quality.
One way of demonstrating the coherence of a qualitative study is by adopting a transparent approach to
reporting. This review reveals that qualitative studies often omit important information that is vital for assessing
the quality of the study. Many researchers do not state their research paradigm, their sampling logic, the unit
of analysis, or the interview schedule they used. These omissions could be considered akin to leaving out
reliability and validity checks in quantitative studies. Thus, providing a detailed and transparent explanation of
all methodological aspects of a qualitative study is key to establishing the rigour of qualitative work. For example,
researchers need to provide details of the research paradigm, theoretical aims and methodological principals used
to guide the study. The choice of study context, selection of cases and participants along with supporting data
sources and interview schedule need to be stated. Details of the approach to data analysis and illustrations of
how the analysis was performed for all data sources also need to be included. Essentially, researchers need to
make a conscious effort to be meticulous when explaining their qualitative work.

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 6/8


To help researchers ensure that their qualitative studies have high levels of coherence and transparency Table 2
attempts to encapsulate the key guidance from this paper through a series of reflective questions. Although by
no means a definitive list, it is hoped that these questions will highlight some of the key considerations for
qualitative research reporting.

Table 2. Questions to consider when developing qualitative studies.


Reflective questions for qualitative research Sources for guidance
1. Have you stated your philosophical position? Conboy et al. (2012), Sarker et al. (2013)
2. Have you explained the theoretical aims for your study? Avison & Malaurent (2014), Gregor (2006)
3. Have you explained how you have chosen your data collection site
Davison & Martinsons (2016), Keutel et al. (2013), Sarker et al. (2013)
and context and considered the limitations of these?
4. Have you explained your sampling strategy? E.g. multiple, single,
Cheng et al. (2016), Keutel et al. (2013), Sarker et al. (2013)
extreme, replication
5. Are you using more than interview data? E.g. observations, archival
Keutel et al. (2013), Sarker et al. (2013), Silverman (1998)
data, quantitative data, documents.
6. Have you described your sample? E.g. number of interviewees, job
roles, number of follow up interviews, number of observations, data Sarker et al. (2013)
collection timespan.
7. Have you included your interview schedule? Sarker et al. (2013)
8. Have you explained how you have analysed your qualitative data
(including supporting data), providing a worked example of each Sarker et al. (2013)
analysis stage?
Cheng et al. (2016), Conboy et al. (2012), Davison & Martinsons (2016),
9. Have you considered the generalisibility of your study?
Sarker (2016)

It is hoped that this review provides a resource for doctoral researchers, early career researchers, established
researchers and evaluators to guide the development and evaluation of qualitative research studies without
being prescriptive. There are many detailed text books (e.g. Corbin & Strauss, 2014; Miles et al., 2013) and
established web based resources (e.g. the AIS living scholarship site to support qualitative methods in IS research
(Myers, 1997)) that already exist. This review is not intended to replace these and researchers are encouraged
to refer to these sources for precise details regarding design and method.
Qualitative studies provide a valuable contribution to research knowledge complementing and enhancing
findings from more commonly applied research methods. They provide an opportunity to explain and understand
phenomena in a deeper way than quantitative studies. However, in order for the contribution of qualitative
studies to be heard, they have to be able to demonstrate the same levels of coherence, transparency and rigour
as we have grown to expect from positivist quantitative studies. Hopefully, this paper will be a useful resource
to help researchers achieve this goal.

References
Abubakre, M. A., Ravishankar, M. N., & Coombs, C. R. (2015). The role of formal controls in facilitating information system diffusion.
Information & Management, 52(5), 599-609. http://dx.doi.org/10.1016/j.im.2015.04.005.
Avison, D., & Malaurent, J. (2014). Is theory king?: questioning the theory fetish in information systems. Journal of Information
Technology, 29(4), 1-10. http://dx.doi.org/10.1057/jit.2014.8.
Benbasat, I., Goldstein, D. K., & Mead, M. (1987). The case research strategy in studies of information systems. Management Information
Systems Quarterly, 11(3), 369-386. http://dx.doi.org/10.2307/248684.
Borrego, M., Douglas, E. P., & Amelink, C. T. (2009). Quantitative, qualitative, and mixed research methods in engineering education.
Journal of Engineering Education, 98(1), 53-66. http://dx.doi.org/10.1002/j.2168-9830.2009.tb01005.x.
Cheng, Z., Dimoka, A., & Pavlou, P. A. (2016). Context may be King, but generalizability is the Emperor! Journal of Information
Technology, 31(3), 257-264. http://dx.doi.org/10.1057/s41265-016-0005-7.
Conboy, K., Fitzgerald, G., & Mathiassen, L. (2012). Qualitative methods research in information systems: motivations, themes, and
contributions. European Journal of Information Systems, 21(2), 113-118. http://dx.doi.org/10.1057/ejis.2011.57.
Corbin, J. M., & Strauss, A. L. (2014). Basics of qualitative research : techniques and procedures for developing grounded theory.
London: Sage. Retrieved in 2017, August 17, from https://books.google.co.uk/books?hl=en&lr=&id=Dc45DQAAQBAJ&oi=fnd&pg
=PP1&dq=strauss+and+corbin&ots=M2EI3QjYur&sig=EW4ndA40fHGiFZxlUWrfcgQcLBo#v=onepage&q=strauss and corbin&f=false
Daly, S., McGowan, A., & Papalambros, P. (August, 2013). Using qualitative research methods in engineering design research. In
Proceedings of the 19th International Conference on Engineering Design (pp. 203-212). Seoul: ICED. Retrieved in 2017, August 17,
from https://www.engineeringvillage.com/share/document.url?mid=cpx_M34eddd49145902ed14eM711f10178163125&database=cpx
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models.
Management Science, 35(8), 982-1003. http://dx.doi.org/10.1287/mnsc.35.8.982.

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 7/8


Davison, R. M., & Martinsons, M. G. (2011). Methodological practice and policy for organisationally and socially relevant IS research: an
inclusive–exclusive perspective. Journal of Information Technology, 26(4), 288-293. http://dx.doi.org/10.1057/jit.2011.19.
Davison, R. M., & Martinsons, M. G. (2016). Context is king! Considering particularism in research design and reporting. Journal of
Information Technology, 31(3), 241-249. http://dx.doi.org/10.1057/jit.2015.19.
Day, R. (1998). How to write and publish a scientific paper (5th ed.). Phoenix: Oryx Press.
Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532-550.
Gawlik, R. (2016). Methodological aspects of qualitative-quantitative analysis of decision-making processes. Management and Production
Engineering Review, 7(2), 3-11. http://dx.doi.org/10.1515/mper-2016-0011.
Gephart, R. P. (2004). Qualitative research and the academy of management journal. Academy of Management Journal, 47(4), 454-462.
http://dx.doi.org/10.5465/AMJ.2004.14438580.
Giddens, A. (1986). The constitution of society: outline of the theory of structuration. Berkeley: University of California Press. Retrieved
in 2017, August 18, from http://www.ucpress.edu/book.php?isbn=9780520057289
Golden-Biddle, K., & Locke, K. (1997). Composing qualitative research. Thousand Oaks, CA: Sage Publications.
Gregor, S. (2006). The nature of theory in information systems. Management Information Systems Quarterly, 30(3), 611-642.
Keutel, M., Michalik, B., & Richter, J. (2013). Towards mindful case study research in IS: a critical analysis of the past ten years. European
Journal of Information Systems, 23(3), 256-272. http://dx.doi.org/10.1057/ejis.2013.26.
Klein, H. K., & Myers, M. D. (1999). A set of principles for conducting and evaluating interpretive field studies in information systems.
Management Information Systems Quarterly, 23(1), 67-94. http://dx.doi.org/10.2307/249410.
Koro-Ljungberg, M., & Douglas, E. P. (2008). State of qualitative research in engineering education: meta-analysis of JEE articles,
2005-2006. Journal of Engineering Education, 97(2), 163-175. http://dx.doi.org/10.1002/j.2168-9830.2008.tb00965.x.
Latour, B. (1990). Technology is society made durable. The Sociological Review, 38(1 Suppl), 103-131. http://doi.org/10.1111/j.1467-
954X.1990.tb03350.x.
Lee, A. S. (2014). Theory is king? But first, what is theory? Journal of Information Technology, 29(4), 350-352. http://dx.doi.org/10.1057/
jit.2014.23.
Lee, A. S., & Baskerville, R. L. (2003). Generalizing generalizability in information systems research. Information Systems Research, 14(3),
221-243. http://dx.doi.org/10.1287/isre.14.3.221.16560.
Miles, M. B., Huberman, A. M., & Saldaña, J. (2013). Qualitative data analysis: a methods sourcebook. London: Sage. Retrieved in 2017,
August 18, from https://books.google.co.uk/books?hl=en&lr=&id=3CNrUbTu6CsC&oi=fnd&pg=PR1&dq=miles+and+huberman&
ots=Lh1WolTO9h&sig=U2yXDtGygPzHYUIMM7B89fDwL7E#v=onepage&q=miles and huberman&f=false
Moore, G. E. (1965). Cramming more components onto integrated circuits. Electronics, 114–117. http://dx.doi.org/10.1109/JPROC.1998.658762.
Myers, M. D. (1997). Qualitative research in information systems. Management Information Systems Quarterly, 21(2), 241-242. http://
dx.doi.org/10.2307/249422.
Orlikowski, W. J., & Iacono, C. S. (2001). Research commentary: desperately seeking the “IT” in IT research—a call to theorizing the IT
artifact. Information Systems Research, 12(2), 121-134. http://dx.doi.org/10.1287/isre.12.2.121.9700.
Paré, G. (2004). Investigating information systems with positivist case study research. Communications of the Association for Information
Systems, 13(1), 233-264.
Paré, G., Trudel, M.-C. C., Jaana, M., & Kitsiou, S. (2015). Synthesizing information systems knowledge: a typology of literature reviews.
Information & Management, 52(2), 183-199. http://dx.doi.org/10.1016/j.im.2014.08.008.
Sarker, S. (2016). Building on Davison and Martinsons’ concerns: a call for balance between contextual specificity and generality in IS
research. Journal of Information Technology, 31(3), 250-253. http://dx.doi.org/10.1057/s41265-016-0003-9.
Sarker, S., Xiao, X., & Beaulieu, T. (2013). Qualitative studies in information systems: a critical review and some guiding principles.
Management Information Systems Quarterly, 37(4), 3-18.
Silverman, D. (1998). Qualitative research: meanings or practices? Information Systems Journal, 8(1), 3-20. http://dx.doi.org/10.1046/
j.1365-2575.1998.00002.x.
Walsham, G. (1995). Interpretive case studies in IS research: nature and method. European Journal of Information Systems, 4(2), 74-81.
http://dx.doi.org/10.1057/ejis.1995.9.
Yin, R. K. (2009). Case study research: design and methods (Vol. 5). London: Sage Publications. http://doi.org/10.1097/FCH.0b013e31822dda9e.
Zhou, Z. (2012). Qualitative research in engineering management. In Proceedings of the American Society for Engineering Management
2012 International Annual Conference (pp. 707-713). Virginia Beach: American Society for Engineering Management. Retrieved in
2017, August 19, from http://arrow.dit.ie/engscheleart

Production, 27, e20170068, 2017 | DOI: 10.1590/0103-6513.006817 8/8

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy