0% found this document useful (0 votes)
20 views

CH 4

Uploaded by

beshahashenafi32
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

CH 4

Uploaded by

beshahashenafi32
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 63

Research Methods in

Information Systems
The Scientific Method in Research
Topics
 Data collection
 Analysis in Qualitative research
 Analysis in Quantitative research
 Analysis in Design Science research
 Discussion and interpretation
Data Collection
 Data collection is the process of gathering
and measuring information on variables of
interest
 Data collection is conducted in a systematic
fashion that enables to answer stated
research questions, test hypotheses, and
evaluate outcomes.
 Regardless of the type of research, accurate
data collection is essential for maintaining
the integrity of a research
Slide 1-3
Cont’d..
 Data collection is conducted after
 developing instruments
 In case of survey, qualitative studies
 determining the data sources
 In case of experimental , analytical or predictive
 Setting the objectives of the design artifacts and
later after the construction of the artifact.
 In case of design science

Slide 1-4
Primary and Secondary data
 Primary data is data that is collected from firsthand
experience
 Since primary data is collected based on the specific problem
at hand, it is more reliable than secondary data.
 Methods of primary data collection include:
 Questionnaires
 Interviews
 Focus Group Interviews
 Observation
 Sources of secondary data include:
 Books
 Journals
 Magazines
 Government and public sector records
Slide 1-5
Questionnaire
 A questionnaire is a research instrument
consisting of a series of questions and other
prompts for the purpose of gathering
information from respondents.
 For best response questions are usually
sequenced to flow from:
 The least sensitive to the most sensitive
 The factual and behavioral to the attitudinal
 The more general to the specific

Slide 1-6
Cont’d..
 Advantages of questionnaire:
 Large amounts of information can be collected
from a large number of people in a short period
of time and in a relatively cost effective way.
 Can be carried out by the researcher or by other
people with limited effect on validity and
reliability
 The result can be easily and quickly quantified
 Can be analyzed more objectively

Slide 1-7
Cont’d..
 Disadvantages of questionnaire:
 Inadequate to understand some information
such as changes of emotions, behavior, feeling
etc.
 There is no way to tell how truthful a
respondent is
 People may read differently and respond based
on their own interpretation.

Slide 1-8
Interview
 Is a means of asking questions and getting
answers from participants in a study.
 Interviews can be:
 Structured
 Each respondent is asked the same series of questions
prepared ahead of the interview (mainly close-ended
questions)
 Semi-structured
 Interview guide is used to ask in some order
 Questions outside the guide may be raised as required
 Unstructured
 A preconceived goal or objective of the interview guides how it
is conducted without following a structured flow.

Slide 1-9
Cont’d..
 Advantages of interview:
 Yields rich information
 Interviewer can clarify and correct questions
that are likely to be misunderstood
 The interviewer can observe body gestures,
facial expressions etc.
 Disadvantages of interview:
 It is demanding in terms of cost, energy, and
time
 Interviewer bias may distort result

Slide 1-10
Focused Group Discussion (FGD)
 A focus group discussion (FGD) is an in-depth field method
that brings together a small homogeneous group (usually six
to twelve persons) to discuss topics on a study agenda.
 The purpose is to stimulate participants to reveal underlying
opinions, attitudes, and reasons for their behavior with the
help of a facilitator
 size of the group is usually 6 - 10 people as:
 smaller groups may limit the potential on the amount of
information collected, and
 more may make it difficult for all participants to
participate and interact and for the interviewer to be able
to make sense of the information given.

Slide 1-11
Cont’d..
 In combination with other methods,
FGD can be used to:
 New research areas
 Explore a topic that is difficult to observe
 Collect a concentrated set of observations in a
short time span
 Clarify findings from other methods

Slide 1-12
Observation
 Observation is a systematic examination of
people and their behavior in their naturally
occurring situations.
 It involves:
 prolonged engagement in a setting or social situation;
 clearly expressed, self-conscious notations of how
observing is done;
 methodical and tactical improvisation in order to
develop a full understanding of the setting of interest;
 recording one’s observations.

Slide 1-13
Cont’d..
 Qualitative records of behavior:
 Observation can provide rich qualitative data,
sometimes described as ‘thick description’ where
the relevant phenomena have been carefully
observed and detailed field notes have been
recorded
 Quantitative records of behavior:
 Researchers often obtain quantitative measures
such as frequency or duration of occurrence when
they seek to describe specific behaviors or events

Slide 1-14
Analysis in Qualitative Studies
 In general, the intent is to make sense out of
text and image data.
 It involves segmenting and taking apart the
data (like Peeling back the layers of an onion)
as well as putting it back together.
 Data analysis go hand-in-hand with data
collection and write-up of findings.
 Analysis of data focuses of aggregating data
into small number of themes.
15
Cont’d..
 Some Specific Analytic Techniques
 Pattern Matching
 Evaluating the identified patterns/themes against
predefined patterns
 Explanation Building/narration
 Using the themes/codes for description building
 Time-Series Analysis
 Watching how themes are changing or unchanged
through time
 Cross-Case Synthesis
 Summary of themes identified across cases
General
steps in
qualitative
data
analysis

Slide 1-17
Cont…
 Some general procedures
 Step 1. Organize and prepare the data for analysis.
 This involves transcribing interviews, optically
scanning material, typing up field notes, cataloguing
all of the visual material, and sorting and arranging
the data into different types depending on the
sources of information.
 Step 2. Read or look at all the data.
 This step provides a general sense of the information
and an opportunity to reflect on its overall meaning.
What general ideas are participants saying? What is
the tone of the ideas?

18
Cont…
 Step 3. Start coding all of the data.
 Coding is the process of organizing the data by
bracketing chunks (or text or image segments) and
writing a word representing a category in the margins
(Rossman & Rallis, 2012).
 Coding is the process of examining the raw qualitative
data in the transcripts and extracting sections of text
units (words, phrases, sentences or paragraphs) and
assigning different codes or labels so that they can
easily be retrieved at a later stage for further
comparison and analysis, and the identification of any
patterns.

19
Tesch’s (1990) Eight Steps of Coding

20
21
Cont…
 Step 4. Use the coding process to generate a description of the
setting or people as well as categories or themes for analysis.
 Description involves a detailed rendering of information about
people, places, or events in a setting. Researchers can generate
codes for this description.
 Use the coding as well for generating a small number of themes
or categories— perhaps five to seven themes for a research
study.
 These themes are the ones that appear as major findings in
qualitative studies and are often used as headings in the
findings sections (or in the findings section of a dissertation or
thesis) of studies.
 They should display multiple perspectives from individuals and
be supported by diverse quotations and specific evidence
22
Cont…
 Beyond identifying the themes during the coding
process, qualitative researchers can do much with
themes to build additional layers of complex
analysis.
 For example, researchers interconnect themes
into a story line (as in narratives) or develop
them into a theoretical model (as in grounded
theory).
 In case study, themes are analyzed for each
individual case and across different cases.

23
Cont…
 Step 5. Advance how the description
and themes will be represented in the
qualitative narrative.
 The most popular approach is to use a narrative
passage to convey the findings of the analysis.
 This might be a discussion that mentions:
 a chronology of events,
 the detailed discussion of several themes (complete
with subthemes, specific illustrations, multiple
perspectives from individuals, and quotations) or
 a discussion with interconnecting themes.
24
Cont…
 Step 6. A final step in data analysis involves
making an interpretation in qualitative
research of the findings or results.
 It could be a meaning derived from a
comparison of the findings with information
gleaned from the literature or theories
confirming past information or diverging from
it.
 The data and analysis that the inquirer had not
foreseen earlier in the study may lead to asking
new questions.
25
Additional Key issues in Validity and reliability of
qualitative results

26
Analysis in Quantitative Studies
 Quantitative analysis deals with analysis of
numeric data using statistical tools using two
ways:
 Descriptive Analysis:
 refers to statistically describing, aggregating, and presenting the
constructs of interest or associations between these constructs
 Inferential Analysis:
 refers to the statistical testing of hypotheses (theory testing).
 It is important to understand IV and DV (but at
times we may need to include different types of
independent variables like moderating, control
variables,…..)
27
Cont’d..
 Intervening or mediating variables stand
between the independent and dependent
variables, and they mediate the effects of the
independent variable on the dependent
variable.
 Objective of the mediator variable is to explain the
relationship between IV & DV e.g. IV is not directly
influencing DV but rather IV is indirectly influencing DV
through mediator variable.
 For example, salary (IV) is positively influencing education
(mediator variable) and then education is positively
influencing health-screening expenses (DV).
 When the effect of education is removed, the relationship
between salary and health-screening disappears.
28
Cont…
 Moderating variables are independent variables that
affect the direction and/or the strength of the
relationship between independent and dependent
variables
 Moderator variable is a third party variable that modify
the relationship between an independent variable (IV)
and a dependent variable (DV).
 Objective of the moderator variable is to measure the
strength of the relationship between the IV & DV.
 For example, if age is a moderator variable between salary
(IV) and health-screening expenses (DV), then
relationship between salary & health-screening can be
stronger for older men and less strong for younger men
30
Cont…
 Confounding (or spurious) variable, is not
actually measured or observed in a study.
 Itexists, but its influence cannot be directly
detected.
 Researchers comment on the influence of
confounding variables after the study has been
completed, because these variables may have
operated to explain the relationship between the
independent variable and dependent variable, but
they were not or could not be easily assessed (e.g.,
a confounding variable such as discriminatory
attitudes).

31
General steps in analysis of survey studies
 Step 1.
 Report information about the number of members of the
sample who did and did not return the survey.
 A table with numbers and percentages describing
respondents and nonrespondents is a useful tool to
present this information
 Step 2.
 Discuss the method by which response bias will be
determined and managed .
 Response bias is the effect of nonresponses on survey
estimates.

32
Cont…
 Step 3.
 Discuss a plan/present a descriptive analysis of
data for all independent and dependent variables in
the study.
 This analysis should indicate the means, standard
deviations, and range of scores for these variables.
 In some quantitative researches , the analysis stops
here with descriptive analysis, especially if the
number of participants is too small for more
advanced, inferential analysis.

33
Cont…
 Step 4.
 Assuming that you proceed beyond descriptive
approaches, if the proposal contains an
instrument with scales or a plan to develop
scales (combining items into scales),
 identify the statistical procedure (i.e., factor
analysis) for accomplishing this.
 Factor analysis is a technique that is used to
reduce a large number of variables into fewer
numbers of factors.

34
Cont…
 Thus factor analysis is a statistical approach that can be used
to analyze interrelationships among a large number of
variables and to explain these variables in terms of a smaller
number of common underlying dimensions.
 It is a process in which the values of observed data are
expressed as functions of a number of possible causes in
order to find which are the most important
 This involves finding a way of condensing the information
contained in some of the original variables into a smaller set
of implicit variables (called factors) with a minimum loss of
information

35
Cont….
 Step 5.
 Identify the statistics and the statistical computer program for
testing the major inferential research questions or hypotheses
in the proposed study.
 The inferential questions or hypotheses relate variables or
compare groups in terms of variables so that inferences can be
drawn from the sample to a population.
 Provide a rationale for the choice of statistical test and mention
the assumptions associated with the statistic.
 Inferential statistics -Is used to infer about the population
from the sample data and show variable relationship
• T-test to compare means of two groups
• Analysis of Variance (ANOVA)- (When groups more than two),
Regression , ……

36
37
Cont…
 Step 6.
 A final step in the data analysis is to present the
results in tables or figures and interpret the
results from the statistical test.
 An interpretation in quantitative research
means that the researcher draws conclusions
from the results for the research questions,
hypotheses, and the larger meaning of the
results.
 Implications of the results for practice or for
future research on the topic is also discussed.
38
Analysis in experimental studies
 In computing fields, in addition to the observation of subjects
before and after treatment, experimental studies refer also
specifically to a research in a laboratory using tools and
materials like algorithms
 Conducting Experiments to analyze data
 Assumption
 You have selected (or developed) tools,
techniques/algorithms based on literature
 Inputs
 The problem, and data
 algorithms/techniques,
 various setting options

39
Cont…
 Procedure
 Plan number of experiments by specifying the data
input, algorithms to be used, experimental settings
like test options …
 Write /report the results of each experiments
 Set criteria and compare the experimental results
 Output
 Accuracy/performance etc.. of an experiment
 Lessons learned, new insights, new knowledge,
 Description of how it is going to be used

40
Cont…

Experiments Algorithm Test option Attributes to Instances


be used

Exp-One Randomforest 10-fold 17 10000


validation
Exp-Two J48 10 –fold 17 10000
validation
Exp-Three PART 10 –fold 17 10000
validation
Exp-Four Randomforest 80/20

Exp-Five J48 80/20

Exp-Six PART 80/20

41
Cont…
 Simulation Analysis
 Is a different flavor of laboratory experiment
 Simulation analysis is a descriptive modeling
technique.
 While laboratory experiment focuses on
measuring the impact of variables on a target
issue, simulation is measuring the effectiveness of
a given model/system in an artificial environment

42
Cont…
 Elements of Simulation Analysis
 Problem Formulation: questions for which answer are sought, the variables
involved and measures of system performance to be used
 Data Collection and Analysis: assembling the information necessary to
further refine our understanding of the problem.
 Model Development: building and testing the model of the real system,
selecting simulation tool (programming language), coding the model and
debugging it
 Model Verification and Validation: establish that the model is an
appropriate accurate representation of the real system.
 Model Experimentation and Optimization: precision issues, how large
sample (simulation time) is necessary to estimate the performance of the
system. The design of effective experiments with which to answer the
question asked in the problem formulation.
 Implementation and Simulation result: acceptance of the result by the users
and improved decision making stemming from the analysis.
43
Cont….
Major Iterative Loops in a Simulation Study
Problem Formulation

Initial Data Collection and Analysis

Model development

Model Verification and Validation

The Model Experimentation and Optimization


center
of
analysis Implementation of Simulation Results

44
Validity and Reliability
 Through Standard instruments /procedures and testing
 Construct Validity
 Concepts being studied are operationalized and
measured correctly
 Internal Validity
 Does the data support the conclusion?
 External Validity
 Establish the domain to which a study’s findings can be
generalized
 Can the result be generalized?
 Experimental Reliability
 Demonstrate that the study can be repeated with the
same results
 Is the method/conduct of the research systematic and
logical?
Analysis in Design Researches
 Analysis through iterative design and evaluation
process
 Suggest, Design, Construct and Evaluate an artifact as
a search process
 Iterations and improvement have to be reported
 May need input from mainly qualitative type of
analysis but from quantitative also
 Mainly analysis as a deductive process of checking an
abductively identified/ suggested solutions through an
iterative design, development and evaluation process

46
47
Main perspectives in design deductive
analysis
 Structure of the artifact: the information space the artifact spans
 basis for deducing all required information about the artifact
 determines the configurational characteristics necessary to
enable the evaluation of the artifact
 Evaluation criteria: the dimensions of the information space which are
relevant for determining the utility of the artifact
 can differ on the purpose of the evaluation
 Evaluation approach: the procedure how to practically test an artifact
 defines all roles concerned with the assessment and the way of
handling the evaluation
 result is a decision whether or not the artifact meets the
evaluation criteria based on the available information.
 Can be qualitative or quantitative
48
Methods
Structure Evaluation Evaluation approach
criteria
process-based meta appropriateness laboratory research
model completeness field inquiries
intended consistency surveys
applications case studies
conditions of
action research
applicability
practice descriptions
products and results
interpretative
of the method
research
application
reference to
constructs
Models
Structure Evaluation criteria Evaluation approach
 domain correcteness syntactical validation
 scope, purpose completeness integrity checking
 syntax and clarity sampling using
semantics flexibility selective matching of
 terminology simplicity data to actual external
 intended phenomena or trusted
applicability
application surrogate
implementability
integration tests
risk and cost analysis
user surveys
Instantiations
Structure Evaluation Evaluation approach
criteria
executable implementation in functionality Code inspection
a programming language usability Testing
reference to a design model reliability Code analysis
reference to a requirement performance Verification
specification supportability Accptance study
reference to the
Usability analysis
documentation
reference to quality
management documents
reference to configuration
management documents
reference to project
management documents
Validity and reliability in deign
research
 Pragmatic Utility through scientific methods
 Does it solve the initial problem?
 Was it correct and repeatable
 Is it acceptable by the respective beneficiaries
 Rigorous evaluation

52
Exercise
 Relating research problems, objectives and results
 Problem statements should lead to defining a clear
objectives
 Objectives should lead to clear analysis and results

 Read the following problem descriptions and


1. Craft a candidate general objectives for the following
three cases.
2. Suggest data sources and data collection methods
3. Suggest potential type of analysis
4. Suggest possible/logical result of the studies and
relevant interpretations expected
53
Example 1
 Distance learning puts learners in isolation,
lack of observation by teachers and more
freedom to learners
 Researchers in distance learning are
interested to develop a collaborative tool
that supports student interactions
 This is not sufficient, collaboration among
tutors is also necessary for effective distance
learning
 There is no system yet that supports tutors.
Slide 1-54
Example 2
 Due to the advent of the Internet, many
Amharic documents are now available
online. Additionally, the popular search
engine Google, provides Amharic interface.
 However, to-date, no tolerant-retrieval
mechanism based on spelling correction has
been employed for Amharic; there is even no
prior published work on spelling correction
for the Amharic language.

Slide 1-55
Example 3
 Technology adoption is one of the area of studies
in information technology. E-learning is a
technology that enables effective teaching
learning process. Educational institutes are seen
trying to adopt such platform in their process.
However appropriate adoption requires prior
investigation of perception and feeling of end
users. In relation to this there was no any
comprehensive study on the adoption level of e-
learning technology among academic institutions.
And factors that affect such adoption were
56
inconsistent from one study to another.
Sample solutions

57
Example 3
 Objective
 To identify adoption level and factors affecting e-learning
adoption
 Data sources
 Human data sources (students, instructors, educational
admins)
 Data collection methods
 It depends but; interview, questionnaire, observation …
 Analysis techniques
 Will depend on the data collection methods
 But can be either descriptive statistics, inferential statistics
methods or thematic analysis
 Expected outputs
 Level of adoption determined
 Factors for adoption listed
Interpretation and Discussion

59
Interpretation and Discussion
 To be precise
 Interpretation is telling what the result means
 Discussion is telling how good/better or related
the result is compared to other works.
 Explain the results in light of
 Previous literatures and theories.
 Own RQ and/or objectives
 No clear distinction in case of qualitative
 Involves reporting results of demonstration and
evaluation in case of design research
60
Interpreting the Data
 Interpreting the data means several things
including:
1) Relating the findings to the original research
problem and to the specific research
questions and hypotheses.
Researchers must eventually come full circle
to their starting point – why they conducted a
research study in the first place and what they
hoped to discover – and relate their results to
their initial concerns and questions
Cont…
2) Relating the findings to preexisting
literature, concepts, theories, and research
studies.
To be useful, research findings must in some way be
connected to the larger picture – to what people
already know or believe about the topic in
question.

Perhaps the new findings confirm a current


theoretical perspective, perhaps they cast doubt on
common “knowledge”, or perhaps they simply raise
new questions that must be addressed before we
can truly understand the phenomenon in question

62
Cont…
3) Determining whether the findings have
practical significance as well as statistical
significance.
Statistical significance is one thing; practical
significance – whether findings are actually useful –
is something else altogether.
4) Identifying limitations of the study. Finally,
interpreting the data involves outlining the
weaknesses of the study that yielded them.
No research study can be perfect, and its
imperfections inevitably cast at least a hint of
doubt on its findings. Good researchers know – and
they also report – the weaknesses along with the
strengths of their research
Review questions
 Explain the central concept in qualitative
analysis
 What are the two forms of analysis in
quantitative research?
 Explain the essence of analysis in design
science research
 How do you interpret your research findings?
 Mention potential challenges in data
collection phase of a research.
64

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy