0% found this document useful (0 votes)
2 views

De

.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

De

.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Association for Information Systems

AIS Electronic Library (AISeL)


Americas Conference on Information Systems
AMCIS 2007 Proceedings
(AMCIS)

December 2007

An Information Technology Literacy Self-


Assessment Instrument: Development and Pilot
Results
Jorge Perez
Kennesaw State University

Meg Murray
Kennesaw State University

Martha Myers
Kennesaw State University

Follow this and additional works at: http://aisel.aisnet.org/amcis2007

Recommended Citation
Perez, Jorge; Murray, Meg; and Myers, Martha, "An Information Technology Literacy Self-Assessment Instrument: Development and
Pilot Results" (2007). AMCIS 2007 Proceedings. 229.
http://aisel.aisnet.org/amcis2007/229

This material is brought to you by the Americas Conference on Information Systems (AMCIS) at AIS Electronic Library (AISeL). It has been accepted
for inclusion in AMCIS 2007 Proceedings by an authorized administrator of AIS Electronic Library (AISeL). For more information, please contact
elibrary@aisnet.org.
AN INFORMATION TECHNOLOGY LITERACY SELF-
ASSESSMENT INSTRUMENT: DEVELOPMENT
AND PILOT RESULTS

Jorge Pérez
Kennesaw State University
jperez@kennesaw.edu
Meg Murray
Kennesaw State University
mcmurray@kennesaw.edu
Martha Myers
Kennesaw State University
mmyers@kennesaw.edu

Abstract

The information technology quotient is on the rise in every field imaginable – computers
and the Internet are everywhere. Essential information technology (IT) competencies are
often taken for granted, to the detriment of students who lack computing and Internet
skills. A standard set of computer skills that clearly define the IT competent individual
has yet to be determined. However, an upsurge of interest in online learning has
prompted many institutions of higher education to implement an assessment measure that
aims to ascertain student readiness for distance education via the Internet. This paper
summarizes commonly employed IT assessment instruments and introduces a new self-
assessment instrument that focuses on three areas: computer hardware and system
software; software applications; and networking, the Internet and information literacy. A
preliminary administration of the instrument is described.
KEYWORDS: assessment, curriculum, education, computer skills, information systems, computer science, Internet,
computer literacy, digital literacy, IT literacy, IT competency.

Introduction

No industry is untouched by the information technology (IT) revolution. The IT quotient – work that is dependent upon
computing technology and the Internet – is on the rise in every imaginable field. Commerce, education, communication and
socialization in particular have undergone radical transformations. Nonetheless, fundamental IT skills are not necessarily
treated as equal to the other must-have competencies: math and English. Critical IT competencies are often taken for granted,
to the detriment of students who lack computing and Internet skills.

Attention to IT competencies has taken a few twists and turns in higher education. Not too many years ago, students on many
college campuses were required to demonstrate a basic level of computer competency. Several institutions developed
computer literacy courses that all students were required to take. The primary focus of these courses was developing basic
skills in the use of applications such as word processing, spreadsheet and presentation software. Eventually, these computer
skill-based courses began to disappear as many educators believed that students were entering the university well-versed in
basic computer usage.
1
While most students today have exposure to computers and experience using the Internet, a gap is emerging between
functional and analytical uses of computing technologies. That is, exposure does not equate with understanding. Students
may be able to use a word processor and surf the Internet, but they often do not understand the fundamentals of how and why
these technologies work. An analogy is often put forth that relates using a computer to driving a car. Certainly, one does not
need to know the engineering principles or mechanical processes behind an automobile to be a successful driver. But the
comparison of the automobile to the computer is incomplete. Whereas the results of driving a car are distinct and finite, the
outcomes of using the resources and information that emanate from computer technologies are infinite and complex. Cars get
us from point A to point B; information technology allows us to reinterpret the journey.

We are in the age of ubiquitous and pervasive computing. Universities are again pondering what defines computer literacy
and what specific skills are needed to effectively utilize computer technologies. Computer literacy has traditionally been
defined as the ability to use computers to perform a variety of tasks, but that definition is no longer adequate. Computer
literacy no longer simply means viewing a computer as a collection of applications; it also means using the computer as a
means of communication and a source of information [Hoffman, Blake, McKeon, Leone, and Schorr, 2005]. These same
authors provide an expanded definition of computer literacy. Specifically, they state that computer literacy includes both
“information literacy, the ability to evaluate information found online, and critical computer literacy, the ability to
incorporate computing technology in support of critical thinking” [p. 164]. Moreover, computer literacy must now be
extended to digital literacy, as students are increasingly expected to interact with information and content made omnipresent
by the fact that it is digitized and available on a variety of devices.

The challenge for institutions of higher education is to operationalize the expanded definition of computer literacy. This is a
major undertaking, as the functional definition of computer literacy is expansive. One of the first steps for many universities
is to assess the IT skills and competencies possessed by all incoming freshmen and then to provide remediation vectors for
students who do not demonstrate adequate mastery of those competencies. The approaches taken differ greatly among
institutions. This paper introduces a self-assessment instrument and describes results from a pilot administration of the
instrument.

Assessment of IT Competencies
For quite some time, institutions of higher education have espoused the need to create graduates who demonstrate a set of IT
competencies. While there is no standard set of computer skills that clearly define the IT competent individual, many
institutions have implemented assessment measures that attempt to ascertain the specific computer skills students possess.
For the most part, these assessments focus on basic computer operations, use of functional software such as word processors
and basic skills in searching the Internet and using e-mail. A typical computer literacy assessment instrument evaluates
student skill levels in such tasks as creating a document in a word processor, naming the parts of a computer, sending an e-
mail, participating in a chat session and using a search engine on the WWW. Some instruments even include sections related
to creating a Web page and setting up a small network.

A broad array of assessment initiatives has been undertaken at the university level. Most of these are focused at evaluating
the IT competencies of incoming freshmen with the goal to address and remediate gaps in individual student skill levels.
These assessment instruments can be categorized into two general areas: assessment via a published skills-based proficiency
test, or in-house developed assessment instrument.

Published Skills-Based Proficiency Assessments

Several skills-based assessment instruments are available which test IT proficiency. Many of these assessments lead to some
type of formal certification These tests generally include evaluation of skills or knowledge in general computing concepts,
Internet use and application software (word processing, spreadsheets, presentations and databases). The most widely known
of these assessments is Certiport’s Internet and Computing Core Certification (IC3). The certification is divided into three
exams covering the areas of “computing fundamentals, key applications and living online” [Certiport, 2006]. The
“Computing fundamentals” component tests for basic understanding of computer hardware, purchasing and maintenance
decisions, identifying different types of software and what they are best suited for, fundamental concepts about the use of
databases and understanding of basic operating system and file manipulation operations. The “Key applications” component
tests basic skill in using word processing, spreadsheets and presentation software. The “Living online” component tests basic
concepts in networking, skill in using electronic mail and searching the Internet, understanding the different types of
information sources found on the Internet and understanding risks and responsible use of computers and the Internet.
Another example of a widely available computer literacy assessment instrument for undergraduate students is known as the
Tek.Xam. Tek.Xam is a partnership between the Virginia Foundation for Independent Colleges, and ACT, Inc, an
internationally known educational assessment corporation. Tek.Xam has 12 different online tests that evaluate student
proficiency in seven areas, including general computing, knowledge and use of the internet, word processing, spreadsheets,
presentations, databases and web authorship. Separate tests are administered for each area and immediately upon completion
students are provided a report of their test results outlining their strengths and weaknesses.

Another widely available assessment instrument is sponsored by the International Computer Driving License (ICDL), an
international essential IT skills certification in use by over 140 countries. The ICDL has been noted as providing a standard
for assessing computer literacy world-wide with the main intent of providing individuals with a way to demonstrate their IT
proficiency to potential employers. The assessment instruments are built upon a standard syllabus of IT competencies as
identified by the European Computer Driver License (ECDL) Foundation. The ICDL assesses skills in seven basic areas
including IT concepts, operating environments, word processing, spreadsheets, databases, presentation graphics and the
Internet and e-mail. Obtaining certification or the ICDL ‘license’ requires passing tests in all seven areas.

Finally, textbook publishers have entered the IT competency assessment arena by offering series of skill-based assessments
coupled with computer-based learning systems for end user applications. In late 2004, Course Technologies launched a
program termed the “SAM Challenge” [Course Technology, 2004] built from their computer-based training series. SAM,
short for Skills Assessment Manager, is designed as a series of IT assessment instruments that coincide with Course
Technologies’ computer-based training programs. The assessments employ the use of simulated software environments.
Tests cover areas such as basic computer skills using the Windows XP operating system, application software skills using
Microsoft Office and Internet skills using Microsoft IE. Colleges and universities can build their own exams from the Course
Technology 60,000 item test bank. Provisions are also available for customizing the test by including in-house designed
questions.

Prentice Hall offers a similar program using the title “Train and Assess IT” [Prentice Hall, 2006]. The program is heavily
weighted towards Microsoft Office applications but also includes topics such as basic computer and Internet use. Like SAM
Challenge, this program uses performance-based testing utilizing simulations of software applications. If desired, Train and
Assess IT will provide immediate student feedback and map students to appropriate ‘Train and Assess IT’ computer-based
learning modules or Prentice Hall texts from their ‘Go!’ series.

A summary of the skills and competencies evaluated by published skills-based proficiency assessments is presented in Table
1. Competencies were divided into 11 different domain categories. As noted, these instruments, for the most part, are task and
skill oriented. While, each of these instruments assesses needed and worthwhile IT skills, they represent only one aspect of
computer literacy. Further, these certification based evaluations, tend to assess IT skills in isolation. They focus on the
functional use of computers; they do not attempt to evaluate the analytical skills students need to be able to use computing
technologies to support critical thinking activities.

Table 1. Competency Area Represented in Published IT Assessment Instruments

Assessment Instrument
SAM Train &
Competency/Skill Area IC3 Tex.Xam ICDL
Challenge Assess IT
Basic concepts of IT X X X
Hardware and hardware
X X X
components
Operating System basics
X X X X X
including file management
Word processing X X X X X
Spreadsheets X X X X X
Presentation Software X X X X X
Database fundamentals X X X X X
Use of the Internet (Web
X X X X
browsing and search engine)
Web Page Authorship X
E-mail X X
Societal Impact of Computing
Technologies [including legal X X
aspects, ethics]
In-House Developed Assessment Surveys

A number of colleges and universities assess IT competencies among incoming freshmen using in-house developed
instruments. These instruments represent two basic forms: self-reported student evaluations or objective-based tests. In
general, self-reported assessments are optional and provide a guide for students to evaluate areas in which they need
remediation. Objective tests, on the other hand, are usually required.

A review of 10 publicly available in-house developed assessment instruments was undertaken to identify commonly cited IT
competencies. Competencies were categorized into 13 different domains, 11 identical to those outlined in the published
assessment instruments and 2 additional areas not previously reported. These included attitudes and computer security. This
data is presented in Table 2.

Table 2. Competency Area Represented in In-House Developed Assessment Instrument

Competency/Skill Area Number of Instruments Covering this Topic


Basic concepts of IT 5
Hardware and hardware
5
components
Operating System basics
10
including file management
Word processing 10
Spreadsheets 10
Presentation Software 7
Database fundamentals 1
Use of the Internet (Web
9
browsing and search engine)
Web Page Authorship 1
E-mail 9
Societal Impact of Computing
Technologies [including legal 3
aspects, ethics]
Computer Security (primarily
5
related to virus protection)
Attitudes towards computing
1
technologies

An analysis of these self-assessment instruments shows that they are primarily competency or skill-based. In other words,
these instruments attempt to identify ‘what’ a student can do. While this has merit, it is also short-sighted. Having a level of
technical competence in using computer technologies does not necessarily translate into being computer literate.

The Educational Testing Service (ETS) has developed a standardized instrument that attempts to assess IT proficiency. The
Information and Communication Technology (ICT) test is focused on information literacy and uses a novel approach based
on interactive scenarios. A description of each of these proficiencies and task descriptions may be found at the ETS web site
(www.ets.org under the ICT Literacy Assessment link).

Use of Assessment Instruments

The primary objective of these assessment instruments is to provide students with information for the purpose of remediation.
Based on results, students are often advised on how to gain a level of IT competency deemed appropriate for a beginning
college student. For instance, students who do poorly on an assessment might be advised to take a particular entry level
computer literacy course. Students who perform poorly on particular sections of the assessment might be advised to take a
workshop or engage in a computer-based training program. The assessment serves as a guide for the student in terms of the
IT competencies they are expected to have mastered and a warning as to what proficiencies they are expected to acquire.
The IT Competency Self-Assessment

After reviewing existing instruments, a decision was made to blend the best of published surveys with more contemporary IT
issues, especially those we see lacking in our own students. Students entering universities today have had more exposure to
IT than ever before. And technology itself is fundamentally different today than in previous iterations of computer literacy.
So we identified a need for a more flexible instrument, one that could be quickly adapted to rapid changes in the technology
landscape and to the needs of our students.

Our instrument contains items that address three broad areas of IT competency: computer hardware and systems software,
application software, and networking, the Internet and information literacy (Table 3). The competencies and skills were
divided into these categories to parallel the modules in the IT literacy course developed by the authors. A five point Likert
scale was used for all items in the instrument.

At our university, students may elect to enroll in a first year class titled “Computers and Your World.” Organized around the
three areas described above, the course includes varied topics such as updating an operating system, designing a database,
conducting research on the web, securing a home network, and creating a web site. Other topics include privacy,
globalization, diversity and ethics.

Table 3. Competency Area Represented in IT Competency Self-Assessment

Competency/Skill Area Items That Address This Area


4, 5, 6, 10, 18, 19, 23, 26, 27, 28,
Computer Hardware and Systems Software
31, 37, 38, 42, 43, 46, 47, 49
2, 3, 9, 11, 16, 21, 22, 28, 35, 36,
Application Software
39, 40, 44, 46
7, 12, 13, 14, 15, 16, 20, 24, 25, 28,
Networking, the Internet and information literacy
29, 32, 33, 34, 41, 45, 48, 50

Results

The instrument was administered to 95 students in five sections of our course. All five sections of the course were taught by
the three authors, two of whom had two sections each. The instrument was administered on the first day of class and again at
the end of the term. Most of the students were freshmen because all five sections were coupled with other required first year
courses. That is, students met a first-year requirement by enrolling in a “learning community” which bundles 2 or more
required first-year courses for registration purposes.

Table 4 shows the average across all sections on a sample of items that address attitudes toward computers, use of specific
Internet applications, and the degree to which respondents considered themselves computer savvy. Table 5 shows the average
across all sections on a sample of items that address specific competencies. Because this is an exploratory examination of the
instrument, no strong quantitative analysis has been conducted yet. Tables 4 and 5 show primitive descriptive reports of our
results.

Table 4. Sample of Attitude and Use Items Before and After Taking IT Literacy Course

Question Before After Difference


I surf the web every day. 4.5 4.8 0.3
I check my e-mail at least once a day. 4.3 4.4 0.1
I frequently use IM to chat with friends. 3.7 3.9 0.2
I enjoy learning about new technologies. 4.1 4.2 0.1
I enjoy using computers. 4.5 4.5 0.0
Table 5. Sample Competency Items Before and After Taking IT Literacy Course

Question Before After Difference


I know how to determine how much RAM is installed on a computer. 2.8 4.2 1.4
I know how to change BIOS settings. 1.9 3.1 1.2
I know how to use an FTP or SSH program to transfer files. 1.9 3.4 1.5
I know how to receive information via an RSS feed. 1.8 3.0 1.2
I know how to disable and enable cookies in a Web browser. 2.9 4.0 1.1
I consider myself computer savvy. 3.4 4.0 0.6

Conclusion

This paper reports on a preliminary examination of a new instrument to assess IT literacy in first-year university students.
Results are largely description and so there are limits on our ability to interpret and generate conclusions. Rather the results
can guide a qualitative, reflective discussion of the current state of literacy in our first year students. For example, the
greatest changes were observed in items that referred to file transfer and hardware configurations. The students perceived that
their knowledge of these two areas has increased. It is unclear whether or not their knowledge has actually increased, but
increased awareness is a step in the right direction. An important goal of the IT literacy course is to manage student
perceptions and expectations about what they need to know in order to succeed as college students.

Next steps for this research include the following: First a larger number of students are needed to participate in the study.
Ideally students from multiple institutions will participate in the future, allowing not only for more rigorous statistical
analysis, including tests for significance and possibly refinements to the three clusters we have identified so far.

IT literacy is a moving target and instruments used to assess IT literacy must be fluid in design. Many of the competencies
encompassed by what has variously been referred to as IT literacy, computer literacy, information literacy, and computer
fluency change at the same rate as the underlying technologies themselves. IT is indeed the third literacy on par with English
and mathematics.
References
Bartholomew, K. C. (2004 October). Computer literacy: is the emperor still exposed after all these years? Journal of
Computing Sciences in Colleges, 20(1), 323-331.
BCS. (2006). European Computer Driving License. Retrieved August 28, 2006, from ECDL Home Web site:
http://www.bcs.org/server.php?show=nav.5829
Certiport. IC3 Certification. Retrieved August 22, 2006, from http://info.certiport.com/yourpersonalpath/ic3Certification/.
Course Technologies (2004, December 20). Thomson Course Technology launches SAM Challenge, an industry first in
computer skills assessment software. Retrieved August 22, 2006, from
http://www.course.com/news/SAMchallenge_1204.cfm.
Course Technologies (2006). SAM Challenge. Retrieved August 22, 2006, from
http://samcentral.course.com/sam_challenge.cfm.
Ehmann, S. (2004). Beyond Computer Literacy: Implications of Technology for the Content of a College Education. Liberal
Education. Retrieved July 30, 2006, from http://www.aacu-edu.org/liberaleducation/le-fa04/le-fa04feature1.cfm
George Mason University: Technology Across the Curriculum (2005). Retrieved July 30, 2006, from
http://cas.gmu.edu/tac/index.html.
Hawkins, B. and Oblinger, D. G. (2006, July/August). The myth about the digital divide: “We have overcome the digital
divide.” EDUCAUSE Review, 41(4), 12–13.
Hoffman, M. and Blake, J. (2003 May). Computer literacy: today and tomorrow. Journal of Computing Sciences in Colleges,
18(5), 221 - 233.
Hoffman, M. Blake, J., McKeon, J., Jill McKeon , Scott Leone, S. & Schorr,M. (2005 May). A critical computer literacy
course. Journal of Computing Sciences in Colleges, 20(5), 163-175.
Holisky, D. A.(n.d.). Ten IT Goals: Information Technology Goals for Liberal Arts Students. George Mason University.
Retrieved July 30, 2006, from http://cas.gmu.edu/tac/goals/tenitgoals.html.
Harvey, B. (1983). Stop Saying “Computer Literacy”! Retrieved July 30, 2006, from
http://www.cs.berkeley.edu/~bh/stop.html
International Computer Driving License (2003). About ICDL. Retrieved August 22, 2006, from
http://www.acs.org.au/icdl/category.asp?category_id=3
International Computer Driving License - US. (2005). The ICDL Program. Retrieved July 30, 2006, from
http://www2.icdlus.com/icdlus-lms-webclient/homepage/about/program.html .
Oblinger, D. G. and Hawkins, B. (2006, March/April). Thy myth about student competency: “Our students are
technologically competent.” EDUCAUSE Review, 41(5), 12-13.
Prentice Hall (2006). Train and Assess IT Faculty Support Home Page. Retrieved August 22, 2006, from
http://www2.phgenit.com/support/support/HomeContent.asp.
Tek.Xam. (2005). Retrieved August 22, 2006, from http://www.tekxam.com.
About the Authors

Jorge Pérez is the CETL Faculty Fellow for E-Learning and an Assistant Professor in the Computer Science and Information
Systems Department at Kennesaw State University. He holds a Ph.D. in Information Systems and has over nineteen years of
experience in the field as a consultant, systems analyst, web developer and educator. Professor Pérez teaches e-business, web
development and informatics at the undergraduate and graduate levels. He has published and presented research on
information security, diffusion of innovations, and assessment. His current research focuses on identifying, assessing and
amplifying essential IT and Internet competencies.

Meg Murray is an Associate Professor in the Department of Computer Science and Information Systems at Kennesaw State
University, part of the higher education system of the state of Georgia. She holds a Ph.D. in Information Systems, an MBA in
Finance and a MS in Computer Science. She has been in the field of computing for more than twenty years and has served
both in higher education and industry. Dr. Murray specializes in the area of emerging technologies and the development and
implementation of those technologies to meet business and organizational needs with a special interest in technology infusion
in healthcare. Her focus in teaching is to inspire students to create and devise new and innovative ways to implement
information technologies to solve real-world problems. Her most recent work is in the area of devising strategies to assess
and remediate IT skills needed by an educated workforce to ensure they are able to use the power of technology as a means
for innovation, the driver necessary to sustain economic growth.

Martha Myers is Professor of Computer Science and Information Systems (CSIS) at Kennesaw State University (KSU),
where she has taught since 1990. Dr. Myers holds three degrees from UT Austin: BA mathematics, MA mathematics and
computer science education, and PhD information systems. She served as CSIS Department Chair from 1993-2000. From
1985-1988, she was Vice President of Systems for Continental Insurance Company. She has also taught high school
mathematics and computer science. Teaching interests include database systems, systems analysis and design, project and
team management. Research subjects include diversity in the IT work place, real-time writing, and database pedagogy. Dr.
Myers is faculty principal for KSU’s Women In Technology student organization and chair of OWLS (Outfitting Women
Leaders in the Sciences), a faculty organization at KSU.
Appendix A

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy