Benchmarking Open Government - An Open Data Perspective
Benchmarking Open Government - An Open Data Perspective
a r t i c l e i n f o a b s t r a c t
Available online xxxx This paper presents a benchmark proposal for the Open Government and its application from the open data
perspective using data available on the U.S. government's open data portal (data.gov). The benchmark is
Keywords: developed over the adopted Open Government conceptual model, which describes Open Government through
Open government data openness, transparency, participation and collaboration. Resulting in two measures, that is, one known as
Conceptual model the e-government openness index (eGovOI) and the other Maturity, the benchmark indicates the progress of
Open government benchmark
government over time, the efficiency of recognizing and implementing new concepts and the willingness of
Open data
the government to recognize and embrace innovative ideas.
© 2014 Elsevier Inc. All rights reserved.
http://dx.doi.org/10.1016/j.giq.2013.10.011
0740-624X/© 2014 Elsevier Inc. All rights reserved.
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
2 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
knowing the place of a government on an Openness scale could signifi- Open Government Declaration, delivering action plans and reporting on
cantly influence a government's developmental path. Not only could their progress (Open Government Partnership, 2012).
government have a general view on its stage of accomplishment but In the scope of the mentioned OG initiatives, we can summarize
also it could question its development regarding each indicator and principles that permeate the Open Government idea:
have a clear and real insight into its openness, transparency and collab-
oration level. Benchmark results could point government development • opening public sector information data and enabling citizens and
toward weak features and thus help to strengthen the relationship entrepreneurs to access government-held data in a uniform way
between the governments and the governed. Throughout this article, (data transparency).
we will explain the benchmark's scope, its indicators and the resulting • opening government processes and operations to the public
measures of government openness, the e-government Openness Index (government transparency).
(eGovOI) and Maturity. We will apply the OpenGovB from the open • explaining decisions and actions to the citizens, acting on require-
data perspective using freely available data from the official U.S. ments expected for the task and accepting responsibility for failure
government's open data portal, which is data.gov. Discussion of the re- (government accountability).
sults and the OpenGovB model are given at the end of the paper and • engaging citizens in decision making (participation).
will provide the reader with a better picture of the significance of this • enabling cooperation across different levels of government, between
model, its strengths and its weaknesses. the government and private institutions and between the government
and the citizens (collaboration).
2. Open government
2.1. The conceptual model of Open Government
The Open Government movement was initiated in 2009, when the
president of the United States of America, Barack Obama, issued a One of the reasons that hinder the development of an Open
Memorandum on Transparency and Open Government that has set Government benchmark is the lack of OG conceptual clarity. Developing
“creating an unprecedented level of openness in government” as a a conceptual model of OG will help us to better understand the Open
primary goal (White House, 2009a). Soon afterward, Obama issued Government idea and will guide us through the process of defining
another Memorandum on the Freedom of Information Act, as “the benchmark indicators for evaluating OG. To adopt such a model, we
most prominent expression of a profound national commitment to will argue the existing OG conceptualizations. Obama's three pillars of
ensuring an open government” (White House, 2009b). These memoranda OG, transparency, participation and collaboration, are being assumed
served as a foundation for issuing the Open Government Directive in or differently named in the aforementioned OG initiatives around the
December 2009, which has addressed Ministerial departments and world. For example, the Australian government principles (informing,
agencies through a series of tasks leading toward creating more open engaging and participating) can be directly mapped onto transparency,
government (Orszag, 2009). The Obama Administration has built the collaboration and participation. The OGP's goals: transparency, citizens'
OG concept on the ideas of transparency, participation, and collabora- engagement, fighting corruption and strengthening governance, can be
tion in government. The road toward creating the opportunity for the interpreted as transparency, participation and accountability. Thus, the
Open Government to flourish included building the necessary legal question is: what ideas can we consider as foundations for the OG to
environment. McDermott (2010) describes in detail the legal history enable us to better understand OG, develop indicators for the benchmark
of OG, starting with “The Paperwork Reduction Act” from 1980 and its model, and grasp the OG character. As observed by Martin (2013), the
journey toward the 2010 Memorandum “Information Collection under main issues with the U.S. government's OG concept are that it does not
the Paperwork Reduction Act”, over the “E-government Act” initiated in draw a clear boundary between participation and collaboration and
2001, and finalizing with “The Freedom of Information Act”, which has that the social media tools and activities that constitute each of them
existed in many versions since 1966. The research notes the extensive depend on the individual viewpoints. This arrangement is one of the
openness legal background, emphasizing the effort behind the OG con- reasons why others suggest a different set of pillars as the foundation of
cept development and implementation and setting up a valuable example the OG concept, namely, participation, accountability and transparency
for others to follow. (Martin, 2013; Open Government Standards, 2012).
Other countries have followed the U.S. government's pioneering It is obvious that all OG models are centered on three pillars. Two of
initiative for OG and have announced their own ideas that could be the pillars repeat themselves, namely, transparency and participation,
directly or indirectly categorized as OG initiatives. For example, the while collaboration and accountability are being used interchangeably.
Australian Government 2.0 Taskforce report (Gruen, 2009) defines Although some argue that it is not clear whether the participation is
three key principles for introducing openness and transparency in collaborative or the collaboration is participatory, they are debating
government: informing, engaging and participating. These principles, about the tools that should be used to strengthen these OG dimensions
which are centered on citizens, tend to lead the government to achieve (Martin, 2013); at the same time, we can see the imaginary line
an informed, connected and democratic community. The government between the two concepts in the scope of the expressive social media
of the United Kingdom has published an action plan for smarter, more and the collaborative social media definitions. Expressive social media
efficient government, entitled “Putting the Frontline First: Smarter enables people to state their opinions by sharing text, picture video and
Government” (Britain & Treasury, 2009). This plan aimed at opening music with others, while collaboration enables people to join their efforts
the government and promoting transparency through the following with government and work together toward achieving a common goal
actions: strengthening the role of citizens and civic society, recasting the (Kotler, Kartajaya, & Setiawan, 2010). Having these scenarios in mind,
relationship between the center and the frontline and saving money we can look at participation as the tool that utilizes the input of the public
through sharper delivery (Huijboom & Van den Broek, 2011). In addition through expressive social media for the enhancement of policy decisions
to individual government initiatives, there are global efforts for promot- and government services. Thus, collaboration can be perceived as the
ing government openness. The Open Government Partnership (OGP) engagement of citizens, businesses and government agencies in complex
was launched in September 2011 with the aim of committing govern- tasks or projects that aim to produce specific outputs (Lee & Kwak, 2011).
ments to promote transparency, empower citizens, fight corruption, and Accountability is being suggested as one of the foundations of the OG idea
harness new technologies to strengthen governance. By the beginning (Martin, 2013; Open Government Standards, 2012), but we believe that
of 2013, the Open Government Partnership welcomed 58 members this suggestion is more an outcome of Open Government. Accountable
who were willing to increase governmental openness through embracing government is something that could be achieved through the idea of
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 3
government transparency (Campaign for Vermont, 2013; Hollyer, and open to all. Transparency does not have only short-term consider-
Rosendorff, & Vreeland, 2013). ations that regard information availability to all; additionally, there are
Originating from the aforementioned government memoranda, strat- long-term considerations that pertain to information usability by all.
egies and action plans and referring to our own study (Veljković, This concern further requires establishing tasks that are related to
Bogdanović-Dinić, & Stoimenov, 2012) and findings of other authors achieving information usability and accessibility, promoting govern-
regarding Open Government around the world (Huijboom & Van den ment and information and technology literacy, making appropriate
Broek, 2011; Lathrop & Ruma, 2010; Lee & Kwak, 2011), this paper will and accurate content and services available, meeting user expectations,
rely on the conceptual model of Open Government, depicted in Fig. 1. promoting trust, and encouraging lifelong usage (Jaeger & Bertot, 2010).
Open data has truly defined an Open Government concept but has Participation aims at including citizens in the democratic processes
also directed its growth toward data rather than services (Veljković, of the state (Parycek & Sachs, 2010), and it represents an issue that is
Bogdanović-Dinić, & Stoimenov, 2011b). Open data are governmental extensively treated in the academic literature (Åström et al., 2012;
data of public interest that are available without any restrictions and Karlsson, Holgersson, Söderström, & Hedström, 2012; Siddiqi, Akhgar,
that can be easily found and accessed. These data can include data on Gamble, & Zaefarian, 2006). As observed by Gant and Turner-Lee
transport, spatial data, weather information, reports, pictures and (2011), the interaction between government and the governed is a core
other information of public importance. David Eaves has a specific pillar of democratic society. Governments can rely on expressive social
vision of open data, according to which open data must comply with media and simple interactive communications (blogging, micro blogging,
the following principles: find, use and share (2009). This arrangement tagging, photo and video sharing) to connect people, help them share
means that the government should provide data in a way that will their ideas, receive their valuable feedback on various matters and involve
allow data indexing and searching, in a reusable format and with no them in the policy-making process (Lee & Kwak, 2011).
restrictions on data re-purposing. Given the widespread use and Collaboration is aimed at more responsive decision making based on
impact of governmental information, it is important for it to meet the collaborative work and feedback information. Collaboration enables
the following characteristics: complete, primary, timely, accessible, involvement of all stakeholders in government operations and decision
machine processable, non-discriminatory, non-proprietary and license making. Relying on collaborative social media tools such as wikis or
free (Open Government Working Group, 2007). Google Docs for dedicating specific tasks to wider community govern-
Transparency is a crucial ingredient of OG; more transparency ments can accomplish the goal of collaboration (Lee & Kwak, 2011).
means better governance, more efficiency and legitimacy. Transparency There are different types of collaboration in government: internal
can be further disseminated into transparency of government opera- collaboration within the government (G2G — government to govern-
tions, procedures and tasks (government transparency) and transpar- ment), intra-collaboration between government and non-profit organiza-
ency of government-held data (data transparency). Transparency of a tions and the private sector (G2B — government to businesses) and
government is a means for achieving accountable government that external collaboration between the government and the citizens (G2C —
measures and tracks the outcomes of its actions and takes responsibility government to citizens).
for the results. Letting people see the internal government flows and
investigate whether their representatives have met their expectations 2.2. Benchmarking open government
is an important step for achieving accountable government. Data open-
ness is a necessary prerequisite for transparency, and it is being promot- Benchmarking is used as a tool for making comparisons between
ed around the world as part of the Open Government initiatives (Open two or more entities based on a defined set of indicators (Rorissa,
data portals, 2011). However, the transparency of data overcomes data Demissi, & Pardo, 2011). Schellong (2009) emphasizes the importance
openness and availability because transparency of data is about ensur- of benchmarking e-government by explaining that it indicates progress
ing that the data are well known, comprehensible, easily accessible, in reaching e-government goals and that it can be used as a tool for
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
4 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
learning, information sharing, goal setting or supporting performance the goals of the OG concept, which were proposed in Section 2.1, are
management. E-government benchmarks developed prior to the OG fulfilled. The benchmark results in two measures:
are not applicable for this model because OG is more focused on data
than on developing electronic services. Furthermore, the missing clear • e-government Openness Index (eGovOI), which reflects the government's
conceptualization of the OG could be the reason why the idea of OG state regarding governmental efforts, readiness to publish open
evaluation has very limited academic research (Sandoval-Almazán, datasets in a timely manner, orientation toward user needs and user
2011). Ren and Glissmann (2012) propose an approach for open data involvement in government.
assessment that is based on theory and practice in business architecture • Maturity, which reflects the government's progression speed and
and information quality. They propose six criteria for identifying infor- openness toward new concepts.
mation assets as open data: accessibility and availability, understand-
ability, completeness, timeliness, error free and security. Government The main building blocks of our benchmark: data sources, indicators
transparency has been conceived variously, from the flow of informa- and results, are depicted in Fig. 2.
tion to government accountability; as a result, there are different mea- Sources identified for benchmark indicators are open data and user
sures that address these aspects of transparency (Hollyer et al., 2013). involvement. Behind the OG data is the principle that data are a “strategic
Transparency has often been viewed through the lens of government asset that should be shared with the public to increase government
corruption. The Corruption perceptions index (CPI) from Transparency accountability, deliver services more efficiently and stimulate economic
International (TI) measures the perceived levels of public sector corrup- growth” (Socrata, 2011). User involvement is considered to be a valuable
tion. It is based on expert assessments and survey data, covering issues source for government openness, and it is included in the benchmark
such as access to information, bribery of public officials, kickbacks in model to express the willingness and readiness of government to accept
public procurement, and the enforcement of anti-corruption laws and utilize users' perspectives and points of view.
(Lambsdorff, 2007). One of the World Bank's Worldwide Governance In the proposed benchmark model, open data are used as a data
Indicators, which is Control of Corruption, captures perceptions of the source for three indicators: a basic dataset, data openness and data
extent to which public power is exercised for private gain. In compari- transparency. User involvement is used as a source for participation
son to the CPI, Control of Corruption measures corruption both in the and collaboration indicators. Within the participation indicator, user
public and private sector as perceived by experts and opinion polls involvement addresses citizens' involvement in the government pro-
(Rohwer, 2009). cess and decision making. In the scope of the collaboration indicator,
Kalampokis, Tambouris, and Tarabanis (2011) suggested a bench- user involvement encompasses citizens', businesses' and government
mark model that evaluates Open Government through four stages: agencies' involvement in the process of collaborative decision making.
government data aggregation, government data integration, integration Each of the five indicators (basic dataset, openness, transparency,
of government data with non-government formal data and integration participation and collaboration) has a set of rules that are applied to
of government data with non-government formal and social data. scoring open data and user involvement.
There are also some private sector attempts for measuring open data Because the focus of this paper is on open data, indicators that
and government openness. Socrata (2011) has performed an Open are founded on open data (basic dataset, data openness, data transpar-
Government data benchmark study that addresses the open data as a ency) will be thoroughly described in the following subsections.
strategic aspect in an OG model. They have studied the presence of Although developed, participation and collaboration indicators will
open data in government portals, the availability of open data portals not be mentioned here because their impact is beyond the scope of
and high-value datasets that are present on the portals. This study this paper.
clearly addresses one aspect of the Open Government model, which is
open data, but it is necessary that an Open Government benchmark 3.1. Basic dataset indicator
also addresses other Open Government features, namely, transparency,
collaboration and participation, and defines appropriate criteria and The basic dataset (BDS) indicator determines the presence of a
scales for measurement. predefined set of high-value open data categories. These categories can
vary in different countries, but to establish a standard assessment
model, it is crucial for a basic set of categories to be defined and adopted.
3. Benchmark model proposal for open government Upon analysis of open data portals around the world (Veljković,
Bogdanović-Dinić, & Stoimenov, 2011a), we have singled out the nine
Our OG benchmark model proposal (OpenGovB) was developed to most common data categories: Finance and Economy, Environment,
explore the boundaries of government openness and discover whether Health, Energy, Education, Transportation, Infrastructure, Employment
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 5
and Population. These categories are present in each analyzed portal, 3.2.1. Choosing a relevant data subset
which emphasizes the common importance of updating and publishing Determining a sample size for each dataset category is an important
such information. Consequently, these categories have been imposed to step, not only for the DO calculation but also for the Transparency indi-
form our predefined basic dataset. cator. A statistical approach is needed here to provide a reliable method
The assessment of a BDS indicator requires the analysis of available for sample size determination with given restrictions, such as the confi-
categories on governments' data portals. A category is considered to dence level and margin of error. Eqs. (2) and (3) represent formulas that
be present if it has at least one dataset published. Governments' portals are used to determine the sample size (Sematech, 2012).
should have all of the data categories from a basic dataset along with
corresponding data for each category. The BDS indicator takes values Z 2 p ð1−pÞ
ss ¼ ð2Þ
from the range (0, 1). The value 0 indicates that there is no high-value c2
dataset category present on the open data portal; the value 1 indicates
that all categories are present. The calculation is performed according
to Eq. (1), where n stands for the number of categories that are present ss
ss ¼ : ð3Þ
on the open data portal, while N marks the total number of categories ss−1
1þ
(N = 9) pop
BSD ¼ n=N: ð1Þ Eq. (2) explains the process of calculating the sample size based on
the confidence level, margin of error and expected accuracy. The margin
of error (also called the confidence interval) is a plus/minus value that
3.2. Data openness indicator indicates the precision of a chosen sample and allows deviation of the
expected results. The margin of error is denoted as c in the equation,
The data openness (DO) indicator is focused on evaluating the and in this research, it was valued at 10%. Z represents the chosen con-
degree of openness of the published data and is thus comprised of fidence level, which is expressed as a percentage and represents how
eight criteria that are consistent with the Open Government Working often the true percentage of the sampled data satisfies the required
Group's list of eight preferable characteristics for open data (2007). characteristic and lies within the confidence interval. Usually Z is chosen
Table 1 gives short descriptions of each criterion along with their values to be 90 or 95%. We have chosen a 95% confidence for which Z takes on
and a total value for the DO indicator. the value of 1.65 in the calculation, according to the table of standard
The completeness is calculated according to five recognized fea- normal curve area values. Finally, p represents the accuracy, which is
tures: the presence of a data meta description, the possibility of data expressed as a percentage of the sampled data that would truly satisfy
downloading, whether the data are machine readable and whether the required characteristics. Because there is not a trustworthy way
the data are linked (meaning that a data link is available), to ease data for a reliable prediction of such a percentage, we have used the 50%
accessibility (e.g., embed data in a custom web application, link to value. Eq. (3) represents a correction of the calculated sample size
other data). Data are primary if the data are published raw, in the orig- according to the true size of the data category, which is denoted as pop.
inal format, to be suitable for different analyses; if data are published in
the form of a chart or any other pre-analyzed format, then the data are 3.3. Transparency indicator
not considered to be primary. Data are timely if the data are up-to-date
and/or regularly being updated; each dataset should have, in its The transparency (T) indicator is composed of two indicators,
description, a part that describes the timeliness of the data: what time Government Transparency (GT) and Data Transparency (DT), and is
period is covered by the data in the dataset, how often the data updated, calculated as the average of their values. We chose the average function
and when the last data update was. The data accessibility considers that because we considered that both types of transparency, data and govern-
the data should be accessible to everyone equally, without asking for ment, are equally important for building a transparent government. The
the purpose needed, while a non-discriminatory feature reflects freely T indicator takes its values from the range (0, 1), where 0 stands for the
available data. Machine processable means that the data could be lack of transparency and 1 is associated with high transparency.
processed by a computer and, in that manner, the proposed model Government Transparency is observed as a measure of insight into
recognizes three evaluation levels: formats that are not machine government tasks, processes and operations. This benchmark does not
processable (e.g., PDF), structured formats that could be automati- propose new measures for GT because there are already well-developed
cally processed (e.g., CSV) and structured formats that include meta- and acknowledged measures, such as the previously referenced Corrup-
description and semantics (e.g., XML, RDF). Non-proprietary features tions Perception Index (CPI) and Control of Corruption (CC). We leave it
relate to the previous feature by considering the data formats from to the Government to choose a measure for the GT indicator, but we do
the aspect of supported processing programs; in that manner, for recommend one of the previous two indices to be involved.
datasets that are available in a format such as XLS, which requires the The Data Transparency indicator is calculated as an average of the
commercial Microsoft Excel program for accessing it, this feature is Authenticity, Understandability and Data Reusability values. This
given a value of 0, while for formats such as CSV, XML, RDF, which do indicator takes values from the range (0, 1), where the value 0 stands
not require any specific, commercial program, this feature is given the for a complete lack of data transparency and the value 1 represents
value of 1. Finally, a license-free feature relates to free access to data the highest transparency level. Table 3 provides an overview on the
and is scored 1 if the data are published under an open license. five data transparency levels, where cradle gives recognition to initial
To perform the assessment on data openness, it is necessary to governmental efforts toward embracing data transparency principles,
choose a relevant subset of the data for each published dataset category, while high transparency signifies the highest efforts and the results
as explained in Section 3.2.1 of the paper. Data openness for the whole regarding providing a transparent view on the data and access to it.
portal could then be calculated as an average value of openness of all Authenticity (A) relates to Data Sources (DS) and Data Accuracy and
dataset categories. The data openness indicator for the whole portal Integrity (DAI). Data sources are considered to be government authori-
can have values that are in the range (0, 1). Table 2 gives a five-level ties and non-governmental institutions and agencies that publish raw
openness scale that is based on calculated values for the DO indicator. data on an open data portal. These institutions should be well known
The lowest level is called cradle, and it only recognizes an initiative and have a good reputation for users to utilize the data safely and with-
toward adopting data openness, while other identified levels indicate out prejudice. Government should publish information about data
the government's advancement regarding eight openness principles. sources on the portal, thus making it freely available and accessible to
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
6 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
Table 1
Calculation of the DO indicator.
Max
0.25 0.25
Primary Are data provided in original form and can be used for further analyses? 1
DO 8/8
all of the users. Additionally, a user should be able to grade each data influence the final value. The final user grading value is calculated as
source according to their experience and provide the grading information user_grade ∗ (graders / number_of_downloads), where user_grade
to the other users. Having that in mind, this benchmark model proposes represents the average grade scaled to the range (0, 1), graders
the following features to be valued in the process of estimating a DS represents a number of users that have graded the dataset, and
indicator along with proposed involvement of each feature in the final number_of_downloads stands for the number of users that have utilized
DS value: data and are potential graders. Government and user grading are very
important because they represent an opinion, an experience and a
• List of data sources available on the data portal (F1) — 30%;
point of view, which is why they are equally scored in the proposed
• The possibility of reviewing datasets published by a specific data
model and together make 70% of the DAI feature. The third feature is
source (F2) — 30%;
quality certification, which simply acknowledges the existence of a
• Available data source description with basic information on the data
certifying document for a dataset; quality certification represents
source (F3) — 20%;
electronic proof of the data accuracy and integrity of the contained
• User grading information (F4) — 20%.
information. The final value of the DAI feature is calculated according
A higher influence on building the DS indicator is given to features to Eq. (5):
marked as F1 and F2 because they represent high-frequency user
requests. The F3 and F4 features are also important, but they are consid- DAI ¼ 0:35 F1 þ 0:35 F2 þ 0:30 F3: ð5Þ
ered to have less influence. Considering all of the 4 features, the informa-
tion provided by Government (F1, F2 and F3) makes 80% of the DS value, DAI takes its values from the range (0, 1), where 0 stands for a total
while user-provided feedback is involved with 20%. Eq. (4) gives a lack of trust in the data accuracy and integrity, and 1 signifies the max-
formula for the DS calculation. imal satisfaction of all 3 features.
Authenticity is then calculated based on the DS and DAI values. Hav-
DS ¼ 0:30 F1 þ 0:30 F2 þ 0:20 F3 þ 0:20 F4: ð4Þ ing in mind the fact that DAI is calculated based on each dataset from a
relevant subset, while DS represents a global view on the data providers,
F1, F2 and F3 are scored 1 if they are satisfied; otherwise, they are we have determined that DS should make 40% while DAI should make
scored 0. F4 represents an average grade scaled to a range of (0, 1). DS 60% of the Authenticity indicator. Eq. (6) represents a formula for the
takes its values from the range (0, 1), where 0 signifies that none of the Authenticity indicator calculation.
features are present on an open data portal, while 1 marks the maximum
A ¼ 0:40 DS þ 0:60 DAI: ð6Þ
data sources' credibility.
It is of crucial importance to fully trust the accuracy and integrity
Understandability (U) of each dataset category as well as of contained
of the published data, but it is also very tricky to evaluate such char-
raw data must be provided. This task can be accomplished by publishing
acteristics. To provide a meaningful and reliable evaluation method, this
textual descriptions of data categories that, in detail, explain types of
model proposes 3 assessment features for DAI evaluation:
• Government grading (F1) — 35%;
• User grading (F2) — 35%; Table 2
• Quality certification (F3) — 30%. Data openness levels.
Government grading implies the existence of government-provided DO indicator value Data openness level
feedback on data accuracy and integration for each published dataset. In 0–5% 0 — cradle
the evaluation process, the average grade is taken for a calculation and 6–35% 1 — basic openness
scaled to the range (0, 1). User grading represents feedback on published 36–75% 2 — average openness
data provided by users. For calculation purposes, a number of users that 76–90% 3 — openness
N90% 4 — high openness
have provided their feedback should be taken into consideration and
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 7
Table 3
Data transparency scale.
data that are published under each category as well as descriptions of knowledge of the data structures. We have made a DR scale for measur-
each contained dataset. The Understandability indicator is calculated by ing the openness of the linked open data, by adapting Sir Tim Berners-
checking whether each data category from a basic dataset has a descrip- Lee's (2010) five star model for linked data to a 4-level model depicted
tion and whether each dataset contained in a chosen subset within each in Fig. 3.
category has a description. The first feature, Data Categories' Descriptions To calculate the Data Reusability, it is necessary to check the chosen
(DCD), is evaluated against three features: data subsets within the data categories and evaluate their stage. Level 0
is scored with the value of 0, Level 1 with the value of 1, Level 2 with the
• Textual description (34%) (F1) — existence of a textual data description
value of 2 and, finally, Level 3 with the value of 3. A dataset is scored
for a dataset.
according to the level it belongs to and then scaled to the range (0, 1).
• Tags (33%) (F2) — existence of searchable tags associated with a dataset.
The DR indicator's value is an average value of the datasets' reusability.
• Linked information (33%) (F3) — links toward additional information
It takes its values from the range (0, 1), where the value 0 represents the
regarding the dataset.
lowest data availability level and the value 3 represents the highest level
All three features are considered to be equally important and are of data availability.
equally scored. Each category is scored either with the value of 1 (if the
feature exists) or 0 (otherwise). The DCD feature is then calculated 3.4. E-government openness index
according to Eq. (7); it takes its values from the range (0, 1), where 0
means that not even one data category has a description and the value E-government Openness Index is an overall measure of each of the
1 signifies that all of the categories have adequate descriptions. five assessment indicators and a base indicator for Maturity (Fig. 4).
The evaluation process is very detailed, and it provides an in-depth
DCD ¼ ð0:34 F1 þ 0:33 F2 þ 0:33 F3Þ=9: ð7Þ analysis of not only the five principal indicators but also the indicators'
key components. The results obtained could point to the exact weak
The second feature, Description of Datasets within the data category spots of the government, addressing with great precision the areas
(DSD), is calculated against the same three characteristics as DCD but that must be improved and, thus, helping to build steady trust with
with regard to each dataset from a chosen data subset. Each dataset is the government. The calculation of eGovOI is given in Eq. (9). The
valued from 0 to 1, following the same equation as the DCD feature, basic dataset indicator makes 15% of eGovOI, the Openness indicator is
and a final DSD value is being calculated as an average value of all of involved with 33%, the Transparency indicator is involved with 26%,
the datasets. The DSD values range from 0 to 1, where 0 implies a total and the Collaboration and Participation indicators are included with
lack of data description, while 1 signifies that all of the datasets are 13% each. This division is drafted based on the assessment of the impor-
properly described. tance of the participation of each of the five indicators in the final
The understandability indicator is calculated based on the DCD and measure of eGovOI.
DSD features, as given in Eq. (8). The understandability indicator ranges
from 0 to 1, where 1 implies the complete availability of both the data eGovOI ¼ 0:15 BDS þ 0:33 O þ 0:26 T þ 0:13 P þ 0:13 C: ð9Þ
categories and dataset descriptions. In contrast, if there are no descrip-
tions for the data categories and datasets, the Understandability indica- The eGovOI maps its values on a 0–100% scale, which reflects the cur-
tor has the value 0. rent state of government openness. A total lack of openness is deter-
mined with 0%, while 100% represents full openness. Because the basic
U ¼ 0:4 DCD þ 0:6 DSD: ð8Þ dataset and data openness are prerequisites for Open Government,
they should have priority over the other three indicators. A government
Data Reusability (DR) is the last transparency aspect that we address. could not be considered to be open if it only has these two features
It refers to providing data in open formats so that a user can search, implemented, but it can be considered to be on the “openness-road”.
index and download data using common tools without any prior This possibility actually means that Basic Dataset and Data Openness
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
8 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
make the foundation for Open Government, while Transparency, embracement of open concepts. In other words, Maturity reflects the
Participation and Collaboration determine the degree of openness. This speed of government progress. The timely identification of necessary
statement also imposes the need for a basic dataset and openness imple- changes and their application for the purpose of government evolvement
mentation prior to implementing the other two concepts. Thus, it was is an essential part of maturity. Maturity is calculated by combining the
necessary to establish a scale of openness based on eGovOI values that eGovOI measure and the time necessary for government to make changes
would define the position of a Government regarding the final goal — and improve its openness. This scenario further means that government
100% openness. Table 4 illustrates defined openness levels from the that progresses in a short period of time is more mature than government
perspective of all five openness indicators and eGovOI. that makes the same amount of progress but much slower. It can be
The lowest level is level 0, which recognizes the openness initiative expected that a more mature government makes a larger amount of
and encourages further development. Basic openness implies that progress in the next period of time.
there is solid development of the BDS and DO indicators and that the Maturity is a measure that relies on the amount of progress that has
Transparency development has been initiated. A value of eGovOI that been made since the last benchmark as well as the experience gained dur-
is between 26 and 65% indicates that government is in a stage of average ing the entire open government development process. The amount of
openness, which means that the BDS and openness indicators are highly progress is measured based on current and previous eGovOI values and
developed; the transparency is approximately halfway toward 100%, the number of years that passed between two adjacent benchmarks, as
and the participation and collaboration are at a starting position. The indicated in Eq. (10).
openness level indicates that the first 3 indicators are developed to be
over 75% and the later 2 are over 30%, while a high openness level
reflects that they are all over 75%. eGovOInew −eGovOIold
F¼ : ð10Þ
No Years
3.5. Maturity
The progress has been marked as F in the equation because it would
The described benchmark has an additional result, called Maturity, represent a factor that influences the change of Maturity in the maturity
which is related to the government's readiness for change and its calculation.
Table 4
Openness stages based on the eGovOI value.
Basic dataset Data openness Transparency Participation Collaboration eGovOI Openness level
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 9
The experience represents the knowledge gained from previous youth (21–30%), adulthood (31–50%), mid-age (51–75%) and mature-
efforts in open government development, and it is expressed as the age (76–100%).
old maturity value. New Maturity is calculated according to Eq. (11):
4. Use case study: focus on open data
8
< 0:01; Maturityold ¼ undefined
Maturity ¼ Maturityold þ log2 ð F þ 1Þ; 0%beGovOIb100% : ð11Þ Considering that the U.S. government's data.gov is the first open data
: portal that influenced others to embrace and develop their own open
Maturityold þ log2 ð1:01Þ; F ¼ 0 and eGovOI ¼ 1
data portals and OG initiatives, we have chosen data.gov to test the
capabilities of the proposed OG benchmark model. During the applica-
There are three case scenarios for calculating Maturity. The first tion, we have focused solely on open data and, therefore, we have scaled
calculation simply assigns a very small value to Maturity as an indica- data.gov on the Openness scale based on the openness of data published
tion of starting the development process. We have chosen this value on the portal and the data transparency. This use case study was
to be 1% (0.01). During the progression, Maturity is calculated according performed in October 2012, while data.gov was still running on the
to a logarithmic rule that is based on experience and a progression Socrata open data platform. Because it is currently in the process of
factor. At the point when eGovOI reaches 100%, the progression factor transferring to the CKAN open data platform, it is possible that there
obtains the value of 0.01, which enables the increase of Maturity over are some mismatches with the results that are further presented.
time regardless of having a constant openness index. In this way, credit
is given to a Government for reaching a 100% openness, and experience Step 1: BSD calculation
is being acknowledged. The Maturity values are in the range (0–100)%. The first step was to determine the presence of proposed set of man-
Because there are no empirical results of this model's application over datory categories. The analysis has shown that data.gov does not
time, we are giving a hypothetical example of Maturity changes over a have only one of the nine proposed data categories, which is the
period of 6 years in two countries that have different progression Infrastructure category. Other categories are all present, and the
rates (Fig. 5). majority of them hold their data in several subcategories, as listed
Based on a concrete value, a government could be classified in one of in Table 5. The value of the BSD indicator for the data.gov portal is
the following maturity periods: infancy (0–10%), adolescence (11–20%), 0.89, which is calculated according to Eq. (1), where n takes the
Table 5
Determination of the relevant data subset sizes for the U.S. Government's open data.
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
10 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
Table 6
The results of the DO indicator calculation for the U.S. government's open data.
Category SubCategory Complete Primary Timely Accessible Machine Non- Non- License Subcategory Category
processable discriminatory proprietary free total total
Finance and 0.5622 0.5889 0.4489 0.5889 0.5267 0.5889 0.5778 0.5889 0.5589
economy Economic 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
Financials 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
Banking, finance and insurance 1.0000 1.0000 0.8000 1.0000 0.9000 1.0000 1.0000 1.0000 0.9625
Federal government finances and 0.9111 0.9444 0.9444 0.9444 0.7333 0.9444 0.8889 0.9444 0.9069
employment
State and local government finances 0.9000 1.0000 0.5000 1.0000 1.0000 1.0000 1.0000 1.0000 0.9250
and employment
Environment 0.9750 0.9750 0.9250 1.0000 0.8225 1.0000 0.9500 1.0000 0.9559
Environment 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
Geography and environment 0.9500 0.9500 0.8500 1.0000 0.6450 1.0000 0.9000 1.0000 0.9119
Health 0.4867 0.5000 0.2333 0.5000 0.2700 0.5000 0.4000 0.5000 0.4238
Health 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
Health and nutrition 0.9733 1.0000 0.4667 1.0000 0.5400 1.0000 0.8000 1.0000 0.8475
Energy 1.0000 1.0000 0.5714 1.0000 0.2667 1.0000 0.1429 1.0000 0.7476
Energy and utilities 1.0000 1.0000 0.5714 1.0000 0.2667 1.0000 0.1429 1.0000 0.7476
Education 0.9385 0.9231 0.7692 0.9231 0.4154 0.9231 0.7692 0.9231 0.8231
Education 0.9385 0.9231 0.7692 0.9231 0.4154 0.9231 0.7692 0.9231 0.8231
Transportation 0.9333 0.7500 0.8333 0.9167 0.2750 0.9167 0.1667 0.9167 0.7135
Transportation 0.9333 0.7500 0.8333 0.9167 0.2750 0.9167 0.1667 0.9167 0.7135
Population 0.9118 0.9412 0.7647 0.9412 0.5471 0.9412 0.7059 0.9412 0.8368
Population 0.9118 0.9412 0.7647 0.9412 0.5471 0.9412 0.7059 0.9412 0.8368
Employment 0.8881 0.9370 0.7704 0.9370 0.7822 0.9370 0.8963 0.9370 0.8856
Labor force, employment and earnings 0.8533 0.8667 0.8667 0.8667 0.6133 0.8667 0.8000 0.8667 0.8250
State and local government finance 0.9000 1.0000 0.5000 1.0000 1.0000 1.0000 1.0000 1.0000 0.9250
and employment
Federal government finances and 0.9111 0.9444 0.9444 0.9444 0.7333 0.9444 0.8889 0.9444 0.9069
employment
value of 8. The selected data.gov categories have been developed as to other, highly valued, indicators. Similar analysis could be per-
data communities that gather citizens, scholars, businesses and formed per category, when the results could be utilized for identifying
innovators around the same interest areas and that enable collabora- week indicators in every category and thus help to define the strategy
tion and knowledge sharing (McClure, 2011). for improving the overall DO value.
Step 2: Calculation of the sample sizes for the categories Step 4: Data transparency
Once the categories have been determined, a sample size for each Data Transparency for data.gov has received the value of 0.4548,
category must be calculated according to Eqs. (2) and (3). In that which is the average data transparency level. The values for its three
case, a category is comprised of several subcategories, and a sample indices were determined as follows:
size for each subcategory is calculated. Table 5 shows the results of Data Authenticity has received a final value of 0.3823 according to
the applied formula on the U.S. government's open data, which Eq. (6). The DS feature has scored 0.6151. The F1 and F2 features
indicates the sample sizes for each category/subcategory.
were maximally scored, while there was a complete lack of data
Step 3: Data openness calculation
source descriptions and user grading, influencing the scores of
After determining the size of the samples, data were chosen ran-
domly from the data categories and subcategories, and the results 0.0151 and 0 for F3 and F4, respectively. For the F3 calculation, we
of the DO calculation are shown in Table 6. The final DO value for determined a sample size in the same way as determining the sam-
data.gov was calculated to be 67.5%, which is average openness. ple sizes for the datasets. There are currently 213 publishers, and the
The best evaluated category is Environment, with a score of 0.9559, calculated size of a sample was 66. F3 has received a 0.0151 score
which is slightly over 95%, attaining a high openness level. On the because only one out of 66 randomly chosen publishers had a textu-
other hand, the worst scored category is Health; Health had the al description and, therefore, received a value of 1. The DAI feature
result of 0.4238, which is under 45%, placing it in the lower average has scored a value of 0.23713.
openness level. This category is comprised of two subcategories:
Understandability is calculated as indicated in Eq. (8), based on the
Health and Health and Nutrition. The health and nutrition category
DCD and DSD feature values, with the value of 0.4733. The DCD fea-
has achieved a relatively high result of 0.8475, but the Health subcat-
egory does not have any dataset and is thus scored with a 0. This ture scored the value of 0 because there were not any data category
analysis points toward data categories for which the improvement descriptions that were available on the portal, while the DSD feature
could directly influence the improvement of the overall DO scored 0.7889.
indicator, such as the Health subcategory. With regard to the analy- The Data Reusability feature was estimated with the value of 0.5088,
sis of 8 DO indicators, the overall lowest score has been achieved by which is slightly over 50% according to the adopted DR scale.
the Machine processable indicator (0.4882), which is a direct conse- Step 5: E-Gov openness index
quence of the fact that the majority of datasets are available in CSV, We have obtained values for all of the indicators except for Partici-
XLS or PDF formats, ignoring the need for semantic data enrichment. pation and Collaboration. These values are sufficient for determin-
Timely and Non-proprietary indicators have also been placed in a ing the openness level, but they are not sufficient for determining
lower graded group, with the values 0.6645 and 0.5761, respectively. the precise eGovOI value. Therefore, the openness index value is
These results are identifying challenging indicators that require represented as a possible range. According to the BDS, DO and T
more attention in the future, to increase their scores and come closer indicators, U.S. Open Government is currently at Level 2, which
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 11
represents average openness, while the openness index could take open data, while the latter two require the involvement of governmen-
its values from the range (51.23–59.03)%. We have calculated Trans- tal officers and thus provide back-office feedback. Because the primary
parency indicator as average of DT, which has been previously de- focus of this paper was on open data, we have performed a calculation
termined in Step 4 of the use-case, and GT indicator (World Bank's of the data-oriented indicators (the first three indicators). One of the
CC index for U.S.). significant advantages of this model is the ease of its application because
data are easily accessible on open data portals.
4.1. Discussion of the results Bannister (2007) has identified three practical issues related to con-
ceptualization of e-government benchmark models: the purpose, what
Based on the obtained results, we could conclude that data.gov has is to be measured and the type of benchmark. Considering the first issue,
more work to perform with regard to the DO and DT indicators. Because OpenGovB was first initiated for academic research purposes, but our
the DO indicator has a higher coefficient in the equation for the eGovOI intention and final goal is to apply it for tracking the progress of one
calculation, the improvements regarding this indicator are likely to have government's openness over time and enabling comparison between
a greater influence on making an advancement on the Openness Scale. governments. Government's openness is reflected through eGovOI and
As discussed previously, there are many categories that have scored Maturity. Although these are the obvious results of the OpenGovB,
low regarding data openness, and there are, as well, many indicators when defining what is to be measured it is necessary to directly address
that point to potential weaknesses. Government officials could focus open data, transparency, user participation and collaboration as build-
on those indicators when developing strategies for improving the data ing blocks of the adopted open government concept. Finally, when
openness. Policy makers, on the other hand, could also benefit from talking about the type of the proposed benchmark, OpenGovB is intended
those results because they could incorporate study findings in new to be standard oriented and to enable comparison of how different
acts that regulate the process of publishing open data. This scenario governments conform to established Open government standards. It is
means that they could recommend to publishing organizations to give currently oriented toward open data definitions (Open Government
special attention to certain data features that were recognized as weak Working Group, 2007) which are implemented through data openness
in the analysis (e.g., data formats, data updates). and data transparency indicators.
Data transparency analysis has demonstrated significant weak- OpenGovB introduced some constraints in the process of the eGovOI
nesses regarding all three indicators: data authenticity, understandabil- calculations. These constraints are reflected on the order of calculation
ity and data reusability. The DAI feature of the Authenticity indicator has of the indicators' values. In this way, transparency cannot be calculated
achieved a very low score (under 25%), which is a direct consequence of before a basic dataset and data openness indicator. This arrangement
a lack of governmental rating of datasets and is extremely fair to the further means that we have given priority to some indictors on the
user feedback. If Governmental officials would provide rating informa- basis of their influence on overall government openness. It is evident
tion for each dataset, then it would significantly increase the value of that the foundations for government openness lie mainly in the first
this indicator; additionally, it would increase the citizens' trust in the three indicators (BDS, data openness and transparency) and that the
published data. Understandability has scored under 50% owing to a participation and collaboration indicators are rather independent. For
complete lack of data category descriptions. One of the recommenda- this reason, we have given those three indicators priority in the calcula-
tions for the data publishers could involve requiring mandatory descrip- tion because without them we cannot talk at all about the existence of
tions of the data that belong to a certain category because this step an Open Government model.
would significantly increase the clarity of the contained information. Each of the indicators is represented by a developed formula, which
The data reusability's low score is a consequence of the data formats. is based on several recognized important aspects for each indicator and
Considering the increasing importance of semantics in every field, the acts as a guide through the calculation process. Formulas are usually
data publishers could be advised to strive toward semantically enriching summations in which each of the summands is enhanced with a chosen
their data whenever it is possible. coefficient. We have decided to apply values to those coefficients,
according to our research outcomes regarding the importance of specif-
5. Discussion and future remarks ic features, rather than to leave it to governments to choose their own
values, for two main reasons. The first reason relates to developing a
The proposed OpenGovB model is developed according to the general model that could be used for comparing the openness level of
adopted Open Government concept, which recognizes open data, data different governments. The second reason is to avoid the possible
transparency, government transparency, participation and collabora- tuning of the results, which could be driven by the ambition of achieving
tion as driving background ideas. The OpenGovB therefore consists of higher results. As we previously discussed OpenGovB is standard orient-
five indicators: Basic Dataset, Data Openness, Transparency, Participa- ed and it is principally designed as comparative framework that enables
tion and Collaboration, among which the first three are related to tracking and comparing of governments' openness progress. However,
Table 7
Activity theory-based OpenGovB.
Activity Benchmarking EGOV Benchmarking: Benchmarking the openness of public government data: basic dataset, data openness and data transparency
focus on open data
Subject Benchmarker Benchmarker E-government researcher, Government or Official eGOV benchmarking organization
Object Benchmarking purpose Open data portal Determine openness of government held data as a measure named DO Index, by analyzing published data on
(purpose, policy, priorities) government open data portal
Artifact Benchmarking approach OpenGovB model Define benchmark indicators, perform calculation and obtain measureable value of benchmark outcomes
Outcome Benchmarking result eGovOI and maturity DOI ranking
Community Benchmarking partners e-government experts researchers; could be extended to e-government institutions responsible for open data portal i.e. open data
providers
Rules Community rules Open data policies policies related to publication of open data on open government portals, open data definition
Roles Partner roles Partner roles commitment of data publishers (agencies, government) to publish data on open data portals in accordance
with Open data policies; commitment of data consumers to participate in the open data publication
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
12 N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx
it is possible to be used as a self-assessment tool, but only to test to what the U.S. government's open data are, in this sense, considered to be an im-
extent government conforms to openness standards, and without any portant study that has noted some critical data features and aspects of the
possibility of addressing the specific interests of that government. evaluation process that must be considered while developing an assess-
Several questions are imposed concerning the OpenGovB model, ment tool. Based on the presented findings, our future research would
starting with the following: How to evaluate the feasibility of the be focused on developing such an evaluation framework and applying it
model? Since OpenGovB evaluation against several open governments to a wider range of governments.
at the time of writing has not been done, we will discuss the pragmatics
of the model to provide a theoretical validation. Ojo, Janowski, and
Estevez (2011) have proposed a benchmarking conceptualization for References
e-government based on Activity Theory (Almeida & Roque, 2000). A Akman, I., Yazici, A., Mishra, A., & Arifoglu, A. (2005). E-government: A global view and an
rationale behind using the Activity Theory in e-government domain is empirical evaluation of some attributes of citizens. Government Information Quarterly,
that e-government is an activity situated within government context, 22(2), 239–257.
Almeida, A., & Roque, L. (2000). Simpler, better, faster, cheaper, contextual: Requirements
carried out by policy makers, strategists and researchers to achieve
analysis for methodological approach to interaction systems development. Paper
learning-oriented outcomes. To provide a theory-based verification of presented at the 8th European Conference on Information Systems, Trends in Information
OpenGovB we will provide mapping between OpenGovB and the Activ- and Communication Systems for the 21st Century, Vienna (pp. 17–22).
ity Theory-based generic benchmarking model proposed by Ojo et al. Åström, J., Karlsson, M., Linde, J., & Pirannejad, A. (2012). Understanding the rise of
e-participation in non-democracies: Domestic and international factors. Government
(2011) (we will further refer it as AT-EGovB) (Table 7). The conceptual- Information Quarterly, 29(2), 142–150.
ization of AT-EGovB is based on mapping of eight generic concepts of Bannister, F. (2007). The curse of the benchmark: an assessment of the validity and value
the Activity Theory — Activity, Subject, Artifact, Object, Outcome, of e-government comparisons. International Review of Administrative Sciences, 73(2),
171–188.
Community, Roles and Rules into the corresponding concepts in Baum, C., & Di Maio, A. (2000). Gartner's four phases of e-government model. Gartner Group
the e-government benchmarking domain. (Retrieved from https://www.gartner.com/doc/317292).
Table 7 demonstrates mappings between AT-EGovB dimensions and Berners-Lee, T. (2010). Linked data. Retrieved March 17, 2012, from. http://www.w3.org/
DesignIssues/LinkedData.html
OpenGovB dimensions based on core Activity Theory concepts. In other Britain, G., & Treasury, H. M. (2009). Putting the frontline first: Smarter government. Great
words, it provides a step by step guide for practical application of the Britain: Stationery Office (Retrieved from www.official-documents.gov.uk/
OpenGovB model. Breaking the Benchmark Activity down to its funda- document/cm77/7753/7753.pdf).
Campaign for Vermont (2013). Achieving accountability: Transforming state government
mental constituents, where each is associated with specific OpenGovB's
into a modern, transparent 21st century system. Retrieved from. http://www.
aspect, it acts as a comprehensive framework for planning the realiza- campaignforvermont.org/pdfs/06.03.13_CFV_Achieving_Accountability.pdf
tion of each benchmarking step. However, OpenGovB needs to be con- Capgemini (2007). The user challenge, benchmarking the supply of online public services:
Report of the Seventh Measurement. EU: Directorate General for the Information Society
firmed in practice. Although followed by examples regarding the U.S.
(Retrieved from http://www.epractice.eu/node/281763).
government's open data portal, to be confirmed and broadly accepted Csetenyi, A. (2000). Electronic government: perspectives from e-commerce. Paper
OpenGovB should be applied to several governments that are in differ- presented at the 11th International Workshop on Database and Expert Systems
ent developmental stages and thus provide a comparative framework Applications (pp. 294–298).
Eaves, D. (2009). The three laws of open data. Retrieved March 17, 2012, from. http://
for tracking the openness progression. The example presented in this gov2.net.au/blog/2009/10/20/the-three-laws-of-open-data
paper illustrates in detail the practical approach to benchmark applica- Eggers, W. D. (2007). Government 2.0: Using technology to improve education, cut red tape,
tion. As part of our future work, we plan to apply OpenGovB to several reduce gridlock, and enhance democracy. Rowman & Littlefield Publishers.
Gant, J., & Turner-Lee, N. (2011). Government transparency: Six strategies for more open and
governments from developed and developing countries and thus obtain participatory government. Washington, DC: The Aspen Institute.
necessary information regarding the value and utility of this model. Gruen, N. (2009). Engage: Getting on with government 2.0 report of the government 2.0
Considering practical implementation of OpenGovB, the following taskforce. Australian Government Information Management Office (Retrieved from
http://www.finance.gov.au/publications/gov20taskforcereport/).
question arouse: Could maturity be calculated based solely on eGovOI Gupta, M. P., & Jana, D. (2003). E-government evaluation: A framework and case study.
and the amount of time that has passed? The answer to this question Government Information Quarterly, 20(4), 365–387.
cannot be summarized in one word “yes” or “no”; instead, the answer Hollyer, J., Rosendorff, B., & Vreeland, J. (2013). Measuring transparency. Retrieved
December, 2012, from. http://ssrn.com/abstract=2113665
is rather descriptive. There are many different factors in different coun-
Huijboom, N., & Van den Broek, T. (2011). Open data: An international comparison of
tries that could affect the Maturity component in an Open Government strategies. European Journal of ePractice, 12(1), 1–13.
model. One of the most obvious and common factors is budgetary limi- Hunter, D., & Jupp, V. (2001). E-government leadership. Rhetoric vs reality — Closing the gap.
Accenture (Retrieved from www.epractice.eu/files/media/media_846.pdf).
tations, but the factors could also be political, technical or other aspects.
Jaeger, P. T., & Bertot, J. C. (2010). Transparency and technological change: Ensuring equal
However, to be able to apply the model uniformly on all governments and sustained public access to government information. Government Information
and later compare and analyze them, we consider that the best solution Quarterly, 27(4), 371–376.
is to keep the calculation as general as possible. Further on, we are Jansen, A. (2005). Assessing e-government progress — Why and what. Paper presented at
the Norwegian Conference on the Use of IT in Organizations — NOKOBIT, Bergen.
intrigued with the question: What happens when a Government Kalampokis, E., Tambouris, E., & Tarabanis, K. (2011). Open government data: A stage
reaches 100%? The idea behind a model is that a government will model for electronic government. Paper presented at the 10th IFIP WG 8.5 international
most likely never reach 100% owing to the constant improvement and conference on Electronic government (pp. 235–246).
Karlsson, F., Holgersson, J., Söderström, E., & Hedström, K. (2012). Exploring user partici-
expansion of Open Government concepts. However, if it does, then the pation approaches in public e-service development. Government Information
Maturity should be treated as a constant. Quarterly, 29(2), 158–168.
Concerning the calculation of government transparency, we could Kotler, P., Kartajaya, H., & Setiawan, I. (2010). Marketing 3.0: From products to customers to
the human spirit. John Wiley & Sons.
ask: Is corruption a significant aspect for measuring Government Trans- Lambsdorff, J. G. (2007). The methodology of the corruption perceptions index 2007.
parency? Corruption is obviously an important aspect of government Retrieved from. http://www.stt.lt/documents/soc_tyrimai/KSI_methodology_2007.
transparency, but it is not the only aspect. A combination of several pdf
Lathrop, D., & Ruma, L. (2010). Open government: Collaboration, transparency, and participation
different indices, measuring several distinct GT aspects, would be a
in practice. Sebastopol: O'Reilly Media.
better choice for the implementation of this indicator. Here, we have Lee, G., & Kwak, Y. H. (2011, June). Open government implementation model: A stage
presented only a simplified version of the indicator, leaving more model for achieving increased public engagement. Paper presented at the 12th Annual
International Digital Government Research Conference: Digital Government Innovation
complex calculations for future model development.
in Challenging Times (pp. 254–261).
Having in mind the impact of ICT on a society in general along with all Martin, P. P. (2013, March 8). Open government beyond open data and transparen-
of the benefits and facilities that it brings to common activities, a signifi- cy. Retrieved April, 2012, from. http://blog.opengovpartnership.org/2013/03/
cant milestone of this research is the development of a benchmarking open-government-beyond-open-data-and-transparency/
McClure, D. (2011). Citizen engagement keynote. [PowerPoint slides]. Retrieved from.
framework that would enable a semi-automated assessment of the Open- http://www.slideshare.net/FedScoop/dr-david-mcclure-associate-administrator-
ness level in a fast and reliable manner. The evaluation performed on office-of-citizen-services-and-innovative-technologies-gsa
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011
N. Veljković et al. / Government Information Quarterly xxx (2014) xxx–xxx 13
McDermott, P. (2010). Building open government. Government Information Quarterly, Siddiqi, J., Akhgar, B., Gamble, T., & Zaefarian, G. (2006). A framework for increasing
27(4), 401–413. participation in e-government. Paper presented at the International Conference
Ojo, A., Janowski, T., & Estevez, E. (2011). Building theoretical foundations for electronic on E-Learning, E-Business, Enterprise Information Systems, E-Government, &
governance benchmarking. Paper presented at the 10th IFIP Conference on Electronic Outsourcing, Las Vegas, Nevada (pp. 60–66).
Government, Delft (pp. 13–25). Socrata (2011). Open Data Benchmark Study Report. Retrieved March 17, 2012, from.
Open data portals (2011). http://www.data.gov/opendatasites (Retrieved from) http://www.socrata.com/benchmark-study/download-report/
Open Government Partnership (2012). About. Retrieved September, 2012, from. http:// Tambouris, E., Gorilas, S., Spanos, E., Ioannides, A., & Lopez, G. (2001). European cities
www.opengovpartnership.org/about platform for realising online transaction services. Paper presented at the 14th Bled
Open Government Standards (2012). Standards. Retrieved September, 2012 from. http:// Electronic Commerce Conference, Bled (pp. 198–214).
www.opengovstandards.org Tat‐Kei Ho, A. (2002). Reinventing local governments and the e‐government initiative.
Open Government Working Group (2007). Open government data principles. Retrieved Public Administration Review, 62(4), 434–444.
March, 2012, from. http://www.opengovdata.org/home/8principles Veljković, N., Bogdanović-Dinić, S., & Stoimenov, L. (2011). eGovernment openness index.
Orszag, P. R. (2009). Memorandum for the heads of executive departments and agencies: Open Paper presented at the 11th European Conference on eGovernment, University of
government directive. Washington, DC: Executive Office of the President (Retrieved from Ljubljana, Ljubljana (pp. 571–577).
http://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_2010/m10-06. Veljković, N., Bogdanović-Dinić, S., & Stoimenov, L. (2011). Municipal open data catalogues.
pdf). Paper presented at Conference for E-Democracy and Open Government, Danube University
Parycek, P., & Sachs, M. (2010). Open government-information flow in Web 2.0. European Krems, Krems au Donau (pp. 195–207).
Journal of ePractice, 9(1), 1–70. Veljković, N., Bogdanović-Dinić, S., & Stoimenov, L. (2012). Web 2.0 as a technological
Ren, G. J., & Glissmann, S. (2012). Identifying information assets for open data: The role of driver of democratic, transparent and participatory government. In C. G. Reddick, &
business architecture and information quality. Paper presented at the 14th International S. K. Aikins (Eds.), Web 2.0 technologies and democratic governance: Political, policy
Conference on Commerce and Enterprise Computing (pp. 94–100). and management implications (pp. 137–151). New York: Springer.
Roberts, D. (2007). Leadership in customer service: Delivering on the promise. Accenture West, D.M. (2001). WMRC global e-government survey. Taubman Center for Public Policy
(Retrieved from http://nstore.accenture.com/acn_com/PDF/2007LCSReport_ (Retrieved December, 2012, from http://www.insidepolitics.org/egovt01int.html).
DeliveringPromiseFinal.pdf). White House (2009, January 21). Memorandum on transparency and open government.
Rohwer, A. (2009). Measuring corruption: A comparison between the transparency Washington, DC: White House (Retrieved from http://edocket.access.gpo.gov/2009/
international's corruption perceptions index and the world bank's worldwide gover- pdf/E9-1777.pdf).
nance indicators. Retrieved from. http://econpapers.repec.org/article/cesifodic/v_ White House (2009, January 21). Memorandum on freedom of information act. Washington,
3a7_3ay_3a2009_3ai_3a3_3ap_3a42-52.htm DC: White House (Retrieved from http://edocket.access.gpo.gov/2009/pdf/E9-1777.
Rorissa, A., Demissi, D., & Pardo, T. (2011). Benchmarking e-government: A comparison of pdf).
frameworks for computing e-government index and ranking. Government Information
Quarterly, 28(3), 354–362.
Nataša Veljković received the BSc and MSc degrees in computer science at the University
Sakowicz, M. (2004). How should e-government be evaluated? Different methodologies and
of Nis, Serbia. She is currently working as a Teaching Assistant at Faculty of Electronic
methods. NISPAcee Occasional Papers in Public Administration and Public Policy, 5(2),
Engineering with the Department of Computer Science. Her area of research includes
18–25.
e-government, e-systems, sensor web and GIS.
Salem, F. (2007). Benchmarking the e-government bulldozer: Beyond measuring the
tread marks. Measuring Business Excellence, 11(4), 9–22.
Sandoval-Almazán, R. (2011). The two door perspective: An assessment framework for Sanja Bogdanović-Dinić received the BSc and MSc degrees in computer science at the
open government. eJournal of eDemocracy & Open Government, 3(2), 166–181. University of Nis, Serbia. She is currently a PhD student at the Faculty of Electronic
Schellong, A. (2009). EU eGovernment benchmarking 2010+: General remarks on the Engineering and a scholar of the Ministry of Science and Technology Development. Her
future of benchmarking Digital Government in the EU. Retrieved March 17, research involves sensor web, GIS and e-systems.
2012, from. http://www.iq.harvard.edu/blog/netgov/papers/schellong_2009_
wp_eu_egovernment_benchmarking_future_methodology.pdf Leonid Stoimenov received the BSc, MSc and PhD degrees in computer science at the
Seifert, J. W. (2003, January). A primer on e-government: Sectors, stages, opportunities, and University of Nis, Serbia. He is currently an Associate Professor and Head of Computer
challenges of online governance. Washington, DC: Congressional Research Service Science Department at Faculty of Electronic Engineering at this University. His research
(Retrieved from http://www.fas.org/sgp/crs/RL31057.pdf). interests in computer science include e-systems, GIS, databases, ontologies and semantic
Sematech, N. I. S. T. (2012). e-Handbook of statistical methods. Retrieved December 17, interoperability. He is a member of IEEE, IAENG and representative in AGILE association
2012, from. http://www.itl.nist.gov/div898/handbook/ of GIS laboratories in Europe.
Please cite this article as: Veljković, N., et al., Benchmarking open government: An open data perspective, Government Information Quarterly
(2014), http://dx.doi.org/10.1016/j.giq.2013.10.011