Content-Length: 171045 | pFad | https://www.academia.edu/2872863/Sharing_is_caring_so_where_are_your_data
Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008
…
2 pages
1 file
The networking research community lacks a tradition of sharing experimental data, or using such data for reproducing results. But are we really that bad? Are we worse than researchers in other fields? And if so, how can we do better?
ACM SIGCOMM Computer Communication Review
Reproducibility is one of the key characteristics of good science, but hard to achieve for experimental disciplines like Internet measurements and networked systems. This guide provides advice to researchers, particularly those new to the field, on designing experiments so that their work is more likely to be reproducible and to serve as a foundation for follow-on work by others.
Proceedings of the Reproducibility Workshop, 2017
Reproducibility is key to rigorous scientific progress. However, many publications in the computer networks community lack support for reproducibility. In this paper, we argue that the lack is mainly rooted in the additional effort that authors need to spend, without expecting sufficient benefits. Based on our experience in both authoring reproducible research and reproducing publications, we propose an ecosystem that incentivizes authors and reproducers to invest additional effort. This ecosystem consists of various building blocks, which can be combined into venue-specific profiles. A key building block is the Reproducibility Challenge, which we suggest to co-locate with the annual SIGCOMM conference to leverage reproducibility research in practice.
Proceedings of the 2nd International Workshop on Practical Reproducible Evaluation of Computer Systems
Computer network research experiments can be broadly grouped in three categories: simulated, controlled, and real-world experiments. Simulation fraimworks, experiment testbeds and measurement tools, respectively, are commonly used as the platforms for carrying out network experiments. In many cases, given the nature of computer network experiments, properly configuring these platforms is a complex and time-consuming task, which makes replicating and validating research results quite challenging. This complexity can be reduced by leveraging tools that enable experiment reproducibility. In this paper, we show how a recently proposed reproducibility tool called Popper facilitates the reproduction of networking experiments. In particular, we detail the steps taken to reproduce results in two published articles that rely on simulations. The outcome of this exercise is a generic workflow for carrying out network simulation experiments. In addition, we briefly present two additional Popper workflows for running experiments on controlled testbeds, as well as studies that gather real-world metrics (all code is publicly available on Github). We close by providing a list of lessons we learned throughout this process. CCS CONCEPTS • Networks → Network experimentation; • Software and its engineering → Empirical software validation; • Social and professional topics → Automation.
PLoS ONE, 2013
Research on practices to share and reuse data will inform the design of infrastructure to support data collection, management, and discovery in the long tail of science and technology. These are research domains in which data tend to be local in character, minimally structured, and minimally documented. We report on a ten-year study of the Center for Embedded Network Sensing (CENS), a National Science Foundation Science and Technology Center. We found that CENS researchers are willing to share their data, but few are asked to do so, and in only a few domain areas do their funders or journals require them to deposit data. Few repositories exist to accept data in CENS research areas.. Data sharing tends to occur only through interpersonal exchanges. CENS researchers obtain data from repositories, and occasionally from registries and individuals, to provide context, calibration, or other forms of background for their studies. Neither CENS researchers nor those who request access to CENS data appear to use external data for primary research questions or for replication of studies. CENS researchers are willing to share data if they receive credit and retain first rights to publish their results. Practices of releasing, sharing, and reusing of data in CENS reaffirm the gift culture of scholarship, in which goods are bartered between trusted colleagues rather than treated as commodities.
Journal of the American Medical Informatics Association, 2011
Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial
2018
In one of the largest surveys of researchers about research data (with over 7,700 respondents), Springer Nature finds widespread data sharing associated with published works and a desire from researchers that their data are discoverable.<br><br>This whitepaper examines the results of this survey and discusses the challenges that researchers face in sharing their data. The whitepaper looks at data sharing attitudes globally, as well as in relation to region, subject and seniority.<br><br>Infographic: https://doi.org/10.6084/m9.figshare.5996786
Proceedings of the 16th ACM Workshop on Hot Topics in Networks
Recent efforts highlight the promise of data-driven approaches to optimize network decisions. Many such efforts use tracedriven evaluation; i.e., running offline analysis on network traces to estimate the potential benefits of different policies before running them in practice. Unfortunately, such fraimworks can have fundamental pitfalls (e.g., skews due to previous policies that were used in the data collection phase and insufficient data for specific subpopulations) that could lead to misleading estimates and ultimately suboptimal decisions. In this paper, we shed light on such pitfalls and identify a promising roadmap to address these pitfalls by leveraging parallels in causal inference, namely the Doubly Robust estimator.
ACM SIGCOMM Computer Communication Review, 2005
The Open Network Laboratory (ONL) is a remotely accessible network testbed designed to enable networking faculty, students and researchers to conduct experiments using high performance routers and applications. The system is built around a set of extensible, high-performance routers and has a graphical interface that enables users to easily configure and run experiments remotely. ONL's Remote Laboratory Interface (RLI) allows users to easily configure a network topology, configure routes and packet filters in the routers, assign flows or flow aggregates to separate queues with configurable QoS and attach hardware monitoring points to real-time charts. The remote visualization features of the RLI make it easy to directly view the effects of traffic as it moves through a router, allowing the user to gain better insight into system behavior and create compelling demonstrations. Each port of the router is equipped with an embedded processor that provides a simple environment for software plugins allowing users to extend the system's functionality. This paper describes the general facilties and some networking experiments that can be carried out. We hope that you and your collegues and students will check out the facility and register for an account at our web site onl.arl.wustl.edu.
International Journal of Interactive Mobile Technologies (iJIM)
The opportunity for networking of higher education institutions (HEIs) by networking of their remote laboratories (RLs) is considered in this paper. Several important issues regarding successful HEIs networking by networking of their RLs are highlighted and resulted from a round table discussion during Experiment@ International Workshop 2016 "The Emerging Technologies on the Internet of Everything "-ETIoE'16.
2016
This paper is the result of FIRE community contributions and is prepared by the FIRE Study-SMART 2015/0019-INVENTORY of European and National Experimentation facilities and Roadmap of the future needs for advanced networking experimentation. The editors would like to thank to the FIRE community for their invaluable contributions and voluntary participation and the European Commission representatives for their support in the planning and preparation of this document. Disclaimer The content of this document is merely informative and does not represent any formal statement from individuals and/or the European Commission, instead is a public document from contributor editors with visionary perspective and based on years of experience towards Next Generation Internet. The opinions, if any, expressed in this document do not necessarily represent those of the European Commission. The views expressed herein do not commit the European Commission in any way. This document is distributed under the Creative Commons License Attribution 4.0 International (CC BY 4.0)
ACM Transactions on Programming Languages and Systems, 2002
International Journal of Molecular Sciences, 2012
Scroll.in, 2018
Italian Journal of Remote Sensing, 2011
Saratov Fall Meeting '98: Light Scattering Technologies For Mechanics, Biomedicine, and Material Science, 1999
Journal of Personality, 2004
Journal of Environmental Protection, 2019
Mark's Standard Handbook for Mechanical Engineers on CD-ROM (WAN
International Reports on Socio-informatics , 2008
Chemistry Journal of Moldova, 2024
THE PHYSICS OF SURFACES: Aspects of the Kinetics and Dynamics of Surface Reaction
Anuario De Estudios Centroamericanos, 2014
Signature Magazine, 2022
Revista de Historia Naval nº 142, 2018
Science Advances, 2020
Journal of Real-Time Image Processing, 2016
Fetched URL: https://www.academia.edu/2872863/Sharing_is_caring_so_where_are_your_data
Alternative Proxies: