Artifacts Evaluation
In support of ACM’s effort to promote reproducibility in its digital library, the annual Workshop on ns-3 (WNS3) will review published papers for artifacts and award badges according to ACM guidelines. The artifact review will be conducted by the technical program committee as part of the review process.
ACM has defined three possible badges for artifact review and two possible badges for reproducing results. Although five levels of badging are available at the ACM, WNS3 will only review to the first level (Artifacts Available, v1.1) as stated below.
Artifacts for WNS3 papers may include the following:
- Simulation programs
- Simulation execution scripts
- Scripts to plot data
- Raw simulation data
- Associated measurement data
- Training data for machine learning models
- Exported machine learning models
Artifacts Available, v1.1
The following definition is reviewed in the context of Workshop on ns-3 papers and the review committee. Italicized text is copied from the ACM website.
ACM definition: Author-created artifacts relevant to this paper have been placed on a publically accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.
We do not mandate the use of specific repositories. Publisher repositories (such as the ACM Digital Library), institutional repositories, or open commercial repositories (e.g., figshare or Dryad) are acceptable. In all cases, repositories used to archive data should have a declared plan to enable permanent accessibility. Personal web pages are not acceptable for this purpose.
Artifacts do not need to have been formally evaluated in order for an article to receive this badge. In addition, they need not be complete in the sense described above. They simply need to be relevant to the study and add value beyond the text in the article. Such artifacts could be something as simple as the data from which the figures are drawn, or as complex as a complete software system under study.
Applicability to WNS3: According to ACM notes on this badge, reviewers are asked to look in the paper for evidence of available artifacts, and to assess the permanence of the repository. Publicly accessible GitHub or GitLab branches are acceptable. Compressed archives (zip, tar) published on an institutional website are acceptable, although ACM has declared that the use of personal web pages is not acceptable. It may be the case that, upon the time of review, artifacts may be under submission or intended to be submitted to a public repository such as the ACM Digital Library itself. If this is the case, the decision on this paper to award the badge will be deferred until the artifacts are posted. If the author does not plan to submit artifacts to the ACM Digital Library upon paper publication, and the camera-ready version does not otherwise note a URL, DOI, or public resource where the artifacts are available, this badge will not be awarded.
Reviewers should check that published URLs are not empty, and that if simulation code is provided, that it compiles and scripts run without noticeable errors. Otherwise, artifacts should be reviewed for “relevance” and “value added” as defined above by ACM, without getting deeper into a formal functional review (next section). If problems arise with working with the provided software, the reviewer may attempt to resolve it with assistance from the authors, as mediated by the TPC chairs (see Review Process below).
Review process
Artifact review will be conducted by TPC members. If a reviewer seeks outside help or delegates the review to someone else, that delegate should be listed as part of the committee unless they prefer to be anonymous.
If an artifact review badge is granted by the committee, completed reviews will be publicly archived on the WNS3 web site.
If the reviewers are unable to access a URL or archive, or are unable to successfully compile or run software, the reviewer may report the problem to the WNS3 TPC chairs, who will relay the message to the authors to attempt to resolve problems. The purpose of relaying communications through the TPC chairs is to protect the anonymity of the artifact reviewer.
For more formal review of artifacts, if the artifact review requires access to specialized computing resources, reviewers may request to be provided with access, and authors may either decline the request or take steps to facilitate the reviewer’s access.
Each paper will have at least two reviews from committee members having no conflicts of interest with the authors. The reviews will make a recommendation to either award or not award the badge. After reviews are complete, the artifacts review committee will meet and attempt to reach a decision based on the reviews. Conflicted members for a given paper will recuse themselves from the discussion. In the case of conflicting reviews, the committee may seek additional reviews to attempt to resolve the decision, or may lean one way or another based on how similar decisions were taken for other papers. The committee will apply consistent evaluation criteria across all of the papers. Authors can raise any concern about the evaluation outcome with the TPC chairs, for further consideration. Decisions will be conveyed to the TPC chairs and Proceedings Chair (for coordination with ACM Digital Library).
Papers may be initially reviewed for Artifacts Available level, and later re-evaluated (by some future reviewer or committee) for Artifacts Functional and Artifacts Reusable levels.
Reviewers will complete and return the review form to the TPC co-chairs.
Author guidelines
Please provide a URL in your submission from which reviewers may download artifacts for review. Please include instructions on how to use the artifacts to generate data or plots in the submission.