PLDI 2018 includes an artifact evaluation process, following a trend that began at PLDI 2014 and which has been employed for PLDI 2015, 2016, and 2017. The artifact evaluation gives authors the opportunity to submit for evaluation any artifacts that accompany their papers.

Submit the artifact for your accepted paper to the artifact evaluation using HotCRP

Background

A paper consists of a constellation of artifacts that extend beyond the document itself: software, proofs, models, test suites, benchmarks, and so on. In some cases, the quality of these artifacts is as important as that of the document itself, yet most of our conferences offer no formal means to submit and evaluate anything but the paper. PLDI 2018 has created an Artifact Evaluation Committee (AEC) to remedy this situation.

Goals

The goal of artifact evaluation is twofold: to reward and probe. The artifact evaluation rewards authors who take the trouble to create useful artifacts to accompany the work in their paper.

The aspirational goal of the artifact evaluation process is to accept every artifact that is submitted, provided it meets the evaluation criteria listed below. Artifact evaluation is optional and authors choose to undergo evaluation only after their paper has been accepted.

Criteria

The artifact evaluation committee will read each artifact’s paper and judge how well the artifact conforms to the expectations set by the paper. The specific artifact evaluation criteria are:

  • consistency with the paper
  • completeness
  • documentation
  • ease of reuse

Note that artifacts will be evaluated with respect to the claims and presentation in the submitted version of the paper, not the camera-ready version.

Benefits

The evaluation and dissemination of artifacts improves reproducibility, and enables authors to build on top of each others’ work. Beyond helping the community as a whole, the evaluation and dissemination of artifacts confers several direct and indirect benefits to the authors themselves. Authors of papers with accepted artifacts will be assigned an official ACM artifact evaluation badge, indicating that they have taken the extra time and have undergone the extra scrutiny to prepare a useful artifact. The badge will appear on the first page of the camera-ready version of the paper, indicating that the artifact was evaluated and functional.

Process

To maintain the separation of paper and artifact review, authors will only be asked to upload their artifacts after their papers have been accepted. Authors planning to submit to the artifact evaluation should prepare their artifacts well in advance of this date, however, to ensure adequate time for packaging and documentation.

Throughout the artifact review period, submitted reviews will be (approximately) continuously visible to authors. Reviewers will be able to continuously interact (anonymously) with authors for clarifications, system-specific patches, and other logistics help to make the artifact evaluable. The goal of continuous interaction is to prevent rejecting artifacts for a “wrong library version”-type problem. The conference proceedings will include a discussion of the continuous artifact evaluation process.

Artifact Details

The artifact evaluation will accept any artifact that authors wish to submit, broadly defined. A submitted artifact might be:

  • software
  • mechanized proofs
  • test suites
  • data sets
  • hardware (if absolutely necessary)
  • a video of a difficult- or impossible-to-share system in use
  • any other artifact described in a paper

A well-packaged artifact is more likely to be easily usable by the reviewers, saving them time and frustration, and more clearly conveying the value of your work during evaluation. A great way to package an artifact is as a Docker image or in a virtual machine that runs “out of the box” with very little system-specific configuration.

Submission of an artifact does not contain tacit permission to make its content public. AEC members will be instructed that they may not publicize any part of your artifact during or after completing evaluation, nor retain any part of it after evaluation. Thus, you are free to include models, data files, proprietary binaries, and similar items in your artifact. The AEC organizers strongly encourage you to anonymize artifacts – if your VM has a default username that reveals your identity, the anonymity of the review process is compromised. Artifact evaluation is single-blind. Please take precautions (e.g. turning off analytics, logging) to help prevent accidentally learning the identities of reviewers.

Artifact Evaluation Committee

Other than the chairs, the AEC members are senior graduate students, postdocs, or recent PhD graduates, identified with the help of current, active researchers.

Qualified graduate students are often in a much better position than many researchers to handle the diversity of systems expectations that the AEC will encounter. In addition, graduate students represent the future of the community, so involving them in the AEC process early will help push this process forward. The AEC chairs devote considerable attention to both mentoring and monitoring, helping to educate the students on their responsibilities and privileges.

Highlights of the 2018 Artifact Evaluation Process

  • The goal of the AEC is to accept every artifact submitted. While some artifacts may not pass muster and may be rejected, we will evaluate in earnest and make our best attempt to follow authors’ evaluation instructions.

  • Throughout the review period, submitted reviews will be (approximately) continuously visible to authors. AEC reviewers will be able to continuously interact (anonymously) with authors for clarifications, system-specific patches, and other logistics help to make the artifact evaluable. The goal of continuous interaction is to prevent rejecting artifacts for a “wrong library version”-type problem.

  • We expect each artifact to receive 3-4 reviews.

  • Reviews will be submitted via HotCRP.

Important Dates

  • February 20th February 27th: Author artifact submission deadline
  • March 5th March 9th: Basic artifact functionality evaluation deadline
  • April 3rd: Final artifact evaluation review deadline
  • April 10th: Acceptance notification