CHES 2024 aims to support open and reproducible research within the
field of cryptography. As such, authors of papers accepted to CHES 2024 are
invited to submit artifacts associated with their papers, such as software or
datasets, for review, in a collaborative process between authors and the
artifact review committee.
IACR CHES Artifact Badges
New in 2024, authors can choose to have their artifacts evaluated by
the CHES Artifact Evaluation Committee (AEC) against three badges: Artifacts
Available, Artifacts Functional, and Artifacts Reproduced. Each evaluation is
optional. This system broadly follows the conventions established in recent years in
security research conferences
such as USENIX Security and NDSS.
IACR CHES Artifacts Available: To earn this badge, the AEC must judge
that artifacts associated with the paper have been made available for
retrieval. Other than making the artifacts available, this badge does
not mandate any further requirements on functionality, correctness,
or documentation. This is intended for authors who simply wish to
make some supplementary material available that supports their paper.
Examples include data sets, large appendices, and other
documentation.
IACR CHES Artifacts Functional: To earn this badge, the AEC must
judge that the artifacts conform to the expectations set by the paper
in terms of functionality, usability, and relevance. The AEC will
consider four aspects of the artifacts in particular.
Documentation: are the artifacts sufficiently documented to
enable them to be exercised by readers of the paper?
Completeness: do the submitted artifacts include all of the key
components described in the paper?
Exercisability: do the submitted artifacts include the scripts and
data needed to run the experiments described in the paper, and
can the software be successfully executed?
Reusability: means that the artifacts are not just functional but
of sufficient quality that they could be extended and reused by
others.
IACR CHES Results Reproduced: To earn this badge, the AEC must judge
that they can use the submitted artifacts to obtain the main results
presented in the paper. In short, is it possible for the AEC to
independently repeat the experiments and obtain results that support
the main claims made by the paper? The goal of this effort is not to
reproduce the results exactly but instead to generate results
independently within an allowed tolerance such that the main claims
of the paper are validated.
Examples of this in the field of cryptography include:
Software implementations (performance, formal verification, etc.):
The source code of the implementation; a list of all dependencies
required; the test harness; instructions on how to build and run the
software and the test harness; a description of the platform on which
the results in the paper were obtained; and instructions or scripts
to process the output of the test harness into appropriate summary
statistics.
Hardware implementations, physical attacks against implementations: A
precise description of any physical equipment used in the setup; the
source code of any software developed for the experiment; a list of
all dependencies required; instructions on how to build the software
and run the device or carry out the attack; instructions or scripts
to process the output and interpret the results.
Data or other non-code artifacts: Documents or reports in a widely
used non-proprietary format, such as PDF, ODF, HTML, text; data in
machine-readable format such as CSV, JSON, XML, with appropriate
metadata describing the schema; scripts used to process the data into
summary form. Where non-standard data formats cannot be avoided,
authors should include suitable viewing software.
Where possible, such as in software-based artifacts relying solely on
open-source components, the artifact review process will aim to run the
artifact and test harness, and see that it produces outputs that would
be required to assess the artifact against results in the paper. For
artifacts that depend on commercial tools or specialized physical
hardware, the goal of the artifact review process will be to confirm
that the artifacts are functional (should the submitters wish to be
evaluated for functionality) and could plausibly be used by someone
with access to the appropriate tools to reproduce the results.
Awards
In addition to the badges, the artifact review committee may recognize
zero or more artifacts at the CHES 2024 conference as exemplars in
terms of functionality, amenability to enabling reproducibility, or
reusability.
Timeline and Process
The artifact review process begins after the paper has been accepted
for publication in TCHES. Only papers accepted to CHES 2024 will be
considered under the artifact review process.
Following notification of acceptance (or acceptance with minor
revisions) to CHES 2024, the artifact may be submitted for review up to
the next artifact submission deadline.
Artifact Submission Deadlines
28 Nov 2023 (extended)
For papers accepted to TCHES Volume 2024 Issue 1
28 Jan 2024
For papers accepted to TCHES Volume 2024 Issue 2
28 Apr 2024
For papers accepted to TCHES Volume 2024 Issue 3
28 Jul 2024
For papers accepted to TCHES Volume 2024 Issue 4
Once the artifact is submitted, two or more members of the artifact
review committee will be assigned to review the artifact. The artifact
review process will be a continuous process and may involve requests
from the reviewers for additional help on how to run the artifact,
interpret its results, etc. It is acceptable (and expected) that the
interaction between the reviewers and the authors leads to the artifact
being updated during the review process. Updates that affect scientific
characteristics reported in the paper (such as changes to performance)
should be clearly documented.
We aim for the artifact review process to be completed within 6 weeks
of the artifact being submitted, but this will vary depending on the
scale of the artifact and the timeliness of interaction between the
authors and reviewers. Authors of artifacts that are accepted for
archiving will be provided instructions on how to submit the archival
version of their artifacts.
We ask for authors to be understanding and to join us in viewing this
as a collaborative process trying to produce better artifacts for the
scientific community.
Confidentiality
The artifact review process will be single-blinded: the authors of the
paper and artifact are not anonymous, but the reviewers will be
anonymous. Communication between the authors and the reviewers will be
facilitated via the HotCRP review site. Authors should not attempt to
learn the identities of the reviewers, for example, by not embedding
analytics or tracking elements in the artifact or a website; if you
cannot comply with this for some reason out of your control, please
notify the chairs immediately to discuss.
Conflict of Interest
The TCHES 2024 artifact review process follows the same conflict of
interest policy as TCHES, which is the IACR policy with respect to
conflicts of interest. A conflict of interest is considered to occur
automatically whenever an author of a submitted paper and a reviewer
were advisee/advisor at any time,
have been affiliated to the same institution in the past 2 years,
have published 2 or more jointly authored papers in the past 3 years, or
are immediate family members.
Conflicts may also arise for reasons other than those just listed.
Examples include closely related technical work, cooperation in the
form of joint projects or grant applications, business relationships,
close personal friendships, instances of personal enmity. For more
information please see the
IACR Policy on Conflicts of Interest.
Authors will be asked to identify conflicts of
interest with the committee members at time of artifact registration.
Copyright and Licensing Conditions
In order for the IACR to distribute Artifacts,
we require permission to do so.
You are asked to grant the IACR permission to do so
under an open-source license of your choice,
such as an OSI-approved license.
As some Artifacts may combine portions
created by you and third-party materials obtained elsewhere,
you must ensure that you have obtained a license
to redistribute all third-party materials included in the Artifact
that were not created by you,
for example by including only open-source components
or by otherwise obtaining and demonstrating the required permission.
It is not a requirement that any patent rights be granted.
The authors of the accepted paper and their affiliations
Email addresses for the contact authors for artifact
The PDF of the submitted paper, or an updated/camera-ready version, if available
A brief description of the artifact
If the artifact is less than 20MB: a .zip or .tar.gz containing the artifact
If the artifact is larger than 20MB: instructions on how to obtain the artifact
A link to a Github repository or similar for the artifact, if available, along with the commit/tag of the submission
The artifact itself shall include at least the following files:
LICENSE: The license(s) under which the artifact is released
README: The main starting point for anyone attempting to use the artifact. It should include
instructions on:
Dependencies required to build and run the artifact, including specific version
numbers of dependencies
Instructions for building and running the artifact
Options on configuring the artifact to run in different modes, if
applicable
Instructions on how to interpret the output of the artifact, including which
scripts to run if appropriate
An explanation of how the source code is organized
Files such as LICENSE and README can be plain text files or Markdown files.
Source code files within the artifact are encouraged to be organized, formatted, and documented using best practices and conventions appropriate to the programming language in question.
For example, formatted using a consistent style such as PEP8 for Python; documentation of APIs using JavaDoc for Java or Doxygen for C; unit tests using an appropriate framework.
Hardware Submission Tips and Suggestions
This document
serves as guidance on how researchers and engineers can package their hardware projects as part
of the CHES Artifact Review Process. It is designed to capture best practice when it comes to improving the
re-usability and reproducibility of hardware projects in the cryptographic research community.
Packaging of the Artifact
The primary form of the artifact should be as source code, with suitable build scripts and instructions on how to install the appropriate dependencies.
For artifacts with complex dependencies or build requirements, the authors are encouraged to also package the artifact in the manner that makes it most amenable to successful execution. Potential formats include:
A virtual machine image (Virtualbox, Docker,…) containing the artifact and all
dependencies already installed, and the artifact compiled, configured, and ready to
run. It is preferable to also include the Dockerfile or script used to create the
image if possible.
A binary installable package, such as .rpm or .deb package on Linux, or an MSI
Installer on Windows.
A video demonstrating the use of the artifact and the results, especially in the case
of an artifact that requires commercial software, specialized hardware, or long
computation times.
A "live notebook" (Jupyter, Sage,...) for demonstrating a sequence of
mathematical calculations, especially of data artifacts.
When in doubt, imagine a first-year grad student in 2029 who is told by their supervisor
"See if you can change this artifact from CHES 2024 to do X."
We want to give them the best chance of success with the least amount of pain.