Due to exceptionally low number of submissions to the Spring 2021 edition of the Reproducibility Challenge, we have forwarded the papers to be considered for the 2021 Reproducibility Challenge (Fall). However, the forwarded papers will be reviewed separately than the pool of submissions we expect to receive from the Fall edition - such that you still benefit from submitting early (i.e, the acceptance % will be calculated only within this cohort, and not with all submissions). The reviews will not be made public until the submissions of Fall 2021 edition are reviewed, however you will be notified earlier (reviews as well as decisions). We will be shortly announcing when you can expect to hear the results of the reviewing process - the selected papers from the Spring 2021 submissions will be featured in ReScience journal's 2022 edition of Reproducibility Reports.
You can claim a paper using our OpenReview Portal. Log in to the portal, and you will find a list of papers available to claim. You can search by any paper metadata (title, authors) in the search box, as well as choose the conference in the drop-down menu. Then, once you click the link of the paper you are interested in, you will find the paper forum, where you will find a “Add Claim” button to submit your claim.
Yes, you can claim multiple papers. Although we advise not to claim too many papers as that would increase your workload.
You can if you want to, but try to work on papers which haven’t been claimed. This will ensure broader coverage of the challenge, as well as give you a competitive advantage. (Remember multiple reports on the same paper have to compete among themselves!)
Yes, you can delete your claim(s). Head over to “Your Consoles” tab and click “Author Console”, where you can find the list of claims with your account. You can click on the delete button to remove your claim.
Any number of team members are allowed.
Not at all, we encourage cross-institute participation! In that case, add the institutions of your team members in a comma separated format in the claims and report submission forms in OpenReview.
No, author order will depend on your final report.
Yes, you can add more authors if you want during your final report submission.
No, you can participate independently on your own. Participation from industry is especially welcome!
Many thanks for your participation! You can just drop us a mail (reproducibility.challenge@gmail.com) with details of your course, and we will list it on our website!
Yes! If your course ends earlier than the submission deadline and you have already graded the assignments, you can directly send us the evaluations! Have your students submit their report in our OpenReview portal, and drop us a mail at reproducibility.challenge@gmail.com with the evaluations paired with the submission links.
Please contact your course instructor or TA to send us a mail (reproducibility.challenge@gmail.com) to register your course. We will update the website periodically and add new courses.
Yes you can! Please consider sharing the word about the challenge to your peers in your company too!
You will first add to the knowledge of the original paper. Peer reviewed reports will be showcased on PapersWithCode and published in the ReScience Journal. We are also planning a worldwide “ML Reproducibility Day” event where authors from high quality peer-reviewed reports will be invited for remote talks. Keep checking our website for more information.
Check the Resources tab for more information.
Yes! It is highly recommended to contact the authors of the paper you are reproducing, to clarify doubts and implementation details.
You can send the authors mail directly to initiate a discussion. The contact details can be found on the paper, which is linked in the pdf of the paper which is available for each paper.
OpenReview now supports comments and subscribe facility on all listed papers in ML Reproducibility Challenge. You can easily opt-in to receive notifications! First log in to your OpenReview account, and then navigate to your paper(s) in ML Reproducibility Challenge homepage. In the forum associated with your paper, you can find a dropdown to "Notification Subscription:" and you may choose the subscription types "Subscribe" and "Unsubscribe" to follow and unfollow activities surrounding your paper.
You can either search the pdf of the paper for the code, or find the link to PapersWithCode (“HTML” button) page of the paper, which is usually updated with the publicly released code of the authors.
There is no restriction on the extent of the original code you can use for the reproducibility effort.
Yes, the report to be submitted should be double blind. When submitting code for review, include your codebase in the Supplementary Materials, or link to an Anonymous Github URL.
You can find the style files of the report here.
Yes! You should use our style files and add the “Reproducibility Summary” in the first page of your report. Make sure this summary does not exceed the first page. Failure of adding this summary will result in desk rejection.
You should copy your Reproducibility Summary in the abstract field. You should add a line separator between sections, and ensure the summary is properly formatted.
Once your report is accepted, you will be required to submit the final draft by camera-ready deadline in ReScience journal format. Details of this process will be communicated to you after acceptance notification.
Thanks for your interest in our challenge! You can help out by spreading the news. If you are a course-instructor you can help by enrolling your course in the challenge. You can also sign up to be a reviewer when we share the call for reviewing for the challenge! If you are a company you can help sponsor by providing compute resources. Please contact us at reproducibility.challenge@gmail.com to list your generous offer in the Resources section.