Instructions For AAAI 2026 Reviewers

Review Timeline

AAAI-26 will follow a two-phase reviewing process as in previous years, with two additions: an additional AI-generated review in Phase 1, and an AI-generated summary of the discussions at the end of the discussion phase.  The AI-generated content is being used as part of a pilot program to evaluate the ability of AI tools to assist in the peer review process.

Phase 1:

Phase 1 review starts

Tue Aug 12

Phase 1 reviews end

Mon Sep 1

SPC phase 1 recommendations

Sat Sep 6

AC phase 1 recommendations

Sat Sep 8

Phase 1 reject notifications

Mon Sep 15

This year, the two human-written reviews in Phase 1 will be supplemented by an additional AI-generated review.  This AI review will not include any ratings or recommendations, as an experiment for the pilot program. No human reviewers are being replaced by AI reviewing.  All decisions about whether a paper proceeds to Phase 2 will be made solely by human reviewers; no automated methods will be used in the decision-making process.

Phase 2:

Phase 2 review starts

Tue Sep 16

Phase 2 reviews end

Sun Oct 5

                        

If the paper proceeds to Phase 2, it is assigned additional human reviewers.

During the author response phase, all reviews, including the AI-generated review, become visible to the reviewers and the authors. The AI-generated review will be clearly labelled as such. Just as in previous years, authors may submit a response to the reviews, including to the AI-generated review.

Phase 2 Discussion Phase:

Author feedback start

Tue Oct 7

Author feedback end

Mon Oct 13

PC discussion start

Tue Oct 14

PC discussion end

Mon Oct 20

SPC metareviews

Fri Oct 24

AC recommendations due

Mon Oct 29

Based on all the reviews (including the AI-generated review), and the authors’ response, the human reviewers may revise their own reviews and/or their ratings. At the conclusion of the discussion phase, an AI generated summary will recap points of consensus and difference between the reviews (including human and AI-generated reviews), visible only to the SPC and AC.

OpenReview Help

How to bid on submissions
FAQ

Guidelines On Writing Helpful Reviews

  • To write a professional, respectful, and effective review:
  • In general, please do not use “I”, “you”, “the authors”, etc., in your reviews. Reviews should be depersonalized as much as possible. Use terms like “the paper”, “the work”, “the project”. The review should focus on the work and not the individuals (reviewers or authors). Avoid referring to yourself. If you must refer to yourself, it should be in the third person (e.g., “this reviewer”) and done sparingly.
  • Phrase your comments as would be appropriate if you were speaking respectfully to the authors face-to-face.
  • Instead of “What is wrong with this paper?”, ask yourself “How could this paper be better?”
  • When suggesting revisions in the review, think about whether the revisions are reasonable in terms of time and resources — which of the recommendations are essential, and which are nice-to-have but optional?
  • Is the story of the paper clear?
  • What is the problem that the paper aims to address?
  • What are the limitations in the state of the art that the paper addresses? Are the limitations clearly articulated, and reflective of the state of the art?
  • Do the empirical results really support the claims of the paper?
  • What is the key novel technical contribution in the paper?
  • Presentation of the proposed contribution:
  • Is the technical approach sound and clearly described?
  • Are there any errors, unstated assumptions, or missing details?
  • Is it expressed in sufficient detail to permit reproduction of the work?
  • Does the paper clearly describe the limitations (in scope and generalizability) of its conclusions?  (All work has limits, and it is vital to understand them.)
  • Related work:
  • Is the proposed contribution placed in the appropriate context of previous and related work, so as to allow a reader to draw connections between this work and others?
  • Evaluations:
  • Does the empirical evaluation include appropriate baselines and comparisons to validate the proposed approach compared to the state of the art?
  • Does the empirical evaluation report well-established and reasonable metrics?
  • Are the evaluation benchmarks and datasets appropriately chosen? Are there any better benchmarks or datasets that should have been used?
  • Does the paper include an analysis of errors made by the proposed approach?
  • Do the empirical results really support the claims of the paper?
  • Are the evaluations fully replicable?
  • Suggestions for improvement:
  • How could the research or the presentation of the paper be improved in light of the above? This is your clearest opportunity to help the authors become better scientists and researchers through your review.


Recommended review structure for AAAI-26

We don’t require reviewers to follow any fixed structure. However, the following mini-guide describes elements that are highly desirable for all reviews. Reviews that include all those elements will be considered substantive and given greater weight by the SPCs, ACs, and Program Chairs. Reviews missing critical elements may be deemed non-substantive, and those lacking most elements risk being flagged as irresponsible

— Paper summary (4-10 sentences): 

The first paragraph of the review should be a summary of the paper.  The purpose of this summary is to both show that the reviewer has a clear understanding of the paper and provide the best possible interpretation before going into critiques. When summarizing a paper, the should contain the following elements (with each of these typically being 1 or 2 sentences):

  1. summarize the main contribution of the paper in one sentence
  2. identify the core problem being addressed by the paper
  3. describe the key idea of the paper and how it addresses the problem
  4. summarize how the idea of the paper is realized as an implementation
  5. identify the conclusion that is claimed (or could be claimed) from the findings

— Review summary (4-10 sentences):

The second paragraph of the review should provide an overall assessment. The first sentence of this paragraph should provide the overall conclusion of the review. This introductory sentence is followed by individual sentences that assess the paper with respect to its clarity, technical and experimental soundness/validity, intellectual novelty, and relevance to the field. The high-order points of feedback to improve the paper should also be included.

— Specific points of feedback (bullet list, variable level of depth):

After the first two paragraphs, a bulleted list should be provided that addresses specific points of feedback. This list can be as long as needed to address all points of feedback for the review. Please be sure these comments are both critical and helpful.  These points can be of varying length, depending on the amount of description needed. For example, points about technical shortcomings tend to be about a paragraph in length. In contrast, typographical and grammatical errors tend to be expressed briefly in one line.  Review comments regarding the need for related work are expected to provide at least three citations from non-overlapping authors to be considered substantive.

Please consider how you would react to receiving your review comments if you were the submitting author.  These comments should both identify the strengths of the work as well as the flaws and shortcomings in the paper.  Most importantly, these comments should provide suggestions for improvement to enable continued progress by the authors.  Our goal as reviewers is to strengthen every submission to AAAI-26 such that these papers are either accepted to our conference or improved for acceptance for a future AI publication venue – especially as growing these ideas are crucial for the continued success and health of AI research.


Using the AI-Generated Reviews and Summaries

  • When will the AI-generated review be visible?
  • To ACs and SPCs: As soon as the review is generated.
  • To reviewers:
  • If the paper is rejected in Phase 1, the AI-generated review, along with the other reviews, will become visible to all reviewers of the paper when the decision is released.
  • If the paper proceeds to Phase 2, the AI-generated review will only be visible to reviewers during the author rebuttal phase, as is done with human reviews.
  • To authors:
    The AI review is made visible along with the other reviews according to the normal review process — if the paper is rejected in Phase 1, all reviews are made available then, and if the paper proceeds to Phase 2, all reviews are made available at the start of the rebuttal period.
  • How can I flag issues with the AI-generated review?
  • All reviewers will have the ability to make an official comment on the AI-generated review (distinct from the reviewer discussion), along with review ratings. These comments will only be visible to SPCs and higher. The Program Chairs, Workflow chairs, and Area Chairs will monitor comments made on AI-generated reviews to identify issues, and will take action as appropriate, such as re-generating the review, or excluding it entirely.
  • What can a reviewer do with the AI-generated review?
  • As with human-generated reviews, reviewers will have the opportunity to read and reflect on others' reviews, including the AI-generated reviews, and the author responses, during the rebuttal period. They may accordingly update their own reviews as they see fit, such as if a misunderstanding is clarified. The reviewers must not duplicate content from other reviews (neither the AI-generated nor human reviews), but are free to express their level of agreement with the points raised.
  • What can the SPC and the AC do with the AI-generated review?
  • Phase 1: The SPC makes recommendations to promote papers from Phase 1 to Phase 2 after reading all reviews, including the AI-generated reviews. Note that only the human reviews will have a paper rating and recommendations — the AI-generated reviews will not have any ratings or recommendations. The SPC will make their recommendation based on all the information available — the two human reviews and the AI review, and the recommendations from the human reviewers.
  • Phase 2: The SPC and AC make decision recommendations in phase 2 for the paper based on 1) all the reviews available from the human reviewers and the AI system; 2) the author responses; 3) discussions among the reviewers; 4) the AI-generated summary of author responses and reviewer discussions