Analysis of IOS preliminary proposal submission system, Part 3 of 3

In January 2012, IOS implemented a new procedure for assessing research proposals. Instead of receiving full proposals twice per year, the division now has a single annual date for receipt of short preliminary proposals (pre-proposals). Authors of the most meritorious pre-proposals are then invited to submit full proposals, and awards are made from this pool. This change was made in response to a steady increase in the proposal workload both within the Division and in the scientific community. For example, in 2011, program officers made 14,000 requests for ad hoc reviews. This increased workload ultimately jeopardized our ability to perform quality merit review of the proposals submitted. The goals of the new submission procedure were to reduce the burden on the principal investigator (PI) and reviewer communities, as well as to maintain high-quality merit review.

Currently, we have completed one full cycle from submission of pre-proposals to final awards (FY 2013). The final decisions on the second cycle of full proposals (FY14) and the review of pre-proposals for the third cycle (FY15) are ongoing. To date, we have heard a number of concerns regarding the outcomes of this transition. Therefore, we performed a preliminary analysis of the proposal submission and success rates for the full first cycle compared to the four years prior, which operated under a semi-annual cycle of full proposals. Through surveys given to panelists, we have also been assessing the impact of the pre-proposal process on the quality of proposals, on the review process, and on the quality of merit review. We will discuss some of our major findings in a series of blog posts.


Results of Review Panelist Survey

As we have implemented the new pre-proposal system, we have surveyed review panelists to get their perspectives on the differences between the previous 2-cycle full-proposal system and the pre-proposal system. Specifically, we asked them to rank their overall experiences, ability to evaluate merit review criteria, time spent on preparing for panels, and their opinions on differences in proposal quality. Of the panelists surveyed, over 90% agreed at least somewhat that the content in pre-proposals was adequate for evaluation under the merit review criteria. Furthermore, greater than 90% of panelists agreed or strongly agreed that they were able to provide adequate feedback to PIs about the proposals they reviewed.

adequate content

feedback opportunity

Panelists were asked to consider the quality of the top 20% of pre-proposals that they reviewed. Overall, greater than 95% of panelists stated that the creativity of research ideas was the same or better than full proposals from previous years. In terms of the Merit Review criteria, a large majority of panelists felt that the quality of proposals was the same or better than in previous years (greater than 90% for Intellectual Merit, 80-90% for Broader Impacts). These responses suggest that PIs with the best proposals are still able to adequately convey their ideas and meet the Merit Review criteria in the shorter pre-proposal format.


merit reveiw criteria

Finally, about 50% of the panelists noted that they spent less time preparing for pre-proposal panels relative to full proposal panels. Furthermore, about 80% of panelists said that their overall experience was the same or better while reviewing pre-proposals, compared to full proposals under the previous system. These results are significant as they highlight the reduced burden on panelists under the new system.

prep time

overall experience


One thought on “Analysis of IOS preliminary proposal submission system, Part 3 of 3

  1. Pingback: Friday links: RIP Oikos Blog (?), Stephen Heard vs. an English department, and more | Dynamic Ecology

Comments are closed.