Fall 2013 Knowledge Synthesis Pilot: Final Report – Long Descriptions

Figure 1. Applicant and Research Administrator Impression of the Structured Application Process.

Figure 1-A. Describe the ease of use of the structured application form.

Very Easy Easy Neutral Complicated Very Complicated
8.77 50.88 24.56 14.04 1.75

« Back to figure 1-A

Figure 1-B. The structured application format was intuitive and easy to use.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Research Administrator 0.00 66.67 25.00 8.33  
Applicant 9.26 48.15 29.63 9.26 3.70

« Back to figure 1-B

Figure 1-C. Level of satisfaction with the structured application process:

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Research Administrator 7.69 84.62 7.69 0.00 0.00
Applicant 9.09 45.45 36.36 9.09 0.00

« Back to figure 1-C

Figure 2. Use of Structured Application Process Compared to Previous Review Applications to the Knowledge Synthesis competition.

Figure 2-A. Compared to last time, completing the structured application form was:

Much Easier to Use Easier to Use As Easy to Use More Difficult to Use Much More Difficult to Use
0.00 30.00 36.67 33.33 0.00

« Back to figure 2-A

Figure 2-B. Compared to last time, submitting the structured application form was:

Much Less Work Somewhat Less Work Same Amount of Work Somewhat More Work Much More Work
0.00 11.11 88.89 0.00 0.00

« Back to figure 2-B

Figure 2-C. Compared to a previous Knowledge Synthesis competition, submitting an application using the structure application format was:

Much Better Better Neutral Worse Much Worse
Research Administrator 0.00 11.11 88.89 0.00 0.00
Applicant 0.00 44.83 34.48 20.69 0.00

« Back to figure 2-C

Figure 3. Character Limits of the Structured Application Form.

Figure 3-A. Character limit was adequate to respond to each adjudication criterion?

Yes No
44.44 55.56

« Back to figure 3-A

Figure 3-B. Ideal page limits according to applicants:

Up to one page 1-2 pages 2-3 pages 3-5 pages More than 5 pages
Quality of the Idea 16.00 68.00 16.00 0.00 0.00
Importance of the Idea 66.67 23.81 4.76 4.76 0.00
Approach 0.00 0.00 5.56 66.67 27.78
Expertise, Experience and Resources 30.43 47.83 21.74 0.00 0.00

« Back to figure 3-B

Figure 4. Stage 1 Reviewer Reactions to the Structured Application – Character Limits.

Figure 4-A. Character limit was adequate to respond to each adjudication criterion?

Too Much OK As Is Too Little
2.00 92.00 5.00

« Back to figure 4-A

Figure 4-B. Character limits allowed for sufficient information to be included by the applicant?

Yes No
Quality of the Idea 33.30 66.60
Importance of the Idea 33.30 66.60
Approach 66.60 33.30
Expertise, Experience, Resources 100.00 0.00
Budget 100.00 0.00

« Back to figure 4-B

Figure 5. Stage 1 Reviewer Reactions to the Structured Application – Allowable Attachments.

Figure 5-A. Value level of allowable attachments:

High Medium Low None
Figure 37.00 55.00 5.00 2.50
Tables 45.00 42.50 12.50 0.00
References 45.00 42.50 12.50 0.00
Letter from Collaborators 37.50 40.00 22.50 0.00
Letters of Support from Knowledge Users 65.00 22.50 10.00 2.50
Letters of Support from Partners 44.74 36.84 13.16 5.26

« Back to figure 5-A

Figure 5-B.

% Reviewers who found it useful % Reviewers who found limits appropriate
Yes No Yes No
Research Funding History 100.00 0.00 92.00 8.00
Publications 100.00 0.00 82.00 18.00
Intellectual Property 46.00 54.00 69.00 31.00
Knowledge and Technology Translation 82.00 18.00 97.00 3.00
International Collaborations 74.00 26.00 91.00 9.00
Presentations 69.00 31.00 86.00 14.00
Interviews and Media Relations 46.00 54.00 77.00 23.00
Community Volunteer Activities 29.00 71.00 88.00 12.00

« Back to figure 5-B

Figure 6. Figure 6. Non-Technical Problems Encountered in Completing the Structured Application Form.

Did applicants experience problems completing the structured application form?

Yes No
25.00 75.00

« Back to figure 6

Figure 7. Stage 1 Reviewer Workload.

Figure 7-A. Workload assigned to Stage 1 reviewers was:

Light Manageable Challenging Excessive
7.50 50.00 30.00 12.50

« Back to figure 7-A

Figure 7-B. Compared to a previous Knowledge Synthesis review experience, the workload of the following review activities was:

Much More More The Same Less Much Less
Reading one application 0.00 5.26 42.11 47.37 5.26
Looking up additional information  0.00 10.53 68.42 15.79 5.26
Writing one review 0.00 10.53 36.84 52.63 0.00
Endering review information 0.00 31.58 36.84 31.58 0.00
Compared to last time, reviewer workload was 0.00 26.00 21.00 53.00 0.00

« Back to figure 7-B

Figure 8. Stage 2 Reviewer Workload.

Figure 8-A. Workload assigned to Stage 2 reviewers was:

Light Manageable Challenging Excessive
13.33 73.33 13.33 0.00

« Back to figure 8-A

Figure 8-B. Compared to previous experiences reviewing for a Knowledge Synthesis competition, the peer review process took

Much Less Time Less Time The Same Time More Time Much More Time
66.67 16.67 16.67 0.00 0.00

« Back to figure 8-B

Figure 8-C. Compared to previous experiences reviewing for a Knowledge Synthesis competition, the peer review process was:

Much Easier to Use Easier to Use As Easy to Use More Difficult to Use Much More Difficult to Use
25.00 41.67 33.33 0.00 0.00

« Back to figure 8-C

Figure 9. Distinction between "Quality of the Idea" and "Importance of the Idea".

Figure 9-A. Distinction between "Quality of the Idea" and "Importance of the Idea" was clear?

Yes No
53.85 46.15

« Back to figure 9-A

Figure 9-B. Distinction between "Quality of the Idea" and "Importance of the Idea" was clear?

Yes No
38.00 62.00

« Back to figure 9-B

Figure 10. Adjudication Criteria.

Figure 10-A. Should the adjudication criteria be weighted equally?

Yes No
57.00 43.00

« Back to figure 10-A

Figure 10-B. Ideal weighting of the adjudication criteria according to Stage 1 reviewers:

0-10% 11-20% 21-30% 31-40% 41-50%
Quality of the Idea 33.30 26.70 33.30 6.70 0.00
Importance of the Idea 20.00 33.30 40.00 6.70 0.00
Approach 0.00 13.30 40.00 26.70 20.00
Expertise, Experience, Resources 0.00 33.30 60.00 6.70 0.00

« Back to figure 10-B

Figure 11. Characteristics of the Adjudication Scale.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Descriptors for the adjudication scale were clear and useful 24.00 54.00 8.00 11.00 3.00
Adjudication scale range was sufficient to describe meaningful differences 27.00 49.00 8.00 11.00 5.00

« Back to figure 11

Figure 12. Use of the Adjudication Scale.

Figure 12-A. Stage 1 reviewers used the full range of the adjudication scale.

Strongly Agree Agree Neutral Disagree Strongly Disagree
According to Stage 1 Reviewers 19.44 44.44 8.33 25.00 2.78
According to Stage 2 Reviewers 0.00 21.43 35.71 28.57 14.29

« Back to figure 12-A

Figure 14. Integrated Knowledge Translation Approach.

Figure 14-A. Adjudication criteria allowed applicants to convey their integrated knowledge translation approach.

Strongly Agree Agree Neutral Disagree Strongly Disagree
3.77 56.60 18.87 15.09 5.66

« Back to figure 14-A

Figure 14-B. Compared to last time, applicants could more easily convey their integrated knowledge translation approach.

Much Better Better Neutral Worse Much Worse
0.00 19.23 46.15 34.62 0.00

« Back to figure 14-B

Figure 15. Integrated Knowledge Translation (IKT) Approach.

Figure 15-A. Reviewer assessment of the IKT approach.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Information provided was sufficient to assess the IKT approach 21.00 62.00 13.00 5.00 0.00
Adjudication criteria allowed for appropriate assessment of the IKT approach 11.00 65.00 11.00 14.00 0.00

« Back to figure 15-A

Figure 15-B. Compared to a previous Knowledge Synthesis review experience, pilot components allowed reviewers to assess/provide feedback regarding the IKT approach.

Much Better Better Neutral Worse Much Worse
Adjudication Worksheet 16.00 26.00 58.00 0.00 0.00
Adjudication Criteria 21.00 26.00 42.00 11.00 0.00
Information contained within the structured application 11.00 26.00 47.00 16.00 0.00

« Back to figure 15-B

Figure 16. Adjudication Worksheet.

Figure 16-A. The adjudication worksheet was easy to work with.

Strongly Agree Agree Neutral Disagree Strongly Disagree
20.51 51.28 17.95 7.69 2.56

« Back to figure 16-A

Figure 16-B. Character limit allowed reviewers to provide good feedback to applicants?

Yes No
90.00 10.00

« Back to figure 16-B

Figure 16-C. Did the adjudication worksheet have sufficient space to allow reviewers to provide useful feedback to applicants?

Yes No
Quality of the Idea 40.00 60.00
Importance of the Idea 40.00 60.00
Approach 0.00 100.00
Expertise, Experience, Resources 50.00 50.00
Budget 75.00 25.00

« Back to figure 16-C

Figure 17. Reading Preliminary Reviews.

Figure 17-A. Did reviewers read the other reviewers’ Stage 1 reviews?

Yes No
59.00 41.00

« Back to figure 17-A

Figure 17-B. Reading other reviewers’ comments influenced reviewer assessment:

Very Often Often Occasionally Rarely Never
0.00 5.00 36.00 27.00 32.00

« Back to figure 17-B

Figure 17-C. Additional time spent reading other reviewers’ reviews:

Less than 1 hour 1-2 hours 3 hours or more
48.00 48.00 4.00

« Back to figure 17-C

Figure 18. Online Discussion Participation.

Figure 18-A. Did Stage 1 reviewers participate in an online discussion?

Yes No
26.00 74.00

« Back to figure 18-A

Figure 18-B. Reviewers did not participate in an online discussion because:

Reviews Not Completed Not Available Nothing to Discuss Other
46.00 12.00 8.00 35.00

« Back to figure 18-B

Figure 18-C. Was 7 days a sufficient amount of time for the online discussion period?

Yes No
23.00 77.00

« Back to figure 18-C

Figure 19. Online Discussion Initiation.

Figure 19-A. Did Stage 1 reviewers initiate an online discussion?

Yes No
70.00 30.00

« Back to figure 19-A

Figure 19-B. Factors used to determine whether an online discussion was required:

Scoring Discrepancy Content Clarification Quality Check
42.90 14.20 42.90

« Back to figure 19-B

Figure 19-C. Who should determine whether an online discussion is required?

CIHR Chair Reviewer
16.00 48.00 36.00

« Back to figure 19-C

Figure 19-D. Criteria should be used to determine when an online discussion takes place?

Yes No
49.00 51.00

« Back to figure 19-D

Figure 20. Impact of Online Discussion.

Very Often Often Occasionally Rarely Never
Your online contribution influenced the assessment of other reviewers 0.00 0.00 22.20 33.30 44.40
Online discussion influenced your assessment 11.10 0.00 44.40 11.10 33.30

« Back to figure 20

Figure 21. Stage 2 Reviewer Comments to Stage 1 Reviewers.

Stage 1 reviewers provided clear feedback to support their ratings.

Strongly Agree Agree Neutral Disagree Strongly Disagree
0.00 42.86 7.14 42.86 7.14

« Back to figure 21

Figure 22. Stage 2 Reviewer Reactions to Stage 1 Reviews.

Figure 22-A. Did Stage 2 reviewers consult both the applications and stage 1 reviews?

Yes No
100.00 0.00

« Back to figure 22-A

Figure 22-B. Reading both the applications and stage 1 reviews is necessary.

Strongly Agree Agree Neutral Disagree Strongly Disagree
28.57 28.57 14.29 21.43 7.14

« Back to figure 22-B

Figure 23. Stage 2 Pre-Meeting Activities - Binning Process.

The number of "Yes/No" allocations for the binning process was appropriate?

Yes No
50.00 50.00

« Back to figure 23

Figure 24. Stage 2 Pre-Meeting Activities - Consultation of Other Reviewers Comments.

Figure 24-A. Did reviewers read the other reviewers’ stage 2 comments?

Yes No
85.71 14.29

« Back to figure 24-A

Figure 24-B. Reading other reviewers’ comments/binning decisions influenced assessment:

Rarely Occasionally Often Very Often Always
33.33 33.33 16.67 8.33 8.33

« Back to figure 24-B

Figure 24-C. Additional time (total) spent reading other reviewers’ comments:

Less than 1 hour 1-2 hours 3 hours or more
33.33 58.33 8.33

« Back to figure 24-C

Figure 24-D. Was the character limit appropriate for Stage 2 reviewer comments?

Yes No
100.00 0.00

« Back to figure 24-D

Figure 25. Face-to-Face Meeting Requirements.

Yes No
The face-to-face committee meeting is required 81.25 18.75
Instructions provided at the meeting were clear and easy to follow 93.75 6.25
Conflicts were handled appropriate at the committee meeting 100.00 0.00

« Back to figure 25

Figure 26. Face-to-Face Meeting – Validating the Application List.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Focusing the discussion on applications in Group B is appropriate 62.50 25.00 12.50 0.00 0.00
Process of moving applications between groups is efficient 43.75 50.00 0.00 6.25 0.00
Moving applications from Group A or C to Group B is easy 37.50 37.50 18.75 6.25 0.00

« Back to figure 26

Figure 27. Face-to-Face Meeting – Voting Process.

Yes No
The voting tool was effective and easy to use 100.00 0.00
Instructions regarding the voting tool were clear 93.75 6.25

« Back to figure 27

Figure 28. Face-to-Face Meeting – Funding Cut-Off Line.

Did the funding cut-off line help to inform the discussion?

Yes No
87.50 12.50

« Back to figure 28

Figure 29. Using ResearchNet as part of the Stage 1 Review Process.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Rnet was easy to use 12.50 70.00 12.50 5.00 0.00
Instructions in RNet on how to conduct peer review were clear 15.00 70.00 5.00 10.00 0.00
Enough information was provided in RNet to accurately declare conflicts 35.00 57.00 0.00 8.00 0.00
Application bookmarks made it easy to navigate through the application 40.54 29.73 24.32 5.41 0.00
It was easy to rank applications 11.11 58.33 11.11 13.89 5.56
It was clear how to re-rank applications 18.75 53.13 12.50 9.38 6.25
I was able to re-rank applications efficiently 20.00 53.33 13.33 6.67 6.67
It was clear to me how to break ties 24.24 36.36 9.09 24.24 6.06
I was able to break ties efficiently 25.00 50.00 6.25 15.63 3.13
I was able to complete my reviews efficiently using Rnet 15.00 75.00 8.00 2.00 0.00
The structured review on RNet was user-friendly 18.00 68.00 12.00 2.00 0.00

« Back to figure 29

Figure 30. Using ResearchNet as part of the Stage 2 Review Process.

Strongly Agree Agree Neutral Disagree Strongly Disagree
RNet was easy to use 42.86 57.14 0.00 0.00 0.00
Instructions in RNet on how to conduct peer review were clear 14.29 85.71 0.00 0.00 0.00
Enough information was provided in RNet to accurately declare conflicts 71.43 21.43 0.00 7.14 0.00
Application bookmarks made it easy to navigate through the application 28.57 57.14 7.14 7.14 0.00
Completing stage 2 reviews using RNet was efficient 64.29 28.57 0.00 0.00 7.14
It was clear how many applications could be assigned to the Yes/No bins 50.00 21.43 14.29 14.29 0.00
It was clear how to assign grant applications to Yes/No bins 21.43 71.43 7.14 0.00 0.00
Applications could be assigned to Yes/No bins efficiently 35.71 57.14 7.14 0.00 0.00
The yes/no binning process in ResearchNet was user-friendly 42.86 57.14 0.00 0.00 0.00

« Back to figure 30

Figure 31. Overall Satisfaction with Stage 1 Review Process.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
11.00 67.00 17.00 6.00 0.00

« Back to figure 31

Figure 32. Overall Satisfaction with Stage 2 Review Process.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
43.75 50.00 6.25 0.00 0.00

« Back to figure 32

Figure 33. Value of the Structured Review Process.

Figure 33-A. The reviews are consistent such that written justifications align with respective ratings.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Funded 37.50 50.00 0.00 12.50 0.00
Reviewed at Stage 1- Not funded 8.33 66.67 8.33 0.00 16.67
Not funded 0.00 38.46 7.69 30.77 23.08

« Back to figure 33-A

Figure 33-B. Reviews provide information that will be useful in refining research project.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Funded 50.00 25.00 12.50 0.00 12.50
Reviewed at Stage 1- Not funded 25.00 33.33 8.33 16.67 16.67
Not funded 15.38 61.54 0.00 15.38 7.69

« Back to figure 33-B

Figure 33-C. There is value in the structured review process (rating and justification are provided for each adjudication criterion).

Strongly Agree Agree Neutral Disagree Strongly Disagree
Funded 50.00 37.50 0.00 0.00 12.50
Reviewed at Stage 1- Not funded 9.09 90.91 0.00 0.00 0.00
Not funded 15.38 61.54 15.38 0.00 7.69

« Back to figure 33-C

Figure 33-D. The review process was fair and transparent.

Strongly Agree Agree Neutral Disagree Strongly Disagree
Funded 50.00 37.50 0.00 0.00 12.50
Reviewed at Stage 1- Not funded 9.09 36.36 36.36 18.18 0.00
Not funded 7.69 30.77 23.08 30.77 7.69

« Back to figure 33-D

Figure 34. Applicant Satisfaction with the Structured Review Process.

Figure 34-A. Consistency of Reviews.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 75.00 12.50 0.00 0.00 12.50
Reviewed at Stage 1- Not funded 9.09 45.45 9.09 18.18 18.18
Not funded 0.00 30.77 0.00 30.77 38.46

« Back to figure 34-A

Figure 34-B. Clarity of Adjudication Criteria.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 15.38 38.46 30.77 7.69 7.69
Reviewed at Stage 1- Not funded 27.27 27.27 36.36 0.00 9.09
Not funded 50.00 12.50 12.50 12.50 12.50

« Back to figure 34-B

Figure 34-C. Quality of Reviewer Comments.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 50.00 37.50 0.00 12.50 0.00
Reviewed at Stage 1- Not funded 36.36 27.27 18.18 9.09 9.09
Not funded 0.00 46.15 15.38 23.08 15.38

« Back to figure 34-C

Figure 34-D. Clarity of Rating System.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 50.00 25.00 0.00 12.50 12.50
Reviewed at Stage 1- Not funded 27.27 27.27 36.36 0.00 9.09
Not funded 15.38 7.69 23.08 46.15 7.69

« Back to figure 34-D

Figure 34-E. Confidence in New Review Process.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 37.50 37.50 12.50 0.00 12.50
Reviewed at Stage 1- Not funded 9.09 36.36 9.09 45.45 0.00
Not funded 0.00 30.77 30.77 23.08 15.38

« Back to figure 34-E

Figure 35. Overall Satisfaction with the Adjudication Process.

Very Satisfied Satisfied Neutral Dissatisfied Very Dissatisfied
Funded 50.00 37.50 0.00 0.00 12.50
Reviewed at Stage 1- Not funded 0.00 72.73 18.18 9.09 0.00
Not funded 0.00 38.46 15.38 30.77 15.38

« Back to figure 35

Figure 36. Usefulness of the Documentation Developed for the Knowledge Synthesis Pilot.

Figure 36-A. Applicants.

Materials were used? Materials were helpful
Yes No Yes No
Knowledge Synthesis Funding Opportunity 98.1132 1.886792 93.61702 6.382979
ResearchNet “Application” Phase Instructions 94.23077 5.769231 97.72727 2.272727
Interpretation Guidelines for Adjudication Criteria 94.33962 5.660378 84.44444 15.55556
Peer Review Manual 45.09804 54.90196 70.83334 29.16667
Knowledge Synthesis – Tips for Success 71.69811 28.30189 84.84849 15.15152
About KT 55.76923 44.23077 84.61539 15.38461
CIHR’s Guide to Knowledge Translation Planning 58.49057 41.50943 75 25
CIHR Guide to Writing Letters of Support 60.37736 39.62264 92.85714 7.142857

« Back to figure 36-A

Figure 36-B. Research Administrators.

Materials were used? Materials were helpful
Yes No Yes No
Knowledge Synthesis Funding Opportunity 86.67 13.33 100.00 0.00
ResearchNet “Application” Phase Instructions 73.33 26.67 80.00 20.00
Interpretation Guidelines for Adjudication Criteria 40.00 60.00 100.00 0.00
Peer Review Manual 26.67 73.33 50.00 50.00
Knowledge Synthesis – Tips for Success 26.67 73.33 100.00 0.00
About KT 33.33 66.67 100.00 0.00
CIHR’s Guide to Knowledge Translation Planning 26.67 73.33 100.00 0.00
CIHR Guide to Writing Letters of Support 20.00 80.00 100.00 0.00

« Back to figure 36-B

Figure 36-C. Stage 1 Reviewers.

Materials were used? Materials were helpful
Yes No Yes No
Knowledge Synthesis Funding Opportunity 78.00 22.00 89.00 11.00
ResearchNet “Application” Phase Instructions 41.00 59.00 93.00 7.00
Interpretation Guidelines for Adjudication Criteria 86.00 14.00 97.00 3.00
Peer Review Manual 75.00 25.00 96.00 4.00
Knowledge Synthesis – Tips for Success 20.00 80.00 71.00 29.00
About KT 20.00 80.00 86.00 14.00
CIHR’s Guide to Knowledge Translation Planning 15.00 85.00 83.00 17.00
CIHR Guide to Writing Letters of Support 9.00 91.00 67.00 33.00

« Back to figure 36-C

Figure 36-D. Stage 2 Reviewers.

Materials were used? Materials were helpful
Yes No Yes No
Knowledge Synthesis Funding Opportunity 87.50 12.50 100.00 0.00
ResearchNet “Application” Phase Instructions 68.75 31.25 100.00 0.00
Interpretation Guidelines for Adjudication Criteria 87.50 12.50 100.00 0.00
Peer Review Manual 62.50 37.50 100.00 0.00
Knowledge Synthesis – Tips for Success 40.00 60.00 80.00 20.00
About KT 37.50 62.50 100.00 0.00
CIHR’s Guide to Knowledge Translation Planning 13.33 86.67 100.00 0.00
CIHR Guide to Writing Letters of Support 6.25 93.75 0.00 100.00

« Back to figure 36-D

Figure 37. Usefulness of the Learning Lessons Developed for the Knowledge Synthesis Pilot.

Figure 37-A. Applicants.

Materials were used? Materials were helpful
Yes No Yes No
Overview of the Knowledge Synthesis Competition 46.15 53.85 90.00 10.00
Application Process 46.00 54.00 90.00 10.00
Interpretive Guidelines 41.18 58.82 71.43 28.57
Stage 1 Review Process 36.54 63.46 88.24 11.76
Ranking Process 26.92 73.08 80.00 20.00
Asynchronous Online Discussion Tool 7.84 92.16 42.86 57.14

« Back to figure 37-A

Figure 37-B. Research Administrators.

Materials were used? Materials were helpful
Yes No Yes No
Overview of the Knowledge Synthesis Competition 33.33 66.67 71.43 28.57
Application Process 33.33 66.67 83.33 16.67
Interpretive Guidelines 13.33 86.67 50.00 50.00
Stage 1 Review Process 0.00 100.00 0.00 0.00
Ranking Process 7.14 92.86 100.00 0.00
Asynchronous Online Discussion Tool 0.00 100.00 0.00 0.00

« Back to figure 37-B

Figure 37-C. Stage 1 Reviewers.

Materials were used? Materials were helpful
Yes No Yes No
Overview of the Knowledge Synthesis Competition 57.00 43.00 94.00 6.00
Application Process 31.00 69.00 80.00 20.00
Interpretation Guidelines 57.00 43.00 89.00 11.00
Stage 1 Review Process 63.00 37.00 89.00 11.00
Ranking Process 26.00 74.00 50.00 50.00
Asynchronous Online Discussion Tool 57.00 43.00 93.00 7.00

« Back to figure 37-C

Figure 37-D. Stage 2 Reviewers.

Materials were used? Materials were helpful
Yes No Yes No
Overview of the Knowledge Synthesis Competition 50.00 50.00 100.00 0.00
Application Process 46.67 53.33 100.00 0.00
Interpretive Guidelines 40.00 60.00 100.00 0.00
Stage 1 Review Process 62.50 37.50 100.00 0.00
Ranking Process 56.25 43.75 88.89 11.11
Asynchronous Online Discussion Tool 12.50 87.50 100.00 0.00

« Back to figure 37-D

Date modified: