Summative Evaluation of the Regional Partnerships Program (RPP) - Final Report: Part 1
Table of Contents
Executive Summary
Preface
Background
Evaluation Methodology
Focus of Report
RPP Objectives
Executive Summary
The Regional Partnerships Program (RPP) was initiated in 1997-98 by the Medical Research Council of Canada (MRC) as a response to a decline in funding to medical researchers in four provinces, all of which had medical schools and relatively small populations. In 1999-2000, RPP, now about to be a CIHR program, expanded its focus to include health research, and added two other provinces, both without medical schools and with small populations. The recommendation of this evaluation is that a renewed RPP be authorized. The need to support Advisory Committees to develop research strengths for their jurisdictions remains relevant and not yet satisfied. There are no other CIHR programs of significance that address this need.
RPP was designed as a response to declining funding for health researchers in Saskatchewan (SK), Nova Scotia (NS), Newfoundland (NL) and Manitoba (MN) from the Medical Research Council of Canada. This report addresses the question of whether the Regional Partnerships Program has been successful. As such, the focus is on the program as a whole. Attention is not given in this report to individual regions or provinces with one exception: the central question of the difference between the receipt within a province of CIHR research funds and the province's share of Canada's population (referred to as the "gap" analysis). Extensive Working Papers contain the detailed findings from the evaluation.
In each of the six provinces with RPP Advisory Committees, the gap between the actual amount of CIHR funds obtained and the amount that would be expected if CIHR funds were allocated solely on the basis of population size was examined. A criterion had not been established by which to determine how much of a gap was needed in order to conclude that the difference between actual and expectation was of a magnitude that required special effort such as provided by RPP. If the criterion was 20%, then two of the six provinces (MN and NS) were just at the criterion threshold in 2003. Comparison of the slope of the two lines for actual and expected funding leads to the conclusion that RPP has not yet resulted in a removal of the gap.
As of 1999, the slope of the actuals line takes a noticeable turn upward in all six RPP provinces although this is rather slight for NB. The improvement in the slope of the actuals is attributed to some combination of the influence of RPP and the increased funding levels that became available with the establishment of CIHR. Other evidence, such as post-RPP success rate, project conversion to non-RPP CIHR funding and perceptions within the provinces, suggests that the mechanisms put in place by RPP have, in general, had noticeable success and that progress is being achieved.
Every grant or award funded within RPP was reviewed in terms of annual contributions by CIHR and annual payments by partners. This showed that the RPP allotments were being used by the Advisory Committees. By 2001-2002, the usage of the RPP allotments had begun to approach the maximum available - $4,400,000 per year - and the usage has stayed close to that level. In almost one-quarter of the projects, partners provided funds in excess of the CIHR contribution. There were a negligible number of cases in which partner funding did not at least match the CIHR contribution.
Fundamental to the theory of RPP is that researchers will move from RPP funding to full CIHR funding. The RPP "graduates" have done very well. The post-RPP success rate for researchers is 72%. Although this appears to be very high, it is not possible to know if this is an unusually high success rate for this success rate spans several competitions. Equally positive, at least 41% of the RPP projects were converted to non-RPP CIHR funding for continued research in the same substantive subject area studied in the RPP project.
In response to survey questions, the Principal Investigators (PIs) revealed that RPP has almost no influence in attracting PIs to a province and only a small influence in retaining PIs in a province. This finding is in contrast to the view of many Advisory Committee members who stated the importance of RPP as a recruitment factor. The majority of RPP researchers are not recent graduates. For 96%, at least five years have passed since their highest degree was obtained. The PIs obtaining RPP projects are, in general, also able to attract other sources of funding.
As a result of RPP funding, many of the PIs were engaging people for their RPP project at more than one level: about 77% engaged technicians, 60% engaged Master's level students and 50% engaged PhD students. Most (89%) of these PIs said that their RPP involvement led to other research opportunities or benefits and they credited their RPP projects with having a primary influence on scientific presentations (96%), publications (84%) and commercialization opportunities (17%). About half (45%) reported that the RPP grant/award did not affect or change their research path or career plans in any important way.
In summary, RPP is the sole CIHR program focused on the development of health research capacity in lower resourced regions. It has not been in place long enough for substantive development to have occurred and may, by itself, be insufficient to the task. It is very positively rated by the participant researchers, Advisory Committee members, and other stakeholders. This suggests that RPP requires more time to prove itself. Using the experience to date, a more rigorously designed program is now possible.
A renewed program could benefit from a number of significant program design considerations. These include: program partnership; performance measurement; and allotment levels. Consideration of each area is offered in Annex B: Program Design Suggestions at the end of this report. Suggestions for policy direction in a renewed program include:
- CIHR should continue to view a renewed RPP as a partnership program.
- Local Chairs should be asked to establish a consolidated partner fund under the management of the Advisory Committee.
- CIHR should actively support the Chairs of local Advisory Committees.
- A program of performance measurement should be designed at the start of a renewed program and measures should be collected on a cyclic basis to be consolidated at least once a year and examined as trend lines. The reports of the performance measurement should be shared.
- CIHR should consider increasing the allotment for a renewed Regional Partnerships Program.
Preface
The context within which the Regional Partnerships Program was designed had two elements that deserve mention. One is a precedent program and the other is an underlying belief in the relationship between medical research and medical services.
In the latter part of the 1980's, the Medical Research Council implemented a program of Development Grants initially directed to "under-developed" colleges of medicine. This program, which had directed noticeable levels of funds to the medical colleges in four provinces (Newfoundland and Labrador, Nova Scotia, Manitoba and Saskatchewan), ended in the early part of the 1990's. The foundation for the Regional Partnerships Program in 1997 is constructed in the following text by James Wood 1 speaking to the termination of the program of Development Grants.
"The Medical Research Council's long standing policy of providing support for the development of research in "have not" colleges has been discarded, initially by opening up the competition to all colleges, and more recently by the total abolishment of the program. The deleterious effect of this action on the efforts of the "have not" colleges to develop and maintain viable and dynamic research programs cannot be over-emphasized. This situation raises the wider issue as to whether MRC, as a Federal Government Agency, has some social responsibility to ensure that a viable health research base is maintained in the different regions of the country.
Every effort should therefore be made by the involved constituencies to have the Developmental Research program reinstated for the "have not" colleges. Such a reinstated program should have more stringent regulations and a more focused objective than previously, and could well have participation by the appropriate provincial government"
The other context to recognise is the belief that the presence of medical researchers in a region has a direct and positive impact on the quality of medical services available to the population of the region. This belief, extant at the time of the Medical Research Council, has been extended to health research and now serves as an underlying rationale for the Regional Partnerships Program of the Canadian Institutes of Health Research. Given this belief, it follows that some minimal level of funding for health research should be maintained.
One other prefatory concept is appropriate for it underlies many of the arguments for and against the Regional Partnerships Program. The question may be asked: "What is a region?" The Program itself is not prescriptive; it does not define a region. It does recognise that provinces should be involved in such definition and some would suggest that a region is expected to be a province. However, the Program was not established as a partnership with provinces but as a partnership with Advisory Committees initially linked to medical schools and then expanded to centres of health research. The issue is not moot for there are many more centres of health research (including medical schools) now in the early stages of their development than was the case in 1996. Could Northern Ontario, the Coquitlam Region of British Columbia or Iqaluit, Nunavut be considered to be regions? And how many regions might be recognised within a province if the understanding of region is other than province? These remarks are offered as context. The remainder of the report focuses on the evidence relevant to an evaluation of the Regional Partnerships Program in its current form.
Background
RPP was designed as a response to a decline in funding from the Medical Research Council of Canada (MRC) to health researchers in Saskatchewan (SK), Nova Scotia (NS), Newfoundland (NL) and Manitoba (MN).2 These four provinces, all with at least one medical school, were recognized as receiving smaller proportions of MRC funding, relative to their population bases, through the open competition of MRC funding programs.
The Regional Partnerships Program (RPP) was initiated in 1995-1996.3 The first applications to MRC competitions were submitted for the September 1996 deadline and RPP funding began in the 1997-1998 fiscal year. Not all of the four provinces invited to participate during the initial two years of this partnership program accepted the invitation as evidenced by the decision of Saskatchewan not to accept the initial program terms.4
The program in its current form began in 1999-2000. At that point, Saskatchewan joined the program.5 The program expanded, in June 1999, to include the provinces of New Brunswick (NB) and Prince Edward Island (PEI), neither of which has a medical school but both of which have research capacity in some of the four pillars of health research - Biomedical, Clinical, Health systems and services, and Population and public health. The expansion from medical to health research was coincident with the development of the Canadian Institutes of Health Research (CIHR).6
Under the RPP, research funding and personnel support applications that are judged to be of high scientific merit through peer review but are below the funding capacity of CIHR's base budget in CIHR regular competitions, are eligible to receive funding if there is a partner to co-fund the proposal. In its original form (1997-98 and 1998-99), the ratio of co-funding was one MRC dollar to two partner dollars with an annual maximum of $500,000 in MRC co-funding for each institution. Given the additional resources made available for health research in the 1999 federal budget under the Canadian Institutes of Health Research Initiative, MRC decided at its March 1999 Council meeting to enhance the RPP by increasing the annual maximum of MRC co-funding to $1M for each of the original four provinces (Manitoba, Newfoundland, Nova Scotia and Saskatchewan) and changed the funding ratio to allow for 1:1 partner funding. Funding for the newly admitted provinces (New Brunswick and Prince Edward Island) was set at $200,000 per annum also with a 1:1 partner funding ratio. CIHR's current commitment to the program is $4.4M per annum. The allotment to a province is an expenditure limit for a fiscal year. The sum of the authorized fiscal year expenditures to all RPP projects within a province may not exceed the allotment for that province.7 The actual expenditures for projects supported by RPP will always be at least double the allotment use since partners must provide at least equal funding to the CIHR contribution and may choose to pay more.
There is a general expectation that a significant proportion of the activity in this program will be directed to the recruitment and retention of promising and/or excellent scientists. These talented individuals should, in turn, be able to generate new sources of peer-reviewed funding through normal competitive programs. However, RPP proposals may also contain initiatives that are provided for in any of CIHR's other programs (e.g., training awards, operating grants, equipment).
The specific guidelines and procedures used in each province vary, sometimes in significant ways. Although all operate within a set of general operational guidelines and financial rules, there remains much scope for each Advisory Committee to design RPP rules and procedures that best serve its own situation. The general RPP process is outlined in Figure 1.
Individual provinces have significant discretion within the guidelines issued for RPP. When the program started, there were few guidelines and even fewer were codified in written form.
Over time, Chairs of local 8 Advisory committee have met with CIHR managers to discuss the program and share experiences. These in-person and teleconference meetings have yielded RPP Guidelines and individual local Guidelines. Three of the program parameters are mentioned here to show the interplay between RPP Guidelines and local practices.
- Cut-off for eligibility? RPP started with a point rating of 3.0 as the minimum point rating in order for a project to be considered eligible. That was later changed to 3.5 but some Advisory Committees continue to occasionally consider projects lower than the 3.5 threshold.
- Rated points or other criteria? Some Advisory Committees designate funding for projects based on highest point values; others favour projects targeted on other priorities such as a submission from a young researcher or a submission that aim to study a specific health research topic.
- Requirement for advance review of submission to CIHR? All Advisory Committees require a minimal advance review of a submission that is pre-registered for RPP; others insist on a formal review process that may also include mentoring opportunities.
The main client for this evaluation is CIHR management who must make a decision on renewal of the program. The focus of this summative evaluation is an assessment of the effectiveness of the program in achieving its objectives, describing its impacts, both intended and unintended, judging its continued relevance and identifying alternative ways of achieving the expected results.
Evaluation Methodology
There were four approaches used in this evaluation: document review; data mining; interviews; and Web-based surveys.
Document review. Among the documents read were: MRC Regional Partnerships Program; Announcements of Competition Awards in 2002, 2003 and 2004; Various RPP Policies; RPP Competition and Administration Timelines; Evaluation and Performance Measurement Framework, CIHR, April 2004; Minutes, RPP Meeting,
November 29, 1999, Sheraton Hotel (location not given); Teleconference, Advisory Committee Chairs, March 15, 2000; Meeting Minutes and Related Documents,
March 23, 2004, Charlottetown, Prince Edward Island; Student Training Workbook, Electronic Information System, Medical Research Council of Canada, April 14, 1999; Organisational Design of CIHR, Process Maps and Profiles (February 20, 2002); regional evaluation reports; a study titled Health Research Funding in Colleges of Medicine Located in Provinces with Relatively Small Populations, James Wood, c.1994; and the many documents collected during visits to the six RPP regions.
Data mining. CIHR made available extensive data from the data bases of MRC/CIHR. They provided spreadsheets on request detailing submissions for funding and the outcomes of those submissions, information on CIHR contributions and partner funds for given RPP projects; information on the history of interactions between individual researchers and CIHR; and listings of project titles, project participants and other descriptive information. To these were added information on populations and gross domestic products from Statistics Canada.
Interviews. The bulk of the interviews took place during field visits to each of the six RPP provinces. On-site interviews were conducted with the Chair and members of each region's Advisory Committee, researchers who had participated in the RPP processes, and representatives of involved universities, provincial governments, institutes and foundations. At CIHR headquarters, interviews were conducted with a range of managers including officers with previous and current RPP experience, both policy and operational and with MRC as well as CIHR experience. All interviews followed the standard procedure of being preceded by a Backgrounder to advise the interviewee of the issues and questions to be explored. Building in the Backgrounder, the interviewers were guided by an interview protocol which included attention to the several issues and questions and gave particular attention to the details of strategic planning, including RPP guidelines for the region and operational procedures. Interview notes were prepared after each meeting and these were organised using a predetermined data grid.
Surveys. Three surveys were conducted on the Web using the facilities of Fair Surveys. The three target populations were Principal Investigators, Other Researchers, and Partners. Lists for the first two were developed from CIHR data. A list of Partners was prepared with assistance from the Advisory Committee Chairs or their administrative assistants. Surveys were designed, reviewed with program knowledgeable people, written to html and posted, tested internally, revised as necessary, and tested with RPP Researchers who would not be part of the actual target populations for reasons of the timing of their projects. All survey administration followed the same pattern. An initial e-mail was sent to confirm the validity of the e-mail address and the name of the person who would represent the RPP study on the survey. Lists were corrected as required. The survey was sent and then followed, as needed, by up to two e-mail reminders spaced at intervals of three working days. Finally, a telephone call was made to those who had not yet responded.
Data analysis. The data from all sources was both quantitative and qualitative. It was collected, integrated and applied to the evaluation questions. The key question was whether RPP was successful in reversing the decline in funding observed in the earlier part of the 1990s. This was to be achieved by creating partnerships, by leveraging local funds, and by promoting the recruitment and retention of promising and/or excellent researchers working in areas of local strengths and priority interests. These were the mechanisms intended to increase the success rate of individual researchers in CIHR funding opportunities other than RPP. Additionally, the evaluation sought to understand the program and its implementation in ways that would serve CIHR well should it renew the program either in its current form or in a revised form.
The data lent itself to analysis by descriptive statistics and content analysis. Other than confidence intervals for survey responses, there was no need for inferential statistics. There had been an intention to apply a quasi-experimental design to add meaning to the findings but that intention was not realised.
Quasi-Experimental Design
As noted above, in the planning for this evaluation, it was intended to conduct a quasi-experimental design. In effect, this design would have classified the work of the Advisory Committees and the quality of their strategic plans and, having sorting these into "relatively good", "average" and "relatively weak", compared the performance of these groups on the success criteria. It has not been possible to do this because the six situations have sorted themselves into five relatively good plans and one relatively poor plan in their strategic planning.
Therefore, this intended feature of the evaluation cannot be carried out.
Focus of Report
An evaluation study is required to reach valid conclusions on the merit, worth or value of the evaluand - in this case the Regional Partnerships Program. The essence of the process is to identify relevant standards, investigate the performance of the evaluand against those standards and, through integration or analysis of the results, arrive at an overall evaluation. For CIHR, RPP is a first stage program. With this evaluation, it completes its first cycle of program design, implementation and feedback on results. The focus of this report is program results with one key results question: Is the distribution of CIHR research funds to the target regions (defined as provinces) proportionate to the populations of the provinces? The standard for this intended result is straightforward. If the difference between the province's share of CIHR research funds and the province's share of Canada's population is zero (or acceptably close to zero), then this intended result has been achieved.
As well, there are very extensive results from three areas: (1) the CIHR data base with information on submissions for funding and the outcomes of those submissions, information on CIHR contributions and partner funds for given RPP projects; information on the history of interactions between individual researchers and CIHR; project titles, project participants and so forth; (2) Web-based surveys of Principal Investigators, of other researchers, and of RPP partners and (3) data collected during on-site visits at each of the six provinces with RPP Advisory Committees during which information was collected on strategic planning, procedures in place for the direction and management of RPP within the region, views on RPP of Advisory Committee members, academics, RPP researchers, representatives of the provincial government, foundations, institutes and so forth.

The findings from data mining, interviews and surveys are placed in Working Papers. These are very large documents. The Working Paper on the analysis of CIHR data base is about 50 pages. There are three Working papers for the survey results, each with approximately 40 pages of tables and content analyses. There is a Working Paper of about 85 pages that presents six local profiles, one for each province, and an integration across the six. The material in these Working Papers has been studied and has informed the preparation of this Final Report.9
This Final Report addresses the question of whether the Regional Partnerships Program is successful and, if renewed, what are the important parameters to consider in the design of a renewed program. As such, the focus is on the program as a whole. Attention will not be given to individual regions or provinces except for the very specific and central question of the difference between the receipt within a province of CIHR research funds and the province's share of Canada's population (what will later be referred to as the "gap" analysis).
RPP Objectives
The program objectives of the Regional Partnerships Program, as presented by CIHR, are to:
- create partnerships with the smaller provinces by leveraging local funds;
- promote the recruitment and retention of promising and/or excellent researchers by building on local strengths and priority interests of the institutions;
- reverse the decline in funding observed in the earlier part of the 1990's.
To these three, a fourth, implicit objective (understood to follow from main objectives 'b' and 'c' taken together 10 has been added:
- increase the success rate of individual researchers in CIHR funding opportunities other than RPP.
1 James D. Wood, Health Research Funding in Colleges of Medicine Located in provinces with Relatively Small Populations. (undated – either 1994 or 1995). Page 18
2 James D. Wood, Health Research Funding in Colleges of Medicine Located in provinces with Relatively Small Populations. (undated – either 1994 or 1995). James D. Wood, Department of Biochemistry, University of Saskatchewan, Saskatoon, Saskatchewan, S7N 0W0
3 Letter of August 11, 1995 from Dr. Henry Friesen announcing the new MRC Regional Partnerships Program.
4 Program file notes. Report of the Mid-Term Evaluation of the Sask-CIHR RPP, December 2002
5 Saskatchewan was the only province to enter into an agreement with CIHR for participation in this program.
6 The CIHR Act came into force in June, 2000.
7 There is a recent variation to the use of RPP funds which may allow the amount spent within a province to exceed the stated limit. On March 1st of the year, any unused portions of provincial allotments may be redirected to other provinces which have projects which they wish to fund under RPP but for which they lack sufficient funds within their annual allotment.
8 The use of the term “local” is “code” for provincial. RPP is not a federal-provincial program although MCR and later CIHR clearly intended important participation by provincial representatives. RPP began as (and remains) a partnership between CIHR, operating under a federal mandate, and an Advisory Committee, university based, in each of the provinces for which an allotment had been provided.
9 The Working Papers will be available upon request.
10 "d" is also justified by this phrase from the original Program Guidelines: "These talented individuals should, in turn, be able to generate new sources of peer-reviewed funding through normal competitive programs."
[Table of Contents]
[1] [2] [3] [4]
[Appendix A]
[Appendix B]
Supplemental content (right column)
- Modified: