Keeping Programs and Resources in Sync
Drake University was long on programs and short on budgets—a mismatch corrected by a comprehensive review.
By Victoria F. Payseur
Key to Drake’s subsequent transformation was the adoption of a process to thoroughly review academic and administrative programs. By doing so, Drake has generated millions of dollars of permanent budget savings. Establishing clear funding priorities and deliberately allocating resources has strengthened the university and clarified its focus. The process we employed may provide ideas for your institution.
Outlining a Model Process
In January 2000, President Maxwell organized a retreat involving key administrators and the Faculty Senate Executive Committee. Robert C. Dickeson, author of the 1999 book Prioritizing Academic Programs and Services: Reallocating Resources to Achieve Strategic Balance, led the discussion that established the framework for Drake’s program review. Shortly after the retreat, all university programs were divided into two categories—academic (instructional/degree) programs and administrative (nonacademic) programs. The academic program review, led by the provost with assistance from the Faculty Senate Executive Committee and various other faculty members, generally followed the approach outlined in Dickeson’s book. The administrative program review followed a parallel path; but the details of the process required significantly more developmental work, because Dickeson’s book was focused primarily on reviewing academic programs. Since the two reviews shared many similarities—and the administrative review is of more relevance to chief business officers—this article will focus on that process. However, both reviews were conducted concurrently and their results were reported out at the same time.
Soon after the retreat, Maxwell appointed the Administrative Program Review Committee (APRC), cochaired by the vice president for business and finance and the vice provost for human resources. Thirteen other appointees were selected for their ability to “see the big picture” and to be credible advocates for change. Because the process was aimed at fairness, participants were required to put aside their individual biases and focus on what was best for Drake. The final committee included representatives of a cross section of administrative units: one academic dean, five midlevel program administrators, two faculty representatives, one nonexempt support staff, one bargaining unit steward, one student, and two members of the Drake Board of Governors.
Clarifying tasks. The primary purpose of the group was to serve as the architects of the administrative program review process and as guarantors of the transparency, integrity, and validity of that process. (The committee would later define an administrative program as “a discrete administrative functional unit or department with a definable budget.”) The APRC’s tasks were to:
- Establish procedures and a timetable for its work. (The final written report was due to the president in September.)
- Develop a working definition of “program” and a list of programs to be reviewed.
- Identify and define criteria for evaluating each program.
- Develop the format and template for data collection and reporting.
- Develop a methodology to ensure objective and thorough review of data collected.
- Prepare a rating recommendation for each administrative program included in the APRC’s final written report.
- Design, implement, and execute an ongoing communications strategy for the administrative program review process.
Identifying goals. Before starting its tasks, the committee articulated two desired outcomes.
- To reduce administrative costs by eliminating low-priority programs; eliminating unnecessary duplication and redundancy; improving efficiency and planning; and/or recommending internal and external strategic partnering, including outsourcing.
- To improve and enhance administrative services to academic programs, students, and other university constituents.
Establishing principles. The group adopted several key principles to guide and inform future recommendations.
- Institutional goals (and the greater good) take precedence over departmental and individual employee goals.
- The process will respect individual employees and will attempt to protect their interests as much as possible.
- All recommendations and/or decisions will be carefully reviewed to assess the potential impact on students and to minimize any adverse effects.
The committee quickly began communicating its goals. Information was distributed to the campus community via e-mail. Frequent e-mail updates and Web postings regarding the administrative program review activities continued throughout the next several months, while the review procedures and the criteria were being developed.
Defining and Collecting Data
Meanwhile, the group began addressing the first of its tasks so that input could be gathered during the spring semester.
Defining administrative program. Dickeson defined a program as “any activity or collection of activities of the institution that consumes resources (dollars, people, space, equipment, time).” For Drake’s administrative review, the APRC defined an administrative program as “a discrete administrative functional unit or department with a definable budget.” Then all program administrators defined the separate programs under their management. For example, programs within athletics included each intercollegiate sport, sports information, ticket office, sports medicine, and other related areas. Student life administration identified orientation, counseling center, health center, Greek life, residence life, and the office of the dean of students among its discrete programs. About 130 administrative programs were identified and subsequently reviewed, including a number of discrete programs in the accounting and finance area: budget office, internal audit, endowment accounting, controller’s office, office of the vice president of business and finance, accounting, student accounts, payroll office, loan office, grants accounting, and purchasing.
Setting guidelines. After defining and listing all administrative programs to be reviewed, the APRC established review guidelines, which were provided in advance to the entire campus community via a dedicated Web site. Posted information included the criteria and data collection template to be used for the program evaluation, the procedures for the review, the rating system and voting rules, and the timetable.
Establishing review criteria. The APRC modified Dickeson’s approach for academic review, adopting five broad criteria to serve as the basis for evaluating each administrative program: (1) importance to Drake, (2) external/internal demand, (3) quality, (4) cost-effectiveness, and (5) opportunity analysis.
Before implementation, the criteria were subsequently presented to Drake’s key administrative staff for reaction and feedback. All managers would be asked to evaluate every administrative program in their units based on these five criteria. For each criterion, a series of specific responses was required.
Creating the data-collection template. To ensure that required responses netted uniform and consistent information, a template was created to provide a series of questions that had to be answered for each criterion. The template guidelines specifically limited the length of the responses; even the type style and font size were specified to give everyone an equal amount of space in which to justify each program. (See sidebar: “Designing a Data Collection Template.”)
The completed template served as the sole instrument for data gathering regarding each administrative program. No appendices or attachments were permitted other than an organizational chart identifying the program’s employees and listing the top several functions performed by each. The template and instructions were provided during the first week of April, and a follow-up workshop for those needing assistance was conducted within 10 days. The completed templates were due to the APRC by June 1, allowing managers approximately two months to complete the responses for all discrete programs. The administrators of only a few programs requested and received extensions.
Applying the Review Process
|Designing a Data Collection Template|
To ensure that uniform and consistent information was gathered as a basis for its administrative program review, Drake University developed five criteria around which to frame questions related to administrative programs. In addition, imposed response lengths and other specifications were employed to give program reviewers equal space to justify programs.
1. Importance to Drake
2. External/Internal Demand
5. Opportunity Analysis
Each member of the review committee received electronic copies of all program data collection templates submitted for review. The cochairs developed a review calendar that assigned each program a specific date and time for APRC review and discussion. The reports were to be read by each APRC member in advance of the program’s scheduled discussion by the committee. From early June through early August, review participants met twice a week for two to four hours to review the reports.
Confidentiality. Although the early review process steps were transparent, the APRC informed the campus community that once it began meeting to analyze the individual programs, no interim information regarding the committee’s progress would be provided to anyone outside the review committee. Strict confidentiality rules were observed, avoiding rumors and politicizing. In fact, no further information was provided to the campus community until the final written report had been reviewed by the president.
Rating and voting. A simplified system was used to rate each program: After reading a program’s data collection template, each committee member independently reviewed and rated the program by assigning a plus (+), minus (-), or neutral (0) to each of the five criteria and calculating an overall rating for the program. Each program discussion opened with a sharing of the individual ratings determined for the programs under review. The full committee then discussed the initial individual ratings for each program and decided whether to conduct a committee vote on the program, to assign the program to a subcommittee for further analysis, or to contact the program manager for additional information.
Eventually, two votes—one preliminary and one final—were taken on every administrative program. APRC members with vested interest in particular programs recused themselves from the initial discussion and the preliminary vote for that program but did participate in the final vote. While the preliminary votes reflected a mixture of opinions, the final votes taken at the end of the process after all programs had been analyzed were usually unanimous.
Ratings and Recommendations
The objective of the final vote was to group each administrative program into one of four categories: enhance, maintain, reduce and/or restructure, and eliminate. Although Dickeson’s process recommended using five categories with an equal number of programs assigned to each one, the APRC decided against applying such a formula to its ratings allocation process. The idea was to not limit widespread change on the administrative side to a specified percentage of programs.
Here are representative examples of the four categories adopted by the administrative program review committee.
Enhance: Only those administrative programs with real mission-enhancing potential were placed in this category, which generally was earmarked for additional budgetary support. Several programs recommended for enhancement fell within information technology—for example, network administration, instructional technology, and user services. Human resources also was placed in the category, with specific emphasis on improving employee training and development services, employee morale programs, compensation planning, and performance appraisal systems.
Maintain: These programs were to be maintained at the current level of staffing and other support. This category was used primarily for programs deemed necessary to the institution and exhibiting sufficient quality, demand, and cost-effectiveness. Consequently, nearly all of the accounting and finance programs identified earlier were included here. (See the next category for a review of the exceptions.) Two other programs that fell into this category were campus security and undergraduate admissions.
Reduce and/or restructure: This category housed programs to be considered for reduced financial and staffing support, combined with other programs, or restructured to provide savings. Generally, the identified programs were (1) non-core, but necessary, programs that would benefit from some type of rethinking, and (2) programs with opportunities for significant cost savings and/or service improvement potential.
Almost half of all administrative programs reviewed were identified for some type of reduction or restructuring. One good example is the Athletics Academic Liaison program. The review committee recommended moving the program from athletics to the Provost’s Office to ensure a uniform approach to student academic success and to strengthen the coordination between athletics and academics.
In the finance and accounting areas, internal audit was recommended for restructuring to provide expanded analytical and financial services to the university administration and to its board of governors. The internal audit function was later outsourced to a CPA firm (separate from the university’s external auditors) with excellent results: less cost and much better risk management and review. In addition, the office of the vice president for business and finance, the loan office, and purchasing were recommended for restructuring or consolidation.
In many other cases, the program reduction/restructuring recommendations were far-reaching and included the following actions:
- Reduce staff.
- Consolidate decentralized and duplicative operations (e.g., marketing).
- Outsource sports medicine and the print shop.
- Physically relocate programs to better match staff size and space capacity, encourage better synergy between programs, or permit the sale or rental of off-campus properties.
The time frame for reduction/restructuring of administrative programs was generally no more than two years.
Eliminate: Programs placed in this category exhibited limited demand, limited future opportunity, and minimal connection to the mission of Drake. In all, the APRC identified about 10 percent of administrative programs for total elimination. Programs in this category generally required phasing out over time; most were completely eliminated within one year.
One example of a program identified for elimination was the Drake Business Center, which brokered and facilitated corporate training programs for third-party vendors at a leased, off-campus facility. The program did not seem to fit Drake’s mission and it was not financially viable. A six-month transition plan was developed, three staff positions eliminated, and the lease terminated.
Another program recommended for elimination was the physical plant store, which ordered, inventoried, and managed facility supplies. The staff position and the accumulated inventory were replaced by a just-in-time ordering and delivery system.
Additional themes: In addition to the findings regarding particular programs, the administrative program review process uncovered several problematic issues that crossed program boundaries and related to administrative practices and procedures of the university as a whole. For example, the APRC discovered a significant amount of duplicative services within the organizational structure as well as inconsistencies in personnel practices involving workload policies, overtime pay, and processes for employee performance evaluations. Those cross-program themes were addressed and described in the final written report.
The final task of the APRC was to prepare a written report that included the findings and rating recommendations for all programs. The report also assigned an accountable person and an implementation deadline for each recommendation. Each APRC member reviewed and approved the draft report before it was issued in final form to President Maxwell, who received the report of the academic review by the provost’s committee at the same time. The president then took several weeks to privately review both the administrative and the academic program review reports before making them available to the entire campus community and the public.
Feedback: Following the public release of the reports in mid-October, the president held a series of town meetings to allow interested staff, faculty, and students to comment on the findings and recommendations in the two reports. The purpose of gathering input was not to defend the recommendations but to receive feedback from the campus community that might inform and influence the next levels of the review process.
Final review steps: The concluding steps in the review process occurred across a three-month period from mid-October 2000 through mid-January 2001. A committee of eight faculty and five staff elected by their peers, the student body president, President Maxwell, and his six-member cabinet (the provost, the vice president for business and finance, the vice president for institutional advancement, the executive assistant to the president, the president of the faculty senate, and the assistant to the president) reviewed and discussed each program recommendation from both reports. Meeting weekly, this Review and Priorities Advisory Committee (RPAC) discussed each recommendation until arriving at a consensus regarding the appropriate action to be taken. Its recommendations did not always agree with those made in the program review reports. This assured the campus community of fair and thoughtful secondary evaluations, which became the next layer of the program review report.
Finally, President Maxwell independently reviewed the academic and administrative committee reports along with the RPAC’s recommendations and provided his written recommendation regarding each program. The financial impact of each final recommendation was calculated by the vice president for business and finance and the estimated savings were scheduled into a four-year budget plan. As the last formal step, in January 2001, the university’s Board of Governor’s reviewed the final report and voted to approve it in its entirety.
Implementation and Results
The implementation of the comprehensive program review took approximately three and a half fiscal years. In that time, as a result of the program review process, Drake generated more than $4 million in permanent budget savings from eliminated or restructured programs. Furthermore, Drake’s operating budget improved incrementally in each of the past six years. FY06 resulted in an operating surplus of $1.4 million, approximately 1.4 percent of operating revenues.
By identifying clear funding priorities and reallocating resources, Drake has emerged a stronger and more focused institution, one that is better able to manage its future. Program review became the first step in an ambitious and ongoing planning process at Drake that has subsequently included the development of a multiyear strategic plan and a four-year strategic budget; a thorough review and analysis of the institutional mission, and the creation of a Vision 2025 document to inspire Drake to become a truly great university. Thanks to program review and its outcomes, Drake is now positioned to create an exceptional learning experience for its students, while ensuring that tuition dollars and other revenues are being used strategically and efficiently.
VICTORIA F. PAYSEUR is vice president for business and finance and treasurer, Drake University, Des Moines, Iowa.
- Final Report for the 2016 NACUBO-Commonfund Study of Endowments is Now Available
- What Did I Miss? February 28-March 20, 2017
- ACA Repeal Moves Forward as House Republicans Unveil Their Health Care Replacement Plan
- WEBCAST: Legislative Lunchcast: A 30-Minute Washington Update from NACUBO
Friday, March 31, 2017 12:00PM ET
- WEBCAST: What’s Happening in Student Financial Services?
Thursday, April 13, 2017 1:00PM ET
- WEBCAST: How to Budget for Technology That Aligns with Institutional Goals
Thursday, April 20, 2017 1:00PM ET
- WEBCAST: Update to Strategic Financial Analysis for Higher Education, 7th Edition: Corrections, Clarifications, and Consistency of Application
Monday, May 22, 2017 1:00PM ET