Open to Assessment
Rapid growth led Philadelphia University to adopt an evaluation program to better manage its physical plant. The results: an insightful action plan–and honors from APPA.
By Randall D. Gentzler
In the early 2000s, Philadelphia University's success was highly visible. The institution was experiencing a period of rapid enrollment growth and academic program expansion. And more students were living on campus than ever before. As a result of this rapid growth and changing demands, the physical plant department was losing its edge.
We faced tremendous challenges in trying to meet the needs of campus building and grounds maintenance. Limited manpower, unsatisfactory delays in responding to requests, and constant pressures on available resources made it increasingly more difficult to maintain our targeted quality level of facilities maintenance.
Also, during this period, major capital construction was under way on campus. The physical plant department was managing the design and construction of a 34,000-square-foot, high-tech, academic building. In addition, we installed a new, major internal pathway that connected various areas of our campus, and we had just purchased a 108-unit, three-building apartment complex that required some deferred-maintenance and life-safety upgrades to be addressed prior to its use as new residence space for students. Adding to the chaos, the city was conducting a major street reconstruction on the main thoroughfare intersecting our campus, requiring detours and continually changing traffic patterns. All this activity quickly increased the workload for our already-burdened staff. The department was spread too thin, and concerns over quality and response time were heard loud and clear from our university stakeholders.
Since Philadelphia University places a great deal of pride in the appearance of its campus and in being able to meet the needs of the campus community, it was becoming clear that we needed a new strategy to turn things around.
How Philadelphia University pulled itself back from a strained maintenance situation is a story of motivation, self-discovery, and the willingness to open our campus to an external evaluation process. Within seven years, we not only achieved our goal of elevating the physical plant operation, but the university won the facilities management Award for Excellence in 2009, the highest institutional honor awarded by APPA: The Association of Higher Education Facilities Officers.
Scope and Strategy
To put things in perspective, Philadelphia University's physical plant department is responsible for the basic operation and managed care of the 52 buildings on the 100-acre campus. These buildings include historic century-old mansions, as well as much newer, architecturally award-winning, academic buildings. The facilities inventory tallies just over 1 million square feet, with buildings dating from 1779 to 2006. The buildings are the university's most significant asset. It was important that we quickly identified a process for improvement, while working within the funding constraints caused by rapid institutional growth and the competing demands for resources. During a 5-year time period in the early 2000s, our full-time undergraduate enrollments had grown by more than 15 percent, while our facilities maintenance budgets grew by less than half that amount. Achieving the required level of facilities maintenance and stewardship of our assets was crucial to our continuing institutional success.
After a thorough review of our options, we decided on the APPA Facilities Management Evaluation Program. The FMEP process would use our internal resources and allow our staff to be an integral part of a strategic assessment of our overall facilities inventory and, eventually, a plan of action. We believed this process could ultimately lead to positive changes within the department and was the best approach by which to garner the internal support necessary to move the initiative forward.
Another key feature of the APPA program is an audit conducted by professional peers, which represented a cost-effective approach for the university, as the overall cost of this program was significantly less than other external options we explored. In the tradition of campuswide collegiality, we wanted an objective third-party evaluation of our overall condition. The evaluation program would benchmark our plant to national standards, drill down to find our customer concerns, recognize deficiencies, propose areas for improvement, and justify resources where needed.
The program is customized to each institution and leverages the available resources. We were prepared to participate in the five steps of the exercise:
- Perform self-evaluations by an internal team representing various campus departments and functions.
- Undergo a thorough review by colleagues from similar institutions during a campus visit by this external team.
- Make a formal assessment identifying areas of concern.
- Share results and objectives among the internal and external teams.
- Develop the university's corrective action plan and implement it on an
According to Thomas Becker, associate vice president for operations, "This seemed to be the best option for Philadelphia University. All involved knew that it is difficult for a department working under a great deal of pressure to be placed under the microscope. However, the process and structure of this program allowed us to absorb the information and feedback more constructively. Ultimately, it allowed us to clarify where we needed to change; guard against resistance to change; ensure a comprehensive, collaborative approach to change; and make change happen."
Let the Process Begin
Our hope was for the university to see its strengths and weaknesses from both internal and external perspectives and be able to base future strategies on this information. Based on the concept that well-maintained facilities go hand-in-hand with the ability and openness to learn, APPA's plan offers an assessment and comprehensive evaluation of the quality of the institution it is focused on, while keeping in mind that every institution is different in culture, size, shape, and purpose. Therefore, recommendations may differ as to how to achieve and maintain success.
"We all have blind spots, so an APPA evaluation team can help an institution's leadership see what they might not be able to see for themselves," said E. Lander Medlin, executive director of APPA. "We liken it to a visit to the doctor for a physical examination. We may not want to hear everything the doctor has to say, but ultimately the knowledge he or she provides is a powerful tool, and with it we can make informed decisions."
Believing in those benefits, we launched into the evaluation process.
Looking within. To begin the process, we put together a team representing a broad spectrum of staff. The intent was to conduct interviews with those who have a stake in the success of the department-those who receive and appreciate the services of facilities management-and to acknowledge and respond to their feedback, even if it was negative.
There are two parts to the self-evaluation. The first is putting together a profile of the department in what is called an "Organizational Description." This is an opportunity to paint a clear picture of the way the department works; its main products and services and how they are delivered; the values, vision, and mission of the department; and the employee demographic profile. In addition, we looked at technologies that are used; regulations that affect our work; department governance; basic customers and stakeholders; and those with whom the department interacts in terms of partners, suppliers, and distributors.
The second part of the self-evaluation, "Organizational Challenges," focuses on the issues that the department is facing. We asked ourselves important questions, including those regarding:
- Institutional position. Where does the institution fall in the education market in terms of size, and what other institutions are comparable?
- Success factors. How do we determine and measure success? What are the factors taking place that have affected this?
- Comparative data. What are the key measures that determine how this institution is doing compared to other similar institutions?
Critical analysis. Once we gathered all the information, APPA evaluated it based on seven important criteria, which also play a big role in the peer-evaluation element of the process that follows the self-evaluation. The seven assessment areas, steeped in the tradition of the Malcolm Baldrige Performance Excellence Program, include:
- Leadership. How do the university's senior leaders set standards and guide the focus to the customers? Do they uphold clear and visible values and high expectations in line with the institution's mission and vision?
- Strategic and operational planning. Are goals being identified and actions taken to achieve success? Many issues might come up at this point, including changes in customer expectations, possibilities for creating business partnerships, and developments in technology.
- Customer focus. Are the people being served having their needs met; do they feel heard, understood, and responded to? Tools might be considered that improve communication with customers and follow-through on requests.
- Information and analysis. How is performance evaluated? Once that is established, the information can be used to move forward with improvements.
- Development and management of human resources. This criterion focuses on the department's employees and partners, pointing toward the belief that having the right people in the right positions can be crucial to success.
- Process management. The focus is on processes such as performance standards, work management, planning, and design.
- Performance results. This is the ultimate criterion, measuring success through campus appearance, employee satisfaction, effectiveness of systems operations, customer satisfaction, and financial bottom line.
External experts. Once we completed the self-evaluation and forwarded the results to APPA, the association assigned us a review team made up of physical plant senior management from universities around the country. Our team included a retired facilities management director from Arizona State University, Tempe, who also happened to be a former APPA president, and the director of facilities at Montana State University, Bozeman. The team of two came to our campus and interacted and engaged with faculty, staff, and students throughout the institution in many different departments over a three-day period.
The site visit was not unlike an academic accreditation process. It began with a dinner at which we met and got to know the review team. We began the following day with a structured agenda that called for a series of meetings throughout the duration of the visit.
Our hope was for the university to see its strengths and weaknesses from both internal and external perspectives and be able to base future strategies on this information.
The evaluation team met with 24 different groups across campus to gain staff impressions of the plant operations and facilities maintenance. Involving all these groups—including student life, various academic schools within the university, the president's council, and the president himself—resulted in stakeholders becoming more vested in the process and more likely to support the efforts that we would need to make.
"Participants were able to convey to the review team information about the areas where they felt improvement was necessary. It was also a chance to collect information about what we were doing right," says Becker. "In the end, when we made the recommended changes, there was less resistance. If we needed resources to improve things, people understood the reasoning behind the request. It was a win-win proposition all around."
"The site team interviews as many constituents as possible," adds Medlin. "They want to get a 360-degree view of the university."
An important element of the review team dynamic was that, overall, individuals felt free to talk openly to the team members-more so than they might have if interviewed by their supervisors. As with the self-evaluation, this team is looking to make sure that specific job and roles are clearly defined within the department and that the institution is operating at an efficient level. Also, student endorsement is something that can't be ignored. The team's work included determining whether or not students are happy with the services they are receiving and with the overall maintenance of their institution.
Where Do We Stand?
At the conclusion of the site visit, Tom Becker and I sat down with the review team to discuss its findings. We were very much aware of the five grades (ranging from the best—Level 1, "Showpiece Facility," to the worst—Level 5, "Crisis Response") that determine the level at which each university is rated (see Figure 1). To be honest, we were concerned that we would be ranked in the less-favorable categories.
Based on its overall assessment, the team confirmed that, although we were not placed in one of the less-favorable categories, there were clear indications that we were regressing in certain areas, as we had previously suspected.
We used this baseline assessment as our starting point to acknowledge where we were and then set our goals to clarify what we wanted to accomplish. We decided that we would target overall improvement while not necessarily targeting the top level initially. Instead, we committed to the policy that no performance aspect should slip below the middle level of "Managed Care," while striving for the second-highest status of "Comprehensive Stewardship." In other words, we made the choice to support the advancement of the entire university from a resource and funding standpoint, but gave priority to meeting the needs of our new and growing academic programs. However, at a minimum, we wanted to operate the physical plant department at a level that benefited everyone and in which the university could take pride.
Tackling the Top 40
The review team's verbal assessment was followed by the final, 103-page confidential report published by APPA. In it, the FMEP team indicated that our department's team was knowledgeable and hardworking. At the same time, the report included 40 recommendations for enhancement or improvement, all of which the university accepted.
Now our path was set and we were ready to implement the recommended changes. Again, we involved as many stakeholders as possible in the process. Based on APPA's national criteria that aligned with the information captured in the FMEP review, we created dashboard reports by which to record and update our progress on an annual basis.
We listed the 40 objectives in priority based on project urgency and the institution's funding timeline. Our goal was to categorize the 40 recommendations as short-, mid- or long-term goals—and we planned to accomplish all 40 within a 6-year time frame. Our annual reports would carefully track and measure our progress against the objectives.
Figure 2 shows the way we allocated our plant resources as of FY09. The percentages shown are based on actual work orders received and time spent. At the top of the list of objectives was our overarching mandate, "safe and legal." We worked clockwise each day around the areas of the chart. Early in the process, reaction time was greater. Now, we are investing more resources in stewardship.
We learned that larger aspects of facilities stewardship can be managed during off-peak hours and with timely capital infusions from year-end margins created by growing and above-budget enrollment levels. Customer service, however, must be ongoing and not be allowed to dip below a certain level since it may prove nearly impossible to regain lost goodwill with our customers. This plan has worked well for us.
Here are some specific examples of a few of our more immediate tasks.
Focus on a new system for service delivery, including ways to gather perceptions of the quality of customer service. We typically get 1,000 work orders each month. Before we made any of our planned changes, a request could take up to a month to complete. We had no way of getting feedback from customers, and customers had no way of tracking what they needed to have done. Many times they didn't even know if the job had been completed. It was clear that we needed to put a new system in place as quickly as possible.
The first thing we did, at a minimum cost, was to update our work-order entry system to eliminate the middleman. As an example, resident assistants (RAs) in residence halls had been responsible for initiating all work orders. The process and systems in place were not working, as shown by the disparity in feedback we got from students during the evaluation process: Some thought we were doing a great job, others thought our service was poor. The answer came down to the fact that some RAs were filing the work orders immediately, some were taking a long time to do it, and others may not have been doing it at all.
Our newly implemented process allows all students to input requests directly to us by completing the work order online. We provide an immediate acknowledgment that we've received the request, and we send updates as to how the process is going. If a part needs to be ordered and the job will be delayed, we automatically notify the student. When the job is completed, we send a message confirming that, followed by a survey asking for the student's feedback. If work is done in a student's room, a door tag is left letting the student know we were there. The work order's completion timeline has been shortened and is now typically being completed within a week—even sooner for smaller jobs.
All of this was achieved at a minimal cost. Most of it was procedural change and improvement to our Web site. It's automatic, and not a lot of time is required to keep up that level of service. The door tags themselves are inexpensive, and customers have confidence that their concern has received the required level of attention. If we didn't understand the concern or somehow did not fix the problem, the method is there to get us on track before discontent sets in. The immediate feedback was very powerful in improving perceptions of our service level.
Grow the department's workforce, as the Facilities Management Evaluation Process determined that our staffing had not kept pace with institutional growth and demands. This recommendation was combined with a caution that we be careful not to change the culture of the plant staff. A significant and immediate increase in size would certainly do that. Economically, that was not feasible anyway, so we have systematically ramped up our staffing, incrementally adding to its numbers each year.
Overall, we have increased the department by nine people since the evaluation, going from 21 to 30. Where that manpower is employed has also shifted. We are now a 24/7 operation during semester operations. We schedule our staff around the needs of our customers. Classroom work is done from midnight to 8 a.m.; office work is completed between 3 p.m. and midnight. This further provides for service in buildings at more times of the day to take care of incidents before they reach an emergency level. Our second- and third-shift workers and a portion of our day shift are designated as RAM techs—Right-Away Maintenance technicians. These employees are multicraft technicians who are ready and available around the clock.
Expand our Honeywell building automation system. The system uses computers to monitor boilers and air conditioning units around the campus. This allows us to easily check status without actually going to the buildings, ultimately saving money because the equipment runs more efficiently while reducing the required manpower.
Before the evaluation, maintenance staff had to drive around checking all the boilers on campus, since we had the monitoring system in only one building. Now we have the system in 62 percent of our buildings, with the hope to one day have it throughout the university. With the new system, boilers and building temperatures can be viewed through Web-based access both on campus and remotely, without requiring that staff be dispatched to each location.
Develop a facilities master plan. We completed a full assessment of the deferred maintenance for each structure—including our older, nonhistoric, deteriorating buildings—and did a financial analysis that included comparing cost of renovation to that of new construction. We determined that instead of investing in the existing structures, it would be more economically prudent to demolish some of them and build new. This decision also had a positive impact on operating costs. The new building's spaces provide improved program serviceability and also operate more efficiently.
In the past seven years, we've achieved what we consider a paradigm shift, reversing course from going down a path of "reactive maintenance" to meeting performance criteria that fall mainly into the national standards of ongoing "comprehensive stewardship." Figure 1 shows how we've plotted our annual performance on APPA's Maintenance Level Matrix, noting our starting point at Level 4 and our 2009 performance at Level 2.
We did this without significant infusion of annual operating dollars. We did, however, strategically focus our capital improvements in a way that also addressed or eliminated maintenance concerns rather than increasing their numbers. We are proud to say we now have a significantly more efficient deployment of manpower, are available to take customer calls around the clock, have made overall improvements to our physical facilities, and have cared for our grounds such that they are an enhancement to the campus.
You could say our plan was a leap of faith, since we did not have the funds to add extra supervision for the increased staffing to cover extended hours, but we moved forward anyway. We felt we had a good staff of responsible workers, and we put in the systems that would allow work to be tracked and assessed. We made customer service a priority, but did so in a way that ultimately allowed more dedicated time for facilities stewardship.
There is no question that winning the APPA Award for Excellence in facilities management in 2009 was in true recognition of all of our hard work and the university's commitment to make significant improvements. One of the best observations was to see how many other departments championed the physical plant department when the Award for Excellence audit team came to campus.
The results prove that once institution leaders know where they stand, where they want to go, and how to put the funding methodology in place to get there, the rest is a matter of communication, campus buy-in, and finally, execution. It was a testament to the willingness of Philadelphia University's administration and physical plant department to assess performance, and ultimately, implement new processes that achieved results and national recognition.
RANDALL D. GENTZLER is vice president for finance and administration and treasurer, Philadelphia University.