NACUBO

My NacuboWhy Join: Benefits of Membership

E-mail:   Password:   

 Remember Me? | Forgot password? | Need an online account?

Business Officer Magazine
Loading

Ready for a Close-Up

To facilitate its NACUBO-supported pilot IT project, Marist College used a hybrid evaluation process. The work included a self-study phase useful not only for the department’s continuous process improvement but also for the college’s upcoming accreditation in 2012–13.

By Reba-Anna Lee and JoAnn DePue

*We are nothing without our accreditation. At the same time, the accreditation process is challenging for any institution. It is time-consuming and tedious and takes countless people to get the job done right. So why do we do it? Every institution wants to be the best at what it does, and accreditation is a key step in attaining a visible symbol of status and quality. Marist College, Poughkeepsie, New York, is no different.

Yes, Marist has been recognized for excellence by U.S. News & World Report, Time, the Princeton Review, and Barron's Best Buys in College Education. And, we are noted for our leadership in using technology to enhance the teaching and learning process. But, we also must earn accreditation by the Middle States Commission on Higher Education in academic year 2012–13. With early awareness of the coming process, many departments—including information technology (IT)—began the self-study phase of the accreditation process in late 2009.

Selected Sakai Tools Facilitate Collaboration

The Sakai suite of collaboration tools provided Marist College with the kind of open communication and group collaboration functions essential for the information technology department's self-study. Study participants captured discussion points and interactions in several formats for use as information resources and collective planning.

The six tools used for the Marist College pilot self-assessment project included such interactive communication functions as:

  • Blogs—a format for real-time note taking or journaling that allowed site participants to create and maintain something similar to an online newsletter published for public viewing. Blogs were well suited for reflecting upon our ongoing learning process, as well as for group development and project collaboration. Intended for others to read, we used this format in our study to keep notes from discussions at the face-to-face meetings.
  • Portable podcasts—a way for self-study participants to upload audio files from their computer systems and display these files in a user-friendly way. Site participants could download these files to the podcaster of their choice (including most computers with podcaster software) and play them on their own schedule.

For example, during the self-study, Brent Ruben, the Rutgers consultant who helped NACUBO create the Excellence in Higher Education workshop, made a campus visit. He met separately with each department within the IT group, and we recorded each meeting and made audio podcasts available for all participants to review.

See these figures for examples of the other four tools.

At that time, IT leadership was already aware of certain challenges. Some were based on the fact that our department is made up of multiple subdivisions that must work together to address all the technology needs and uses on campus. The diversity of the department staff also makes for a varied set of communication issues not only among employees themselves but with customer interaction and staff knowledge of the services offered. We expected the self-study to reveal other improvement opportunities as well.  

In September 2010, NACUBO selected Marist College to take part in its Lumina-funded Challenge 2010 Project, directed at building a culture of assessment and accountability. NACUBO indicated that it chose Marist for three reasons: (1) The college's institutional leadership demonstrated strong, committed support for continuous improvement, and our institution showed a deep appreciation for the value of implementing a data-collection and reporting system to ensure institutional excellence. (2) The project was designed to develop protocols that could be used in a broad range of college departments; with our IT department as a relevant pilot, we could leverage results. (3) The project would demonstrate the close connection between a systematic approach to continuous improvement and effective self-assessment and accreditation processes.

The NACUBO self-study framework combines the Baldrige National Quality Program with the association's own Excellence in Higher Education (EHE) model to form a somewhat hybrid approach for conducting a self-study on a college campus. While similar in structure to the Baldridge approach, EHE provides higher education context and vocabulary. NACUBO support included training and a Baldrige criteria consultant to come in and add expertise to the process.

This methodical self-assessment resulted in myriad findings for the Marist College IT department, some of which focused on overall customer service, learning outcomes for the department's student employees, and support for the student body as a whole.

Where to Begin?

After reviewing NACUBO's self-assessment model, the IT department's leadership decided that it fit well with the department's best interests. "We thought the recommended framework would work well for our rather eclectic information technology department," says Bill Thirsk, chief information officer and vice president of information technology. "When you combine this culture with the wide array of other services we provide, it was very hard to find a template to measure our performance. The seven performance excellence criteria outlined by Baldrige and EHE gave us that template." (See sidebar, "Define the Discussion,"  for a list of the self-assessment criteria.)

The bottom line is that by combining these two sets of tools, we thought Marist College's IT department could continue to strive for excellence, particularly as we prepared for accreditation.

To facilitate and document the self-assessment activities, we added Sakai's collaborative learning environment, an open-source course management system that we had used many times before. We further refined the hybrid collaboration tool, which seemed a natural progression for the IT group, as we simultaneously sought to reach a broader staff audience and capture the thought process behind our self-assessment initiative along the way. (By a "broader audience" we meant that we wanted our assessment approaches to reach beyond the full-time staff to student employees and to staff working the third shift, all of whom tend to be less engaged in departmental activities and less able to attend meetings.)

Designing a Workable Process

We envisioned a process that blended real-time meetings with an archive to capture meeting suggestions, decisions, and outcomes via the Sakai collaboration site, which would include a wiki, blog, and other forums. Since the accreditation process would include a campus visit and a full self-study report for all college departments, we planned to base our final report on the information captured with Sakai.

"The Baldrige Criteria for Performance Excellence guidelines gave us a road map to create a service-oriented environment upon which we could continually improve," Thirsk notes. The road map he refers to consisted in large part of creating a way to ask ourselves the guiding questions in each of the Baldrige/EHE excellence criteria categories. This systematic assessment of the IT department's effectiveness as a whole allowed us to navigate the process clearly, with our managers on board.

Engaging Others in Self-Scrutiny

We saw one big problem looming: How would we engage our participants? As the project's co-leads, we were excited about the project, as were the other key stakeholders. But, we knew that the manner in which we pulled everyone together and spread that enthusiasm would determine in great part the success of our self-study.

Cascading communication. We decided to employ a top-down "enthusiasm technique." The idea was to first motivate the IT department managers, who would then go back and energize their staffs, and so on.

We started by dividing the seven Baldrige categories into reasonably sized pieces for a series of presentations to the managerial level. We then reorganized the schedule for existing weekly meetings of the various department teams such that managers could incorporate aspects of the self-study at each of those meetings. In that way, they could educate team members to be a part of the process.

We used this technique for two main reasons: (1) It was an easy way to disseminate a great deal of information and generate relevant questions from a large audience; and (2) everyone could hear the information, process it, and then offer their feedback in a timely fashion. No one was being put on the spot for answers or feedback.

Another advantage of this approach was that we could align the plan with our existing continuous improvement plan, outlined in the Marist Middle States Periodic Review Report. This report applies to the entire college and outlines Marist's strategic mission and goals for the next five years, or until the next accreditation visit. A key part of continuous improvement is self-assessment and looking for strengths and areas for improvement, which corresponded precisely with our goals for the department.

Knowledge Transfer Tools

Tampa Program Features NACUBO-Lumina Foundation Projects

At the NACUBO 2011 Annual Meeting in Tampa, July 9-12, college and university leaders will report what they learned from projects designed to deliver examples of data-driven, results-oriented improvement strategies. Selected by NACUBO, the institutions took part in an initiative funded by the Lumina Foundation for Education to develop a cadre of institutions committed to using the Baldrige National Quality Program methodology and associated tools.

Make a point to attend the following sessions, where you can hear about systematic and productive approaches to leading change during difficult economic times:

Sunday, July 10, 2011

10:15–11:30 a.m.    "Tools for Driving Change: Strengthening Existing Practices"

2:00–3:15 p.m.        "Overcoming Resistance to Change: New Programs and Practices"

3:30–4:45 p.m.        "Creating an Appetite for Change: Building a Culture"

The anticipated outcomes for the self-assessment were to gain a working knowledge of the IT department and identify areas that needed improvement. Employee involvement was essential, if only because of the size of the staff, which includes roughly 75 full-time and part-time professional staff in addition to a large student employee population. In all, more than 100 people staff the department.

Using the Sakai project site to which all staff had access, we (1) set up a time-line for the various assessment categories, (2) determined dates when certain mat-erials would be covered at department meetings, and (3) handed out assignments. These included general questions, driven by the Baldrige method, for each category for managers to discuss and answer with their teams. We organized the questions in batches of five or six at a time to slowly to move things forward.

We also used several of the Sakai system's wide range of collaborative tools (see sidebar, "Selected Sakai Tools Facilitate Collaboration").

The first step in employee involvement was for managers to disseminate the information properly to their respective staffs and begin to transition them into the self-assessment process. This took place at weekly face-to-face, managers-only team meetings. The second method called for each manager to meet with his or her staff as a group and answer the guideline questions.

A key part of the process was to capture answers and comments that would later be used to compile results and strategies. That's where the Sakai site came in. Documenting the material from various meetings provided a workflow trail and kept the majority of employees informed and able to participate in the process and feedback loop, which enriched the experience. The feedback loop consisted of the staff posting answers to the criteria-based questions and the managers then commenting on the postings. This led to a wider discussion of topics that left the staff feeling as if their needs had been heard.

For example, the manager of the academic technology department asked employees to answer a number of questions in Category 3 "Constituents and Beneficiaries," such as: "What groups and organizations (either within or outside Marist) benefit most directly from the work your department performs? What programs and services do you provide for each?" and "What other constituency groups are important to your department's success? In other words, who influences or is influenced by your department's work?"

Questions like these helped the department think and reflect upon their constituents and the kind of service that was being offered to them. This group format helped build a large feedback support cycle for the self-study. People in nonmanagerial positions were also able to make suggestions and provide valuable input. Each department followed this formula for information flow, asking key questions, discussing their answers, and reflecting upon the responses. This helped each department, and the division as a whole, to build upon our existing knowledge base and consider ideas that would improve results.

All assignments had due dates as well as guidelines for keeping a standard format for feedback. When managers completed a group of assignments, each gave a short presentation at one of the monthly face-to-face meetings. They structured their presentations around which questions were answered, how they were answered, and what participants learned during the process. We archived the presentations and notes taken at each meeting in a blog tool within the online Sakai site, which enabled us to document our progress chronologically.

Getting Down to Business Practices

As we moved from the general to the specific, we reviewed self-study results and began to focus on the anticipated challenges revealed in the self-assessment. These included the need for increased customer awareness, improved and expanded services, and a plan for continuous improvement for information technology overall, as well as each individual department.

Since an entire section of the study was devoted to beneficiaries and constituencies—based on the Baldrige Performance Excellence Criteria—the process intrinsically created awareness of customer issues and needs. Our way of evaluating the effectiveness of our services was to look into past survey data, as well as to conduct focus groups representing all departments on campus, which allowed us to directly interact with our customers.

We then combined the survey data and focus group responses to identify the IT department's strengths and develop goals and plans for areas of weakness.

Practical goals to strengthen service. Based on this information, we found that we needed to: (1) Create a presence in each academic building to support professors who teach online courses, (2) develop a plan to enlarge and relocate our digital publication center so that services can be expanded, and (3) provide on-site documentation for students and faculty using higher-end equipment in our computer labs.

Intangible and unplanned outcomes. We also realized that some of the findings of our self-assessment were intangible and not necessarily intended, such as:

  • Departmental impact. One great outcome, Thirsk notes, is that "everyone became more aware of the impact our department has on the entire college. The staff learned how their work affects the learning environment, and they gained several great tools to assess how their work affects their own division." For example, Christine Mulvey, director of special projects, telecommunications, and networking, describes the biggest change she noticed in her department: "Everyone is more aware of how their everyday work impacts the Marist campus community and where that work fits in the college's strategic plan. The process still brings us together regularly to present our challenges and to share our plans for moving forward." Mulvey says also that mutual challenges have translated into somewhat of a departmental movement toward planned change. "Early on in the process," she says, "we created mission statements for each of our departments and shared these with each other. As a result, we now develop interrelated plans that collectively support our departmental and college missions."
  • Organizational change. As we worked through the evaluation process, we began to notice some behavioral changes. While the original intention was directed outward—that is, toward improving services to our customer base—a simultaneous shift was occurring in our internal culture. We gained a deeper understanding of our department as a whole by examining where our collective opportunities for improvement exist and how we are each responding to the opportunity. 

As is typical in an IT department, we face each day with uncertainty as to the problems and challenges our customers will bring to us. It is very easy to get swept up in our individually driven work. However, this process forces us to take a broader view of our work and apply it to other college departments. "It has been quite beneficial for our staff to understand the challenges other departments are faced with and the ways they are each addressing these," says Peggy Kuck, director of the client services and enterprise solutions group. "We have a much deeper understanding of what they do and a greater appreciation for it."  

Other Optimistic Outcomes

Interactions among departments have strengthened as well. This is a welcome change given that IT departments typically find themselves pulled in the direction of knowledge silos, with each technical performer as a specialist in his or her field rather than a generalist across all of IT. We tend to have the closest relationships with those we work with directly, and we typically don't understand the work of those outside our immediate areas of expertise. In addition, the speed at which we work does not often afford us the luxury of building those relationships.

The self-assessment forced our IT departments to meet regularly as a group and to discuss strengths and weaknesses. Sharing this information helped break down those independent knowledge centers and transformed us into a more cohesive group that better understands how to approach individuals in ways that achieve our collective objectives.

We saw other benefits as well:

Useful learning experience. Our previous planning process, recordkeeping, and statistics did not always complement one another, and were not very meaningful. Working through this structured process has forced us to identify goals, indicators, measures, benchmarks, and intended outcomes as a complete unit.

"One of the greatest benefits," says Kuck, "is an understanding of how to use a framework to capture quantifiable measures that support our institutional goals. The key performance indicators we now look at tell a story of process improvement rather than simply work progress."

Different approaches to planning and implementation. "In our culture," explains James Curran, network manager, "we are in the habit of very quickly responding to customers and situations. That sometimes spills over into regular work and planning. While we always got the work done, we were not always as diligent about documenting work and systematically capturing statistics that tie back to our original plan. This has changed."

For example, when his department installed more wireless access points on campus this year, Curran's team not only documented the work completed, but also created network maps that identify the location points graphically. Now it is much easier for management to quickly determine coverage and identify gaps that need to be addressed next. "This creates that cycle of continuous improvement," says Curran, "in which output from our last effort allows us to plan for the next deployment."

Cycle of continuous process improvement. What Curran describes is, of course, our ultimate goal: using the information from each project to improve our approach to the next similar work effort. Melissa Egan, assistant director, enterprise solutions group, who manages the Web services department, concurs. "The most powerful aspect of this project," she says, "has been developing a cycle of continuous improvement. In the past, each new initiative in our department was met, to a certain extent, with a new set of parameters and methods. After developing a planned approach to change, however, we are now using the output from previous projects to guide us through the next similar effort."

Egan says this is particularly true in the area of customer relations, when guiding technology users through the process of developing applications for the Web. "Instead of reinventing how we approach each project with a new set of customers, we reuse effective methods that have worked in the past and reengineer those that did not prove to be particularly useful," she says.

Student employee professional development. During the course of our self-evaluation process, we've implemented improvement plans that directly address constituents' concerns. This has had a positive impact on teaching and learning, particularly for the more than 100 of our employees who are also undergraduate and graduate students. "Everyone in my area," says Mulvey, "now is more focused on seeking out teaching and learning opportunities for their student employees."

What's in the Spotlight Now?

In Baldrige Goes to College (a NACUBO workshop based on Excellence in Higher Education), presenter Brent Ruben of Rutgers University states, "Review, planning, and continuous improvement are fundamental to institutional effectiveness." Executive director of Rutgers' Center for Organizational Development and Leadership, New Brunswick, New Jersey, Ruben also makes clear that these elements of continuous improvement "should be thoroughly integrated into the fabric of every institution aspiring to excellence."

That's our goal at Marist College. We've gathered, organized, and processed the data collected in our predetermined areas. We've shared it with all participants and stakeholders. And, in following the Baldrige method of coordination with any accreditation process (in this case we have used EHE standards), we've documented all parts of the process for departmental knowledge sharing as well as for archival purposes.

We're now at the implementation phase, carrying out all the pieces of the plans we outlined in the months of planning and documentation. For example, one of the issues that came out during our focus groups was the development of our student-employee skills important for future professional success. Thus, we've instituted learning outcomes for student employees in all areas of the IT department. Another finding had to do with the need to develop a faculty-based IT toolkit, which is now in progress. But, most important, we were able to hear the needs and concerns directly from our constituents and then act upon the issues that arose.

Marist College anticipates that our future Middle States accreditation visit will be the proving ground for all our hard work. Meanwhile, the IT department is busy implementing the many plans that developed from our self-study. Since we've been involved in the self-assessment process for almost two years now, we are confident that we'll represent Marist well during our accreditation visit. Also, we are looking to branch out and share our model with other departments on campus.

Even when next year's accreditation is behind us, says Thirsk, "We plan to continue to use the Malcolm Baldrige Performance Excellence Program guidelines on a regular basis. It will be a continuous cycle to make sure that we take a good look at ourselves, see what we've learned, and focus on ways to improve even more."

REBA-ANNA LEE is assistant director, and JOANN DEPUE is project manager, information technology, Marist College, Poughkeepsie, New York.

^ Top

Define the Discussion

The Baldrige National Quality Performance methodology and the Excellence in Higher Education model share similar performance criteria. Marist College used the seven EHE topics (which better articulate criteria in the context and vocabulary of higher education) as a basis for crafting questions that would lead to systematic self-assessment. Note the variances in the two frameworks, with Baldrige applying more generally to education and directly to organizations in other industries.

 EHE Criteria                                                             Baldrige Criteria

1. Leadership.                                                          1. Leadership.

2. Purposes and plans.                                            2. Strategic planning.

3. Beneficiaries and constituencies.                        3. Customer focus. 

4. Programs and services.                                       4. Measurement, analysis, and knowledge management.

5. Staff and workplace.                                            5. Workflow focus.

6. Assessment and information use.                        6. Operations focus.

7. Outcomes and achievements.                              7. Results.

 

^ Top

Negotiating the Rapids of Permanent Change

More than a decade ago, I learned of the concept of "permanent white water," the unnerving notion that technological, organizational, and social change would occur explosively, continuously, and disruptively. Survival requires not only the flexibility to quickly jettison outmoded tools and values and embrace new ones, but also the psychological strength to accept the demands and reality of continuous change. The accompanying article is about a self-assessment process at Marist College, Poughkeepsie, New York, and provides an excellent example of the way higher education institutions are learning to navigate changed circumstances.

A pilot project in the college's information technology (IT) department was supported in part by a NACUBO grant funded by the Lumina Foundation for Education, whose recent focus on higher education is gaining momentum.

A description of the foundation's work—and NACUBO's role in the Marist College project that demonstrates business practice improvement—provides further context for the article as well as thoughts on how you might involve your institution in these important efforts.

Lumina confronts the productivity challenge. In 2007, the Lumina Foundation started investing in productivity-related grants to help college and university leaders recognize and adjust to higher education's "permanent white water." Often called the "new normal," this condition means higher expectations for continuous improvement, with reduced taxpayer resources to achieve these goals. The rationale for Lumina's advocacy is this:

  • The nation's economy will need significantly more and better-educated workers in the future.
  • At current degree and certificate production rates, colleges and universities will not award a sufficient number of degrees and credentials to meet projected demand.
  • With flat or even declining revenues, state policies and institutional practices have to change in transformative ways to enable the nation's colleges and universities to better serve higher numbers of less-prepared students.

To respond to this challenge, Lumina invited states to compete for four-year grants focused on what it calls the "Four Steps to Finishing First": (1) reward colleges and universities for student success, (2) reward students for completion, (3) expand lower-cost options for postsecondary education, and (4) adopt good business practices.

The initiative is intended to inspire systemic innovation-first in a few states that are the early adopters-and then disseminate the new policies, practices, and innovation culture more broadly. The latter is done through a comprehensive strategy that includes state, regional, and national outreach and gatherings; focused research and publications; and direct technical assistance.

National consensus grows. A broad coalition of policy makers, associations, and stakeholders has embraced much of the productivity challenge, with special focus on college completion. The Obama administration and a number of associations and foundations have joined the effort.

We're seeing some early results. For example, better business practices are proliferating in areas like Ohio and Maryland, where state efficiency councils systematically create and implement improvements in campus and system business practices to free up resources for core academic missions.

In summer 2009, NACUBO announced another program to be funded by the Lumina Foundation, with the intent to develop a cadre of colleges and universities committed to combining the methodologies of the Baldrige National Quality Program and NACUBO's Excellence in Higher Education (EHE) model to build a culture of assessment and accountability. Selected institutions would serve as models of the approach, providing examples of data-driven, results-oriented improvement strategies.

The goal of the program was to help institutions make the best possible use of their resources to support their teaching, research, and service missions. NACUBO offered a training program; consulting support to the selected institutions, such as Marist College; and opportunities to share the results of the projects.

NACUBO's participation is absolutely critical to this national reform's continuing improvement and success, especially in regard to the effective business practices reflected in the Marist pilot project. After all, the position descriptions of business and finance officers may vary across institutions, but adopting good business practices is in fact what you all try to do every day. Lumina's numerous efforts provide opportunities to help you do that even better, and faster.

RICH PETRICK, former vice chancellor for finance for the Ohio Board of Regents, is a senior consultant for HCM Strategists and lead for the Lumina Foundation for Education's work in promoting good business practices.

^ Top

Self-Study Learning Curve

Ater working through its lengthy self-study process, Marist College leaders share some lessons that may help other institutions considering similar efforts.

  • Create a timeline, but remain flexible and realistic. It helps to view this as a process as well as a project. We have slid deadlines a few times because the group needed us to clarify something, give different examples, define the framework, or simply review the information again. It is more important to achieve group understanding than simply to meet a deadline, particularly since each phase builds on the previous one.
  • Encourage feedback and accept it when it comes. Just as you would in a continuous improvement process, take participant input and modify how or what your current approach might be. Through the feedback we received, we slowed down the process, reframed material, and reviewed details with our chief information officer, our human resources department, and our vice president for institutional research. We even brought in experts. We took great care to listen and respond quickly.
  • Engage the entire department in the work. Sometimes, to shield certain employees from extra work, managers may not be involving all levels of the organization. Find this out early by talking with employees and ensure that the entire organization has an opportunity to be involved.
  • Always keep in mind the end goal, which is strategic and operational planning. You will find that participants, especially early on, may have difficulty integrating this process into their normal routine. Especially if particular employees have not previously been involved in the planning process, it is essential to the project's success that you keep them focused. Help staff see clearly the bridge between their specific planning goals and the college's strategic plan goals—and the way their contribution influences the institution's plans and outcomes.
  • Be consistent in your delivery. Decide ahead of time on ways you'll present, disseminate, and review information. Although you may modify this as you receive feedback from participants, you want to continue to move the project forward by being consistent and persistent. Along those lines, if you are learning the process along with your participants, try to stay at least two steps ahead of them. Review the upcoming material, and decide whether or not you need to alter your schedule because a section is a particularly detailed one. In short, be organized and prepared.
^ Top