Ready for a Close-Up
To facilitate its NACUBO-supported pilot IT project, Marist College used a hybrid evaluation process. The work included a self-study phase useful not only for the department’s continuous process improvement but also for the college’s upcoming accreditation in 2012–13.
By Reba-Anna Lee and JoAnn DePue
We are nothing without our accreditation. At the same time, the accreditation process is challenging for any institution. It is time-consuming and tedious and takes countless people to get the job done right. So why do we do it? Every institution wants to be the best at what it does, and accreditation is a key step in attaining a visible symbol of status and quality. Marist College, Poughkeepsie, New York, is no different.
Yes, Marist has been recognized for excellence by U.S. News & World Report, Time, the Princeton Review, and Barron's Best Buys in College Education. And, we are noted for our leadership in using technology to enhance the teaching and learning process. But, we also must earn accreditation by the Middle States Commission on Higher Education in academic year 2012–13. With early awareness of the coming process, many departments—including information technology (IT)—began the self-study phase of the accreditation process in late 2009.
At that time, IT leadership was already aware of certain challenges. Some were based on the fact that our department is made up of multiple subdivisions that must work together to address all the technology needs and uses on campus. The diversity of the department staff also makes for a varied set of communication issues not only among employees themselves but with customer interaction and staff knowledge of the services offered. We expected the self-study to reveal other improvement opportunities as well.
In September 2010, NACUBO selected Marist College to take part in its Lumina-funded Challenge 2010 Project, directed at building a culture of assessment and accountability. NACUBO indicated that it chose Marist for three reasons: (1) The college's institutional leadership demonstrated strong, committed support for continuous improvement, and our institution showed a deep appreciation for the value of implementing a data-collection and reporting system to ensure institutional excellence. (2) The project was designed to develop protocols that could be used in a broad range of college departments; with our IT department as a relevant pilot, we could leverage results. (3) The project would demonstrate the close connection between a systematic approach to continuous improvement and effective self-assessment and accreditation processes.
The NACUBO self-study framework combines the Baldrige National Quality Program with the association's own Excellence in Higher Education (EHE) model to form a somewhat hybrid approach for conducting a self-study on a college campus. While similar in structure to the Baldridge approach, EHE provides higher education context and vocabulary. NACUBO support included training and a Baldrige criteria consultant to come in and add expertise to the process.
This methodical self-assessment resulted in myriad findings for the Marist College IT department, some of which focused on overall customer service, learning outcomes for the department's student employees, and support for the student body as a whole.
Where to Begin?
After reviewing NACUBO's self-assessment model, the IT department's leadership decided that it fit well with the department's best interests. "We thought the recommended framework would work well for our rather eclectic information technology department," says Bill Thirsk, chief information officer and vice president of information technology. "When you combine this culture with the wide array of other services we provide, it was very hard to find a template to measure our performance. The seven performance excellence criteria outlined by Baldrige and EHE gave us that template." (See sidebar, "Define the Discussion," for a list of the self-assessment criteria.)
The bottom line is that by combining these two sets of tools, we thought Marist College's IT department could continue to strive for excellence, particularly as we prepared for accreditation.
To facilitate and document the self-assessment activities, we added Sakai's collaborative learning environment, an open-source course management system that we had used many times before. We further refined the hybrid collaboration tool, which seemed a natural progression for the IT group, as we simultaneously sought to reach a broader staff audience and capture the thought process behind our self-assessment initiative along the way. (By a "broader audience" we meant that we wanted our assessment approaches to reach beyond the full-time staff to student employees and to staff working the third shift, all of whom tend to be less engaged in departmental activities and less able to attend meetings.)
Designing a Workable Process
We envisioned a process that blended real-time meetings with an archive to capture meeting suggestions, decisions, and outcomes via the Sakai collaboration site, which would include a wiki, blog, and other forums. Since the accreditation process would include a campus visit and a full self-study report for all college departments, we planned to base our final report on the information captured with Sakai.
"The Baldrige Criteria for Performance Excellence guidelines gave us a road map to create a service-oriented environment upon which we could continually improve," Thirsk notes. The road map he refers to consisted in large part of creating a way to ask ourselves the guiding questions in each of the Baldrige/EHE excellence criteria categories. This systematic assessment of the IT department's effectiveness as a whole allowed us to navigate the process clearly, with our managers on board.
Engaging Others in Self-Scrutiny
We saw one big problem looming: How would we engage our participants? As the project's co-leads, we were excited about the project, as were the other key stakeholders. But, we knew that the manner in which we pulled everyone together and spread that enthusiasm would determine in great part the success of our self-study.
Cascading communication. We decided to employ a top-down "enthusiasm technique." The idea was to first motivate the IT department managers, who would then go back and energize their staffs, and so on.
We started by dividing the seven Baldrige categories into reasonably sized pieces for a series of presentations to the managerial level. We then reorganized the schedule for existing weekly meetings of the various department teams such that managers could incorporate aspects of the self-study at each of those meetings. In that way, they could educate team members to be a part of the process.
We used this technique for two main reasons: (1) It was an easy way to disseminate a great deal of information and generate relevant questions from a large audience; and (2) everyone could hear the information, process it, and then offer their feedback in a timely fashion. No one was being put on the spot for answers or feedback.
Another advantage of this approach was that we could align the plan with our existing continuous improvement plan, outlined in the Marist Middle States Periodic Review Report. This report applies to the entire college and outlines Marist's strategic mission and goals for the next five years, or until the next accreditation visit. A key part of continuous improvement is self-assessment and looking for strengths and areas for improvement, which corresponded precisely with our goals for the department.
Knowledge Transfer Tools
The anticipated outcomes for the self-assessment were to gain a working knowledge of the IT department and identify areas that needed improvement. Employee involvement was essential, if only because of the size of the staff, which includes roughly 75 full-time and part-time professional staff in addition to a large student employee population. In all, more than 100 people staff the department.
Using the Sakai project site to which all staff had access, we (1) set up a time-line for the various assessment categories, (2) determined dates when certain mat-erials would be covered at department meetings, and (3) handed out assignments. These included general questions, driven by the Baldrige method, for each category for managers to discuss and answer with their teams. We organized the questions in batches of five or six at a time to slowly to move things forward.
We also used several of the Sakai system's wide range of collaborative tools (see sidebar, "Selected Sakai Tools Facilitate Collaboration").
The first step in employee involvement was for managers to disseminate the information properly to their respective staffs and begin to transition them into the self-assessment process. This took place at weekly face-to-face, managers-only team meetings. The second method called for each manager to meet with his or her staff as a group and answer the guideline questions.
A key part of the process was to capture answers and comments that would later be used to compile results and strategies. That's where the Sakai site came in. Documenting the material from various meetings provided a workflow trail and kept the majority of employees informed and able to participate in the process and feedback loop, which enriched the experience. The feedback loop consisted of the staff posting answers to the criteria-based questions and the managers then commenting on the postings. This led to a wider discussion of topics that left the staff feeling as if their needs had been heard.
For example, the manager of the academic technology department asked employees to answer a number of questions in Category 3 "Constituents and Beneficiaries," such as: "What groups and organizations (either within or outside Marist) benefit most directly from the work your department performs? What programs and services do you provide for each?" and "What other constituency groups are important to your department's success? In other words, who influences or is influenced by your department's work?"
Questions like these helped the department think and reflect upon their constituents and the kind of service that was being offered to them. This group format helped build a large feedback support cycle for the self-study. People in nonmanagerial positions were also able to make suggestions and provide valuable input. Each department followed this formula for information flow, asking key questions, discussing their answers, and reflecting upon the responses. This helped each department, and the division as a whole, to build upon our existing knowledge base and consider ideas that would improve results.
All assignments had due dates as well as guidelines for keeping a standard format for feedback. When managers completed a group of assignments, each gave a short presentation at one of the monthly face-to-face meetings. They structured their presentations around which questions were answered, how they were answered, and what participants learned during the process. We archived the presentations and notes taken at each meeting in a blog tool within the online Sakai site, which enabled us to document our progress chronologically.
Getting Down to Business Practices
As we moved from the general to the specific, we reviewed self-study results and began to focus on the anticipated challenges revealed in the self-assessment. These included the need for increased customer awareness, improved and expanded services, and a plan for continuous improvement for information technology overall, as well as each individual department.
Since an entire section of the study was devoted to beneficiaries and constituencies—based on the Baldrige Performance Excellence Criteria—the process intrinsically created awareness of customer issues and needs. Our way of evaluating the effectiveness of our services was to look into past survey data, as well as to conduct focus groups representing all departments on campus, which allowed us to directly interact with our customers.
We then combined the survey data and focus group responses to identify the IT department's strengths and develop goals and plans for areas of weakness.
Practical goals to strengthen service. Based on this information, we found that we needed to: (1) Create a presence in each academic building to support professors who teach online courses, (2) develop a plan to enlarge and relocate our digital publication center so that services can be expanded, and (3) provide on-site documentation for students and faculty using higher-end equipment in our computer labs.
Intangible and unplanned outcomes. We also realized that some of the findings of our self-assessment were intangible and not necessarily intended, such as:
- Departmental impact. One great outcome, Thirsk notes, is that "everyone became more aware of the impact our department has on the entire college. The staff learned how their work affects the learning environment, and they gained several great tools to assess how their work affects their own division." For example, Christine Mulvey, director of special projects, telecommunications, and networking, describes the biggest change she noticed in her department: "Everyone is more aware of how their everyday work impacts the Marist campus community and where that work fits in the college's strategic plan. The process still brings us together regularly to present our challenges and to share our plans for moving forward." Mulvey says also that mutual challenges have translated into somewhat of a departmental movement toward planned change. "Early on in the process," she says, "we created mission statements for each of our departments and shared these with each other. As a result, we now develop interrelated plans that collectively support our departmental and college missions."
- Organizational change. As we worked through the evaluation process, we began to notice some behavioral changes. While the original intention was directed outward—that is, toward improving services to our customer base—a simultaneous shift was occurring in our internal culture. We gained a deeper understanding of our department as a whole by examining where our collective opportunities for improvement exist and how we are each responding to the opportunity.
As is typical in an IT department, we face each day with uncertainty as to the problems and challenges our customers will bring to us. It is very easy to get swept up in our individually driven work. However, this process forces us to take a broader view of our work and apply it to other college departments. "It has been quite beneficial for our staff to understand the challenges other departments are faced with and the ways they are each addressing these," says Peggy Kuck, director of the client services and enterprise solutions group. "We have a much deeper understanding of what they do and a greater appreciation for it."
Other Optimistic Outcomes
Interactions among departments have strengthened as well. This is a welcome change given that IT departments typically find themselves pulled in the direction of knowledge silos, with each technical performer as a specialist in his or her field rather than a generalist across all of IT. We tend to have the closest relationships with those we work with directly, and we typically don't understand the work of those outside our immediate areas of expertise. In addition, the speed at which we work does not often afford us the luxury of building those relationships.
The self-assessment forced our IT departments to meet regularly as a group and to discuss strengths and weaknesses. Sharing this information helped break down those independent knowledge centers and transformed us into a more cohesive group that better understands how to approach individuals in ways that achieve our collective objectives.
We saw other benefits as well:
Useful learning experience. Our previous planning process, recordkeeping, and statistics did not always complement one another, and were not very meaningful. Working through this structured process has forced us to identify goals, indicators, measures, benchmarks, and intended outcomes as a complete unit.
"One of the greatest benefits," says Kuck, "is an understanding of how to use a framework to capture quantifiable measures that support our institutional goals. The key performance indicators we now look at tell a story of process improvement rather than simply work progress."
Different approaches to planning and implementation. "In our culture," explains James Curran, network manager, "we are in the habit of very quickly responding to customers and situations. That sometimes spills over into regular work and planning. While we always got the work done, we were not always as diligent about documenting work and systematically capturing statistics that tie back to our original plan. This has changed."
For example, when his department installed more wireless access points on campus this year, Curran's team not only documented the work completed, but also created network maps that identify the location points graphically. Now it is much easier for management to quickly determine coverage and identify gaps that need to be addressed next. "This creates that cycle of continuous improvement," says Curran, "in which output from our last effort allows us to plan for the next deployment."
Cycle of continuous process improvement. What Curran describes is, of course, our ultimate goal: using the information from each project to improve our approach to the next similar work effort. Melissa Egan, assistant director, enterprise solutions group, who manages the Web services department, concurs. "The most powerful aspect of this project," she says, "has been developing a cycle of continuous improvement. In the past, each new initiative in our department was met, to a certain extent, with a new set of parameters and methods. After developing a planned approach to change, however, we are now using the output from previous projects to guide us through the next similar effort."
Egan says this is particularly true in the area of customer relations, when guiding technology users through the process of developing applications for the Web. "Instead of reinventing how we approach each project with a new set of customers, we reuse effective methods that have worked in the past and reengineer those that did not prove to be particularly useful," she says.
Student employee professional development. During the course of our self-evaluation process, we've implemented improvement plans that directly address constituents' concerns. This has had a positive impact on teaching and learning, particularly for the more than 100 of our employees who are also undergraduate and graduate students. "Everyone in my area," says Mulvey, "now is more focused on seeking out teaching and learning opportunities for their student employees."
What's in the Spotlight Now?
In Baldrige Goes to College (a NACUBO workshop based on Excellence in Higher Education), presenter Brent Ruben of Rutgers University states, "Review, planning, and continuous improvement are fundamental to institutional effectiveness." Executive director of Rutgers' Center for Organizational Development and Leadership, New Brunswick, New Jersey, Ruben also makes clear that these elements of continuous improvement "should be thoroughly integrated into the fabric of every institution aspiring to excellence."
That's our goal at Marist College. We've gathered, organized, and processed the data collected in our predetermined areas. We've shared it with all participants and stakeholders. And, in following the Baldrige method of coordination with any accreditation process (in this case we have used EHE standards), we've documented all parts of the process for departmental knowledge sharing as well as for archival purposes.
We're now at the implementation phase, carrying out all the pieces of the plans we outlined in the months of planning and documentation. For example, one of the issues that came out during our focus groups was the development of our student-employee skills important for future professional success. Thus, we've instituted learning outcomes for student employees in all areas of the IT department. Another finding had to do with the need to develop a faculty-based IT toolkit, which is now in progress. But, most important, we were able to hear the needs and concerns directly from our constituents and then act upon the issues that arose.
Marist College anticipates that our future Middle States accreditation visit will be the proving ground for all our hard work. Meanwhile, the IT department is busy implementing the many plans that developed from our self-study. Since we've been involved in the self-assessment process for almost two years now, we are confident that we'll represent Marist well during our accreditation visit. Also, we are looking to branch out and share our model with other departments on campus.
Even when next year's accreditation is behind us, says Thirsk, "We plan to continue to use the Malcolm Baldrige Performance Excellence Program guidelines on a regular basis. It will be a continuous cycle to make sure that we take a good look at ourselves, see what we've learned, and focus on ways to improve even more."