Assessment and Analytics in Institutional Transformation

min read

Freeman A. Hrabowski III ([email protected]) is President of the University of Maryland, Baltimore County (UMBC). Jack Suess ([email protected]) is Vice President, Information Technology, and CIO at UMBC. John Fritz ([email protected]) is Assistant Vice President, Instructional Technology & New Media, at UMBC.

Comments on this article can be posted to the web via the link at the bottom of this page.

Assessment and analytics, supported by information technology, can change institutional culture and drive the transformation in student retention, graduation, and success.
For more on lessons learned in undertaking transformational initiatives, the value of assessment and analytics, and the role of information technology and IT leaders, listen to the podcast featuring President Freeman Hrabowski, Interim Provost Philip Rous, and Vice President for IT Jack Suess, moderated by Assistant Vice President for Instructional Technology John Fritz: http://www.umbc.edu/ER2011.

U.S. higher education has an extraordinary record of accomplishment in preparing students for leadership, in serving as a wellspring of research and creative endeavor, and in providing public service. Despite this success, colleges and universities are facing an unprecedented set of challenges. To maintain the country’s global preeminence, those of us in higher education are being called on to expand the number of students we educate, increase the proportion of students in science, technology, engineering, and mathematics (STEM), and address the pervasive and long-standing underrepresentation of minorities who earn college degrees—all at a time when budgets are being reduced and questions about institutional efficiency and effectiveness are being raised.

Since fewer than 30 percent of American adults over the age of twenty-five hold a bachelor’s degree and barely 60 percent of all students who start college actually finish, college and university leaders are asking: “How effective have we been in helping students to graduate?” We need to know graduation and retention rates, of course, but we also must go deeper. We must understand our institutions’ impact on the success of distinct groups of students, including women in engineering and computer science, underrepresented minorities in science, and students from low-income backgrounds in general. A recent National Academies report found that underrepresented minority populations aspire to earn STEM degrees at roughly the same rate as students from other backgrounds but that only about 20 percent complete undergraduate STEM programs within five years. What may be surprising is that although white and Asian-American students have higher graduation rates, most of these students also do not succeed in STEM—only 33 percent and 42 percent respectively.1

Those institutions most effective in retaining and graduating students have focused on supporting their students by creating a climate that encourages “(1) asking good questions, (2) being honest about both strengths and challenges, and (3) developing innovative problem-solving strategies and initiatives that address particular issues.”2 Indeed, to address societal imperatives, higher education must begin by transforming its own culture, which is reflected in the questions we ask (and those we don’t), the achievements we measure and highlight (and those we ignore), and the initiatives we support (or don’t support).3

Developing a Culture of Assessment

In his recent book The Social Animal: The Hidden Sources of Love, Character, and Achievement, David Brooks postulates that culture has everything to do with our habits, beliefs, practices, relationships—even the kind of tensions that have an impact on our lives and relationships.4 In higher education, we define our role on campus in terms of how we envision ourselves as individuals and where we want the institution to go. As a result, the culture of an institution is so hard to change because people are accustomed to performing their duties in a set way. When institutions realize they need to improve and when they determine the priorities most critical to that improvement, the most important challenge is convincing people to be open-minded and to consider the evidence: that there may be a better way of achieving goals and distributing resources.

At the University of Maryland, Baltimore County (UMBC), we believe that process is an important factor in creating cultural change. We thus approach transformational initiatives by using the same scholarly rigor that we expect of any researcher. This involves (1) reviewing the literature and prior work in the area, (2) identifying critical factors and variables, (3) collecting data associated with these critical factors, (4) using rigorous statistical analysis and modeling of the question and factors, (5) developing hypotheses to influence the critical factors, and (6) collecting data based on the changes and assessing the results.

Increasingly, national standards of excellence are also emphasizing a culture of assessment. These standards focus both on the outcomes that students should achieve and on how colleges and universities are positioned to find out about these outcomes. For example, the Association of American Colleges and Universities (AACU) identified four “essential outcomes” that students need for 21st-century challenges:

  1. Knowledge of human cultures and the physical and natural world
  2. Intellectual and practical skills
  3. Personal and social responsibility
  4. Integrative learning

However, the AACU also identified institutional assessment as critical in two of its seven “Principles of Excellence” to implement these essential outcomes:

  • Principle Two: Give Students a Compass. Focus each student’s plan of study on achieving the essential learning outcomes—and assess progress.
  • Principle Seven: Assess Students’ Ability to Apply Learning to Complex Problems. Use assessment to deepen learning and to establish a culture of shared purpose and continuous improvement.5

Similarly, the Council of Graduate Schools (CGS) now highlights “monitoring graduate student progress” as one of ten “lessons learned” from the UMBC experiences with improving Ph.D. student retention and success.6

Perhaps not surprisingly, both the AACU and the CGS identify academic departments, and faculty in particular, as the key agents for how an institution changes its approach to implementing and evaluating intervention efforts to improve student success. As such, the faculty culture, which is built on critical inquiry and healthy skepticism, must be understood (and collegially challenged) to change deeply-held beliefs and attitudes. Most faculty teach the way they were taught, so any attempt to change this approach (if there is evidence that it should be changed) must also rely on evidence and rigorous assessment to show what purports to be a better way.

Strong leadership can help create the vision, set the tone of the climate, emphasize the values that are most critical, and build trust among people. Strong management ensures that the appropriate execution of functions and follow-through are enabled through assessment. However, changing the culture is the engine that drives transformation.

Transformational Initiatives at UMBC

UMBC has received national recognition for two major transformational initiatives undertaken during the tenure of President Freeman Hrabowski: (1) the Meyerhoff Scholars program, which focuses on achievement in STEM (science, technology, engineering, and mathematics) fields by underrepresented minorities, and (2) the Council on Graduate Schools (CGS) Ph.D. Completion Project, which focuses on increasing Ph.D. completion rates. 

  • Meyerhoff Scholars Program (http://www.umbc.edu/meyerhoff/): Begun in 1988, the Meyerhoff Scholars Program (named after its founders, Baltimore philanthropists Robert and Jane Meyerhoff) focuses on producing bachelor’s degree recipients, particularly African Americans, who go on to doctoral programs in science and engineering. Since the start of the program, more than 700 students have graduated in undergraduate STEM fields and more than 600 have completed or are pursuing graduate degrees (the program currently enrolls approximately 230 undergraduates). In addition, among predominantly white higher education institutions in the United States, UMBC has become the leading producer of African-American bachelor’s degree recipients who go on to earn Ph.D.’s in STEM fields. The program has been recognized by the National Science Foundation and the National Academies as a national model.
  • Ph.D. Completion Project (http://www.phdcompletion.org/): In 2003, the Council of Graduate Schools (CGS) launched a transformational effort to dramatically increase the completion rates of U.S. and Canadian doctoral students through a set of pilot projects led by graduate school deans. UMBC is among the 29 universities participating in the project, with a goal of focusing first on projects that will have a direct impact on the production of minority and women Ph.D. graduates in the social sciences and humanities, as well as science, engineering, and mathematics fields. UMBC has made dramatic changes to its graduate advising and mentoring system and has produced a paper on its work for the CSG.

Beyond these two programs, UMBC has recently begun a major effort focused on the success of transfer students in STEM majors. This effort, with pilot funding from the Bill and Melinda Gates Foundation, will look at how universities can partner with community colleges to prepare their graduates to successfully complete a bachelor’s degree in a STEM field.

____

See Scott A. Bass, Janet C. Rutledge, Elizabeth B. Douglass, and Wendy Y. Carter, “The University as Mentor: Lessons Learned from UMBC Inclusiveness Initiatives,” CGS Occasional Paper Series on Inclusiveness, volume 1, 2007, <http://www.cgsnet.org/Default.aspx?tabid=290>.

The Role of Information Technology

In such an environment, how can information technology help? First, IT leaders can align their organizations around an understanding of the institution’s strategic goals and transformational initiatives, including how information technology can help change institutional culture and achieve campus priorities. One important way this is achieved is through the effective use of technology to help build the campus culture for evidence-based decision-making and management. It is impossible for any leader to understand, through personal experience or instincts alone, the challenges encountered by the thousands of students on a campus. Institutions thus need rigorous data modeling and analysis to reveal the obstacles to student success and to evaluate any attempts at intervention. They need to integrate data from a variety of systems—student information, learning management, and alumni systems, as well as systems managing experiences outside the classroom.

The second way in which IT organizations can help institutional assessment efforts is by being perceived as partners outside the IT organization. Institutional assessment is hard work that requires time, patience, and healthy working relationships built on trust and mutual respect. Finding the meaning, significance, or even a recurring pattern or trend in a mountain of data at our disposal is difficult. Too often, IT organizations try to help by providing an analytics “dashboard” designed by a vendor that doesn’t know the institution. As a result, the dashboard indicators don’t focus on those key factors most needed at the institution and quickly become window-dressing. At UMBC, we have found that a collaborative approach with Institutional Research (IR), as well as with the academic programs and student affairs, improves not only our data-analysis capability but also our relationships across units. Through work with our IR department, we are able to identify critical factors for success, then ask questions, quickly get a response, see if the response is what we need, analyze the data, and subsequently ask more questions. We have substantially strengthened assessment with this iterative process, which has also brought us all together as colleagues for succeeding projects.

Third, IT organizations can support assessment by showing how data in separate systems can become very useful when captured and correlated. For example, UMBC has spent considerable effort to develop a reporting system based on our learning management system (LMS) data. This effort, led from within the IT organization, has helped the institution find new insights into the way faculty and students are using the LMS and has helped us improve the services we offer. We are now working to integrate this data into our institutional data warehouse and are leveraging access to important demographic data to better assess student risk factors and develop interventions.7

Finally, as institutional interventions are developed, information technology can help by refining the associated business processes to collect critical data that might not have been collected institutionally. For example, through the use of technology and institutional research at UMBC, we realized that students who were affiliated with a specific group on campus—whether a scholarship program or the Honors College—performed better academically than those who simply received merit aid but were not affiliated. In the past, these affiliations were tracked locally, and manually, in departments. The IT organization engaged the groups, redesigned a business process that would allow them to easily report affiliations, and provided them with tools for tracking the performance of these students. The resulting correlation of student performance data became much more efficient and useful in decision-making.

Learning Analytics

Interest in learning analytics is on the rise. The 2011 Horizon Report (http://www.nmc.org/publications/2011-horizon-report), produced by the New Media Consortium (NMC) and the EDUCAUSE Learning Initiative (ELI), identified learning analytics as one of two technologies that will be widely adopted in the next four to five years. In addition, the 1st International Conference on Learning Analytics and Knowledge (LAK) was held in Banff, Alberta, Canada, in early 2011 (https://tekri.athabascau.ca/analytics/). Reporting on the conference, a recent ELI Brief defined the field as “the collection and analysis of usage data associated with student learning.” The brief added that the purpose of learning analytics is “to observe and understand learning behaviors in order to enable appropriate interventions.”

As early as 2005, the EDUCAUSE Center for Analysis and Research (ECAR) reported survey results from 380 institutions describing their primary use of “academic analytics” along five key stages:

Stage 1: Extraction and reporting of transaction-level data (70% of respondents)

Stage 2: Analysis and monitoring of operational performance

Stage 3: “What-if” decision support (e.g., scenario building)

Stage 4: Predictive modeling and simulation

Stage 5: Automatic triggers of business processes (e.g., alerts)

This report’s discussion of academic analytics is broader than the current definitions of learning analytics, which focus more narrowly on students. However, a 2009 ECAR follow-up survey of 309 institutions found that a majority of institutions (58.4 percent) were still in Stage 1, involving activities primarily focused on accessing data.

If the 2011 Horizon Report’s prediction is correct—and if adoption of learning analytics includes the ELI emphasis on intervention—there is much work to be done by IT leaders to prepare their institutions.

____

See Malcolm Brown, “Learning Analytics: The Coming Third Wave,” EDUCAUSE Learning Initiative (ELI) Brief, April 2011, p. 1, <http://www.educause.edu/Resources/227287>;
Philip J. Goldstein, “Academic Analytics: The Uses of Management Information and Technology in Higher Education,” EDUCAUSE Center for Analysis and Research (ECAR) Research Study, 2005, p. 60, <http://www.educause.edu/ir/library/pdf/ers0508/rs/ers0508w.pdf>;
Ronald Yanosky, “Institutional Data Management in Higher Education,” EDUCAUSE Center for Analysis and Research (ECAR) Research Study, 2009, p. 65, <http://www.educause.edu/ir/library/pdf/ers0908/rs/ers0908w.pdf>.

Taking Ownership of the Problem

In Leaving College, his seminal work on student success, Vincent Tinto asserts that institutions alone cannot solve the retention and student success problem and should not absolve students from at least partial responsibility for their own education.8 Every time students enter the Chemistry Discovery Center at UMBC, they see the following three quotations over a window to the left of the door:

  • I cannot teach anybody anything. I can only make them think. —Socrates
  • Knowledge must be gained by ourselves. —Benjamin Disraeli
  • The only real object of education is to have a man in the condition of continually asking questions. —Mandell Creighton

At UMBC, we are using analytics and assessment to shine a light on students’ performance and behavior and to support teaching effectiveness. What has made the use of analytics and assessment particularly effective on our campus has been the insistence that all groups—faculty, staff, and students—take ownership of the challenge involving student performance and persistence. Change can be sustained only when the people who are affected believe in that change. This is not a problem for leadership alone: “The best leaders, the people do not notice. . . . When the best leader’s work is done, the people say: ‘We did it ourselves!’”9

We believe the process of cultural change begins with a focus on inclusiveness, bringing all campus members into the discussions about problems and strategies and showing them the evidence that forms the basis of our approach. Shared governance and broad consultations harness the ingenuity and creativity of faculty, students, and staff. IT professionals play an important role through their understanding of technology and how to effectively innovate using technology. Learning analytics and assessment, supported by information technology, can thus change institutional culture and drive the transformation in student retention, graduation, and success.

Notes

1. National Academies, Expanding Underrepresented Minority Participation: America’s Science and Technology Talent at the Crossroads (Washington, D.C.: National Academies Press, 2010).

2. Freeman A. Hrabowski III and Jack Suess, “Reclaiming the Lead: Higher Education’s Future and Implications for Technology,” EDUCAUSE Review, vol. 45, no. 6 (November/December 2010), <http://www.educause.edu/library/ERM1068>.

3. Elliot L. Hirshman and Freeman A. Hrabowski, “Meet Societal Challenges by Changing the Culture on Campus,” Chronicle of Higher Education, January 16, 2011, <http://chronicle.com/article/Meet-Societal-Challenges-by/125937/>.

4. David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement (New York: Random House, 2011).

5. Association of American Colleges and Universities, “College Learning for the New Global Century,” A Report from the National Leadership Council for Liberal Education & America’s Promise (LEAP), 2007, p. 6, <http://www.aacu.org/advocacy/leap/documents/GlobalCentury_ExecSum_final.pdf>.

6. Scott A. Bass, Janet C. Rutledge, Elizabeth B. Douglass, and Wendy Y. Carter, “The University as Mentor: Lessons Learned from UMBC Inclusiveness Initiatives,” CGS Occasional Paper Series on Inclusiveness, volume 1, 2007, <http://www.cgsnet.org/Default.aspx?tabid=290>.

7. John Fritz, "Video Demo of UMBC’s ‘Check My Activity’ Tool for Students,” EDUCAUSE Quarterly (EQ), vol. 33, no. 4 (2010), <http://www.educause.edu/library/EQM1049>.

8. Vincent Tinto, Leaving College: Rethinking the Causes and Cures of Student Attrition, 2d ed. (Chicago: University of Chicago Press, 1993), p. 144.

9. Lao Tzu, Tao Te Ching.

EDUCAUSE Review, vol. 46, no. 5 (September/October 2011)