Thursday, January 14, 2010

Putting the 'A' in IR&A

When I first arrived on my campus nearly four years ago, the focus of Institutional Research and Analysis (IR&A) was directed primarily on survey administration and the development of historical reports. Over time, we have maintained these responsibilities and added many more. For example, several years ago we assumed enrollment reporting from the University Registrar. In addition, we have taken on the mandatory federal reporting responsibilities of both the Registrar and Human Resources. We were also tasked with the responsibility of improving the institutional Data Factbook which is in its third iteration and currently includes 74 pages, 98 charts, and 76 tables describing the admissions, enrollment, academic, and human resource patterns at the institution. In addition, we have developed additional reports to include the Academic Department Factbook, Human Resources Factbook, Tommie Almanac, Survey Factbook, and many others. The number of surveys administered by this office has increased with many legacy surveys converted from the more expensive paper version to a more efficient and cost effective electronic version. In addition to....
....improved efficiency of the survey administration process, response rates have increased. In the interest of transparency, we have developed an IR&A ‘Wiki’ which makes many of these reports (and others) available to the UST community.


Much of this effort has been geared toward providing descriptive statistics to better understand the overall effort put forth by the UST community. I like to think of this as the institutional research part of our job. Of course, this descriptive information does not account for the ‘analysis’ portion of IR&A. For example, many of us know the fall-to-fall retention rate of freshman entering the institution in the fall of 2008 is approximately 86%. Of course, this number by itself tells us little about whether we are being successful or not. Intuitively, this statement should make sense to all of us. After all, how many of us ‘define’ ourselves by our gender alone? Or our age? Marital status? Clearly, we are complex beings possessing multiple characteristics in which to identify or describe ourselves. I would hope we could agree that we are a result of the sum of our parts and not easily defined by a category or number on a sliding scale. The same could be said for much of the information generated by our office.


This is where the analysis aspect of our work can help to guide us and ultimately lead to an ‘informed’ decision making process critically important to the success of UST. Getting back to the retention number, we can tell right away that approximately 86 of every 100 new freshmen entering UST in the fall of 2008 returned for the fall 2009 semester. This may sound pretty good; however, the number also tells us that more than 190 students from the 2008 freshman class did not return. This number may not sound as good. If we analyze this number a bit more, we could establish the departure of these 190 students equates to approximately 1.8 million dollars in unrealized revenue which would have been generated during their sophomore year alone. When considering the junior and senior years of these 190 students, the unrealized revenue balloons to more than 5.4 million dollars. Of course, this sounds really bad when one considers the current economic situation.


The obvious response may be to question why we do not maintain a 100% retention rate. Certainly, this is a fair question as everyone on campus wants each student entering UST to succeed. Further analysis can help us to determine whether our retention rate is at an acceptable level or not. In 2007, the Division of Academic Affairs contracted with the American Council of Education (ACE) to measure one aspect of the learning that occurs at UST. As part of this process, ACE utilized incoming freshman characteristics (ACT Scores, high school GPA, etc) to predict our ‘expected’ fall-to-fall retention rate. The results of this analysis indicated our retention rate should be 81%. When considering the actual rate of 86%, this looks pretty good. In addition to the ACE analysis, I worked with student researchers (Allison Liebl & Megan Leners) in our office to develop an internal prediction. In this case, we gathered Catholic institution data from the federal government. We considered many factors to include admission rates, ACT scores, tuition rates, professor salaries, size of institution, etc. Our prediction determined the UST fall-to-fall retention rate should be 84%. Again, the predicted rate is below the actual rate. This information provides us with strong evidence that UST is doing something right.


But the story does not end there. Further analysis can assist us in understanding the characteristics of an institution that are related to higher retention rates. For example, we can look at institutions such as Notre Dame and Georgetown. These institutions have retention rates of 97% and 96% respectively. Again, using federal data, we can analyze similarities and differences between UST and these high- performing institutions. This information can help to inform a discussion on whether to work toward improving retention rates or simply to work toward maintaining them. Nationally, institutions accepting students with high ACT scores tend to have high retention rates. In this case, Notre Dame (34) and Georgetown (33) report very high ACT Composite scores in the 75th percentile. These scores are significantly higher than USTs (27). This discrepancy holds true when considering faculty salaries as well. The average full professor salaries of Notre Dame ($131,108) and Georgetown ($150,336) are significantly higher than UST professor salaries ($90,915). Further, the admission rates of Notre Dame (24%) and Georgetown (21%) are dramatically lower than the UST admission rate of 80%. Taken together, UST would need to change their current academic and financial profile dramatically in order to realistically achieve the retention rates of high-performing Catholic institutions such as Notre Dame and Georgetown.


As we consider the value of analysis at UST, IR&A staff members hope to extend our ongoing dialog with multiple campus divisions to include Student Affairs. My experience in academia has taught me that Student Affairs staff members are a valuable asset to help describe data from a holistic viewpoint rather than as a simple number incapable of telling “the rest of the story.” Indeed, the same argument could be made for IR&A. To date, advanced regression analysis has failed to adequately explain ‘why’ students leave UST. This is due, in part, to our lack of knowledge concerning the financial, work, and behavioral patterns of students who drop out of UST. We may be the only department on campus with the word ‘analysis’ in our title, but we can’t do it alone. An informed institutional community requires an inclusiveness in which all members should feel comfortable participating in the analysis portion of the decision making process.

No comments:

Post a Comment