1. Objectivity and neutrality. Working in a context where different groups of stakeholders sometimes have conflicting priorities, it is absolutely crucial that OASIR not appear to be, or actually be, beholden to any particular individual, group, unit, office, department or academic division. If OASIR is not perceived as (and actually) neutral and objective, the information and analyses it provides will be of little to no use to the College or to its constitutive elements, not least because the Office will not be regarded as trustworthy. IR offices do not take sides in disputes; they provide information and analysis to those who request it. The IR Office's only commitment is to accurate collection, presentation and analysis of data as requested across the College.
  2. Balancing proactivity and responsiveness. IR offices serve their institutions through a combination of addressing specific requests and proactively obtaining data and carrying out analyses. This balancing act can be delicate and challenging, based as it is upon both the nature of tasks for different units and the personal preferences of those involved. A successful IR office must be prepared to respond effectively to requests while also knowing when and how to initiate research on behalf of constituents.
  3. Advocating appropriate transparency. Both because institutional research requires reasonable access to information, and as a general principle, IR offices advocate openness and transparency in sharing data appropriately across college units. Institutional researchers understand that some data are extremely sensitive and should not be shared broadly; IR offices support a process of discernment to determine the appropriateness of sharing specific data, rather than responding to the reality of data sensitivity with policies of blanket restriction.
  4. A pile of great stuff versus a great pile of stuff. Grinnell College is a relatively small organization but it is complex and data-rich.  One of the biggest challenges faced by our IR office (or that of any college) is to help communicate matters of substance with clarity, precision, and efficiency. With access to many sources of data concerning the resources, processes, and outcomes of the College, the IR Office strives to create a “pile of great stuff from a great pile of stuff.”
  5. The difference between data and information. Data are ways of expressing things (numbers, words, sounds, images), and information is the arrangement of data into meaningful patterns.[1] Data are the “unorganized sludge” of the information age.[2]  Finding and then communicating the meaning in data is at the core of the work of any IR office.
  6. A single version of institutional statistics. One of the primary reasons we developed an IR office at Grinnell was to create a place where one can find the official statistics of the institution – a single version of the truth.  While those in the IR office cannot be expected to be the campus experts in all areas of College activity, they can collect and “sanity check” our data trends and our data versus those of other institutions.
  7. The power of graphical presentation. Because humans have different learning styles, it is often helpful to develop information in multiple forms, e.g., tabular, graphical, and narrative.  However, “Often the most effective way to describe, explore, and summarize a set of numbers – even a very large data set – is to look at pictures of those numbers… of all methods for analyzing and communicating statistical information, well-designed data graphics are usually the simplest and at the same time the most powerful.”[3]
  8. Quantitative and qualitative information.  While quantitative measures and statistics are a large part of the work of any IR office, a balance with qualitative information is often necessary. Quantitative measures help us understand the “what” of a situation, but not necessarily the “why.” Such qualitative measures as interviews, focus groups and open-ended survey questions are important because they capture the perspective of the people being studied, particularly in terms of how these people make sense of the situations they are in as well as what motivates them to make the choices they do. In addition, website research can provide qualitative support for some kinds of benchmarking projects.
  9. Benchmarking, trending, and forecasting. Most colleges do some amount of benchmarking, i.e., comparing our resources, processes, or outcomes with some well-defined set of peers.  Two predictable questions routinely arise in the context of any such exercise: 1) are we sure we are comparing “apples to apples,” and 2) “what’s right for our institution, given its unique history, culture and ethos.” These questions put definite limits on the return on investment in benchmarking work. At least as powerful as the concept of benchmarking is that of trending and forecasting – examining how our important metrics are changing with time and extrapolating those trends into the future. Also, using trend data with one’s own data and data definitions mitigates the issues noted with benchmarking.
  10. Human Intelligence and Analytics. Small, private, residential liberal arts colleges have long relied on human intelligence networks made up of faculty, professional advisors, other administrators, and the students themselves to find the right balance of challenge and support for individualized learning and to monitor student progress toward a degree. In fact, such networks are, and will continue to be, a primary source of distinctiveness and strength for these campuses. While OASIR will continue to provide research support for such networks, we are also very interested in the ongoing integration of predictive learning analytics at Grinnell to complement these networks. Learning analytics has been defined as “the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”[i] With the advent of new analytics techniques including data mining and machine-learning, liberal arts colleges are in a position to join with other colleges and universities who are developing or enhancing alert systems and predictive models based on these techniques.
  11. Separating information by sort and stakeholder. Benchmarking information should be separated from trend data when both appear in a document, as they address different types of comparisons (comparison institutions vs. comparisons over time). Similarly, IR offices do well to provide information specifically targeted by stakeholder, meaning (for example) that faculty-related data are provided to the Dean's Office, and admissions-related data are provided to the VP for Enrollment Management.
  12. Campus training in research issues. While teaching other campus units how to carry out their own research generally represents a small part of IR office work, it can be of value both to an IR office and to the unit in question if some kinds of data collection and analysis can be done outside the IR office, mostly because IR offices always confront more work than they can carry out in a timely fashion. If another unit wishes to speed the data collection and analysis process and has staff capable of doing such work, it may choose to ask the IR office for training or advice; where possible, the IR office does well to support research independence in such situations.

 Notes

[1] Stan Davis and Jim Botkin, The Monster under the Bed: How Business is Mastering the Opportunity of Knowledge for Profit, Simon & Schuster, New York, 1994, pg 42.
[2] Robert Lucky, former director of AT&T Bell Labs
[3] Edward R. Tufte, The Visual Display of Quantitative Information, Graphics Press, Cheshire Connecticut, 2001, pg 9.

*Note, this list of principles has been derived from a similar list developed by Randy Stiles and Amanda Udis-Kessler at Colorado College in 2010.