Academic Analytics, Big Data, and the Tenure Track

Academic Analytics, Big Data, and the Tenure Track

A recent controversy at the University of Missouri (UM) has re-ignited a debate about higher education’s use of information from Academic Analytics. Scores of major universities contract the big-data analysis company to measure and benchmark the productivity of their academic departments, programs, and faculty.

The latest issue concerns UM’s plan to use data from Academic Analytics to measure the scholarly impact of faculty members’ research products - their grants, papers, journal articles and books - and to include that analysis as part of some promotion-and-tenure dossiers. The plan at UM is to add this measurement to several other dossier elements, including the traditional letters of reference written by external peers who are asked to evaluate promotion-and-tenure candidates based on scholarly reputation and influence in their field of study.

Such letters are a routine element in the advancement dossiers of candidates at most research universities, like the University of Missouri, which includes four campuses. These dossiers also include many other materials, such as published journal articles, awards, grants, books and evidence of effective teaching and service. Academic Analytics compiles information about many of these products and also provides comparative data for scholars in various fields and among various institutions.

Professors Push Back

Faculty members at UM have voiced their objections to the school’s use of Academic Analytics in the promotion and tenure process. As reported in the Columbia Daily Tribune, faculty expressed their concerns to Mun Choi, UM System president and chancellor of the flagship campus at Columbia (MU), as well as to MU Provost Latha Ramchand, during the Spring General Faculty Meeting, conducted over Zoom.

Faculty objections focused on three issues:

Transparency

First, faculty appeared surprised by the use of Academic Analytics in personnel decisions, criticizing it as a departure from the transparency that should characterize the process. As quoted in the Tribune, faculty member Stephen Karlan said, “I have concerns about using Academic Analytics data for promotion and tenure decisions. This is a radical break with previous practice.” Others noted that the company itself has not promoted the use of its data for individual personnel decisions.

Faculty appeared surprised by the use of Academic Analytics in personnel decisions, criticizing it as a departure from the transparency that should characterize the process.”

The university responded by noting that Academic Analytics data are just one piece of information to be considered in a comprehensive tenure process, and that the data are not more important or valid than other information included in the review. The promotion and tenure process should continue to rely on expertise, input and evaluation from several layers of faculty and university leadership.

Access

There were also concerns that faculty members under evaluation would not be given access to their data, a condition that President Choi acknowledged should not be imposed. “I don’t see why faculty members would not be able to see the data for their own cases,” he said. The university has made some data available to faculty for at least a year through an Academic Analytics summary tool, but it also plans to provide faculty more of the underlying data and comparative information that are now available to department chairs and deans.

Accuracy

And finally, there were worries that the data were not always complete or accurate. One faculty member said the reason everyone involved in promotion and tenure decisions must see the data is because they don’t always make sense. Another opined that they can be “inaccurate, confusing and fuzzy.” The university says faculty can ask for additions or corrections to their profile information and encourages faculty to engage with the data to ensure all their scholarship is captured, as well as to connect them with potential collaborators, grants, and award opportunities.

A Recent History of Dispute

UM has had a contract with Academic Analytics for about ten years. According to the terms, it will pay $527,000 next year for the full suite of services, said university spokesman Christian Basi.

But this is not the first dispute over the use of Academic Analytics. Five years ago, Georgetown University dropped its subscription with the company for a time after concluding that there were some problems with the comprehensiveness and accuracy of the data on individuals. Around the same time, the American Association of University Professors (AAUP), weighing in on faculty concerns at Rutgers University, warned that:

“Colleges and universities and their faculty members should exercise extreme caution in deciding whether to subscribe to external sources of data like Academic Analytics and should always refrain from reliance on such data in tenure, promotion, compensation or hiring decisions. In cases where such data is made available, it must be employed subordinate to a process of effective peer review in accordance with longstanding principles of academic freedom and shared governance.”

The Academic Analytics Value Proposition

That advice gets to the nub of the current issue. To what extent should large databases about faculty productivity be used for high-stakes personnel decisions in today’s higher education landscape? Should such number-crunching be prohibited, or, similar to how it’s being implemented by the University of Missouri, should it be accepted as a tool - one additional bit of information that administrators can use to evaluate the achievements of their faculty members, particularly in cases perceived to be “close calls.”

If these data are included in the dossier, it’s not clear why they should be considered “subordinate” as AAUP suggests, to the subjective opinions of individual reviewers, who - as is well-known in the academy - are often curated precisely because their views of a tenure candidate–whether favorable or unfavorable–can be anticipated. What’s more, those institutions that have used Academic Analytics typically praise the company’s willingness to respond to feedback and correct occasional errors and omissions. That sort of iterative process is not possible with individual reviewers who are free to assert their opinions with virtually no constraints.

Promotion and tenure dossiers typically contain a lot of data with variable reliability and validity. These data must be evaluated with discretion. Ideally, what one should look for is a consistency or convergence in the data leading to a “best judgment” about the quality of a scholar’s work. Privileging one source of data is risky. So is prohibiting it.

Promotion and tenure dossiers typically contain a lot of data with variable reliability and validity. These data must be evaluated with discretion”

Including comparative indices of a faculty member’s scholarly work is a knife that can cut both ways. While the typical faculty member is worried that the data might undercut the reputation he or she enjoys on campus for scholarly superiority, they may actually work in an opposite direction more often by invalidating the isolated negative comments in external letters of reference that so often receive disproportionate attention in faculty dossiers.

And there is another potential advantage that users of these comparative data might come to appreciate: they allow comparisons of apples to apples in the academic orchard. The productivity of faculty in the humanities is compared to that of their peers, rather than to their colleagues in engineering or the social and life sciences, who almost always have more publications, citations, and grant awards.

Who knows? Some day, faculty might even want to turn the tables and use Academic Analytics’ metrics to evaluate administrators - chairs, deans, provosts, and yes, even chancellors and presidents.

The Future of Big Data in Higher Ed

Like it or not, data mining and analysis companies like Academic Analytics will probably become more common in the future. Universities must constantly make major strategic determinations about, for instance:

  • which academic departments should be strengthened and which should be diminished;
  • which research areas represent an institution’s best prospects for more research funding, and which are being overlooked by major funding agencies; and
  • how successful graduates of one academic program are versus those completing others.

Addressing these questions requires the best data possible.

Moreover, university administrators, boards of trustees, accrediting bodies, and policy makers will increasingly want to know how a given school compares to peer institutions or others within the same state or region. Well-informed internal allocations and performance-based public appropriations depend on the most comprehensive benchmarking data possible.

For these reasons, contracting with external data mining companies will be embraced as colleges seek customized evaluations and solutions for high-stakes institutional decisions. Moneyball–welcome to the academy.

Look for two other trends to unfold in the future:

  1. more companies starting up and competing in this market; and
  2. a closer collaboration between universities and data companies that will enable more tailored, accurate and high-quality databases that can be trusted by the institutions and faculty alike.

If you would like to take a deeper dive into issues facing university professors today, check out these articles on the perils of peer review:

Do you have a question about this topic? Ask it here