Netherlands Research Integrity Network (NRIN)’s Post

Netherlands Research Integrity Network (NRIN) reposted this

View profile for Andrew W. Wilkins, graphic

Reader & Director of Research at Goldsmiths, University of London

THIS 👇 We need to reclaim the thinking time necessary to produce quality research that move, advance or disrupt knowledge. Fundamentally this means pushing back against university managers who wrongly celebrate quantity over quality. Write better papers, write fewer papers.

  • No alternative text description for this image
Roeland Buckinx

Manager Health Campus Limburg DC > #healthcare #lifesciences #innovation #innovationareas #economicdevelopment #collaboration

1w

For anyone following leading journals as Nature, Science and the like, the sheer volume of op-eds on this topic for years and years is stunning. At the same time, the academic sector remains - by its nature - a bottom-up stakeholder-driven sector. Academics often populate committees, governance boards, etc… Yet it seems impossible to break this cycle of (new public management-driven) quantitave output-oriented policies. That seems contra-intuitive. It is easy to point to “managers”. One has to acknowledge that (leading) academics themselves often support this system and hinder change.

Véronique Vitry

Professeur/ Professor at UMONS - Conseillère pour l'Internationalisation at Home - Présidente at A3TS

1w

And it's not only the pressure to publish that we need to recover time from. The pressure of admin can be as damaging, if not worse. I have colleagues whose PhD students are required to have a formal meeting with their supervisory team every other week. With slides and minutes that need to be posted on a repository as proof. Not only does that take them a lot of time, but it also keeps them permanently on a 2 weeks deadline. How can you think creatively or even reflect on your findings when you're always thinking with a short term aim?

Sonia Peter

Executive Director - Biocultural Education and Research Programme and Founding Director - Heritage Teas Barbados

1w

Valid points are being made in this discussion. From my perspective of a scientist managing a nonprofit in the biocultural diversity space, I argue that the bulk of knowledge gained by targeted research is of no use to the community when held behind electronic barriers and library vaults. The impact factor that fuels recognition of research value should not be based solely on the number of times referenced or the reach in the scientific community, but by the contribution made to the advancement towards real time solutions to challenges being faced by the global community.

Douglas Marchuk

Professor at Duke University Medical Center

1w

It’s so much easier to measure success by metrics that merely involve counting- ie. the number of papers and the amount of grant money. But success measured in inpact and significance requires the passage of time.

So do we encourage academics publish less and write fewer grants and industry scientists to work on fewer projects? How else do we create 'more time to think'? How do we force changes in incentives like tenure to accomodate these changes? Calling for more time to think is easy, implementing work place practices is hard.

Alessandro Palermo

Professor in Structural Engineering at the University of California, San Diego and part-time MBA Student at University of Canterbury

1w

I don’t think the metric is the real problem. If we look at good parameters as in Scopus scival we can understand if what we do is impactful and design our research pathway. Most of the people use it to simply advertise or show how good they are. The real problem for the lack of thinking especially in the U.S. is the capitalistic environment that pushes academics to bring funds and funds such that the publications yes comes but I feel they are secondary and sometimes rushed. Moreover being new to the U.S. this constant pressure to run run run doesn’t really help to step back, think and look at things differently. All the good ideas I had where nurtured when I was not working and actually relaxing! Sorry for the long comment!

Quantity is obviously an indirect measure of impact but I don't think managers understand how incredibly poor of a measure it is. No one has ever introduced two peers by saying, "Meet so-and-so, you may recognize them from their 15 publications last year." While it may be hard to quantify an introduction like, "Meet so-and-so, you may recognize them from their work on ____" but that's the hard work I expect from someone that wants to manage researchers.

the “concept of money” is very useful, and it is the most effective way to organize many things, yet when it becomes the sole determinant of all human activity, its cumulative and progressive aspect suddenly becomes regressive and averaging. therefore, when time becomes money, the end product is inevitably an increase in quantity at the expense of a decrease in quality. the result is hardly a surprise. Are we looking for a simple and cheap solution? I think we need to increase the number of “curious children”, regardless of their profession, whether they are academics or something else, and then we need to reduce the social and economic barriers between curious children.

Seyed Mehdi Ahmadi, PhD

Dedicated Nutrition Researcher | Evidence Synthesis | Academic Contributor

1w

To fix the overfocus on publishing for promotions, universities need to value more than just papers. They should reward impactful work like mentoring, community engagement, and real-world problem-solving. Recognizing teamwork, innovative ideas, and alternative outputs like tools or policy briefs can shift the focus from quantity to quality. Giving researchers more time to think, reducing admin burdens, and using holistic reviews with peer and community feedback can help, too. The goal is to align promotions with meaningful contributions that benefit society, not just academic metrics.

Abdul Aziz Malik Diop

Associate Professor Umm al Qura University

1w

Thanks for sharing this most pertinent piece. Once prestigious universities decided that quantity was a key evaluation metric—perhaps the key metric—then other universities had no choice but to toe the line. Universities around the world are replete with thousands of publications from faculty members; yet, no one seems to worry about the impact any of these publications have on the advancement of knowledge. In fact, the only research paradigm that seems to matter is to for the researcher to fill the proverbial gap, regardless whether doing so tell us anything pertinent. The cycle continues…

See more comments

To view or add a comment, sign in

Explore topics