acx unjournal app
(David Reinstein and possibly others involved in project)

Short one-sentence description of your proposed project

The "Unjournal" will organize and fund 'public journal-independent evaluation’ of EA-relevant/adjacent research, encouraging this research by making it easier for academics and EA-organization researchers to get feedback and credible ratings.   

Longer description of your proposed project

Peer review is great, but academic publication processes are wasteful, slow, rent-extracting, and they discourage innovation.

  • Academic publishers extract rents and discourage progress. But there is a coordination problem in ‘escaping’ this. Funders like Open Philanthropy and EA-affiliated researchers are not stuck, we can facilitate an exit.

  • The traditional binary ‘publish or reject’ system wastes resources (wasted effort and gamesmanship) and adds unnecessary risk. I propose an alternative, the “Evaluated Project Repo”: a system of credible evaluations, ratings, and published reviews (linked to an open research archive/curation). This will also enable more readable, reliable, and replicable research formats, such as dynamic documents; and allow research projects to continue to improve without “paper bloat”. (I also propose some ‘escape bridges’ from the current system.)
  • Global priorities and EA research organizations are looking for ‘feedback and quality control’, dissemination, and external credibility. We would gain substantial benefits from supporting, and working with the Evaluated Project Repo (or with related peer-evaluation systems), rather than (only) submitting our work to traditional journals. We should also put some direct value on results of open science and open access, and the strong impact we may have in supporting this.

I am asking for funding to help replace this system, with EA 'taking the lead'. My goal is permanent and openly-hosted research projects, and efficient journal-independent peer review, evaluation, and communication. (I have been discussing and presenting this idea publicly for roughly one year, and gained a great deal of feedback. I return to this in the next section). 

I propose the following 12-month Proof of Concept:
Proposal for EA-aligned research 'unjournal' collaboration mechanism
  1. Build a ‘founding committee’ of 5-8 experienced and enthusiastic EA-aligned/adjacent researchers at EA orgs, research academics, and practitioners (e.g., draw from speakers at recent EA Global meetings). 
I will publicly share my procedure for choosing this group (in the long run we will aim at transparent and impartial process for choosing ‘editors and managers’, as well as aiming at decentralized forms of evaluation and filtering.)

  1. Host a meeting (and shared collaboration space/document), to come to a consensus/set of principles on
  • A cluster of EA-relevant research areas we want to start with
  • A simple outreach strategy 
  • How we determine which work is 'EA-interesting’
  • How we will choose ‘reviewers’ and avoid conflicts-of-interest
  • How we will evaluate, rate, rank, and give feedback on work
  • The platforms we will work with 
  • How to promote and communicate the research work (to academics, policymakers, and the EA community)

  1. Post and present our consensus (on various fora especiallyin the EA, Open Science, and relevant academic communities, as well as pro-active interviews with key players). Solicit feedback. Have a brief ‘followup period’ (1 week) to consider adjusting the above consensus plan in light of the feedback.
 
  1. Set up the basic platforms, links
  • Note: I am strongly leaning towards https://prereview.org/ as the main platform, which has indicated willingness to give us a flexible ‘experimental space

  1. Reach out to researchers in relevant areas and organizations and ask them to 'submit' their work for 'feedback and potential positive evaluations and recognition', and for a chance at a prize.
  1. The unjournal will *not be an exclusive outlet.* Researchers are free to also submit the same work to 'traditional journals' at any point. 
  1. Their work must be publicly hosted, with a DOI. Ideally the 'whole project' is maintained and updated, with all materials, in a single location. We can help enable them to host their work and enable DOI's through (e.g.) Zenodo; even hosted 'dynamic documents' can be DOI'd.

Researchers are encouraged to write and present work in 'reasoning transparent' (as well as 'open science' transparent) ways. They are encouraged to make connections with core EA ideas and frameworks, but without being too heavy-handed. Essentially, we are asking them to connect their research to 'the present and future welfare of humanity/sentient beings'. 

Reviews will, by default, be made public and connected with the paper. However, our committee will discuss I. whether/when authors are allowed to withdraw/hide their work, and II. when reviews will be ‘signed’ vs anonymous. In my conversations with researchers, some have been reluctant to ‘put themselves out there for public criticism’, while others seem more OK with this.  

We aim to have roughly 25 research papers/projects reviewed/evaluated and 'communicated' (to EA audiences) in the first year.

My suggestions on the above, as a starting point...
  • Given my own background, I would lean towards ‘empirical social science’ (including Economics) and impact evaluation and measurement (especially for ‘effective charitable interventions’) 
  • Administration should be light-touch, to also be attractive to aligned academics
  • We should build "editorial-board-like" teams with subject/area expertise
  • We should pay reviewers for their work (I propose $250 for 5 hours of quality reviewing work)
  • Create a set of rules for 'submission and management', 'which projects enter the review system' (relevance, minimal quality, stakeholders, any red lines or 'musts'), how projects are to be submitted (see above, but let's be flexible), how reviewers are to be assigned and compensated (or 'given equivalent credit')