Ugh, "SocArXiv"
@chadtopaz
Data science + math for social justice. Violist, yogi, husband, dad to human & dogs. π³οΈβπ Views do not represent my employers. Author of Unlocking Justice: The Power of Data to Confront Inequity and Create Change. Preorder at: https://bit.ly/4nT7qUh
Ugh, "SocArXiv"
Inefficiency and Inequity of the Law Review Submission System Chad M. Topaz1,2,3,* 1Williams College, Williamstown, MA, USA 2University of ColoradoβBoulder, Boulder, CO, USA 3QSIDE Institute, Williamstown, MA, USA *Corresponding author: cmt6@williams.edu Abstract Where a legal scholar works shapes publication outcomes nearly as much as what they write. In the law review submission systemβthe primary publication market for legal scholarship in the United Statesβstudent editors face thousands of submissions for a handful of slots and rely heavily on institutional prestige as a proxy for article quality. We build a calibrated agent- based simulation of this market and benchmark it against deferred acceptance, a centralized matching algorithm used in markets like medical residencies. The simulation predicts severe misallocation: more than 60% of top-tier placements differ from what centralized signal- based matching would produce, and the rank correlation between article quality and journal prestige is 0.45 versus 0.79 under centralized matching. Which system produces better placements overall depends on how many authors are competing for how many slots. As competition intensifiesβa trend already underwayβthe current systemβs disadvantage grows, with the model predicting up to 13.4% loss in match quality. Partial reforms like extending deadlines have negligible effects; in the simulation, the primary source of inefficiency is the decentralized structure of the market itself. The simulation also reveals that credential dependence produces inequity that persists even among articles of comparable quality: authors from prestigious institutions receive markedly better placements regardless of the matching mechanism. Centralized matching fixes the sorting problem but not this equity problemβ prestige bias is embedded in editorial signals and would require changes to how articles are evaluated, not just how they are assigned.
This'll be my last post on this (unless/until publication) but the fruits of my rage are now "officially" posted on SoxArXiv and have been submitted for publication, yay!
"Inefficiency and inequity of the law review submission system"
Link: osf.io/preprints/so...
Ah, the two genders
Model assumes that articles have "quality" which we cannot know perfectly and which is noisily perceived by editors. These are all parameters that are varied in the model. See results.
My extra radical far loony left position is that no one should ever be allowed to make a profit from academic publishing. It is a public good. See also: education, healthcare (on a good day, public transportβ¦)
Anyone in my orbit in the exclusive club of 9,000 handling editors for PLOS One? Asking for real.
Paper title and abstract: Inefficiency and Inequity of the Law Review Submission System (author: Chad M. Topaz) Where a legal scholar works shapes publication outcomes nearly as much as what they write. In the law review submission system---the primary publication market for legal scholarship in the United States---student editors face thousands of submissions for a handful of slots and rely heavily on institutional prestige as a proxy for article quality. We build a calibrated agent-based simulation of this market and benchmark it against deferred acceptance, a centralized matching algorithm used in markets like medical residencies. The simulation predicts severe misallocation: more than 60\% of top-tier placements differ from what centralized signal-based matching would produce, and the rank correlation between article quality and journal prestige is 0.45 versus 0.79 under centralized matching. Which system produces better placements overall depends on how many authors are competing for how many slots. As competition intensifies---a trend already underway---the current system's disadvantage grows, with the model predicting up to 13.4\% loss in match quality. Partial reforms like extending deadlines have negligible effects; in the simulation, the primary source of inefficiency is the decentralized structure of the market itself. The simulation also reveals that credential dependence produces inequity that persists even among articles of comparable quality: authors from prestigious institutions receive markedly better placements regardless of the matching mechanism. Centralized matching fixes the sorting problem but not this equity problem---prestige bias is embedded in editorial signals and would require changes to how articles are evaluated, not just how they are assigned.
π¨What if some bitches simulated the bonkers law review submission market to show exactly how it's wildly inefficient and deeply inequitable? It's me, I'm bitches!
(Here's a *much* refined version of yesterday's preprint.)
Sharing is caring!
#LawSky #AcademicSky
drive.google.com/file/d/1dsDm...
Ok but for real, asking for advice. Where should I send work like this? I actually have no idea.
Told my kid that with St Patrick's Day coming up, he should get ready for a week of me making Irish-themed dad jokes. He glared at me. "Glare all you want, I'm Dublin' down on it."
π―
Said as a very quanty person: some dude I donβt know on here shitting on academic fields for having theoretical frameworks and not using causal inferenceβ¦ earns an instablock.
In other news, trans rights are human rights.
We are a three-electric-car family and itβs never been a better time.
Yes exactly!!
omg I don't know how I missed this three months ago. I'm β οΈ.
It's interesting for me to ponder this. I have been using some Claude Code. I'm at a tiny undergrad-only school, where I only get, at best, a semester to work on research with a student. The LLM hasn't replaced any person's labor but it has helped finish projects that I would otherwise leave undone.
^^^This.
Lol I have not submitted anywhere because I am honestly not sure where to send it but I am SHAMELESSLY solicit social media here. Tell all your friends or students or whomever. First one to solicit this article from me gets it. π
For some of the folx who chimed in yesterday: there's a much-refined version posted now. Just click through to skeet below. @hoffprof.bsky.social @bdgesq.bsky.social @davidasimon.bsky.social @msmith750.bsky.social @profferguson.bsky.social @narosenblum.bsky.social
oh p.s., I have not yet submitted this anywhere and I literally dare any legal publication to solicit this from me. π
Take note, @abovethelaw.com !
Paper title and abstract: Inefficiency and Inequity of the Law Review Submission System (author: Chad M. Topaz) Where a legal scholar works shapes publication outcomes nearly as much as what they write. In the law review submission system---the primary publication market for legal scholarship in the United States---student editors face thousands of submissions for a handful of slots and rely heavily on institutional prestige as a proxy for article quality. We build a calibrated agent-based simulation of this market and benchmark it against deferred acceptance, a centralized matching algorithm used in markets like medical residencies. The simulation predicts severe misallocation: more than 60\% of top-tier placements differ from what centralized signal-based matching would produce, and the rank correlation between article quality and journal prestige is 0.45 versus 0.79 under centralized matching. Which system produces better placements overall depends on how many authors are competing for how many slots. As competition intensifies---a trend already underway---the current system's disadvantage grows, with the model predicting up to 13.4\% loss in match quality. Partial reforms like extending deadlines have negligible effects; in the simulation, the primary source of inefficiency is the decentralized structure of the market itself. The simulation also reveals that credential dependence produces inequity that persists even among articles of comparable quality: authors from prestigious institutions receive markedly better placements regardless of the matching mechanism. Centralized matching fixes the sorting problem but not this equity problem---prestige bias is embedded in editorial signals and would require changes to how articles are evaluated, not just how they are assigned.
π¨What if some bitches simulated the bonkers law review submission market to show exactly how it's wildly inefficient and deeply inequitable? It's me, I'm bitches!
(Here's a *much* refined version of yesterday's preprint.)
Sharing is caring!
#LawSky #AcademicSky
drive.google.com/file/d/1dsDm...
I agree with some of this and think my model disagrees with some of it! Thanks for sharing!
Fair point, and I will read! Thanks!
lol omg QSIDE is not in Orono. This is a latex copy-paste error and why it is good to post a preprint before submitting somewhere π
changing how articles are assigned fixes most of the misallocation. The equity problem is harder β that one lives in the evaluation itself, and you're right that peer review wouldn't automatically fix it either.
Thanks β and to be clear, the paper doesn't argue for peer review. The benchmark is a centralized matching algorithm using the same noisy student-editor signals. The finding is that even without changing who evaluates,
π€£ Here's preprint in my google drive while I wait for it to post on SocArXiv. Also I literally did this in like 72 hrs of not sleeping so Imma need to check my work (even) more before submitting. drive.google.com/file/d/1dsDm...
Hopefully this will post to SocArXiv within a day or two (I chose them instead of SSRN in the end) but for now, you can "enjoy" this version in my google drive. Like I said, this one's for you, #LawSky! drive.google.com/file/d/1dsDm...