William Sessions's Avatar

William Sessions

@william.maxoutput.ai

Helping clinical research sites get answers to protocol queries in seconds. Chief Strategy Officer & Co-Founder @MaxOutput.

23
Followers
127
Following
152
Posts
08.12.2025
Joined
Posts Following

Latest posts by William Sessions @william.maxoutput.ai

This week's Site Sanity breaks down the budget mechanics behind it and what sites can actually negotiate to fix it.

10.03.2026 22:20 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
80% of clinical sites are running on six months of Check out the latest newsletter

80% of clinical sites are running on six months of cash or less

10.03.2026 22:20 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Time to First Patient In isn't just a timeline metric. It's a survival signal.
Slow FPI tells sponsors your site can't execute. And they remember.
Speed = system health. If enrollment drags, fix the backend first.

#ClinicalResearch

09.03.2026 21:25 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Research suggests 37% of sites consistently under-enroll. When you're a 2-3 person independent site, every empty slot is payroll you're covering out of pocket.

The break-even math is tough. The admin burden makes it worse.

#ClinicalResearch #MSky

08.03.2026 22:16 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Research suggests CRCs toggle between 22+ systems per trial. Not a training problem. A retention problem.

You can't fix burnout with pizza parties when the cognitive load is structural.

#ClinicalResearch

07.03.2026 22:15 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Most sites track screen failures. Almost none track them by referral source.

That's not a documentation problem. It's a data architecture problem.

Fix the structure, and suddenly you know which pipelines work.

#ClinicalResearch

06.03.2026 22:20 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Screen failures aren't recruitment failures. They're systems failures.

Most sites lose candidates in the middle - between contact and consent. That's where Cora lives.

cora.getmaxoutput.com

#ClinicalResearch #ScienceFeed

05.03.2026 23:45 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Research suggests screen failures cost sites ~$1,200 in unrecoverable labor. Per occurrence.

Most sites track conversion rates. Few track cost by referral source.

That's the actual admin burden no one's measuring.

#ClinicalResearch

04.03.2026 22:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

https://open.substack.com/pub/maxoutput/p/everyone-at-scope-heard-80-heres?r=6o81r5&utm_campaign=post&utm_medium=web&showWelcomeOnShare=true

03.03.2026 22:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

This week's newsletter: the recruitment math nobody talks about.

03.03.2026 22:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
AI tools can't fix a broken enrollment funnel. But Check out the latest newsletter

AI tools can't fix a broken enrollment funnel. But they can show you exactly where it's breaking.

03.03.2026 22:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

The average trial now generates ~296 protocol deviations. Nearly 33% of the data collected isn't even scientifically essential.

This isn't a training problem. It's a design problem.

E6(R3) finally gives sites the language to push back on bloated protocols.

#ClinicalResearch

26.02.2026 23:22 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

The average site operates under conflicting protocol versions for 215 days per trial. Multiply that by 10 active studies. You aren't disorganized - you're managing structural chaos with a calendar and a prayer. #ClinicalResearch #CRC

25.02.2026 22:25 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Protocol Complexity breaks down the math: https://open.substack.com/pub/maxoutput/p/why-clinical-trial-complexity-is?r=6o81r5&utm_campaign=post&utm_medium=web&showWelcomeOnShare=true

24.02.2026 23:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

This isn't negligence. It's arithmetic.

24.02.2026 23:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

301 procedures. 215 days juggling conflicting protocol versions. 24% of sites now declining trials outright.

24.02.2026 23:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
296 deviations per trial. A 3x increase in a decad Check out the latest newsletter

296 deviations per trial. A 3x increase in a decade.

24.02.2026 23:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Image from Airtable

Image from Airtable

301 procedures per trial. 4.9M data points. 32.5% of it non-essential.

24% of sites are now declining trials because the complexity is operationally impossible.

#ClinicalResearch #CRC

23.02.2026 23:29 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Who called it "eligibility screening" instead of "reading a 47-page mystery novel to find out the patient was disqualified on page 12 by a footnote that references an appendix"

#ClinicalResearch #CRC

20.02.2026 23:22 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Image from Airtable

Image from Airtable

30-35 eligibility criteria. 300+ procedures to verify. 80% of EMR data buried in unstructured notes.

Eligibility screening isn't a memory test - it's a systems design problem.

When CRCs spend 31% of their week hunting through charts, that's broken architecture.

#ClinicalResearch

19.02.2026 22:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

55-69% of screen failures? Same I/E criteria every time. 60% of RCTs have at least one poorly justified exclusion criterion.

SCRS calls them "unicorn protocols."

Predictable failures, labeled unavoidable. The system isn't broken by accident.

#ClinicalResearch

18.02.2026 23:22 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
The 41-Minute Guessing Game Why Eligibility Screening Is Broken

41 minutes to screen one patient. 36% fail anyway. $1,200 lost per failure because the protocol was written like a mystery novel.

The eligibility system is designed to fail CRCs.

New newsletter breaks down why (and what actually works).

18.02.2026 01:15 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

β€œAI that summarizes PDFs” β‰  β€œAI that changes outcomes.”
CRCs need tools that cut screening time, surface eligibility risks, and reduce deviations across active protocols, not just shorter text.
If it doesn’t change your workflow, it’s just decoration.

16.02.2026 23:22 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Image from Airtable

Image from Airtable

Protocol deviations are like Valentine's Day plans - no matter how carefully you document them, something unexpected is getting reported to the IRB.

Happy Friday. πŸ’”

13.02.2026 23:23 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Image from Airtable

Image from Airtable

FDA's new safety reporting guidance clarifies what many sites get wrong: sponsors determine if an event is "expected" or not. You assess causality. They see the full safety database across all sites.

12.02.2026 23:22 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Image from Airtable

Image from Airtable

FDA rejected both January warning letter responses. Not because sites didn't promise to fix things. Because they didn't explain HOW.

Root cause analysis. Specific procedures. Timelines. Verification methods.

Promises don't pass inspections. Systems do.

11.02.2026 23:25 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Protocol adherence shows up in 40-45% of FDA inspe Check out the latest newsletter

Protocol adherence shows up in 40-45% of FDA inspection severity changes. It's the #1 reason clinical research sites get upgraded to more serious findings.

https://open.substack.com/pub/maxoutput/p/two-warning-letters-in-five-days?r=6o81r5&utm_campaign=post&utm_medium=web&showWelcomeOnShare=true

10.02.2026 23:33 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

So the Wegovy pill ad basically tried to promise weight loss, emotional healing, and a whole new life arc, but FDA said this isn’t therapy in a capsule

10.02.2026 19:43 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
FDA inspectors upgrade severity in 40-45% of cases Check out the latest newsletter

FDA inspectors upgrade severity in 40-45% of cases for one reason: protocol adherence failures.

Not data integrity. Not safety reporting. Protocol compliance.

Tomorrow's newsletter breaks down what sites miss most - and what to fix now. <https://maxoutput.substack.com/>

#ClinicalResearch #MSky

09.02.2026 23:30 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Wild to see modern imaging used on a 28,000-year-old skull like this. Do you think it actually makes sense to diagnose specific conditions like NF1 in remains this old, or does it risk over-interpreting what the bones show?

08.02.2026 03:28 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0