Ah I think he likes it in many ways though its unfortunately harder to get research funding at a small liberal arts college like he's at.
Ah I think he likes it in many ways though its unfortunately harder to get research funding at a small liberal arts college like he's at.
He liked it, although it was just off campus it wasn't an area with many students living nearby
He's a college professor in a small town and during his first couple years the college put him up in a house just off campus.
I decided to check what the longest piece was and it's that long because you're supposed to play it 840 times in a row.
This reminds me that a friend used to have a ~3 minute walk to work so I made a playlist of all the <=3 min songs I like: open.spotify.com/playlist/7h5...
It's not exactly what you're looking for, but in the DeepSeek OCR paper they were able to compress context by ~10x by using image tokens instead of text: arxiv.org/abs/2510.18234
I only recently learned there are people who hate WebP and I just naively assumed it must be about wanting to adopt JXL or AVIF instead, but no it's just not wanting a new better format at all.
I think @jefferyharrell.bsky.social
my flatmate btw
The smallest image models I know of are the Stable Diffusion Models, ~1B parameters for something like ~3 GB when including the VAE/text encoder. I'm surprised there hasn't been significant progress on how small a coherent image model can be since then.
Welcome to the part of Bluesky that takes AI seriously! Here are a couple starter packs for serious AI discussion on Bluesky if you want to find more:
bsky.app/starter-pack...
bsky.app/starter-pack...
Her piece does talk about how it loses meaning at long lengths:
The circle packing in circle page dates to 1999.
This site has been around a very long time, I remember first finding it over a decade ago.
1T seems possible a few years hence, the largest companies by revenue today are Walmart and Amazon each at 2/3 of a trillion. 1T is also less than 1% of world GDP. But this requires speculating beyond three years which is problematic.
yeah, surprisingly weak relationship between water % and how wet they seem
Same! I remember learning watermelons were ~92% water as a kid and assuming that other fruits must be so much lower.
Many fruits and veg are surprisingly high % water
I agreed with Dario at the time, and still do. It's good to be cautious on account of the known unknown of future growth. But the unknown unknowns like the supply chain risk thing makes not overextending yourself look even better.
yup
A couple reasons:
-It's a good indicator of capabilities progress, more useful than benchmarks IMO, and I'm more worried about all types of risk in a world where AI capabilities progress is very rapid.
-More revenue is itself one of the drivers of infrastructure spend which drives further progress.
At their current 10x/year growth rate, they should automate all labor in about 3.5 years.
Anthropic revenue growth is terrifyingly fast www.bloomberg.com/news/article...
These are four particularly good long reads on the topic of military software provisioning from the past few days.
www.hyperdimensional.co/p/clawed
thezvi.substack.com/p/secretary-...
jessicatillipman.com/what-rights-...
www.astralcodexten.com/p/all-lawful...
TBH I'm not sure and it's possible it's speculation. Lin was an open source champion, and many people on X are suggesting that the firing is a shift toward further commercialization. I'm not so sure myself though.
Targeting daily active users as the key metric you want more of
Some more context on Lin, via SCMP's Vincent Chow: x.com/vince_chow1/...
TBC I'm not sure about all the claims in this post, like the claim that it's about DAUmaxxing.