Oh nice! Thanks for mentioning it! I didn’t know.
Oh nice! Thanks for mentioning it! I didn’t know.
LIT licensed 🤣 … some typos are too good to fix. And Bluesky still doesn’t sport an edit button 🥲
It’s LIT licensed though, so even if the original authors left it, it’s still there. Anyone can pick it up and run with it if they’d like 🤷♂️ … that’s the beauty of open source 🥲
I’d be curious to hear your/others’ take on what a good replacement is? Gitlab? Or simply running your own git server and using it as it was originally intended? (Y’know… decentralized?)
The only feat harder than finding an available domain name is finding an available GitHub name.
@felixgv.ninja (ex-LinkedIn) and @olimpiupop.bsky.social explore Venice DB, planetary-scale data systems, CAP trade-offs, and chaos testing at peak load - revealing how distributed architectures remain resilient today!
Tune in and listen to former #VeniceDB tech lead and all around swell guy Félix GV as he talks with Olimpiu P. on building #VeniceDB for planet scale workloads!! youtu.be/eg6EFeGSx6M?...
Is the goal to make Bluesky a bit more like 4chan 🙃🥲 ?
And there we go, 366 push ups in a day, on January 2nd. What will the rest of the year have in store? We’ll see, but I ain’t waiting around all year to do it, when a single day suffices.
And neither should you! Which next year goal can you complete tomorrow?
It’s not that hard. I just need to grease the groove and take the time to do it.
So today I did 24 sets of 10 reps before lunch, and another 12 sets of 10 before dinner, with 10m breaks after each. And I topped it off with 3 sets of 2 reps each with 27 kg (60 lbs) on my back.
Don’t Sell Yourself Short With New Year Resolutions
Last night, I was thinking of a new year’s resolution where I would do one more push up each day, until I reached 365 in a day at the end of the year.
Then I thought: why wait? I can already do 365 push ups in a day.
As for breaking APIs, I certainly wasn’t suggesting this cardinal sin. Your borked APIs are the past crimes you need to carry on your conscience forever. They cannot be undone. You can only try to do better going forward.
Happy holidays 😂
Thread-local session?
Some ideas (you probably thought of) to make usage easier:
1. For cases where the watermark needs to be transferred, collapse the two params into a single blobby payload.
2. For cases where it doesn’t need to be transferred, let a client init param result in all further calls carry it by default?
Nice post. Clear content and honest take. Appreciate the retrospective.
OMFG Christmas came early this year 🤩
Thank you @dancarlin.bsky.social 🎅🏻 !!
podcasts.apple.com/us/podcast/d...
These indicators should all be gathered automatically. But could be overridden if the author wishes to clarify beyond the automated heuristics what is it they actually saw or not in their prior context.
A permanent record in the sense links in the graph: this node is linked to that ancestor node which is the most recent one the author had seen, and these other nodes which were received (but probably NOT seen) while the author was typing.
We could go further and indicate not only this, but also a "typing indicator". Not the fleeting "so and so are typing" which appears briefly then disappears forever, but a permanent record.
What I’m thinking is it should be a DAG. Each node (vertex) has links (edges) to denote the ancestry relation to each previous nodes the author "knew" about. Furthermore, these links can be annotated with the degree of knowledge the author had (received, seen, read).
Sounds great. I think a lot of misunderstandings are just that: people assuming a given message is a reply to the one that precedes it, while in fact it is a reply to the one 3-4 messages behind it in the feed.
Explicit replies and threading are opt-in, when in fact they should probably be default.
Reminds me of the case of real-time telephony, in which we do not care for infra-level support for re-sending lost packets, since a late packet is ultimately just a noisy distraction.
The re-send happens already at the higher semantic level of human understanding. "Come again? I didn’t catch that."
"Semantic ordering" is a nice way to put it. It is reminiscent of "causal ordering" from the data infra literature, but at a higher level… i.e., at the level of human understanding rather than packets sent and acked.
Interesting post, though I’m slightly disappointed in this paragraph which (to me at least) is the crux of the issue.
On one hand, the approach of the "linear feed" in which a central authority decides the order and imposes it on all participants is the classical one, so it makes sense to copy it.
But it’s fine, no one project should try to solve everything at once. Roomy is a great stepping stone if it can improve ownership democratization. Solving the UX issues can be a different project 😊
The point is: a UI which can illustrate the fundamental non-linearity of messaging can not only make it easier to understand who knew what when they wrote each message in a multi-planetary scenario, but ALSO in a scenario where people are typing concurrently.
But besides sci-fi scenarios, it could simply be a matter of attention. While a user is typing, they’re probably not reading the latest incoming messages. If the UI provides a full-screen text editor then for sure they’re not seeing them. So it matters not that they’ve received the messages.
High latency could be temporary due to connectivity issues (the example in the blog), or it could be permanent (an Earth-Mars ping is 4-20 minutes, with ~2 weeks of solar occlusion a year).
On the other hand, I personally think there is a big opportunity to improve the UX as well, by offering a better alternative to the classical "linear feed"?
But why? What’s wrong with the linear feed? One issue, but not the only one, is high latency scenarios, as the blog post points out…
By offering similar UX and semantics as FB Messenger and all the other mainstream chat apps, Roomy can reduce friction for new users.
Since Roomy intends to focus (IIUC) on changing the data ownership and not the UX, this makes perfect sense.