S3 API underneath, so rclone, AWS CLI, and any SDK work too. ~10MB binary. Files stay exactly where they are.
If you run a NAS or home server and want to try it before launch, reply or DM me your setup.
github.com/deepjoy/shoebox
S3 API underneath, so rclone, AWS CLI, and any SDK work too. ~10MB binary. Files stay exactly where they are.
If you run a NAS or home server and want to try it before launch, reply or DM me your setup.
github.com/deepjoy/shoebox
The twist: I built this to find duplicate photos. That meant building an S3 server. Then I realized a terminal can't show you which photos are duplicated, so I built a companion webapp too.
One curl command enables CORS. Point your browser at your Shoebox instance.
deepjoy.github.io/shoebox-webapp
The Shoebox application's Buckets page. A card displays a bucket named "backup" with a database icon. It shows 106,715 files and 71.4 GB of storage. Highlighted in amber are three warnings: 18,847 duplicate files, 581 duplicate folders, and 8.2 GB reclaimable.
Ran Shoebox against a directory. 106K files, 71 GB. It found 18,847 duplicates hiding in 581 folders. 8.2 GB reclaimable.
Here's what that looks like in a browser. Open source, MIT licensed.
#buildinpublic #rustlang #selfhosted #selfhosting
The real comparison isn't "laptops at this price point." It's "laptops that will still work in 6 years at this price point." And that list is very short.
This is the right distinction. My whole setup is the second kind. All local. Nothing phones home. If my modem dies, the lights still automate. If my router dies, that's a different story, but I keep a spare for that too. The "needs internet" is a business model choice, not a technical requirement.
They don't need to be connected to the internet. That's the whole point. My light switches and thermostat run on Zigbee through Home Assistant, completely local. If my internet goes down, everything still works. The problem was never "smart." The problem is cloud.
Enterprise hardware doesn't retire. It just gets interesting.
The real move is when you find the flag you need buried in a test file that was never mentioned in the docs. Peak open source experience. AI turning "read the source, Luke" from a mass punishment into something actually doable is a solid win.
Yeah the model didn't fail, it got bought. I'd love to see someone to prove a sustainable middle ground exists. Pay for the thing, get the thing, dev still eats. But every attempt flames out or gets swallowed by a bigger fish. The problem isn't the model, it's the incentives around it.
Did this for years. Can confirm. Email is the one self-hosting hill I eventually climbed back down from. Still self-host literally everything else. But email? Nah. Life's too short.
Same arc here. Ran Postfix and Dovecot for years, had SPF/DKIM/DMARC all set up correctly, and still played whack-a-mole with blacklists. The technical part was fine. The political part, convincing Google you're not spam just because you're small, that's the part nobody warns you about.
Aseprite is the exception that proves the rule. One dev, passion project, $20 forever. But updates still cost labor. The real split is FOSS (labor of love) vs subscription (labor for profit). Everyone wants a third option. Nobody's found one that sustains even 2-3 long term.
That missing middle between corporate subscription bloat and FOSS-with-no-maintainer is so real. But, people are building in that gap more than you'd think. I came across github.com/terraincognita07/ovumcy recently. It's exactly "does one thing, doesn't spy on you" software you're describing.
The "taking my data back" framing resonates. I've been running a reverse tunnel into Docker Traefik for years. Same itch, just wired it up myself. Pangolin looks like a nice way to get there without the custom plumbing, especially at 90 services.
That's the best way to build. Solve it for yourself first. The local history angle really sets it apart from the usual community platforms. Looking forward to seeing where it goes!
I moved my cameras to wired PoE Reolink. Completely offline and have solid open source NVR support (Frigate).
Reolink also makes a video doorbell that might fit the bill: reolink.com/us/product/r... . I picked one up but haven't installed it yet so can't vouch for it firsthand. Worth a look though.
The "tedious" part is a feature. I run a home server with btrfs RAID 10, NextCloud, the works. could use a managed service for any of it, but owning the pipeline end-to-end means when something breaks, you actually understand why. Respect the no-CMS choice.
Privacy is what kept me away from Synology too. Been self-hosting on Btrfs RAID 10 for about a decade. The tradeoff is you're on your own for the software layer. Does UGOS Pro support encryption now?
Congrats on the release! Love the idea of map-linked local posts. What inspired the hyperlocal focus — was there a specific gap you were trying to fill vs. existing community platforms like Nextdoor or local Facebook groups?
Started this to find duplicate photos. But "any S3 client works against your local files" turns out to solve problems I wasn't thinking about.
Open source, MIT licensed. Curious what people would connect to a local S3 endpoint. What's your first tool?
This is why open standards matter. I built a storage server for local files. I didn't write a single line of integration code. The S3 spec did the work.
Same reason the AT Protocol matters here, interoperability you don't have to negotiate for.
A nighttime photo of a large tech company campus. In the foreground, there's a oversized green Android mascot statue standing on decorative blocks, arms outstretched in a welcoming pose. Behind it, a modern glass-walled office building is lit from within, revealing open workspaces across multiple floors. The company's logo is displayed prominently on the building's facade. Colorful bicycles are parked along a walkway to the right, and landscaped plantings border the curved paved paths. The scene has a quiet, after-hours feel with the building's interior lights contrasting against the dark sky.
Lesson from three weeks of building: the most valuable thing about S3 isn't S3. It's that every tool already speaks it.
Build to standard and rclone, Cyberduck, boto3, the AWS CLI all work on day one. No plugins. No integrations. Just the protocol.
#buildinpublic #rustlang #selfhosted #selfhosting
Honestly? Trusting the scanner with real data. I had a data corruption incident even with RAID 10.
That taught me the scanner had to be read-only and non-destructive above all else. No moving files, no reorganizing. Just indexing what's there before writing anything.
Also shipped since Thursday: multipart uploads (any file size), CopyObject, tagging, conditional requests. rclone and AWS CLI just connect. No plugins, no config beyond credentials.
MIT licensed, building in the open. What would you point a local S3 server at?
Update on the local S3 server: filesystem scanner landed. Point it at a directory, it walks every file, indexes it, and watches for changes. Drop a file in the folder, it shows up in S3 automatically. No upload step.
Good progress in two weeks.
#buildinpublic #rustlang #selfhosted #selfhosting
Love the archival mindset! S3 has built-in object versioning. Every version of a file kept and retrievable. That's part of what Shoebox brings to local files. Interesting use case for sure.
And sometimes that person 3 time zones away becomes your co-creator, your first user, or just the one person who gets it when nobody else does. Building in public is less about the public part and more about finding your people.
The best tool is the one that actually ships features 🚢 Claude Code has been solid for me too. What are you building?
Guess the year and location.
#selfhosted #selfhosting #buildinpublic
The real value: making local files accessible via S3 API. Works with rclone, AWS CLI, any SDK. Dedup is just one use case that falls out of content hashing. Open source, MIT licensed, building in the open.