Does anyone know how I can get access to Antropic 100K LLM?
Does anyone know how I can get access to Antropic 100K LLM?
Chatting with the framework's technical documentation is developer experience on a different level
Today's wordle
When doing software 3.0 we can't assume Software 2.0 principles will apply
Turns out AI will replace developers and lawyers long before it will replace drivers
Headache. If Iβd use that before I go to bed I had trouble sleeping. And overall, it was nice as an one time experience, but just too intense.
I sold my Oculus a month after I got it
Alternative moats: hard work. Serving your customers, offering UX/UI, creative marketing, out-of-the-box thinking, and a huge market.
-- Whatβs your tech moat? Investors often ask. Google's leaked memo proves nothing but the myth of βtechnological defensibilityβ. Investors should grow from this question.
How did I miss that ai.com links to ChatGPT?
This is just insane how much of my code is now generated by AI
AI agents will become an unfair advantage for many small startups.
In the near future, we'll see founders overseeing AI agents and achieving what a 500-people organization can't.
LLMs make side projects fun π€©
Lovely. Austin is so beautiful β¨
Letβs see what happens w/ Bluesky
Iβm looking to improve my note-taking system. What are some best note-taking workflows out there?
Thereβs an AI for that > Thereβs an app for that π¦Ύ
ChatGPT folks, what are some good prompting guidelines for software development?
OpenAI βbuilding in publicβ approach is fascinating. I wonβt be surprised if this strategy is one of their top revenue drivers
How many of the BlueSky team are Ex-Twitter?
Forget AI, we now have Time
Machines!
So far itβs nice here
I wonder if, now that AI is taking over knowledge workers, traditional businesses offering physical goods and services will become more valuable.
Illustration: AI is eating the world
AutoGPT now has 109K stars on GitHub (a one-month-old project). For comparison, React has 209 thousand stars (a decade-old project that basically runs the internet). Wild.
Yo π
Researchers have reported success in increasing LLM's context length up to 2M tokens. For instance, GPT-4 model can take input size up to 32K tokens.
This might be addressing one of the biggest limitations of today's LLMs.
https://arxiv.org/abs/2304.11062
Hello World! π