Weekly Startup Digest #8
Apr 21, 2024
If you want to increase your success rate, double your failure rate - Thomas J. Watson
Meta just released their new open-weight 8B and 70B models a few days ago, making waves in the AI industry. Notably, both models were trained on over 15 trillion tokens of data, allowing them to perform at the level of Gemini Pro 1.5 and Claude 3 Sonnet in benchmarks. However, the context window being at 8K tokens looks quite low compared to the current landscape. This release is a significant landmark for being open-weight and will push the industry to further commoditize LLMs.
They share specific metrics and give a sense of how they evaluate companies. I think the 4 P's is a valuable framework for getting to PMF: Persona, Problem, Promise (Pitch), and Product. They offer a special 8-session program for B2B founders
Great write up on marketing products for developers.
Mixtral has released its latest open-weight Mixture of Experts (MoE) model, which offers great cost efficiency considering its size.
After decades of research, Boston Dynamics unveils their electric version of Atlas, retiring their famous hydraulic Atlas humanoid
2.0 of API driven rich text editor just released.
New Boston Dynamic robot gets up in the most normal way possible.
After almost a decade retiring hidrolic Atlas to make way for new electric Atlas.
Supabase just released their support for S3 protocol this week making integrations much easier.
Bittorrent protocol implemented at filesystem level.
Join 300+ subscribers to get the latest weekly insights/articles about Startups, Growth Strategy, Artificial Intelligence, Coding and more. All links curated by hand.