Heyy there! Welcome to the second edition of The Neuron. I hope you’re doing well. I have changed the format a bit, just doing some A/B testing. Let’s see how this one goes.
Latest AI trends and developments
Sonnet 3.5 crushes GPT-4o and other LLMs
Anthropic, the creators of Claude.ai, released their new LLM model, Sonnet 3.5. Though it stands ahead of all other LLMs in most of the categories, what I’m more excited about is that it’s free and available for all users.
Moreover, I love the new Artifacts feature. Claude opens a side pane to write code and renders the output (if possible). Here’s an app I made as a joke using Sonnet 3.5 on Claude.ai
I’ll skip explaining how it was a joke, but this was made by Claude.ai in 4 prompts and the overall conversation took less than 5 minutes. Here’s my tweet about the joke if you’re interested: https://x.com/inclinedadarsh/status/1805065809674281062
There’s one turn-off though, Claude cannot access the internet, whereas GPT-4o can. But even after this, since the release of Sonnet 3.5, I have started ditching ChatGPT for my own good.
You can learn more about the release here: https://www.anthropic.com/news/claude-3-5-sonnet
SSI Inc.’s first message for the world
Illya, co-founder and previous chief scientist at OpenAI has started a new project. SSI (Safe Superintelligence) is an AI startup prioritizing safety over ‘commercial pressure’.
Though SSI hasn’t released a lot of information, they (Illya) released a message for the world, describing their goal, mission, vision and why one should join them. Other than that, Illya in a recent interview said that their first product will be Safe Superintelligence and the company will not focus on anything else until then.
Read the full message here: https://x.com/ssi/status/1803472825476587910
OpenAI acquires Rockset
Rockset is a real-time indexing database that lets developers search and index vectors, JSON, etc. within milliseconds (yes, we’re talking about a database fast af). Due to recent developments in AI, Rockset’s focus was shifting towards vector databases, which is one of the major reasons for OpenAI acquisition.
A lot of people are speculating that OpenAI is building LLM OS (an operating system integrated with LLM and utilizes LLM to perform various tasks and functions). If that’s true, I’d be interested in learning how LLM OS works.
For now, you can read more about the acquisition here: https://openai.com/index/openai-acquires-rockset
Insights and updates
Logistic regression with the ‘Deep learning’ mindset
While developing Inclinet, I felt a bit too lost at many points, hence I decided to take a step back and get back to basics. I have been learning the basics of neural networks and implementing those in Python since then.
Very recently I made a Jupyter notebook (inspired by this deep learning course’s assignment) to make a very simple deep learning model that classifies if an image is cat or non-cat. This helped me understand the underlying concepts better.
If you found Inclinet complex to understand, or if you’re just interested in seeing how deep learning works behind the scenes, I highly encourage you to check out my notebook of cat v/s non-cat: https://github.com/inclinedadarsh/logistic-regression-deep-learning
Nights & Weekends
After a lot of thought, I realized this is what I want to do, the newsletter itself. So I finally released my idea slide for n&w s5, here it is:
Yess! I’ll be documenting my journey through this very newsletter and my Twitter. Other than this weekly newsletter, I’ll be publishing tonnes of updates on my Twitter account including some video content hopefully.
Awesome resources
This week I had a lot of stuff to share, but I’ll limit myself by sharing only a couple to not bore you all.
I shouldn’t work on LLMs?!
Yes, you read that right. And if you’re a student too, neither should you do that. Yann LeCun in a recent talk explained why you shouldn’t work on LLMs as a student or academic researcher.
Though I agree that there are a lot of experts who’re saying the opposite anyway, but the argument Yann LeCun presented made a lot of sense to me. You can read more about it in this Twitter post: https://x.com/prajdabre1/status/1804696922562949288
What are your thoughts on this? I would love to hear that!
AI wrapper ideas for side projects
Okay, straight to the point, here’s a list of AI wrappers that can be your next side project and even start generating some side revenue.
Find it here — https://github.com/bleedline/aimoneyhunter/blob/main/README_en.md
Nope, I didn’t make this list, nor do I claim so. All credits go to the original creator of the list.
Smolorg — A treasure for ML enthsiasts
So if you’re an ML enthusiast like me and like to implement stuff from scratch, here’s smolorg!
The official description states — “smol implementations of cool things.” It’s a GitHub org by Maharshi consisting of interesting implementations from the machine learning/deep learning world. Currently it contains —
smolgrad (autograd engine)
smolar (Numpy in C)
smolvecstore (vectorstore in Numpy)
Maharshi recently has been a source of inspiration for me. I highly encourage you to check out the organization to learn more about how things work behind the scenes.
Makers’ schedule v/s Managers’ schedule
Here’s a very interesting blog explaining the core difference between a manager’s schedule and a maker’s schedule. As a young maker, I could relate to this.
I highly recommend giving this a read irrespective of your profession. This blog will ultimately help you build a better work environment where productivity will be prioritized over anything.
Read the blog here: https://www.paulgraham.com/makersschedule.html
That’s it for this week. I have got my first 20 subscribers already, and I can’t tell y’all how happy I am! Thanks to everyone who subscribed to The Neuron.
To end with, here’s something I deeply resonate with —
Hey, did you like this post? Please let me know by messaging me on Discord or tweeting about it, make sure to tag me (@inclinedadarsh).
You can connect with me on Twitter, Discord, WhatsApp or you can find all other links here.
Thank you for your time.
keep growing adarsh, good work