newsletter.param.codes

Share this post

Learning deep learning

newsletter.param.codes

Learning deep learning

Thoughts on lesson 1 of the fast.ai course (https://course.fast.ai)

Param Singh
Feb 26
2
Share this post

Learning deep learning

newsletter.param.codes

In the interest of not becoming entirely redundant in the next one year, I’ve decided to pick up fast.ai’s deep learning course. The course is free on YouTube, and so is the book it’s based on.

I’ve only watched Lecture 1 so far. I thought I would write a few small takeaways for myself (and for readers who are interested). I still highly recommend going through the course, because it’s pretty cool.

So here we go:

  1. Creating applications using deep learning is easier than I thought. In essence, most state-of-the-art (or near state-of-the-art) stuff is available freely and easy to fine-tune for your application.

  2. Hardware is available in ease that was not really possible even just a few years ago. Most of these models require some sort of GPU to actually work in reasonable amounts of time in practical applications. In the past, especially when I was in university, it was hard and expensive to get access to these things. These days, you can just spin up a Kaggle notebook or a Google Colab session to experiment with. For production, you can use tools like Replicate.

  3. Most of the actual work is data cleaning and that sort of labor. So far, from what I’ve experienced, 90% of the time, I’m working on getting the data in a format that’s actually useful for the model and then put the answers in a format that’s actually useful for me. The fine-tuning and predictions are much easier.

  4. It feels amazing and a little weird to train (or fine-tune) a model. I solved the digit recognizer problem by fine-tuning resnet34 over the training dataset. On a high level, it’s similar to teaching a kid digits. You show them a few digits and tell them to remember it. This is essentially what you’re doing when you’re fine-tuning a model too. I wonder how the creators of things like ChatGPT feel, seeing a nearly general intelligence be born out of their code.

Other news

  • I’m working on a cool project to create a Q&A bot over all of YCombinator’s YouTube channel, if you’re interested and want to chat, hit me up on Twitter:

    Twitter avatar for @iliekcomputers
    iliekcomputers @iliekcomputers
    guess who's transcribing literally the entirety of @ycombinator's YouTube channel
    6:17 PM ∙ Feb 20, 2023
  • Interesting post about Stripe this week:

    Net Interest
    Striped Down
    Since switching on the paid tier of Net Interest eighteen months ago, I’ve processed a lot of payments. On average, I do around 12 a day and have administered around 7,000 in total. They originate from all over the world: 40 states in America, practically every country in Western Europe, and countries as diverse as Colombia, Kenya and Vietnam. Most subscribers pay by credit card, but some use debit and prepaid cards…
    Read more
    a month ago · 34 likes · Marc Rubinstein

Thanks for reading newsletter.param.codes! Subscribe for free to receive new posts and support my work.

Share this post

Learning deep learning

newsletter.param.codes
Previous
Next
Comments
TopNewCommunity

No posts

Ready for more?

© 2023 Param Singh
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing