STOP Taking Random AI Courses - Read These Books Instead

    Oct 3, 2025

    11826 symboles

    8 min de lecture

    SUMMARY

    Egor Howell, an AI practitioner with over four years of experience, recommends targeted books and courses in programming, math, machine learning, deep learning, and AI engineering to build AI skills effectively, emphasizing practice over random courses.

    STATEMENTS

    • To work in AI, strong software engineering and programming skills are essential, as supported by OpenAI CTO Greg Brockman.
    • Python is the most commonly used language for AI infrastructure and machine learning, unlikely to change soon.
    • AI engineering roles lean more toward software engineering than machine learning, potentially requiring backend languages like Java, Go, or Rust.
    • Practice is the best way to learn programming or any AI skill, using resources only for fundamentals before implementing.
    • Understanding the underlying math is crucial for top AI practitioners, even if not building models from scratch.
    • Key math areas for AI include statistics, linear algebra, and calculus, best learned through targeted resources.
    • Modern AI, often called generative AI, builds on machine learning foundations dating back to the 1950s and earlier concepts like the Turing Test.
    • Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow is the single best book for comprehensive AI and machine learning knowledge.
    • Deep learning, powering generative AI like LLMs and transformers, requires learning at least one library like PyTorch over TensorFlow.
    • AI engineering focuses on deploying existing models to production, emphasizing infrastructure over building from scratch.

    IDEAS

    • Python's dominance in AI stems from its role in machine learning libraries, making it the safest starting point despite emerging backend languages.
    • Many believe math is unnecessary for AI due to pre-built models, but deep understanding enhances proficiency in using and innovating with them.
    • AI's history predates generative tools, tracing to 1950s neural networks and Alan Turing's wartime ideas on thinking machines.
    • A single book can cover nearly all machine learning fundamentals, from basics to advanced topics like reinforcement learning and LLMs.
    • Andrew Ng's courses remain top-tier despite age, now updated for Python, blending theory with practical notebooks.
    • Short, reference-style books like The Hundred-Page Machine Learning Book offer quick conceptual overviews without dense details.
    • PyTorch's rise to 92% of Hugging Face models and 77% of 2021 research papers signals its future as the primary deep learning library.
    • Building a GPT from scratch using only NumPy in courses like Karpathy's reveals the raw mechanics of state-of-the-art LLMs.
    • AI engineering prioritizes productionizing existing foundational models like Llama or Claude, as individual training is resource-prohibitive.
    • Learning AI through concrete projects and summarizing in your own words outperforms broad, bottom-up study approaches.

    INSIGHTS

    • True expertise in AI demands blending theoretical math with hands-on coding, turning abstract concepts into deployable solutions that drive real value.
    • Python's ecosystem locks in its AI relevance, but versatility in languages like Rust prepares for evolving engineering roles beyond pure ML.
    • Generative AI's hype overshadows broader machine learning foundations, yet mastering classics like statistical learning unlocks deeper innovation.
    • Practice via projects, not exhaustive reading, accelerates skill-building, as partial resource use yields faster, applicable knowledge.
    • Deployment trumps model creation in modern AI jobs, shifting focus from invention to scalable infrastructure for business impact.
    • Iterative, self-comparative learning fosters sustainable growth, avoiding intimidation from peers while building personalized expertise.

    QUOTES

    • "If you want to work in AI, you have to have good software engineering skills and also good programming skills."
    • "The only real way to learn something is through consistent practice and building and getting hands-on experience."
    • "One, iteratively take on concrete projects and accomplish them depthwise. Learning on demand. Don't learn bottom up breathwise."
    • "You're not necessarily building models from scratch because a lot of the best models like Llama, Claude, GPT are kind of already built."
    • "If you learn everything in that textbook, then your math skills will be more than sufficient for a lifelong career in AI and machine learning."

    HABITS

    • Start with fundamentals from one resource, then immediately apply by implementing projects to reinforce learning.
    • Use platforms like HackerRank and LeetCode for daily problem-solving to build hands-on Python and interview skills.
    • Summarize everything learned in your own words to deepen understanding and track personal progress.
    • Compare your growth only to your past self, avoiding self-doubt from benchmarking against others.
    • Select relevant sections from dense textbooks, focusing on AI-applicable topics rather than reading cover-to-cover.

    FACTS

    • PyTorch powered 77% of AI research papers in 2021 and 92% of Hugging Face models.
    • Andrew Ng's Machine Learning Specialization originated in 2012 using Octave but has been revamped for Python.
    • Zero to Mastery has helped over 1,000 students land jobs at companies like Meta, Google, and Nvidia.
    • AI concepts date back to Alan Turing's work during World War II, with the first neural network proposed in the 1950s.
    • The Elements of Statistical Learning emphasizes theory in statistical methods, which underpin much of modern machine learning.

    REFERENCES

    • Learn Python - Full Course for Beginners (freeCodeCamp video).
    • Python for Everybody Specialization (Coursera).
    • NeetCode (for data structures, algorithms, and system design).
    • CS50: Introduction to Computer Science (Harvard course).
    • Practical Statistics for Data Science (book by Peter Bruce et al.).
    • Mathematics for Machine Learning (book by Deisenroth et al.).
    • Mathematics for Machine Learning and Data Science Specialization (DeepLearning.AI).
    • Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow (book by Aurélien Géron).
    • Machine Learning Specialization (Coursera by Andrew Ng).
    • The Hundred-Page Machine Learning Book (by Andriy Burkov).
    • The Elements of Statistical Learning (book by Hastie et al.).
    • Deep Learning Specialization (DeepLearning.AI by Andrew Ng).
    • Hands-On Large Language Models (book by Jay Alammar).
    • Practical MLOps (book by Noah Gift et al.).
    • AI Engineering (book by Chip Huyen).
    • PyTorch Tutorials (video series).
    • Introduction to LLMs (1-hour talk by Andrej Karpathy).
    • Neural Networks: Zero to Hero (course by Andrej Karpathy).
    • Zero to Mastery AI/ML Bootcamp.

    HOW TO APPLY

    • Begin with Python fundamentals using freeCodeCamp's 4-hour course, then solve problems on HackerRank to practice coding basics daily.
    • Study statistics and math via Practical Statistics for Data Science and Mathematics for Machine Learning, applying concepts with Python examples in notebooks.
    • Complete Andrew Ng's Machine Learning Specialization, implementing theory in Python assignments and building simple models like regression predictors.
    • Learn PyTorch through tutorials, then tackle Karpathy's Zero to Hero course to construct a neural network from scratch using NumPy.
    • Deploy a model using Practical MLOps techniques: containerize with Docker, integrate into a cloud system, and test for production scalability on a sample project like image classification.

    ONE-SENTENCE TAKEAWAY

    Master AI by prioritizing targeted books, practice-driven courses, and deployment skills over scattered online tutorials.

    RECOMMENDATIONS

    • Focus on Python first for its ML ecosystem, then explore Rust for backend AI engineering versatility.
    • Invest in Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow as your core reference for all AI fundamentals.
    • Build projects iteratively, learning on demand rather than exhaustive upfront study, to gain depth quickly.
    • Understand math under the hood via specialized resources to excel beyond basic model usage.
    • Pursue AI engineering by mastering MLOps and deployment, as most jobs involve productionizing existing models.

    MEMO

    In a field exploding with hype and half-baked online courses, Egor Howell cuts through the noise. With four years deep in AI and machine learning trenches, the British data scientist and educator urges aspiring practitioners to ditch the scattershot approach. Instead, he prescribes a curated arsenal of books and courses that build unshakeable foundations. "If you want to work in AI," Howell says, "you have to have good software engineering skills and also good programming skills." This isn't casual advice; it's echoed by heavyweights like OpenAI's CTO Greg Brockman. At the heart of Howell's roadmap lies Python, the lingua franca of machine learning, powering everything from neural networks to vast infrastructures. Yet, as AI engineering roles proliferate—blending code with deployment savvy—languages like Rust may carve out niches in scalable systems.

    Math, often dismissed as optional in the era of plug-and-play models like ChatGPT, gets a firm defense from Howell. "To understand how these LLMs work under the hood," he argues, "you need the fundamental maths." He spotlights three essentials: statistics, linear algebra, and calculus, distilled through targeted texts like Practical Statistics for Data Science and Mathematics for Machine Learning. These aren't abstract tomes; they're laced with Python examples, ensuring relevance to AI's real-world grind. Howell's history lesson underscores AI's depth: far from a Silicon Valley novelty, it echoes Alan Turing's wartime musings on thinking machines and 1950s neural network blueprints. This broader lens reveals generative AI as just one vibrant thread in machine learning's tapestry.

    For machine learning mastery, Howell crowns Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow as the indispensable bible. "If you could literally get only one book for your whole AI machine learning career, it would be this one," he declares. Andrew Ng's timeless Coursera specialization follows, now Python-fied for modern relevance. Lighter fare like The Hundred-Page Machine Learning Book serves as bedside fodder, while denser works like The Elements of Statistical Learning dive into statistical roots. Howell praises Zero to Mastery's bootcamp for its project-heavy ethos—think heart disease detectors and image classifiers—fostering a community that's launched careers at Meta and Google.

    Deep learning, the engine of today's generative boom, demands PyTorch fluency, Howell insists, citing its dominance in 92% of Hugging Face models. Ng's follow-up specialization unpacks convolutions and transformers, paving the way for Andrej Karpathy's illuminating talks and courses. Building a GPT from raw NumPy arrays? That's the zero-to-hero thrill that cements intuition. Jay Alammar's Hands-On Large Language Models demystifies these beasts with the clarity of his famed "Illustrated Transformer" blog. Yet Howell reminds: knowledge alone fizzles without action. AI engineering, the field's practical pinnacle, hinges on deploying models via MLOps—containerizing with Docker, cloud integration—to deliver business value.

    Howell's parting wisdom, drawn from Karpathy's tweet, flips traditional learning on its head: tackle concrete projects depth-first, summarize in your words, and measure against your younger self. No need to devour every page; cherry-pick relevance and iterate. This pragmatic path, honed over years, demystifies AI's ascent, empowering anyone to contribute meaningfully. As tools evolve and jobs demand hybrid skills, Howell's blueprint offers not just entry but enduring edge.