Python in Action: 60 Mini Projects to Automate Everything — Part 2 is where Python goes online.
If Part 1 helped you build solid foundations—project structure, clean CLIs, logging, validation, and real-world data handling—
Part 2 (Mini Projects 21–40) teaches you how to work with the outside world: websites that change, services that rate-limit you, flaky connections, inconsistent responses, and data you can’t trust until you validate it.
This volume is built for doing, not skimming. Each mini project ends with a practical tool you can reuse: scrapers that export clean datasets, monitors that detect changes, downloaders that save files with sane names, and API clients that behave like professional software.
What you’ll build in Part 2Block 3 — Responsible Web Scraping & Web Automation (21–30)- Headline scraper (title/date/link) → CSV
- Pagination scraper that consolidates results
- Rate limiting with retries and backoff
- HTML table extraction and normalization
- Change monitor (diff + alerts)
- Price/stock monitor (report mode)
- PDF downloader with clean filenames
- Page archiver (HTML + metadata + hash)
- Feed builder (scraping → RSS/JSON)
- A “polite” scraper: cache, user-agent, basic robots awareness, retries, and logs
Block 4 — APIs: Consume Services Like a Pro (31–40)- Simple GET/POST client with consistent error handling
- Token authentication + simulated refresh
- Universal pagination with generators
- Rate limit handler that respects response headers
- Local TTL caching to reduce calls
- Offline mode: fall back to cache when the API fails
- Response validation with Pydantic + readable errors
- Batch requests with controlled concurrency
- Sync remote data to a local store using upsert logic
- A mini SDK package with docs, examples, and automated tests
Who this book is for- Python learners who want real projects, not toy examples
- Developers who can write scripts but want tools that don’t break
- Anyone who wants a practical roadmap to web automation and API engineering
If you want Python to do more than run locally—if you want it to collect, verify, store, and deliver information reliably—
Part 2 is your next step.