Skip to content

feat: replace litellm with nanollm β€” 108x less code, 1 dependency#1871

Open
hafezparast wants to merge 9 commits intounclecode:mainfrom
hafezparast:feat/replace-litellm-with-nanollm
Open

feat: replace litellm with nanollm β€” 108x less code, 1 dependency#1871
hafezparast wants to merge 9 commits intounclecode:mainfrom
hafezparast:feat/replace-litellm-with-nanollm

Conversation

@hafezparast
Copy link
Contributor

Summary

  • Replaces litellm dependency (544K lines, 61 packages, 151 MB installed) with nanollm (5K lines, 1 dep, 5.5 MB installed)
  • All from litellm import ... changed to from nanollm import ...
  • Zero functional changes to crawl4ai β€” same API surface, same behavior
  • Depends on unclecode/nanollm#2

What changed

  • pyproject.toml: litellm>=1.53.1 β†’ nanollm @ git+...
  • All Python files: litellm β†’ nanollm imports (utils.py, cli.py, extraction_strategy.py, legacy/llmtxt.py)
  • nanollm.drop_params = True added where crawl4ai calls completion (handles O-series/GPT-5 compat)

Why

litellm is 544K lines with 61 transitive dependencies. crawl4ai uses exactly 4 functions from it: completion, acompletion, batch_completion, aembedding. NanoLLM provides those same functions (plus more) in 5K lines with 1 dependency (httpx, which crawl4ai already uses).

Test plan

  • 15/15 crawl4ai unit tests pass
  • 355/356 regression tests pass (1 pre-existing transformers issue, unrelated)
  • Zero litellm references in crawl4ai source
  • All import patterns verified
  • Real API integration tests

πŸ€– Generated with Claude Code

hafezparast and others added 9 commits March 26, 2026 18:34
Complete migration from unclecode-litellm to nanollm. Zero litellm
references remain in the crawl4ai source code.

Version: 0.8.7a1 (pre-release for nanollm integration testing)

Changes:
- pyproject.toml, requirements.txt: swap unclecode-litellm for nanollm
- crawl4ai/utils.py: all litellm imports β†’ nanollm (completion,
  acompletion, batch_completion, aembedding, RateLimitError, drop_params)
- crawl4ai/cli.py: litellm import β†’ nanollm, provider docs URLs updated
- crawl4ai/legacy/llmtxt.py: litellm imports β†’ nanollm, set_verbose
- crawl4ai/__version__.py: bump to 0.8.7a1

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Use hafezparast/nanollm for easier access and independent control
during testing.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Reflects nanollm dependency now pointing to hafezparast fork.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Pin nanollm dependency to tagged release v0.1.0 instead of branch
name for reproducible installs.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Pin to commit 35a0c4b (v0.1.0) which includes the
completion_tokens_details and prompt_tokens_details fix.
Using commit hash instead of tag ensures pip won't serve a
stale cached version.

Bump to 0.8.7a4.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
nanollm v0.1.1 wraps completion_tokens_details and prompt_tokens_details
in _AttrDict so crawl4ai's .__dict__ access pattern works correctly.

Bump to 0.8.7a5.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Point to hafezparast/nanollm-core (Approach 1 β€” general-purpose litellm
replacement) instead of hafezparast/nanollm (Approach 2 β€” fork).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Point dependency to hafezparast/nanollm-approach1 v0.2.0 β€” built by
AST-analyzing litellm's 544K-line codebase, extracting the core
completion subgraph, and rewriting it with multimodal/vision support.

3,731 lines, 1 dep (httpx), 605 tests, 7 adapters, 25+ providers.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Combines Approach 1 (httpx, module functions, 600+ tests) with
Approach 2 (class-based providers, NanoLLM client, built-in retry,
structured output, thinking/reasoning).

5,026 lines, 1 dep (httpx), 609 tests, 25 providers.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant