depscope
Packages
IntegrateAPI DocsCuratorBenchmarkCoverage
Sign inGet API access
depscope/pypi/airllm

airllm

pypiv2.11.0

AirLLM allows single 4GB GPU card to run 70B large language models without quantization, distillation or pruning. 8GB vmem to run 405B Llama3.1.

License MITpermissive33 versions1 maintainers8 deps3,536 weekly dl
lyogavin/airllm
50
/ 100
Health
safe to use

[email protected] is safe to use (health: 50/100)

Health breakdown0 – 100
5/25
maintenance
6/20
popularity
25/25
security
12/15
maturity
2/15
community
Vulnerabilities
0
none known

Health History

Dependency Tree

License Audit

Dependencies (8)
tqdmtorchtransformersacceleratesafetensorsoptimumhuggingface-hubscipy
API access

Get this data programmatically — free, no authentication.

curl https://depscope.dev/api/check/pypi/airllm
More from pypi
boto3packagingurllib3certifiidnarequests
Browse all pypi packages →

Last updated · 2024-09-21T02:52:22.091498Z

DepScope

Package intelligence for AI agents. 19 ecosystems.

Resources
API DocumentationHallucination BenchmarkFor EnterpriseSwagger / OpenAPIPopular PackagesCoverageAI Plugin SetupWatch the pitch (60s)
Legal
Legal hubPrivacy PolicyTerms of ServiceCookie PolicyAcceptable UseAttributionDPASub-processorsSecurityImprintContact中文
© 2026 Cuttalo srl — Italy · VAT IT03242390734Built for AI agents