edgemodelr

cranv0.2.0

Local Large Language Model Inference Engine. Enables R users to run large language models locally using 'GGUF' model files and the 'llama.cpp' inference engine. Provides a complete R interface for loading models, generating text completions, and streaming responses in real-time. Supports local inference without requiring cloud APIs or internet

License MIT + file LICENSE0 versions1 maintainers3 deps60 weekly dl
PawanRamaMali/edgemodelr
47
/ 100
Health
safe to use

[email protected] is safe to use (health: 47/100)

Health breakdown0 – 100
20/25
maintenance
0/20
popularity
25/25
security
0/15
maturity
2/15
community
Vulnerabilities
0
none known

Health History

Dependency Tree

License Audit

Dependencies (3)
API access

Get this data programmatically — free, no authentication.

curl https://depscope.dev/api/check/cran/edgemodelr

First published · 2026-02-26 03:22:00

Last updated · 2026-02-25T23:50:02+00:00