edgemodelr: Local Large Language Model Inference Engine

Enables R users to run large language models locally using 'GGUF' model files and the 'llama.cpp' inference engine. Provides a complete R interface for loading models, generating text completions, and streaming responses in real-time. Supports local inference without requiring cloud APIs or internet connectivity, ensuring complete data privacy and control. Based on the 'llama.cpp' project by Georgi Gerganov (2023) <https://github.com/ggml-org/llama.cpp>.

Version: 0.1.4
Depends: R (≥ 4.0)
Imports: Rcpp (≥ 1.0.0), utils, tools
LinkingTo: Rcpp
Suggests: testthat (≥ 3.0.0), knitr, rmarkdown, curl
Published: 2026-01-22
DOI: 10.32614/CRAN.package.edgemodelr
Author: Pawan Rama Mali [aut, cre, cph], Georgi Gerganov [aut, cph] (Author of llama.cpp and GGML library), The ggml authors [cph] (llama.cpp and GGML contributors), Jeffrey Quesnelle [ctb, cph] (YaRN RoPE implementation), Bowen Peng [ctb, cph] (YaRN RoPE implementation), pi6am [ctb] (DRY sampler from Koboldcpp), Ivan Yurchenko [ctb] (Z-algorithm implementation), Dirk Eddelbuettel [ctb] (Connection handling fix)
Maintainer: Pawan Rama Mali <prm at outlook.in>
BugReports: https://github.com/PawanRamaMali/edgemodelr/issues
License: MIT + file LICENSE
Copyright: see file COPYRIGHTS
URL: https://github.com/PawanRamaMali/edgemodelr
NeedsCompilation: yes
SystemRequirements: C++17, GNU make or equivalent for building
Citation: edgemodelr citation info
CRAN checks: edgemodelr results

Documentation:

Reference manual: edgemodelr.html , edgemodelr.pdf

Downloads:

Package source: edgemodelr_0.1.4.tar.gz
Windows binaries: r-devel: not available, r-release: not available, r-oldrel: not available
macOS binaries: r-release (arm64): edgemodelr_0.1.4.tgz, r-oldrel (arm64): edgemodelr_0.1.4.tgz, r-release (x86_64): edgemodelr_0.1.4.tgz, r-oldrel (x86_64): edgemodelr_0.1.4.tgz
Old sources: edgemodelr archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=edgemodelr to link to this page.