LevitateOS

Documentation

LLM Helpers

AI-assisted helpers for extracting versions and download URLs from unstructured sources. These helpers shell out to a local agent CLI (Codex or Claude).

Section

Overview

LLM helpers are useful when version numbers or download URLs aren't available via a clean API (like GitHub releases). Recipe does not try to parse the model output; it just runs your configured provider and returns the final text.

WARNING

Safety: LLM output is untrusted. Review the extracted version/URL and anything that will be executed. The default update model is A/B immutable (slot updates + rollback); mutable mode is an explicit opt-in for daredevils and is unsafe if you let an LLM author recipes without review. See Atomic Updates (A/B).

Configure the provider via XDG config: $XDG_CONFIG_HOME/recipe/llm.toml (default: ~/.config/recipe/llm.toml). You must set default_provider to either codex or claude (equal footing; no implicit fallback). Optionally define named profiles under [profiles.<name>] and select them per run via recipe --llm-profile <name> ....

Section

llm_extract

Extract arbitrary information from unstructured text (HTML, changelog, etc) using a natural language prompt.

Command
rhai
let html = http_get("https://example.com/downloads");
let result = llm_extract(
    html,
    "Find the SHA256 checksum for the Linux x86_64 download. Return only the checksum."
);

Section

llm_find_latest_version

Find the latest version number from a download page. The LLM parses the page and returns just the version string.

Command
rhai
fn check_update() {
    // For software without GitHub releases
    let latest = llm_find_latest_version("https://example.com/downloads/", "ExampleProject");
    if latest != ctx.version {
        latest
    } else {
        ()
    }
}

Section

llm_find_download_url

Find a download URL matching criteria from unstructured text (HTML, release notes, etc). Useful for complex download pages.

Command
rhai
fn acquire(ctx) {
    let html = http_get("https://example.com/downloads/");
    let url = llm_find_download_url(html, "Linux x86_64 tarball for version " + ctx.version);
    let archive = download(url, join_path(BUILD_DIR, ctx.name + ".tar.gz"));
    // ...
}

Section

When to Use

Prefer structured APIs when available:

Source Use
GitHub releases github_latest_release(), github_download_release()
GitHub tags github_latest_tag()
Direct URLs download()
Unstructured pages llm_* helpers

LLM helpers are slower and require a local LLM, so only use them when no structured alternative exists.

Section

Configuration

Create $XDG_CONFIG_HOME/recipe/llm.toml (default: ~/.config/recipe/llm.toml):

Command
toml
version = 1
default_provider = "codex" # or "claude"
default_profile = "kernels_nightly" # optional

[providers.codex]
bin = "codex"
args = ["--sandbox", "read-only", "--skip-git-repo-check"]

[providers.claude]
bin = "claude"
args = ["-p", "--output-format", "text", "--no-chrome"]

[profiles.kernels_nightly]
default_provider = "codex"

[profiles.kernels_nightly.providers.codex]
model = "gpt-5.3-codex"
effort = "xhigh" # passed to codex as --config model_reasoning_effort=xhigh