Search Results for

    ⚙️ CLI Usage — PromptStream.AI

    The PromptStream.AI CLI lets you build, validate, and analyze prompt templates directly from your terminal. It provides a clean developer workflow for experimenting with the Flow.AI ecosystem (Flow.AI.Core, TokenFlow.AI, and PromptStream.AI).


    🚀 Installation

    Install the global tool via NuGet:

    dotnet tool install --global promptstream.ai.cli
    

    Once installed, you can call it using the promptstream command from anywhere.


    🧱 Basic Commands

    🧩 Build

    Render and substitute variables in a template.

    promptstream build --template "Hello {{name}}" --var name=Andrew
    

    🔍 Validate

    Check structure, token limits, and syntax.

    promptstream validate --template "Summarize {{text}}" --var text="PromptStream.AI is awesome"
    

    📊 Analyze

    Estimate token counts and cost for a given prompt.

    promptstream analyze --template "Explain quantum computing simply."
    

    💬 Generate

    Build, validate, and request a response from the model provider.

    promptstream generate --template "Write a haiku about AI" --save context.json
    

    🧠 Context

    Inspect, summarize, or clear conversation history.

    promptstream context --load context.json --summarize
    

    🧩 Command Reference

    Command Description
    build Renders and substitutes variables in a template.
    validate Validates prompt structure and token usage.
    analyze Estimates token usage and cost metrics.
    generate Generates model output for a prompt.
    context Loads, saves, summarizes, or clears context data.

    💡 Options

    Option Description
    --template <value> Path to a .json file or inline text content.
    --var <key=value> Variables to substitute (multiple allowed).
    --context <path> Path to an existing context JSON file.
    --save <path> Destination path to save context or output.
    --model <id> Optional model ID (e.g. gpt-4o-mini).
    --output <format> Output format (table or json).

    📸 Example Output

    📊 Prompt Analysis
    ────────────────────────────────────────
    Model: gpt-4o-mini
    Prompt Tokens:     128
    Completion Tokens: 142
    Total Tokens:      270
    Estimated Cost:    $0.00052
    ────────────────────────────────────────
    ✅  Analysis complete — total tokens: 270
    

    🧠 Integration Notes

    • The CLI uses PromptStream.AI services under the hood, leveraging Flow.AI.Core’s shared models.
    • Variable substitution is handled via the same builder logic used by the library itself.
    • Each command mirrors a service call: Build → Validate → Analyze → Generate.

    🗺️ Links

    • PromptStream.AI on GitHub
    • NuGet Package
    • PromptStream.AI Docs

    Author: AndrewClements84
    License: MIT
    Version: 0.8.3

    • Edit this page
    In this article
    Back to top Generated by DocFX