Skip to content

OpenHosta Documentation

Version 4.1 ยท GitHub ยท PyPI

Welcome to the OpenHosta documentation. OpenHosta is the semantic layer for Python โ€” it transforms human language and type annotations into executable, type-safe Python functions powered by Large Language Models.


Quick Navigation

๐Ÿš€ Getting Started

Set up your environment, configure a local or remote model, and run your first emulate() call.

โš™๏ธ Core Functions

Learn about emulate, emulate_async, emulate_iterator, closure, ask, and test.

๐Ÿ”ง Models & Setup

Connect any OpenAI-compatible endpoint (Ollama, vLLM, Azure OpenAI), customize prompts, enable audit mode, and track costs.

๐Ÿงฉ Types & Pydantic Validation

OpenHosta natively supports int, str, List, Dict, Enum, dataclass, Pydantic V2, and even Callable return types.

๐Ÿ›ก๏ธ Safe Context & Error Handling

Handle uncertainty, catch ambiguous LLM responses, and build robust production workflows.

๐Ÿ“ Guarded Types

Deep dive into OpenHosta's type validation and conversion system with configurable tolerance.


Cookbook

Example Description
๐Ÿ“š Text Classification Classify text into Enum states
๐Ÿ—ƒ๏ธ Data Extraction Populate dataclass / Pydantic from unstructured text
๐Ÿ‘๏ธ Local OCR Image processing with PIL.Image + Ollama
โšก Parallel Processing Async batch workloads with emulate_async

Compatibility

  • Python: 3.10, 3.11, 3.12, 3.13, 3.14 (details)
  • Models: OpenAI (GPT-4.1, GPT-5), Ollama (Qwen, Mistral), Azure, vLLM (type matrix)

Contributing

We welcome contributions! See our Contribution Guide and Code of Conduct.

OpenHosta is licensed under the MIT License.