OpenHosta Documentation¶
Welcome to the OpenHosta documentation. OpenHosta is the semantic layer for Python โ it transforms human language and type annotations into executable, type-safe Python functions powered by Large Language Models.
Quick Navigation¶
๐ Getting Started¶
Set up your environment, configure a local or remote model, and run your first emulate() call.
โ๏ธ Core Functions¶
Learn about emulate, emulate_async, emulate_iterator, closure, ask, and test.
๐ง Models & Setup¶
Connect any OpenAI-compatible endpoint (Ollama, vLLM, Azure OpenAI), customize prompts, enable audit mode, and track costs.
๐งฉ Types & Pydantic Validation¶
OpenHosta natively supports int, str, List, Dict, Enum, dataclass, Pydantic V2, and even Callable return types.
๐ก๏ธ Safe Context & Error Handling¶
Handle uncertainty, catch ambiguous LLM responses, and build robust production workflows.
๐ Guarded Types¶
Deep dive into OpenHosta's type validation and conversion system with configurable tolerance.
Cookbook¶
| Example | Description |
|---|---|
| ๐ Text Classification | Classify text into Enum states |
| ๐๏ธ Data Extraction | Populate dataclass / Pydantic from unstructured text |
| ๐๏ธ Local OCR | Image processing with PIL.Image + Ollama |
| โก Parallel Processing | Async batch workloads with emulate_async |
Compatibility¶
- Python: 3.10, 3.11, 3.12, 3.13, 3.14 (details)
- Models: OpenAI (GPT-4.1, GPT-5), Ollama (Qwen, Mistral), Azure, vLLM (type matrix)
Contributing¶
We welcome contributions! See our Contribution Guide and Code of Conduct.
OpenHosta is licensed under the MIT License.