A JIT-Compiled Language with Built-In AI
Write AI, ML, and vision apps in fewer lines of code. Native ARM64 and x86-64 performance. One toolchain across Linux, macOS, and Windows.
use API.Ollama;
class Chat {
function : Main(args : String[]) ~ Nil {
prompt := args[0];
Completion->Generate("llama3:latest", prompt)
->PrintLine();
}
}
Run Phi-3 and Phi-3 Vision models locally — text generation, image analysis, and more. Supports DirectML, CoreML, CUDA, and CPU acceleration.
Language Highlights
JIT Compiled
Automatic hot-code detection with ARM64 and x86-64 native compilation. Direct JIT-to-JIT calling and generational GC with bump allocation.
AI & ML Ready
Chat with LLMs, run ONNX models, and process images with OpenCV — all from the standard library.
OOP + Functional
Generics, closures, lambdas, first-class functions, and reflection. One language, both paradigms.
Text Processing
Tokenization, TF-IDF, text similarity, and sentiment analysis out of the box.
Web & Networking
HTTP client/server, TLS 1.3, OAuth, JSON, XML, and CSV. Ship networked apps without extra dependencies.
Developer Tools
REPL, LSP support for VS Code/Sublime/Kate, interactive debugger, and API doc generator.