A JIT-Compiled Language with Built-In AI

Write AI, ML, and vision apps in fewer lines of code. Native ARM64 and x86-64 performance. One toolchain across Linux, macOS, and Windows.

use API.Ollama;

class Chat {
  function : Main(args : String[]) ~ Nil {
    prompt := args[0];
    Completion->Generate("llama3:latest", prompt)
      ->PrintLine();
  }
}
New Local AI with ONNX + Phi-3
Run Phi-3 and Phi-3 Vision models locally — text generation, image analysis, and more. Supports DirectML, CoreML, CUDA, and CPU acceleration.

Language Highlights

JIT

JIT Compiled

Automatic hot-code detection with ARM64 and x86-64 native compilation. Direct JIT-to-JIT calling and generational GC with bump allocation.

AI

AI & ML Ready

Chat with LLMs, run ONNX models, and process images with OpenCV — all from the standard library.

FN

OOP + Functional

Generics, closures, lambdas, first-class functions, and reflection. One language, both paradigms.

NLP

Text Processing

Tokenization, TF-IDF, text similarity, and sentiment analysis out of the box.

NET

Web & Networking

HTTP client/server, TLS 1.3, OAuth, JSON, XML, and CSV. Ship networked apps without extra dependencies.

DEV

Developer Tools

REPL, LSP support for VS Code/Sublime/Kate, interactive debugger, and API doc generator.

Cross-Platform

Linux x64 Linux ARM64 macOS ARM64 Windows x64 Windows ARM64 Raspberry Pi