ai
Numeric vectors, small matrices, activations, batching, and prompt token budgeting. Pure ilusm - no host model required.
Load with: use ai
What this module does
ai provides the numeric building blocks you need to work with ML data in ilusm.
It operates on plain lists of numbers as vectors and lists-of-lists as matrices.
There's no native model or GPU here - this is pure ilusm math for things like:
preprocessing embeddings, computing dot products and softmax, splitting text into windows,
estimating token counts, and running a single feed-forward layer.
Quick example
use ai
a = [1.0, 2.0, 3.0]
b = [4.0, 5.0, 6.0]
prn(aiudo(a, b)) # dot product → 32.0
prn(aiunr(a)) # L2 norm → 3.74...
prn(aiusf(a)) # softmax → [0.09, 0.24, 0.67]
# Rough token estimate for a prompt
prn(aiest("Hello, how are you?")) # → 5
Functions
Vector operations
Vectors are flat lists of numbers. Most operations require both vectors to have the same length, and will error if they don't.
aiule(v)
Returns the length of the vector (number of elements).
aiusu(v)
Returns the sum of all elements in the vector. Errors if v is not a list.
aiudo(a, b)
Dot product of two vectors - multiplies element-wise and sums the results. Both vectors must be the same length.
aiuad(a, b)
Element-wise addition of two vectors. Returns a new vector.
aiusb(a, b)
Element-wise subtraction of two vectors (a - b). Returns a new vector.
aiusc(v, s)
Scalar multiplication - multiplies every element of v by s.
aiumu(a, b)
Element-wise multiplication (Hadamard product) of two vectors.
aiunr(v)
L2 norm (Euclidean length) of the vector - square root of the dot product with itself.
aiurl(v)
ReLU activation - returns a new vector where negative values are replaced with 0.
aiusf(v)
Softmax over the vector. Uses the numerically stable max-shift method. Errors if the vector is empty or all exponents sum to zero.
Matrix operations
Matrices are lists of rows, where each row is a list of numbers.
aimrw(m)
Returns the number of rows in the matrix.
aimcl(m)
Returns the number of columns (length of the first row).
aimmv(m, v)
Matrix-vector multiplication. Multiplies each row of m by v using dot product. The vector length must match the number of columns.
Loss and distance
ail2s(a, b)
Squared L2 distance between two vectors - subtracts them and takes the dot product of the difference with itself.
Batching
aibch(xs, k)
Splits a flat list into chunks of size k. The last chunk may be shorter if the list doesn't divide evenly.
aibcv(vs, k)
Same as aibch - splits a list of vectors into batches of size k.
Text chunking and token budgeting
aichn(s, w)
Splits a string into chunks of w characters. Useful for sliding-window context processing.
aiest(s)
Rough token count estimate: divides the character length by 4 and rounds up. This is a heuristic only - actual tokenisation varies by model.
Feed-forward layer
aiffn(m, b, x)
Runs a single feed-forward layer: computes relu(m * x + b). m is the weight matrix, b is the bias vector, x is the input vector. The bias length must match the number of rows in m.
Notes
- All vectors must be flat lists of numbers. The module errors with
"ai:vec"if you pass something else. - Token estimation (
aiest) is chars ÷ 4, rounded up. It is a rough heuristic, not exact tokenisation. - No GPU or native model is involved - everything runs in pure ilusm.