Skip to Content
IntelligenceRough-path signatures

Rough-Path Signatures

What goes wrong with a naive approach

You want to compare two runtime traces — “is today’s showcase run performing equivalently to yesterday’s?” A trace is a multivariate time series of frame time, allocation bytes, effect-queue depth, input latency, etc. Comparing them with pointwise distances (L2L^2, DTW, etc.) fails because:

  • Reparameterisation. The same workload with different frame rates produces two traces that are pointwise different but semantically identical. Pointwise metrics flag noise as regression.
  • Variable length. Yesterday’s trace was 30 seconds, today’s 33. You have to crop or stretch; both choices distort.
  • Cross-channel coupling. What matters is not just the frame time, but the order in which frame time and allocation spikes interleave. Separate summaries lose that ordering.

You need a summary that is (a) invariant to time-reparameterisation, (b) captures cross-channel coupling, and (c) has a clean “update with a new sample” rule so it can be computed online.

Path signatures — the central object of rough-path theory (Lyons, 1998) — are that summary.

Mental model

A path is a function X:[0,T]RdX: [0, T] \to \mathbb{R}^d. Its signature is a graded object whose degree-kk component is the kk-fold iterated integral:

S(X)i1ikk=0<t1<<tk<TdXt1i1dXtkikS(X)^k_{i_1 \ldots i_k} = \int_{0 < t_1 < \cdots < t_k < T} dX^{i_1}_{t_1} \cdots dX^{i_k}_{t_k}
  • Degree 1: the path increment ΔXi\Delta X^i.
  • Degree 2: signed area between pairs of coordinates.
  • Degree 3+: higher-order interactions.

Two crucial properties:

  1. Time-reparameterisation invariance. Stretch or squeeze XX in time; signature does not change.
  2. Chen’s identity. Signature of a concatenation is the tensor product of signatures: S(XY)=S(X)S(Y)S(X \star Y) = S(X) \otimes S(Y) This is the online-update rule.

Truncate at depth KK (typically 3 or 4) and you get a finite vector that compresses the whole path’s “shape” into O(dK)O(d^K) numbers.

The signature is to paths what Fourier coefficients are to periodic signals: a coordinate system that turns complex objects into vectors you can feed into classifiers, distance functions, and regression detectors. It is the right encoding for multi-channel runtime traces.

The math

Iterated integrals

Signature at depth KK:

SK(X)=(1,  S1(X),  S2(X),  ,  SK(X)),Sk(X)Rdk\mathbf{S}_{\le K}(X) = \left(1,\; S^1(X),\; S^2(X),\; \ldots,\; S^K(X)\right), \qquad S^k(X) \in \mathbb{R}^{d^k}

Chen’s identity

For two paths XX on [0,s][0, s] and YY on [s,t][s, t] concatenated:

S(XY)=S(X)S(Y)\mathbf{S}(X \star Y) = \mathbf{S}(X) \otimes \mathbf{S}(Y)

The tensor product works degree-by-degree with the Chen coproduct — complexity O(KdK)O(K \cdot d^K) per update, not the O(nK)O(n^K) a naive recomputation would suggest. That is why this is fast online.

Truncation

A depth-KK signature has size:

SK=k=0Kdk=dK+11d1|\mathbf{S}_{\le K}| = \sum_{k=0}^{K} d^k = \frac{d^{K+1} - 1}{d - 1}

For d=4d = 4 and K=4K = 4: 341 doubles per trace — cheap.

Distance between signatures

Compare two signatures by any standard vector metric on the concatenated components (weighted 2\ell^2 works well):

dist(X,Y)=k=0KβkSk(X)Sk(Y)2\text{dist}(X, Y) = \sum_{k=0}^K \beta^k \|S^k(X) - S^k(Y)\|_2

β<1\beta < 1 down-weights high-order interactions; β=1\beta = 1 treats them equally.

Uses in FrankenTUI

  • Workload fingerprinting. Capture a signature at the end of a showcase scene; store it as the scene’s canonical fingerprint.
  • Trace comparison. Compute the signature distance between the current run and a stored baseline. Large distance = regression candidate.
  • Regression detection. Pair with an e-process (see e-processes) to get an anytime-valid alert on signature drift.

Rust interface

crates/ftui-runtime/src/rough_path.rs
use ftui_runtime::rough_path::{Signature, SignatureConfig}; let mut sig = Signature::new(SignatureConfig { channels: 4, // d depth: 4, // K }); // Online update — Chen's identity under the hood: sig.extend(&observation); // &[f64; 4] // At any time, get the current truncated signature: let fingerprint = sig.vector(); // Compare to a baseline: let dist = Signature::distance(&fingerprint, &baseline, 0.7); if dist > threshold { flag_regression(); }

How to debug

Signature dumps and comparisons emit rough_path_signature lines:

{"schema":"rough_path_signature","channels":4,"depth":4, "sig_size":341,"dist_to_baseline":0.72,"baseline":"showcase-v2.1", "hash":"b4c1e2...", "triggered":false}
FTUI_EVIDENCE_SINK=/tmp/ftui.jsonl cargo run -p ftui-demo-showcase # Regression candidates in the last run: jq -c 'select(.schema=="rough_path_signature" and .triggered) | {baseline, dist_to_baseline}' /tmp/ftui.jsonl

Pitfalls

Signature size explodes with depth. At d=10,K=6d = 10, K = 6 you store 1.1M doubles. Keep K4K \le 4 unless you have a specific need for higher-order interactions, and use a depth-adaptive truncation if different channels have different effective depth.

Paths must be of bounded variation (or piecewise linear). Jump discontinuities — e.g., a subscription restart — break the iterated-integral interpretation unless you smooth first. The runtime wraps raw observations in a piecewise-linear interpolator before feeding them to the signature.

Cross-references

  • /testing/determinism-fixtures — where baseline signatures are captured and stored.
  • E-processes — for anytime-valid regression alerts on top of signature distance.
  • IVM — a different cross-cutting mechanism for incremental state summaries.

Where next