ftui-core — Overview
ftui-core is the crate that owns the boundary between a live terminal and
deterministic application logic. Everything that has to touch stdin,
environment variables, raw mode, multiplexer passthrough, or the OS signal
mask lives here — and nowhere else in the workspace.
If you are reading a terminal byte, detecting a capability, classifying a
gesture, or setting the alternate screen, you are in ftui-core. If you are
painting a cell, computing a diff, or laying out a widget tree, you are not.
That split — input on one side, rendering on the other — is the organizing
principle of FrankenTUI. The runtime (ftui-runtime) sits above both and
glues them into an Elm-style update loop.
This page orients you to the crate as a whole: what it exports, where each concept lives, and how the parts compose. Follow the links in the table to drill into specific subsystems.
Why a separate crate?
A TUI toolkit that tangles I/O with rendering ends up with every test
reaching for a real terminal, every panic leaving the user’s shell in raw
mode, and every new widget tempted to read its own bytes from stdin.
ftui-core exists to make those failure modes impossible by construction:
it owns raw-mode lifecycle via RAII, decodes bytes into a canonical
Event enum, classifies gestures, and detects capabilities — and then
hands the results to the runtime as plain data. Nothing downstream needs
to know whether a click came from X10, SGR, or a harness fixture.
Mental model
┌────────────────────────────────────────────┐
│ Terminal (raw mode, alt-screen, mouse) │ ← OS boundary
└──────────────────┬─────────────────────────┘
│ raw byte stream
▼
┌─────────────┐
│ InputParser │ state machine, DoS-bounded
└──────┬──────┘ (CSI 256 B, OSC 102 KB, paste 1 MB)
│ [Event]
▼
┌──────────────────┐
│ GestureRecognizer│ raw → semantic
└──────┬───────────┘
│ [SemanticEvent]
▼
ftui-runtime::Model::update()TerminalSession wraps the whole pipeline with an RAII guard, and
TerminalCapabilities supplies the feature profile the runtime uses to
pick codepaths (synchronized output, scroll region, truecolor, …).
Public surface at a glance
Every item in this table is a stable public export of ftui-core. Read the
linked pages for semantics and pitfalls.
| Area | Type | Page |
|---|---|---|
| Lifecycle | TerminalSession, SessionOptions | terminal-session |
| Input | Event, KeyEvent, MouseEvent, InputParser | events-and-input |
| Gestures | GestureRecognizer, GestureConfig, SemanticEvent | gestures |
| Capabilities | TerminalCapabilities, CapabilityLedger | capabilities |
| Screen modes | InlineStrategy, InlineRenderer | screen-modes |
All exports are #![forbid(unsafe_code)]. The crate depends on
crossterm for cross-platform raw-mode primitives (see
ADR-003 for the rationale) but does not re-export it.
Minimal tour
The following snippet exercises every layer above — acquire a session, feed a byte stream to the parser, and ask the recognizer for a semantic verdict. In real applications the runtime calls these on your behalf; showing them side by side clarifies ownership.
use std::time::Instant;
use ftui_core::event::Event;
use ftui_core::gesture::{GestureConfig, GestureRecognizer};
use ftui_core::input_parser::InputParser;
use ftui_core::terminal_session::{SessionOptions, TerminalSession};
fn main() -> std::io::Result<()> {
// 1. Raw-mode + alt-screen + mouse + paste (RAII — restored on Drop).
let _session = TerminalSession::new(SessionOptions {
alternate_screen: true,
mouse_capture: true,
bracketed_paste: true,
focus_events: true,
..Default::default()
})?;
// 2. Decode a byte chunk into Events.
let mut parser = InputParser::new();
let events: Vec<Event> = parser.parse(b"\x1b[A"); // Up arrow
// 3. Lift raw events into semantic gestures.
// `process` returns Vec<SemanticEvent> and needs a monotonic `now`.
let mut gr = GestureRecognizer::new(GestureConfig::default());
for ev in &events {
for semantic in gr.process(ev, Instant::now()) {
// Feed each semantic event into Model::update()
tracing::info!(?semantic, "gesture");
}
}
Ok(())
}Crate-boundary mental model
ftui-core is deliberately a narrow bridge: it does not know what a
widget is, what a frame looks like, or how the runtime stores state. The
only crate that may call into it directly is ftui-runtime; widgets and
style code consume the data it produces via the runtime context. This
asymmetry keeps tests cheap — ftui-harness injects Event values
without ever opening a PTY — and keeps the render pipeline stateless
with respect to input.
- terminal_session.rs (RAII guard, SessionOptions)
- event.rs (Event, KeyEvent, MouseEvent)
- input_parser.rs (DoS-bounded byte decoder)
- gesture.rs (GestureRecognizer state machine)
- semantic_event.rs (Click, Drag, LongPress, Chord)
- terminal_capabilities.rs (feature profile)
- caps_probe.rs (Bayesian evidence ledger)
- inline_mode.rs (DECSTBM / overlay / hybrid)
Do not bypass ftui-core to read input directly. If a widget pulls
bytes off stdin, it will race the parser, steal keypresses, and make
determinism tests impossible. All input flows through
TerminalSession → parser → recognizer →
runtime. No exceptions.
Cross-references
- Terminal session lifecycle — RAII guard details.
- Screen modes — inline vs. alt-screen trade-offs.
- Frame API — what the runtime hands to your widgets.
- Model trait — where events become updates.