A framework for UI that doesn't exist until it's needed. Function over form. Context over canvas.


In 1964, Marshall McLuhan wrote "The medium is the message."

His argument: the channel shapes the content more than the content itself. Television isn't radio with pictures. The web isn't print with links. Each medium restructures thought.

He died in 1980. But his insight applies to territory he never mapped: UI as a function of context.

The punchline: the interfaces that actually work aren't designed. They're derived. Builders just haven't noticed yet.


The Core Thesis

UI is not an artifact. It's a rendering.

Screens don't scale. Fixed layouts work for brochures. They collapse under complexity, multiplicity, and the actual conditions of use.

If you're building interfaces that work across contexts, you're probably already building contextual systems.

You just call them "responsive."


The Tension

Most UI architectures live here:

FIXED ←————————————————————————→ FLUID

Designed screens    vs.    Derived views
Media queries       vs.    Context signals
Viewport width      vs.    Full environment
Static              vs.    Generated
Brittle             vs.    Adaptive

Left side: controllable, predictable, dead on arrival. Right side: emergent, contextual, alive.

The instinct is to design screens. The systems that survive derive interfaces.


The Paradigm Shift

Two models. Only one scales.

Traditional:
Data → Fixed UI → User
        ↓
    (hope it fits)

Contextual:
Data → Context → Generated UI → User
         ↓
    (always fits)

Function is what the system knows and can do. Data. APIs. Logic. Function is stable.

Frontend is how function appears. Presentation should be derived, not designed. Built on demand. Destroyed on exit.

Same revenue number:

Context Rendering
Dashboard Metric card with sparkline
Voice assistant "Revenue is up twelve percent"
Notification Badge: ↑12%
Command palette revenue: $1.2M (+12%)
Chat thread Inline bubble
Watch face 1.2M ↑

Same function. Six frontends. None pre-designed.


What Is Context?

Context is the sum of conditions under which UI is consumed.

Not one dimension. Many.

Embedding Context

Where does the UI physically live?

Context UI Implication
Full application Complete controls, navigation, depth
Chat message Inline card, single action, conversational
Notification Glanceable, dismissible, urgent
Command palette One-line, keyboard-driven
Email digest Static, scannable, no interaction
Ambient display Large type, auto-refresh, no input

Interaction Modality

How is the user engaging?

Modality Implication
Visual + pointer Rich hover states, dense information
Visual + touch Larger targets, swipe gestures
Voice-first Spoken output, minimal visuals
Glanceable No interaction expected, status only

Attention Level

How much focus does the user have?

Level Implication
Deep work Full detail, all options exposed
Monitoring Exceptions only, auto-updating
Interrupted Key insight, single action
Presenting Simplified, narrative, no chrome

Temporal Context

When in their workflow?

When Implication
Morning briefing Summary of overnight changes
Real-time Live updates, streaming
End of day Wrap-up, comparisons, trends
Pre-meeting Talking points, shareable snapshots

Width is one signal. These are the others.


Mechanisms

Nature solved adaptive interfaces. We're catching up.

UI-Agnostic Data

Your backend shouldn't return "dashboard data." It should return semantic data that any frontend can interpret:

{
  "metric": "revenue",
  "value": 1247500,
  "change": { "percent": 12.4, "direction": "up" },
  "period": "mtd",
  "benchmark": "target"
}

The frontend decides representation. The backend doesn't know or care.

Context-Aware Renderers

Instead of building a component, build a renderer that accepts context:

<Metric 
  data={revenue} 
  context={{ 
    embedding: "chat", 
    attention: "interrupted", 
    modality: "touch" 
  }} 
/>

The component switches representations based on signals. No conditionals in consuming code.

Transformation Rules

A design system is no longer a component library. It's a set of rules:

revenue + dashboard     → trend chart
revenue + notification  → delta + emoji
revenue + voice         → spoken comparison
revenue + watch         → number + arrow

Rules compose. Context resolves.

AI as Rendering Engine

Given semantic data, context signals, and transformation rules—AI generates appropriate UI on the fly.

Not "AI-generated design" in the slop sense. AI as a compiler. Data and context in. Interface out.


Framework Principles

Eight principles. No fluff.

1. Function Is Stable, Frontend Is Fluid

Separate what you know from how you show it. Data doesn't change. Presentation always does.

2. Context Is Multidimensional

Width is one signal. Embedding, modality, attention, and time are equally important. Designing for "mobile" is designing for one dimension of a four-dimensional space.

3. UI Is Derived, Not Designed

Define transformation rules, not fixed layouts. The layout is an output, not an input.

4. Same Data, Appropriate Density

A notification and a dashboard show the same truth at different densities. Density is a context variable.

5. Design for Context, Not Device

A phone in your hand and a phone on a car mount are different contexts, same device. Device is a proxy. Context is the territory.

6. Graceful Degradation Across Contexts

If voice fails, fall back to visual. If rich interaction isn't available, provide static output. Every context has a floor.

7. Screens Are Dead

Users don't want screens. They want answers, status, actions, confirmations. These can be delivered anywhere, in any form, through any modality.

8. The Screen Is an Implementation Detail

"The settings screen" is an artifact of how we build, not what users need. Kill your screens.


The Test

One question:

Would this UI make sense if you didn't know where it was being rendered?

If the UI assumes a specific viewport → not contextual If the data layer knows about cards or charts → not decoupled If changing context requires a redesign → not derived

If it passes → contextual system.


Failure Modes

Nothing's free. Know the risks.

Failure What happens Fix
Context collapse Treated all contexts the same Explicit context detection
Over-derivation Everything generated, nothing stable Anchor points, core components
Signal noise Too many context dimensions Prioritize, collapse related signals
Transformation bloat Rules multiply without bound Composable primitives
Uncanny output AI-generated UI feels wrong Stronger design system constraints
Data coupling Semantic data leaks presentation Strict schema validation

Prototyping This

Minimum viable experiment. No infrastructure.

  1. Audit your data layer. Is it UI-agnostic? If your API returns cardTitle, you've already failed.
  2. Identify real contexts. Not "mobile" and "desktop." Where do users actually engage? Chat? Email? Voice? Ambient?
  3. Sketch one data type in five contexts. Same data. Five renderings. What changes? What stays?
  4. Build one context-aware component. Accepts context object. Renders differently based on signals.
  5. Expand from there.

That's your day-one prototype.

More structured experiments:

  • Context matrix: Map your core data types against all known contexts. Find the gaps.
  • Transformation audit: Document your implicit rules. Make them explicit.
  • Graceful degradation test: Start with richest context. Degrade one dimension at a time. Where does it break?

What This Enables

  • Truly omnichannel products — not ports, but native experiences everywhere
  • AI-native interfaces — AI determines presentation, not just content
  • Progressive disclosure by context — complexity when needed, simplicity when not
  • Faster iteration — change the data once, every context updates
  • Accessibility by default — voice, visual, and glanceable are all first-class

Why This Matters

The interfaces that scale are contextual in structure.

Not because of ideology. Because of physics.

Fixed layouts are a bottleneck. Single-context design fails in multi-context reality. Screens create information loss at every boundary.

McLuhan spent his life arguing that media shape thought. That the channel restructures the content.

Turns out the same is true for UI.

The interface isn't the product. The function is the product. The interface is just rendering.


Summary

Your UI is contextual. That shouldn't surprise you.

Build systems where:

  • Data is semantic and UI-agnostic
  • Context is detected, not assumed
  • Presentation is derived, not designed
  • Transformation rules replace fixed layouts
  • Screens are outputs, not inputs

Build systems that would work even if you didn't know where they'd be rendered.

Build contextual systems.


"The medium is the message."

— Marshall McLuhan (1911–1980)