Skip to content

What are Human Standards?

Human Standards — also known as Human Factors and Ergonomics (HF/E) — is the discipline focused on designing systems that fit human capabilities and limitations: physical, cognitive, and organizational.

The term “Human Standards” emphasizes our goal: clear, actionable, machine-readable principles for building technology that truly serves people — and that AI agents can use to validate and improve interface designs.


Technology is built by people who deeply understand their systems but often overlook the humans who use them. The result:

  • Interfaces that work for experts, confuse newcomers
  • Features that assume perfect attention, memory, and motor control
  • Designs that exclude people with disabilities
  • Error-prone systems that blame users for predictable mistakes

Human Factors research has generated decades of evidence about human capabilities and limitations. But this knowledge is scattered across academic papers, expensive consultancies, and tribal knowledge within organizations.

Human Standards makes this knowledge:

PropertyTraditional HF/EHuman Standards
AccessibleAcademic papers, expensive trainingFree, searchable documentation
ActionablePrinciples and theoriesSpecific implementation guidance
Machine-readableProse descriptionsYAML validation rules, JSON tokens
CurrentPublished research (months/years lag)Continuously updated
IntegratedSeparate from developmentMCP server for AI-assisted design

Human Factors and Ergonomics (HF/E) is the scientific discipline concerned with understanding interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimize human well-being and overall system performance.

— International Ergonomics Association

In simpler terms: designing things so people can use them effectively, safely, and comfortably.


Design systems that:

  • Prevent predictable errors before they occur
  • Tolerate mistakes without catastrophic consequences
  • Enable easy recovery when errors happen

Enable people to:

  • Complete tasks faster and more accurately
  • Learn systems with minimal training
  • Maintain performance over extended use

Ensure systems work for people with:

  • Visual differences (blindness, low vision, color blindness)
  • Motor differences (limited mobility, tremors, missing limbs)
  • Cognitive differences (attention, memory, processing speed)
  • Hearing differences (deafness, hearing loss)
  • Temporary impairments (injury, environment, situational)

Create experiences that:

  • Feel natural and intuitive
  • Build confidence through appropriate feedback
  • Respect users with honest, transparent design
  • Support wellbeing rather than exploit vulnerabilities

How human bodies interact with systems.

FactorDesign Consideration
AnthropometricsBody measurements for sizing and reach
BiomechanicsMovement patterns, force requirements
PostureSustainable body positions
EnvironmentLighting, noise, temperature
TouchTarget sizes, spacing, haptic feedback

How human minds process information and make decisions.

FactorDesign Consideration
PerceptionHow we see, hear, and feel interfaces
AttentionWhat we notice, what we miss
MemoryWhat we can hold in mind (7±2 chunks)
LearningHow we build mental models
Decision-makingHow we choose under uncertainty
ErrorSlips, mistakes, and their prevention

How human systems and workflows operate.

FactorDesign Consideration
CommunicationInformation flow between people
CollaborationMulti-user coordination
TeamworkRoles, responsibilities, handoffs
TrainingSkill development and maintenance
CultureNorms, expectations, safety climate

This documentation currently focuses on digital interfaces:

DomainCoverage
Web applicationsFull guidance, code patterns, validation
Mobile appsTouch targets, gestures, responsive design
Desktop softwareKeyboard shortcuts, window management
Digital accessibilityWCAG 2.2 compliance, ARIA patterns
Cognitive principlesFor screens and digital interaction
Input ergonomicsKeyboard, mouse, touch, stylus

On the roadmap but not yet covered:

DomainPlanned Coverage
Voice interfacesConversational design, speech recognition
VR/ARSpatial computing, immersive UX
AI systemsHuman-AI interaction, explainability
IoT/AmbientSmart environments, multimodal
WearablesBody-worn devices, health tech
Automotive HMIIn-vehicle interfaces, driving context
Physical productsIndustrial design ergonomics

See Scope & Roadmap for expansion plans.


Foundational concepts from human factors research, expressed as clear guidelines:

Miller’s Law: People can hold approximately 7±2 chunks of information in working memory. Keep menus, options, and steps to 7 or fewer items where possible.

Concrete, measurable requirements for implementation:

SpecificationValueSource
Minimum touch target44×44px (iOS), 48×48dp (Android)Platform guidelines
Maximum response time100ms for perceived instantNielsen research
Minimum contrast ratio4.5:1 (AA), 7:1 (AAA)WCAG 2.2

Machine-readable rules that AI agents and linters can apply:

rules:
- id: touch-target-minimum
severity: error
check: "Interactive elements are at least 44×44px"
wcag: "2.5.5 AA"
- id: contrast-ratio-text
severity: error
check: "Text contrast ratio is at least 4.5:1"
wcag: "1.4.3 AA"

Implementation examples showing how to apply principles:

/* Touch target sizing */
.button {
min-width: 44px;
min-height: 44px;
padding: 12px 24px;
}
/* Safe spacing between targets */
.button + .button {
margin-left: 8px; /* Prevent accidental activation */
}

Pseudo-code showing when and how to apply guidelines:

FUNCTION evaluateTouchTarget(element):
IF element.width < 44 OR element.height < 44:
RETURN error("Touch target too small")
IF nearbyTargets.any(distance < 8):
RETURN warning("Targets too close together")
RETURN pass

This documentation provides:

  • Explanations of why design decisions matter
  • Examples showing good and bad practices
  • Research citations for deeper learning
  • Checklists for practical implementation

This documentation provides:

  • Structured data (tables, YAML) for parsing
  • Validation rules for automated checking
  • Decision logic for design recommendations
  • Specifications for code generation
  • MCP server integration for real-time guidance

Human Standards provides the evidence base for UX design:

  • UX designers create specific designs
  • Human Standards ensures those designs fit human capabilities

Accessibility is a subset of Human Standards:

  • All accessibility is human factors
  • Not all human factors is accessibility

Usability is one measure of success:

  • Human Standards addresses effectiveness, efficiency, and satisfaction
  • Plus safety, wellbeing, and inclusion

Ergonomics and Human Factors are largely synonymous:

  • “Ergonomics” emphasizes physical factors (common in Europe)
  • “Human Factors” emphasizes cognitive factors (common in US)
  • Both terms describe the same discipline

Output TypeDescriptionExample
PrinciplesFoundational concepts”Recognition over recall”
GuidelinesSpecific recommendations”Use 44×44px minimum targets”
PatternsReusable solutionsModal dialog ARIA pattern
TokensDesign valuesColor, spacing, typography
ValidationAutomated checksContrast ratio linter
EvaluationAssessment methodsUsability testing protocols

Foundational:

Accessibility:

History: