We are no longer building products for hands and eyes — we are building for minds in motion. The next interface isn’t seen. It’s sensed. CXD is the open-source discipline of designing for perception, attention, memory, and restoration.
We used to design what people do. Now we design how they feel, think, and recover.The next interface isn’t seen. It’s sensed.Behavior was visible. The mind is where the real work happens.We are no longer in the business of shaping behavior. We are in the business of stewarding cognition.In the age of AI, experience design becomes the architecture of consciousness.Build for cognition, not consumption.The mind is the new interface.Cognitive wellness is the next platform layer.We’re entering the era of Mindware, not software.We used to design what people do. Now we design how they feel, think, and recover.The next interface isn’t seen. It’s sensed.Behavior was visible. The mind is where the real work happens.We are no longer in the business of shaping behavior. We are in the business of stewarding cognition.In the age of AI, experience design becomes the architecture of consciousness.Build for cognition, not consumption.The mind is the new interface.Cognitive wellness is the next platform layer.We’re entering the era of Mindware, not software.
10×
Core principles guiding human-obsessed AGI design
55M+
People living with dementia worldwide today
3B
People affected by neurological conditions globally
18%
Increase in neurological burden since 1990
What We Believe
The future of AI products isn’t intelligence — it’s cognitive alignment.
Coined by Joanna Peña-Bickley at IBM Watson in 2014, CXD is the discipline of designing for perception, attention, memory, and restoration — not just interaction. A new category: Neuroadaptive Product Design.
The Foundation
10 Principles of Principled Ingenuity
Every design decision flows from these universal tenets — a compass for builders and designers shaping AI-powered human experiences.
01
Human Effects
Strives to enhance human ability, not replace it. Human agency is paramount — balancing AI and people to equitably complement human skills.
02
Transparent Truth
Plainly explains how a system reaches conclusions — through data indicators, confidence scores, corpus sourcing, and explanation interfaces.
03
Breaking Bias
Begins with diverse hiring. Counteracts human, data, and algorithmic bias with oversight processes that are clear and transparent.
04
Privacy Pledged
Builds processes that protect data. All data collected observes privacy laws, is limited by design, and deleted upon human request.
05
Systematic Sanctuary
Protects humans with safety practices. AI inventions test in constrained environments with proper monitoring after deployment.
06
Algorithmic Accord
Brings about products that produce enduring relationships — never creating unhealthy emotional dependencies between humans and machines.
07
Empathetic Analytics
Goes beyond accuracy targets. Develops cost metrics aligned to domain-specific applications that understand true model performance.
08
Inclusive Design
Empowers everyone. Engages all people respectfully, equally and with dignity — never virtue signaling or promoting empty promises.
09
Radical Simplicity
Exemplary cognitive experience design removes complexity from life. The best interface is no interface — AI reduces friction, not adds it.
10
Wise Works
Healthy CXD accelerates new ideas that work for everyone, everywhere, everyday — wisdom applied at the scale of human civilization.
The Crisis
The Triple Brain Health Epidemic
Three converging crises share a neurobiological foundation — and CXD offers the most coherent counterforce available.
Crisis 1
Mental Health Deterioration
Digital platforms systematically fragment attention via algorithmic manipulation. The COVID-19 pandemic accelerated digital dependency, while AI chatbot use creates an “isolation paradox” — initially reducing loneliness, then progressive social withdrawal.
Crisis 2
Accelerated Cognitive Decline
“Digital dementia” captures how excessive screen exposure increases long-term cognitive decline risk. Alzheimer’s could increase 4–6× between 2060–2100, reflecting the first generation to experience extensive digital exposure during neuroplasticity-critical periods.
Crisis 3
Digital Overload
The attention economy creates a “digital lobotomy” — eroding cognitive autonomy and deep thinking. The WHO calls it an “infodemic”: the human condition of cognitive overload, where information supply exceeds the brain’s processing capacity.
The role of AI should not be to find the needle in the haystack — it should be to show you how much hay it has cleared so you can better see the needle yourself.
— Joanna Peña-Bickley · Mother of Cognitive Experience Design
Living Book
A corpus that evolves with the field.
Open-source chapters at the intersection of neuroscience, AI product design, and cognitive wellness — built for practitioners, researchers, and curious minds.
CXD concepts, flashcards, and glossary — a living classroom for practitioners building human-obsessed AI.
What is Cognitive Experience Design?
The practice of using artificial intelligence technologies to reduce the human mental effort and time required to complete a task. It employs knowledge of neuroscience, human perception, mental processing, modeling, and memory.
Tap to reveal
1 / 6
Cognitive Experience Design — coined by Joanna Peña-Bickley at IBM Watson in 2014. The practice of designing AI-powered systems that reduce human mental effort. The abbreviation CXD or #CognitiveXD became widely adopted after her 2015 C2 Montreal appearance.
The total amount of mental effort being used in working memory. CXD aims to reduce cognitive load so users have more mental energy for high-level problem-solving. Measured through task completion time, error rates, and self-reported mental effort.
Personal, internal representations of external reality that people use to interact with the world. Constructed based on unique life experiences and perceptions. CXD designs systems that align with — not against — a user’s existing mental models.
Designing systems that predict a user’s next move or need before they fully articulate it. A core AI pattern in CXD — treating technology as an “extended mind” that stores memory, performs reasoning, and filters information on the user’s behalf.
Coined by Joanna Peña-Bickley and Vibes AI: the convergence of (1) mental health deterioration, (2) accelerated cognitive decline, and (3) digital overload. These share a neurobiological foundation — disruption in one domain inevitably affects the others.
The scientific discipline that applies neuroscience to the design of tools, systems, and environments. One of the four practice areas of CXD — alongside Human-Centered Design, User-Centered Design, and Cognitive Ergonomics.
CXD in Action · Vibes AI
vibes ai
The first brain fitness platform that proves it’s working. Vibes AI is CXD made manifest — using voice biomarkers to measure cognitive state, then delivering hyper-personalized therapeutic audio that makes you measurably sharper. Brain and mental health is a birthright, not a privilege.
A closed-loop, agentic system that replaces wellness churn with measurable progress. 30 seconds of voice yields a daily Brain Readiness Score, then delivers restorative audio therapy calibrated to your exact cognitive state — with before and after proof included.
Open-source research at the intersection of voice biomarkers, neuro wellness, cognitive design, and AI — written for practitioners and builders who refuse to separate science from craft.
A community of practitioners, researchers, and builders committed to human-obsessed AI. Open to designers, engineers, neuroscientists, and anyone who believes the next great interface isn’t seen — it’s sensed.
CognitiveExperience.design is an international organization fighting for equitable access to digital literacy — and a world where everyone can augment and access intelligence equitably.
Our Mission
Design For Thinking & Doing.
Cognitive Experience Design is the practice of using artificial intelligence to invent, reinvent, and design products, services, systems, organizations, and the way companies or governments work.
CXD unites four disparate design practices: the artful intention and intuition of Human-Centered and User-Centered Design, and the objective scientific approach of Cognitive and Neuro-Ergonomics. Simply put, it moves business from the management fad of design thinking to the measurable magic of Design For Thinking & Doing.
The CognitiveExperience.design foundation was established in 2014 to advance the use of AI for public good. This site is a living collection of resources at the intersection of design, data, media, machine learning, and AI — striving to democratize learning and intelligence for all humans.
The Four Pillars
What CXD Unites
01
Human-Centered Design
Artful intention and intuition — empathy at the center
02
User-Centered Design
Behavioral rigor and task-based validation
03
Cognitive Ergonomics
Applied cognitive science and mental load reduction
04
Neuro-Ergonomics
Neurological and perceptual science grounding
Founder
Joanna Peña-Bickley
Mother of Cognitive Experience Design
Amazon · AlexaIBM WatsonUberVibes AI · CEOAI Design Corps · Fellow
The Origin Story.
In 2014, business, design, and engineering leaders were struggling to explain how design needed to evolve for a new AI computing era. While designing software powered by IBM’s Watson, Joanna coined the phrase Cognitive Experience Design to name an emerging practice: creating machines that could learn, reason, and interact with humans.
She used “cognitive” for two reasons: smart devices enable humans to offload repetitive tasks to combat cognitive overload; and AI technologies like ML, computer vision, and NLP are built to mimic how the human brain works.
In 2015, the emcee of C2 Montreal introduced her as “the mother of cognitive experience design” — and it stuck forever after.
2014
Coined “Cognitive Experience Design” at IBM Watson — designing AI-powered software for professionals.
2015
Named “Mother of Cognitive Experience Design” at C2 Montreal. CognitiveExperience.design launched as a living corpus.
2022
Sr. Director, Product Design at Uber — led design for UberEats across Consumer, Membership, Merchant, and Advertising.
2023
Co-founded Vibes AI — the first brain fitness platform that proves it’s working, using voice biomarkers and therapeutic sound.
2026
SXSW keynote on UX to AX. CXD enters its second decade as a global, open-source discipline.
The Scale of the Problem
Why the world needs CXD.
Today, 59% of the global population lives in a connected world inundated with notifications, reminders, articles, posts, and content. The WHO calls it an infodemic.
3.5B
Searches made every single day — a volume growing year over year.
294B
Emails sent daily — a volume no human mind was ever designed to process.
This is far too much information for any human to simultaneously reason, understand, and react to. The result is cognitive overload. CXD and cognitive ergonomics is the cure for machine-induced human cognitive overload.
Everything you need to understand Cognitive Experience Design — what it is, why it exists, who practices it, and how to put it to work.
Foundations
Cognitive Experience Design, #CognitiveXD or CXD, is the practice of using artificial intelligence technologies to reduce the human mental effort and time required to complete a task. It employs knowledge of human perception, mental processing, modeling, and memory.
More broadly, it is the practice of using AI to invent, reinvent, and design products, services, systems, organizations, and the way companies or governments work. CXD unites four disparate design practices: Human-Centered Design, User-Centered Design, Cognitive Ergonomics, and Neuro-Ergonomics.
In 2014, business, design, and engineering leaders were struggling to explain how design needed to evolve for a new AI computing era. While designing software powered by IBM’s Watson, Joanna Peña-Bickley coined the phrase to give life to an emerging practice creating machines that could learn, reason, and interact with humans.
She used “cognitive” for two reasons: smart devices enable humans to offload repetitive tasks to combat cognitive overload; and AI technologies like machine learning, computer vision, and NLP are built to mimic the human brain. In 2015, the emcee of C2 Montreal introduced her as “the mother of cognitive experience design” — and it stuck.
Today, 59% of the global population lives in a connected world inundated with notifications, reminders, articles, posts, and content. The WHO calls this an infodemic — when information supply exceeds the brain’s processing capacity.
In 2020 alone: 3.5 billion searches a day, 294 billion emails, 230 million tweets, 75+ petabytes of user-generated data daily. This is far too much for any human to simultaneously reason, understand, and react to. CXD and cognitive ergonomics is the cure for machine-induced human cognitive overload.
Ultimately, CXD is for humans. More specifically, it’s a practice for any designer who strives to design human interfaces with machines. Any designer can be a practitioner — expressly, Conversation, Sound & Voice UI Designers, Brain Interface Designers, and IoT product designers.
Design practitioners can use CognitiveXD to bestow superhuman powers on their users, customers, employees, and patients for a task in time and space. Anyone willing to incorporate AI technologies into their crafting of solutions to human problems is a practitioner.
Practice & Application
Everyday. CXD should be used to solve human problems at the intersection of space and time. It is used to understand the whole human experience, their mental model and journey, and improve it in a specific time and space with AI technologies like NLG, Speech Recognition, Virtual Assistants, Machine Learning, AI-optimized Hardware, Decision Management, Deep Learning, Biometrics, Robotic Process Automation, Text Analytics, and NLP.
Step 1: Seek to understand the whole human and their problems. Define the problem with crisp terms.
Step 2: Write a narrative that acts as a North Star — one that gives or augments your customer with superpowers and abilities (e.g. The Force, Photographic Reflexes, Dimensional Travel).
Step 3: Define the experience in the form of Frequently Asked Questions. Seek to enhance the user experience throughout the day with superpowers that augment the 5 basic senses or give humans a sixth sense.
The Superpower Playing Cards help business leaders connect ideas to human interface design and AI capabilities, writing bold business and customer goals that translate into user stories.
Natural Language Understanding (NLU), Natural Language Processing (NLP), Machine Learning (ML), Decision Management, Deep Learning, Biometrics, and Robotic Process Automation.
In more recent iterations: Generative AI, Large Language Models, Voice Biomarkers, Neural Interfaces, Agentic AI systems, and Neuroadaptive computing — systems that adjust dynamically to a user’s real-time cognitive state.
Amazon Alexa — as Head of Research and Design of Alexa Devices, CXD informed how millions of people interact with ambient intelligence daily.
IBM Watson — diagnosis tools for healthcare professionals with confidence scores, transparent data sourcing, and trust-building interaction patterns. Also Ford Service Tech, John Deere Agent, and GM OnStar Go Mobility Assistant.
Uber — redesigning UberEats to improve customer satisfaction by 15% and reduce churn by 7%.
Vibes AI — the most direct application yet: a platform that turns voice into a daily brain health report and delivers therapeutic audio with measurable cognitive improvement.
Earth Speaks — CXD applied to reimagining human behavior on the planet.
Trust, AI & Mental Models
The user experience of an AI begins with the belief that a human will have the final say. The role of AI should NOT be to find the needle in the haystack — it should be to show the user how much hay it has cleared so they can better see the needle themselves.
When designing diagnosis tools for healthcare professionals, the interface must: (1) telegraph the amount of data it’s sorting through, (2) demonstrate confidence levels for surfaced options, (3) create mechanisms for users to see the inputs that led to conclusions. Each interaction builds trust.
AI-driven systems are based on probability and uncertainty — this is why explanation is key. Once a user has a clear mental model of the system’s capabilities and limits, they can understand how and when to trust it.
Clarity, simplicity, and trust are inherently linked. A user who understands a system’s reasoning can work with it as a partner rather than fear it as an oracle. Design for transparency first. Intelligence second.
Computational design applies computational strategies to the design process — encoding design decisions in computer language. CXD is the practical application of intelligent and quantum computing technologies to augment human capabilities.
While different, they complement each other — together enabling design for billions of people in real-time, at scale, and on demand. The intersection they share is a change in design expression: from geometry to connected neural logic.
Still curious?
Go deeper in the Living Book.
Every principle, every chapter, every concept — open-source and freely available.
Research at the intersection of voice biomarkers, neuro wellness, cognitive design, and AI — written for practitioners and builders who refuse to separate science from craft.
The Future of AI Products Isn’t Intelligence. It’s Cognitive Alignment with the Human Nervous System.
Joanna Peña-Bickley · Cognitive Experience Design
We have been designing for behavior for 30 years. Clicks, swipes, conversions, retention. Every metric in the product designer’s toolkit has been, at its core, a measure of what the body does — what the hand taps, what the eye follows, how long the thumb scrolls before it stops.
But behavior is what the mind produces. It is the exhaust of cognition, not cognition itself. And for three decades, we have been optimizing the exhaust while ignoring the engine.
The Shift That’s Already Happening
The most significant product companies of the next decade will not be measured by engagement time. They will be measured by cognitive outcome — did the user leave the interaction sharper, calmer, and more capable than when they arrived? Or did the product extract attention, fragment focus, and leave behind the residue of digital overload?
This is not a philosophical position. It is a design imperative. The WHO has classified cognitive overload as a global health crisis. The attention economy has produced what researchers now call the Triple Brain Epidemic: simultaneous surges in mental health deterioration, accelerated cognitive decline, and digital overload that reinforce each other in a devastating feedback loop.
What Cognitive Alignment Actually Means
Cognitive alignment means designing AI systems whose outputs match the cognitive state, capacity, and needs of the human using them — in real time. Not personalization (which optimizes for preference). Not accessibility (which ensures baseline usability). Cognitive alignment is the dynamic, ongoing calibration of the machine to the mind.
Consider the difference: a standard recommendation algorithm shows you what you clicked before. A cognitively aligned system understands that you are running at 60% cognitive capacity this morning — your voice biomarkers revealed fragmented sleep, elevated cortisol markers, reduced phonation consistency — and surfaces only what your mind can actually process, integrate, and act on at this moment.
The Architecture of Consciousness Is the New UI
For most of the history of software design, the interface was the thing you saw on screen. Buttons, forms, flows. The field advanced to include what you heard (voice interfaces), what you touched (haptics), and eventually what you felt (affective computing). The next frontier is what you are — your cognitive and emotional state, your attention capacity, your nervous system’s readiness to engage.
This is Cognitive Experience Design. Not a new name for UX. A genuinely different practice — one that designs for perception, attention, memory, and restoration, not just interaction. One that treats the human mind not as the target of engagement, but as the beneficiary of design.
The Practical Imperative
If you are building AI products in 2026 and you are not asking “what cognitive state does this product require of its user, and what cognitive state does it return them in?” — you are building for the old paradigm. The new one is already here. It just hasn’t been evenly distributed yet.
The designers and engineers who build with cognitive alignment at the center will not just build better products. They will build the only products that matter as AI becomes the infrastructure of human cognition itself.
The future of AI products isn’t intelligence. It’s the architecture of consciousness — designed for human flourishing.
Something new is emerging in the design of AI-powered products — and it doesn’t have a clean name yet. It is not UX. It is not CX. It is not accessibility, personalization, or affective computing, though it draws from all of them. I am calling it Neuroadaptive Product Design.
What Makes It Different
Neuroadaptive Product Design is the practice of designing systems that adjust their behavior — in real time — to the cognitive and neurological state of the human using them. Not their preferences. Not their history. Their current state.
Traditional personalization asks: what does this person like? Neuroadaptive design asks: what can this person’s mind actually absorb, process, and use right now? The difference is the difference between a waiter remembering your usual order and a doctor adjusting their communication style when they see you’re in pain.
Three Inputs That Define the Field
Voice biomarkers are the most accessible real-time window into cognitive state currently available at consumer scale. Acoustic features of speech — pitch variability, speech rate, pause frequency, phonation quality — carry measurable correlates of cognitive load, emotional state, sleep deprivation, and neurological health markers. They require no wearable, no implant, no special hardware. Just a microphone.
Behavioral signals — scroll velocity, typing cadence, error frequency, task completion patterns — provide a continuous stream of low-resolution cognitive state data that most products already collect and almost none currently use to adapt the experience in cognitively meaningful ways.
Biometric integration — heart rate variability from wearables, skin conductance from next-generation devices — will add depth as the ecosystem matures. But the foundation is already available.
The Design Principles That Follow
If you accept that the system should adapt to cognitive state, several design principles follow naturally. Complexity should reduce when cognitive capacity is low. Information density should adapt to attention state. Recovery experiences — not just productive ones — should be first-class design primitives. Rest should be designed, not assumed.
This is the new category. Not everyone is ready to build it yet. But the ones who understand it first will define the next era of software.
For two years, before we shipped a single line of the app, we used audio as a testbed. Not to build content — though we built 262 episodes — but to understand what the human voice actually reveals about the brain behind it.
The result was 31,000 organic listeners, 94% of whom reported improved focus, and a compression of our voice model’s input requirement from 120 seconds to 30 seconds. More importantly, it was a corpus of real-world cognitive signals at a scale that most research labs never access.
What the Voice Actually Reveals
The human voice is a window into the nervous system. Every time you speak, your voice carries acoustic features that are measurably correlated with your cognitive and emotional state. Pitch variability narrows under cognitive load. Speech rate slows when working memory is taxed. Pause frequency increases with mental fatigue. Phonation consistency — how steadily you produce sound — tracks sleep quality more reliably than most wearable metrics.
These are not theoretical correlations. They are measurable features that, in aggregate, produce what we call a Brain Readiness Score — an 85%-accurate daily measure of cognitive capacity derived from 30 seconds of natural speech.
What 31,000 Users Taught Us
The dominant theme across listener data was not what we expected. We expected focus and productivity to be the primary pain points. They were present, but the most deeply resonant content — measured by 189% replay rates — was around Digital Detox. Users weren’t just tired. They were exhausted in a way that felt qualitatively different from ordinary fatigue. They were experiencing what we now understand as the Nervous System Breakdown component of the Triple Brain Epidemic.
The second insight: demand for proof. Users didn’t want to be told something would help. They wanted to see evidence that it had helped. Before and after. This insight drove the closed-loop architecture of the platform — voice detects state, therapeutic audio intervenes, voice confirms change.
Where This Goes
Voice biomarker research is still early. The Framingham Heart Study showed voice-based linear classifiers can predict incident dementia with an AUC of 0.812. Our own model achieves 85% accuracy on daily brain readiness. What comes next — longitudinal tracking, clinical validation, integration with HRV and biometric data — will transform this from a wellness insight into a clinical tool.
Brain and mental health is a birthright, not a privilege. Voice is the most democratic diagnostic instrument that has ever existed. Everyone has one.
40Hz Gamma Therapy: From MIT Labs to 262 Episodes of Proven Restorative Audio
Joanna Peña-Bickley · Vibes AI Field Notes
In 2016, MIT researchers Li-Huei Tsai and colleagues published a finding that seemed almost too strange to be true: flickering light at 40Hz — the frequency of gamma brainwaves — reduced amyloid plaques in the visual cortex of Alzheimer’s mouse models. By 2019, they had replicated the effect with sound. 40Hz auditory stimulation, delivered through clicks or tones, produced measurable gamma entrainment in the auditory cortex and, crucially, appeared to activate the brain’s glymphatic system — the waste-clearance mechanism that removes toxic proteins during sleep.
From Lab to Daily Practice
At Vibes AI, we built this research into our content library not as a wellness trend, but as an evidence-based restorative protocol. 262 episodes, spanning ADHD focus sessions, sleep preparation, digital detox, and cognitive restoration, each engineered with therapeutic audio principles including 40Hz gamma entrainment, binaural beats, and solfeggio frequency layers.
The design question we kept returning to: how do you make a therapeutic practice that people actually maintain? The data from meditation apps is sobering — 77% churn by Day 3. The reason is structural: meditation requires the very resource it’s trying to restore. Sustained attention. You can’t ask a depleted mind to perform mindfulness.
What Our Data Shows
Across 31,000 listeners, the highest replay rate content (189%) was our Digital Detox series — audio experiences explicitly designed to interrupt the nervous system’s fight-or-flight loop without requiring active attention. Passive restoration. The user doesn’t have to do anything except listen.
94% of users reported improved focus after consistent use. More tellingly, the before/after Brain Readiness Score comparison — measured via voice biomarkers — showed measurable improvement in cognitive state within a single session for 71% of users on high-stress days.
The Clinical Frontier
This is still early. MIT’s research continues. Our longitudinal data is accumulating. What we know: 40Hz gamma therapy is real, it is measurable, and it is accessible to anyone with a decent pair of headphones. What we believe: it will become a clinical standard within the decade. Music is medicine. Sound heals. And we now have the tools to prove it.
Mindware, Not Software: Why the Era of Screen-First Design Is Over
Joanna Peña-Bickley · Cognitive Experience Design
Software is what runs on hardware. Mindware is what runs on the human nervous system. For thirty years, we built software and called it product design. We are now entering the era of mindware — and it requires a completely different discipline.
The Screen Was Always a Proxy
The screen was never the interface. It was a proxy for attention. A surface onto which we projected information and then measured whether the eye followed it. Clicks were not engagement — they were the residue of attention that had already been captured upstream, in the mind, before the hand ever moved.
We optimized the proxy. We built entire disciplines — UX research, conversion optimization, A/B testing — around measuring the screen’s ability to capture and hold the proxy metric. Meanwhile, the actual thing — human cognition — was being depleted by the systems we were building.
What Comes After the Screen
The cognitive interface of the next decade will not be primarily visual. It will be ambient, voice-native, biometric-aware, and — crucially — designed to restore cognitive capacity rather than extract it.
Voice is the most natural human interface. We spoke before we wrote. We narrate our inner lives in language before we type them. A product that meets humans in voice — and responds with intelligence calibrated to what the voice reveals about the mind behind it — is closer to cognition itself than any screen has ever been.
The Design Imperative
If you are designing with screens as your primary surface, you are not wrong — screens will persist for decades. But if you are designing with screens as your only surface, or with engagement as your primary metric, you are building for a paradigm that is already ending.
The question every product designer should be asking in 2026 is not “how do I get more attention?” It is “what cognitive state does my product require, and what cognitive state does it return?” Build for cognition, not consumption. Build mindware, not software. The era of screen-first design is over. The era of mind-first design has begun.
Cognitive Experience Design is the practice of using artificial intelligence technologies to reduce the human mental effort and time required to complete a task. It employs knowledge of human perception, mental processing, modeling, and memory.
The Origin
In 2014, business, design, and engineering leaders were struggling to explain how the discipline of design needed to evolve for a new AI computing era. While designing software for professionals powered by IBM’s Watson, Joanna Peña-Bickley coined the phrase Cognitive Experience Design — CXD — to give life to an emerging practice creating machines that could learn, reason, and interact with humans.
She used the term “cognitive” for two reasons. First, smart devices enable humans to offload repetitive tasks and mundane thinking in order to combat cognitive overload. Secondarily, artificial technologies like machine learning, computer vision, and NLP are built to mimic the way the human brain works.
A Simple Definition
CXD moves business from the management fad of design thinking to the measurable magic of Design For Thinking & Doing. It unites four disparate design practices into a single coherent framework:
01
Human-Centered Design — the artful intention and intuition that keeps empathy at the center of every decision.
02
User-Centered Design — the behavioral rigor and task-based validation that ensures solutions actually work.
03
Cognitive Ergonomics — the applied cognitive science that maps and reduces mental load.
04
Neuro-Ergonomics — the objective scientific grounding in neurological and perceptual science.
The Needle in the Haystack
The most instructive principle in all of CXD is what might be called the Haystack Principle. The role of AI should NOT be to find the needle in the haystack — it should be to show the user how much hay it has cleared so they can better see the needle themselves.
AI clears friction. Humans make meaning.
Who Practices CXD?
Any designer can be a practitioner. Expressly: Conversation, Sound & Voice User Interface Designers, Brain Interface Designers, Internet of Things product designers, and anyone willing to incorporate AI technologies into their crafting of solutions to human problems. CXD is a practice, not a role.
Mental models are personal, internal representations of external reality that people use to interact with the world around them — constructed based on unique life experiences, perceptions, and understandings of how things work.
What a Mental Model Is
Mental models are used to reason and make decisions and can be the basis of individual behaviors. They provide the mechanism through which new information is filtered and stored. They are not static — they evolve continuously as new experiences update our internal representation of how the world works.
As Jay Wright Forrester, the American engineer and computer scientist, observed: “The image of the world around us, which we carry in our head, is just a model. Nobody in his head imagines all the world, government or country. He has only selected concepts, and relationships between them, and uses them to represent the real system.”
Why Mental Models Matter for CXD
Recognizing and working with the plurality of stakeholders’ perceptions, values, and mental models is currently considered a key aspect of effective natural cognitive experience design. When a product’s design contradicts a user’s mental model, friction and cognitive load increase. When design aligns with mental models, the interface becomes invisible — which is the goal.
Consider how users approach a shopping cart. Many expect it to save items indefinitely — their mental model is a wishlist. Designers who understand this can create bridging interactions that honor the model while guiding behavior.
Designing With Mental Models
CXD practitioners use mental model research as a foundation for anticipatory design. By mapping how users conceptualize a system — what concepts they consider important, how those concepts are organized cognitively, and how they understand dynamic interactions — designers can build systems that work with cognition, not against it.
The practical approach: conduct mental model interviews not to understand behavior, but to understand the internal map users are using to navigate. Then design the terrain to match the map.
The Hardware, Software, and Wetware Model
CognitiveExperience.design explores the role of hardware (devices), software (programs), and wetware (the human brain) in constructing, simulating, and communicating mental models. The intersection of all three is where cognitive experience design lives.
Design the terrain to match the map the human already carries.
The 2017 transformer paper “Attention Is All You Need” didn’t just change AI. It named the most precious resource in the digital economy — and in the human brain. The coincidence is more than semantic.
The Architecture of Attention in AI
The transformer architecture, introduced by Vaswani et al. in 2017, replaced recurrence and convolution in neural networks with a mechanism called self-attention. The key insight: not all parts of a sequence are equally relevant to every other part. The model learns to weight — to pay attention to — the relationships that matter most for understanding.
This mirrors, in remarkable ways, how human attention works. The brain does not process everything in its perceptual field equally. It allocates cognitive resources based on relevance, novelty, and emotional salience. Attention is the gatekeeper of consciousness.
The Human Attention Crisis
The attention economy was named by economist Herbert Simon, who observed that as information becomes abundant, what becomes scarce is the attention to process it. Thirty years later, we are living through the most severe attention scarcity crisis in human history.
Millennials and Gen Z can no longer sustain 20 minutes of focused work. Time to focus has compressed from 23 minutes to 8–12 minutes over a single generation. 70% of adults scroll their phones between 11PM and 2AM — peak nervous system recovery time — keeping the brain in a perpetual fight-or-flight state.
What CXD Does With This Convergence
CXD uses the architectural insight of the transformer — that selective attention, weighted by relevance, is what makes intelligence coherent — as a design principle. AI systems should not present everything. They should present what matters most, at the moment the human can actually process it, weighted by cognitive state and context.
Attention is all you need. But human attention is finite, fragile, and under assault. Designing AI that honors and restores attention rather than captures and monetizes it is the central design challenge of the next decade.
The machine learned to attend. Now the human needs to recover the capacity to do the same.
We are living through a historic collision. Three distinct brain health crises — mental health deterioration, accelerated cognitive decline, and digital overload — are converging and reinforcing each other in a devastating feedback loop.
Crisis 1: Mental Health Deterioration
60 million people experienced mental illness in 2024 (NIMH). Digital platforms systematically fragment attention via algorithmic manipulation designed to maximize engagement by exploiting emotional vulnerability. Social comparison mechanisms, infinite scroll, notification architectures — all designed to trigger dopaminergic responses that create compulsive return patterns.
74–83% of Millennials and Gen Z report burnout. 70% are scrolling between 11PM and 2AM, keeping nervous systems in perpetual fight-or-flight. This is not stress. This is functional impairment at epidemic scale.
Crisis 2: Accelerated Cognitive Decline
$1 trillion is spent annually treating cognitive decline — 40% of which is preventable. The concept of “digital dementia” captures how excessive screen exposure during brain development increases long-term cognitive decline risk. Predictive models suggest Alzheimer’s and related dementias could increase 4–6× between 2060 and 2100.
Over 55 million people worldwide already live with some form of dementia. Chronic multitasking associated with digital device use impairs the development of sustained attention networks and executive function systems that protect against cognitive decline.
Crisis 3: Digital Overload
The WHO has called the inundation of incoming data an “infodemic” — the human condition of cognitive overload, where information supply exceeds the brain’s processing capacity. Over 3 billion people worldwide are affected by neurological conditions, with the overall burden increasing by 18% since 1990.
70% of workers report “brain rot” — cognitive fatigue so pervasive it has entered the cultural lexicon. $400 billion in productivity is lost annually to this phenomenon.
Why They Cannot Be Treated Separately
These three crises share a neurobiological foundation. They reinforce each other in a self-reinforcing doom loop: attention collapse leads to performance drops leads to sleep disruption leads to self-medication with scrolling leads to digital overload leads to deeper attention collapse. Treating symptoms one at a time keeps failing because the system itself is broken.
CXD is the design response to this system failure — not treating symptoms, but redesigning the interface between human cognition and the machines we’ve built.
The attention economy is a trillion-dollar system that monetizes human focus. Now supercharged by artificial intelligence, it represents the most significant threat to cognitive autonomy in human history — and the most urgent design imperative.
What the Attention Economy Is
As information became abundant, human attention became scarce. The business models of social media, streaming, and platform economies are built on one insight: human attention can be captured, held, and sold to advertisers. The product is not the app. The product is the attention of the user.
$83 billion was spent on self-care apps in 2024 — a market created partly by the very anxiety and cognitive fragmentation the attention economy produces. We are paying to repair damage that was architecturally designed in.
How AI Accelerated the Crisis
Recommendation algorithms optimized for engagement discovered that emotional activation — fear, outrage, social comparison, FOMO — produces the most reliable engagement signals. AI did not create this insight; it industrialized it. What was previously an art form practiced by tabloid editors became a real-time, personalized, continuous optimization system running at planetary scale.
The most recent evolution is what researchers at King’s College London call “cognitive lock-in” — computational dependencies between technology and users where platforms develop not just habitual use but genuine cognitive dependencies. MIT research found that cognitive activity scales down in direct relation to generative AI use.
The CXD Response
Cognitive Experience Design offers the most coherent counterforce available: a practice that uses AI to reduce human mental effort rather than exploit it. The 10 Principles of CXD are, in their entirety, an ethical framework for designing against the attention economy’s incentives.
Organizations implementing AI for cognitive augmentation (versus engagement optimization) report 3.2x higher ROI and 4.1x higher employee satisfaction. The business case for human flourishing exists. It simply requires the courage to design for it.
We are no longer in the business of shaping behavior. We are in the business of stewarding cognition.
Generative AI is not a tool. Tools are inert until used. Generative AI is an active participant in the design process — a collaborator with its own knowledge, tendencies, and outputs that must be directed, curated, and critically evaluated.
The Shift in Design Expression
Traditional design practice moves from concept to geometry — from an idea to a form that can be rendered, printed, or coded. Computational design encoded that movement in algorithms. Generative AI introduces a third paradigm: from intent to synthesis, where the designer provides direction and the machine generates possibilities at a scale and speed no human could match.
This changes the designer’s role. Less maker, more curator. Less author, more editor. Less craftsperson, more creative director. The design skills that matter most in this paradigm — judgment, taste, ethical reasoning, user empathy — are the ones AI cannot replicate.
Generative AI as CXD Material
In CXD, generative AI becomes a first-class design material for three reasons. First, it enables mass customization — experiences that adapt to individual users at a scale that was previously impossible. Second, it enables anticipatory design — generating responses, content, and interfaces that meet users before they’ve fully articulated their need. Third, it enables cognitive offloading at unprecedented scale — the AI handles the cognitive work that depletes human attention, freeing the human for judgment and meaning-making.
The Design Responsibility
With generative AI as a design material comes a design responsibility. Generated content is not neutral. Generated interfaces are not objective. Generated recommendations carry the biases of their training data and the incentives of their deployers. CXD practitioners who use generative AI must apply the same ethical framework — human effects, transparent truth, breaking bias, privacy pledged — to AI-generated outputs as to any other design decision.
Generative AI is the most powerful design material in history. It is also the most consequential. Use it with the intention of a healer, not a harvester.
AI is not the designer. The designer is still you. AI is the most powerful material you’ve ever been handed.
Draw a card. Find the human superpower your product should bestow. Use these cards to connect AI capabilities to the five basic senses — or give your users a sixth sense entirely.
✦
CXD Superpowers
Tap to Draw
Tap to reveal
Tap to flip back
Space = draw · Enter = flip · ←/→ = filter sense
All 12 Cards
@media (min-width:620px) {
#play-area {
flex-direction: row !important;
justify-content: center;
align-items: flex-start;
gap: 48px;
}
#drawn-card { max-width: 280px; }
}
@media (min-width:900px) {
#drawn-card { max-width: 300px; }
}
#deck-top:hover { transform: translateY(-8px) rotate(-1deg) !important; box-shadow: 0 16px 48px rgba(124,106,247,0.25) !important; }
#drawn-card:hover #card-front, #drawn-card:hover #card-back { filter: brightness(1.05); }
.article-body h2 { font-family: var(–font-display); font-size: clamp(22px,3vw,30px); font-weight: 400; color: var(–white); margin: 40px 0 16px; line-height: 1.2; letter-spacing: -0.01em; }
.article-body p { font-size: 17px; line-height: 1.8; color: var(–gray-300); margin-bottom: 20px; }
.article-body strong { color: var(–gray-100); font-weight: 500; }
.article-body em { color: var(–gray-200); font-style: italic; }
.article-body ul { padding-left: 24px; margin: 16px 0; display: flex; flex-direction: column; gap: 8px; }
.article-body li { font-size: 16px; color: var(–gray-400); line-height: 1.65; }
/* ── SUPERPOWER CARDS DATA ── */
const CARDS = [
{ id:’S01′, sense:’sight’, senseColor:’rgba(96,165,250,0.15)’, senseText:’#60a5fa’, senseBorder:’rgba(96,165,250,0.3)’,
title:’Telescopic Vision’, ai:’Computer Vision · AR Overlay’,
desc:’Give users the ability to see information layers invisible to the naked eye — metadata, history, context, risk — overlaid on the physical world in real time.’,
question:’What hidden layer of your data would transform your user\’s decision if they could see it instantly?’ },
{ id:’S02′, sense:’sight’, senseColor:’rgba(96,165,250,0.15)’, senseText:’#60a5fa’, senseBorder:’rgba(96,165,250,0.3)’,
title:’Pattern Recognition’, ai:’Machine Learning · Anomaly Detection’,
desc:’Train the user\’s eye to see what the machine sees — surface the signal in the noise before the human would have noticed it.’,
question:’What pattern in your data do experts recognize after years of experience that your product could surface in seconds?’ },
{ id:’S03′, sense:’sound’, senseColor:’rgba(79,209,165,0.15)’, senseText:’#4fd1a5′, senseBorder:’rgba(79,209,165,0.3)’,
title:’Sonic Intelligence’, ai:’Voice AI · NLP · Audio Biomarkers’,
desc:’Turn the human voice into a diagnostic instrument. Every conversation, every meeting, every check-in becomes a window into cognitive and emotional state.’,
question:’What would your product do differently if it could hear the difference between a user who is focused and one who is depleted?’ },
{ id:’S04′, sense:’sound’, senseColor:’rgba(79,209,165,0.15)’, senseText:’#4fd1a5′, senseBorder:’rgba(79,209,165,0.3)’,
title:’Restorative Audio’, ai:’Generative Audio · Therapeutic AI’,
desc:’Use sound as medicine. Compose AI-generated audio environments that restore nervous system balance, improve focus, and accelerate cognitive recovery.’,
question:’What would it mean for your users if a 5-minute audio experience measurably improved their brain readiness score?’ },
{ id:’S05′, sense:’touch’, senseColor:’rgba(244,132,95,0.15)’, senseText:’#f4845f’, senseBorder:’rgba(244,132,95,0.3)’,
title:’Haptic Memory’, ai:’Biometric Feedback · Wearables’,
desc:’Encode information into the body — gentle pulses, pressure patterns, temperature signals that communicate state without demanding visual attention.’,
question:’What information does your user need while their eyes are busy elsewhere? How do you deliver it through the body?’ },
{ id:’S06′, sense:’touch’, senseColor:’rgba(244,132,95,0.15)’, senseText:’#f4845f’, senseBorder:’rgba(244,132,95,0.3)’,
title:’Physical Presence’, ai:’IoT · Ambient Computing’,
desc:’Make digital intelligence physically present — objects that change temperature, weight, texture, or light in response to data, making the invisible tangible.’,
question:’What data in your system would users respond to more powerfully if it had physical form?’ },
{ id:’S07′, sense:’mind’, senseColor:’rgba(124,106,247,0.15)’, senseText:’#7c6af7′, senseBorder:’rgba(124,106,247,0.3)’,
title:’Anticipatory Mind’, ai:’Predictive AI · Anticipatory Design’,
desc:’Answer the question before it\’s asked. Surface the need before it becomes conscious. Design systems that predict intent from context and act ahead of articulation.’,
question:’What does your user always want to do next? How far ahead of the request can your system begin preparing?’ },
{ id:’S08′, sense:’mind’, senseColor:’rgba(124,106,247,0.15)’, senseText:’#7c6af7′, senseBorder:’rgba(124,106,247,0.3)’,
title:’Extended Memory’, ai:’LLM · Knowledge Graphs · RAG’,
desc:’Be the user\’s perfect memory. Remember everything they\’ve said, decided, or encountered — and surface it exactly when relevant, without being asked.’,
question:’What has your user told you that they\’ll forget they told you? How does your product remember it for them?’ },
{ id:’S09′, sense:’mind’, senseColor:’rgba(124,106,247,0.15)’, senseText:’#7c6af7′, senseBorder:’rgba(124,106,247,0.3)’,
title:’Cognitive Shield’, ai:’Attention Management · Cognitive Load Reduction’,
desc:’Protect the user\’s cognitive bandwidth. Filter, prioritize, and sequence information so the mind receives only what it can process and act on right now.’,
question:’What is the single highest-value decision your user needs to make today? How does your product clear everything else away?’ },
{ id:’S10′, sense:’mind’, senseColor:’rgba(124,106,247,0.15)’, senseText:’#7c6af7′, senseBorder:’rgba(124,106,247,0.3)’,
title:’Emotional Compass’, ai:’Affective AI · Sentiment Analysis’,
desc:’Read the room. Detect emotional and cognitive state from subtle signals — voice, behavior, biometrics — and adapt the experience to meet the human where they are.’,
question:’How would your product behave differently if it knew the user was anxious, versus energized, versus depleted?’ },
{ id:’S11′, sense:’sight’, senseColor:’rgba(96,165,250,0.15)’, senseText:’#60a5fa’, senseBorder:’rgba(96,165,250,0.3)’,
title:’Future Vision’, ai:’Simulation AI · Scenario Modeling’,
desc:’Show users futures they cannot see yet. Simulate the outcomes of decisions before they\’re made, making consequences visible before they\’re lived.’,
question:’What decision is your user making today whose consequences only become visible in 3 years? Can your AI show them?’ },
{ id:’S12′, sense:’sound’, senseColor:’rgba(79,209,165,0.15)’, senseText:’#4fd1a5′, senseBorder:’rgba(79,209,165,0.3)’,
title:’Universal Translator’, ai:’Multilingual NLP · Real-Time Translation’,
desc:’Dissolve language barriers in real time. Give every user access to every piece of knowledge regardless of the language it was written in.’,
question:’What knowledge does your user need that currently exists only in a language they don\’t speak?’ },
];
let currentCardIndex = -1;
let shownCards = new Set();
let isFlipped = false;
function drawCard(filterSense) {
const candidates = filterSense ? CARDS.filter(c => c.sense === filterSense) : CARDS;
const unshown = candidates.filter(c => !shownCards.has(c.id));
const pool = unshown.length > 0 ? unshown : candidates;
const card = pool[Math.floor(Math.random() * pool.length)];
shownCards.add(card.id);
currentCardIndex = CARDS.indexOf(card);
isFlipped = false;
// Animate deck
const dt = document.getElementById(‘deck-top’);
if (dt) {
dt.style.transition = ‘transform 0.15s ease, opacity 0.15s’;
dt.style.transform = ‘translateY(-16px) rotate(4deg) scale(0.94)’;
dt.style.opacity = ‘0.5’;
setTimeout(() => { dt.style.transform = ”; dt.style.opacity = ‘1’; }, 280);
}
renderDrawnCard(card);
updateCounter();
}
function renderDrawnCard(card) {
const front = document.getElementById(‘card-front’);
const back = document.getElementById(‘card-back’);
const wrapper = document.getElementById(‘drawn-card’);
// Style front
front.style.background = `linear-gradient(135deg, ${card.senseColor.replace(‘0.15′,’0.25’)} 0%, rgba(10,10,10,0.95) 100%)`;
front.style.border = `1px solid ${card.senseBorder}`;
front.style.display = ‘flex’;
document.getElementById(‘card-sense-badge’).style.background = card.senseColor;
document.getElementById(‘card-sense-badge’).style.color = card.senseText;
document.getElementById(‘card-sense-badge’).style.border = `0.5px solid ${card.senseBorder}`;
document.getElementById(‘card-sense-badge’).textContent = card.sense === ‘mind’ ? ‘⬡ Sixth Sense’ :
card.sense === ‘sight’ ? ‘◎ Sight’ : card.sense === ‘sound’ ? ‘◈ Sound’ : ‘◉ Touch’;
document.getElementById(‘card-number’).textContent = card.id;
document.getElementById(‘card-title’).textContent = card.title;
document.getElementById(‘card-title’).style.color = card.senseText;
document.getElementById(‘card-ai’).textContent = card.ai;
document.getElementById(‘card-ai’).style.color = card.senseText;
// Style back
back.style.background = `linear-gradient(135deg, ${card.senseColor.replace(‘0.15′,’0.35’)} 0%, rgba(8,8,8,0.98) 100%)`;
back.style.border = `1px solid ${card.senseBorder}`;
document.getElementById(‘card-back-sense’).style.background = card.senseColor;
document.getElementById(‘card-back-sense’).style.color = card.senseText;
document.getElementById(‘card-back-sense’).style.border = `0.5px solid ${card.senseBorder}`;
document.getElementById(‘card-back-sense’).textContent = card.title;
document.getElementById(‘card-back-title’).textContent = ‘The Superpower’;
document.getElementById(‘card-back-title’).style.color = card.senseText;
document.getElementById(‘card-back-desc’).textContent = card.desc;
document.getElementById(‘card-back-desc’).style.color = ‘var(–gray-300)’;
document.getElementById(‘card-design-q’).textContent = ‘↳ ‘ + card.question;
document.getElementById(‘card-design-q’).style.background = card.senseColor.replace(‘0.15′,’0.1’);
document.getElementById(‘card-design-q’).style.color = card.senseText;
document.getElementById(‘card-design-q’).style.borderLeft = `2px solid ${card.senseText}`;
// Show card with animation
wrapper.style.display = ‘block’;
wrapper.style.opacity = ‘0’;
wrapper.style.transform = ‘translateY(20px) scale(0.95)’;
back.style.display = ‘none’;
front.style.display = ‘flex’;
setTimeout(() => {
wrapper.style.transition = ‘all 0.4s cubic-bezier(0.34,1.56,0.64,1)’;
wrapper.style.opacity = ‘1’;
wrapper.style.transform = ‘translateY(0) scale(1)’;
}, 20);
renderAllCards();
}
function flipCard() {
const front = document.getElementById(‘card-front’);
const back = document.getElementById(‘card-back’);
isFlipped = !isFlipped;
front.style.display = isFlipped ? ‘none’ : ‘flex’;
back.style.display = isFlipped ? ‘flex’ : ‘none’;
}
function updateCounter() {
document.getElementById(‘card-counter’).textContent =
shownCards.size + ‘ of ‘ + CARDS.length + ‘ cards drawn’;
}
function renderAllCards() {
const grid = document.getElementById(‘all-cards-grid’);
grid.innerHTML = CARDS.map(card => `
${card.id}
${card.title}
`).join(”);
}
window._drawSpecific = function(id) {
const card = CARDS.find(c => c.id === id);
if (card) { shownCards.add(id); renderDrawnCard(card); isFlipped = false; updateCounter(); }
};
// Deck hover bounce
const deckTop = document.getElementById(‘deck-top’);
if (deckTop) {
deckTop.parentElement.addEventListener(‘mouseenter’, () => { deckTop.style.transform = ‘translateY(-6px) rotate(-1deg)’; });
deckTop.parentElement.addEventListener(‘mouseleave’, () => { deckTop.style.transform = ”; });
}
// Pre-render all-cards grid when cards page is first visited
const _origShowPage = showPage;
window.showPage = function(pageId) {
_origShowPage(pageId);
if (pageId === ‘cards’ && document.getElementById(‘all-cards-grid’).children.length === 0) {
renderAllCards();
}
};
// Mobile nav
const hamburger = document.getElementById(‘hamburger’);
const mobileNav = document.getElementById(‘mobileNav’);
hamburger.addEventListener(‘click’, () => {
mobileNav.classList.toggle(‘open’);
});
function closeMobileNav() {
mobileNav.classList.remove(‘open’);
}
// Page router
const ALL_PAGES = [‘home’,’about’,’faq’,’blog’,’cards’,
‘ch-01′,’ch-02′,’ch-03′,’ch-04′,’ch-05′,’ch-06’,
‘post-01′,’post-02′,’post-03′,’post-04′,’post-05’];
function showPage(pageId) {
ALL_PAGES.forEach(id => {
const el = document.getElementById(‘page-‘ + id);
if (el) el.style.display = id === pageId ? ‘block’ : ‘none’;
});
window.scrollTo({ top: 0, behavior: ‘smooth’ });
setTimeout(() => {
document.querySelectorAll(‘.fade-up’).forEach(el => {
const rect = el.getBoundingClientRect();
if (rect.top {
entries.forEach(e => {
if (e.isIntersecting) { e.target.classList.add(‘visible’); }
});
}, { threshold: 0.1, rootMargin: ‘0px 0px -40px 0px’ });
fadeEls.forEach(el => observer.observe(el));
// Teaching tabs
function switchTab(btn, panelId) {
document.querySelectorAll(‘.teaching-tab’).forEach(t => t.classList.remove(‘active’));
document.querySelectorAll(‘.teaching-panel’).forEach(p => p.classList.remove(‘active’));
btn.classList.add(‘active’);
document.getElementById(‘panel-‘ + panelId).classList.add(‘active’);
}
// FAQ accordion
function toggleFaq(btn) {
const item = btn.closest(‘.faq-item’);
const wasOpen = item.classList.contains(‘open’);
// Close all in the same list
btn.closest(‘.faq-list’).querySelectorAll(‘.faq-item’).forEach(i => i.classList.remove(‘open’));
if (!wasOpen) item.classList.add(‘open’);
}
// Glossary accordion
function toggleGlossary(btn) {
const item = btn.closest(‘.glossary-item’);
const wasOpen = item.classList.contains(‘open’);
document.querySelectorAll(‘.glossary-item’).forEach(i => i.classList.remove(‘open’));
if (!wasOpen) item.classList.add(‘open’);
}
// Flashcards
const cards = [
{
q: ‘What is Cognitive Experience Design?’,
a: ‘The practice of using artificial intelligence technologies to reduce the human mental effort and time required to complete a task. It employs knowledge of neuroscience, human perception, mental processing, modeling, and memory — coined by Joanna Peña-Bickley at IBM Watson in 2014.’
},
{
q: ‘What is the “Haystack Principle” in CXD?’,
a: ‘The role of AI should NOT be to find the needle in the haystack — it should be to show the user how much hay it has cleared so they can better see the needle themselves. AI clears friction; humans make meaning.’
},
{
q: ‘What are the 4 pillars that CXD unites?’,
a: ‘Human-Centered Design (artful intention), User-Centered Design (behavioral rigor), Cognitive Ergonomics (applied cognitive science), and Neuro-Ergonomics (neurological and perceptual science). Together they move from “Design Thinking” to “Design For Thinking & Doing.”‘
},
{
q: ‘What is Cognitive Load and why does it matter?’,
a: ‘Cognitive load is the total mental effort used in working memory. When users are exhausted by an interface, they have less energy for high-level thinking. CXD designs to minimize load — so human intelligence can flourish instead of survive the experience.’
},
{
q: ‘Define the Triple Brain Health Epidemic.’,
a: ‘Three converging crises: (1) Mental health deterioration — digital platforms exploit social comparison and attention. (2) Accelerated cognitive decline — “digital dementia” from early screen exposure. (3) Digital overload — the WHO-named “infodemic” where information exceeds brain processing capacity.’
},
{
q: ‘What is Anticipatory Design in CXD?’,
a: ‘Designing systems that predict a user\’s next need before it is fully articulated. Treating AI as an “extended mind” — an external cognitive tool that stores memory, performs reasoning, and filters information. The goal: reduce the distance between intention and action.’
}
];
let currentCard = 0;
function renderCard() {
const card = cards[currentCard];
document.getElementById(‘fc-question’).textContent = card.q;
document.getElementById(‘fc-answer’).textContent = card.a;
document.getElementById(‘fc-counter’).textContent = (currentCard + 1) + ‘ / ‘ + cards.length;
document.getElementById(‘flashcard’).classList.remove(‘revealed’);
}
function revealCard() {
document.getElementById(‘flashcard’).classList.toggle(‘revealed’);
}
function nextCard() {
currentCard = (currentCard + 1) % cards.length;
renderCard();
}
function prevCard() {
currentCard = (currentCard – 1 + cards.length) % cards.length;
renderCard();
}