Research Statement – Academic Highlights – Methods


I research and design digital products designed for millions of users.

I continue to build on my doctoral research in applying Activity-Goal Alignment theory, which explores how a digital product’s aims can be embodied in the moment-to-moment user experience.

For example: Can we convert teens’ passion for dramatic stories into real-world self-improvement outcomes?  This is an example of ‘transfer’, a long-studied problem in educational games, VR therapy, and other digital entertainment based approaches to impact.  In my research I argue that transfer is not easy, but it’s possible.

My research responds to 30 years of disappointing attempts to apply video game and digital entertainment product design to non-entertainment products. I agree with theorists who argue these points:

  • Video games have great promise as tools for change
  • Past approaches, such as gamification[1] and edutainment[2], have largely failed to deliver on that promise
  • Activity-goal alignment theory can help address those failures
  • Design Research theory can be applied to improve practice

From 2012 to 2017, my colleagues and I explored how to apply activity-goal alignment theory to Cognitive Behavioral Therapy (CBT), the leading evidence-based theory to prevent and treat depression and anxiety.  In recognition of the general problem of adherence (many users begin, but don’t complete, CBT treatment) and specifically the many self-directed digital CBT therapy products that have failed to engage their users enough to achieve impact seen in laboratory settings, I argue our prioritization of engagement is appropriate.

Specifically, I aim to “raise the bar” for CBT games.  I critique top-cited CBT games (e.g. SPARX) as being too easy: Their designers allow players to achieve win without struggle or challenge, and provide CBT knowledge with weak connection to game play outcome.

To address this, I argue that good therapy requires patients to face difficult challenges and struggle to overcome them.  To support this, I cite Activity-Goal Alignment, basic game design theory, and widely accepted psychological approaches that embody stressful challenge such as exposure therapy.

Specifically, I review and reject simple design moves such as modifying existing games to make challenges simply “harder”.  I ague that different game designs are required, and suggest setting design requirements such as:

  • CBT games should embed CBT theory in the game’s rules and mechanics, as opposed to informative text content.  For example, a game design method may identify “rumination” a specific behavior from CBT theory, and create a lose state  around them.
  • No easy wins. New players should iterate on hard problems, failing before succeeding.
  • Players must discover CBT tools in ways that feel authentic to the game experience.
  • When CBT tools are embodied in in-game items, the value of the tool must fit the player’s perception of that in-game item’s value.
  • Players must understand the real-life purpose, meaning, and/or value of a CBT tool to use it in-game.
  • Players must use CBT tools to win the game.
  • Players must transfer in-game CBT tools to improve their real-life experiences.


Formal (e.g. NIH-funded) research design might include

  • Two studies (N=30 pilot, N=300 final)
  • Five Stages (recruit, screen, pre survey, treatment, post survey)
  • Justification for setting (lab vs natural), participant selection, exposure
  • Analysis of methodological weaknesses (e.g. biases due to compensation) and mitigation efforts
  • Quantitative data from valid, reliable published measures of knowledge and attitudinal change
  • Power and bias reviews from statistics, subject matter, other research experts, prior to study
  • IRB approval

My study designs begin with literature reviews, and address known weaknesses. For example, many studies of game-based interventions were unable to distinguish engagement effects attributable specifically to the novelty of a video game experience, or rely on anecdotal data to support their claims of engagement levels comparable to commercial games. To address this in a recent project, I added a second control group to create a three-group design: one control group plays a top-selling commercial game, while experimental group plays our prototype, and a third group engages in Treatment As Usual – an online elearning course with known levels of efficacy).

  • Third stage (‘promising signs’ / exploratory user research) research prior to pilot, during production

As an applied researcher, I often combine modern product design approaches (iterative, user data-driven), combining and modifying methods from the following academic and commercial fields:

  • psychology (behavioral, social)
  • digital health
  • design
  • media and communication
  • commercial video game production

I often work at early stage (concept) projects, typically employing the following methods:

  • Basic ethnographic (e.g. natural and laboratory behavioral observation, semi-structured interview, and/or attitudinal / knowledge surveys)
  • Secondary source research (academic meta-analyses, competitive product analyses, business reports based on prior user surveys, published essays from designers and user researchers)

For example, I might evaluate a nonfunctional low-fidelity prototype using this method:

  • Typical N=2-10
  • Recruiting via friends & family, online services (mechanical Turk, Sermo), compensated
  • Screening, survey, 60 minute videoconference, intro-play-discuss
  • Often single-blinded (we show two prototypes per session)
  • Data, qualitative (notes and transcripts from semistructured interviews, observation of behavioral and social interactions with prototypes (function/feature definition)
  • Data, quantitative (sort, speech-based, observed actions/body language, data recorded by prototype)

As an applied researcher, my aims are often broader than traditional academic research. In addition to typical UX research aims (adoption/engagement/commercial success), my work often has an efficacy assessment aim as well.  I customize methods to achieve these combinations.  For example, for a nonprofit-funded socioemotional treatment experiment in 2015, my colleagues and I conducted an innovative pre-pilot study method I named “Rapid Evaluation”.

  • Aim: estimate (rapidly inaccurately measure) efficacy and engagement prior to pilot
  • Timeline: iteration, in 1-3 week cycles
  • Product: functional prototype of proposed product
  • Protocol: online recruiting via mTurk, pre-screening, pre-interview 10-item survey, 60 minute play-discuss session via videoconference
  • Coding: single-rater coding of body proximity, presence/absence of emotion (frustration/confusion), conversation topic (social rejection experience),
  • Data: spreadsheet of coded behaviors, brief written statements 3 topics per playtest, videos of playtests.
  • Outcome: we feel the “Rapid Eval” method fills an important gap between non-behavioral methods (surveys, secondary sources) and a typical 8-month academic pilot test. Its findings are too low quality to substitute for pilot, but can be used to justify resources for one.

For preproduction or larger-scale research during production that is sufficiently resourced, I collaborate with experts in laboratory settings to design experiments suitable for the budget and specific project.

[1] Gamification is adding generic mechanics from video games such as point scoring and leaderboards.

[2] Edutainment is didactically delivered educational content interleaved with entertaining experiences.




  • Twice invited presenter and design participant, “Depression Game Jam” workshop, University of Southern California (2014), Radboud University (2015)
    • Presentation title: “Prototyping Methods to find High Alignment between Game Mechanics and CBT for Teen Depression.”
  • Accepted panel, Foundations of Digital Games (FDG) 2015, Asilomar CA.
    • “Game Design Prototyping Methods To Find High Alignment Between Game Mechanics And CBT For Teen Depression For Android Devices.”
  • Invited speaker, “Youth Technology Health Live 2015”, San Francisco, April 2015.  “Methods and Findings for Engagement Studies of “Surviving Independence” Game-Based Behavior Change Intervention”
  • Invited participant, HealthFoo 2015
  • Won $250k grant to Northwest Media Inc., titled “Online Training for Foster and Primary Parents of Neglected Children”, NIH. Co-PI. Designed, developed product, completed pilot study, 2014.
  • Won $2.4M grant to Northwest Media Inc., titled “VSG”: a video game that teaches independent living skills to at-risk youth, NIH. Co-PI. Designed, developed product, completed final study, 2014
  • 2011-2012 “Lead Researcher” (PI), successful $30,000 YAWCRC “Young People and Game Developers Working Together: Modelling a Process for Video Game Design Co-Creation” link