Research Statement – Academic Highlights – Methods


I research and design digital products designed for millions of users.

I continue to build on my doctoral research, which explores how a digital therapeutic intervention’s aims can be embodied in the moment-to-moment user experience. For example: Can we convert teen passion for dramatic stories into real-world self-improvement outcomes?  This is an example of ‘transfer’, a long-studied problem in educational games, VR therapy, and other digital entertainment based approaches to impact.  In my research I argue that transfer is not easy, but it’s possible.”

My work responds to 30 years of mostly failed attempts to apply video game and digital entertainment product design to non-entertainment products. I agree with theorists who argue these points:

  • Video games have great promise as tools for change
  • Past approaches, such as gamification[1] and edutainment[2], have failed to deliver on that promise
  • Activity-goal alignment theory can address those failures
  • A good method to advance this research area is to apply design theory in building innovative products

From 2012 to 2017, my colleagues and I explored how to apply activity-goal alignment theory to Cognitive Behavioral Therapy (CBT), the leading evidence-based theory to prevent and treat depression and anxiety.  In recognition of the many self-directed digital CBT therapy products that have failed to engage their users enough to have major impact, I argue our prioritization of engagement is appropriate, and have the potential to be more effective than top-cited work (e.g. SPARX) because the users discover, internalize, and employ strategies of CBT via playful, iterative, challenge-based practice to master systems whose rules embed CBT.


Formal (e.g. NIH-funded) research design might include

  • Two studies (N=30 pilot, N=300 final)
  • Five Stages (recruit, screen, pre survey, treatment, post survey)
  • Justification for setting (lab vs natural), participant selection, exposure
  • Analysis of methodological weaknesses (e.g. biases due to compensation) and mitigation efforts
  • Quantitative data from valid, reliable published measures of knowledge and attitudinal change
  • Power and bias reviews from statistics, subject matter, other research experts, prior to study
  • IRB approval

My study designs begin with literature reviews, and address known weaknesses. For example, many studies of game-based interventions were unable to distinguish engagement effects attributable specifically to the novelty of a video game experience, or rely on anecdotal data to support their claims of engagement levels comparable to commercial games. To address this in a recent project, I added a second control group to create a three-group design: one control group plays a top-selling commercial game, while experimental group plays our prototype, and a third group engages in Treatment As Usual – an online elearning course with known levels of efficacy).

  • Third stage (‘promising signs’ / exploratory user research) research prior to pilot, during production

As an applied researcher, I often combine modern product design approaches (iterative, user data-driven), combining and modifying methods from the following academic and commercial fields:

  • psychology (behavioral, social)
  • digital health
  • design
  • media and communication
  • commercial video game production

I often work at early stage (concept) projects, typically employing the following methods:

  • Basic ethnographic (e.g. natural and laboratory behavioral observation, semi-structured interview, and/or attitudinal / knowledge surveys)
  • Secondary source research (academic meta-analyses, competitive product analyses, business reports based on prior user surveys, published essays from designers and user researchers)

For example, I might evaluate a nonfunctional low-fidelity prototype using this method:

  • Typical N=2-10
  • Recruiting via friends & family, online services (mechanical Turk, Sermo), compensated
  • Screening, survey, 60 minute videoconference, intro-play-discuss
  • Often single-blinded (we show two prototypes per session)
  • Data, qualitative (notes and transcripts from semistructured interviews, observation of behavioral and social interactions with prototypes (function/feature definition)
  • Data, quantitative (sort, speech-based, observed actions/body language, data recorded by prototype)

As an applied researcher, my aims are often broader than traditional academic research. In addition to typical UX research aims (adoption/engagement/commercial success), my work often has an efficacy assessment aim as well.  I customize methods to achieve these combinations.  For example, for a nonprofit-funded socioemotional treatment experiment in 2015, my colleagues and I conducted an innovative pre-pilot study method I named “Rapid Evaluation”.

  • Aim: estimate (rapidly inaccurately measure) efficacy and engagement prior to pilot
  • Timeline: iteration, in 1-3 week cycles
  • Product: functional prototype of proposed product
  • Protocol: online recruiting via mTurk, pre-screening, pre-interview 10-item survey, 60 minute play-discuss session via videoconference
  • Coding: single-rater coding of body proximity, presence/absence of emotion (frustration/confusion), conversation topic (social rejection experience),
  • Data: spreadsheet of coded behaviors, brief written statements 3 topics per playtest, videos of playtests.
  • Outcome: we feel the “Rapid Eval” method fills an important gap between non-behavioral methods (surveys, secondary sources) and a typical 8-month academic pilot test. Its findings are too low quality to substitute for pilot, but can be used to justify resources for one.

For preproduction or larger-scale research during production that is sufficiently resourced, I collaborate with experts in laboratory settings to design experiments suitable for the budget and specific project.

[1] Gamification is adding generic mechanics from video games such as point scoring and leaderboards.

[2] Edutainment is didactically delivered educational content interleaved with entertaining experiences.




  • Twice invited presenter and design participant, “Depression Game Jam” workshop, University of Southern California (2014), Radboud University (2015)
    • Presentation title: “Prototyping Methods to find High Alignment between Game Mechanics and CBT for Teen Depression.”
  • Accepted panel, Foundations of Digital Games (FDG) 2015, Asilomar CA.
    • “Game Design Prototyping Methods To Find High Alignment Between Game Mechanics And CBT For Teen Depression For Android Devices.”
  • Invited speaker, “Youth Technology Health Live 2015”, San Francisco, April 2015.  “Methods and Findings for Engagement Studies of “Surviving Independence” Game-Based Behavior Change Intervention”
  • Invited participant, HealthFoo 2015
  • Won $250k grant to Northwest Media Inc., titled “Online Training for Foster and Primary Parents of Neglected Children”, NIH. Co-PI. Designed, developed product, completed pilot study, 2014.
  • Won $2.4M grant to Northwest Media Inc., titled “VSG”: a video game that teaches independent living skills to at-risk youth, NIH. Co-PI. Designed, developed product, completed final study, 2014
  • 2011-2012 “Lead Researcher” (PI), successful $30,000 YAWCRC “Young People and Game Developers Working Together: Modelling a Process for Video Game Design Co-Creation” link