HabitLab : in-the-wild behavior change experiments at scale

Placeholder Show Content

Abstract/Contents

Abstract
Behavior change systems help people manage their time online. However, existing productivity systems have tended to assume a one-size-fits all solution, whereas there are many factors - novelty effects, attrition, influences from other apps and devices, and differences in individual motivation - that we must take into account. That said, these effects have been researched mostly in small-scale labs studies in domains other than online behavior change, so there is a large space of opportunities for studying how these effects manifest in real-world online behavior change contexts, and how to design better behavior change systems using these insights. In this thesis we present HabitLab, an in-the-wild experimentation platform we developed for conducting behavior change experiments, as well as a set of studies we ran on the platform. HabitLab is a browser extension and mobile phone app with over 12,000 daily active, voluntary users, that users install to help them reduce time online and on their phones. It works by displaying one of 20+ interventions whenever they open an app or site they wish to spend less time on. We use HabitLab as a large-scale experiment platform to understand behavior change. In our first set of studies, we investigate novelty effects of interventions, finding that compared to always showing the same intervention, a strategy of rotating between different interventions improves intervention effectiveness, but at the cost of increased attrition. This attrition is partly due to users being unfamiliar with rotating interventions, and improving users' mental models with a notice shown whenever a new intervention is shown is able to reduce this attrition. In our second set of studies, we investigate whether reducing time on one site or app by intensifying interventions influences time on other sites, apps, and devices. We find that on the browser, reducing time on one site reduces time spent elsewhere, but we do not observe the effect on mobile devices, and do not observe cross-device effects. In our third set of studies, we investigate users' motivation levels over time as indicated by the difficulty of interventions they select. We find that users initially overestimate how difficult of interventions they want, and their choices of difficulty progressively decline over time. Thus, we have found that online behavior change is a domain where incentives for users and researchers line up such that researchers can run large-scale in-the-wild experiments gaining ecologically valid insights about how behavioral psychology and economics theories play out in the real world, while users benefit from the more effective, scientifically informed behavior change systems that we can develop using these experiments and data.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2019; ©2019
Publication date 2019; 2019
Issuance monographic
Language English

Creators/Contributors

Author Kovács, Géza
Degree supervisor Bernstein, Michael S, 1984-
Thesis advisor Bernstein, Michael S, 1984-
Thesis advisor Fogg, Brian J
Thesis advisor Landay, James A, 1967-
Degree committee member Fogg, Brian J
Degree committee member Landay, James A, 1967-
Associated with Stanford University, Computer Science Department.

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Geza Kovacs.
Note Submitted to the Computer Science Department.
Thesis Thesis Ph.D. Stanford University 2019.
Location electronic resource

Access conditions

Copyright
© 2019 by Geza Kovacs
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Also listed in

Loading usage metrics...