Evaluation beyond usability: Validating sustainable HCI research

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review


  • Christian Remy, Physik-Institut, Universitat Zürich-Irchel
  • ,
  • Oliver Bates, University Lancaster
  • ,
  • Alan Dix, Birmingham University
  • ,
  • Vanessa Thomas
  • ,
  • Mike Hazas, University Lancaster
  • ,
  • Adrian Friday, University Lancaster
  • ,
  • Elaine M. Huang, Physik-Institut, Universitat Zürich-Irchel

The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.

Original languageEnglish
Title of host publicationCHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems : Engage with CHI
PublisherAssociation for Computing Machinery
Publication year20 Apr 2018
ISBN (Electronic)9781450356206, 9781450356213
Publication statusPublished - 20 Apr 2018
Event2018 CHI Conference on Human Factors in Computing Systems, CHI 2018 - Montreal, Canada
Duration: 21 Apr 201826 Apr 2018


Conference2018 CHI Conference on Human Factors in Computing Systems, CHI 2018

    Research areas

  • Evaluation, Sustainability, Sustainable HCI, Validation

See relations at Aarhus University Citationformats

ID: 143264284