Aarhus University Seal

Evaluating HCI Research beyond Usability

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

DOI

  • Christian Remy
  • ,
  • Oliver Bates, Lancaster University, United Kingdom
  • Jennifer Mankoff, University of Washington, United States
  • Adrian Friday, Lancaster University, United Kingdom
Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.
Original languageEnglish
Title of host publicationCHI EA '18 Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
EditorsR. Mandryk, M. Hancock
Number of pages4
Volume2018
Place of publicationNew York, NY, USA
PublisherAssociation for Computing Machinery
Publication year20 Apr 2018
EditionApril
Article numberSIG13
ISBN (Electronic)978-1-4503-5621-3
DOIs
Publication statusPublished - 20 Apr 2018

    Research areas

  • Evaluation, Research Methods, Validation, Sustainable HCI, HCI4D, Design Fiction, Futures Studies

See relations at Aarhus University Citationformats

ID: 140732181