Aarhus University Seal / Aarhus Universitets segl

Evaluating HCI Research beyond Usability

Publikation: Bidrag til bog/antologi/rapport/proceedingKonferencebidrag i proceedingsForskningpeer review

DOI

  • Christian Remy
  • Oliver Bates, University Lancaster, Storbritannien
  • Jennifer Mankoff, University of Washington, USA
  • Adrian Friday, University Lancaster, Storbritannien
Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.
OriginalsprogEngelsk
TitelCHI EA '18 Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
RedaktørerR. Mandryk, M. Hancock
Antal sider4
Vol/bind2018
UdgivelsesstedNew York, NY, USA
ForlagAssociation for Computing Machinery
Udgivelsesår20 apr. 2018
UdgaveApril
ArtikelnummerSIG13
ISBN (Elektronisk)978-1-4503-5621-3
DOI
StatusUdgivet - 20 apr. 2018

Se relationer på Aarhus Universitet Citationsformater

ID: 140732181