Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 27, 2015, Revised Contributions /

As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015...

Full description

Saved in:
Bibliographic Details
Corporate Authors: SpringerLink Online service
Group Author: Archambault, Daniel; Purchase, Helen; Ho feld, Tobias
Published: Springer International Publishing : Imprint: Springer,
Publisher Address: Cham :
Publication Dates: 2017.
Literature type: eBook
Language: English
Series: Lecture Notes in Computer Science, 10264
Subjects:
Online Access: http://dx.doi.org/10.1007/978-3-319-66435-4
Summary: As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research d
Carrier Form: 1 online resource (VII, 191 pages): illustrations.
ISBN: 9783319664354
Index Number: QA76
CLC: TP11
Contents: Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.