Forschungsbeiträge auf der HICSS 2020

Samstag, 28. Dezember 2019 um 17:17 Uhr

Auf der Hawaii Conference on Systems Science (HICSS) referiert Thorsten Schoormann zu "Software Tools for Supporting Reflection in Design Thinking Projects" und "Taxonomy Evaluation".

Wir freuen uns sehr, dass ISUM zwei erfolgreiche Beitragsannahmen auf der Hawaii Conference on Systems Science verzeichnen kann:

 

Schoormann, T., Hofer, J., & Knackstedt, R. (2020): Software Tools for Supporting Reflection in Design Thinking Projects. In: Proceedings of the Hawaii Conference on Systems Science (HICSS 2020), Hawaii, USA.

In creative work such as design thinking projects, teams mostly seek to solve complex (wicked) problems as well as situations of uncertainty and value conflicts. To design solutions that cope with these aspects, teams usually start doing something, reflect on their results, and adjust their process. By actually doing something, tacit knowledge (i.e., knowing-in-action) of individuals is disclosed, which might be beneficial for an entire project team because it allows drawing on information and experiences that go beyond single individuals. Accordingly, the present study aims to investigate how tools can be designed that support collaborative reflection in creativity-driven projects. Drawing on reflection theory and several expert interviews, we derive design requirements as well as present a concrete software-based prototype as an expository instantiation.

 

Szopinski, D., Schoormann, T., & Kundisch, D. (2020): Criteria as a Prelude for Guiding Taxonomy Evaluation. In: Proceedings of the Hawaii Conference on Systems Science (HICSS 2020), Hawaii, USA.

Taxonomies are design science artifacts used by researchers and practitioners to describe and classify existing or future objects of a domain. As such, they constitute a necessary foundation for theory building. Yet despite the great interest in taxonomies, there is virtually no guidance on how to rigorously evaluate them. Based on a literature review and a sample of 446 articles, this study explores the criteria currently employed in taxonomy evaluations. Surprisingly, we find that only a minority of taxonomy building projects actually evaluate their taxonomies and that there is no consistency across the multiplicity of criteria used. Our study provides a structured overview of the taxonomy evaluation criteria used by IS researchers and proposes a set of potential guidelines to support future evaluations. The purposeful and rigorous taxonomy evaluation our study advances contributes to DSR by bridging the gap between generic evaluation criteria and concrete taxonomy evaluation criteria.