Hosseini, M., Groen, E., Shahri, A. and Ali, R., 2017. CRAFT: A Crowd-Annotated Feedback Technique. In: CrowdRE: 2nd International Workshop on Crowd-Based Requirements Engineering, 4 September 2017, Lisbon, Portugal.
Full text available as:
|
PDF
Crowd Annotated Feedback Technique.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial No Derivatives. 449kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Official URL: https://crowdre.github.io/ws-2017/program.html
Abstract
The ever increasing accessibility of the web for the crowd offered by various electronic devices such as smartphones has facilitated the communication of the needs, ideas, and wishes of millions of stakeholders. To cater for the scale of this input and reduce the overhead of manual elicitation methods, data mining and text mining techniques have been utilised to automatically capture and categorise this stream of feedback, which is also used, amongst other things, by stakeholders to communicate their requirements to software developers. Such techniques, however, fall short of identifying some of the peculiarities and idiosyncrasies of the natural language that people use colloquially. This paper proposes CRAFT, a technique that utilises the power of the crowd to support richer, more powerful text mining by enabling the crowd to categorise and annotate feedback through a context menu. This, in turn, helps requirements engineers to better identify user requirements within such feedback. This paper presents the theoretical foundations as well as the initial evaluation of this crowd-based feedback annotation technique for requirements identification.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Uncontrolled Keywords: | crowdsourcing; requirements elicitation; feedback categorisation; crowdsourced text mining |
Group: | Faculty of Science & Technology |
ID Code: | 29514 |
Deposited By: | Symplectic RT2 |
Deposited On: | 25 Jul 2017 08:59 |
Last Modified: | 14 Mar 2022 14:06 |
Downloads
Downloads per month over past year
Repository Staff Only - |