2019
DOI: 10.1109/tse.2017.2759112
|View full text |Cite
|
Sign up to set email alerts
|

Listening to the Crowd for the Release Planning of Mobile Apps

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
54
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 65 publications
(55 citation statements)
references
References 29 publications
1
54
0
Order By: Relevance
“…Villarroel et al [43] and Scalabrino et al [44] conducted a semi-structured interview with 3 project managers of software companies developing mobile apps in order to evaluate the usefulness of their tool (which extracts and clusters user reviews from the app store into bug report or new feature request). The tool was first demonstrated to the managers, then they were asked about the usefulness of reviews (do you analyse user reviews when planning a new release?…”
Section: Related Workmentioning
confidence: 99%
“…Villarroel et al [43] and Scalabrino et al [44] conducted a semi-structured interview with 3 project managers of software companies developing mobile apps in order to evaluate the usefulness of their tool (which extracts and clusters user reviews from the app store into bug report or new feature request). The tool was first demonstrated to the managers, then they were asked about the usefulness of reviews (do you analyse user reviews when planning a new release?…”
Section: Related Workmentioning
confidence: 99%
“…Using spaCy, we created the part-of-speech tags for the title, the body, and the phrases of a post. While Chaparro et al also used NLP patterns, we opted for a simple, effective, and pretty consolidated approach to classify text, such as the one successfully used by Villarroel et al (2016) and Scalabrino et al (2017), when classifying app reviews.…”
Section: Experimental Setup Using Machine Learning Algorithmsmentioning
confidence: 99%
“…Following the general idea of incorporating user feedback into typical development process, Di Sorbo et al [16], [17] and Scalabrino et al [14], [66] proposed SURF and CLAP, two approaches aiming at recommending the most important reviews to take into account while planning a new release of a mobile application. CLAP improves AR-MINER by clustering reviews into specific categories (e.g., reports of security issues) and by learning from the app history (or from similar apps) which reviews should be addressed [66]. SURF proposed a first strategy to automatically summarize user feedback in more structured and recurrent topics [17], [34] (e.g., GUI, app pricing, app content, bugs, etc.).…”
Section: User Feedback Analysis and App Successmentioning
confidence: 99%