User Story Quality in Practice: Comparison
Please note this is a comparison between Version 1 by Mohammad Amin Kuhail and Version 2 by Dean Liu.

User stories are widely used in Agile development as requirements. User stHories have their origins in extreme programming (XP). Kent Beck, the founder of XP, stated that user stories were created to address the specific needs of software development, conducted by small teams in the face of changing and vague requirementswever, few studies have assessed the quality of user stories in practice.

  • user stories
  • problem-oriented requirements
  • case study

1. Introduction

User stories have their origins in extreme programming (XP). Kent Beck, the founder of XP, stated that user stories were created to address the specific needs of software development, conducted by small teams in the face of changing and vague requirements [1].
In general, user stories consist of brief descriptions of a system feature written from the perspective of the customer who wants the system [2]. Different user story templates and practices have been proposed by Beck [1], Jeffries, et al. [3], Beck and Fowler [4], and Cohn [2][5][2,5].
According to a survey in 2014, user stories have become the most used requirements in an agile environment [6]. In fact, user stories are integrated into many agile techniques, such as release and iteration planning and tracking the progress of a project. Thus, user stories are increasingly a means of communication with end-users and customers and the basis for developing the related functionalities and building them into the system [7].
Despite their popularity, there is little evidence of the effectiveness of user stories [8]. According to some authors, only simple, customer-visible functional requirements can be expressed by user stories [9], while the consistency and verifiability of user stories are difficult to validate [10].

2. User Stories as Requirements

Many organizations prefer to use user stories as high-level requirements, as opposed to traditional requirements [9]. User stories cover the essential parts of a requirement: who it is for, what is needed from the system, and, optionally, why it is important [11][12]. Further, user stories serve as conversation starters with customers, and they may change before, or during, implementation [2]. Despite their popularity, there are many concerns with user stories. For instance, they are often imprecise, require significant implementation effort [12][13], and are unsuited for expressing requirements for large or complex software systems. In such a case, separate system and subsystem requirements are needed [12][13][13,14]. Another major concern with user stories is the lack of requirements traceability, creating problems when requirements, code, and tests change over time [14][15][15,16]. Non-functional requirements (NFRs) are often ill-defined or ignored by user stories, as stakeholders focus on core functionality and ignore scalability, maintainability, security, portability, and performance [9][10][9,10]. The requirements side of the engineering community has made several attempts to address some of these concerns and improve the quality of user stories [11][12]. For instance, some authors [16][17] suggested guidelines for documenting NFRs, where the diversity, scope, and detail level of the NFRs are taken into account. Other authors [17][18] suggested using tools for visually modeling NFRs to help to reason out NFRs. Some approaches to improving user stories utilize qualitative metrics, such as the heuristics of the INVEST framework (Independent-Negotiable-Valuable-Estimable-Scalable-Testable) [18][19], and the general guidelines for ensuring quality in agile requirements engineering, proposed by Heck and Zaidman [19][20]. Lucassen et al. [11][12] proposed the Quality User Story Framework, containing syntactic, semantic, and pragmatic criteria that user stories should conform to. The proposed frameworks do not advocate IEEE criteria, such as verifiability, traceability, and modifiability, but commend other criteria that overlap with the IEEE criteria, such as completeness and unambiguity. A notable contribution to enhancing the quality of user stories in practice is a unified model that covers the essential elements of user stories: who it is for, why it is important, and what it needs [20][21], as well as visually representing user stories using Rational Trees (RTs) [21][22]. RTs help analysts view the interdependencies of user stories and could potentially help them identify inconsistencies or missing requirements. However, despite their benefits, it can be challenging for analysts to build RTs [22][23]. Further, it was found that using a visual representation (Rational Tree) did not help analysts determine whether user stories were missing requirements [23][24].

3. Requirements in Practice

Fernandez et al. [24][25] conducted a mapping study on empirical evaluation of software requirement specification techniques and found that most authors conducted experiments in academic environments. A recent example of such experiments can be found in [25][26]. The authors performed a controlled experiment with 118 undergraduate students to assess the benefits of user stories versus use cases as part of a course. The researcheuthors concluded that participants could derive a more complete conceptual model with user stories because of the conciseness and focus of user stories and the repetitions of entities in the user stories. A few articles examined how requirement engineering methods are used in practice. For instance, a recent study [26][27] assessed the quality of derived conceptual models from two notations: user stories and use cases. The study found that the requirement notation has little effect on the quality of the derived conceptual models. Another empirical study explores how practitioners use user stories and perceive their effectiveness (defined as the extent to which user stories improve productivity and quality of work deliverables). Lucassen et al. conducted a survey with 182 practitioners and 21 follow-up interviews. The results show that practitioners agree that using a user story template and quality guidelines, such as INVEST, improve the quality of the user story requirements [8]. Related studies found in [8][25][8,26] assessed the perceived benefits and effectiveness of user stories from the perspective of students and practitioners. Since perceived effectiveness is subjective, there is a need to assess the effectiveness of user stories in a more objective way. Another study [6] found that most practitioners do not use requirements standards and prefer to use their own personal style. The aforementioned studies undoubtedly contributed to the body of knowledge. Nonetheless, none of these studies evaluated the quality of the requirements of a real-life project expressed with user stories. A recent systematic literature review suggests that agile requirements engineering as a research context needs more empirical investigations to better understand the impact of agile requirements engineering in practice [27][28]. An attempt to evaluate the quality of non-agile requirements was embodied in an experiment comparing use cases with task descriptions by asking 15 professionals to specify requirements for a hotline system with use cases or task descriptions [28][29]. The result of the experiment was that traditional use cases covered stakeholder needs poorly in areas where improvement was important but difficult and restricted the solution space severely.