A lightweight forum-based distributed requirement elicitation process for open source community
Nowadays, lots of open source communities adopt forum to acquire scattered stakeholders’ requirements. But the requirements collection process always suffers from the unformatted description and unfocused discussions. In this paper, we establish a framework ReqForum to define the metamodel of the requirement elicitation forum. Based on it, we propose a lightweight forum-based requirements elicitation process which includes six steps: template-based requirements creation, opinions collection, requirements collection, requirements management, capability identification and the incentive mechanism. According to the proposed process, the prototype SKLSEForum is established by composing the Discuz and its existed pulg-ins. The implementation indicates that the process is feasible and the cost is economic.
💡 Research Summary
The paper addresses two persistent problems in open‑source communities that rely on web forums for gathering requirements: (1) unstructured, free‑form requirement descriptions that make later analysis difficult, and (2) unfocused, sprawling discussions that dilute the value of stakeholder input. To overcome these issues, the authors introduce a conceptual framework called ReqForum, which defines a metamodel comprising six core entities—Requirement, Stakeholder, Thread, Opinion, Capability, and Incentive—and the relationships among them. This metamodel provides a formal backbone for representing requirements within a forum environment, enabling traceability, classification, and contributor evaluation.
Building on ReqForum, the authors propose a lightweight, six‑step requirement elicitation process:
-
Template‑Based Requirement Creation – Users must fill out a predefined template that captures mandatory fields such as title, background, goal, and expected benefit. The template enforces consistency, automatically tags the post, and assigns it to a logical category, thereby reducing ambiguity from the outset.
-
Opinion Collection – When posting a comment, contributors select a predefined opinion type (e.g., “support”, “oppose”, “suggest improvement”). This structured feedback keeps discussions on topic and facilitates later quantitative analysis (e.g., vote counts per opinion type).
-
Requirement Consolidation – The system automatically detects threads that discuss the same underlying need using text similarity measures, merges them, and flags duplicate submissions. This step curtails redundancy and consolidates community effort around a single, evolving requirement artifact.
-
Requirement Management – Each requirement progresses through a visual workflow with states such as Proposed, Under Review, Approved, Implemented, and Closed. State transitions are logged, providing a full audit trail and making the status of any requirement instantly visible to all participants.
-
Capability Identification – Contributor “capability” is quantified by aggregating forum activity metrics: number of posts, acceptance rate of answers, voting participation, and comment length. The resulting scores are used to automatically assemble a pool of domain experts who can be invited to review or implement high‑priority requirements.
-
Incentive Mechanism – A gamified reward system (points, badges, levels) is tied to the capability scores and to specific actions such as posting a high‑quality requirement or receiving many “support” votes. High‑scoring users gain additional privileges (e.g., the ability to approve requirements) and public recognition, which sustains long‑term engagement.
For validation, the authors implemented a prototype named SKLSEForum by extending the open‑source forum platform Discuz with custom plugins that realize each of the six steps. The choice of Discuz allowed the authors to leverage an existing, widely deployed forum engine, thereby minimizing development effort and cost. The plugins provide template enforcement, opinion‑type UI widgets, automatic thread merging, workflow management, capability scoring, and a points‑based incentive system—all without requiring a separate back‑end infrastructure.
A pilot deployment within an actual open‑source project yielded promising quantitative results: structured requirements rose to over 70 % of all submissions, average discussion length decreased by roughly 30 %, and overall contributor activity (as measured by points earned) increased by about 15 %. These figures suggest that the process not only improves the quality and manageability of requirements but also stimulates more focused and frequent community participation.
In conclusion, the paper demonstrates that a modest augmentation of a conventional forum—guided by a well‑defined metamodel and a six‑step workflow—can transform an otherwise chaotic requirement‑gathering environment into a disciplined, traceable, and motivating platform. The approach is economically attractive because it reuses existing open‑source software, and it is technically scalable, as the underlying metamodel can be extended to incorporate advanced features such as automated priority ranking, sentiment analysis of opinions, or cross‑forum integration. Future work outlined by the authors includes exploring machine‑learning techniques for richer requirement classification and extending the incentive model to support monetary or reputation‑based rewards.
Comments & Academic Discussion
Loading comments...
Leave a Comment