Capturing Evidence From Wireless Internet Services Development
The merging of the Internet with the Wireless services domain has created a potential market whose characteristics are new technologies and time-to-market pressure. The lack of knowledge about new technologies and the need to be competitive in a short time demand that software organizations learn quickly about this domain and its characteristics. Additionally, the effects of development techniques in this context need to be understood. Learning from previous experiences in such a changing environment demands a clear understanding of the evidence to be captured, and how it could be used in the future. This article presents definitions of quantitative and qualitative evidence, and templates for capturing such evidence in a systematic way. Such templates were used in the context of two pilot projects dealing with the development of Wireless Internet Services.
💡 Research Summary
The paper addresses the emerging domain of Wireless Internet Services (WIS), where rapid technological change and intense time‑to‑market pressure create a knowledge gap for software organizations. To remain competitive, firms must acquire domain knowledge quickly and understand how development techniques behave under these constraints. The authors argue that systematic capture of both quantitative and qualitative evidence from past projects is essential for building a reusable knowledge base that can guide future work.
First, the authors define quantitative evidence as objectively measurable data such as response times, packet loss rates, battery consumption, defect density, test coverage, and other performance or quality metrics. Qualitative evidence is described as subjective information gathered through interviews, observations, retrospectives, and narrative reports that capture stakeholder perceptions, developer attitudes, and contextual factors not reflected in raw numbers.
To collect this evidence in a repeatable way, the paper introduces two Evidence Capture Templates (ECTs). The quantitative template includes fields for the measurement target, measurement method, collection point, expected value, actual value, deviation analysis, and corrective actions. It is designed to be linked with automated logging, performance‑testing scripts, and monitoring tools, and it incorporates metadata (project phase, responsible person, version) for traceability. The qualitative template contains structured interview questions, observation checkpoints, free‑form narrative sections, and role identifiers, enabling systematic capture of insights from developers, designers, product owners, and end‑users.
The authors applied these templates in two pilot projects:
-
Mobile Payment Prototype – The team measured network latency and battery usage weekly using the quantitative template, while weekly sprint retrospectives fed into the qualitative template. Early quantitative data revealed an unexpected network bottleneck; qualitative feedback highlighted user expectations for instantaneous UI response. The combined evidence prompted a redesign of the communication layer and UI optimizations, reducing latency by 30 % and improving perceived responsiveness.
-
Location‑Based Advertising Service – Here, A/B testing results (click‑through rate, conversion rate) were logged quantitatively, and user interviews plus usability observations were recorded qualitatively. When quantitative metrics plateaued, qualitative data uncovered that users found ad placement intrusive, leading to a redesign of the ad positioning algorithm. Subsequent testing showed a 15 % lift in conversion and higher user satisfaction scores.
Across both pilots, the evidence‑driven approach yielded several key insights:
- Early Risk Identification – Quantitative metrics flagged performance problems before they became critical, while qualitative inputs surfaced usability and expectation gaps that numbers alone missed.
- Improved Decision Making – Having both data types in a single, version‑controlled repository allowed the teams to prioritize fixes based on a balanced view of technical impact and stakeholder value.
- Cultural Benefits – Transparent documentation of evidence fostered trust among cross‑functional teams, encouraging a learning‑oriented culture where failures were openly discussed and lessons systematically archived.
- Scalability and Extensibility – By embedding metadata and version control, the templates support reuse across projects and can be extended to new domains with minimal effort.
The paper concludes that systematic evidence capture is not a peripheral activity but a core component of agile, high‑velocity development in the WIS context. The proposed templates provide a practical, domain‑agnostic framework that can be integrated with existing development pipelines and automated tooling. Future work is suggested in the direction of automated evidence extraction (e.g., mining logs, sentiment analysis of retrospectives) and the creation of a centralized evidence repository that supports query‑based retrieval for knowledge reuse across the organization. Ultimately, the authors argue that disciplined evidence management enables faster learning cycles, reduces time‑to‑market, and improves the overall quality and sustainability of wireless internet services.