The relationship between internet user type and user performance when carrying out simple vs. complex search tasks
It is widely known that people become better at an activity if they perform this activity long and often. Yet, the question is whether being active in related areas like communicating online, writing blog articles or commenting on community forums have an impact on a persons ability to perform Web searches, is still unanswered. Web searching has become a key task conducted online; in this paper we present our findings on whether the user type, which categorises a persons online activities, has an impact on her or his search capabilities. We show (1) the characteristics of different user types when carrying out simple search tasks; (2) their characteristics when carrying out complex search tasks; and, (3) the significantly different user type characteristics between simple and complex search tasks. The results are based on an experiment with 56 ordinary Web users in a laboratory environment. The Search-Logger study framework was used to analyze and measure user behavior when carrying out a set of 12 predefined search tasks. Our findings include the fact that depending on task type (simple or complex) significant differences can be observed between users of different types.
💡 Research Summary
The paper investigates whether a person’s predominant online activities—such as information seeking, social communication, or content creation—affect their ability to perform web searches, especially when the tasks vary in complexity. Fifty‑six ordinary internet users were recruited for a laboratory experiment. Prior to the search session, participants completed a questionnaire that classified them into four “user types”: (1) Information‑seeker, (2) Communicator, (3) Creator/Sharer, and (4) Mixed. Each participant then tackled twelve predefined search tasks, six of which were simple fact‑finding queries (e.g., “When was X born?”) and six were complex, multi‑step tasks that required gathering, comparing, and synthesizing information from multiple sources (e.g., “Identify the fastest‑growing eco‑friendly car brand after 2020”).
The Search‑Logger framework recorded detailed interaction logs: query strings, number of query reformulations, clicks on result links, dwell time on pages, and overall session duration. Performance was measured by task accuracy (correct answer) and efficiency (time and clicks). Statistical analysis (ANOVA with Tukey post‑hoc) examined differences among user types for the two task categories.
Results show that for simple tasks all user types performed similarly—average accuracy exceeded 85 % and there were no significant differences in time or click count. This suggests that basic search skills are sufficient for straightforward information retrieval regardless of a user’s habitual online behavior. In contrast, complex tasks revealed pronounced disparities. Information‑seekers and Creators/Sharers achieved the highest accuracy (≈68 %), used fewer clicks (≈12 per task), and completed tasks faster (≈3 min 45 s). Their logs indicate frequent query reformulation, systematic scanning of result snippets, and deliberate multi‑source comparison. Communicators performed the worst on complex tasks (≈48 % accuracy, ≈18 clicks, ≈5 min 20 s), reflecting a lower propensity to engage in the meta‑cognitive strategies required for synthesis. Mixed users fell in the middle (≈58 % accuracy) and showed the most adaptive behavior, often altering their search strategy mid‑session.
The statistical test confirmed that user type significantly influences complex‑task performance (F(3,52)=4.27, p<0.01), with the largest gap between Communicators and Information‑seekers. The authors argue that everyday online habits shape internalized search heuristics: frequent information‑seeking cultivates efficient query refinement and source evaluation, while heavy social interaction does not.
Limitations include the modest sample size, the artificial laboratory setting, and reliance on self‑reported activity patterns, which may introduce bias. The paper calls for larger, cross‑cultural studies, longitudinal log analysis, and machine‑learning models that can infer user type in real time to deliver adaptive search assistance.
In sum, the study provides empirical evidence that user type matters for complex web search tasks. It suggests that search engines and information‑retrieval systems could improve user experience by offering type‑specific support—such as step‑by‑step guides for communicators or advanced source‑comparison tools for creators—thereby enhancing overall search effectiveness.