PurposeDescribes the basic premises of three metatheories that represent important or emerging perspectives on information seeking, retrieval and knowledge formation in information science: constructivism, collectivism, and constructionism.Design/methodology/approachPresents a literature‐based conceptual analysis. Pinpoints the differences between the positions in their conceptions of language and the nature and origin of knowledge.FindingsEach of the three metatheories addresses and solves specific types of research questions and design problems. The metatheories thus complement one another. Each of the three metatheories encourages and constitutes a distinctive type of research and learning.Originality/valueOutlines each metatheory's specific fields of application.
The study investigates the ways in which people experience information overload in the context of monitoring everyday events through media such as newspapers and the internet. The findings are based on interviews with 20 environmental activists in Finland in 2005. The perceptions of the seriousness of problems caused by information overload varied among the participants. On the one hand, information overload was experienced as a real problem particularly in the networked information environments. On the other hand, information overload was perceived as an imagined problem with some mythical features. Two major strategies for coping with information overload were identified. The filtering strategy is based on the determined weeding out of material deemed useless. This strategy is favoured in networked information environments. The withdrawal strategy is more affectively oriented, emphasizing the need to protect oneself from excessive information supply by keeping the number of information sources to a minimum.
Evaluation is central in research and development of information retrieval (IR). In addition to designing and implementing new retrieval mechanisms, one must also show through rigorous evaluation that they are effective. A major focus in IR is IR mechanisms' capability of ranking relevant documents optimally for the users, given a query. Searching for information in practice involves searchers, however, and is highly interactive. When human searchers have been incorporated in evaluation studies, the results have often suggested that better ranking does not necessarily lead to better search task, or work task, performance. Therefore, it is not clear which system or interface features should be developed to improve the effectiveness of human task performance. In the present article, we focus on the evaluation of task-based information interaction (TBII). We give special emphasis to learning tasks to discuss TBII in more concrete terms. Information interaction is here understood as behavioral and cognitive activities related to task planning, searching information items, selecting between them, working with them, and synthesizing and reporting. These five generic activities contribute to task performance and outcome and can be supported by information systems. In an attempt toward task-based evaluation, we introduce program theory as the evaluation framework. Such evaluation can investigate whether a program consisting of TBII activities and tools works and how it works and, further, provides a causal description of program (in)effectiveness. Our goal in the present article is to structure TBII on the basis of the five generic activities and consider the evaluation of each activity using the program theory framework. Finally, we combine these activity-based program theories in an overall evaluation framework for TBII. Such an evaluation is complex due to the large number of factors affecting information interaction. Instead of presenting tested program theories, we illustrate how the evaluation of TBII should be accomplished using the program theory framework in the evaluation of systems and behaviors, and their interactions, comprehensively in context.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.