Bite-sized learning is a current educational trend in which educators divide content into relatively small, easily comprehensible chunks, called nuggets. In this paper, we introduce an authoring toolkit that relies on VR implementation of nuggets and show that a nugget-based approach is also facilitating the authoring of VR learning content. In particular, we present Immersive Nugget Tiles (IN-Tiles), a novel authoring toolkit aimed at authors who are not experts in VR. With IN-Tiles, manipulating VR nuggets and authoring VR learning content can be directly accomplished within a virtual environment allowing authors to immediately experience the results of their authoring efforts in VR. We discuss the underlying concepts of IN-Tiles, specifically how to visualize VR nuggets in a virtual environment and how to present affordances that support authoring and manipulating VR nuggets. We report the results of a user study where we evaluated the IN-Tiles toolkit and compared it to a conventional 2D authoring environment that also relies on component-based VR. The results support the hypothesis that nugget-based immersive authoring tools are suitable to create bitesized VR applications successfully and that authoring directly in VR has an added value particularly for authors who are no IT specialists.
Asymmetric Virtual Reality (VR) applications are a substantial subclass of multi-user VR that offers not all participants the same interaction possibilities with the virtual scene. While one user might be immersed using a VR head-mounted display (HMD), another user might experience the VR through a common desktop PC. In an educational scenario, for example, learners can use immersive VR technology to inform themselves at different exhibits within a virtual scene. Educators can use a desktop PC setup for following and guiding learners through virtual exhibits and still being able to pay attention to safety aspects in the real world (e. g., avoid learners bumping against a wall). In such scenarios, educators must ensure that learners have explored the entire scene and have been informed about all virtual exhibits in it. According visualization techniques can support educators and facilitate conducting such VR-enhanced lessons. One common technique is to render the view of the learners on the 2D screen available to the educators. We refer to this solution as the shared view paradigm. However, this straightforward visualization involves challenges. For example, educators have no control over the scene and the collaboration of the learning scenario can be tedious. In this paper, we differentiate between two classes of visualizations that can help educators in asymmetric VR setups. First, we investigate five techniques that visualize the view direction or field of view of users (view visualizations) within virtual environments. Second, we propose three techniques that can support educators to understand what parts of the scene learners already have explored (exploration visualization). In a user study, we show that our participants preferred a volume-based rendering and a view-in-view overlay solution for view visualizations. Furthermore, we show that our participants tended to use combinations of different view visualizations.
Creating Augmented Reality (AR) applications can be an arduous process. With most current authoring tools, authors must complete multiple authoring steps in a time-consuming process before they can try their AR application and get a first impression of it. Especially for laypersons, complex workflows set a high barrier to getting started with creating AR applications. This work presents a novel authoring approach for creating mobile AR applications. Our idea is to provide authors with small, ready-to-use AR applications that can be executed and tested directly as a starting point. Authors can then focus on customizing these AR applications to their needs without programming knowledge. We propose to use patterns from application domains to further facilitate the authoring process. Our idea is based on the learning nugget approach from the educational sciences, where a nugget is a small and self-contained learning unit. We transfer this approach to the field of AR authoring and introduce an AR nugget authoring tool. The authoring tool provides pattern-based self-contained AR applications, called AR nuggets. AR nuggets use simple geometric objects to give authors an impression of the AR application. By replacing these objects and further adaptions, authors can realize their AR applications. Our authoring tool draws from non-immersive desktop computers and AR devices. It synchronizes all changes to an AR nugget both to an AR device and a non-immersive device. This enables authors to use both devices, e.g., a desktop computer to type text and an AR device to place virtual objects in the 3D environment. We evaluate our proposed authoring approach and tool in a user study with 48 participants. Our users installed the AR nugget authoring tool on their own devices, worked with it for 3 weeks, and filled out a questionnaire. They were able to create AR applications and found the AR nugget approach supportive. The users mainly used the desktop computer for the authoring tasks but found the synchronization to the AR device helpful to experience the AR nuggets at any time. However, the users had difficulties with some interactions and rated the AR nugget authoring tool in a neutral field.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.