Transfer optimization attempts to minimize the overall inconvenience to passengers who must transfer between lines in a transit network. Bus trips are scheduled to depart from their terminal so as to minimize some objective function measuring that inconvenience. In this paper, the transit network is assumed to be given, and the scheduled headway is treated as fixed on each line. We denote by ti the departure time of the first bus on line i. {ti} are termed “offset times,” and constitute the decision variables of our model. To take into account stochastic travel times of buses, our treatment of transfer optimization employs a simulation procedure in combination with an optimization model. That model turns out to be a relaxation of the Quadratic Assignment Problem. It can incorporate a wide range of objective functions (measures of overall passenger disutility) and a variety of policies for holding buses at a transfer point. In the case where buses are not held at all, we show, for a number of different objective functions and transit networks, the negative consequences of optimizing transfers with a deterministic bus-travel-times assumption, if these travel times are in fact random variables. Suggestions are then made for future research.
This paper discusses the design and implementation of processes and tools to support the collaborative creation and maintenance of multilingual wiki content. A wiki is a website where a large number of participants are allowed to create and modify content using their Web browser. This simple concept has revolutionized collaborative authoring on the web, enabling among others, the creation of Wikipedia, the world's largest online encyclopedia. On many of the largest and highest profile wiki sites, content needs to be provided in more than one language. Yet, current wiki engines do not support the efficient creation and maintenance of such content. Consequently, most wiki sites deal with the issue of multilingualism by spawning a separate and independent site for each language. This approach leads to much wasted effort since the same content must be researched, tracked and written from scratch for every language. In this paper, we investigate what features could be implemented in wiki engines in order to deal more effectively with multilingual content. We look at how multilingual content is currently managed in more traditional industrial contexts, and show how this approach is not appropriate in a wiki world. We then describe the results of a User-Centered Design exercise performed to explore what a multilingual wiki engine should look like from the point of view of its various end users. We describe a partial implementation of those requirements in our own wiki engine (LizzyWiki), to deal with the special case of bilingual sites. We also discuss how this simple implementation could be extended to provide even more sophisticated features, and in particular, to support the general case of a site with more than two languages. Finally, even though the paper focuses primarily on multilingual content in a wiki context, we argue that translating in this "Wiki Way", may also be useful in some traditional industrial settings, as a way of dealing better with the fast and ever-changing nature of our modern internet world.
Wikis are simple to use, asynchronous, Web-based collaborative hypertext authoring systems which are quickly gaining in popularity. In spite of much anecdotal evidence to the effect that wikis are usable by non technical experts, this has never been studied formally. In this paper, we studied the usability of a wiki through observation and problem-solving interaction with several children who used the tool to collaboratively author hypertext stories over several sessions. The children received a minimal amount of instruction, but were able to ask for help during their work sessions. Despite minimal instruction, 5 out of 6 teams were able to complete their story. Our data indicate that the major usability problems were related to hyperlink management. We report on this and other usability issues, and provide suggestions for improving the usability of wikis. Our analysis and conclusions also apply to hypertext authoring with non wiki-based tools.
In this paper, we investigate a novel approach to correcting grammatical and lexical errors in texts written by second language authors. Contrary to previous approaches which tend to use unilingual models of the user's second language (L2), this new approach uses a simple roundtrip Machine Translation method which leverages information about both the author's first (L1) and second languages. We compare the repair rate of this roundtrip translation approach to that of an existing approach based on a unilingual L2 model with shallow syntactic pruning, on a series of preposition choice errors. We find no statistically significant difference between the two approaches, but find that a hybrid combination of both does perform significantly better than either one in isolation. Finally, we illustrate how the translation approach has the potential of repairing very complex errors which would be hard to treat without leveraging knowledge of the author's L1.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.