As datacenter speeds scale to 100 Gb/s and beyond, traditional congestion control algorithms like TCP and RCP converge slowly to steady sending rates, which leads to poorer and less predictable user performance. These reactive algorithms use congestion signals to perform gradient descent to approach ideal sending rates, causing poor convergence times. In this paper, we propose a proactive congestion control algorithm called PERC, which explicitly computes rates independently of congestion signals in a decentralized fashion. Inspired by message-passing algorithms with traction in other fields (e.g., modern Low Density Parity Check decoding algorithms), PERC improves convergence times by a factor of 7 compared to reactive explicit rate control protocols such as RCP. This fast convergence reduces tail flow completion time (FCT) significantly in high speed networks; for example, simulations of a realistic workloads in a 100 Gb/s network show that PERC achieves up to 4× lower 99 th percentile FCT compared to RCP.
In the past five years, the graduate networking course at Stanford has assigned over 200 students the task of reproducing results from over 40 networking papers. We began the project as a means of teaching both engineering rigor and critical thinking, qualities that are necessary for careers in networking research and industry. We have observed that reproducing research can simultaneously be a tool for education and a means for students to contribute to the networking community. Through this editorial we describe our project in reproducing network research and show through anecdotal evidence that this project is important for both the classroom and the networking community at large, and we hope to encourage other institutions to host similar class projects.
Educators have faced new challenges in effective course assessment during the recent, unprecedented shift to remote online learning during the COVID-19 pandemic. In place of typical proctored, timed exams, instructors must now rethink their methodology for assessing course-level learning goals. Are exams appropriate-or even feasible-in this new online, open-internet learning environment? In this experience paper, we discuss the unique exams framework: our framework for upholding exam integrity and student privacy. In our Probability for Computer Scientists Course at an R1 University, we developed autogenerated, unique exams where each student had the same four problem skeletons with unique numeric variations per problem. Without changing the process of the traditional exam, unique exams provide a layer of security for both students and instructors about exam reliability for any classroom environment-in-person or online. In addition to sharing our experience designing unique exams, we also present a simple end-to-end tool and example question templates for different CS subjects that other instructors can adapt to their own courses.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.