Clinical IE has been used for a wide range of applications, however, there is a considerable gap between clinical studies using EHR data and studies using clinical IE. This study enabled us to gain a more concrete understanding of the gap and to provide potential solutions to bridge this gap.
In Electronic Health Records (EHRs), much of valuable information regarding patients’ conditions is embedded in free text format. Natural language processing (NLP) techniques have been developed to extract clinical information from free text. One challenge faced in clinical NLP is that the meaning of clinical entities is heavily affected by modifiers such as negation. A negation detection algorithm, NegEx, applies a simplistic approach that has been shown to be powerful in clinical NLP. However, due to the failure to consider the contextual relationship between words within a sentence, NegEx fails to correctly capture the negation status of concepts in complex sentences. Incorrect negation assignment could cause inaccurate diagnosis of patients’ condition or contaminated study cohorts. We developed a negation algorithm called DEEPEN to decrease NegEx’s false positives by taking into account the dependency relationship between negation words and concepts within a sentence using Stanford dependency parser. The system was developed and tested using EHR data from Indiana University (IU) and it was further evaluated on Mayo Clinic dataset to assess its generalizability. The evaluation results demonstrate DEEPEN, which incorporates dependency parsing into NegEx, can reduce the number of incorrect negation assignment for patients with positive findings, and therefore improve the identification of patients with the target clinical findings in EHRs.
The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.
Consider k robots initially located at a point inside a region T . Each robot can move anywhere in T independently of other robots with maximum speed one. The goal of the robots is to evacuate T through an exit at an unknown location on the boundary of T . The objective is to minimize the evacuation time, which is defined as the time the last robot reaches the exit. We consider the face-to-face communication model for the robots: a robot can communicate with another robot only when they meet in T .In this paper, we give upper and lower bounds for the face-to-face evacuation time by k robots that are initially located at the centroid of a unit-sided equilateral triangle or square. For the case of a triangle with k = 2 robots, we give a lower bound of 1 + 2/ √ 3 ≈ 2.154, and an algorithm with upper bound of 2.3367 on the worst-case evacuation time. We show that for any k, any algorithm for evacuating k ≥ 2 robots requires at least √ 3 time. This bound is asymptotically optimal, as we show that even a straightforward strategy of evacuation by k robots gives an upper bound of √ 3 + 3/k. For k = 3 and 4, we give better algorithms with evacuation times of 2.0887 and 1.9816, respectively. For the case of the square and k = 2, we give an algorithm with evacuation time of 3.4645 and show that any algorithm requires time at least 3.118 to evacuate in the worst-case. Moreover, for k = 3, and 4, we give algorithms with evacuation times 3.1786 and 2.6646, respectively. The algorithms given for k = 3 and 4 for evacuation in the triangle or the square can be easily generalized for larger values of k. * A preliminary version of this paper appeared in
Let P be an orthogonal polygon. Consider a sliding camera that travels back and forth along an orthogonal line segment s ⊆ P as its trajectory. The camera can see a point p ∈ P if there exists a point q ∈ s such that pq is a line segment normal to s that is completely contained in P . In the minimum-cardinality sliding cameras problem, the objective is to find a set S of sliding cameras of minimum cardinality to guard P (i.e., every point in P can be seen by some sliding camera in S) while in the minimum-length sliding cameras problem the goal is to find such a set S so as to minimize the total length of trajectories along which the cameras in S travel. In this paper, we first settle the complexity of the minimum-length sliding cameras problem by showing that it is polynomial tractable even for orthogonal polygons with holes, answering a question posed by Katz and Morgenstern [9]. Next we show that the minimum-cardinality sliding cameras problem is NP-hard when P is allowed to have holes, which partially answers another question posed by Katz and Morgenstern [9].
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.