PrefaceWelcome to the 1st Workshop on Sense, Concept and Entity Representations and their Applications (SENSE 2017). The aim of SENSE 2017 is to focus on addressing one of the most important limitations of word-based techniques in that they conflate different meanings of a word into a single representation. SENSE 2017 brings together researchers in lexical semantics, and NLP in general, to investigate and propose sense-based techniques as well as to discuss effective ways of integrating sense, concept and entity representations into downstream applications.The workshop is targeted at covering the following topics:• Utilizing sense/concept/entity representations in applications such as Machine Translation, Information Extraction or Retrieval, Word Sense Disambiguation, Entity Linking, Text Classification, Semantic Parsing, Knowledge Base Construction or Completion, etc.• Exploration of the advantages/disadvantages of using sense representations over word representations.• Proposing new evaluation benchmarks or comparison studies for sense vector representations.• Development of new sense representation techniques (unsupervised, knowledge-based or hybrid).• Compositionality of senses: learning representations for phrases and sentences.• Construction and use of sense representations for languages other than English as well as multilingual representations.We received 21 submissions, accepting 15 of them (acceptance rate: 71%).We would like to thank the Program Committee members who reviewed the papers and helped to improve the overall quality of the workshop. We also thank Aylien for their support in funding the best paper award. Last, a word of thanks also goes to our invited speakers, Roberto Navigli (Sapienza University of Rome) and Hinrich Schütze (University of Munich).Jose Camacho-Collados and Mohammad Taher Pilehvar Co-Organizers of SENSE 2017iii
AbstractThis article describes a method to build semantic representations of composite expressions in a compositional way by using WordNet relations to represent the meaning of words. The meaning of a target word is modelled as a vector in which its semantically related words are assigned weights according to both the type of the relationship and the distance to the target word. Word vectors are compositionally combined by syntactic dependencies. Each syntactic dependency triggers two complementary compositional functions: the named head function and dependent function. The experiments show that the proposed compositional method performs as the state-of-the-art for subjectverb expressions, and clearly outperforms the best system for transitive subject-verbobject constructions.
IntroductionThe principle of compositionality (Partee, 1984) states that the meaning of a complex expression is a function of the meaning of its constituent parts and of the mode of their combination. In the recent years, different distributional semantic models endowed with a compositional component have been proposed. Most of them define words as high-dimensional vectors where dimensions ...