Dokumentation

What topic do you need documentation on?

Measuring Online Debaters’ Persuasive Skill from Text over Time

Measuring Online Debaters’ Persuasive Skill from Text over Time Kelvin Luu1 Chenhao Tan2 Noah A. Smith1,3 1Paul G. Allen School of Computer Science & Maschinenbau, University of Washington 2Department of Computer Science, University of Colorado Boulder 3Allen Institute for Artificial Intelligence {kellu,nasmith}@cs.washington.edu chenhao.tan@colorado.edu Abstract Online debates allow people to express their persuasive abilities and provide exciting oppor- tunities for understanding persuasion. Prior studies have focused

Read more »

Where’s My Head? Definition, Data Set, and Models

Where’s My Head? Definition, Data Set, and Models for Numeric Fused-Head Identification and Resolution Yanai Elazar† and Yoav Goldberg†∗ †Computer Science Department, Bar-Ilan University, Israel ∗Allen Institute for Artificial Intelligence {yanaiela,yoav.goldberg}@gmail.com Abstract We provide the first computational treatment of fused-heads constructions (FHs), focusing on the numeric fused-heads (NFHs). FHs con- structions are noun phrases in which the head noun is missing and is said to

Read more »

Natural Questions: A Benchmark for Question Answering Research

Natural Questions: A Benchmark for Question Answering Research Tom Kwiatkowski♣♦♠∗ Jennimaria Palomaki♠ Olivia Redfield♦♠ Michael Collins♣♦♠♥ Ankur Parikh♥ Chris Alberti♥ Danielle Epstein♤♦ Illia Polosukhin♤♦ Jacob Devlin♤ Kenton Lee♥ Kristina Toutanova♥ Llion Jones♤ Matthew Kelcey♤♦ Ming-Wei Chang♥ Andrew M. Dai♣♦ Jakob Uszkoreit♣ Quoc Le♣♦ Slav Petrov♣ Google Research natural-questions@google.com Abstract We present the Natural Questions corpus, a question answering data set. Questions consist of real anonymized,

Read more »

Learning End-to-End Goal-Oriented Dialog with Maximal User Task

Learning End-to-End Goal-Oriented Dialog with Maximal User Task Success and Minimal Human Agent Use Janarthanan Rajendran†∗, Jatin Ganhotra‡, and Lazaros C. Polymenakos‡ †University of Michigan ‡IBM Research rjana@umich.edu, jatinganhotra@us.ibm.com, and lcpolyme@us.ibm.com Abstract Neural end-to-end goal-oriented dialog sys- tems showed promise to reduce the workload of human agents for customer service, as well as reduce wait time for users. Jedoch, their inability to handle new user

Read more »

A Generative Model for Punctuation in Dependency Trees

A Generative Model for Punctuation in Dependency Trees Xiang Lisa Li⇤ and Dingquan Wang⇤ and Jason Eisner Department of Computer Science, Johns Hopkins University xli150@jhu.edu, {wdd,jason}@cs.jhu.edu Abstract Treebanks traditionally treat punctuation marks as ordinary words, but linguists have suggested that a tree’s “true” punctuation marks are not observed (Nunberg, 1990). These latent “underlying” marks serve to delimit or separate constituents in the syn- tax tree.

Read more »

Syntax-aware Semantic Role Labeling without Parsing

Syntax-aware Semantic Role Labeling without Parsing Rui Cai and Mirella Lapata Institute for Language, Cognition and Computation School of Informatics, University of Edinburgh 10 Crichton Street, Edinburgh EH8 9AB Rui.Cai@ed.ac.uk mlap@inf.ed.ac.uk Abstract In this paper we focus on learning dependency aware representations for semantic role label- ing without recourse to an external parser. The backbone of our model is an LSTM- based semantic role labeler

Read more »

On the Complexity and Typology of Inflectional Morphological Systems

On the Complexity and Typology of Inflectional Morphological Systems Ryan Cotterell and Christo Kirov and Mans Hulden and Jason Eisner Department of Computer Science, Johns Hopkins University Department of Linguistics, University of Colorado {ryan.cotterell,ckirov1,eisner}@jhu.edu, first.last@colorado.edu Abstract We quantify the linguistic complexity of dif- ferent languages’ morphological systems. We verify that there is a statistically significant empirical trade-off between paradigm size and irregularity: A language’s inflectional

Read more »

Calculating the Optimal Step in Shift-Reduce Dependency Parsing:

Calculating the Optimal Step in Shift-Reduce Dependency Parsing: From Cubic to Linear Time Mark-Jan Nederhof School of Computer Science University of St Andrews, UK markjan.nederhof@googlemail.com Abstract We present a new cubic-time algorithm to calculate the optimal next step in shift-reduce dependency parsing, relative to ground truth, commonly referred to as dynamic oracle. Un- like existing algorithms, it is applicable if the training corpus contains non-projective

Read more »

What You Say and How You Say it: Joint Modeling of

What You Say and How You Say it: Joint Modeling of Topics and Discourse in Microblog Conversations Jichuan Zeng1∗ Jing Li2 ∗ Yulan He3 Cuiyun Gao1 Michael R. Lyu1 Irwin King1 1Department of Computer Science and Engineering The Chinese University of Hong Kong, HKSAR, China 2Tencent AI Lab, Shenzhen, China 3Department of Computer Science, University of Warwick, Vereinigtes Königreich 1{jczeng, cygao, lyu, king}@cse.cuhk.edu.hk 2ameliajli@tencent.com, 3yulan.he@warwick.ac.uk Abstract

Read more »

Learning Neural Sequence-to-Sequence Models from

Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss Laura Jehl∗ Carolin Lawrence∗ Computational Linguistics Heidelberg University 69120 Heidelberg, Deutschland {jehl, lawrence}@cl.uni-heidelberg.de Stefan Riezler Computational Linguistics & IWR Heidelberg University 69120 Heidelberg, Germany riezler@cl.uni-heidelberg.de Abstract In many machine learning scenarios, supervi- sion by gold labels is not available and conse- quently neural models cannot be trained directly by maximum likelihood estimation. In einem

Read more »

DREAM: A Challenge Data Set and Models for

DREAM: A Challenge Data Set and Models for Dialogue-Based Reading Comprehension Kai Sun♠ ∗ Dian Yu♥ Jianshu Chen♥ Dong Yu♥ Yejin Choi♦, ♣ Claire Cardie♠ ♠Cornell University, Ithaca, New York, USA ♥Tencent AI Lab, Bellevue, WA, USA ♦University of Washington, Seattle, WA, USA ♣Allen Institute for Artificial Intelligence, Seattle, WA, USA ks985@cornell.edu {yudian,jianshuchen,dyu}@tencent.com yejin@cs.washington.edu cardie@cs.cornell.edu Abstract We present DREAM, the first dialogue-based multiple-choice reading comprehension data

Read more »

Complex Program Induction for Querying Knowledge

Complex Program Induction for Querying Knowledge Bases in the Absence of Gold Programs Amrita Saha1 Ghulam Ahmed Ansari1 Abhishek Laddha∗ 1 Karthik Sankaranarayanan1 Soumen Chakrabarti2 1IBM Research India, 2Indian Institute of Technology Bombay amrsaha4@in.ibm.com, ansarigh@in.ibm.com, laddhaabhishek11@gmail.com, kartsank@in.ibm.com, soumen@cse.iitb.ac.in Abstract 1 Introduction Recent years have seen increasingly com- plex question-answering on knowledge bases (KBQA) involving logical, quantitative, and comparative reasoning over KB subgraphs. Neural Program Induction

Read more »

SECTOR: A Neural Model for Coherent Topic

SECTOR: A Neural Model for Coherent Topic Segmentation and Classification Sebastian Arnold Rudolf Schneider Beuth University of Applied Sciences Berlin, Deutschland {sarnold, ruschneider}@ beuth-hochschule.de Philippe Cudr´e-Mauroux University of Fribourg Fribourg, Switzerland pcm@unifr.ch Felix A. Gers Alexander L¨oser Beuth University of Applied Sciences Berlin, Deutschland {gers, aloeser}@ beuth-hochschule.de Abstract When searching for information, a human reader first glances over a document, spots relevant sections, and then

Read more »

Autosegmental Input Strictly Local Functions

Autosegmental Input Strictly Local Functions Jane Chandlee Tri-College Department of Linguistics Haverford College jchandlee@haverford.edu Adam Jardine Department of Linguistics Rutgers University adam.jardine@rutgers.edu Abstract Autosegmental representations (ARs; Goldsmith, 1976) are claimed to enable local analyses of otherwise non-local phenomena (Odden, 1994). Focusing on the domain of tone, we investigate this ability of ARs using a computationally well-defined notion of locality extended from Chandlee (2014). The result

Read more »

GILE: A Generalized Input-Label Embedding for Text Classification

GILE: A Generalized Input-Label Embedding for Text Classification Nikolaos Pappas James Henderson Idiap Research Institute, Martigny 1920, Schweiz {nikolaos.pappas,james.henderson@idiap.ch} Abstract Neural text classification models typically treat output labels as categorical variables that lack description and semantics. This forces their parametrization to be dependent on the label set size, Und, somit, they are unable to scale to large label sets and generalize to unseen ones. Existing

Read more »

Rotational Unit of Memory: A Novel Representation

Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications Rumen Dangovski*,1, Li Jing*,1, Preslav Nakov2, Mi´co Tatalovi´c1,3, Marin Soljaˇci´c1 *equal contribution 1Massachusetts Institute of Technology 2Qatar Computing Research Institute, HBKU 3Association of British Science Writers rumenrd, ljing } { @mit.edu, pnakov@qf.org.qa, mico, soljacic @mit.edu } { Abstract Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of

Read more »

Synchronous Bidirectional Neural Machine Translation

Synchronous Bidirectional Neural Machine Translation Long Zhou1,2, Jiajun Zhang1,2∗, Chengqing Zong1,2,3 1National Laboratory of Pattern Recognition, CASIA, Peking, China 2University of Chinese Academy of Sciences, Peking, China 3CAS Center for Excellence in Brain Science and Intelligence Technology, Shanghai, China {long.zhou, jjzhang, cqzong}@nlpr.ia.ac.cn Abstract Existing approaches to neural machine trans- lation (NMT) generate the target language sequence token-by-token from left to right. Jedoch, this kind of

Read more »