搜索结果: 16-30 共查到“理论语言学 Grammar”相关记录82条 . 查询时间(0.024 秒)
We explore the consequences of letting the incremental and integrative
nature of language processing inform the design of competence
grammar. What emerges is a view of grammar as a system of local...
Half a century ago, Noam Chomsky introduced the eld of linguistics to new mathematical
tools drawn largely from recursive function theory. These were exciting tools that imparted
mathematical precis...
Gradient grammar: An effect of animacy on the syntax of give in New Zealand and American English
New Zealand English US English Dative alternation Animacy Probabilistic grammar
2015/6/17
Bresnan et al. (2007) show that a statistical model can predict United States (US) English speakers’ syntactic choices with ‘give’-type verbs extremely accurately. They argue that these results are co...
Harmonic grammar with linear programming: From linear systems to linguistic typology
Harmonic Grammar Optimality Theory linear programming typology, Lango ATR harmony positional markedness positional faithfulness
2015/6/15
Harmonic Grammar (HG) is a model of linguistic constraint interaction in which well-formedness is calculated in terms of the sum of weighted constraint violations. We show how linear programming algor...
Natural Language Grammar Induction using a Constituent-Context Model
Natural Language Grammar Induction Constituent
2015/6/12
This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG m...
A Generative Constituent-Context Model for Improved Grammar Induction
Generative Constituent Context Model Grammar Induction
2015/6/12
We present a generative distributional model for the unsupervised induction of natural language syntax which explicitly models constituent yields and contexts. Parameter search with EM produces higher...
Feature Selection for a Rich HPSG Grammar Using Decision Trees
Feature Selection Rich HPSG Grammar Decision Trees
2015/6/12
This paper examines feature selection for log linear models over rich constraint-based grammar (HPSG) representations by building decision trees over features in corresponding probabilistic context fr...
In this paper, we describe experiments on HPSG parse disambiguation using the Redwoods HPSG treebank (Oepen et al. 2002a,b,c). HPSG is a constraint-based lexicalist (“unification”) grammar formalism.
Panini’s grammar is universally admired for its insightful analysis of Sanskrit. In addition, some of its features have a more specialized appeal. Sanskritists prize the completeness of its descriptiv...
Universal Grammar Is a Universal Grammar
Universal Grammar Turing completeness language evolution
2015/6/10
Is Universal Grammar a universal grammar? From Chomsky's hierarchy we deduce that for each grammar there is a Turing machine, and conversely. Following this equivalence, it is immediate to conclude th...
Lateen EM: Unsupervised Training with Multiple Objectives,Applied to Dependency Grammar Induction
Lateen EM Unsupervised Training Multiple Objectives Dependency Grammar Induction
2015/6/10
We present new training methods that aim to mitigate local optima and slow convergence in unsupervised training by using additional imperfect objectives. In its simplest form, lateen EM alternates bet...
Capitalization Cues Improve Dependency Grammar Induction
Capitalization Cues Grammar Induction
2015/6/10
We show that orthographic cues can be helpful for unsupervised parsing. In the Penn Treebank, transitions between upper- and lowercase tokens tend to align with the boundaries of base (English) noun p...
Three Dependency-and-Boundary Models for Grammar Induction
Three Dependency Boundary Models Grammar Induction
2015/6/10
We present a new family of models for unsupervised parsing, Dependency and Boundary models, that use cues at constituent boundaries to inform head-outward dependency tree generation. We build on three...
Bootstrapping Dependency Grammar Inducers from Incomplete Sentence Fragments via Austere Models
Dependency Grammar Induction Unsupervised Dependency Parsing Curriculum Learning Partial EM Punctuation Unsupervised Structure Learning
2015/6/10
Modern grammar induction systems often employ curriculum learning strategies that begin by training on a subset of all available input that is considered simpler than the full data. Traditionally, fil...
Breaking Out of Local Optima with Count Transforms and Model Recombination: A Study in Grammar Induction
Local Optima Count Transforms Model Recombination Grammar Induction
2015/6/10
Many statistical learning problems in NLP call for local model search methods. But accuracy tends to suffer with current techniques,which often explore either too narrowly or too broadly: hill-climber...