Selectional Constraints: an Information-theoretic Model and its Computational Realization

Selectional Constraints: an Information-theoretic Model and its Computational Realization – Resnik (1996)

Resnik proposes an information-theoretic model of verb selection restriction.  Uses WordNet to model noun conceptual classes and then shows how the Relative Entropy between the noun class and the noun class given the verb is a good measure of how specific a selectional constraint the verb has.  Shows that this measure of selectional restriction correlates with human judgments of plausibility and can predict which verbs undergo inferred object alternations and when an speaker may choose to use an inferred object construction.  Overall the strength of the correlations with human judgments were weaker than expected.  Resnik argues that this may be due to the minimal nature of the model and the fact that it does not distinguish between verb or noun senses.  He argues that this is an advantage in corresponding to the information available to the language learner.  It could indicate that sense disambiguation happens early in language learning and is not reliant on selectional constraints.  In all likelihood both selectional constraint information and sense disambiguation are learned together and reinforce each other.