Natural Language Generation

Semantic Noise Matters for Neural Natural Language Generation

Neural natural language generation (NNLG) systems are known for their pathological outputs, i.e. generating text which is unrelated to the input specification. In this paper, we show the impact of semantic noise on state-of-theart NNLG models which …

Arguing for consistency in the human evaluation of natural language generation systems

Learning Sentence Planning Rules with Bayesian Methods

Data-driven natural language generation (NLG) is not a new concept. For decades, researchers have been studying corpora to inform their development of NLG systems. More recently, this interest has shifted away from rule-based systems to fully …

Getting Started With Openccg

OpenCCG is a Java library which can handle both parsing and generation. I've mostly used it for surface realization, converting fairly syntactic meaning representations into a natural language text, but you can use it for parsing or for generation from higher-level semantic representations if you'd like. This tutorial is intended to help you: Start exploring OpenCCG with the tccg utility. If you haven't installed OpenCCG yet, see the first post on Installing OpenCCG first.

From OpenCCG to AI Planning: Detecting Infeasible Edges in Sentence Generation

The search space in grammar-based natural language generation tasks can get very large, which is particularly problematic when generating long utterances or paragraphs. Using surface realization with OpenCCG as an example, we show that we can …

Search Challenges in Natural Language Generation with Complex Optimization Objectives

Automatic natural language generation (NLG) is a difficult problem already when merely trying to come up with natural-sounding utterances. Ubiquituous applications, in particular companion technologies, pose the additional challenge of flexible …