“…With recent advances in unconstrained language generation (Radford et al, a,b;Brown et al, 2020), an emerging direction is to adapt such pre-trained language models to follow certain stylistic constraints (Wang et al, 2019;Syed et al, 2020). These approaches rely on the inherent properties of the training corpus to tailor generation to target characteristics; for example, implicitly learning author-stylized text generation by training on author-specific corpus (Syed et al, 2020) and learning to generate formal text (Wang et al, 2019). However, it is desirable to have explicit control over certain stylistic aspects in such generation, for e.g., emulating lexical choices of an author in a generation, capturing syntactic constructs, inducing sentential preferences (active vs. passive) in generation.…”