All Articles
Unlocking Creativity in NLP: Advancements in Metaphor Generation
Introduction
Natural Language Processing (NLP) is at the forefront of AI innovation, continuously redefining the limits of machine comprehension and creativity. Among the most intriguing and complex tasks within NLP is metaphor generation—a process that allows machines to draw parallels between seemingly unrelated concepts, enriching language with vividness and depth. Metaphors not only make communication more engaging but also facilitate the understanding of abstract or unfamiliar ideas by linking them to more familiar ones. Given its critical role in applications such as machine translation and sentiment analysis, metaphor generation has become a focal point of research in computational linguistics. Recent advancements in AI have accelerated the development of sophisticated models and frameworks, propelling this area of study into new and exciting territories.
Understanding Metaphor Generation
Metaphor generation involves crafting expressions that describe one concept in terms of another, enriching communication with deeper meaning and emotional resonance. In the field of Natural Language Processing (NLP), metaphor generation focuses on teaching machines to create these imaginative expressions, enabling them to produce more nuanced and human-like text.
Key Sub-Tasks in Metaphor Generation
Verb Substitution
Verb substitution is a key focus in metaphor generation, particularly for verbs due to their high metaphorical frequency and the simplicity of replacing a literal verb with a metaphorical one without altering the sentence’s syntax. For the literal input “The wildfire spread through the forest at an amazing speed,” the generated metaphor is “The wildfire danced through the forest at an amazing speed.” Yu and Wan (2019)[1] introduced an end-to-end framework that used WordNet synonyms and hypernyms for candidate selection and a PoS-constrained language model, followed by a joint beam search to ensure metaphoricity. Stowe et al. (2020)[2] developed a lexical replacement method and a metaphor masking Seq2Seq model, selecting candidate words based on the cosine similarity of word embeddings. Chakrabarty et al. (2021)[3] fine-tuned a BART model for metaphorical sentence generation in poetry, incorporating a metaphor discriminator to enhance creativity. Stowe et al. (2021a)[4] compared free and controlled generation using T5, finding that controlled models produced more novel metaphors, while free generation improved fluency. They also explored correlations between automatic and human evaluations, identifying SentBERT and MoverScore as effective for semantic similarity and perplexity for fluency. Stowe et al. (2021b)[5] further advanced metaphor generation with two models, CM-Lex and CM-BART, leveraging FrameNet concept mappings. Despite the success of verb substitution methods, they often limit creativity by restricting syntactic and stylistic diversity.
Metaphor Surface Realization (MSR)
Zheng et al. (2019)[6] proposed that metaphors can enhance user engagement in human-computer conversations. They developed an unsupervised method for metaphor generation by selecting target words from poetry themes and source words from chatbot logs, filtered by concreteness and frequency. The task involved finding a connecting word representing a shared property between the source and target words, determined by a score based on a distance function. However, the source-target pairs were chosen randomly. For the target word “Time” with the property “Valuable,” the generated metaphor is “Time is as valuable as gold.”. Song et al. (2020)[7] approached metaphor surface realization (MSR) as a knowledge graph completion task. Their model generated a source concept based on a target word and an attribute, embedding metaphor knowledge into the graph and constructing figurative expressions through simile templates. Current MSR methods primarily focus on nominal metaphors and rely on pre-defined phrases, limiting their applicability in open-domain metaphor generation tasks, where the need for more flexibility poses challenges.
Sentence Generation
Sentence generation is about creating entire sentences that contain metaphors, often using advanced language models. For the input target “Autumn,” the generated sentence is: “Autumn is like the heavy ginkgo, shining as a rainbow behind the treetop.”. Brooks and Youssef (2020)[8] explored sentence generation in metaphor production, noting that certain syntactic patterns in metaphors are absent in literal texts. They developed a framework using an unsupervised LSTM language model to generate sentences under specific constraints and evaluate their metaphoricity and novelty based on unique syntactic patterns. Li et al. (2022)[9] tackled Chinese nominal metaphor generation with a GPT-2-based model, leveraging large-scale unlabelled data and self-training to address data sparsity. By incorporating an auxiliary task for metaphor identification, they enhanced the model’s focus on metaphorical components, resulting in more diverse outputs. However, the unplanned nature of the generated concepts led to aimless metaphors. Overall, sentence generation methods offer greater syntactic flexibility and creativity in metaphor production but face challenges in producing purposeful metaphors with controllable mappings between source and target concepts.
Advances and Challenges
Recent advancements in metaphor generation have primarily focused on three key sub-tasks: replacing metaphorical verbs with literal ones, generating sentences with nominal metaphors, and creating sentences with different syntactic patterns, regardless of the part of speech (PoS). These tasks rely heavily on lexical resources like WordNet, COMET, and MetaNet to support conceptual mapping. Seq2Seq models are commonly used, though they often face limitations, such as dependency on literal parallel corpora. To address this, researchers have proposed various methods, including masking frameworks and auxiliary tasks to regulate metaphoricity.
Evaluation and Future Directions
Evaluation of metaphor generation models typically involves both automatic and human assessments. Creativity, metaphoricity, and fluency are key metrics, with tools like BLEU, sentence BERT, and perplexity playing crucial roles. Despite progress, challenges remain, particularly in ensuring that generated metaphors are meaningful and appropriate for practical use. Studies have demonstrated the ability to generate metaphors using arbitrary concept mappings, but further work is needed to refine control over source concepts and target metaphorical meanings. As research evolves, the goal is to develop more sophisticated techniques that can produce metaphors that are not only creative but also contextually relevant and useful in real-world applications.
Conclusion
Metaphor generation in NLP is a fascinating blend of creativity and technology. The recent advancements highlight the potential of machines to produce imaginative and engaging language, transforming how we interact with technology. As research progresses, we can look forward to even more sophisticated and versatile metaphor generation models that will continue to push the boundaries of creativity in NLP.
Summary
Metaphor generation in Natural Language Processing (NLP) enriches machine-generated text by drawing imaginative parallels between concepts. Key methods include verb substitution, metaphor surface realization, and sentence generation. Despite advancements, challenges remain in ensuring metaphors are creative, meaningful, and contextually relevant. Ongoing research aims to enhance the sophistication and utility of metaphor generation models.

Jathushan Raveendra
Undergraduate
Department of Computer Science and Engineering
University of Moratuwa

Co Author : Uthayasanker Thayasivam
Senior Lecturer
Department of Computer Science and Engineering
University of Moratuwa