Authors :
Lakshin Pathak; Kajal Lochab; Veena Gidwani
Volume/Issue :
Volume 9 - 2024, Issue 8 - August
Google Scholar :
https://shorturl.at/rrkPp
Scribd :
https://shorturl.at/fldwV
DOI :
https://doi.org/10.38124/ijisrt/IJISRT24AUG1043
Abstract :
This paper presents a pioneering approach to
text generation employing Recurrent Neural Networks
(RNN) with Long Short-Term Memory (LSTM)
architecture, inspired by the rich and timeless prose of
William Shakespeare. The motivation stems from the
enduring allure of Shakespearean language, which has
captivated audiences across centuries, and the challenge
of replicating itsintricate style using modern computational
techniques. Our research contributes a novel methodology
that leverages the capabilities of RNN LSTM networks to
emulate the linguistic nuances of Shakespeare with
remarkable fidelity. The paper begins by providing a
comprehensive overview of RNN LSTM networks,
highlighting their suitability for sequential data processing
tasks and their ability to capture long-rangedependencies.
A review of related work in the field sets the stage for our
proposed approach, shedding light on recent advancements
and methodologies employed in text generation using
similar techniques. We formulate the problem by defining
the mathematical framework, optimization objectives, and
evaluation metrics for our proposed model. The
architecture consists of three layers: the data layer for
preprocessing input text data, the intelligence layer
comprising multiple LSTM units for capturing different
aspects of Shakespearean language, and the application
layer for generating output text based on learned
representations. Experimental results demonstrate the
effectiveness of our approach, with evaluations
conducted on a corpus of Shakespearean texts.In
conclusion, our research presents a significant
advancement in the field of natural language generation,
opening new avenues for exploring the intersection of
literature and artificial intelligence.
Keywords :
Text Generation, RNN LSTM, Shakespearean Language, Natural Language Processing
References :
- K. Pandey and S. S. Roy, “Natural language generation using se- quential models: A survey,” Neural Processing Letters, vol. 55, no. 6, pp. 7709–7742, 2023.
- S. Abujar, A. K. M. Masum, S. M. H. Chowdhury, M. Hasan, and S. A. Hossain, “Bengali text generation using bi-directional rnn,” in 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), pp. 1–5, IEEE, 2019.
- H. V. K. S. Buddana, S. S. Kaushik, P. Manogna, and S. K. PS, “Word level lstm and recurrent neural network for automatic text generation,” in 2021 International Conference on Computer Communication and Informatics (ICCCI), pp. 1–4, IEEE, 2021.
- M. S. Islam, S. S. S. Mousumi, S. Abujar, and S. A. Hossain, “Sequence- to-sequence bangla sentence generation with lstm recurrent neural net- works,” Procedia Computer Science, vol. 152, pp. 51–58, 2019.
- X. Zhang and M. Lapata, “Chinese poetry generation with recurrent neural networks,” in Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp. 670–680, 2014.
- H.-C. Wang, W.-C. Hsiao, and S.-H. Chang, “Automatic paper writing based on a rnn and the textrank algorithm,” Applied Soft Computing, vol. 97, p. 106767, 2020.
- S. Chakraborty, J. Banik, S. Addhya, and D. Chatterjee, “Study of dependency on number of lstm units for character based text genera- tion models,” in 2020 International Conference on Computer Science, Engineering and Applications (ICCSEA), pp. 1–5, IEEE, 2020.
This paper presents a pioneering approach to
text generation employing Recurrent Neural Networks
(RNN) with Long Short-Term Memory (LSTM)
architecture, inspired by the rich and timeless prose of
William Shakespeare. The motivation stems from the
enduring allure of Shakespearean language, which has
captivated audiences across centuries, and the challenge
of replicating itsintricate style using modern computational
techniques. Our research contributes a novel methodology
that leverages the capabilities of RNN LSTM networks to
emulate the linguistic nuances of Shakespeare with
remarkable fidelity. The paper begins by providing a
comprehensive overview of RNN LSTM networks,
highlighting their suitability for sequential data processing
tasks and their ability to capture long-rangedependencies.
A review of related work in the field sets the stage for our
proposed approach, shedding light on recent advancements
and methodologies employed in text generation using
similar techniques. We formulate the problem by defining
the mathematical framework, optimization objectives, and
evaluation metrics for our proposed model. The
architecture consists of three layers: the data layer for
preprocessing input text data, the intelligence layer
comprising multiple LSTM units for capturing different
aspects of Shakespearean language, and the application
layer for generating output text based on learned
representations. Experimental results demonstrate the
effectiveness of our approach, with evaluations
conducted on a corpus of Shakespearean texts.In
conclusion, our research presents a significant
advancement in the field of natural language generation,
opening new avenues for exploring the intersection of
literature and artificial intelligence.
Keywords :
Text Generation, RNN LSTM, Shakespearean Language, Natural Language Processing