July'2023

Welcome to the IUP Journal of Computer Sciences

Focus

Generative Pre-trained Transformer (GPT) and Artificial Intelligence (AI) are two interconnected and rapidly evolving trends in the field of Natural Language Processing (NLP) and Machine Learning. GPT, developed by OpenAI, is a series of state-of-the-art language models that leverage transformer architectures. These models are pre-trained on massive amounts of text data and can generate coherent and contextually relevant text based on given prompts. GPT models, such as GPT-3, have demonstrated impressive capabilities in various language-related tasks, including text generation, translation, summarization and even engaging in conversational interactions. AI, on the other hand, refers to the broader field of developing intelligent machines that can perform tasks that typically require human intelligence. AI encompasses various techniques and approaches, including Machine Learning, Deep Learning, NLP, computer vision, etc. AI technologies aim to mimic or simulate human intelligence to solve complex problems, automate tasks and improve decision-making processes. The trend in recent years has seen an increasing focus on developing and advancing large-scale language models like GPT, which have significantly pushed the boundaries of what machines can achieve in terms of language understanding and generation. GPT models have garnered widespread attention due to their impressive language capabilities and potential applications across various industries and domains.

more »»        

This issue consists of four papers. The first paper is "Sentiment Analysis of YouTube Comments Using Deep Neural Networks and Pre-Trained Word Embedding" by A C Nanayakkara and G A D M Thennakoon. It discusses the importance of sentiment analysis for various NLP tasks, particularly in the context of the massive data generated by social media. The study focuses on using Deep Learning models with GloVe word embeddings to handle sentiment analysis. Specifically, the study employs a basic Neural Network (NN), Convolutional Neural Networks (CNNs), and a Long Short-Term Memory (LSTM) NN architecture to categorize the sentiment of comments on a popular YouTube video. Based on the experiment's results, it recommends using an LSTM NN model with GloVe word embeddings to overcome the limitations of traditional recurrent NNs. The performance of the LSTM model with GloVe word embeddings is compared to a densely connected NN model and a CNN model. The evaluation metrics used for comparison include training accuracy, testing accuracy and overfitting indicators. The results indicate that the LSTM model with GloVe word embeddings outperforms both the CNN and simple NN models.

The second paper, "Analysis of GPT Sentiments Using Blog Mining" by Sandeep Bhattacharjee, focuses on comprehending the evolution of language models, particularly Generative Pre-trained Transformers (GPTs), commonly known as GPTs. The author has used text mining analytics in the R4.2.2 console to analyze 72 blogs that were available on the OPENAI website. Through this analysis, he identified key terms related to GPT, such as "learning," "openAI," "models," and "model." Furthermore, the study conducted a correlation analysis, which revealed associations between certain terms like "appreciated," "creativity," "flex," "combine," and "connections." These correlations may provide valuable insights into the relationships between different concepts in the context of language models and their applications. The study appears to be relevant for individuals interested in language models, specifically GPTs, and their potential applications in various fields.

The third paper, "Dzyaloshinskii-Moriya Interactions of Skyrmions and Antiskyrmions in Mimicking Electron-Hole Pairs for Logic and Memory Applications" by Mosiori Cliff Orori presents a model and simulation of magnetic skyrmions stabilized by the Dzyaloshinskii-Moriya interaction. The skyrmions are described as tiny whirls of magnetic configurations. Skyrmions are swirling quasi-particles that exhibit a unique magnetic texture. They are characterized by their topological charge and can be thought of as localized magnetic knots or whirls. Antiskyrmions are the counterparts of skyrmions with opposite topological charges. These structures can be stabilized in certain materials by the Dzyaloshinskii-Moriya interaction. The growing demand for novel memory and logic devices in the field of communication and information technology has led to increased interest in utilizing skyrmions and antiskyrmions. Their topological properties and stability make them potential candidates for ultra-dense magnetic memories. Recent laboratory observations of skyrmions and antiskyrmions at room temperature have spurred further research into their transport and dynamic properties. There are also efforts to explore their applications in reservoir computing, a field that requires large memory storage and fast access capabilities. These recent findings suggest that skyrmions and antiskyrmions could be used as memory elements and even potentially in logic elements.

The last paper, "A Comparative Analysis of Traditional and Lightweight Algorithms", by Rimpi Rani and R K Bathla, discusses the importance of cryptography algorithms in safeguarding sensitive data and communications. It highlights that these algorithms can be compared based on various criteria. The aim is to conduct a comparative analysis between traditional cryptography algorithms such as AES, RSA and SHA, and new lightweight algorithms like Speck, Simon and Celfi. The paper acknowledges that the choice of algorithm should be tailored to the specific application and the security requirements. Traditional algorithms like AES, RSA, and SHA are considered to be suitable for scenarios that demand high levels of security. On the other hand, lightweight algorithms like Speck, Simon and Celfi are better suited for environments with limited resources, where efficiency and lower computational complexity are essential. Overall, the paper contributes to the field of cryptography by providing valuable insights into the strengths and limitations of both traditional and lightweight algorithms, helping users make informed decisions about selecting the most appropriate algorithm for their particular use cases.

B Seetharamulu
Consulting Editor

Article   Price (₹)
Sentiment Analysis of YouTube Comments Using Deep Neural Networks and Pre-Trained Word Embedding
100
Analysis of GPT Sentiments Using Blog Mining
100
Dzyaloshinskii-Moriya Interactions of Skyrmions and Antiskyrmions in Mimicking Electron-Hole Pairs for Logic and Memory Applications
100
A Comparative Analysis of Traditional and Lightweight Algorithms
100
Contents : (July'23)

Sentiment Analysis of YouTube Comments Using Deep Neural Networks and Pre-Trained Word Embedding
A C Nanayakkara and G A D M Thennakoon

Textual content sentiment analysis is important for a wide range of natural language processing activities. Particularly, the growth of social media stimulates a huge need for sentiment analysis, which is used to extract relevant statistics from massive data on the Internet. The study focused on handling the sentiment analysis problem by employing Deep Learning models with GloVe word embedding, which was motivated by the accomplishments of Deep Learning. A basic Neural Network (NN) was utilized as the experiment's foundation, together with Convolutional NNs (CNNs) and a Long Short-Term Memory (LSTM) NN architecture to categorize the tone of comments left on one of the most popular YouTube videos. The study recommends NN model with LSTM to overcome the shortcomings of the traditional recurrent neural community. In the comparison test between the densely connected neural network model, the CNN model, and the LSTM model, based on training accuracy (81%, 86%, 87%), testing accuracy (64%, 74%, 84%), and over fitting indicators (15, 12, 3), the LSTM model with GloVe word embedding outperformed both the CNN and simple NN. Future studies are encouraged to test different embedding methods with diversified datasets.


© 2023 IUP. All Rights Reserved.

Article Price : Rs.100

Analysis of GPT Sentiments Using Blog Mining
Sandeep Bhattacharjee

Language models are thought to be a mechanism for machines to comprehend and anticipate human languages as a component of contextually appropriate human communication. The paper attempts to comprehend the development of language models, commonly referred to as Generative Pre-trained Transformers (GPTs). Using text mining analytics in R4.2.2 console, the study attempts to locate keywords in 72 GPT blogs that appeared on the OPENAI website. Among them, the terms found were "learning," "openAI", "models" and "model." Moreover, correlation analysis revealed associations between terms like "appreciated," "creativity," "flex," "combine," and "connections." Academics, researchers and professionals working on business and information technology applications can all benefit greatly from this study.


© 2023 IUP. All Rights Reserved.

Article Price : Rs.100

Dzyaloshinskii-Moriya Interactions of Skyrmions and Antiskyrmions in Mimicking Electron-Hole Pairs for Logic and Memory Applications
Mosiori Cliff Orori

The demand for novel memory and logic devices has grown in recent days with technology advancement. Particular attention has been drawn to the use of skyrmions and antiskyrmions in memory access and storage. The findings in most recent laboratory observation at room temperature further encourage more studies. So far, some investigations have pointed on skyrmions for reservoir computing applications which in most applications require very large memory storages and fast access capabilities. It is only recently that material physicists proposed skyrmions for ultra-dense magnetic memories, though it has not been implemented. In this paper, we present the model, simulation and discuss the findings obtained by simulating a magnetic skyrmion model. The findings suggested that a magnetic skyrmion with antiskyrmion has a capacity to act as memory element. As a result, adopting them for memory applications can simplify the fabrication process of logic elements if their magnetic spin textures are taken into account. These findings form a promising pointer for future application of skyrmion and antiskyrmion.


© 2023 IUP. All Rights Reserved.

Article Price : Rs.100

A Comparative Analysis of Traditional and Lightweight Algorithms
Rimpi Rani and R K Bathla

Cryptography algorithms are an essential tool for protecting sensitive data and communications. They can be compared based on several criteria. The paper compares some of the traditional AES, RSA and SHA with lightweight Speck, Simon and Clefia cryptography algorithms based on key size, block size, rounds and security. It is worth noting that the choice of algorithm depends on the specific use case and the level of security. In general, traditional algorithms may be more appropriate for applications that require high levels of security, while lightweight algorithms may be more suitable for resource-constrained environments.


© 2023 IUP. All Rights Reserved.

Article Price : Rs.100