AI Based Natural Language Processing: Emerging Approaches and Applications

A special issue of Future Internet (ISSN 1999-5903). This special issue belongs to the section "Big Data and Augmented Intelligence".

Deadline for manuscript submissions: 31 May 2025 | Viewed by 663

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computing and Information Sciences, Kansas State University, Manhattan, KS 66506, USA
Interests: machine learning; data science; artificial intelligence; deep learning; natural language processing

E-Mail Website
Guest Editor
CSIRO's Data61, Eveleigh, Sydney, NSW 2015, Australia
Interests: deep learning; natural language processing; pattern recognition; document understanding; high-performance computing

Special Issue Information

Dear Colleagues,

AI-based natural language processing (NLP) technologies, driven by advancements in deep learning and the rise of large language models, have seen broad applications in the real world, such as information extraction, information retrieval, machine translation, question answering, and conversational systems. These technologies are also extensively used in scientific fields, including scientific text understanding, biomedical information processing, and domain knowledge construction. However, many unresolved theoretical and technological challenges remain in this field, which require further investigation. This Special Issue aims to address these open challenges by inviting scholarly contributions that focus on emerging approaches and applications in NLP. We welcome original, unpublished research in all aspects of AI-based computational linguistics and natural language processing.

Prof. Dr. William Hsu
Dr. Huichen Yang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information extraction
  • artificial intelligence
  • text mining
  • natural language processing
  • natural language generation
  • natural language understanding
  • machine learning for NLP
  • question answering
  • speech recognition
  • NLP for document intelligence
  • NLP applications, including domain-specific applications in fields such as science and medicine

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 1242 KiB  
Article
Accelerating and Compressing Transformer-Based PLMs for Enhanced Comprehension of Computer Terminology
by Jian Peng and Kai Zhong
Future Internet 2024, 16(11), 385; https://doi.org/10.3390/fi16110385 - 22 Oct 2024
Viewed by 433
Abstract
Pretrained language models (PLMs) have significantly advanced natural language processing (NLP), establishing the "pretraining + fine-tuning" paradigm as a cornerstone approach in the field. However, the vast size and computational demands of transformer-based PLMs present challenges, particularly regarding storage efficiency and processing speed. [...] Read more.
Pretrained language models (PLMs) have significantly advanced natural language processing (NLP), establishing the "pretraining + fine-tuning" paradigm as a cornerstone approach in the field. However, the vast size and computational demands of transformer-based PLMs present challenges, particularly regarding storage efficiency and processing speed. This paper addresses these limitations by proposing a novel lightweight PLM optimized for accurately understanding domain-specific computer terminology. Our method involves a pipeline parallelism algorithm designed to accelerate training. It is paired with an innovative mixed compression strategy that combines pruning and knowledge distillation to effectively reduce the model size while preserving its performance. The model is further fine-tuned using a dataset that mixes source and target languages to enhance its versatility. Comprehensive experimental evaluations demonstrate that the proposed approach successfully achieves a balance between model efficiency and performance, offering a scalable solution for NLP tasks involving specialized terminology. Full article
Show Figures

Figure 1

Back to TopTop