Search Paper
  • Home
  • Login
  • Categories
  • Post URL
  • Academic Resources
  • Contact Us

 

Tuning Dari Speech Classification Employing Deep Neural Networks

google+
Views: 14                 

Author :  Mursal Dawodi and Jawid Ahmad Baktash

Affiliation :  Avignon University

Country :  France

Category :  Linguistics

Volume, Issue, Month, Year :  12, 2, April, 2023

Abstract :


Recently, many researchers have focused on building and improving speech recognition systems to facilitate and enhance human-computer interaction. Today, Automatic Speech Recognition (ASR) system has become an important and common tool from games to translation systems, robots, and so on. However, there is still a need for research on speech recognition systems for low-resource languages. This article deals with the recognition of a separate word for Dari language, using Mel-frequency cepstral coefficients (MFCCs) feature extraction method and three different deep neural networks including Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Multilayer Perceptron (MLP). We evaluate our models on our built-in isolated Dari words corpus that consists of 1000 utterances for 20 short Dari terms. This study obtained the impressive result of 98.365% average accuracy.

Keyword :  Dari, deep neural network, speech recognition, recurrent neural network, multilayer perceptron, convolutional neural network

Journal/ Proceedings Name :  International Journal on Natural Language Computing (IJNLC)

URL :  https://aircconline.com/abstract/ijnlc/v12n2/12223ijnlc03.html

User Name : Darren
Posted 25-05-2023 on 15:51:49 AEDT



Related Research Work

  • Lexical Features Of Medicine Product Warnings In The Philippines
  • Stress Test For Bert And Deep Models: Predicting Words From Italian Poetry
  • Adversarial Grammatical Error Generation: Application To Persian Language
  • Factors Affecting Students’ Low Competence In Reading English At Primary Level In Pakistan

About Us | Post Cfp | Share URL Main | Share URL category | Post URL
All Rights Reserved @ Call for Papers - Conference & Journals