This paper presents a truly meshless approximation strategy for solving partial differential equations based on the local multiquadric (LMQ) and the local inverse multiquadric (LIMQ) approximations. It is different from the traditional global multiquadric (GMQ) approximation in such a way that it is a pure local procedure. In constructing the approximation function, the only geometrical data needed is the local configuration of nodes fallen within its influence domain. Besides this distinct characteristic of localization, in the context of meshlesstyped approximation strategies, other major advantages of the present strategy include: (i) the existence of the shape functions is guaranteed provided that all the nodal points within an influence domain are distinct; (ii) the constructed shape functions strictly satisfy the Kronecker delta condition; (iii) the approximation is stable and insensitive to the free parameter embedded in the formulation and; (iv) the computational cost is modest and the matrix operations require only inversion of matrices of small size which is equal to the number of nodes inside the influence domain. Based on the present LMQ and LIMQ approximations, a collocation procedure is developed for solutions of 1D and 2D boundary value problems. Numerical results indicate that the present LMQ and LIMQ approximations are more stable than their global counterparts. In addition, it demonstrates that both approximation strategies are highly efficient and able to yield accurate solutions regardless of the chosen value for the free parameter.Keywords Local multiquadric approximation, Local inverse multiquadric approximation, Radical base functions, Meshless method, Collocation procedure IntroductionRecently, the so-called ''meshless methods'' emerged as a potential alternative for solutions in computational mechanics and a variety of approaches named after ''meshless'' have appeared. They can be categorised into two main groups, with respect to their approximation techniques: (i) Methods based on the Galerkin integration technique For those methods grouped under this category, their final numerical equations are generated by substituting the approximation functions into an (Galerkin) integration equation. It includes the element-free Galerkin method [1, 2], the reproducing kernel particle method [3, 4], the h-p-cloud method [5], the partition-of-unity method [6], the meshless local Petrov-Galerkin method [7] and the point interpolation method [8]. (ii) Methods based on point collocation technique This group embraces methods which base on kinds of collocation technique to generate the final numerical equations. Examples include the finite point [9, 10] and the finite cloud [11] method, the multiquadric (MQ) and other methods in which the formulations are based on the radial basis functions (RBF) [12-18].In the above-mentioned meshless methods, the way of constructing their approximate field functions is one of the most important features that affect their performance, stability and efficiency. A wealth o...
In recent years recurrent neural network language models (RNNLMs) have been successfully applied to a range of tasks including speech recognition. However, an important issue that limits the quantity of data used, and their possible application areas, is the computational cost in training. A significant part of this cost is associated with the softmax function at the output layer, as this requires a normalization term to be explicitly calculated. This impacts both the training and testing speed, especially when a large output vocabulary is used. To address this problem, noise contrastive estimation (NCE) is explored in RNNLM training. NCE does not require the above normalization during both training and testing. It is insensitive to the output layer size. On a large vocabulary conversational telephone speech recognition task, a doubling in training speed on a GPU and a 56 times speed up in test time evaluation on a CPU were obtained.
In recent years, recurrent neural network language models (RNNLMs) have become increasingly popular for a range of applications including speech recognition. However, the training of RNNLMs is computationally expensive, which limits the quantity of data, and size of network, that can be used. In order to fully exploit the power of RNNLMs, efficient training implementations are required. This paper introduces an open-source toolkit, the CUED-RNNLM toolkit, which supports efficient GPU-based training of RNNLMs. RNNLM training with a large number of word level output targets is supported, in contrast to existing tools which used class-based output-targets. Support fot N-best and lattice-based rescoring of both HTK and Kaldi format lattices is included. An example of building and evaluating RNNLMs with this toolkit is presented for a Kaldi based speech recognition system using the AMI corpus. All necessary resources including the source code, documentation and recipe are available online 1 .
In natural languages multiple word sequences can represent the same underlying meaning. Only modelling the observed surface word sequence can result in poor context coverage, for example, when using n-gram language models (LM). To handle this issue, this paper presents a novel form of language model, the paraphrastic LM. A phrase level transduction model that is statistically learned from standard text data is used to generate paraphrase variants. LM probabilities are then estimated by maximizing their marginal probability. Significant error rate reductions of 0.5%-0.6% absolute were obtained on a state-ofthe-art conversational telephone speech recognition task using a paraphrastic multi-level LM modelling both word and phrase sequences.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.