Low-dimensional word vectors have long been used in a wide range of applications in natural language processing. In this paper we shed light on estimating query vectors in ad-hoc retrieval where a limited information is available in the original query. Pseudorelevance feedback (PRF) is a well-known technique for updating query language models and expanding the queries with a number of relevant terms. We formulate the query updating in lowdimensional spaces first with rotating the query vector and then with scaling. These consequential steps are embedded in a queryspecific projection matrix capturing both angle and scaling. In this paper we propose a new but not the most effective technique necessarily for PRF in language modeling, based on the query projection algorithm. We learn an embedded coefficient matrix for each query, whose aim is to improve the vector representation of the query by transforming it to a more reliable space, and then update the query language model. The proposed embedded coefficient divergence minimization model (ECDMM) takes top-ranked documents retrieved by the query and obtains a couple of positive and negative sample sets; these samples are used for learning the coefficient matrix which will be used for projecting the query vector and updating the query language model using a softmax function. Experimental results on several TREC and CLEF data sets in several languages demonstrate effectiveness of ECDMM. The experimental results reveal that the new formulation for the query works as well as state-of-the-art PRF techniques and outperforms state-of-the-art PRF techniques in a TREC collection in terms of MAP,P@5, and P@10 significantly.