2020
DOI: 10.1139/cjp-2019-0494
|View full text |Cite
|
Sign up to set email alerts
|

Mixed fluid cosmological model in f(R, T) gravity

Abstract: We construct Locally Rotationally Symmetric (LRS) Bianchi type-I cosmological model in f (R, T) theory of gravity when the source of gravitation is the mixture of barotropic fluid and dark energy (DE) by employing a time varying deceleration parameter (DP). We observe through the behavior of the state finder parameters (r, s) that our model begins from the Einstein static era and goes to ΛCDM era. The EoS parameter(ω<sub>d</sub>) for DE varies from phantom (ω < -1) phase to quintessence (ω… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 74 publications
0
3
0
Order By: Relevance
“…Given the increased training and use costs of LLMs, it is necessary to investigate whether smaller language models trained on relevant data may achieve the desired performance at a lower cost. For example, researchers at the Center for Research on Foundation Models at Stanford University created a model called Alpaca with 4% as many parameters as OpenAI’s text-davinci-003, matching its performance at a cost of $600 to create …”
Section: Are the Llms Being Trained With The Relevant Data And The Ri...mentioning
confidence: 99%
“…Given the increased training and use costs of LLMs, it is necessary to investigate whether smaller language models trained on relevant data may achieve the desired performance at a lower cost. For example, researchers at the Center for Research on Foundation Models at Stanford University created a model called Alpaca with 4% as many parameters as OpenAI’s text-davinci-003, matching its performance at a cost of $600 to create …”
Section: Are the Llms Being Trained With The Relevant Data And The Ri...mentioning
confidence: 99%
“…The level of openness and democratization of LLMs is a topic of concern. Compared with OpenAI's GPT‐3, Meta's LLaMA model 36 is positioned as an “open‐source research tool” that uses various publicly available datasets, including Common Crawl, Wikipedia, and C4 (Table 1). Both models use pretraining data, and LLaMA's pretraining data is publicly available, while GPT‐3.5 currently only has CC data available, making LLaMA more user‐friendly in terms of data accessibility.…”
Section: Large‐scale Ai Modelsmentioning
confidence: 99%
“…Recently, [46] studies the theory and predicts the conditions to obtain expanding universe in the absence of any dark component. In [47], the authors investigate a mixture of barotropic fluid and DE in f (R, T ) gravity where the model evolves from the Einstein static era and approaches ΛCDM. In [48], the study of cosmological dynamics of DE within the theory can be seen.…”
Section: Introductionmentioning
confidence: 99%