2010
DOI: 10.1007/s13253-010-0040-8
|View full text |Cite
|
Sign up to set email alerts
|

Managing the Essential Zeros in Quantitative Fatty Acid Signature Analysis

Abstract: Quantitative fatty acid signature analysis (QFASA) is a recent diet estimation method that depends on statistical techniques. QFASA has been used successfully to estimate the diet of predators such as seals and seabirds. Given the potential species in the predator's diet, QFASA uses statistical methods to obtain point estimates of the proportion of each species in the diet. In this paper, inference for a population of predators is considered.The estimated diet is compositional and often with zeros correspondin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
38
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 43 publications
(39 citation statements)
references
References 17 publications
1
38
0
Order By: Relevance
“…In both our NFS and SSL samples, higher CCs were indeed associated with higher variance. This observation is particularly important given the recent interest in using bootstrapping resampling techniques for dietary FA investigations (Stewart & Field 2011, Thiemann et al 2011.…”
Section: Importance Of Ccsmentioning
confidence: 98%
“…In both our NFS and SSL samples, higher CCs were indeed associated with higher variance. This observation is particularly important given the recent interest in using bootstrapping resampling techniques for dietary FA investigations (Stewart & Field 2011, Thiemann et al 2011.…”
Section: Importance Of Ccsmentioning
confidence: 98%
“…To examine the effect of essential zeros on the stress, we used a reduced prey base containing eight important species, as described in [18]. This allowed us to vary the amount of zeros in the true diets and produce diet estimates that are generally closer to the true diet than would be expected if the larger prey database was used.…”
Section: Resultsmentioning
confidence: 99%
“…Pseudo-predators were first introduced in [8] and a modified algorithm was developed in [18]. A pseudo-predator is created by sampling with replacement from a prey database with probability weights given by a selected 'true' diet.…”
Section: Qfasamentioning
confidence: 99%
See 1 more Smart Citation
“…Work by Stewart and Field (2011) uses a multiplicative logistic normal mixture model that allows them to consider the univariate log odds for the i th component to be normally distributed where the i th component is not zero. It works well for their applications, in particular regression, but does not capture covariance easily.…”
Section: Previous Workmentioning
confidence: 99%