2015
DOI: 10.1093/gbe/evv086
|View full text |Cite
|
Sign up to set email alerts
|

Metagenome Skimming of Insect Specimen Pools: Potential for Comparative Genomics

Abstract: Metagenomic analyses are challenging in metazoans, but high-copy number and repeat regions can be assembled from low-coverage sequencing by “genome skimming,” which is applied here as a new way of characterizing metagenomes obtained in an ecological or taxonomic context. Illumina shotgun sequencing on two pools of Coleoptera (beetles) of approximately 200 species each were assembled into tens of thousands of scaffolds. Repeated low-coverage sequencing recovered similar scaffold sets consistently, although appr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
36
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(37 citation statements)
references
References 64 publications
(78 reference statements)
1
36
0
Order By: Relevance
“…In our study, the reduced number of false positives found with the de novo assembly approach (Supporting Information S9) also indicates that an exhaustive database can considerably improve the outcome of MG. Second, higher coverage rates could help reducing false discovery rates by filtering out all mappings under a certain threshold or by adding replicates to cross‐validate species presence/absence as we did here on the MB data set. In general, sequencing depth is a major limitation for MG as the vast majority of sequences produced with MG do not correspond to mitochondrial sequences and are therefore currently uninformative (although see Linard et al, ). In our study, approximately 0.02% of all reads mapped to the COI reference database for the raw read mapping pipeline (Supporting Information S5).…”
Section: Discussionmentioning
confidence: 99%
“…In our study, the reduced number of false positives found with the de novo assembly approach (Supporting Information S9) also indicates that an exhaustive database can considerably improve the outcome of MG. Second, higher coverage rates could help reducing false discovery rates by filtering out all mappings under a certain threshold or by adding replicates to cross‐validate species presence/absence as we did here on the MB data set. In general, sequencing depth is a major limitation for MG as the vast majority of sequences produced with MG do not correspond to mitochondrial sequences and are therefore currently uninformative (although see Linard et al, ). In our study, approximately 0.02% of all reads mapped to the COI reference database for the raw read mapping pipeline (Supporting Information S5).…”
Section: Discussionmentioning
confidence: 99%
“…Where previous approaches have inferred a phylogenetic signal in interaction structure by combining interaction webs with phylogeny inferred from external sources, molecular information will allow the reconstruction of phylogenies from the samples themselves (Elias et al 2013;Evans et al 2016). Here, genomic approaches will add manifold to the information content as compared to classic DNA barcodes only (Papadopoulou et al 2015;Andújar et al 2015;Linard et al 2015)-again building a natural extension from previous studies advancing from one to multiple (Nyman et al 2007(Nyman et al , 2015Kress et al 2009Kress et al , 2015 gene regions. Thus, it is evident how the use of single-locus DNA barcodes have laid a solid fundament for making full use of the information fountains now emerging.…”
Section: Towards Network Of Ecological Network In Space and Timementioning
confidence: 99%
“…; Linard et al . ) – techniques that are not only much simpler to apply, but might in theory also confer an additional layer of data that is challenging with PCR approaches – quantification of proportional abundances. This still leaves challenges to be overcome such as reference database incompleteness and accurate taxonomic identification (e.g.…”
mentioning
confidence: 99%
“…; Linard et al . ), as well as the development of appropriate means for ensuring data quality should it to be used in academic or monitoring studies, although as neither are unique to this quest, it is realistic to expect that they will develop in parallel with the data generation techniques. Thus, in my idle moments, I like to think that on that sad day in the future when I have to pack Buddy off to the big recycling centre in the sky, his replacement might have a few new options built in, to simultaneously clean and screen!…”
mentioning
confidence: 99%