2017
DOI: 10.1186/s12859-016-1452-4
|View full text |Cite
|
Sign up to set email alerts
|

Parameter estimation in large-scale systems biology models: a parallel and self-adaptive cooperative strategy

Abstract: BackgroundThe development of large-scale kinetic models is one of the current key issues in computational systems biology and bioinformatics. Here we consider the problem of parameter estimation in nonlinear dynamic models. Global optimization methods can be used to solve this type of problems but the associated computational cost is very large. Moreover, many of these methods need the tuning of a number of adjustable search parameters, requiring a number of initial exploratory runs and therefore further incre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
51
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 341 publications
(56 citation statements)
references
References 72 publications
(59 reference statements)
0
51
0
Order By: Relevance
“…Leaving aside the even more complicated task of network inference, parameter optimization and uncertainty analysis are currently the key challenges for large-scale models, for which no satisfactory approaches exist. Due to the high computation times, toolboxes suitable for computing clusters are necessary and have recently been developed (Penas et al, 2017;Schmiester et al, 2019). Moreover, new approaches have to be explored, such as transferring the concept of mini batching from the field of deep learning to optimization (Goodfellow et al, 2016) or MCMC sampling (Seita et al, 2017) of dynamic models, and must be adapted to ODE models.…”
Section: Discussionmentioning
confidence: 99%
“…Leaving aside the even more complicated task of network inference, parameter optimization and uncertainty analysis are currently the key challenges for large-scale models, for which no satisfactory approaches exist. Due to the high computation times, toolboxes suitable for computing clusters are necessary and have recently been developed (Penas et al, 2017;Schmiester et al, 2019). Moreover, new approaches have to be explored, such as transferring the concept of mini batching from the field of deep learning to optimization (Goodfellow et al, 2016) or MCMC sampling (Seita et al, 2017) of dynamic models, and must be adapted to ODE models.…”
Section: Discussionmentioning
confidence: 99%
“…As was suggested in [26], (parallel) genetic algorithms are known to be good exploratory search methods as well. Moreover, with respect to our implementation of scatter search, recently a more advanced version of scatter search has become available, called saCeSS (self-adaptive cooperative enhanced scatter search) [51]. It uses multiple instances of scatter search set at different levels of exploration and exploitation, multiple local search methods, and utilizes the small-scale parallelism of current-day workstations.…”
Section: Discussionmentioning
confidence: 99%
“…Algorithm 2 shows a schematic simple pseudocode of the proposed saCMM method. By now, two metaheuristics are performed by the islands, the Differential Evolution (DE), implemented with the enhancements described in, and the enhanced Scatter Search (eSS), using the implementation outlined in . Nevertheless, as already pointed before, this paper aims to be a proof‐of‐concept, and the saCMM could be easily extended including more metaheuristics.…”
Section: Self‐adaptive Cooperative Multimethodsmentioning
confidence: 99%