2020
DOI: 10.1109/lra.2020.2965418
|View full text |Cite
|
Sign up to set email alerts
|

Improving Robotic Cooking Using Batch Bayesian Optimization

Abstract: With advances in the field of robotic manipulation, sensing and machine learning, robotic chefs are expected to become prevalent in our kitchens and restaurants. Robotic chefs are envisioned to replicate human skills in order to reduce the burden of the cooking process. However, the potential of robots as a means to enhance the dining experience is unrecognised. This work introduces the concept of food quality optimization and its challenges with an automated omelette cooking robotic system. The design and con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 47 publications
(29 citation statements)
references
References 18 publications
0
21
0
Order By: Relevance
“…e main acquisition functions are the PI (probability of improvement), EI (expected improvement), and UBC (upper confidence bounds). Currently, Bayesian optimization has been demonstrated as a powerful tool for optimal design problems, such as industrial control [45], robotics [46], and chemical experiments [47]. In this paper, a novel Bayesian optimization algorithm, named DART-EI Bayesian optimization is proposed for the wind speed forecasting models.…”
Section: Online Sequential Extreme Learning Machinementioning
confidence: 99%
“…e main acquisition functions are the PI (probability of improvement), EI (expected improvement), and UBC (upper confidence bounds). Currently, Bayesian optimization has been demonstrated as a powerful tool for optimal design problems, such as industrial control [45], robotics [46], and chemical experiments [47]. In this paper, a novel Bayesian optimization algorithm, named DART-EI Bayesian optimization is proposed for the wind speed forecasting models.…”
Section: Online Sequential Extreme Learning Machinementioning
confidence: 99%
“…Firstly, the existing extensions of MES supporting common BO extensions like Multi-fidelity BO (Moss et al, 2020d) and batch BO (Takeno et al, 2020) require additional approximations beyond those of vanilla MES, typically through the numerical integration of low-dimensional integrals. Multi-fidelity BO (also known as multi-task BO) leverages cheap approximations of the objective function to speed up optimisation, for example through exploiting coarse resolution simulations when calibrating large climate models (Prieß et al, 2011), whereas batch BO allows multiple objective function evaluations to be queried in parallel, a scenario arising often in science applications, for example when training a collection of robots to cook (Junge et al, 2020). Therefore, although still cheaper than their ES-and PES-based counterparts, extensions of MES for multi-fidelity and batch BO do not inherit the simplicity and low-cost of vanilla MES.…”
Section: Introductionmentioning
confidence: 99%
“…Although BO has been widely used in the experimental design community since the 1990s [15,13], it is not until the last decade that BO has become extremely popular in the machine learning community as an efficient tool for tuning hyper-parameters in various algorithms, e.g., deep learning [5,7], natural language processing [29], and preference learning [10]. The BO is also embraced by new areas such as robotics [16], automatic control [1], and pharmaceutical product development [21].…”
Section: Introductionmentioning
confidence: 99%