2017
DOI: 10.1145/3124643
|View full text |Cite
|
Sign up to set email alerts
|

Evolved Control of Natural Plants

Abstract: Mixing societies of natural and artificial systems can provide interesting and potentially fruitful research targets. Here we mix robotic setups and natural plants in order to steer the motion behavior of plants while growing. The robotic setup uses a camera to observe the plant and uses a pair of light sources to trigger phototropic response, steering the plant to user-defined targets. An evolutionary robotic approach is used to design a controller for the setup. Initially, preliminary experiments are perform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 26 publications
(43 reference statements)
0
11
0
Order By: Relevance
“…Two general approaches were followed, different in scale (in space and time) and precision. (1) A system consisting of a single board computer with a camera and control over two light sources together with a single freshly sprouted bean plant was used to guide the growing shoots to multiple targets in space using image detection and machine learning (detailed in Hofstadler et al, 2017). In these experiments, it typically took the bean shoot 2-3 days to grow out of the space monitored by the camera, corresponding to ∼50 cm of bean shoot.…”
Section: Plant and Robot Experimentationmentioning
confidence: 99%
See 1 more Smart Citation
“…Two general approaches were followed, different in scale (in space and time) and precision. (1) A system consisting of a single board computer with a camera and control over two light sources together with a single freshly sprouted bean plant was used to guide the growing shoots to multiple targets in space using image detection and machine learning (detailed in Hofstadler et al, 2017). In these experiments, it typically took the bean shoot 2-3 days to grow out of the space monitored by the camera, corresponding to ∼50 cm of bean shoot.…”
Section: Plant and Robot Experimentationmentioning
confidence: 99%
“…We showcase the model mimicking the behaviour of the closedloop bean tip controllers artificially evolved in Hofstadler et al (2017;Figure 11). The task is to guide a single growing and nutating tip through specific targets on the 2D plane of the camera projection during its (growth-)journey through the image.…”
Section: Experiments With Robots and Plantsmentioning
confidence: 99%
“…Ad hoc approaches to this problem (e.g. [109,110]) construct a data-driven model by image processing time-lapse records of a certain species in a given set-up, from a few initial experiments. More generalized approaches could be investigated, building from a variety of models in plant science literature.…”
Section: Hybridizing Robots and Biologymentioning
confidence: 99%
“…Research on steering the morphological development of individual plants is rare, as agricultural concerns, for instance, do not motivate such studies. However, there is a line of research on shaping plants that develops an automated process of evolving controllers that direct the growth of a single plant to certain goals [109,110,162,163]. Machine vision was used to understand the behaviour of single bean plants in reaction to external light stimuli, and to construct data-driven models of the plant’s growth and motion.…”
Section: Hybridizing Robots and Biologymentioning
confidence: 99%
See 1 more Smart Citation