2010
DOI: 10.3844/ajassp.2010.390.394
|View full text |Cite
|
Sign up to set email alerts
|

A Back Propagation Neural Networks for Grading <i>Jatropha curcas</i> Fruits Maturitiy

Abstract: Problem statement: Jatropha curcas has the potential to become one of the worlds key energy crops. Crude vegetable oil, extracted from the seeds of the Jatropha plant, can be refined into high quality biodiesel. Traditional identification of Jatropha curcas fruits is performed by human experts. The Jatropha curcas fruit quality depends on type and size of defects as well as skin color and fruit size. Approach: This research develops a back propagation neural networks t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0
1

Year Published

2010
2010
2019
2019

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(14 citation statements)
references
References 18 publications
0
13
0
1
Order By: Relevance
“…One of the most popular classes of artificial neural networks is the Multilayer Perceptrons (MLP) with the backpropagation algorithm as the training method. The MLP have been applied to several areas, for example, agriculture (Effendi et al, 2010), medicine (Benamrane et al, 2005;Isa et al, 2007;Eiamkanitchat et al, 2010), face recognition (Rizon et al, 2006), electric power systems (Benslimane et al, 2006).…”
Section: Introductionmentioning
confidence: 99%
“…One of the most popular classes of artificial neural networks is the Multilayer Perceptrons (MLP) with the backpropagation algorithm as the training method. The MLP have been applied to several areas, for example, agriculture (Effendi et al, 2010), medicine (Benamrane et al, 2005;Isa et al, 2007;Eiamkanitchat et al, 2010), face recognition (Rizon et al, 2006), electric power systems (Benslimane et al, 2006).…”
Section: Introductionmentioning
confidence: 99%
“…Each subsequent layer incorporates a connection in the last layer as well as the final layer generates the network's output. Feedforward networks can easily be utilised for those sorts of input to output mapping (Effendi et al, 2010). Hence a feedforward network which has a single hidden layer together with enough neurons within the hidden layers can certainly fit any finite input-output mapping problem (Fortin et al, 2010).…”
Section: Feed Forward Neural Network Training and Linear Regression Mmentioning
confidence: 99%
“…This work has used supervised learning with back propagation neural network. For pattern classification, Back propagation neural networks architecture is most commonly used neural networks [13][14] [15]. Another reason to chose back propagation due to its ability to perform pattern classification on data where the input and the output had no linear relationship, as in the case of this application [16].…”
Section: Classifier Designmentioning
confidence: 99%