1996
DOI: 10.1007/978-1-4612-0745-0
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Learning for Neural Networks

Abstract: except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden.The use of general descriptive names, trade names, trademarks, etc., in this publication, even if the former are not especially identified, is not to be taken as a sign that such names, as understood by the Trade Marks and Merchandise Marks … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

13
2,357
0
15

Year Published

1997
1997
2013
2013

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 2,809 publications
(2,481 citation statements)
references
References 0 publications
13
2,357
0
15
Order By: Relevance
“…To improve the signal-to-background discrimination further, we employed a Bayesian neural network (BNN) trained on a variety of kinematic variables to distinguish W H events from the background [16,17]. For this analysis, we employ distinct BNN discriminant functions that were optimized separately for the different tagging categories and each Higgs boson mass in order to maximize the sensitivity.…”
Section: B Bayesian Neural Network Discriminantmentioning
confidence: 99%
See 1 more Smart Citation
“…To improve the signal-to-background discrimination further, we employed a Bayesian neural network (BNN) trained on a variety of kinematic variables to distinguish W H events from the background [16,17]. For this analysis, we employ distinct BNN discriminant functions that were optimized separately for the different tagging categories and each Higgs boson mass in order to maximize the sensitivity.…”
Section: B Bayesian Neural Network Discriminantmentioning
confidence: 99%
“…Searches for the standard model Higgs boson using the same final state have been reported before by CDF [13,14] and D0 [15] with data corresponding to an integrated luminosity of 5.6 fb −1 and 5.3 fb −1 , respectively. Compared to the previously reported analysis, we have employed a Bayesian artificial neural network (BNN) discriminant [16,17] to improve discrimination between signal and background. The signal acceptance is improved by using additional triggers based on jets and missing transverse energy, as well as a novel method to combine them into a single analysis stream in order to maximize the event yield while properly accounting for correlations between triggers.…”
Section: Introductionmentioning
confidence: 99%
“…Gaussian processes with such a covariance function fall within the family of automatic relevance determination (ARD) models [10,9]. This ARD model, through assigning w d 's for each covariate, is capable of assessing the magnitude of relevance of the corresponding variable to the prediction, and it naturally provides a mechanism for variable selection.…”
Section: Alternative Covariance Functionmentioning
confidence: 99%
“…Initially proposed by [8], Gaussian process was viewed as an alternative approach to neural networks since it was demonstrated that a large class of Bayesian regression models, based on neural networks, converged to a Gaussian process in the limit of an infinite network [9]. Gaussian processes can also be derived from the perspective of non-parametric Bayesian regression [10], by directly placing Gaussian prior distributions over the space of regression functions.…”
Section: Introductionmentioning
confidence: 99%
“…In the latter context we additionally investigate how Automatic Relevance Determination (ARD) [15] [16] can be used to identify which sensors in the array contribute most to the estimate of the concentration posterior distribution. Finally, we analyze to which extent considering past sensor readings can improve the accuracy of probabilistic calibration.…”
mentioning
confidence: 99%