2009 IEEE International Conference on Systems, Man and Cybernetics 2009
DOI: 10.1109/icsmc.2009.5346119
|View full text |Cite
|
Sign up to set email alerts
|

Bernoulli's principle of insufficient reason and conservation of information in computer search

Abstract: Abstract-Conservation of information (COI) popularized by the no free lunch theorem is a great leveler of search algorithms, showing that on average no search outperforms any other. Yet in practice some searches appear to outperform others. In consequence, some have questioned the significance of COI to the performance of search algorithms. An underlying foundation of COI is Bernoulli's Principle of Insufficient Reason 1 (PrOIR) which imposes of a uniform distribution on a search space in the absence of all pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(36 citation statements)
references
References 46 publications
0
36
0
Order By: Relevance
“…Given the uncertainties around the evaluation of the objectives, we incorporated them in the HRV in a way that enhanced visual steering, allowing a more informed and responsible decision‐making process. This was achieved by using uncertainty ranges based on a concept borrowed from the Physical Programming approach for multi‐objective optimization (Dembski and Marks, ). A percent confidence/deviation value for each objective attribute in the trade‐off set made it possible to compute the corresponding range bounds for each trade‐off point: Fi,lb=FijFij×δi1000.5emand0.5emFi,ub=Fij+Fij×δi100,i=[],1140.5emand0.5emj=[],1100where i is theobjective index, j is the trade‐off point index and δ i is the allowable percent deviation value for each objective = [3, 10, 10, 10, 3, 1, 1, 1, 2, 5, 5, 2, 5, 10] for water quantity, nitrate leaching, phosphorus loss, CO 2 e, sediments, discounted costs, discounted income, discounted EBIT, milksolids, sawlog, pulpwood, beef, sheepmeat and wool, respectively.…”
Section: Hyper‐radial Visualizationmentioning
confidence: 97%
See 1 more Smart Citation
“…Given the uncertainties around the evaluation of the objectives, we incorporated them in the HRV in a way that enhanced visual steering, allowing a more informed and responsible decision‐making process. This was achieved by using uncertainty ranges based on a concept borrowed from the Physical Programming approach for multi‐objective optimization (Dembski and Marks, ). A percent confidence/deviation value for each objective attribute in the trade‐off set made it possible to compute the corresponding range bounds for each trade‐off point: Fi,lb=FijFij×δi1000.5emand0.5emFi,ub=Fij+Fij×δi100,i=[],1140.5emand0.5emj=[],1100where i is theobjective index, j is the trade‐off point index and δ i is the allowable percent deviation value for each objective = [3, 10, 10, 10, 3, 1, 1, 1, 2, 5, 5, 2, 5, 10] for water quantity, nitrate leaching, phosphorus loss, CO 2 e, sediments, discounted costs, discounted income, discounted EBIT, milksolids, sawlog, pulpwood, beef, sheepmeat and wool, respectively.…”
Section: Hyper‐radial Visualizationmentioning
confidence: 97%
“…Given the uncertainties around the evaluation of the objectives, we incorporated them in the HRV in a way that enhanced visual steering, allowing a more informed and responsible decision-making process. This was achieved by using uncertainty ranges based on a concept borrowed from the Physical Programming approach for multi-objective optimization (Dembski and Marks, 2009). A percent confidence/deviation value for each objective attribute in the trade-off set made it possible to compute the corresponding range bounds for each trade-off point:…”
Section: Uncertainty Ranges and Visualizationmentioning
confidence: 99%
“…2) Fitness Value Representations: If the distribution of fitness values in a fitness map is known beforehand, a compression scheme such as Huffman encoding can be used to assign short binary strings to frequently occurring fitness values and longer strings to less frequently occurring values. If distribution information is not available, however, we assume an equiprobable distribution of values, so that each value in the set Y of possible fitness values has a 1 |Y| probability of occurring [19], [20], [21]. This assumption results in an encoding length of l = log 2 |Y| binary digits per value, following Shannon's entropy formula [11].…”
Section: B Fitness Functions and Fitness Mapsmentioning
confidence: 99%
“…As illustrated in Figure 1, assume Pirate X searches for the location in a specific order such as (1,2,3). Assuming uniformity [3], the probability of the treasure being found in any of those locations is However, this changes if both pirates are hunting for treasure at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…Assuming uniformity [3], the probability of the treasure being found in any of those locations is However, this changes if both pirates are hunting for treasure at the same time. Pirate Y has chosen locations such that in two of the three cases, he will have searched a location and taken the treasure before Pirate X.…”
Section: Introductionmentioning
confidence: 99%