Selecting parameters can be a powerful mechanism in constructing new evolving connectionist network. However, if a parameter contains partial information such that only some of the values are relevant and others are not, then a selection of the subset of relevant values is more appropriate. Considering the possible values of a parameter of a processing connectionist network as the outcomes of a variable, this research focuses on selecting interval values of the variable. It also considers the partitioning schemes used in generating the intervals from the outcomes of a variable. The goal of this work is to explore variable value selection and its effect in an evolving connectionist network. Using input variables in a backpropagation network, the proposed method evaluates its effect based on training of a dataset, and eliminates those intervals of the variable values that contribute negatively when processed by the network. When a value falls into an interval that has been selected and ignored, it is analogous to a network without processing the corresponding variable, and vice versa. Two approaches for interval partitioning are considered, based on equal-probability (or maximum entropy) and equal-width partitioning scheme. Comparing the best performing network with selection and the one without selection, the experimental results show that the best network with selection can produce better performance accuracy and smaller network size.