A perceptron that "learns" the opposite of its own output is used to generate a time series. We analyze properties of the weight vector and the generated sequence, such as the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model are discussed. If a continuous transfer function is used, the system displays chaotic and intermittent behavior, with the product of the learning rate and amplification as a control parameter.