We study the storage properties associated with generalized Hebbian learning rules which present four free parameters that allow for asymmetry. We also introduce two extra parameters in the post-synaptic potentials in order to further improve the critical capacity. Using signal-to-noise analysis, as well as computer simulations on an analog network, we discuss the performance of the rules for arbitrarily biased patterns and find that the critical storage capacity αc becomes maximal for a particular symmetric rule (αc diverges in the sparse coding limit). Departures from symmetry decrease αc butcan increase the robustness of the model.