People often misrecognize objects that are similar to those they have previously encountered. These mnemonic discrimination errors are attributed to shared memory representations (gist) typically characterized in terms of meaning. In two experiments, we investigated multiple semantic and perceptual relations that may contribute: at the concept-level, a feature-based measure of concept confusability quantified each concept’s tendency to activate other similar concepts via shared features; at the item-level, rated semantic exemplarity indexed the degree to which the specific depicted objects activated their particular concepts. We also measured perceptual confusability over items using a computational model of vision, HMax, and an index of color confusability. Participants studied single (Experiment 1, N = 60) or multiple (Experiment 2, N = 60) objects for each basic-level concept, followed by a recognition memory test including studied items, similar lures, and novel items. People were less likely to recognize studied items with high concept confusability, and more likely to correctly reject their lures. This points to weaker basic-level semantic gist representations for objects with more confusable concepts because of a greater emphasis on coarse processing of shared features relative to fine-grained processing of individual concepts. In contrast, people were more likely to misrecognize lures that were better exemplars of their concept, suggesting that enhanced basic-level semantic gist processing increased errors due to gist across items. Mnemonic discrimination errors were also more frequent for more perceptually confusable lures. The results implicate semantic similarity at multiple levels and highlight the importance of perceptual as well as semantic relations.