I remember old games with better graphics than they actually had, the reason, nostalgia.
And there are a couple factors.
The first is the fact that you didn't have the new games of today to compare them with. Obviously if you took DOOM and put it up against Battlefield 4, then DOOM would look awful. But back in its prime, DOOM was the best of the best. Those were cutting edge graphics and that was a cutting edge game.
A lot of it boils down to comparative thinking and gradual progression.
Graphics are perpetually increasing in quality, with some combination of improved technique and Moore's Law doing the driving. We've never, however, meter real time content (e.g. games) against reality; even the most graphically stunning titles today aren't close to photorealistic - you can tell you're playing a game, but it's at the forefront of what can currently be achieved, and that is the standard we care about. No doubt at some point you'll think that a game made in 2012-4, at the pinnacle of what can currently be achieved is visually stunning, because it exceeds our previous measure for games; you've never seen a game with better graphics. Similarly, it's easy to remember a game made in 1994 as visually stunning because it did the exact same thing. Now, you can't really picture the graphics from 20 years ago (unless you have an eidetic memory) so your imagination fills in the blank. Of course, now when you go back to said older game, it's certainly not visually stunning by modern standards; we're used to considerably better, thus the older game looks terrible, and the gap between your misremembered perception and reality closes sharply.
The other is imagination. When you play a game and really get lost in it, then it's like reading a book. You're not only playing the game and looking at pixels on a screen but you're building this image in your head as well.
You can actually demonstrate this process to yourself over a much shorter time scale. On a decent TV, sit and watch a standard definition channel for a while - rate the picture quality out of 10. Most people say around 7 or 8. Swap then to a high definition channel. You'll probably think it looks a little better, but the difference between SD and HD is mostly minimal, maybe rating it 1 higher at 8-9. Now swap back to SD and rate it again; you'll find that by comparison to HD it looks awful; far worse than it did before, with most people revising their rating to around a 5. This is much the same process.
It's quite interesting to consider that even the most visually spectacular games today will be considered retro, or even ugly in a few years time, and people will probably still be talking about this, but using Crysis 3 as the outdated example.
For really old games there is another reason: Cathode-ray tube TVs -- you know, the big chunky ones, pre-LCD -- were a bit blurry.
Back in the NES, SNES, and N64 days, developers were actually banking on this blurriness to cover up the harsh lines of early 3D models, and the blurring smoothed out the pixels in pixel-art SNES games. When you play those games on a modern LCD, the harsh lines and square pixels show up with sharp clarity and can look a lot clunkier. If you go into the menus of consoles that offer retro games in there stores, many of them have video filters designed to blur things a little, add scanlines, and otherwise mimic the look of older TVs, which can make a lot of things look, unintuitively, nicer.
Likewise, the stuff you were comparing those games to was of a similarly low quality. If you were watching your movies on a long-play VHS on a 20" TV, then you would've never really seen a ton of background details, and you would've been used to some blur. So putting on a good N64 game like Perfect Dark wouldn't have seem as far from the movies as it does today, when you've been watching 50" Blu-rays.
Some fuzziness and a lack of background detail would just be par for the course.