I'm guessing it has something to do with how the early innovations had already been handled so people felt they could start getting wild with it, using it excessively when pre-1997 films tended to use it in a few brief shots that were buffered by practical effects and had to be carefully planned around. I'm curious as to why you think it seemed to get better around 2010, though.
I think around 2005, studios began to realised the limitations, so they began hiding the CGI instead of blanketing their movies with it... from 2000-2005, movies looked awful. Like you said, CGI was easier to fall back on and easier to access, but the tech itself hadn't been perfected. Was still jerky and plastic-looking.
Studios were simply trying to push too far beyond the limitations, and it always came out looking sh*te.
... but by around 2010 there were some leaps made in tech which meant CGI was becoming a little more photo-realistic and we got stuff like the De-aging tech.
Ok De-aging was kinda crappy the very first time it was used in X-Men 3... but the second time it was used in Benjamin Button it was already pretty damned close to getting perfected.
TRON Legacy did a decent job too, and it really pushed the envelope as to how close to photo-real the tech could get.
Sadly, The Matrix 2 and 3... were just far too grand a vision for the available tech.
The 100-Smith-Fight was exciting on first watch, but even back then, when Neo is flipped to a CGI character, it looked like crap.
Part 3 with the CGI fists hitting CGI faces in slow-motion, in extreme close-up, with CGI rain... it was all just too much for the technology to render.