Or is it now possible?
It used to be said that, you should never future proof your PC, because it's not possible, because of Moore's Law, your PC will just become obsolete, and your money wasted. Now it's starting to look like that may not be true for much longer. CPUs in particular have really stagnated. The CPU I upgraded to recently, is not all that much better than the i5 2500k I upgraded from. And that concerns me.
We're fastly approaching the physical limits of how we can even shrink transistors, where it isn't any longer an engineering issue, but an issue of the laws of the universe. From the look of things, we're either going to hit a paradigm shift, or a stagnation in 2025 or so. Of course, I don't have a crystal ball or anything, but it looks like we'll hit 5 nm mainstream chips by 2025 at the earliest unless I'm mistaken.
And after that, well, we'll pretty well have to find something else to do like "3D" chip design or putting more and more cores on chips, or we're stuck where we are. And I'm no expert on economics, but that also probably means prices stop dropping as hardware is no longer obsolete.
So what I'm saying is, maybe I should be getting the best CPUs on the market right now for gaming, and in general. Usually that would be a waste of money, but I'm starting to wonder if that's the case any more. The whole rule about "don't future proof your PC" is based on Moore's Law. And that Law is ending soon.