I have neural networks that would easily use all the virtual memory windows would let me have. On the order of 1000 TB.
Nothing says you have to store all that data in memory at the same time. You have to be somewhat conservative. This is a problem that will probably always exist.
Stuff that requires 1000s of TBs of memory is most likely not designed to run on your average computer, thus it is not a really valid comparison to the 64 GB is plenty comparison.
4 GB is the usual standard today. I am looking at 8 GB myself, but anything beyond that is something I have not even considered. I think 64 GB is plenty.
You have to be conservative if you don't have that much memory.
If you do, but an arbitrarily set OS limit won't let you use it, that's another story.