@TRega123 You'd think a software developer would have a better understanding of how ram works. 4GB is more than enough for a dedicated gaming console and will be for the forseeable future. 8 would be overkill and 16 would be absurd. It doesn't work exactly like this but 4GB in a console is almost equivilant to 16GB in a PC. Look at what's out now. Consoles have 512mb and perform as good if not better than a PC with 2GB. All hardware and numbers you're familiar with for PCs mean next to nothing when dealing with a console.
Yeah... us software developers have no idea what we're talking about. Apparently neither does the CEO of Crytek, who says he wants 32GB RAM in the next-gen consoles. Now us software developers have no idea what we're talking about, so please ignore us and instead trust random people on internet forums. After all, they know how to open their task manager and see their memory utilization!
If you really want to know how memory utilization works, you can read my previous post:
@Demoskinos said:@TRega123 Uh, because they really dont need it. Look what they have done with 512 MB. Its amazing the Witcher 2 runs on the 360 at all. You also aren't running an operating system either so pretty much all of that memory can be utilized.
Garbage collection in a nutshell (yes, I wrote this in it's entirety-- I clearly know too much about how modern applications manage memory):
When a small object is allocated to memory, it is written to a contiguous block of available memory in the heap as a generation 0 object. Several background threads will recurse the heap building a memory graph, freeing unused objects found through reference counting and cycle detection, and marking survivors as generation 1. This is called a mark sweep. Less frequently, a generation 1 collector recurses the heap, freeing unused generation 1 objects and marking survivors as generation 2, and so on. For most garbage collectors, 2 generations is enough. 'Object churn' (ie. the constant freeing and allocating of objects) causes the heap to become fragmented, and once it becomes difficult to find contiguous blocks to allocate more objects, the garbage collector preforms heap compaction which is a very expensive operation. In order to avoid heap compaction, the application must be well-tuned so most objects which are not permanent die in generation 0. A well-tuned application also does not use more than 80% of the total memory available in order to allow for room to churn objects and greatly reduce the likelihood of heap compaction.
What about when a large object is allocated to memory? It is common in garbage collectors that large objects are treated differently, often having their own heap called the large object heap. Large objects also require a contiguous block of available memory to be allocated; however, unlike the standard heap, the large object heap is not compacted. If a contiguous block of memory is not available to allocate your large object, you are simply out of memory. This isn't as bad as it sounds, because in a well-tuned application, large objects should be infrequent or interchangeable, such as an uncompressed high resolution texture, which can be freed and have it's memory block reallocated by another texture of the same resolution and size (all uncompressed RGB textures will be the same number of bits as long as they are the same resolution). This is called object pooling.
If that was too long to read (or you really don't care about the details), I can sum it up with one sentence. Any more than 80% memory utilization when using garbage collection causes heap compaction, and will significantly slow down your application, whether it is your productivity software or a game.