Computer: Storage: Memory
Non-Sequential Memory   (+4, -1)  [vote for, against]
Non-sequential computer memory.

Even true random-access devices like modern electronic memory share a certain disadvantage with primitive technologies like magnetic-tape storage - though we can access any location at any time, we must still deal with the fact that they are arranged in a certain fixed order, which we cannot change. Thus, operating systems include memory-allocation and garbage collection schemes which produce CPU overhead. There might be another, more elegant way to deal with this problem.

Current memory chips have relatively simple interfaces - usually a set of address lines which select a location, and a set of data lines which allow reading and writing to that location. It would be useful to create a new type of memory - a non-sequential random-access memory. The interface would resemble a hardware version of "malloc()" or other familiar concept. Rather than simply perform memory allocation in hardware, which has been proposed before, why not simply eliminate the need for memory allocation as we know it? Simply do not arrange the individual memory cells in fixed sequential order, but allow them to form chains on command using an architecture similar to that of an FPGA (field-programmable gate array), where one may dynamically program connections between elements at nearly any physical distance from one another. Thus, the new chip could perform the basic functions of memory allocation without much actual processing taking place; nothing resembling "garbage collection" would be needed, as unused cells would simply be disconnected from their chains (in our case, blocks of contiguous data) and be free for use by any other chain.
-- dsm, Mar 11 2002

Very interesting idea... but wouldn't this mean that every memory access now involves following several pointers down the linked list to find its 'chunk' first? That would be pretty inefficient, I think...
-- Jeremi, Mar 11 2002


Yeah. This would involve additional hardware, firmware and processing overhead. And you'd still have programs that don't properly allocate memory or release it when done. I don't see how it would accomplish anything.
-- phoenix, Mar 11 2002


What would be better would be to have languages and processors with built-in support for a pointer type which would consist of a handle and an offset. If segment descriptors were extended to 32 bits the x86 architecture could probably work pretty well here. Pointers would be eight bytes rather than four, but intelligent hardware bounds-checking would be quite feasible.
-- supercat, Mar 11 2002


<Elvis>I forgot to remember to forget</Elvis>
-- thumbwax, Mar 11 2002


CAM - Content Addressable Memory. Very specialized but it exists. Yes there is more overhead inside it. You "address it" by specying the partial contents you're looking for and it returns the entire memory word. So the s/w would use part of each word for a tag and part for the data. You can see that it would be very easy to look for empty words and easy to throw away words no longer needed. No need for garbage collection in the conventianal sense since it is always easy and automatic to find unused words. The Goodyear Staran computer used this kind of memory.
-- syzygy, Mar 11 2002


CAM may be 'specialized', but it's also very widespread; nearly every modern cache controller makes extensive use of it in some form.
-- supercat, Mar 12 2002


Why not just put all memory integrated into the CPU. Have all HD data and RAM put into the CPU, it would be a big CPU but all information would be available at all times.
-- JoeLounsbury, Nov 12 2003


Every Intel processor since the 80386 has had the ability to arbitrarily rearrange 4K blocks of memory/address space. Other CPU architectures have similar abilities. To be sure, the 32-bit address registers are a little limitting by modern standards, but the Intel architectures also support 48-bit pointers (though I'll admit I don't know if anybody actually uses them).
-- supercat, Nov 14 2003


The phrase Non-Sequential Memory does not get a lot of hits. I wonder what they decided to call it. Likely built into computer games or something.

Cloud computing is just off loading all your troubles to some computer in the sky, but it has some of the all you can eat character of this idea.
-- popbottle, Jul 14 2015



random, halfbakery