Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
Your journey of inspiration and perplexement provides a certain dark frisson.

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


                                   

Modular neural network

modular blocks to build 3D circuits, in particular neural networks
  (+7, -2)
(+7, -2)
  [vote for,
against]

Each block shape is a space filling polyhedra (or a space filling polyhedra with face extrusion). The simplest example of a space-filling polyhedra is a cube and with face extrusion a cube becomes a 3D cross. To form an array of 3D cross blocks, the extruded face of one block connects to the extruded face of the next etc. This gives a 'porous' array which aids in the construction and cooling of the array.

Each face (or extruded face) of each block acts as an electrical interface to the circuitry within each block (called the centre circuit) and to the touching adjacent block. Some or all faces of the blocks have a magnet (ferromagnet or electromagnet) embedded therein which helps connect blocks and maintain electrical contact between blocks. The polarity of the magnets at each face indicates whether the faces are acting as an input or output to the centre circuit. A diode can be used as well (or instead of) the magnet to determine whether the face acts as an input or output.

The blocks can also include a conductor running between opposite faces of the block. When the blocks are connected, the conductors electrically connect and act as a 'rails', carrying a voltage (and ground) to supply power to the centre circuit. The rail could also carry a clock signal.

Centre circuits could be analogue or digital. The circuits could be synchronous or asynchronous. The centre circuit could be for example:

analogue such as: Amplifier/Inverter, capacitor/inductor/resistor, etc... digital: memory/EEPROM, timer/delay, multiplexer, AND gate, OR gate etc...

So for example, consider a block that has 5 inputs and one output:

-A digital centre circuit could be an AND gate, thus only outputs an ON signal if all inputs are ON.

-An analogue centre circuit could be an averaging circuit. If the inputs are (0.3V, 0.2V, 0V, 0.5V, 0V) then the output would be 0.2V.

The blocks can be of any size. For educational purposes they could be 5cm wide for building arrays of about 10x10x10, for research purposes they could be 5mm wide for building arrays of about 100x100x100. Even smaller blocks are conceivable with the size only limited by the tools to manufacture and manipulate the blocks.

Scaffolding could be threaded through the gaps formed between the blocks to add strength. Alternatively, the array of blocks could be immersed in a fluid to dissipate heat.

The array of blocks could be deconstructed by 'slicing' planes between blocks with a sheet of paper or other suitable material (the edges of the blocks would need to be chamfered to allow this). In this way the array could be completely dismantled or used as a method to access and replace some of the blocks.

The blocks could be constructed (and deconstructed) by a robotic arm with an electromagnet to pick-up and drop pieces into place. Alternatively a 2D electromagnetic array could be used to pick-up and drop several pieces or an entire slice at a time.

The blocks at the faces of the array are connected to 2D circuits. These circuits interface to input/output devices (eg computer, a camera, a display etc), or electrically couple block faces to other block faces.

This system of constructing neural networks would be suitable for an 'open sourcing' AI. So for example, several independent researchers could be working on neural networks for differentiating between images of cancerous and non-cancerous skin. They could 'breed' their most successful neural networks to create an even more successful neural network.

The system is also suitable for 'pasting' or merging neural networks together. So one researcher could be working on a neural network that extracts and outputs only the human voices from a noisy recording. Another researcher could be working on neural network that performs voice recognition. The two researchers' neural network modular arrays could be attached at the appropriate faces (ie the output face of one to the input face of another). In this way the 'parralelism' of the neural networks are preserved in the merger (ie not just a single output of one neural net being fed into the input of the next).

xaviergisz, Aug 11 2005

sphericon lattices http://www.pjrobert...on/arrangements.php
[JesusHChrist, Aug 12 2005]

Creatures from primordial silicon http://www.netscrap...ail.cfm?scrap_id=73
using genetic algorithms on FPGA to evolve AI. Similar approach could be used in this modular system. [xaviergisz, Aug 12 2005]

Space-Filling Polyhedron http://mathworld.wo...lingPolyhedron.html
[xaviergisz, Aug 14 2005]

Andeini tesselations http://en.wikipedia...dreini_tessellation
space filling using polyhedra with regular polygon faces. (the rendered images on this page are my first contribution to wikipedia) [xaviergisz, May 15 2006]

Sugarcube CPU Sugarcube_20CPU
snap!, this is what I'm talking about [xaviergisz, May 19 2007]

smart sand http://web.mit.edu/...otic-sand-0402.html
[xaviergisz, Apr 02 2012]

Please log in.
If you're not logged in, you can see what this page looks like, but you will not be able to add anything.
Short name, e.g., Bob's Coffee
Destination URL. E.g., https://www.coffee.com/
Description (displayed with the short name and URL.)






       Sphericons rotate against eachother in 3D lattices. Not sure if that is applicable.
JesusHChrist, Aug 11 2005
  

       That's cool, I didn't know you could just join two neural nets together at the appropriate nodes and have a system which performed the two tasks at once.   

       Not sure how you would breed inanimate conductive tesseracts (if I understand your description rightly) but it all sounds like a thoroughly interesting idea. Great for learning about neural nets too. [+]
pooduck, Aug 11 2005
  

       [pooduck], have a look at the link to get an idea of how to 'breed' neural networks.
xaviergisz, Aug 12 2005
  

       That's a cool article but how does it apply to your idea? (don't mean to sound flip there, sorry)   

       What would do the selective deleting of unfit modules?
pooduck, Aug 12 2005
  

       [pooduck], admittedly there are a lot more parameters in this system than the one in the article, but the theory is the same. In the article, the chip is programmed with binary logic, so for example a particular gate could be programmed to be an AND gate or an OR gate. The only difference with this system is that instead of programming the gate, the gates (ie blocks) are physically replaced.   

       Obviously it is more time consuming and difficult to replace blocks rather than programming them, but it could be done.   

       So to apply a genetic alorithm to this system you would build about 100 small random arrays (eg 10x10x10) of logic gates and test each of them for ability to perform a task. Then identify the best performers. To breed the two best performers you could slice both arrays into 10x10x5 arrays and stick together to form two 10x10x10 arrays*. Again test for ability. Mutations could be introduced by slicing an array open and removing a random block and replacing it with a block with a different logic operation.   

       *Actually a more careful slice would need to be made (ie with a jagged edge) (if it would be possible at all), so the inputs and outputs would match. I'm sure there would be ways to overcome this problem.
xaviergisz, Aug 12 2005
  

       Ah I understand now. In fact it could all be automated and done very quickly.   

       Even without commercial uses it would still be good for educational purposes.
pooduck, Aug 12 2005
  

       //The simplest example of a space-filling polyhedra is a// tetrahedron. Otherwise known as the first Platonic solid.
baconbrain, Aug 14 2005
  

       [baconbrain], from the link:   

       Although even Aristotle himself proclaimed in his work 'On the Heavens' that the tetrahedron fills space, it in fact does not. The cube is the only Platonic solid possessing this property. However, a combination of tetrahedra and octahedra do fill space.
xaviergisz, Aug 14 2005
  

       that is a rather uncharitable analysis [bigsleep]. When I suggested that two modular neural networks could be spliced together, I didn't mean they would automatically merge perfectly into a better neural net. It would no doubt require more evolution using method described.   

       Also, my example was to aid in understanding rather than a 'best method of construction'. The priniciple I was trying to explain is that it is good to retain the 'parallelism' inherent in neural networks when developing them.
xaviergisz, May 15 2006
  

       [bigsleep] where did you post //a theory on how to stitch two neural nets together// ?
xaviergisz, May 15 2006
  

       I've got a variation on the original idea. I'm thinking that if you tried to make the modules small (i.e. micrometre scale), it'd be difficult to make and arrange them precisely. Instead the modules could be made approximately spherical, with the surface covered by input/output contacts.   

       The spherical modules could then be simply mixed together (kind of like mixing different kinds of sand in a bowl) to form the neural networks. So developing neural networks would be about finding the right ratios of the 'ingredient' modules.
xaviergisz, Nov 06 2006
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle