“Shared Nothing Architecture” for large scale analytics

“Shared Nothing (SN) Architecture” is an old technology that is increasingly gaining popularity.

SN Architecture (or SNA) is a distributed architecture where each node is independent and self-sufficient, which eliminates single points of failure/bottleneck in large scale systems.

Today TIBCO announce the release of new ActiveSpaces 2.0
To improve scalability and facilitate more in-depth analysis of “data in motion” TIBCO is relying on SVA for storage. (in Google world it would be called “DB Sharding”).

(As a side note Teradata uses SNA as well)

With this release TIBCO is pushing further into the big-data market which according to Wikibon is predicated to be a $53.4 billion hardware, software and services market by 2017.

Intelligent Computers .. of the other kind

Computer Intelligence is getting another boost through an innovative new design for artificial neural networks.

Known from her work on Artificial Recurrent Neural Network (ARNN) Hava Siegelmann of the University of Massachusetts Amherst is responsible for the breakthrough.

She is taken Alan Turing’s work to its next logical step. Her neural network models are capable of morphing and completely rearranging their design every time a new fact is learned. It means that her neural network learns an order of magnitude more effectively and faster than more traditional neural networks. Siegelmann claims that her model is superior not just in ability but also it requires significantly less computational power.

She received funding to develop the first ever “Super-Turing” computer based on ARNN.

At the same time scientists from the University of Gothenburg just demonstrated growing nerve cells (biological neural network) on nanocellulose that can be used to construct 3D brain models.

One way or the other we will get to true AI … the question is which approach will prevail; reverse engineering or evolutionary process!