Παρασκευή, 19 Φεβρουαρίου 2016

Memcomputing

Memcomputing is a new computing paradigm that is based on the idea that the memory can and should be used to compute. The idea is based on the functionality of the brain where neurons are used to both store information and process it.  In particular the following drawing shows the way a memecomputer operates. The zigzag arrow specifies that  a signal is sent.  All other arrows designate flow of information.


Now compare this architecture with the "traditional" von Neumann architecture:

I think the difference is obvious. The interesting thing with me memcomputing is that the people who designed this computer architecture published a paper where they claim that memcomputers can solve NP-complete problems. In particular, they claim that their machine can solve instances of the subset sum problem. This problem can be phrased as follows:  Consider a finite set G of integers  having n elements, is there a non-empty subset K of G whose elements sum up to s? As happens in this and other similar cases, a number of people com forward just to question the last claim without reconizing the general contribution of people. The Shtetl-Optimized blog is such a medium. What is really annoying with such media is that their authors completely ingore Socrates's dictum: I know that I know nothing…


Creator of EAC implementation passed away

Today I was informed that  Jonathan Wayne Mills, the creator of an implementation of the Extended Analog Machine passed away on January 27, 2016 at the age pf 64 after a six month fight against cancer. I am really saddened when I hear such tragic news.
on Wednesday, January 27, 2016 at the age of 64, after a six month fight against cancer. - See more at: http://obits.mlive.com/obituaries/kalamazoo/obituary.aspx?page=lifestory&pid=177627505#sthash.st1ONYH7.dpuf
on Wednesday, January 27, 2016 at the age of 64, after a six month fight against cancer. - See more at: http://obits.mlive.com/obituaries/kalamazoo/obituary.aspx?page=lifestory&pid=177627505#sthash.st1ONYH7.dpuf

Jonathan Wayne Mills

Jonathan Wayne Mills

Jonathan Wayne Mills
Jonathan Wayne Mills

Πέμπτη, 11 Δεκεμβρίου 2014

Deep Neural Networks are Easily Fooled!

In an article that was recently posted to the arXiv and is entitled Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images, the authors discuss how Deep neural networks (DNNs) can be fooled when performing visual classification. In particular, the show how  easy it is to produce images that are completely unrecognizable to humans yet that  DNNs believe they are recognizable objects with 99.99% confidence...!

Σάββατο, 20 Σεπτεμβρίου 2014

Δευτέρα, 25 Αυγούστου 2014

Constructive Hypercomputation

Constructivists assert that one has to construct a mathematical object in order  to show that it exists. Thus they reject hypercomputation. However, Rasoul Ramezanian notes correctly in A Hypercomputation in Brouwer's Constructivism that for Brouwer, who was the founder of the mathematical philosophy of intuitionism, something exists as long there is a mental construction for it. Some constructivists do not accept that there are infinite objects at all. In fact, some assert that there are 21000 elementary particles in the Universe and so this is the largest number! To me such ideas are absurd. So Ramezanian concludes that intuitionism can co-exist with hypercomputation. Moreover, he presents his Persistent Evolutionary Turing Machines, which is a couple N = (⟨z0, z1,…, zi⟩, f) where z0, z1,…, zi is a growing sequence of codes of deterministic Turing machines, and f (called the persistently evolutionary function) is a computable partial function from Σ× Σ to Σ. Ramezanian demands that f has certain properties and from there he goes on to explore the hypercomputational capabilities of this machine.

Implementing an Analog Recurrent Neural Network

A. Steven Younger, Emmett Redd, and Hava Siegelmann published a paper entitled  Development of Physical Super-Turing Analog Hardware, where they report their efforts to build a real hypercomputer. In particular, they present their work on the realization of Analog Recurrent Neural Networks (ARNN, for short). The theory of ARNNs is presented in Neural Networks and Analog Computation.In a nutshel, the ARNNs are generally more powerful than Turing machines and so they are classified as hypecomputers. Younger  et al. have designed and developed an OpticARNN which is depicted in the figure that follows:
Also, they have developed an electronic ARNN whose functional schematic follows:
These system have not been tested thoroughly and so one cannot draw definitive conclusions. The authors plan to build larger devices and continue their studies.