Αναρτήσεις

Προβολή αναρτήσεων από 2011

Hypercomputation and Economics

I have been thinking for many years why economists usually fail to make any reliable predictions. And when one does make some reliable predictions, then she is considered phenomenal! But if economics is to be considered a science, then it should be able to make reliable predictions, otherwise it is completely worthless. First of all economics is a social science . Thus, in order  to make trustworthy predictions, one must ensure that an economic system is computable and, why not, deterministic . Economics systems consist of agents (ordinary people) that interact and create the world we see around us. But how can be so sure that the behavior of these agents is computable? In fact we are not and, furthermore, we shouldn't! One of the basic ideas of hypercomputation is that the human mind, ergo our agents, has capabilities computational, hypercomputational, and paracomputational (i.e., abilities that lie outside computation as we presently know it). In different words, eco

The universe as a quantum computer

The other day I was skimming through Lee Smolin's The Trouble with Physics . On pages 317-138 one can read the following: In the context of quantum gravity, it resulted in a new approach to quantum cosmology, made by Fotini Markopoulou and her collaborators. Markopoulou emphasized that describing the exchange of information between different subsystems is the same as describing the causal structure that limits which system can influence each other. She thus found that a universe can be described as a quantum computer, with a dynamically generated logic. With all due respect, the idea that the universe is a computer was put forth by Konrad Zuse in his Rechnender Raum .  Furthemore, Zuse's ideas form in a way the basis of digital philosophy . Whether the universe is a computer or not is another discussion that I have addressed briefly in an older post.

A brain simulator

SpiNNaker is a project to create a simulator of the human brain, The computer will consist of up to a million ARM cores to host a brain simulator. But as Steve Furber of the University of Manchester has correctly noted " [t]here are about 100 billion neurons with 1,000 trillion connections in the human brain. Even a machine with one million of the specialized ARM processor cores developed at Manchester would only allow modeling of about 1 percent of the human brain ." Nevertheless, I am sure such a project will provide some scientific knowledge.

New Worlds of Computation 2011

Εικόνα
The second workshop New Worlds of Computation , which was organized by the Laboratoire d'Informatique Fondamentale d'Orléans , took place in Orléans from May 23 till May 24. A number of researchers, mostly from France, gathered and presented their work and ideas regarding computation. Françoise Chatelin talked about the necessity to use mathematical tools in computability theory that have been largely ignored until now. Obviously, such a mathematical tools include fuzzy sets, quaternions, etc. Sama Goliaei talked about her work in optical computing. Mike Stannett talked about his joint-work in cosmological computation (i.e., the exploitation of the properties of the space-time to perform hard and "impossible" computations).  Yaroslav D. Sergeyev presented his "numbering system of infinity" and its use in computation (a possibility that was mentioned in my book on hypercomputation). My talk was about vagueness and its use in computation. Unfortunately, some s

Commercial Quantum Computer

D-Wave , a Canadian technology company, has announced that they have sold their first commercial quantum computer to Lockheed Martin Corporation . The intriguing thing about D-Waves technology is that they were claiming to use the technology Tien D. Kieu has used in his adiabatic quantum computing method, which is a hypercomputational method.  

Building a brain?

Today Spiegel Online International posted an article entitled Researchers Hope to Build a Brain . The article discusses the efforts of the Blue Brain Project team (the article wrongly states that the team's name is Human Brain Project). The problem is that the article as well as the people involved with this project use the terms simulation and building almost  interchangeably, which is wrong. To build a brain means to actually construct something that will function as a brain, while simulating means that the team will write software that will function similar to bran. I can imagine that such a simulation could be implemented in an object-oriented way, where each neuron will be simulated by a very complex object. Obviously, all these objects would form a network. Now, how will they respond to external stimuli? Moreover, what will count as an external stimulus? All in all, even the simulation of the brain is a very ambitious project and I don't think we are ready to implement

Technological singularity

Technological singularity is described in a recent issue of Time magazine . I read the article because I was curious, but I was disappointed when I read the following paragraph: Computers are getting faster. Everybody knows that. Also, computers are getting faster faster — that is, the rate at which they're getting faster is increasing. True? True. So if computers are getting so much faster, so incredibly fast, there might conceivably come a moment when they are capable of something comparable to human intelligence. Artificial intelligence. All that horsepower could be put in the service of emulating whatever it is our brains are doing when they create consciousness — not just doing arithmetic very quickly or composing piano music but also driving cars, writing books, making ethical decisions, appreciating fancy paintings, making witty observations at cocktail parties. The problem here is that somehow intelligence is equated with processing speed, when, for example, it

Does Nature Compute?

Recently, I received an email about a new book entitled Randomness Through Computation: Some Answers, More Questions . Although, I fail to see how computation would create randomness, the title of one article seemed particularly… intriguing—"What is Computation? (How) Does Nature Compute?" I haven't read the article, so I do not know what it is about, but I can speculate that is advocates the idea that the Universe, Nature somehow computes. Well, to me the idea that Nature computes is pure Pythagorean mysticism (also known as Pythagoreanism ). In different words, it is my opinion that a chair, a wall or even a black hole computes absolutely anything. I dare to say that people who believe such things are suffering from schizophrenia! But why is this idea of computing is everywhere so appealing to many thinkers and researchers? It seems to me this happens because it is so fascinating to say that my cup of tea computes the trajectory of a spacecraft that is located million