Science

Networks of probabilistic processors

Rais G.Bukharaev

Kazan State University

18, Kremlyovskaya Str., Kazan, 420008, Russia

Rais.Bukharaev@ksu.ru

A probabilistic automaton (PA) is a mathematical model of a probabilistic processor (PP) - a device, which is controlled by special commands and realizes certain behavior of output parameters, which are statistical stable and determined beforehand. Probabilistic calculations have numerous advantages in comparison with deterministic ones, especially when using in computing structures from networks of PP. In the paper the brief summary of comparison is given between deterministic and probabilistic automata characteristics. There is a possibility to design by means of PP such probabilistic machines, which realize random behavior demanded with great efficiency. Probabilistic machines, made up of networks of specialized or universal PP, possess the most efficiency. It is possible to present a great variety of constructive means, which increase the rate and computing power of such machines by implementing probabilistic algorithms an enormous number of times, providing very high efficiency of task decision. A list of possible aids is presented, which might be used for acceleration of calculating and model processes.

These properties of PP and networks of PP arise from characteristics of PA model. A survey of several approaches to decision of the problem of PP synthesis is given.

Introduction

In usage of computer facilities for decision of calculating tasks or simulations of complex systems application of probabilistic algorithms occupies a special place. Method of statistical simulation is the first one in a number of applications of probabilistic algorithms, which, having arisen practically simultaneously with computers as a so-called Monte-Carlo method has obtained a wide circulation and development. In a ground of a method the principle of simulation of some random variable x(T) lays, fitted such a manner that its expectation E(x) is a decision of the task T, or, is simply interlinked with it. Thus, the approximate solution of the task T is an average value of a great number of sample values of a random variable x(T). Such a way it is possible to calculate multivariate and functional integrals, to solve systems of linear algebraic equations of high orders and, therefore, to implement the endless set of computing methods reduced to their solution, to solve integral equations and boundary problems for differential partial equations of a different type etc. Many problems of physical simulation form another class of tasks, in which the randomness is exhibited itself as some behavioral characteristic of the process, such as tasks of queuing or tasks of simulation of a nuclear reactor. At last, it is possible to organize the computing process to operate such a way, that the probability of the random code is considered as its behavioral numerical characteristic, or else, random code is considered in the process as an analogue carrier of the numerical information.

At a rating of comparative efficiency of probabilistic algorithms one can mark a number of simple observations, which, nevertheless, clear up the problem. To obvious advantages of probabilistic statistical) methods of calculations concern

ž        Their small connectedness, that is a volume of the RAM used for storage of intermediate calculations. This essentially increases possibilities to complicate producible calculations,

ž        The simplicity and very high degree of recurrence of computing (or simulating) algorithms, that essentially increases efficiency of specific usage of hardware of the computer in comparison with the deterministic case. It gives a new contribution into possibilities of complicating of solved tasks, into simplification of programming and debugging processes.

ž        High reliability of the computing (or simulating) circuit as many categories of failures have no effect for productivity of algorithm. This new sort of reliability secures, that latter very weak depends from complexity of the scheme, providing in total possibilities of practically unlimited complicating of a scheme at saving operational reliability.

So that, the more difficult is the solved problem, the more effective are applications of probabilistic methods at its solution. This circumstance is not accidental. According to the hypothesis of A.N.Kolmogorov, a randomness can be considered in particular sense as manifestation of a complexity. From this point of view, when the problem does not admit analytical solution (or its deterministic description is "too difficult"), the probabilistic outcome can be looking as particular randomness "patch" substituting unavailable analytical solution. In aspect considered it means, that with increasing the complexity of problems emergent sphere of applicability of probabilistic algorithms will be dilated, more and more displacing the deterministic methodologies.

Despite of a seeming constructive elusiveness of randomness, it can effectively be used in positive purposes. Moreover, the presence of major number of degree of freedoms in behavior of random objects gives a number of advantages, which are very important in simulation of complicated systems. The possibilities of unobstructed parallel and pipeline data processing in probabilistic calculations, possibilities of free superposition of probabilistic computers in networks of PA and PP appear so considerable, that they many times over overlap their some lacks. So, one can see, that probabilistic approach to calculations permits to overcome some drawbacks intrinsic to deterministic computers. That are:

1. Poor operational reliability. With growth of complexity of the deterministic device its reliability inevitably decreases. The basic quality of the probability is that "failure" being serious and non removable interference in deterministic calculations, is a taken into account constructive unit of the device in a case of probabilistic calculations. As a corollary, we have the high and adjustable reliability of outcome of probabilistic calculations.

On the basis of probabilistic devices it is possible to design as much as complicated devices without loss of reliability. This principle works in a nature in a course of evolution.

2. Difficulty of multi-sequence of the deterministic algorithms of a general type. As a corollary, with growth of complexity of algorithm the duration of calculations increases, its speed and reliability decrease. Despite of a growing speed of modern deterministic processors the situation in general remains deceptive and there are limits of their complexity, reliability and speed. The probabilistic algorithms, on the contrary, very easily give in to multi-sequence and this circumstance defines premises for creation of as much as composite and as much as reliable devices.

3. Poor efficiency of usage of redundancy in deterministic calculations.

An effective utilization of redundancy in probabilistic calculations is a foundation of all their positive qualities. This principle works in a nature as well. The probabilistic computer representing a network of universal and specialized probabilistic processors would be clear of enumerated lacks. It would be a hybrid computer in the sense that the data circulate in its channels as probabilities of random signals, but the machine design is deterministic. Such a computer has flexibility of operation and can work as the particular specialized computer according to the program, which is assigned to it in two forms - as preliminary adjustments of its memory and hardware structure and as some algorithm of execution, realizing in a course of solution. The process of solution consists in obtaining a status of memory satisfying to particular criteria (self-controlled random search). Probabilistic computers of a described type do not yet exist. There are examples of separate experiences of instrumental implementation of probabilistic algorithms. The theory of probabilistic automata representing a foundation for designing of probabilistic processors and their networks is practically completed to the present time. There exist a considerable number of projects of specialized probabilistic computers, in particular, there exists a project of the universal probabilistic processor. In Kazan State University the complex of specifications on creation of such a probabilistic processor and the command systems for it is elaborated, however practical implementation of these conditions demands more concentrated efforts, than it was up to now. The purpose of this report also is to attract attention to this problem.

The modern life is characterized by steady increase of complexity of originating problems. At learning the behavior of super-complicated systems, methods of probabilistic simulation will be, in a counterbalance to analytical methods, more and more to appear on the foreground. The orientation to simulation in a counterbalance to computing solutions has also philosophical aspect. The nature does not solve any of its tasks by calculation. In particular, the alive systems solve their problems by simulation - by means of directional random search on the basis of rating of outcomes of prediction.

The second philosophical argument is, that in the ground of the evolution, as of some process of complicating, randomness lays, and the

determinism exhibits itself as an average value only. From this point of view the development of the theory of probabilistic computers design is

not only perspective business, but also is an urgent necessary matter. The theoretical basis for their design is elaborated completely. But a transition to appearing of a technological interest is braked by victorious procession of modern computers. It is necessary, nevertheless, to realize, that the driving in the direction of probabilistic computers and their networks is unavoidable.




[Contents]

homeKazanUniversitywhat's newsearchlevel upfeedback

© 1995-2008 Kazan State University