The theorem about the a priori estimate for the solution of Tricomi problem for Lavrentiev-Bitsadze equation is proved. From this theorem, in particular, follows the uniqueness of a regular solution of the investigated problem.
In this work we build a fundamental solution to a regular continuous differential equation of second order with regularized derivatives of segment order and find an explicit solution to Cauchy problem in terms of fundamental solution.
In this paper we propose a method for representing various-valued logic function in a logical neural network. This logical neural network will keep the totality of cause-andeffect relationships identified using various-valued logic functions with-in a given specified area. Thus, it becomes possible to transfer a logical algorithm to detect hidden patterns in a given specified area, in case when the values of logical variables are not well-defined and are values obscured between zero and one. These logic operations are implemented by special logic neural cells: conjunctors and disjunctors.
A method of constructing an optimal cognitive maps consists in optimizing the input data and the dimension data structure of a cognitive map. Pro-optimization problem occurs when large amounts of input data. Optimization of time-dimension data is clustering the input data and as a method of polarization-clusters using hierarchical agglomerative method. Cluster analysis allows to divide the data set into a finite number of homogeneous groups. Optimization of the structurery cognitive map is automatically tuning the balance of influence on each other concepts of machine learning methods, particularly the method of training the neural network.
This paper considers the logical approach to the theoretical justification of constructing correct algorithms that extend the area of the solutions obtained on the basis of existing
algorithms. The proposed method makes it possible on the basis of a given set of pattern recognition algorithms to identify additional knowledge of a given subject area and to build a minimum rule, providing additional training of these algorithms.
The paper proposes a new scheme for the gradient solution to minimize losses averaged
problem. It is an analog circuit used in the SAG algorithm in the case when the risk is calculated using the arithmetic mean. An illustrative example of the construction of robust classification based on the maximization of the surrogate median indentation.