This paper introduces a knowledge recognition algorithm (KRA) that is both a Turing machine algorithm and an Oracle Turing machine algorithm. By definition KRA is a non-deterministic language recognition algorithm. Simultaneously it can be implemented as a deterministic Turing machine algorithm. KRA applies mirrored perceptual-conceptual languages to learn member-class relations between the two languages iteratively and retrieve information through deductive and reductive recognition from one language to another. The novelty of KRA is that the conventional concept of relation is adjusted. The computation therefore becomes efficient bidirectional string mapping.
Deep Dive into Knowledge Recognition Algorithm enables P = NP.
This paper introduces a knowledge recognition algorithm (KRA) that is both a Turing machine algorithm and an Oracle Turing machine algorithm. By definition KRA is a non-deterministic language recognition algorithm. Simultaneously it can be implemented as a deterministic Turing machine algorithm. KRA applies mirrored perceptual-conceptual languages to learn member-class relations between the two languages iteratively and retrieve information through deductive and reductive recognition from one language to another. The novelty of KRA is that the conventional concept of relation is adjusted. The computation therefore becomes efficient bidirectional string mapping.
Knowledge recognition (also called relation recognition) algorithm (KRA), was originally designed to simulate the mirrored language structure of the human brain by Han (Han08). The human brain contains a mirrored perceptual-conceptual language structure for storing member-class relations between the two languages as knowledge. That is, KRA has two levels of languages, which permit the perceptual language L p as the members of the conceptual language L c , and conceptual language as the class of the perceptual. Based on this continuous iterative structure, four "innate" logic functions exist, defined by four axioms: Sensation: Innate mapping function of one-to-one correspondence L p ∋ p = c ∈ L c exists between the perceptual language L p and conceptual language L c . The existence of sensation also can be presented equivalently as the "diagonal" set {(p,c) | p = c} (see Fig1). Reduction: Membership recognition function exists for recognizing relations from the conceptual language L c to perceptual language L p , defined as follows: Suppose that L k is a
The notion string mapping is to map member-class relations, where deduction is the mapping from perceptual to conceptual language denoted by L p ≥ L c , and reduction is the mapping from conceptual to perceptual language denoted by L p ≤ L c .
Fig1. Iterations of member-class relations between perceptual and conceptual language 2. KRA enables P = NP We denote by t M (w) the number of steps in the computation of M on input w. We denote by T M (n) the worst case run time of M; that is,
It is easy to see that KRA can answer membership and class relation questions of the form L p ∋|p|∈ |c| ∈ L c correctly in polynomial time.
Theorem 1. L k is both a P and an NP language (P = L k = NP) iff L k is over Σ k , k = p, c, where such that, for every y o ∈ Δ * , the set {<y o , y>| < y o , y > ∈ τ} has fewer than k A elements, where k A is a constant. The computation of A on input x ∈ D is a sequence y 1 , y 2 , … which ends with y K , such that y 1 = E(x), <y i , y i+1 > ∈ τ for all i, and y K ∈{ ACCEPT, REJECT}. [Coo00,Kar72]
This content is AI-processed based on ArXiv data.