Using a generalization of forward elimination, it is proved that functions $f_1,...,f_n:X\to\mathbb{A}$, where $\mathbb{A}$ is a field, are linearly independent if and only if there exists a nonsingular matrix $[f_i(x_j)]$ of size $n$, where $x_1,...,x_n\in X$.
Deep Dive into Criterion for linear independence of functions.
Using a generalization of forward elimination, it is proved that functions $f_1,...,f_n:X\to\mathbb{A}$, where $\mathbb{A}$ is a field, are linearly independent if and only if there exists a nonsingular matrix $[f_i(x_j)]$ of size $n$, where $x_1,...,x_n\in X$.
Suppose we are dealing with a separable kernel (see, e.g., [1], p. 4) K : [a, b] × [a, b] → R of an integral operator, i. e.
where T 1 , . . . , T n , S 1 , . . . , S n : [a, b] → R.
(
Suppose K = 0. Then we may consider each of the systems {T 1 , . . . , T n }, {S 1 , . . . , S n } linearly independent. Indeed, starting from an expression of kind (1), we consequently reduce the number of items while it is needed.
Assume that we need to express the functions (2) in terms of K. In order to do it, we find such points (a proof of existence and a way of finding will follow) t 1 , . . . , t n , s 1 , . . . , s n ∈ [a, b],
that the square matrices T = [T j (t i )], S = [S j (s i )] of size n are nonsingular, write out the identities
and obtain the desired expressions
The formulas (4) let one, for example, prove smoothness of the functions (2) if K is smooth.
The existence of such points (3) seems doubtless: if we considered the set {1, . . . , m} instead of [a, b], the matrices [T j (i)] and [S j (i)] of size m × n would be of full rank (see [2]) and would correspondingly have nonsingular submatrices T = [T j (t i )] and S = [S j (s i )] of size n. But what about a strict proof?
Let X be a nonempty set, let A be a field and let
β i g i = 0. We are going to prove that
(5) Indeed, let’s denote the row [β j ] ∈ A 1,n by β T and multiply the equity g = Af by β T from the left. We have 0 = (β T A)f . Note that β T A ∈ F 1,n is a row. Since the entries of f are linearly independent functions then β T A = 0. The matrix A is nonsingular, therefore the equities (5) hold.
for any A ∈ A n,n .
Lemma 2. Let the matrices A, (Af )(x) ∈ A n,n be nonsingular. Then the matrix f (x) is nonsingular.
Proof. It follows from the formula (6
Further, given a column f = [f i ] ∈ F n,1 of linearly independent functions, we will find such a vector x ∈ X n and such a matrix
Because of nonsingularity of matrices A, (Af )(x) ∈ A n,n and lemma 2, the matrix f (x) will be nonsingular.
Theorem 1. Let the entries of the column f = [ f i ] ∈ F n,1 be linearly independent functions. Then there exists such a vector x ∈ X n and such a matrix A ∈ A n,n of kind (7) that the matrix (Af )(x) is of kind (8).
Proof. Let’s use mathematical induction on n.
Let n = 1. Then there exists such x 1 ∈ X that f 1 (x 1 ) = 0, because otherwise f 1 = 0 and hence the system {f 1 } is linearly dependent.
Let n > 1. As in the case 1, we find such
Because of nonsingularity of the matrix M and lemma 1, g is a column of linearly independent functions. Also
Let’s consider the following block partition g = f 1 g , where g ∈ F n-1,1 . Since any subsystem of a linearly independent system is itself linearly independent, the entries of g are linearly independent functions. Moreover, g(x 1 ) = 0.
Let’s find such a vector x = (x 1 , . . . , xn-1 ) ∈ X n-1 and such a ma-
is of kind (8), because f (x 1 ) = 0, ( Bg)(x 1 ) = B • g(x 1 ) = 0, by ( 6), (9), and ( Bg)(x) is of kind (8).
Let A = BM. Then A ∈ A nn , A is of kind (7) (as a product of matrices of such kind) and (Af )(x) = (BMf )(x) = (Bg)(x) is of kind (8).
Theorem 2 (criterion for linear independence of functions). The functions f 1 , . . . , f n : X → F are linearly independent if and only if there exists such (x 1 , . . . , x n ) ∈ X n that the matrix
Proof. Suppose that the entries of the column f = [f i ] ∈ F n,1 are linearly independent functions. Then, by theorem 1 and lemma 2, there exists such x ∈ X n that f (x) is nonsingular. Now let x ∈ X n and let the matrix [f i (x j )] ∈ A n,n be nonsingular. Assume that α 1 , . . . , α n ∈ A and n i=1 α i f i = 0. In particular, we have
Considering (10) a nondegenerate system of linear algebraic equations in unknowns α 1 , . . . , α n we conclude that α 1 = . . . = α n = 0. Thus the functions f 1 , . . . , f n are linearly independent. [3]) and therefore nonsingular. Thus, by theorem 2, the functions f 1 , f 2 , f 3 are linearly independent.
Analogously to the previous example, the matrix [f i (x j )] ∈ C 3,3 is nonsingular and thus the functions f 1 , f 2 , f 3 are linearly independent.
Example 3. Let X = Y × Z and let f 1 , . . . , f n ∈ F. Suppose that z * ∈ Z and ϕ i : Y → A (i = 1, . . . , n) are such linearly independent functions that ϕ i (y) ≡ f i (y, z * ), i = 1, . . . , n. Then, by theorem 2, there exists such (y 1 , . . . , y n ) ∈ Y n that the matrix [ϕ i (y j )] ∈ A n,n is nonsingular. Note that this matrix equals [f i (x j )] ∈ A n,n , where x j = (y j , z * ), j = 1, . . . , n. Thus, by theorem 2, the functions f 1 , . . . , f n are linearly independent.
Taking into account the notion of rank of a system of vectors (see [4], p. 52) and, in particular, of rank of a system of functions in the linear space F, we prove a more general theorem.
Let’s prove that r ′ ≥ r. Indeed, if r = 0, the inequality r ′ ≥ r holds. Let r > 0. Then there exists a subset {f k 1 , . . . , f kr } ⊆ {f 1 , . . . , f n } of r linearly independent fun
…(Full text truncated)…
This content is AI-processed based on ArXiv data.