The Replacement Theorem Theorem (Theorem 1.10) Let V be a vector space and suppose G and L are finite subsets of V such that V = Span ( G ) , |G| = n , L is linearly independent, and |L| = m . Then m ≤ n and there is a set H ⊂ G , such that |H| = n − m and Span ( H ∪ L ) = V .
The Replacement Theorem Theorem (Theorem 1.10) Let V be a vector space and suppose G and L are finite subsets of V such that V = Span ( G ) , |G| = n , L is linearly independent, and |L| = m . Then m ≤ n and there is a set H ⊂ G , such that |H| = n − m and Span ( H ∪ L ) = V .
The Replacement Theorem Theorem (Theorem 1.10) Let V be a vector space and suppose G and L are finite subsets of V such that V = Span ( G ) , |G| = n , L is linearly independent, and |L| = m . Then m ≤ n and there is a set H ⊂ G , such that |H| = n − m and Span ( H ∪ L ) = V .
The Replacement Theorem Theorem (Theorem 1.10) Let V be a vector space and suppose G and L are finite subsets of V such that V = Span ( G ) , |G| = n , L is linearly independent, and |L| = m . Then m ≤ n and there is a set H ⊂ G , such that |H| = n − m and Span ( H ∪ L ) = V .
The Replacement Theorem Theorem (Theorem 1.10) Let V be a vector space and suppose G and L are finite subsets of V such that V = Span ( G ) , |G| = n , L is linearly independent, and |L| = m . Then m ≤ n and there is a set H ⊂ G , such that |H| = n − m and Span ( H ∪ L ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof We proceed by induction on m . If m = 0 then L = φ and we let H = G so that m = 0 ≤ n and Span( H ∪ L ) = Span( G ) = V . Now suppose the statement is true for m = µ , where µ is a fixed nonnegative integer. We assume that L = { � v 1 ,� v 2 , . . . ,� v µ ,� v µ +1 } is linearly independent. Then the set { � v 1 ,� v 2 , . . . ,� v µ } is linearly independent. So by the induction hypothesis there is a subset { � u 1 ,� u 2 , . . . ,� u n − µ } ⊂ G such that Span( � u 1 ,� u 2 , . . . ,� u n − µ ,� v 1 ,� v 2 , . . . ,� v µ ) = V .
Proof (continued) Hence � v µ +1 = a 1 � u 1 + a 2 � u 2 + · · · + a n − µ � u n − µ + b 1 � v 1 + b 1 � v 2 + · · · + b µ � v µ , for some scalars a 1 , a 2 , . . . , a n − µ , b 1 , b 2 , . . . , b µ . We note that n − µ > 0 since otherwise � v µ +1 would be a linear combination of � v 1 , � v 2 , . . . , � v µ would contradict L being linearly independent. Therefore n − µ ≥ 1 and n ≥ µ + 1. Similarly at least of the scalars a 1 , a 2 , . . . a n − µ must be nonzero since L is linearly independent. Suppose without loss of generality that a 1 � = 0. Then � u 1 = ( − a 2 / a 1 ) � u 2 + · · · + ( − a n − µ / a 1 ) � u n − µ + ( − b 1 / a 1 ) � v 1 + · · · + ( − b µ ) / a 1 ) � v µ + (1 / a 1 ) � v µ +1 ) .
Proof (continued) Hence � v µ +1 = a 1 � u 1 + a 2 � u 2 + · · · + a n − µ � u n − µ + b 1 � v 1 + b 1 � v 2 + · · · + b µ � v µ , for some scalars a 1 , a 2 , . . . , a n − µ , b 1 , b 2 , . . . , b µ . We note that n − µ > 0 since otherwise � v µ +1 would be a linear combination of � v 1 , � v 2 , . . . , � v µ would contradict L being linearly independent. Therefore n − µ ≥ 1 and n ≥ µ + 1. Similarly at least of the scalars a 1 , a 2 , . . . a n − µ must be nonzero since L is linearly independent. Suppose without loss of generality that a 1 � = 0. Then � u 1 = ( − a 2 / a 1 ) � u 2 + · · · + ( − a n − µ / a 1 ) � u n − µ + ( − b 1 / a 1 ) � v 1 + · · · + ( − b µ ) / a 1 ) � v µ + (1 / a 1 ) � v µ +1 ) .
Recommend
More recommend