difference of two orthogonal projections is orthogonal projectionIf A is both orthogonal and a orthogonal projector. What can you then conlcude about A?If $P$ is the standard matrix an orthogonal projection, prove that so is $P^k$.Pseudoinverse and orthogonal projectionOrthogonal projection matrix proofFinding orthogonal projections onto $1$ (co)-dimensional subspaces of $mathbb R^n$Why are orthogonal projection matrices not … orthogonal?Orthogonal Projections- PropertiesPositive definite matrix for projectionShow that a symmetric and idempotent matrix $P$ is the projection matrix onto some subspace.Determine if a matrix is an orthogonal projection matrix
How long does it take to type this?
N.B. ligature in Latex
New order #4: World
Why is this code 6.5x slower with optimizations enabled?
declaring a variable twice in IIFE
What is the command to reset a PC without deleting any files
Can I interfere when another PC is about to be attacked?
Are tax years 2016 & 2017 back taxes deductible for tax year 2018?
Example of a relative pronoun
How is the claim "I am in New York only if I am in America" the same as "If I am in New York, then I am in America?
Why is "Reports" in sentence down without "The"
Possibly bubble sort algorithm
Could a US political party gain complete control over the government by removing checks & balances?
Can I make popcorn with any corn?
XeLaTeX and pdfLaTeX ignore hyphenation
What do you call a Matrix-like slowdown and camera movement effect?
Patience, young "Padovan"
Is it tax fraud for an individual to declare non-taxable revenue as taxable income? (US tax laws)
Circuitry of TV splitters
Draw simple lines in Inkscape
Extreme, but not acceptable situation and I can't start the work tomorrow morning
How to report a triplet of septets in NMR tabulation?
How can the DM most effectively choose 1 out of an odd number of players to be targeted by an attack or effect?
The magic money tree problem
difference of two orthogonal projections is orthogonal projection
If A is both orthogonal and a orthogonal projector. What can you then conlcude about A?If $P$ is the standard matrix an orthogonal projection, prove that so is $P^k$.Pseudoinverse and orthogonal projectionOrthogonal projection matrix proofFinding orthogonal projections onto $1$ (co)-dimensional subspaces of $mathbb R^n$Why are orthogonal projection matrices not … orthogonal?Orthogonal Projections- PropertiesPositive definite matrix for projectionShow that a symmetric and idempotent matrix $P$ is the projection matrix onto some subspace.Determine if a matrix is an orthogonal projection matrix
$begingroup$
Premise: I have an $n × q$ matrix $X$ and a $q × a$ matrix $C$ with $n > q > a$.
I'm interested in the structure of the matrix
$$
M = X X^+ - X_0 X_0^+
$$
where the superscript $^+$ indicates the Moore–Penrose pseudoinverse and
$$
X_0 = X (I_q - C C^+).
$$
I assume that $X$ is of full column rank and therefore $X^+ = (X' X)^-1 X$ (where $'$ indicates the transpose).
Background: $X$ is the design matrix of a linear model, $C$ is a contrast, $X_0$ is a reduced design matrix, and $M$ occurs in the definition of standard test statistics.
$M$ is the difference of two orthogonal projection matrices, where the second projects into a subspace of the subspace the first projects into. This makes the difference an orthogonal projection matrix itself (symmetric and idempotent), which means it has a representation
$$
M = X_Delta X_Delta^+.
$$
Question: How do I obtain $X_Delta$?
user1551 has correctly pointed out in an answer that $X_Delta = M$ itself fulfills the equation. However, I'm looking for a "version" of $X$, meaning an $n times q$ matrix of rank $a$.
My approach: I am guessing that
$$
X_Delta = X - X_0 X_0^+ X,
$$
and this seems to be confirmed by numerical tests. But I am unable to come up with a proof, i.e. to show that
$$
(X - X_0 X_0^+ X) (X - X_0 X_0^+ X)^+ = X X^+ - X_0 X_0^+.
$$
The problem is how to deal with the pseudoinverse of a difference. One can write
$$
X_Delta = (I_n - X_0 X_0^+) X,
$$
and according to Wikipedia, in the pseudoinverse of a product where one factor is an orthogonal projection, the orthogonal projection can be redundantly multiplied to the opposite side, meaning here
$$
X_Delta^+ = [(I_n - X_0 X_0^+) X]^+ = [(I_n - X_0 X_0^+) X]^+ (I_n - X_0 X_0^+) = X_Delta^+ (I_n - X_0 X_0^+),
$$
but that doesn't seem to help.
I can prove that $M$ is symmetric and idempotent, using the relations
$$
X X^+ X_0 = X_0
quad textand quad
X_0 X_0^+ X X^+ = X_0 X_0^+,
$$
which derive from the definition of $X_0$ and the properties of the pseudoinverse. I can also show that
$$
X X_0^+ = X_0 X_0^+
$$
using the property of the pseudoinverse of a product involving an orthogonal projection (see above). But none of that helps either.
linear-algebra pseudoinverse projection-matrices
$endgroup$
add a comment |
$begingroup$
Premise: I have an $n × q$ matrix $X$ and a $q × a$ matrix $C$ with $n > q > a$.
I'm interested in the structure of the matrix
$$
M = X X^+ - X_0 X_0^+
$$
where the superscript $^+$ indicates the Moore–Penrose pseudoinverse and
$$
X_0 = X (I_q - C C^+).
$$
I assume that $X$ is of full column rank and therefore $X^+ = (X' X)^-1 X$ (where $'$ indicates the transpose).
Background: $X$ is the design matrix of a linear model, $C$ is a contrast, $X_0$ is a reduced design matrix, and $M$ occurs in the definition of standard test statistics.
$M$ is the difference of two orthogonal projection matrices, where the second projects into a subspace of the subspace the first projects into. This makes the difference an orthogonal projection matrix itself (symmetric and idempotent), which means it has a representation
$$
M = X_Delta X_Delta^+.
$$
Question: How do I obtain $X_Delta$?
user1551 has correctly pointed out in an answer that $X_Delta = M$ itself fulfills the equation. However, I'm looking for a "version" of $X$, meaning an $n times q$ matrix of rank $a$.
My approach: I am guessing that
$$
X_Delta = X - X_0 X_0^+ X,
$$
and this seems to be confirmed by numerical tests. But I am unable to come up with a proof, i.e. to show that
$$
(X - X_0 X_0^+ X) (X - X_0 X_0^+ X)^+ = X X^+ - X_0 X_0^+.
$$
The problem is how to deal with the pseudoinverse of a difference. One can write
$$
X_Delta = (I_n - X_0 X_0^+) X,
$$
and according to Wikipedia, in the pseudoinverse of a product where one factor is an orthogonal projection, the orthogonal projection can be redundantly multiplied to the opposite side, meaning here
$$
X_Delta^+ = [(I_n - X_0 X_0^+) X]^+ = [(I_n - X_0 X_0^+) X]^+ (I_n - X_0 X_0^+) = X_Delta^+ (I_n - X_0 X_0^+),
$$
but that doesn't seem to help.
I can prove that $M$ is symmetric and idempotent, using the relations
$$
X X^+ X_0 = X_0
quad textand quad
X_0 X_0^+ X X^+ = X_0 X_0^+,
$$
which derive from the definition of $X_0$ and the properties of the pseudoinverse. I can also show that
$$
X X_0^+ = X_0 X_0^+
$$
using the property of the pseudoinverse of a product involving an orthogonal projection (see above). But none of that helps either.
linear-algebra pseudoinverse projection-matrices
$endgroup$
add a comment |
$begingroup$
Premise: I have an $n × q$ matrix $X$ and a $q × a$ matrix $C$ with $n > q > a$.
I'm interested in the structure of the matrix
$$
M = X X^+ - X_0 X_0^+
$$
where the superscript $^+$ indicates the Moore–Penrose pseudoinverse and
$$
X_0 = X (I_q - C C^+).
$$
I assume that $X$ is of full column rank and therefore $X^+ = (X' X)^-1 X$ (where $'$ indicates the transpose).
Background: $X$ is the design matrix of a linear model, $C$ is a contrast, $X_0$ is a reduced design matrix, and $M$ occurs in the definition of standard test statistics.
$M$ is the difference of two orthogonal projection matrices, where the second projects into a subspace of the subspace the first projects into. This makes the difference an orthogonal projection matrix itself (symmetric and idempotent), which means it has a representation
$$
M = X_Delta X_Delta^+.
$$
Question: How do I obtain $X_Delta$?
user1551 has correctly pointed out in an answer that $X_Delta = M$ itself fulfills the equation. However, I'm looking for a "version" of $X$, meaning an $n times q$ matrix of rank $a$.
My approach: I am guessing that
$$
X_Delta = X - X_0 X_0^+ X,
$$
and this seems to be confirmed by numerical tests. But I am unable to come up with a proof, i.e. to show that
$$
(X - X_0 X_0^+ X) (X - X_0 X_0^+ X)^+ = X X^+ - X_0 X_0^+.
$$
The problem is how to deal with the pseudoinverse of a difference. One can write
$$
X_Delta = (I_n - X_0 X_0^+) X,
$$
and according to Wikipedia, in the pseudoinverse of a product where one factor is an orthogonal projection, the orthogonal projection can be redundantly multiplied to the opposite side, meaning here
$$
X_Delta^+ = [(I_n - X_0 X_0^+) X]^+ = [(I_n - X_0 X_0^+) X]^+ (I_n - X_0 X_0^+) = X_Delta^+ (I_n - X_0 X_0^+),
$$
but that doesn't seem to help.
I can prove that $M$ is symmetric and idempotent, using the relations
$$
X X^+ X_0 = X_0
quad textand quad
X_0 X_0^+ X X^+ = X_0 X_0^+,
$$
which derive from the definition of $X_0$ and the properties of the pseudoinverse. I can also show that
$$
X X_0^+ = X_0 X_0^+
$$
using the property of the pseudoinverse of a product involving an orthogonal projection (see above). But none of that helps either.
linear-algebra pseudoinverse projection-matrices
$endgroup$
Premise: I have an $n × q$ matrix $X$ and a $q × a$ matrix $C$ with $n > q > a$.
I'm interested in the structure of the matrix
$$
M = X X^+ - X_0 X_0^+
$$
where the superscript $^+$ indicates the Moore–Penrose pseudoinverse and
$$
X_0 = X (I_q - C C^+).
$$
I assume that $X$ is of full column rank and therefore $X^+ = (X' X)^-1 X$ (where $'$ indicates the transpose).
Background: $X$ is the design matrix of a linear model, $C$ is a contrast, $X_0$ is a reduced design matrix, and $M$ occurs in the definition of standard test statistics.
$M$ is the difference of two orthogonal projection matrices, where the second projects into a subspace of the subspace the first projects into. This makes the difference an orthogonal projection matrix itself (symmetric and idempotent), which means it has a representation
$$
M = X_Delta X_Delta^+.
$$
Question: How do I obtain $X_Delta$?
user1551 has correctly pointed out in an answer that $X_Delta = M$ itself fulfills the equation. However, I'm looking for a "version" of $X$, meaning an $n times q$ matrix of rank $a$.
My approach: I am guessing that
$$
X_Delta = X - X_0 X_0^+ X,
$$
and this seems to be confirmed by numerical tests. But I am unable to come up with a proof, i.e. to show that
$$
(X - X_0 X_0^+ X) (X - X_0 X_0^+ X)^+ = X X^+ - X_0 X_0^+.
$$
The problem is how to deal with the pseudoinverse of a difference. One can write
$$
X_Delta = (I_n - X_0 X_0^+) X,
$$
and according to Wikipedia, in the pseudoinverse of a product where one factor is an orthogonal projection, the orthogonal projection can be redundantly multiplied to the opposite side, meaning here
$$
X_Delta^+ = [(I_n - X_0 X_0^+) X]^+ = [(I_n - X_0 X_0^+) X]^+ (I_n - X_0 X_0^+) = X_Delta^+ (I_n - X_0 X_0^+),
$$
but that doesn't seem to help.
I can prove that $M$ is symmetric and idempotent, using the relations
$$
X X^+ X_0 = X_0
quad textand quad
X_0 X_0^+ X X^+ = X_0 X_0^+,
$$
which derive from the definition of $X_0$ and the properties of the pseudoinverse. I can also show that
$$
X X_0^+ = X_0 X_0^+
$$
using the property of the pseudoinverse of a product involving an orthogonal projection (see above). But none of that helps either.
linear-algebra pseudoinverse projection-matrices
linear-algebra pseudoinverse projection-matrices
edited 2 days ago
A. Donda
asked Mar 29 at 21:27
A. DondaA. Donda
1,116620
1,116620
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
With your choice of $X_Delta$, $M$ is indeed equal to $X_Delta X_Delta^+$.
Proof. Let $P=I-CC^+$. Note that the column space of $M=XX^+-(XP)(XP)^+$ is $operatornamecol(X)capoperatornamecol(XP)^perp$, while the column space of $X_Delta X_Delta^+$ is precisely the column space of $X_Delta=left[I-(XP)(XP)^+right]X$.
Since $X_Delta=Xleft[I-P(XP)^+Xright]$, $operatornamecol(X_Delta)subseteqoperatornamecol(X)$. Also, since
beginaligned
(XP)^TX_Delta
&=(X_Delta^TXP)^T\
&=left[X^Tleft(I-(XP)(XP)^+right)XPright]^T\
&=left[X^Tleft(XP-(XP)(XP)^+(XP)right)right]^T\
&=left[X^Tleft(XP-XPright)right]^T=0,
endaligned
we also have $operatornamecol(X_Delta)subseteqoperatornamecol(XP)^perp$. Thus $operatornamecol(X_Delta)subseteqoperatornamecol(M)$.
We now show that the reverse inclusion is also true. Pick any $vinoperatornamecol(M)=operatornamecol(X)capoperatornamecol(XP)^perp$. Since $vinoperatornamecol(X)$, it can be written as $Xb$ for some vector $b$. Thus
$$
X_Delta b=left[I-(XP)(XP)^+right]Xb=v-(XP)(XP)^+v.
$$
However, we also have $vinoperatornamecol(XP)^perp$. Therefore $(XP)(XP)^+v=0$ and in turn $X_Delta b=v$, meaning that $vinoperatornamecol(X_Delta)$.
Thus $operatornamecol(X_Delta X_Delta^+)equivoperatornamecol(X_Delta)=operatornamecol(M)$. Hence $X_Delta X_Delta^+$ and $M$ must be equal, because they are orthogonal projections with identical column spaces.
$endgroup$
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3167640%2fdifference-of-two-orthogonal-projections-is-orthogonal-projection%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
With your choice of $X_Delta$, $M$ is indeed equal to $X_Delta X_Delta^+$.
Proof. Let $P=I-CC^+$. Note that the column space of $M=XX^+-(XP)(XP)^+$ is $operatornamecol(X)capoperatornamecol(XP)^perp$, while the column space of $X_Delta X_Delta^+$ is precisely the column space of $X_Delta=left[I-(XP)(XP)^+right]X$.
Since $X_Delta=Xleft[I-P(XP)^+Xright]$, $operatornamecol(X_Delta)subseteqoperatornamecol(X)$. Also, since
beginaligned
(XP)^TX_Delta
&=(X_Delta^TXP)^T\
&=left[X^Tleft(I-(XP)(XP)^+right)XPright]^T\
&=left[X^Tleft(XP-(XP)(XP)^+(XP)right)right]^T\
&=left[X^Tleft(XP-XPright)right]^T=0,
endaligned
we also have $operatornamecol(X_Delta)subseteqoperatornamecol(XP)^perp$. Thus $operatornamecol(X_Delta)subseteqoperatornamecol(M)$.
We now show that the reverse inclusion is also true. Pick any $vinoperatornamecol(M)=operatornamecol(X)capoperatornamecol(XP)^perp$. Since $vinoperatornamecol(X)$, it can be written as $Xb$ for some vector $b$. Thus
$$
X_Delta b=left[I-(XP)(XP)^+right]Xb=v-(XP)(XP)^+v.
$$
However, we also have $vinoperatornamecol(XP)^perp$. Therefore $(XP)(XP)^+v=0$ and in turn $X_Delta b=v$, meaning that $vinoperatornamecol(X_Delta)$.
Thus $operatornamecol(X_Delta X_Delta^+)equivoperatornamecol(X_Delta)=operatornamecol(M)$. Hence $X_Delta X_Delta^+$ and $M$ must be equal, because they are orthogonal projections with identical column spaces.
$endgroup$
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
add a comment |
$begingroup$
With your choice of $X_Delta$, $M$ is indeed equal to $X_Delta X_Delta^+$.
Proof. Let $P=I-CC^+$. Note that the column space of $M=XX^+-(XP)(XP)^+$ is $operatornamecol(X)capoperatornamecol(XP)^perp$, while the column space of $X_Delta X_Delta^+$ is precisely the column space of $X_Delta=left[I-(XP)(XP)^+right]X$.
Since $X_Delta=Xleft[I-P(XP)^+Xright]$, $operatornamecol(X_Delta)subseteqoperatornamecol(X)$. Also, since
beginaligned
(XP)^TX_Delta
&=(X_Delta^TXP)^T\
&=left[X^Tleft(I-(XP)(XP)^+right)XPright]^T\
&=left[X^Tleft(XP-(XP)(XP)^+(XP)right)right]^T\
&=left[X^Tleft(XP-XPright)right]^T=0,
endaligned
we also have $operatornamecol(X_Delta)subseteqoperatornamecol(XP)^perp$. Thus $operatornamecol(X_Delta)subseteqoperatornamecol(M)$.
We now show that the reverse inclusion is also true. Pick any $vinoperatornamecol(M)=operatornamecol(X)capoperatornamecol(XP)^perp$. Since $vinoperatornamecol(X)$, it can be written as $Xb$ for some vector $b$. Thus
$$
X_Delta b=left[I-(XP)(XP)^+right]Xb=v-(XP)(XP)^+v.
$$
However, we also have $vinoperatornamecol(XP)^perp$. Therefore $(XP)(XP)^+v=0$ and in turn $X_Delta b=v$, meaning that $vinoperatornamecol(X_Delta)$.
Thus $operatornamecol(X_Delta X_Delta^+)equivoperatornamecol(X_Delta)=operatornamecol(M)$. Hence $X_Delta X_Delta^+$ and $M$ must be equal, because they are orthogonal projections with identical column spaces.
$endgroup$
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
add a comment |
$begingroup$
With your choice of $X_Delta$, $M$ is indeed equal to $X_Delta X_Delta^+$.
Proof. Let $P=I-CC^+$. Note that the column space of $M=XX^+-(XP)(XP)^+$ is $operatornamecol(X)capoperatornamecol(XP)^perp$, while the column space of $X_Delta X_Delta^+$ is precisely the column space of $X_Delta=left[I-(XP)(XP)^+right]X$.
Since $X_Delta=Xleft[I-P(XP)^+Xright]$, $operatornamecol(X_Delta)subseteqoperatornamecol(X)$. Also, since
beginaligned
(XP)^TX_Delta
&=(X_Delta^TXP)^T\
&=left[X^Tleft(I-(XP)(XP)^+right)XPright]^T\
&=left[X^Tleft(XP-(XP)(XP)^+(XP)right)right]^T\
&=left[X^Tleft(XP-XPright)right]^T=0,
endaligned
we also have $operatornamecol(X_Delta)subseteqoperatornamecol(XP)^perp$. Thus $operatornamecol(X_Delta)subseteqoperatornamecol(M)$.
We now show that the reverse inclusion is also true. Pick any $vinoperatornamecol(M)=operatornamecol(X)capoperatornamecol(XP)^perp$. Since $vinoperatornamecol(X)$, it can be written as $Xb$ for some vector $b$. Thus
$$
X_Delta b=left[I-(XP)(XP)^+right]Xb=v-(XP)(XP)^+v.
$$
However, we also have $vinoperatornamecol(XP)^perp$. Therefore $(XP)(XP)^+v=0$ and in turn $X_Delta b=v$, meaning that $vinoperatornamecol(X_Delta)$.
Thus $operatornamecol(X_Delta X_Delta^+)equivoperatornamecol(X_Delta)=operatornamecol(M)$. Hence $X_Delta X_Delta^+$ and $M$ must be equal, because they are orthogonal projections with identical column spaces.
$endgroup$
With your choice of $X_Delta$, $M$ is indeed equal to $X_Delta X_Delta^+$.
Proof. Let $P=I-CC^+$. Note that the column space of $M=XX^+-(XP)(XP)^+$ is $operatornamecol(X)capoperatornamecol(XP)^perp$, while the column space of $X_Delta X_Delta^+$ is precisely the column space of $X_Delta=left[I-(XP)(XP)^+right]X$.
Since $X_Delta=Xleft[I-P(XP)^+Xright]$, $operatornamecol(X_Delta)subseteqoperatornamecol(X)$. Also, since
beginaligned
(XP)^TX_Delta
&=(X_Delta^TXP)^T\
&=left[X^Tleft(I-(XP)(XP)^+right)XPright]^T\
&=left[X^Tleft(XP-(XP)(XP)^+(XP)right)right]^T\
&=left[X^Tleft(XP-XPright)right]^T=0,
endaligned
we also have $operatornamecol(X_Delta)subseteqoperatornamecol(XP)^perp$. Thus $operatornamecol(X_Delta)subseteqoperatornamecol(M)$.
We now show that the reverse inclusion is also true. Pick any $vinoperatornamecol(M)=operatornamecol(X)capoperatornamecol(XP)^perp$. Since $vinoperatornamecol(X)$, it can be written as $Xb$ for some vector $b$. Thus
$$
X_Delta b=left[I-(XP)(XP)^+right]Xb=v-(XP)(XP)^+v.
$$
However, we also have $vinoperatornamecol(XP)^perp$. Therefore $(XP)(XP)^+v=0$ and in turn $X_Delta b=v$, meaning that $vinoperatornamecol(X_Delta)$.
Thus $operatornamecol(X_Delta X_Delta^+)equivoperatornamecol(X_Delta)=operatornamecol(M)$. Hence $X_Delta X_Delta^+$ and $M$ must be equal, because they are orthogonal projections with identical column spaces.
edited 6 hours ago
answered 2 days ago
user1551user1551
74.1k566129
74.1k566129
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
add a comment |
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
$begingroup$
Now it's clear. Thanks again!
$endgroup$
– A. Donda
6 hours ago
1
1
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
$begingroup$
@A.Donda You are welcome. Note that if you need to verify the relation numerically, you need to use SVD and set thresholds to cut off the small singular values, otherwise the verification will likely fail because the calculation of pseudoinverse is not numerically stable.
$endgroup$
– user1551
6 hours ago
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3167640%2fdifference-of-two-orthogonal-projections-is-orthogonal-projection%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown