How to prove $E[g(x)] = int_0^infty g'(x)S(x) dx$ Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Is it true that $limlimits_xtoinftyx·P[X>x]=0$?Proof of $textTVaR_p(X)$ and $textVaR_u(X)$ relationshipShow that if X is a continuous random variable on $[b,infty)$ then $mathrmE[X]=b+int_b^infty(1- F(x))dx $For a pdf $f(x)$, how can we prove that $int_-infty^infty x,f(x),dx=int_-infty^infty F(xgeq t),dt$?Prove the following equality to show a relationship between Poisson and Gamma Random VariablesHow can I prove that expectation of conditional random variable?Peculiar problem about characteristic function and density function.With scant information, how to prove this probability limit tends to zero?How to prove that a conditional pdf sums to 1?How to calculate a joint pdf by convolutionFind the conditional density function and expectation of $Y$ given $X$ when $f(x,y) = lambda^2e^-lambda y$ and $f(x,y) = xe^-x(y+1)$
C's equality operator on converted pointers
Does "shooting for effect" have contradictory meanings in different areas?
Did any compiler fully use 80-bit floating point?
Why are my pictures showing a dark band on one edge?
Draw 4 of the same figure in the same tikzpicture
What is the meaning of 'breadth' in breadth first search?
Is it fair for a professor to grade us on the possession of past papers?
Central Vacuuming: Is it worth it, and how does it compare to normal vacuuming?
Intuitive explanation of the rank-nullity theorem
What does Turing mean by this statement?
Why weren't discrete x86 CPUs ever used in game hardware?
Misunderstanding of Sylow theory
Is there public access to the Meteor Crater in Arizona?
What are the discoveries that have been possible with the rejection of positivism?
A letter with no particular backstory
Dyck paths with extra diagonals from valleys (Laser construction)
Would it be easier to apply for a UK visa if there is a host family to sponsor for you in going there?
Karn the great creator - 'card from outside the game' in sealed
Do wooden building fires get hotter than 600°C?
How many morphisms from 1 to 1+1 can there be?
Co-worker has annoying ringtone
Drawing spherical mirrors
What makes a man succeed?
How to identify unknown coordinate type and convert to lat/lon?
How to prove $E[g(x)] = int_0^infty g'(x)S(x) dx$
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Is it true that $limlimits_xtoinftyx·P[X>x]=0$?Proof of $textTVaR_p(X)$ and $textVaR_u(X)$ relationshipShow that if X is a continuous random variable on $[b,infty)$ then $mathrmE[X]=b+int_b^infty(1- F(x))dx $For a pdf $f(x)$, how can we prove that $int_-infty^infty x,f(x),dx=int_-infty^infty F(xgeq t),dt$?Prove the following equality to show a relationship between Poisson and Gamma Random VariablesHow can I prove that expectation of conditional random variable?Peculiar problem about characteristic function and density function.With scant information, how to prove this probability limit tends to zero?How to prove that a conditional pdf sums to 1?How to calculate a joint pdf by convolutionFind the conditional density function and expectation of $Y$ given $X$ when $f(x,y) = lambda^2e^-lambda y$ and $f(x,y) = xe^-x(y+1)$
$begingroup$
Given that $E[g(X)] = int_-infty^infty g(x)f(x) dx$, how to prove $E[g(X)] = int_0^infty g'(x)S(x)dx$, where $S(x) = 1-F(x)$?
By integration by parts, I can get the following:
beginalign
E[g(X)] &= int_-infty^infty g(x)f(x) dx \
&= int_-infty^infty g(x)F'(x) dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)F(x) dx \
&= g(x)F(x)|_-infty^infty + int_-infty^infty g'(x)[1-S(x)] dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)dx +int_-infty^infty g'(x)S(x) dx\
endalign
Then I am stuck.
probability
$endgroup$
add a comment |
$begingroup$
Given that $E[g(X)] = int_-infty^infty g(x)f(x) dx$, how to prove $E[g(X)] = int_0^infty g'(x)S(x)dx$, where $S(x) = 1-F(x)$?
By integration by parts, I can get the following:
beginalign
E[g(X)] &= int_-infty^infty g(x)f(x) dx \
&= int_-infty^infty g(x)F'(x) dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)F(x) dx \
&= g(x)F(x)|_-infty^infty + int_-infty^infty g'(x)[1-S(x)] dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)dx +int_-infty^infty g'(x)S(x) dx\
endalign
Then I am stuck.
probability
$endgroup$
add a comment |
$begingroup$
Given that $E[g(X)] = int_-infty^infty g(x)f(x) dx$, how to prove $E[g(X)] = int_0^infty g'(x)S(x)dx$, where $S(x) = 1-F(x)$?
By integration by parts, I can get the following:
beginalign
E[g(X)] &= int_-infty^infty g(x)f(x) dx \
&= int_-infty^infty g(x)F'(x) dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)F(x) dx \
&= g(x)F(x)|_-infty^infty + int_-infty^infty g'(x)[1-S(x)] dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)dx +int_-infty^infty g'(x)S(x) dx\
endalign
Then I am stuck.
probability
$endgroup$
Given that $E[g(X)] = int_-infty^infty g(x)f(x) dx$, how to prove $E[g(X)] = int_0^infty g'(x)S(x)dx$, where $S(x) = 1-F(x)$?
By integration by parts, I can get the following:
beginalign
E[g(X)] &= int_-infty^infty g(x)f(x) dx \
&= int_-infty^infty g(x)F'(x) dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)F(x) dx \
&= g(x)F(x)|_-infty^infty + int_-infty^infty g'(x)[1-S(x)] dx \
&= g(x)F(x)|_-infty^infty - int_-infty^infty g'(x)dx +int_-infty^infty g'(x)S(x) dx\
endalign
Then I am stuck.
probability
probability
asked Apr 2 at 0:18
JangoJango
1839
1839
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
I think what you are trying to prove is not exactly correct. I will work through the problem and show the correct answer at the end.
You need to split the integral into two, one for the region $xge 0$ and one for $xle 0$.
In the positive region, instead of writing $f(x)=F'(x)$ and integrating by parts, you need to use $f=(F(x)-1)'$. This is the only choice which ensures that the resulting integral actually exists; note that $F(x)-1$ goes to zero as $xto+infty$, but $F(x)$ does not.
beginalign
int_0^infty g(x)f(x),dx
&=int_0^infty g(x)[F(x)-1]',dx
\&=g(x)[F(x)-1]Big|^infty_0-int_0^infty g'(s)[F(x)-1],dx
\&=underbraceBig(lim_xtoinftyg(x)[F(x)-1]Big)_=0-g(0)[F(0)-1]+int_0^infty g'(x)S(x),dx
endalign
It takes some doing, but you can show that that limit actually is zero, as long as $E[g(X)]$ is finite. See Is it true that $limlimits_xtoinftyx·P[X>x]=0$? for some inspiration. Edit: Actually, I am not quite sure this is true without some additional assumptions on $g.$ It is true as long as $g$ is either bounded or monotonic.
For the negative region, you do want to use $F(x)$ as the antiderivate of $f(x)$, because $lim_xto-inftyF(x)=0$.
beginalign
int_-infty^0 g(x)f(x),dx
&=int_-infty^0 g(x)F(x)',dx
\&=g(x)F(x)Big|^0_-infty-int_-infty^0 g'(s)F(x),dx
\&=g(0)F(0)-underbraceBig(lim_xto-inftyg(x)F(x)Big)_=0-int_-infty^0 g'(x)F(x),dx
endalign
Putting this all together, we get
$$
bbox[7px,border:2px solid red]int_-infty^infty g(x)f(x),dx=g(0)+int_0^infty g'(x)S(x),dx-int_-infty^0g'(x)F(x),dx
$$
This is correct for any function $g$ such that $E[g(X)]$ is finite. Edit: Well, as long as $g$ is either bounded or monotonic.
If we make further assumptions about $g$, we can write this more nicely. Assuming $g(-infty):=lim_xto-inftyg(x)$ exists, you would have
$$
g(0)=g(-infty)+int_-infty^0 g'(x),dx
$$
so
$$
bbox[7px,border:2px solid green]int_-infty^infty g(x)f(x),dx=g(-infty)+int_-infty^infty g'(x)S(x),dx
$$
You cannot in general avoid the presence of some constant $g(a)$. This is because the $int g'(x)S(x),dx$ part of the formula does not change when $g$ is shifted by a constant (this does not affect $g'$), but $E[g(x)]$ does change when $g$ is shifted by a constant.
$endgroup$
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
add a comment |
$begingroup$
Write $1-F(x)=int_(x,infty) dmu(y)$ where $mu$ is the measure corresponding to $F$. Now apply Fubini/Tonelli Theorem to the right hand side.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171308%2fhow-to-prove-egx-int-0-infty-gxsx-dx%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I think what you are trying to prove is not exactly correct. I will work through the problem and show the correct answer at the end.
You need to split the integral into two, one for the region $xge 0$ and one for $xle 0$.
In the positive region, instead of writing $f(x)=F'(x)$ and integrating by parts, you need to use $f=(F(x)-1)'$. This is the only choice which ensures that the resulting integral actually exists; note that $F(x)-1$ goes to zero as $xto+infty$, but $F(x)$ does not.
beginalign
int_0^infty g(x)f(x),dx
&=int_0^infty g(x)[F(x)-1]',dx
\&=g(x)[F(x)-1]Big|^infty_0-int_0^infty g'(s)[F(x)-1],dx
\&=underbraceBig(lim_xtoinftyg(x)[F(x)-1]Big)_=0-g(0)[F(0)-1]+int_0^infty g'(x)S(x),dx
endalign
It takes some doing, but you can show that that limit actually is zero, as long as $E[g(X)]$ is finite. See Is it true that $limlimits_xtoinftyx·P[X>x]=0$? for some inspiration. Edit: Actually, I am not quite sure this is true without some additional assumptions on $g.$ It is true as long as $g$ is either bounded or monotonic.
For the negative region, you do want to use $F(x)$ as the antiderivate of $f(x)$, because $lim_xto-inftyF(x)=0$.
beginalign
int_-infty^0 g(x)f(x),dx
&=int_-infty^0 g(x)F(x)',dx
\&=g(x)F(x)Big|^0_-infty-int_-infty^0 g'(s)F(x),dx
\&=g(0)F(0)-underbraceBig(lim_xto-inftyg(x)F(x)Big)_=0-int_-infty^0 g'(x)F(x),dx
endalign
Putting this all together, we get
$$
bbox[7px,border:2px solid red]int_-infty^infty g(x)f(x),dx=g(0)+int_0^infty g'(x)S(x),dx-int_-infty^0g'(x)F(x),dx
$$
This is correct for any function $g$ such that $E[g(X)]$ is finite. Edit: Well, as long as $g$ is either bounded or monotonic.
If we make further assumptions about $g$, we can write this more nicely. Assuming $g(-infty):=lim_xto-inftyg(x)$ exists, you would have
$$
g(0)=g(-infty)+int_-infty^0 g'(x),dx
$$
so
$$
bbox[7px,border:2px solid green]int_-infty^infty g(x)f(x),dx=g(-infty)+int_-infty^infty g'(x)S(x),dx
$$
You cannot in general avoid the presence of some constant $g(a)$. This is because the $int g'(x)S(x),dx$ part of the formula does not change when $g$ is shifted by a constant (this does not affect $g'$), but $E[g(x)]$ does change when $g$ is shifted by a constant.
$endgroup$
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
add a comment |
$begingroup$
I think what you are trying to prove is not exactly correct. I will work through the problem and show the correct answer at the end.
You need to split the integral into two, one for the region $xge 0$ and one for $xle 0$.
In the positive region, instead of writing $f(x)=F'(x)$ and integrating by parts, you need to use $f=(F(x)-1)'$. This is the only choice which ensures that the resulting integral actually exists; note that $F(x)-1$ goes to zero as $xto+infty$, but $F(x)$ does not.
beginalign
int_0^infty g(x)f(x),dx
&=int_0^infty g(x)[F(x)-1]',dx
\&=g(x)[F(x)-1]Big|^infty_0-int_0^infty g'(s)[F(x)-1],dx
\&=underbraceBig(lim_xtoinftyg(x)[F(x)-1]Big)_=0-g(0)[F(0)-1]+int_0^infty g'(x)S(x),dx
endalign
It takes some doing, but you can show that that limit actually is zero, as long as $E[g(X)]$ is finite. See Is it true that $limlimits_xtoinftyx·P[X>x]=0$? for some inspiration. Edit: Actually, I am not quite sure this is true without some additional assumptions on $g.$ It is true as long as $g$ is either bounded or monotonic.
For the negative region, you do want to use $F(x)$ as the antiderivate of $f(x)$, because $lim_xto-inftyF(x)=0$.
beginalign
int_-infty^0 g(x)f(x),dx
&=int_-infty^0 g(x)F(x)',dx
\&=g(x)F(x)Big|^0_-infty-int_-infty^0 g'(s)F(x),dx
\&=g(0)F(0)-underbraceBig(lim_xto-inftyg(x)F(x)Big)_=0-int_-infty^0 g'(x)F(x),dx
endalign
Putting this all together, we get
$$
bbox[7px,border:2px solid red]int_-infty^infty g(x)f(x),dx=g(0)+int_0^infty g'(x)S(x),dx-int_-infty^0g'(x)F(x),dx
$$
This is correct for any function $g$ such that $E[g(X)]$ is finite. Edit: Well, as long as $g$ is either bounded or monotonic.
If we make further assumptions about $g$, we can write this more nicely. Assuming $g(-infty):=lim_xto-inftyg(x)$ exists, you would have
$$
g(0)=g(-infty)+int_-infty^0 g'(x),dx
$$
so
$$
bbox[7px,border:2px solid green]int_-infty^infty g(x)f(x),dx=g(-infty)+int_-infty^infty g'(x)S(x),dx
$$
You cannot in general avoid the presence of some constant $g(a)$. This is because the $int g'(x)S(x),dx$ part of the formula does not change when $g$ is shifted by a constant (this does not affect $g'$), but $E[g(x)]$ does change when $g$ is shifted by a constant.
$endgroup$
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
add a comment |
$begingroup$
I think what you are trying to prove is not exactly correct. I will work through the problem and show the correct answer at the end.
You need to split the integral into two, one for the region $xge 0$ and one for $xle 0$.
In the positive region, instead of writing $f(x)=F'(x)$ and integrating by parts, you need to use $f=(F(x)-1)'$. This is the only choice which ensures that the resulting integral actually exists; note that $F(x)-1$ goes to zero as $xto+infty$, but $F(x)$ does not.
beginalign
int_0^infty g(x)f(x),dx
&=int_0^infty g(x)[F(x)-1]',dx
\&=g(x)[F(x)-1]Big|^infty_0-int_0^infty g'(s)[F(x)-1],dx
\&=underbraceBig(lim_xtoinftyg(x)[F(x)-1]Big)_=0-g(0)[F(0)-1]+int_0^infty g'(x)S(x),dx
endalign
It takes some doing, but you can show that that limit actually is zero, as long as $E[g(X)]$ is finite. See Is it true that $limlimits_xtoinftyx·P[X>x]=0$? for some inspiration. Edit: Actually, I am not quite sure this is true without some additional assumptions on $g.$ It is true as long as $g$ is either bounded or monotonic.
For the negative region, you do want to use $F(x)$ as the antiderivate of $f(x)$, because $lim_xto-inftyF(x)=0$.
beginalign
int_-infty^0 g(x)f(x),dx
&=int_-infty^0 g(x)F(x)',dx
\&=g(x)F(x)Big|^0_-infty-int_-infty^0 g'(s)F(x),dx
\&=g(0)F(0)-underbraceBig(lim_xto-inftyg(x)F(x)Big)_=0-int_-infty^0 g'(x)F(x),dx
endalign
Putting this all together, we get
$$
bbox[7px,border:2px solid red]int_-infty^infty g(x)f(x),dx=g(0)+int_0^infty g'(x)S(x),dx-int_-infty^0g'(x)F(x),dx
$$
This is correct for any function $g$ such that $E[g(X)]$ is finite. Edit: Well, as long as $g$ is either bounded or monotonic.
If we make further assumptions about $g$, we can write this more nicely. Assuming $g(-infty):=lim_xto-inftyg(x)$ exists, you would have
$$
g(0)=g(-infty)+int_-infty^0 g'(x),dx
$$
so
$$
bbox[7px,border:2px solid green]int_-infty^infty g(x)f(x),dx=g(-infty)+int_-infty^infty g'(x)S(x),dx
$$
You cannot in general avoid the presence of some constant $g(a)$. This is because the $int g'(x)S(x),dx$ part of the formula does not change when $g$ is shifted by a constant (this does not affect $g'$), but $E[g(x)]$ does change when $g$ is shifted by a constant.
$endgroup$
I think what you are trying to prove is not exactly correct. I will work through the problem and show the correct answer at the end.
You need to split the integral into two, one for the region $xge 0$ and one for $xle 0$.
In the positive region, instead of writing $f(x)=F'(x)$ and integrating by parts, you need to use $f=(F(x)-1)'$. This is the only choice which ensures that the resulting integral actually exists; note that $F(x)-1$ goes to zero as $xto+infty$, but $F(x)$ does not.
beginalign
int_0^infty g(x)f(x),dx
&=int_0^infty g(x)[F(x)-1]',dx
\&=g(x)[F(x)-1]Big|^infty_0-int_0^infty g'(s)[F(x)-1],dx
\&=underbraceBig(lim_xtoinftyg(x)[F(x)-1]Big)_=0-g(0)[F(0)-1]+int_0^infty g'(x)S(x),dx
endalign
It takes some doing, but you can show that that limit actually is zero, as long as $E[g(X)]$ is finite. See Is it true that $limlimits_xtoinftyx·P[X>x]=0$? for some inspiration. Edit: Actually, I am not quite sure this is true without some additional assumptions on $g.$ It is true as long as $g$ is either bounded or monotonic.
For the negative region, you do want to use $F(x)$ as the antiderivate of $f(x)$, because $lim_xto-inftyF(x)=0$.
beginalign
int_-infty^0 g(x)f(x),dx
&=int_-infty^0 g(x)F(x)',dx
\&=g(x)F(x)Big|^0_-infty-int_-infty^0 g'(s)F(x),dx
\&=g(0)F(0)-underbraceBig(lim_xto-inftyg(x)F(x)Big)_=0-int_-infty^0 g'(x)F(x),dx
endalign
Putting this all together, we get
$$
bbox[7px,border:2px solid red]int_-infty^infty g(x)f(x),dx=g(0)+int_0^infty g'(x)S(x),dx-int_-infty^0g'(x)F(x),dx
$$
This is correct for any function $g$ such that $E[g(X)]$ is finite. Edit: Well, as long as $g$ is either bounded or monotonic.
If we make further assumptions about $g$, we can write this more nicely. Assuming $g(-infty):=lim_xto-inftyg(x)$ exists, you would have
$$
g(0)=g(-infty)+int_-infty^0 g'(x),dx
$$
so
$$
bbox[7px,border:2px solid green]int_-infty^infty g(x)f(x),dx=g(-infty)+int_-infty^infty g'(x)S(x),dx
$$
You cannot in general avoid the presence of some constant $g(a)$. This is because the $int g'(x)S(x),dx$ part of the formula does not change when $g$ is shifted by a constant (this does not affect $g'$), but $E[g(x)]$ does change when $g$ is shifted by a constant.
edited Apr 2 at 4:54
answered Apr 2 at 4:27
Mike EarnestMike Earnest
28.2k22152
28.2k22152
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
add a comment |
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
Thank you very much. I did not think it is so difficult since I read this from "Loss Model" other than analysis. Anyway, your work is clear and easy to understand. Thanks.
$endgroup$
– Jango
Apr 2 at 14:34
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
$begingroup$
@Jango Can I ask where this appears in the book?
$endgroup$
– Mike Earnest
Apr 2 at 14:48
add a comment |
$begingroup$
Write $1-F(x)=int_(x,infty) dmu(y)$ where $mu$ is the measure corresponding to $F$. Now apply Fubini/Tonelli Theorem to the right hand side.
$endgroup$
add a comment |
$begingroup$
Write $1-F(x)=int_(x,infty) dmu(y)$ where $mu$ is the measure corresponding to $F$. Now apply Fubini/Tonelli Theorem to the right hand side.
$endgroup$
add a comment |
$begingroup$
Write $1-F(x)=int_(x,infty) dmu(y)$ where $mu$ is the measure corresponding to $F$. Now apply Fubini/Tonelli Theorem to the right hand side.
$endgroup$
Write $1-F(x)=int_(x,infty) dmu(y)$ where $mu$ is the measure corresponding to $F$. Now apply Fubini/Tonelli Theorem to the right hand side.
answered Apr 2 at 0:22
Kavi Rama MurthyKavi Rama Murthy
76.1k53370
76.1k53370
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171308%2fhow-to-prove-egx-int-0-infty-gxsx-dx%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown