Upper bound for independent Normal random variables The Next CEO of Stack OverflowProbability involving dependent normal variablesMaximum of a sequence of almost-identical independent normal random variablesUpper bound for the gaussian measure of an epsilon strip.Upper bound on the entropy of a sum two random variablesDifferential entropy of the product of Gaussian random variablesConvergence of normal distributed random variablesIf $X$ and $Y$ are two NON independent random normal variables, what is the distribution of $Z = fracXY^n$Sum of independent normal random variablesapproximation of product of two independent Normal random variablesAsymptotics of expected value of maximum of normal variables
Does it take more energy to get to Venus or to Mars?
Can I equip Skullclamp on a creature I am sacrificing?
How long to clear the 'suck zone' of a turbofan after start is initiated?
WOW air has ceased operation, can I get my tickets refunded?
% symbol leads to superlong (forever?) compilations
Anatomically Correct Mesopelagic Aves
Where to find order of arguments for default functions
How to start emacs in "nothing" mode (`fundamental-mode`)
What does "Its cash flow is deeply negative" mean?
Solution of this Diophantine Equation
When airplanes disconnect from a tanker during air to air refueling, why do they bank so sharply to the right?
How to safely derail a train during transit?
Why does GHC infer a monomorphic type here, even with MonomorphismRestriction disabled?
How do I construct this japanese bowl?
Is HostGator storing my password in plaintext?
Is it okay to store user locations?
How to Reset Passwords on Multiple Websites Easily?
Why were Madagascar and New Zealand discovered so late?
Rotate a column
Science fiction (dystopian) short story set after WWIII
Fastest way to shutdown Ubuntu Mate 18.10
How can I open an app using Terminal?
Anatomically Correct Strange Women In Ponds Distributing Swords
Is it a good idea to use COLUMN AS (left([Another_Column],(4)) instead of LEFT in the select?
Upper bound for independent Normal random variables
The Next CEO of Stack OverflowProbability involving dependent normal variablesMaximum of a sequence of almost-identical independent normal random variablesUpper bound for the gaussian measure of an epsilon strip.Upper bound on the entropy of a sum two random variablesDifferential entropy of the product of Gaussian random variablesConvergence of normal distributed random variablesIf $X$ and $Y$ are two NON independent random normal variables, what is the distribution of $Z = fracXY^n$Sum of independent normal random variablesapproximation of product of two independent Normal random variablesAsymptotics of expected value of maximum of normal variables
$begingroup$
I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?
Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.
$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?
normal-distribution entropy upper-lower-bounds
New contributor
$endgroup$
add a comment |
$begingroup$
I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?
Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.
$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?
normal-distribution entropy upper-lower-bounds
New contributor
$endgroup$
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday
add a comment |
$begingroup$
I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?
Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.
$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?
normal-distribution entropy upper-lower-bounds
New contributor
$endgroup$
I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?
Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.
$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?
normal-distribution entropy upper-lower-bounds
normal-distribution entropy upper-lower-bounds
New contributor
New contributor
edited yesterday
Tara
New contributor
asked yesterday
TaraTara
11
11
New contributor
New contributor
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday
add a comment |
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Tara is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3164350%2fupper-bound-for-independent-normal-random-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Tara is a new contributor. Be nice, and check out our Code of Conduct.
Tara is a new contributor. Be nice, and check out our Code of Conduct.
Tara is a new contributor. Be nice, and check out our Code of Conduct.
Tara is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3164350%2fupper-bound-for-independent-normal-random-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday
$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday