Upper bound for independent Normal random variables The Next CEO of Stack OverflowProbability involving dependent normal variablesMaximum of a sequence of almost-identical independent normal random variablesUpper bound for the gaussian measure of an epsilon strip.Upper bound on the entropy of a sum two random variablesDifferential entropy of the product of Gaussian random variablesConvergence of normal distributed random variablesIf $X$ and $Y$ are two NON independent random normal variables, what is the distribution of $Z = fracXY^n$Sum of independent normal random variablesapproximation of product of two independent Normal random variablesAsymptotics of expected value of maximum of normal variables

Does it take more energy to get to Venus or to Mars?

Can I equip Skullclamp on a creature I am sacrificing?

How long to clear the 'suck zone' of a turbofan after start is initiated?

WOW air has ceased operation, can I get my tickets refunded?

% symbol leads to superlong (forever?) compilations

Anatomically Correct Mesopelagic Aves

Where to find order of arguments for default functions

How to start emacs in "nothing" mode (`fundamental-mode`)

What does "Its cash flow is deeply negative" mean?

Solution of this Diophantine Equation

When airplanes disconnect from a tanker during air to air refueling, why do they bank so sharply to the right?

How to safely derail a train during transit?

Why does GHC infer a monomorphic type here, even with MonomorphismRestriction disabled?

How do I construct this japanese bowl?

Is HostGator storing my password in plaintext?

Is it okay to store user locations?

How to Reset Passwords on Multiple Websites Easily?

Why were Madagascar and New Zealand discovered so late?

Rotate a column

Science fiction (dystopian) short story set after WWIII

Fastest way to shutdown Ubuntu Mate 18.10

How can I open an app using Terminal?

Anatomically Correct Strange Women In Ponds Distributing Swords

Is it a good idea to use COLUMN AS (left([Another_Column],(4)) instead of LEFT in the select?



Upper bound for independent Normal random variables



The Next CEO of Stack OverflowProbability involving dependent normal variablesMaximum of a sequence of almost-identical independent normal random variablesUpper bound for the gaussian measure of an epsilon strip.Upper bound on the entropy of a sum two random variablesDifferential entropy of the product of Gaussian random variablesConvergence of normal distributed random variablesIf $X$ and $Y$ are two NON independent random normal variables, what is the distribution of $Z = fracXY^n$Sum of independent normal random variablesapproximation of product of two independent Normal random variablesAsymptotics of expected value of maximum of normal variables










0












$begingroup$


I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?



Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.



$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?










share|cite|improve this question









New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
    $endgroup$
    – Ertxiem
    yesterday











  • $begingroup$
    Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
    $endgroup$
    – Tara
    yesterday











  • $begingroup$
    I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
    $endgroup$
    – Ertxiem
    yesterday















0












$begingroup$


I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?



Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.



$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?










share|cite|improve this question









New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
    $endgroup$
    – Ertxiem
    yesterday











  • $begingroup$
    Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
    $endgroup$
    – Tara
    yesterday











  • $begingroup$
    I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
    $endgroup$
    – Ertxiem
    yesterday













0












0








0





$begingroup$


I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?



Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.



$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?










share|cite|improve this question









New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I want to fined a condition for variance of independent normal random variables to show that Entropy Integral is finite.
I guess i have to show, if sigma_n decreases to 0 then is the Entropy Integral
finite but i don't know how?



Let $X_nsim N(0,sigma_n)$ be independent. I want to show , is there a condition for $sigma_n$ that shows Dudley Integral is right. Because if Dudley Integral(Entropy Integral) is finite then we can say our variables has Upper bound.



$X_n sim N( 0, sigma_n)$
is $int _0^d sqrtlog N(epsilon)< infty $?







normal-distribution entropy upper-lower-bounds






share|cite|improve this question









New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited yesterday







Tara













New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked yesterday









TaraTara

11




11




New contributor




Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Tara is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
    $endgroup$
    – Ertxiem
    yesterday











  • $begingroup$
    Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
    $endgroup$
    – Tara
    yesterday











  • $begingroup$
    I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
    $endgroup$
    – Ertxiem
    yesterday
















  • $begingroup$
    I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
    $endgroup$
    – Ertxiem
    yesterday











  • $begingroup$
    Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
    $endgroup$
    – Tara
    yesterday











  • $begingroup$
    I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
    $endgroup$
    – Ertxiem
    yesterday















$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday





$begingroup$
I see problems with the square root of a negative number due to the log. Are you sure about the expression you need to integrate? Assuming that you can get to a correct expression, in order to prove something like you said, my suggestion would be to make the transformation to the standardized normal distribution and then show that the transformed value of $d$, although increasing keeps the computation of the integral bounded.
$endgroup$
– Ertxiem
yesterday













$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday





$begingroup$
Thank you for your Comment. Actually i fined for a condition for variance of independent Normal variables to show that these variables has Upper bound with Entropy Integral.
$endgroup$
– Tara
yesterday













$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday




$begingroup$
I took a look at wikipedia and I'm not sure that the $N$ that appears there is the Normal distribution. If I remember correctly what I learned from Statistical Physics $N$ may be the number of states or something close to it.
$endgroup$
– Ertxiem
yesterday










0






active

oldest

votes












Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);






Tara is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3164350%2fupper-bound-for-independent-normal-random-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes








Tara is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















Tara is a new contributor. Be nice, and check out our Code of Conduct.












Tara is a new contributor. Be nice, and check out our Code of Conduct.











Tara is a new contributor. Be nice, and check out our Code of Conduct.














Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3164350%2fupper-bound-for-independent-normal-random-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Triangular numbers and gcdProving sum of a set is $0 pmod n$ if $n$ is odd, or $fracn2 pmod n$ if $n$ is even?Is greatest common divisor of two numbers really their smallest linear combination?GCD, LCM RelationshipProve a set of nonnegative integers with greatest common divisor 1 and closed under addition has all but finite many nonnegative integers.all pairs of a and b in an equation containing gcdTriangular Numbers Modulo $k$ - Hit All Values?Understanding the Existence and Uniqueness of the GCDGCD and LCM with logical symbolsThe greatest common divisor of two positive integers less than 100 is equal to 3. Their least common multiple is twelve times one of the integers.Suppose that for all integers $x$, $x|a$ and $x|b$ if and only if $x|c$. Then $c = gcd(a,b)$Which is the gcd of 2 numbers which are multiplied and the result is 600000?

Ingelân Ynhâld Etymology | Geografy | Skiednis | Polityk en bestjoer | Ekonomy | Demografy | Kultuer | Klimaat | Sjoch ek | Keppelings om utens | Boarnen, noaten en referinsjes Navigaasjemenuwww.gov.ukOffisjele webside fan it regear fan it Feriene KeninkrykOffisjele webside fan it Britske FerkearsburoNederlânsktalige ynformaasje fan it Britske FerkearsburoOffisjele webside fan English Heritage, de organisaasje dy't him ynset foar it behâld fan it Ingelske kultuergoedYnwennertallen fan alle Britske stêden út 'e folkstelling fan 2011Notes en References, op dizze sideEngland

Հադիս Բովանդակություն Անվանում և նշանակություն | Դասակարգում | Աղբյուրներ | Նավարկման ցանկ