Conditional entropy on race outcome Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Non-zero Conditional Differential Entropy between a random variable and a function of itInformation Entropy Applied to Complexity TheoryWhat does the $-log[P(X)]$ term mean in the calculation of entropy?Why can we use entropy to measure the quality of a language model?Conditional entropy of repetition code over BSCProof that the inequality with mutual information and conditional mutual information is not true always.Entropy and Mutual InformationConditional Expectation and EntropyUnderstanding information entropyEntropy of roulette
Project Euler #1 in C++
A letter with no particular backstory
How does Belgium enforce obligatory attendance in elections?
How would a mousetrap for use in space work?
Why does 14 CFR have skipped subparts in my ASA 2019 FAR/AIM book?
In musical terms, what properties are varied by the human voice to produce different words / syllables?
AppleTVs create a chatty alternate WiFi network
How do I find out the mythology and history of my Fortress?
Why weren't discrete x86 CPUs ever used in game hardware?
How could we fake a moon landing now?
How to run automated tests after each commit?
How many morphisms from 1 to 1+1 can there be?
Misunderstanding of Sylow theory
Dyck paths with extra diagonals from valleys (Laser construction)
Is CEO the "profession" with the most psychopaths?
Why are vacuum tubes still used in amateur radios?
Why do early math courses focus on the cross sections of a cone and not on other 3D objects?
Lagrange four-squares theorem --- deterministic complexity
Why can't I install Tomboy in Ubuntu Mate 19.04?
Drawing spherical mirrors
Google .dev domain strangely redirects to https
How can I prevent/balance waiting and turtling as a response to cooldown mechanics
How to identify unknown coordinate type and convert to lat/lon?
How many time has Arya actually used Needle?
Conditional entropy on race outcome
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Non-zero Conditional Differential Entropy between a random variable and a function of itInformation Entropy Applied to Complexity TheoryWhat does the $-log[P(X)]$ term mean in the calculation of entropy?Why can we use entropy to measure the quality of a language model?Conditional entropy of repetition code over BSCProof that the inequality with mutual information and conditional mutual information is not true always.Entropy and Mutual InformationConditional Expectation and EntropyUnderstanding information entropyEntropy of roulette
$begingroup$
The problem is:
9 guys are racing.
The favorite has a probability of 3/4 to win the race.
Each other competitor has an equal chance to win.
If it becomes known that the favorite did not win the race, what is the uncertainty of the result?
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.
information-theory
$endgroup$
add a comment |
$begingroup$
The problem is:
9 guys are racing.
The favorite has a probability of 3/4 to win the race.
Each other competitor has an equal chance to win.
If it becomes known that the favorite did not win the race, what is the uncertainty of the result?
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.
information-theory
$endgroup$
add a comment |
$begingroup$
The problem is:
9 guys are racing.
The favorite has a probability of 3/4 to win the race.
Each other competitor has an equal chance to win.
If it becomes known that the favorite did not win the race, what is the uncertainty of the result?
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.
information-theory
$endgroup$
The problem is:
9 guys are racing.
The favorite has a probability of 3/4 to win the race.
Each other competitor has an equal chance to win.
If it becomes known that the favorite did not win the race, what is the uncertainty of the result?
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.
information-theory
information-theory
edited Apr 2 at 0:26
JohnDough
asked Apr 1 at 23:57
JohnDoughJohnDough
134
134
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.
$endgroup$
add a comment |
$begingroup$
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.
When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.
(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )
In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171292%2fconditional-entropy-on-race-outcome%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.
$endgroup$
add a comment |
$begingroup$
Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.
$endgroup$
add a comment |
$begingroup$
Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.
$endgroup$
Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.
answered Apr 2 at 1:18
ChargeShiversChargeShivers
1,261613
1,261613
add a comment |
add a comment |
$begingroup$
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.
When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.
(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )
In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.
$endgroup$
add a comment |
$begingroup$
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.
When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.
(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )
In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.
$endgroup$
add a comment |
$begingroup$
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.
When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.
(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )
In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.
$endgroup$
My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.
When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.
(In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )
In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.
answered Apr 2 at 18:26
leonbloyleonbloy
42.5k647108
42.5k647108
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171292%2fconditional-entropy-on-race-outcome%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown