Conditional entropy on race outcome Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30UTC (7:30pm US/Eastern)Non-zero Conditional Differential Entropy between a random variable and a function of itInformation Entropy Applied to Complexity TheoryWhat does the $-log[P(X)]$ term mean in the calculation of entropy?Why can we use entropy to measure the quality of a language model?Conditional entropy of repetition code over BSCProof that the inequality with mutual information and conditional mutual information is not true always.Entropy and Mutual InformationConditional Expectation and EntropyUnderstanding information entropyEntropy of roulette

How to report t statistic from R

How to compare two different files line by line in unix?

What does 丫 mean? 丫是什么意思?

Why are my pictures showing a dark band on one edge?

How would a mousetrap for use in space work?

Why do early math courses focus on the cross sections of a cone and not on other 3D objects?

Random body shuffle every night—can we still function?

What initially awakened the Balrog?

What does it mean that physics no longer uses mechanical models to describe phenomena?

Why are vacuum tubes still used in amateur radios?

What's the point of the test set?

How to run automated tests after each commit?

How could we fake a moon landing now?

How do I find out the mythology and history of my Fortress?

How often does castling occur in grandmaster games?

Drawing spherical mirrors

Sentence with dass with three Verbs (One modal and two connected with zu)

One-one communication

Semigroups with no morphisms between them

How long can equipment go unused before powering up runs the risk of damage?

Flash light on something

Intuitive explanation of the rank-nullity theorem

Is multiple magic items in one inherently imbalanced?

Is CEO the "profession" with the most psychopaths?



Conditional entropy on race outcome



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30UTC (7:30pm US/Eastern)Non-zero Conditional Differential Entropy between a random variable and a function of itInformation Entropy Applied to Complexity TheoryWhat does the $-log[P(X)]$ term mean in the calculation of entropy?Why can we use entropy to measure the quality of a language model?Conditional entropy of repetition code over BSCProof that the inequality with mutual information and conditional mutual information is not true always.Entropy and Mutual InformationConditional Expectation and EntropyUnderstanding information entropyEntropy of roulette










0












$begingroup$


The problem is:



9 guys are racing.



The favorite has a probability of 3/4 to win the race.



Each other competitor has an equal chance to win.



If it becomes known that the favorite did not win the race, what is the uncertainty of the result?



My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.










share|cite|improve this question











$endgroup$
















    0












    $begingroup$


    The problem is:



    9 guys are racing.



    The favorite has a probability of 3/4 to win the race.



    Each other competitor has an equal chance to win.



    If it becomes known that the favorite did not win the race, what is the uncertainty of the result?



    My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.










    share|cite|improve this question











    $endgroup$














      0












      0








      0





      $begingroup$


      The problem is:



      9 guys are racing.



      The favorite has a probability of 3/4 to win the race.



      Each other competitor has an equal chance to win.



      If it becomes known that the favorite did not win the race, what is the uncertainty of the result?



      My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.










      share|cite|improve this question











      $endgroup$




      The problem is:



      9 guys are racing.



      The favorite has a probability of 3/4 to win the race.



      Each other competitor has an equal chance to win.



      If it becomes known that the favorite did not win the race, what is the uncertainty of the result?



      My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win. My trouble is on how to model the P(X|Y) and P(X,Y) needed to find the entropy.







      information-theory






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Apr 2 at 0:26







      JohnDough

















      asked Apr 1 at 23:57









      JohnDoughJohnDough

      134




      134




















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.






          share|cite|improve this answer









          $endgroup$




















            0












            $begingroup$


            My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.




            When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.



            (In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )



            In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.






            share|cite|improve this answer









            $endgroup$













              Your Answer








              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171292%2fconditional-entropy-on-race-outcome%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0












              $begingroup$

              Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.






              share|cite|improve this answer









              $endgroup$

















                0












                $begingroup$

                Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.






                share|cite|improve this answer









                $endgroup$















                  0












                  0








                  0





                  $begingroup$

                  Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.






                  share|cite|improve this answer









                  $endgroup$



                  Given that the winner is one of the 8 equi-probable participants, the entropy of the result is $log 8 = 3$ bits.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Apr 2 at 1:18









                  ChargeShiversChargeShivers

                  1,261613




                  1,261613





















                      0












                      $begingroup$


                      My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.




                      When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.



                      (In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )



                      In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.






                      share|cite|improve this answer









                      $endgroup$

















                        0












                        $begingroup$


                        My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.




                        When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.



                        (In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )



                        In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.






                        share|cite|improve this answer









                        $endgroup$















                          0












                          0








                          0





                          $begingroup$


                          My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.




                          When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.



                          (In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )



                          In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.






                          share|cite|improve this answer









                          $endgroup$




                          My intuition would be a conditional entropy approach H(X|Y) where X denotes the competitor and Y the information that the champion did not win.




                          When learning conditional entropy, you need to distinguish between $H(X | Y)$ and $H(X | Y =y)$. In the first one, the condition is not with respect to an event, but with respect with the distribution of the other variable; that's why $H(X |Y)$ is a plain number. Instead, $H(X | Y =y)$ conditions with respect of an event (in this case, the value of $Y$), hence the result depends on $y$.



                          (In other words, the notation $H(X|Y)$ is not analogous to other conditionals such as $E(X|Y)$ )



                          In your case you are interested in the latter, you are conditioning on an event: the winner is not the player (say) 1, that is $H(X mid X ne 1)$. Now, the conditional probability on that event is a uniform over eight values, hence the entropy is $3$ bits.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Apr 2 at 18:26









                          leonbloyleonbloy

                          42.5k647108




                          42.5k647108



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3171292%2fconditional-entropy-on-race-outcome%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Triangular numbers and gcdProving sum of a set is $0 pmod n$ if $n$ is odd, or $fracn2 pmod n$ if $n$ is even?Is greatest common divisor of two numbers really their smallest linear combination?GCD, LCM RelationshipProve a set of nonnegative integers with greatest common divisor 1 and closed under addition has all but finite many nonnegative integers.all pairs of a and b in an equation containing gcdTriangular Numbers Modulo $k$ - Hit All Values?Understanding the Existence and Uniqueness of the GCDGCD and LCM with logical symbolsThe greatest common divisor of two positive integers less than 100 is equal to 3. Their least common multiple is twelve times one of the integers.Suppose that for all integers $x$, $x|a$ and $x|b$ if and only if $x|c$. Then $c = gcd(a,b)$Which is the gcd of 2 numbers which are multiplied and the result is 600000?

                              Ingelân Ynhâld Etymology | Geografy | Skiednis | Polityk en bestjoer | Ekonomy | Demografy | Kultuer | Klimaat | Sjoch ek | Keppelings om utens | Boarnen, noaten en referinsjes Navigaasjemenuwww.gov.ukOffisjele webside fan it regear fan it Feriene KeninkrykOffisjele webside fan it Britske FerkearsburoNederlânsktalige ynformaasje fan it Britske FerkearsburoOffisjele webside fan English Heritage, de organisaasje dy't him ynset foar it behâld fan it Ingelske kultuergoedYnwennertallen fan alle Britske stêden út 'e folkstelling fan 2011Notes en References, op dizze sideEngland

                              Հադիս Բովանդակություն Անվանում և նշանակություն | Դասակարգում | Աղբյուրներ | Նավարկման ցանկ