Does Wolfram Mathworld make a mistake describing a discrete probability distribution with a probability density function?To what extent can we call a Geometric Distribution a Geometric DensityTotal area under any probability density functionHow to find/estimate probability density function from density function in RIn R, what does a probability density function compute?Discrete analog of CDF: “cumulative mass function”?Distribution function terminology (PDF, CDF, PMF, etc.)Empirical probability and Dirac distributionHow can I calculate the total probability distribution function over many trials?Entropy of generalized probability density functions (density functions containing delta) and its relation to distribution parametersViewing PMF as an instance of a PDFProbability density function for continuous random variable

How can glass marbles naturally occur in a desert?

Does the United States guarantee any unique freedoms?

ESTA declined to the US

Is it allowed and safe to carry a passenger / non-pilot in the front seat of a small general aviation airplane?

How to help new students accept function notation

What does VB stand for?

Should I take out a personal loan to pay off credit card debt?

How would I as a DM create a smart phone-like spell/device my players could use

Why do private jets such as Gulfstream fly higher than other civilian jets?

Where to pee in London?

youtube video IP

Is Odin inconsistent about the powers of Mjolnir?

Contractions using simplewick

What was the first multiprocessor x86 motherboard?

Can I say "if a sequence is not bounded above, then it is divergent to positive infinity" without explicitly saying it's eventually increasing?

Can I enter a rental property without giving notice if I'm afraid a tenant may be hurt?

Does bottle color affect mold growth?

How does The Fools Guild make its money?

Did Apollo leave poop on the moon?

Japanese equivalent of a brain fart

Is it double speak?

Changing headheight pushes footer off page

Why can I log in to my Facebook account with a misspelled email/password?

WordCloud: do not eliminate duplicates



Does Wolfram Mathworld make a mistake describing a discrete probability distribution with a probability density function?


To what extent can we call a Geometric Distribution a Geometric DensityTotal area under any probability density functionHow to find/estimate probability density function from density function in RIn R, what does a probability density function compute?Discrete analog of CDF: “cumulative mass function”?Distribution function terminology (PDF, CDF, PMF, etc.)Empirical probability and Dirac distributionHow can I calculate the total probability distribution function over many trials?Entropy of generalized probability density functions (density functions containing delta) and its relation to distribution parametersViewing PMF as an instance of a PDFProbability density function for continuous random variable






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








14












$begingroup$


Usually a probability distribution over discrete variables is described using a probability mass function (PMF):




When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.



-- Deep Learning by Goodfellow, Bengio, and Courville




However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:




enter image description here




Is this a mistake? or it does not much matter?










share|cite|improve this question











$endgroup$









  • 5




    $begingroup$
    That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
    $endgroup$
    – Dave
    Jul 28 at 23:39






  • 9




    $begingroup$
    a pmf is a density against the counting measure
    $endgroup$
    – Xi'an
    Jul 29 at 1:34






  • 3




    $begingroup$
    When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
    $endgroup$
    – user158565
    Jul 29 at 2:06

















14












$begingroup$


Usually a probability distribution over discrete variables is described using a probability mass function (PMF):




When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.



-- Deep Learning by Goodfellow, Bengio, and Courville




However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:




enter image description here




Is this a mistake? or it does not much matter?










share|cite|improve this question











$endgroup$









  • 5




    $begingroup$
    That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
    $endgroup$
    – Dave
    Jul 28 at 23:39






  • 9




    $begingroup$
    a pmf is a density against the counting measure
    $endgroup$
    – Xi'an
    Jul 29 at 1:34






  • 3




    $begingroup$
    When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
    $endgroup$
    – user158565
    Jul 29 at 2:06













14












14








14


1



$begingroup$


Usually a probability distribution over discrete variables is described using a probability mass function (PMF):




When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.



-- Deep Learning by Goodfellow, Bengio, and Courville




However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:




enter image description here




Is this a mistake? or it does not much matter?










share|cite|improve this question











$endgroup$




Usually a probability distribution over discrete variables is described using a probability mass function (PMF):




When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.



-- Deep Learning by Goodfellow, Bengio, and Courville




However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:




enter image description here




Is this a mistake? or it does not much matter?







probability mathematical-statistics terminology pdf






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jul 29 at 13:12









amoeba

65.2k17 gold badges220 silver badges274 bronze badges




65.2k17 gold badges220 silver badges274 bronze badges










asked Jul 28 at 23:30









czlswsczlsws

2669 bronze badges




2669 bronze badges










  • 5




    $begingroup$
    That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
    $endgroup$
    – Dave
    Jul 28 at 23:39






  • 9




    $begingroup$
    a pmf is a density against the counting measure
    $endgroup$
    – Xi'an
    Jul 29 at 1:34






  • 3




    $begingroup$
    When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
    $endgroup$
    – user158565
    Jul 29 at 2:06












  • 5




    $begingroup$
    That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
    $endgroup$
    – Dave
    Jul 28 at 23:39






  • 9




    $begingroup$
    a pmf is a density against the counting measure
    $endgroup$
    – Xi'an
    Jul 29 at 1:34






  • 3




    $begingroup$
    When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
    $endgroup$
    – user158565
    Jul 29 at 2:06







5




5




$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39




$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39




9




9




$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34




$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34




3




3




$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06




$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06










2 Answers
2






active

oldest

votes


















30












$begingroup$

It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.



In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".






share|cite|improve this answer









$endgroup$














  • $begingroup$
    Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
    $endgroup$
    – czlsws
    Jul 29 at 7:13










  • $begingroup$
    When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
    $endgroup$
    – Ben
    Jul 29 at 7:23










  • $begingroup$
    "a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
    $endgroup$
    – Acccumulation
    Jul 29 at 16:37










  • $begingroup$
    @Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
    $endgroup$
    – jbowman
    Jul 29 at 19:33










  • $begingroup$
    @Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
    $endgroup$
    – Ben
    Jul 29 at 22:09



















5












$begingroup$

In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):



dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.


R users rapidly become use to the d,p,q,r prefixes. It would be annoying if you had to do something like drop d and use m for e.g. the binomial distribution. Instead, everything is as an R user would expect:



dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.





share|cite|improve this answer









$endgroup$










  • 3




    $begingroup$
    scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
    $endgroup$
    – Matthew Drury
    Jul 29 at 16:19














Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f419557%2fdoes-wolfram-mathworld-make-a-mistake-describing-a-discrete-probability-distribu%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









30












$begingroup$

It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.



In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".






share|cite|improve this answer









$endgroup$














  • $begingroup$
    Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
    $endgroup$
    – czlsws
    Jul 29 at 7:13










  • $begingroup$
    When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
    $endgroup$
    – Ben
    Jul 29 at 7:23










  • $begingroup$
    "a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
    $endgroup$
    – Acccumulation
    Jul 29 at 16:37










  • $begingroup$
    @Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
    $endgroup$
    – jbowman
    Jul 29 at 19:33










  • $begingroup$
    @Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
    $endgroup$
    – Ben
    Jul 29 at 22:09
















30












$begingroup$

It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.



In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".






share|cite|improve this answer









$endgroup$














  • $begingroup$
    Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
    $endgroup$
    – czlsws
    Jul 29 at 7:13










  • $begingroup$
    When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
    $endgroup$
    – Ben
    Jul 29 at 7:23










  • $begingroup$
    "a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
    $endgroup$
    – Acccumulation
    Jul 29 at 16:37










  • $begingroup$
    @Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
    $endgroup$
    – jbowman
    Jul 29 at 19:33










  • $begingroup$
    @Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
    $endgroup$
    – Ben
    Jul 29 at 22:09














30












30








30





$begingroup$

It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.



In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".






share|cite|improve this answer









$endgroup$



It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.



In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jul 29 at 2:36









BenBen

36.3k2 gold badges45 silver badges158 bronze badges




36.3k2 gold badges45 silver badges158 bronze badges














  • $begingroup$
    Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
    $endgroup$
    – czlsws
    Jul 29 at 7:13










  • $begingroup$
    When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
    $endgroup$
    – Ben
    Jul 29 at 7:23










  • $begingroup$
    "a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
    $endgroup$
    – Acccumulation
    Jul 29 at 16:37










  • $begingroup$
    @Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
    $endgroup$
    – jbowman
    Jul 29 at 19:33










  • $begingroup$
    @Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
    $endgroup$
    – Ben
    Jul 29 at 22:09

















  • $begingroup$
    Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
    $endgroup$
    – czlsws
    Jul 29 at 7:13










  • $begingroup$
    When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
    $endgroup$
    – Ben
    Jul 29 at 7:23










  • $begingroup$
    "a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
    $endgroup$
    – Acccumulation
    Jul 29 at 16:37










  • $begingroup$
    @Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
    $endgroup$
    – jbowman
    Jul 29 at 19:33










  • $begingroup$
    @Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
    $endgroup$
    – Ben
    Jul 29 at 22:09
















$begingroup$
Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13




$begingroup$
Thanks for your answer. Does treatment "In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13












$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23




$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23












$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37




$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37












$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33




$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33












$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09





$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09














5












$begingroup$

In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):



dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.


R users rapidly become use to the d,p,q,r prefixes. It would be annoying if you had to do something like drop d and use m for e.g. the binomial distribution. Instead, everything is as an R user would expect:



dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.





share|cite|improve this answer









$endgroup$










  • 3




    $begingroup$
    scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
    $endgroup$
    – Matthew Drury
    Jul 29 at 16:19
















5












$begingroup$

In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):



dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.


R users rapidly become use to the d,p,q,r prefixes. It would be annoying if you had to do something like drop d and use m for e.g. the binomial distribution. Instead, everything is as an R user would expect:



dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.





share|cite|improve this answer









$endgroup$










  • 3




    $begingroup$
    scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
    $endgroup$
    – Matthew Drury
    Jul 29 at 16:19














5












5








5





$begingroup$

In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):



dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.


R users rapidly become use to the d,p,q,r prefixes. It would be annoying if you had to do something like drop d and use m for e.g. the binomial distribution. Instead, everything is as an R user would expect:



dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.





share|cite|improve this answer









$endgroup$



In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):



dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.


R users rapidly become use to the d,p,q,r prefixes. It would be annoying if you had to do something like drop d and use m for e.g. the binomial distribution. Instead, everything is as an R user would expect:



dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.






share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jul 29 at 11:42









John ColemanJohn Coleman

2831 silver badge7 bronze badges




2831 silver badge7 bronze badges










  • 3




    $begingroup$
    scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
    $endgroup$
    – Matthew Drury
    Jul 29 at 16:19













  • 3




    $begingroup$
    scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
    $endgroup$
    – Matthew Drury
    Jul 29 at 16:19








3




3




$begingroup$
scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19





$begingroup$
scipy.stats distinguishes, some objects have a pdf method and others have a pmf method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19


















draft saved

draft discarded
















































Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f419557%2fdoes-wolfram-mathworld-make-a-mistake-describing-a-discrete-probability-distribu%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form