Does Wolfram Mathworld make a mistake describing a discrete probability distribution with a probability density function?To what extent can we call a Geometric Distribution a Geometric DensityTotal area under any probability density functionHow to find/estimate probability density function from density function in RIn R, what does a probability density function compute?Discrete analog of CDF: “cumulative mass function”?Distribution function terminology (PDF, CDF, PMF, etc.)Empirical probability and Dirac distributionHow can I calculate the total probability distribution function over many trials?Entropy of generalized probability density functions (density functions containing delta) and its relation to distribution parametersViewing PMF as an instance of a PDFProbability density function for continuous random variable
How can glass marbles naturally occur in a desert?
Does the United States guarantee any unique freedoms?
ESTA declined to the US
Is it allowed and safe to carry a passenger / non-pilot in the front seat of a small general aviation airplane?
How to help new students accept function notation
What does VB stand for?
Should I take out a personal loan to pay off credit card debt?
How would I as a DM create a smart phone-like spell/device my players could use
Why do private jets such as Gulfstream fly higher than other civilian jets?
Where to pee in London?
youtube video IP
Is Odin inconsistent about the powers of Mjolnir?
Contractions using simplewick
What was the first multiprocessor x86 motherboard?
Can I say "if a sequence is not bounded above, then it is divergent to positive infinity" without explicitly saying it's eventually increasing?
Can I enter a rental property without giving notice if I'm afraid a tenant may be hurt?
Does bottle color affect mold growth?
How does The Fools Guild make its money?
Did Apollo leave poop on the moon?
Japanese equivalent of a brain fart
Is it double speak?
Changing headheight pushes footer off page
Why can I log in to my Facebook account with a misspelled email/password?
WordCloud: do not eliminate duplicates
Does Wolfram Mathworld make a mistake describing a discrete probability distribution with a probability density function?
To what extent can we call a Geometric Distribution a Geometric DensityTotal area under any probability density functionHow to find/estimate probability density function from density function in RIn R, what does a probability density function compute?Discrete analog of CDF: “cumulative mass function”?Distribution function terminology (PDF, CDF, PMF, etc.)Empirical probability and Dirac distributionHow can I calculate the total probability distribution function over many trials?Entropy of generalized probability density functions (density functions containing delta) and its relation to distribution parametersViewing PMF as an instance of a PDFProbability density function for continuous random variable
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
Usually a probability distribution over discrete variables is described using a probability mass function (PMF):
When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.
-- Deep Learning by Goodfellow, Bengio, and Courville
However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:
Is this a mistake? or it does not much matter?
probability mathematical-statistics terminology pdf
$endgroup$
add a comment |
$begingroup$
Usually a probability distribution over discrete variables is described using a probability mass function (PMF):
When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.
-- Deep Learning by Goodfellow, Bengio, and Courville
However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:
Is this a mistake? or it does not much matter?
probability mathematical-statistics terminology pdf
$endgroup$
5
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
9
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
3
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06
add a comment |
$begingroup$
Usually a probability distribution over discrete variables is described using a probability mass function (PMF):
When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.
-- Deep Learning by Goodfellow, Bengio, and Courville
However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:
Is this a mistake? or it does not much matter?
probability mathematical-statistics terminology pdf
$endgroup$
Usually a probability distribution over discrete variables is described using a probability mass function (PMF):
When working with continuous random variables, we describe probability distributions using a probability density function (PDF) rather than a probability mass function.
-- Deep Learning by Goodfellow, Bengio, and Courville
However, Wolfram Mathworld is using PDF to describe the probability distribution over discrete variables:
Is this a mistake? or it does not much matter?
probability mathematical-statistics terminology pdf
probability mathematical-statistics terminology pdf
edited Jul 29 at 13:12
amoeba
65.2k17 gold badges220 silver badges274 bronze badges
65.2k17 gold badges220 silver badges274 bronze badges
asked Jul 28 at 23:30
czlswsczlsws
2669 bronze badges
2669 bronze badges
5
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
9
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
3
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06
add a comment |
5
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
9
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
3
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06
5
5
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
9
9
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
3
3
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.
In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".
$endgroup$
$begingroup$
Thanks for your answer. Doestreatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
add a comment |
$begingroup$
In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):
dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.
R users rapidly become use to the d,p,q,r
prefixes. It would be annoying if you had to do something like drop d
and use m
for e.g. the binomial distribution. Instead, everything is as an R user would expect:
dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.
$endgroup$
3
$begingroup$
scipy.stats
distinguishes, some objects have apdf
method and others have apmf
method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f419557%2fdoes-wolfram-mathworld-make-a-mistake-describing-a-discrete-probability-distribu%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.
In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".
$endgroup$
$begingroup$
Thanks for your answer. Doestreatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
add a comment |
$begingroup$
It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.
In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".
$endgroup$
$begingroup$
Thanks for your answer. Doestreatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
add a comment |
$begingroup$
It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.
In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".
$endgroup$
It is not a mistake: In the formal treatment of probability, via measure theory, a probability density function is a derivative of the probability measure of interest, taken with respect to a "dominating measure" (also called a "reference measure"). For discrete distributions over the integers, the probability mass function is a density function with respect to counting measure. Since a probability mass function is a particular type of probability density function, you will sometimes find references like this that refer to it as a density function, and they are not wrong to refer to it this way.
In ordinary discourse on probability and statistics, one often avoids this terminology, and draws a distinction between "mass functions" (for discrete random variables) and "density functions" (for continuous random variables), in order to distinguish discrete and continuous distributions. In other contexts, where one is stating holistic aspects of probability, it is often better to ignore the distinction and refer to both as "density functions".
answered Jul 29 at 2:36
BenBen
36.3k2 gold badges45 silver badges158 bronze badges
36.3k2 gold badges45 silver badges158 bronze badges
$begingroup$
Thanks for your answer. Doestreatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
add a comment |
$begingroup$
Thanks for your answer. Doestreatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?
$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
$begingroup$
Thanks for your answer. Does
treatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
Thanks for your answer. Does
treatment
"In the formal treatment of probability" mean notation, perspective, convention or something else?$endgroup$
– czlsws
Jul 29 at 7:13
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
When I talk here about the "formal treatment" I am referring to the modern basis of probability theory, which is a subset of measure theory. That is the mathematical theory that is accepted as the formal underpinning of probability.
$endgroup$
– Ben
Jul 29 at 7:23
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
"a probability density function is a derivative of the probability measure of interest" It seems to me that in some sense it's more of an "anti-integral" than a derivative. There are discontinuous PDFs, such as the uniform distribution, and discrete distributions can be treated as being sums of Dirac delta functions. In those cases, one would have to have to generalize the concept of a derivative far beyond the ordinary understanding for it to apply.
$endgroup$
– Acccumulation
Jul 29 at 16:37
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Acccumulation - how is the uniform distribution discontinuous? ... and measure theory is a far more general treatment of integration and differentiation than the ordinary understanding of Calc I and II provides.
$endgroup$
– jbowman
Jul 29 at 19:33
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
$begingroup$
@Accumulation: Yes, that's a fair characterisation, and indeed, that is what is done. Technically the density is a Radon-Nikodym derivative, which is indeed a type of "anti-integral" of the type you describe.
$endgroup$
– Ben
Jul 29 at 22:09
add a comment |
$begingroup$
In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):
dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.
R users rapidly become use to the d,p,q,r
prefixes. It would be annoying if you had to do something like drop d
and use m
for e.g. the binomial distribution. Instead, everything is as an R user would expect:
dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.
$endgroup$
3
$begingroup$
scipy.stats
distinguishes, some objects have apdf
method and others have apmf
method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19
add a comment |
$begingroup$
In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):
dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.
R users rapidly become use to the d,p,q,r
prefixes. It would be annoying if you had to do something like drop d
and use m
for e.g. the binomial distribution. Instead, everything is as an R user would expect:
dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.
$endgroup$
3
$begingroup$
scipy.stats
distinguishes, some objects have apdf
method and others have apmf
method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19
add a comment |
$begingroup$
In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):
dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.
R users rapidly become use to the d,p,q,r
prefixes. It would be annoying if you had to do something like drop d
and use m
for e.g. the binomial distribution. Instead, everything is as an R user would expect:
dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.
$endgroup$
In addition to the more theoretical answer in terms of measure theory, it is also convenient to not distinguish between pmfs and pdfs in statistical programming. For example, R has a wealth of built-in distributions. For each distribution, it has 4 functions. For example, for the normal distribution (from the help file):
dnorm gives the density, pnorm gives the distribution function, qnorm gives the quantile function, and rnorm generates random deviates.
R users rapidly become use to the d,p,q,r
prefixes. It would be annoying if you had to do something like drop d
and use m
for e.g. the binomial distribution. Instead, everything is as an R user would expect:
dbinom gives the density, pbinom gives the distribution function, qbinom gives the quantile function and rbinom generates random deviates.
answered Jul 29 at 11:42
John ColemanJohn Coleman
2831 silver badge7 bronze badges
2831 silver badge7 bronze badges
3
$begingroup$
scipy.stats
distinguishes, some objects have apdf
method and others have apmf
method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19
add a comment |
3
$begingroup$
scipy.stats
distinguishes, some objects have apdf
method and others have apmf
method. It really annoys me!
$endgroup$
– Matthew Drury
Jul 29 at 16:19
3
3
$begingroup$
scipy.stats
distinguishes, some objects have a pdf
method and others have a pmf
method. It really annoys me!$endgroup$
– Matthew Drury
Jul 29 at 16:19
$begingroup$
scipy.stats
distinguishes, some objects have a pdf
method and others have a pmf
method. It really annoys me!$endgroup$
– Matthew Drury
Jul 29 at 16:19
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f419557%2fdoes-wolfram-mathworld-make-a-mistake-describing-a-discrete-probability-distribu%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
5
$begingroup$
That's sloppy, in my opinion, but not very important. It's even defensible if they approach probability from the standpoint of measure theory, though that seems like a bit much for an introduction to flipping a coin. (Weird enough, they don't appear to have an article on PMFs.)
$endgroup$
– Dave
Jul 28 at 23:39
9
$begingroup$
a pmf is a density against the counting measure
$endgroup$
– Xi'an
Jul 29 at 1:34
3
$begingroup$
When you discuss the probability theory at the level of measure space specified by 3 elements, pdf and pmf have no different, so the pmf is dropped. All distributions can be specified by pdf. wolfram is a math website, so it is not surprise that they use high level math to talk about probability. Here is good free reading. stat.washington.edu/~pdhoff/courses/581/LectureNotes/…
$endgroup$
– user158565
Jul 29 at 2:06