Is there a Kolmogorov complexity proof of the prime number theorem?Why could Mertens not prove the prime number theorem?Can a string's sophistication be defined in an unsophisticated way?Quantitative and elementary proofs of the Prime Number TheoremIdeas in the elementary proof of the prime number theorem (Selberg / Erdős)Reason Coppersmith fails here?Why shouldn't this prove the Prime Number Theorem?

Is there a Kolmogorov complexity proof of the prime number theorem?


Why could Mertens not prove the prime number theorem?Can a string's sophistication be defined in an unsophisticated way?Quantitative and elementary proofs of the Prime Number TheoremIdeas in the elementary proof of the prime number theorem (Selberg / Erdős)Reason Coppersmith fails here?Why shouldn't this prove the Prime Number Theorem?













25












$begingroup$


Lance Fortnow uses Kolmorogov complexity to prove an Almost Prime Number Theorem (https://lance.fortnow.com/papers/files/kaikoura.pdf, after theorem $2.1$): the $i$th prime is at most $i(log i)^2$. This differs from the bound in the Prime Number Theorem only by a factor of $log i$.



  1. Is there a way to sharpen this proof to a bound of $O(ilog i)$, making good use of complexity theory (Matt. F comment gives a smaller $O((loglog i)^2)$ factor)?

added: If 1. is not possible what additional information do we need to get to the Prime Number Theorem?



The only limitation seems to be the creativity needed in prefix free codes. Perhaps there are asymptotically efficient coding or some other reasoning getting to needed.



  1. Is there a connection between prefix free codes and multiplicative number Theory? Both try to characterize unique representation in some manner.









share|cite|improve this question











$endgroup$









  • 6




    $begingroup$
    This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
    $endgroup$
    – Matt F.
    Aug 8 at 2:26







  • 11




    $begingroup$
    By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
    $endgroup$
    – John Baez
    Aug 8 at 8:59







  • 2




    $begingroup$
    $sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
    $endgroup$
    – kodlu
    Aug 8 at 10:59






  • 4




    $begingroup$
    I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
    $endgroup$
    – Yemon Choi
    Aug 8 at 14:53






  • 6




    $begingroup$
    Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
    $endgroup$
    – YCor
    Aug 8 at 15:57















25












$begingroup$


Lance Fortnow uses Kolmorogov complexity to prove an Almost Prime Number Theorem (https://lance.fortnow.com/papers/files/kaikoura.pdf, after theorem $2.1$): the $i$th prime is at most $i(log i)^2$. This differs from the bound in the Prime Number Theorem only by a factor of $log i$.



  1. Is there a way to sharpen this proof to a bound of $O(ilog i)$, making good use of complexity theory (Matt. F comment gives a smaller $O((loglog i)^2)$ factor)?

added: If 1. is not possible what additional information do we need to get to the Prime Number Theorem?



The only limitation seems to be the creativity needed in prefix free codes. Perhaps there are asymptotically efficient coding or some other reasoning getting to needed.



  1. Is there a connection between prefix free codes and multiplicative number Theory? Both try to characterize unique representation in some manner.









share|cite|improve this question











$endgroup$









  • 6




    $begingroup$
    This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
    $endgroup$
    – Matt F.
    Aug 8 at 2:26







  • 11




    $begingroup$
    By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
    $endgroup$
    – John Baez
    Aug 8 at 8:59







  • 2




    $begingroup$
    $sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
    $endgroup$
    – kodlu
    Aug 8 at 10:59






  • 4




    $begingroup$
    I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
    $endgroup$
    – Yemon Choi
    Aug 8 at 14:53






  • 6




    $begingroup$
    Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
    $endgroup$
    – YCor
    Aug 8 at 15:57













25












25








25


7



$begingroup$


Lance Fortnow uses Kolmorogov complexity to prove an Almost Prime Number Theorem (https://lance.fortnow.com/papers/files/kaikoura.pdf, after theorem $2.1$): the $i$th prime is at most $i(log i)^2$. This differs from the bound in the Prime Number Theorem only by a factor of $log i$.



  1. Is there a way to sharpen this proof to a bound of $O(ilog i)$, making good use of complexity theory (Matt. F comment gives a smaller $O((loglog i)^2)$ factor)?

added: If 1. is not possible what additional information do we need to get to the Prime Number Theorem?



The only limitation seems to be the creativity needed in prefix free codes. Perhaps there are asymptotically efficient coding or some other reasoning getting to needed.



  1. Is there a connection between prefix free codes and multiplicative number Theory? Both try to characterize unique representation in some manner.









share|cite|improve this question











$endgroup$




Lance Fortnow uses Kolmorogov complexity to prove an Almost Prime Number Theorem (https://lance.fortnow.com/papers/files/kaikoura.pdf, after theorem $2.1$): the $i$th prime is at most $i(log i)^2$. This differs from the bound in the Prime Number Theorem only by a factor of $log i$.



  1. Is there a way to sharpen this proof to a bound of $O(ilog i)$, making good use of complexity theory (Matt. F comment gives a smaller $O((loglog i)^2)$ factor)?

added: If 1. is not possible what additional information do we need to get to the Prime Number Theorem?



The only limitation seems to be the creativity needed in prefix free codes. Perhaps there are asymptotically efficient coding or some other reasoning getting to needed.



  1. Is there a connection between prefix free codes and multiplicative number Theory? Both try to characterize unique representation in some manner.






nt.number-theory prime-numbers it.information-theory coding-theory prime-number-theorem






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Aug 10 at 0:40







Turbo

















asked Aug 8 at 1:02









TurboTurbo

5,2921 gold badge15 silver badges52 bronze badges




5,2921 gold badge15 silver badges52 bronze badges










  • 6




    $begingroup$
    This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
    $endgroup$
    – Matt F.
    Aug 8 at 2:26







  • 11




    $begingroup$
    By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
    $endgroup$
    – John Baez
    Aug 8 at 8:59







  • 2




    $begingroup$
    $sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
    $endgroup$
    – kodlu
    Aug 8 at 10:59






  • 4




    $begingroup$
    I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
    $endgroup$
    – Yemon Choi
    Aug 8 at 14:53






  • 6




    $begingroup$
    Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
    $endgroup$
    – YCor
    Aug 8 at 15:57












  • 6




    $begingroup$
    This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
    $endgroup$
    – Matt F.
    Aug 8 at 2:26







  • 11




    $begingroup$
    By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
    $endgroup$
    – John Baez
    Aug 8 at 8:59







  • 2




    $begingroup$
    $sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
    $endgroup$
    – kodlu
    Aug 8 at 10:59






  • 4




    $begingroup$
    I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
    $endgroup$
    – Yemon Choi
    Aug 8 at 14:53






  • 6




    $begingroup$
    Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
    $endgroup$
    – YCor
    Aug 8 at 15:57







6




6




$begingroup$
This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
$endgroup$
– Matt F.
Aug 8 at 2:26





$begingroup$
This argument can also give a bound of $i(log i)(log log i)^2$, if you bound $log m$ using the second equation from p 3 instead of the first.
$endgroup$
– Matt F.
Aug 8 at 2:26





11




11




$begingroup$
By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
$endgroup$
– John Baez
Aug 8 at 8:59





$begingroup$
By the way, the Prime Number Theorem is equivalent to an asymptotic formula $p_i sim i log i$. This is much stronger than just saying $p_i = O(i log i)$.
$endgroup$
– John Baez
Aug 8 at 8:59





2




2




$begingroup$
$sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
$endgroup$
– kodlu
Aug 8 at 10:59




$begingroup$
$sim$ means the proportionality constant is 1, so $f(x)sim g(x)$ if $lim_xinfty f(x)/g(x)=1.$
$endgroup$
– kodlu
Aug 8 at 10:59




4




4




$begingroup$
I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
$endgroup$
– Yemon Choi
Aug 8 at 14:53




$begingroup$
I have rolled back an edit from the user who seems determined to enforce their standards and customs for punctuation of headings, to very very very little benefit
$endgroup$
– Yemon Choi
Aug 8 at 14:53




6




6




$begingroup$
Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
$endgroup$
– YCor
Aug 8 at 15:57




$begingroup$
Btw the $O(ilog i)$ upper bound is due to Chebyshev around 1850 much before PNT was proved in its usual form.
$endgroup$
– YCor
Aug 8 at 15:57










2 Answers
2






active

oldest

votes


















24












$begingroup$

Firstly, the proof given there doesn't really show that $p_n = O(n(log n)^2)$
(at least not without further effort). Instead what it shows is that there are
$n$ for which $p_n = O(n(log n)^2)$, and with a small effort that this holds for arbitrarily large $n$. As noted in the comments, the argument also gives that there are $n$ with $p_n = O(n (log n) (log log n)^2)$ etc.



In fact, all that is going on in this proof is that the sum of the reciprocals of the primes diverges, which also immediately gives that there must be large $n$ with $p_n le n (log n)^2$ etc (else the sum of reciprocals would converge). So let me reformulate this ``new proof" in more usual language: the proof is nice, but there aren't really any new ideas here, only less familiar language (which is interesting and suggestive).



Consider integers $n$ below $x$ (which is large), and sort them according to their largest prime factor $p=P(n)$. If $z$ is any fixed number, then there are at most
$(log x)^z$ numbers below $x$ all of whose prime factors are less than $z$. Therefore
$$
x le (log x)^z + sum_substack nle x \ P(n) >z 1
le (log x)^z + sum_p>z fracxp,
$$

since there are at most $x/p$ integers below $x$ whose largest prime factor is $p$. This shows that for any fixed $z$
$$
sum_p> z frac 1p >1,
$$

which amounts to the divergence of the sum of reciprocals of primes. This is the content of the proof in the linked notes.



There is one fact that I find intriguing. The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence
must grow in order to have a divergent sum (that $1/n(log n)^2$ converges, $1/(n(log n)(log log n)^2)$ converges, etc) --- curious!






share|cite|improve this answer









$endgroup$














  • $begingroup$
    'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
    $endgroup$
    – Turbo
    Aug 8 at 17:37










  • $begingroup$
    'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
    $endgroup$
    – Turbo
    Aug 8 at 17:40










  • $begingroup$
    Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
    $endgroup$
    – Turbo
    Aug 8 at 17:42







  • 3




    $begingroup$
    This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
    $endgroup$
    – GH from MO
    Aug 9 at 10:15







  • 2




    $begingroup$
    @GHfromMO: Well spotted!
    $endgroup$
    – Lucia
    Aug 9 at 13:55


















5












$begingroup$

I think it’s not uncommon for arguments in elementary number theory to be “philosophically” information-theoretic in nature. But this is not a very deep fact - at the end of the day it all comes down to the well-known heuristic principle that the events of a “random integer” being divisible by different primes are approximately independent events. It’s making that heuristic notion precise that always requires considerably more subtle arguments involving complex analysis etc, in which information theoretic thinking plays no apparent role. So I‘m reasonably sure the answer to your title question about the prime number theorem is “no”.



Nonetheless, one can do some cute things with this idea. Here is an example I noticed many years ago, related to the standard estimate
$$
textrm(*) qquad
sum_p le n fraclog pp = log(n) + O(1)
qquad (nto infty)
$$

(summation over primes) — a series that is very much related to the sum of the reciprocals of the primes and to estimates of $p_n$ which you and others were discussing here. I will give (*) the following information-theoretic interpretation (which will give a rigorous proof of a bound in one direction):



Take a random variable $X$ that is chosen uniformly at random from the numbers $1,ldots,n$. For each prime $ple n$, let $V_p$ be the $p$-valuation of X (the exponent of $p$ dividing $X$). A key observation is that knowing all the $V_p$’s, one can recover $X$. That means that there is an inequality of Shannon entropies:
$$
log_2(n) = H(X)
le H(V_p, ple n)
le sum_ple n H(V_p),
$$

by well-known properties of the entropy function $H(cdot)$ (subadditivity and monotonicity with respect to the $sigma$-field, precisely analogous to the properties of Kolmogorov complexity that Fortnow uses in his notes).



Now, what is the entropy $H(V_p)$ of the random variable $V_p$? It’s the expression
$$
- sum_kge 0 operatornamePr(V_p=k)
log operatornamePr(V_p=k),
$$

which can be bounded from above by the $k=1$ term
$$
-operatornamePr(V_p=1)
log operatornamePr(V_p=1) =
-left( frac1n lfloor n/p rfloor right)
log left( frac1n lfloor n/p rfloor right),
$$

plus all the other parts (which I’ll leave as an exercise to show is $O(1)$).
And this last quantity is approximately equal to $fraclog pp$ as long as $p$ is smaller in order of magnitude than $n$. Thus after some more simple estimates one gets essentially the “$ge$” side of (*), together with the added insight that the error term in approximations such as (*) says something about the extent to which divisibility properties of a typical number by different primes are correlated with each other.



As I was saying at the beginning, this is kind of interesting at a philosophical level, but I’m not aware that anyone has found a way to make these sorts of arguments precise enough to prove something as strong as the prime number theorem, or indeed anything stronger than the most elementary sorts of estimates that already have very short and direct proofs.






share|cite|improve this answer









$endgroup$














  • $begingroup$
    I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
    $endgroup$
    – Turbo
    Aug 10 at 0:39













Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "504"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f337875%2fis-there-a-kolmogorov-complexity-proof-of-the-prime-number-theorem%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









24












$begingroup$

Firstly, the proof given there doesn't really show that $p_n = O(n(log n)^2)$
(at least not without further effort). Instead what it shows is that there are
$n$ for which $p_n = O(n(log n)^2)$, and with a small effort that this holds for arbitrarily large $n$. As noted in the comments, the argument also gives that there are $n$ with $p_n = O(n (log n) (log log n)^2)$ etc.



In fact, all that is going on in this proof is that the sum of the reciprocals of the primes diverges, which also immediately gives that there must be large $n$ with $p_n le n (log n)^2$ etc (else the sum of reciprocals would converge). So let me reformulate this ``new proof" in more usual language: the proof is nice, but there aren't really any new ideas here, only less familiar language (which is interesting and suggestive).



Consider integers $n$ below $x$ (which is large), and sort them according to their largest prime factor $p=P(n)$. If $z$ is any fixed number, then there are at most
$(log x)^z$ numbers below $x$ all of whose prime factors are less than $z$. Therefore
$$
x le (log x)^z + sum_substack nle x \ P(n) >z 1
le (log x)^z + sum_p>z fracxp,
$$

since there are at most $x/p$ integers below $x$ whose largest prime factor is $p$. This shows that for any fixed $z$
$$
sum_p> z frac 1p >1,
$$

which amounts to the divergence of the sum of reciprocals of primes. This is the content of the proof in the linked notes.



There is one fact that I find intriguing. The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence
must grow in order to have a divergent sum (that $1/n(log n)^2$ converges, $1/(n(log n)(log log n)^2)$ converges, etc) --- curious!






share|cite|improve this answer









$endgroup$














  • $begingroup$
    'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
    $endgroup$
    – Turbo
    Aug 8 at 17:37










  • $begingroup$
    'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
    $endgroup$
    – Turbo
    Aug 8 at 17:40










  • $begingroup$
    Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
    $endgroup$
    – Turbo
    Aug 8 at 17:42







  • 3




    $begingroup$
    This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
    $endgroup$
    – GH from MO
    Aug 9 at 10:15







  • 2




    $begingroup$
    @GHfromMO: Well spotted!
    $endgroup$
    – Lucia
    Aug 9 at 13:55















24












$begingroup$

Firstly, the proof given there doesn't really show that $p_n = O(n(log n)^2)$
(at least not without further effort). Instead what it shows is that there are
$n$ for which $p_n = O(n(log n)^2)$, and with a small effort that this holds for arbitrarily large $n$. As noted in the comments, the argument also gives that there are $n$ with $p_n = O(n (log n) (log log n)^2)$ etc.



In fact, all that is going on in this proof is that the sum of the reciprocals of the primes diverges, which also immediately gives that there must be large $n$ with $p_n le n (log n)^2$ etc (else the sum of reciprocals would converge). So let me reformulate this ``new proof" in more usual language: the proof is nice, but there aren't really any new ideas here, only less familiar language (which is interesting and suggestive).



Consider integers $n$ below $x$ (which is large), and sort them according to their largest prime factor $p=P(n)$. If $z$ is any fixed number, then there are at most
$(log x)^z$ numbers below $x$ all of whose prime factors are less than $z$. Therefore
$$
x le (log x)^z + sum_substack nle x \ P(n) >z 1
le (log x)^z + sum_p>z fracxp,
$$

since there are at most $x/p$ integers below $x$ whose largest prime factor is $p$. This shows that for any fixed $z$
$$
sum_p> z frac 1p >1,
$$

which amounts to the divergence of the sum of reciprocals of primes. This is the content of the proof in the linked notes.



There is one fact that I find intriguing. The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence
must grow in order to have a divergent sum (that $1/n(log n)^2$ converges, $1/(n(log n)(log log n)^2)$ converges, etc) --- curious!






share|cite|improve this answer









$endgroup$














  • $begingroup$
    'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
    $endgroup$
    – Turbo
    Aug 8 at 17:37










  • $begingroup$
    'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
    $endgroup$
    – Turbo
    Aug 8 at 17:40










  • $begingroup$
    Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
    $endgroup$
    – Turbo
    Aug 8 at 17:42







  • 3




    $begingroup$
    This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
    $endgroup$
    – GH from MO
    Aug 9 at 10:15







  • 2




    $begingroup$
    @GHfromMO: Well spotted!
    $endgroup$
    – Lucia
    Aug 9 at 13:55













24












24








24





$begingroup$

Firstly, the proof given there doesn't really show that $p_n = O(n(log n)^2)$
(at least not without further effort). Instead what it shows is that there are
$n$ for which $p_n = O(n(log n)^2)$, and with a small effort that this holds for arbitrarily large $n$. As noted in the comments, the argument also gives that there are $n$ with $p_n = O(n (log n) (log log n)^2)$ etc.



In fact, all that is going on in this proof is that the sum of the reciprocals of the primes diverges, which also immediately gives that there must be large $n$ with $p_n le n (log n)^2$ etc (else the sum of reciprocals would converge). So let me reformulate this ``new proof" in more usual language: the proof is nice, but there aren't really any new ideas here, only less familiar language (which is interesting and suggestive).



Consider integers $n$ below $x$ (which is large), and sort them according to their largest prime factor $p=P(n)$. If $z$ is any fixed number, then there are at most
$(log x)^z$ numbers below $x$ all of whose prime factors are less than $z$. Therefore
$$
x le (log x)^z + sum_substack nle x \ P(n) >z 1
le (log x)^z + sum_p>z fracxp,
$$

since there are at most $x/p$ integers below $x$ whose largest prime factor is $p$. This shows that for any fixed $z$
$$
sum_p> z frac 1p >1,
$$

which amounts to the divergence of the sum of reciprocals of primes. This is the content of the proof in the linked notes.



There is one fact that I find intriguing. The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence
must grow in order to have a divergent sum (that $1/n(log n)^2$ converges, $1/(n(log n)(log log n)^2)$ converges, etc) --- curious!






share|cite|improve this answer









$endgroup$



Firstly, the proof given there doesn't really show that $p_n = O(n(log n)^2)$
(at least not without further effort). Instead what it shows is that there are
$n$ for which $p_n = O(n(log n)^2)$, and with a small effort that this holds for arbitrarily large $n$. As noted in the comments, the argument also gives that there are $n$ with $p_n = O(n (log n) (log log n)^2)$ etc.



In fact, all that is going on in this proof is that the sum of the reciprocals of the primes diverges, which also immediately gives that there must be large $n$ with $p_n le n (log n)^2$ etc (else the sum of reciprocals would converge). So let me reformulate this ``new proof" in more usual language: the proof is nice, but there aren't really any new ideas here, only less familiar language (which is interesting and suggestive).



Consider integers $n$ below $x$ (which is large), and sort them according to their largest prime factor $p=P(n)$. If $z$ is any fixed number, then there are at most
$(log x)^z$ numbers below $x$ all of whose prime factors are less than $z$. Therefore
$$
x le (log x)^z + sum_substack nle x \ P(n) >z 1
le (log x)^z + sum_p>z fracxp,
$$

since there are at most $x/p$ integers below $x$ whose largest prime factor is $p$. This shows that for any fixed $z$
$$
sum_p> z frac 1p >1,
$$

which amounts to the divergence of the sum of reciprocals of primes. This is the content of the proof in the linked notes.



There is one fact that I find intriguing. The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence
must grow in order to have a divergent sum (that $1/n(log n)^2$ converges, $1/(n(log n)(log log n)^2)$ converges, etc) --- curious!







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Aug 8 at 15:15









LuciaLucia

36.6k5 gold badges157 silver badges186 bronze badges




36.6k5 gold badges157 silver badges186 bronze badges














  • $begingroup$
    'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
    $endgroup$
    – Turbo
    Aug 8 at 17:37










  • $begingroup$
    'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
    $endgroup$
    – Turbo
    Aug 8 at 17:40










  • $begingroup$
    Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
    $endgroup$
    – Turbo
    Aug 8 at 17:42







  • 3




    $begingroup$
    This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
    $endgroup$
    – GH from MO
    Aug 9 at 10:15







  • 2




    $begingroup$
    @GHfromMO: Well spotted!
    $endgroup$
    – Lucia
    Aug 9 at 13:55
















  • $begingroup$
    'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
    $endgroup$
    – Turbo
    Aug 8 at 17:37










  • $begingroup$
    'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
    $endgroup$
    – Turbo
    Aug 8 at 17:40










  • $begingroup$
    Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
    $endgroup$
    – Turbo
    Aug 8 at 17:42







  • 3




    $begingroup$
    This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
    $endgroup$
    – GH from MO
    Aug 9 at 10:15







  • 2




    $begingroup$
    @GHfromMO: Well spotted!
    $endgroup$
    – Lucia
    Aug 9 at 13:55















$begingroup$
'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
$endgroup$
– Turbo
Aug 8 at 17:37




$begingroup$
'at least not without further effort' indicates $p_n=O(n(log n)^2)$ possible to state in complexity language?
$endgroup$
– Turbo
Aug 8 at 17:37












$begingroup$
'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
$endgroup$
– Turbo
Aug 8 at 17:40




$begingroup$
'The complexity argument of how to concatenate ``programs" given there seems to dovetail exactly with how a sequence must grow in order to have a divergent sum converges' indicates there is something more in your sleeve?
$endgroup$
– Turbo
Aug 8 at 17:40












$begingroup$
Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
$endgroup$
– Turbo
Aug 8 at 17:42





$begingroup$
Would it also be used to show there is always an $n$ such that $p_n$ is prime and $1/n(log n)$ converges? Perhaps such arguments have use in other sequences?
$endgroup$
– Turbo
Aug 8 at 17:42





3




3




$begingroup$
This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
$endgroup$
– GH from MO
Aug 9 at 10:15





$begingroup$
This proof of $sum 1/p=infty$ resembles Erdős's proof of this fact (cf. proof of Theorem 19 in Hardy-Wright).
$endgroup$
– GH from MO
Aug 9 at 10:15





2




2




$begingroup$
@GHfromMO: Well spotted!
$endgroup$
– Lucia
Aug 9 at 13:55




$begingroup$
@GHfromMO: Well spotted!
$endgroup$
– Lucia
Aug 9 at 13:55











5












$begingroup$

I think it’s not uncommon for arguments in elementary number theory to be “philosophically” information-theoretic in nature. But this is not a very deep fact - at the end of the day it all comes down to the well-known heuristic principle that the events of a “random integer” being divisible by different primes are approximately independent events. It’s making that heuristic notion precise that always requires considerably more subtle arguments involving complex analysis etc, in which information theoretic thinking plays no apparent role. So I‘m reasonably sure the answer to your title question about the prime number theorem is “no”.



Nonetheless, one can do some cute things with this idea. Here is an example I noticed many years ago, related to the standard estimate
$$
textrm(*) qquad
sum_p le n fraclog pp = log(n) + O(1)
qquad (nto infty)
$$

(summation over primes) — a series that is very much related to the sum of the reciprocals of the primes and to estimates of $p_n$ which you and others were discussing here. I will give (*) the following information-theoretic interpretation (which will give a rigorous proof of a bound in one direction):



Take a random variable $X$ that is chosen uniformly at random from the numbers $1,ldots,n$. For each prime $ple n$, let $V_p$ be the $p$-valuation of X (the exponent of $p$ dividing $X$). A key observation is that knowing all the $V_p$’s, one can recover $X$. That means that there is an inequality of Shannon entropies:
$$
log_2(n) = H(X)
le H(V_p, ple n)
le sum_ple n H(V_p),
$$

by well-known properties of the entropy function $H(cdot)$ (subadditivity and monotonicity with respect to the $sigma$-field, precisely analogous to the properties of Kolmogorov complexity that Fortnow uses in his notes).



Now, what is the entropy $H(V_p)$ of the random variable $V_p$? It’s the expression
$$
- sum_kge 0 operatornamePr(V_p=k)
log operatornamePr(V_p=k),
$$

which can be bounded from above by the $k=1$ term
$$
-operatornamePr(V_p=1)
log operatornamePr(V_p=1) =
-left( frac1n lfloor n/p rfloor right)
log left( frac1n lfloor n/p rfloor right),
$$

plus all the other parts (which I’ll leave as an exercise to show is $O(1)$).
And this last quantity is approximately equal to $fraclog pp$ as long as $p$ is smaller in order of magnitude than $n$. Thus after some more simple estimates one gets essentially the “$ge$” side of (*), together with the added insight that the error term in approximations such as (*) says something about the extent to which divisibility properties of a typical number by different primes are correlated with each other.



As I was saying at the beginning, this is kind of interesting at a philosophical level, but I’m not aware that anyone has found a way to make these sorts of arguments precise enough to prove something as strong as the prime number theorem, or indeed anything stronger than the most elementary sorts of estimates that already have very short and direct proofs.






share|cite|improve this answer









$endgroup$














  • $begingroup$
    I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
    $endgroup$
    – Turbo
    Aug 10 at 0:39















5












$begingroup$

I think it’s not uncommon for arguments in elementary number theory to be “philosophically” information-theoretic in nature. But this is not a very deep fact - at the end of the day it all comes down to the well-known heuristic principle that the events of a “random integer” being divisible by different primes are approximately independent events. It’s making that heuristic notion precise that always requires considerably more subtle arguments involving complex analysis etc, in which information theoretic thinking plays no apparent role. So I‘m reasonably sure the answer to your title question about the prime number theorem is “no”.



Nonetheless, one can do some cute things with this idea. Here is an example I noticed many years ago, related to the standard estimate
$$
textrm(*) qquad
sum_p le n fraclog pp = log(n) + O(1)
qquad (nto infty)
$$

(summation over primes) — a series that is very much related to the sum of the reciprocals of the primes and to estimates of $p_n$ which you and others were discussing here. I will give (*) the following information-theoretic interpretation (which will give a rigorous proof of a bound in one direction):



Take a random variable $X$ that is chosen uniformly at random from the numbers $1,ldots,n$. For each prime $ple n$, let $V_p$ be the $p$-valuation of X (the exponent of $p$ dividing $X$). A key observation is that knowing all the $V_p$’s, one can recover $X$. That means that there is an inequality of Shannon entropies:
$$
log_2(n) = H(X)
le H(V_p, ple n)
le sum_ple n H(V_p),
$$

by well-known properties of the entropy function $H(cdot)$ (subadditivity and monotonicity with respect to the $sigma$-field, precisely analogous to the properties of Kolmogorov complexity that Fortnow uses in his notes).



Now, what is the entropy $H(V_p)$ of the random variable $V_p$? It’s the expression
$$
- sum_kge 0 operatornamePr(V_p=k)
log operatornamePr(V_p=k),
$$

which can be bounded from above by the $k=1$ term
$$
-operatornamePr(V_p=1)
log operatornamePr(V_p=1) =
-left( frac1n lfloor n/p rfloor right)
log left( frac1n lfloor n/p rfloor right),
$$

plus all the other parts (which I’ll leave as an exercise to show is $O(1)$).
And this last quantity is approximately equal to $fraclog pp$ as long as $p$ is smaller in order of magnitude than $n$. Thus after some more simple estimates one gets essentially the “$ge$” side of (*), together with the added insight that the error term in approximations such as (*) says something about the extent to which divisibility properties of a typical number by different primes are correlated with each other.



As I was saying at the beginning, this is kind of interesting at a philosophical level, but I’m not aware that anyone has found a way to make these sorts of arguments precise enough to prove something as strong as the prime number theorem, or indeed anything stronger than the most elementary sorts of estimates that already have very short and direct proofs.






share|cite|improve this answer









$endgroup$














  • $begingroup$
    I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
    $endgroup$
    – Turbo
    Aug 10 at 0:39













5












5








5





$begingroup$

I think it’s not uncommon for arguments in elementary number theory to be “philosophically” information-theoretic in nature. But this is not a very deep fact - at the end of the day it all comes down to the well-known heuristic principle that the events of a “random integer” being divisible by different primes are approximately independent events. It’s making that heuristic notion precise that always requires considerably more subtle arguments involving complex analysis etc, in which information theoretic thinking plays no apparent role. So I‘m reasonably sure the answer to your title question about the prime number theorem is “no”.



Nonetheless, one can do some cute things with this idea. Here is an example I noticed many years ago, related to the standard estimate
$$
textrm(*) qquad
sum_p le n fraclog pp = log(n) + O(1)
qquad (nto infty)
$$

(summation over primes) — a series that is very much related to the sum of the reciprocals of the primes and to estimates of $p_n$ which you and others were discussing here. I will give (*) the following information-theoretic interpretation (which will give a rigorous proof of a bound in one direction):



Take a random variable $X$ that is chosen uniformly at random from the numbers $1,ldots,n$. For each prime $ple n$, let $V_p$ be the $p$-valuation of X (the exponent of $p$ dividing $X$). A key observation is that knowing all the $V_p$’s, one can recover $X$. That means that there is an inequality of Shannon entropies:
$$
log_2(n) = H(X)
le H(V_p, ple n)
le sum_ple n H(V_p),
$$

by well-known properties of the entropy function $H(cdot)$ (subadditivity and monotonicity with respect to the $sigma$-field, precisely analogous to the properties of Kolmogorov complexity that Fortnow uses in his notes).



Now, what is the entropy $H(V_p)$ of the random variable $V_p$? It’s the expression
$$
- sum_kge 0 operatornamePr(V_p=k)
log operatornamePr(V_p=k),
$$

which can be bounded from above by the $k=1$ term
$$
-operatornamePr(V_p=1)
log operatornamePr(V_p=1) =
-left( frac1n lfloor n/p rfloor right)
log left( frac1n lfloor n/p rfloor right),
$$

plus all the other parts (which I’ll leave as an exercise to show is $O(1)$).
And this last quantity is approximately equal to $fraclog pp$ as long as $p$ is smaller in order of magnitude than $n$. Thus after some more simple estimates one gets essentially the “$ge$” side of (*), together with the added insight that the error term in approximations such as (*) says something about the extent to which divisibility properties of a typical number by different primes are correlated with each other.



As I was saying at the beginning, this is kind of interesting at a philosophical level, but I’m not aware that anyone has found a way to make these sorts of arguments precise enough to prove something as strong as the prime number theorem, or indeed anything stronger than the most elementary sorts of estimates that already have very short and direct proofs.






share|cite|improve this answer









$endgroup$



I think it’s not uncommon for arguments in elementary number theory to be “philosophically” information-theoretic in nature. But this is not a very deep fact - at the end of the day it all comes down to the well-known heuristic principle that the events of a “random integer” being divisible by different primes are approximately independent events. It’s making that heuristic notion precise that always requires considerably more subtle arguments involving complex analysis etc, in which information theoretic thinking plays no apparent role. So I‘m reasonably sure the answer to your title question about the prime number theorem is “no”.



Nonetheless, one can do some cute things with this idea. Here is an example I noticed many years ago, related to the standard estimate
$$
textrm(*) qquad
sum_p le n fraclog pp = log(n) + O(1)
qquad (nto infty)
$$

(summation over primes) — a series that is very much related to the sum of the reciprocals of the primes and to estimates of $p_n$ which you and others were discussing here. I will give (*) the following information-theoretic interpretation (which will give a rigorous proof of a bound in one direction):



Take a random variable $X$ that is chosen uniformly at random from the numbers $1,ldots,n$. For each prime $ple n$, let $V_p$ be the $p$-valuation of X (the exponent of $p$ dividing $X$). A key observation is that knowing all the $V_p$’s, one can recover $X$. That means that there is an inequality of Shannon entropies:
$$
log_2(n) = H(X)
le H(V_p, ple n)
le sum_ple n H(V_p),
$$

by well-known properties of the entropy function $H(cdot)$ (subadditivity and monotonicity with respect to the $sigma$-field, precisely analogous to the properties of Kolmogorov complexity that Fortnow uses in his notes).



Now, what is the entropy $H(V_p)$ of the random variable $V_p$? It’s the expression
$$
- sum_kge 0 operatornamePr(V_p=k)
log operatornamePr(V_p=k),
$$

which can be bounded from above by the $k=1$ term
$$
-operatornamePr(V_p=1)
log operatornamePr(V_p=1) =
-left( frac1n lfloor n/p rfloor right)
log left( frac1n lfloor n/p rfloor right),
$$

plus all the other parts (which I’ll leave as an exercise to show is $O(1)$).
And this last quantity is approximately equal to $fraclog pp$ as long as $p$ is smaller in order of magnitude than $n$. Thus after some more simple estimates one gets essentially the “$ge$” side of (*), together with the added insight that the error term in approximations such as (*) says something about the extent to which divisibility properties of a typical number by different primes are correlated with each other.



As I was saying at the beginning, this is kind of interesting at a philosophical level, but I’m not aware that anyone has found a way to make these sorts of arguments precise enough to prove something as strong as the prime number theorem, or indeed anything stronger than the most elementary sorts of estimates that already have very short and direct proofs.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Aug 10 at 0:32









Dan RomikDan Romik

1,47614 silver badges27 bronze badges




1,47614 silver badges27 bronze badges














  • $begingroup$
    I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
    $endgroup$
    – Turbo
    Aug 10 at 0:39
















  • $begingroup$
    I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
    $endgroup$
    – Turbo
    Aug 10 at 0:39















$begingroup$
I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
$endgroup$
– Turbo
Aug 10 at 0:39




$begingroup$
I was just interested in whether such wood pushing arguments get to PNT. Perhaps I will add a note.
$endgroup$
– Turbo
Aug 10 at 0:39

















draft saved

draft discarded
















































Thanks for contributing an answer to MathOverflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f337875%2fis-there-a-kolmogorov-complexity-proof-of-the-prime-number-theorem%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form