Do you know how large the gap between two prime numbers is, when like heading off to infinity? No? Well, you're not alone. The closest thing there is to that is Cramer's conjecture – an unproven formulation. But a few Spanish researchers at the University of Barcelona have found a probability model which seems to follow reality very closely. It may give us some additional insight into how prime numbers (and our number system) operate, and to the fundaments of mathematics.
Dirty details in the paper on arxiv.
http://arxiv.org/pdf/1402.3612v1.pdf
Embedded Link
This post has been reshared 2 times on Google+
View this post on Google+
+John Brøndum the paper doesnt dispute that. It says simply that their model is surprisingly accurate for the first 10 to the 11 primes, and that is more than any model so far. It hasnt been verified off to infinity. Thats where the next step comes.
+Frederic Dahl cheers, yes. That should've been every integer greater than 1.
I'm slightly confused +Sophie Wrobel. I thought that we knew the gap between two primes to be less than 70 million when we head off to infinity.
http://www.unh.edu/news/releases/2013/may/bp16zhang.cfm
Not every integer…
0 is even, because it's a whole number multiple of 2
Is zero even or odd?
The Fundamental Theorem of Arithmetic states that every integer can be expressed uniquely as a product of primes. So, for example,
10 = 2 * 5
And multiplying 2 and 5 (both of which are prime) is the only way to get 10.
If 1 were prime we'd have infinitely many ways to multiply primes and get 10.
10 = 2 * 5
10 = 1 * 2 * 5
10 = 1 * 1 * 2 * 5
etc
— and the Fundamental Theorem of Arithmetic would be false!
That was what I was taught in primary school as well. In secondary school I was taught it was wrong.
Ah, now we are talking +fan tai. However, I was taught is school that 1 was prime as the definition was "all numbers that can only be divided by themselves and 1". Unity fits that bill.
But it was a Catholic school.
It never was. People made mistakes in transmitting knowledge. AFAIK, the definition has always been "for a number larger than 1, and can only be divided by itself, and 1".
Thank you +T. Pascal
+fan tai I know the formal definition of primes. The question was why? An answer would be "At the [insert here] convention, in [insert here] mathematicians agreed "1" should no longer, as it was before, because [insert rationale related to advances in number theory here].
I know the upper bound: given a list of every (heretofore discovered) prime number, multiply then together and add one. That number is absurdly large at infinity, however. 🙂
Because 1 is not a prime number. Google will explain.
+Sophie Wrobel do you know why "1" was taken off the list of primes since I attended grade school? It makes no sense to me.