this is of course, waaay too fking technical for you to actually understand... so please close this post now and go back to eating that bag of dog cum u had saved up for later, i ain't judging!
but academia is waaay too stiff for me, i much prefer the flaccid dicks rdrama has to offer, so here goes:
in his paper "on computable numbers" turing spent the first half of the paper inventing the theory of modern computing, and proving that said machines he theorized can be described by a single finite length natural number, called it's description number. if u have no idea what that means, or even what binary machines code is... srsly i do recommend consulting that bag of dog semen for further info.
now, the fact these machines can be assigned a unique finite-length number, as each binary code for a machine is unique, has the effect that they as a set, can be bijected with, or assigned a one-to-one relationship to, the set of natural numbers. this has the important implication of being able to count them, meaning we can list then out one by one, in some order, without skipping over any. yes, yes, ur mum told u that u can do anything (u can't), but mathcels get very ornery and uptight about what it means to be able to successfully count a set of things, as not all sets are countable. for example u might think u can just count all real numbers, but actually ur too busy drinking dog sperm while cantor chuckles softly in background, so it won't ever happen.
lets move onto the title line: "computable number". the concept is simple, even for you: it's just a number which we can build a machine for, that can calculate any nth digit. for example: pi is a computable number because we can build a machine that can eventually calculate any nth digit of pi. now there are an infinite number of these, obv, and because these computable numbers can be associated with the machines that compute them, and machine are countable... these computable numbers are countable, as well. muster up all of whatever the fuck that is u call a brain, and don't forget it! (u will)
this brings us to the famous §8 where basically everyone thinks turing went on to prove some ground breaking, mind bending, result of the ages, but was actually just a series of fuckups that has largely contributed to the ungodly world of idiocracy we witness today, in which we confused our ignorance for knowledge and declared math as indefinitely incomplete. just like ur parents marriage.
the section more or less goes like:
well, if machines that compute numbers are countable, then the count can be tricked! i could do this by writing a machine that counts through them, and for each nth machine, calculates the nth digit, returns the inverse of that nth digit, and i can use this to build an inversed diagonal number β. because this β would contain a digit different from every other number on the list, it can't exist is the count!
but since that can't be possible, what's wrong here is counting them would be equivalent to solving the halting problem, and there is no known method of doing that, so obviously we can't acktually count them. so therefore the set of computable numbers stays countable, because they aren't able to be counted. furthermore, because this has the disadvantage that it may leave the reader with a feeling that "there must be something wrong", i will present the halting paradox that backs this up! fin. bow down and suck the dick of incompleteness u fools!
now what a fking mess: the set of computable numbers remains "countable", because we have no method to acktually count them!? i must say, i truly am left with the feeling that "there must be something wrong". i appreciate the effort for the modern theory of computing, turing, but ya shouda stopped there... and i don't want to get into the fact literally all of computer science from then until now just dogpiled ontop with biazzare classifications to justify this like "countable" vs "recursively enumerable" that basically does fuck all but jerk off about how right this!
so anyone else spot the error? no, u didn't, don't fking lie to me, u can't do it. ur a fucking idiot like everyone else. u actually drank all that dog cum, and moved onto raiding the cat litter. marsey is disappoint.
but mommy put ur dumbass through a bootcamp cause u failed out of collage, twice, and now u consider urself a "codecel". so let me translate the issue into a language that resembles the utter dogshit u barf up day after day:
inverse_diagonal = (n: number) -> {
count = 0
for (comptuable_number) in (enumerate.computable_numbers()) {
if (count < n)
count += 1
else
return computable_number(n) ? 0 : 1
}
}
enumerate_diagonal = () -> {
for (n = 0; true; n++) print inverse_diagonal(n);
}
running enumerate_diagonal
is the machine turing claims can calculate β. the problem is it can't actually do that, and it can't do that regardless of whether this would solve the halting problem or not. that halting connection a totally an irrelevant thread turing had no business going down, because there is a more fundamental reason why this doesn't work...
still can't see? ok, i'll throw u a bone, it's a bit healthier than that cat shit u've been munching on. recall that we are counting through the computable numbers, "enumerate" being a fancy word for doing this, so the iterable enumerate.computable_numbers()
must iterate over all computable numbers. if turing's claim is to be true, this must also include inverse_diagonal
itself, what happens then?
.... no? still don't get it?? god i really am gunna spell out everything line by line, eh? fuck:
-
(1)
inverse_diagonal
is the machine that computes digits of β. -
(2) at some input
n
, the variablecomptuable_number
will be the machineinverse_diagonal
referring to itself, as if β is to be part of the set of computable numbers, then the machine that computes it will need to eventually be enumerated upon, -
(3) at that point,
inverse_diagonal
will run the equivalent ofinverse_diagonal(n)
and get stuck in infinite recursion. -
(4) therefore, you cannot prove the halting process contradictory through means of β, as β cannot give an inverse to it's own nth number, and therefore cannot be a proper inverse diagonal, regardless of any "hidden assumption" about solving the halting problem. that was totally necessary to bring up.
if u can't understand it now, u had no business reading this post in the first place, and just go jump off a building or something fun like that. ur daddy will be proud that u've learned to fly! think of it as "falling with style".
for that one slightly less of a retard still with me:
it's absolutely absurd turing didn't see this. the dude went on to proudly to use an infinite recursion issue in his paradox construction on the very next page, so why didn't he see that here? heck he even intuited a sense of wrongness about these initial thoughts on the matter, but i guess was so enamored by the halting problem he just didn't dig deep enough into β to recognize the wrongness inadvertently he went down. i can't explain that anymore than the idiots that just jumped off a building over an rdrama suggestion, that they voluntarily read
honestly, i accept the dude being fallible, i forgive u turing. u gave us the theory of modern computing, and that really was a fucking stroke of genius for the ages. this mistep is almost entirely not ur fault.
but what i really can't explain is how the flying fuck this went unnoticed for so long.
what the fuck have mathcels being doing?!
Jump in the discussion.
No email address required.
u mad bro?
it feeds my ego when people try to thru up a bunch of irrelevant info when they've been stumped,
cause at that point, i know they've been stumped too much to provide anything meaningful in terms of a counter point.
Jump in the discussion.
No email address required.
More options
Context