this is of course, waaay too fking technical for you to actually understand... so please close this post now and go back to eating that bag of dog cum u had saved up for later, i ain't judging!
but academia is waaay too stiff for me, i much prefer the flaccid dicks rdrama has to offer, so here goes:
in his paper "on computable numbers" turing spent the first half of the paper inventing the theory of modern computing, and proving that said machines he theorized can be described by a single finite length natural number, called it's description number. if u have no idea what that means, or even what binary machines code is... srsly i do recommend consulting that bag of dog semen for further info.
now, the fact these machines can be assigned a unique finite-length number, as each binary code for a machine is unique, has the effect that they as a set, can be bijected with, or assigned a one-to-one relationship to, the set of natural numbers. this has the important implication of being able to count them, meaning we can list then out one by one, in some order, without skipping over any. yes, yes, ur mum told u that u can do anything (u can't), but mathcels get very ornery and uptight about what it means to be able to successfully count a set of things, as not all sets are countable. for example u might think u can just count all real numbers, but actually ur too busy drinking dog sperm while cantor chuckles softly in background, so it won't ever happen.
lets move onto the title line: "computable number". the concept is simple, even for you: it's just a number which we can build a machine for, that can calculate any nth digit. for example: pi is a computable number because we can build a machine that can eventually calculate any nth digit of pi. now there are an infinite number of these, obv, and because these computable numbers can be associated with the machines that compute them, and machine are countable... these computable numbers are countable, as well. muster up all of whatever the fuck that is u call a brain, and don't forget it! (u will)
this brings us to the famous §8 where basically everyone thinks turing went on to prove some ground breaking, mind bending, result of the ages, but was actually just a series of fuckups that has largely contributed to the ungodly world of idiocracy we witness today, in which we confused our ignorance for knowledge and declared math as indefinitely incomplete. just like ur parents marriage.
the section more or less goes like:
well, if machines that compute numbers are countable, then the count can be tricked! i could do this by writing a machine that counts through them, and for each nth machine, calculates the nth digit, returns the inverse of that nth digit, and i can use this to build an inversed diagonal number β. because this β would contain a digit different from every other number on the list, it can't exist is the count!
but since that can't be possible, what's wrong here is counting them would be equivalent to solving the halting problem, and there is no known method of doing that, so obviously we can't acktually count them. so therefore the set of computable numbers stays countable, because they aren't able to be counted. furthermore, because this has the disadvantage that it may leave the reader with a feeling that "there must be something wrong", i will present the halting paradox that backs this up! fin. bow down and suck the dick of incompleteness u fools!
now what a fking mess: the set of computable numbers remains "countable", because we have no method to acktually count them!? i must say, i truly am left with the feeling that "there must be something wrong". i appreciate the effort for the modern theory of computing, turing, but ya shouda stopped there... and i don't want to get into the fact literally all of computer science from then until now just dogpiled ontop with biazzare classifications to justify this like "countable" vs "recursively enumerable" that basically does fuck all but jerk off about how right this!
so anyone else spot the error? no, u didn't, don't fking lie to me, u can't do it. ur a fucking idiot like everyone else. u actually drank all that dog cum, and moved onto raiding the cat litter. marsey is disappoint.
but mommy put ur dumbass through a bootcamp cause u failed out of collage, twice, and now u consider urself a "codecel". so let me translate the issue into a language that resembles the utter dogshit u barf up day after day:
inverse_diagonal = (n: number) -> {
count = 0
for (comptuable_number) in (enumerate.computable_numbers()) {
if (count < n)
count += 1
else
return computable_number(n) ? 0 : 1
}
}
enumerate_diagonal = () -> {
for (n = 0; true; n++) print inverse_diagonal(n);
}
running enumerate_diagonal
is the machine turing claims can calculate β. the problem is it can't actually do that, and it can't do that regardless of whether this would solve the halting problem or not. that halting connection a totally an irrelevant thread turing had no business going down, because there is a more fundamental reason why this doesn't work...
still can't see? ok, i'll throw u a bone, it's a bit healthier than that cat shit u've been munching on. recall that we are counting through the computable numbers, "enumerate" being a fancy word for doing this, so the iterable enumerate.computable_numbers()
must iterate over all computable numbers. if turing's claim is to be true, this must also include inverse_diagonal
itself, what happens then?
.... no? still don't get it?? god i really am gunna spell out everything line by line, eh? fuck:
-
(1)
inverse_diagonal
is the machine that computes digits of β. -
(2) at some input
n
, the variablecomptuable_number
will be the machineinverse_diagonal
referring to itself, as if β is to be part of the set of computable numbers, then the machine that computes it will need to eventually be enumerated upon, -
(3) at that point,
inverse_diagonal
will run the equivalent ofinverse_diagonal(n)
and get stuck in infinite recursion. -
(4) therefore, you cannot prove the halting process contradictory through means of β, as β cannot give an inverse to it's own nth number, and therefore cannot be a proper inverse diagonal, regardless of any "hidden assumption" about solving the halting problem. that was totally necessary to bring up.
if u can't understand it now, u had no business reading this post in the first place, and just go jump off a building or something fun like that. ur daddy will be proud that u've learned to fly! think of it as "falling with style".
for that one slightly less of a retard still with me:
it's absolutely absurd turing didn't see this. the dude went on to proudly to use an infinite recursion issue in his paradox construction on the very next page, so why didn't he see that here? heck he even intuited a sense of wrongness about these initial thoughts on the matter, but i guess was so enamored by the halting problem he just didn't dig deep enough into β to recognize the wrongness inadvertently he went down. i can't explain that anymore than the idiots that just jumped off a building over an rdrama suggestion, that they voluntarily read
honestly, i accept the dude being fallible, i forgive u turing. u gave us the theory of modern computing, and that really was a fucking stroke of genius for the ages. this mistep is almost entirely not ur fault.
but what i really can't explain is how the flying fuck this went unnoticed for so long.
what the fuck have mathcels being doing?!
Jump in the discussion.
No email address required.
I think it's because the standard class curriculum teaches a bunch of mathematical topics that are useful to know for coding (or for engineering, which has the same problem) but it rarely teaches the fundamentals because there's not enough time and the students need more Java OOP slop classes
As a math major it took like 3-4 classes of logic and real analysis to prove rigorously all the calculus ideas that my buddies in engineering were learning in one week of their math classes. It has to be like this so they can study mechanical engineering, or circuits, or all the other applied topics.
Now, the market needs more engineers and programmers than it needs fart huffing math majors, and the engineering and programming student I tutored would have killed themselves if they had to do 2 years of useless math classes. The current system works well enough, but I think it produces a lot of engineers and coders that think they are the next Isaac Newton when in fact they have a shaky understanding of the fundamentals
Jump in the discussion.
No email address required.
right, so u too accept that we are not able to actually count, countable sets?
what's the point of studying anything if u accept something that fricking r-slurred.
Jump in the discussion.
No email address required.
what is your definition of countable here?
Jump in the discussion.
No email address required.
countable aka able-to-be-counted aka we can go: 0th computable number, 1st computable number, 2nd computable number, 3rd computable number, etc....
jeez, math bro went to collage just to ask stupid questions on wtf counting is ?!?!
proof academia is a waste of time
Jump in the discussion.
No email address required.
I think I'm confused by the way you write
Jump in the discussion.
No email address required.
counting too hard for u, mathcel?
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
might as well get it over with now, ur life will never amount to anything
Jump in the discussion.
No email address required.
Alas, as long as I fail upward it's my employer's problem, not mine
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
That's such a weird thing to do if they think that way, but I think it's because despite college Calculus and Differential Equations being just foundational math, it's still more math than >90% of the population learns so it may create this false impression of being way more knowledgeable than they actually are.
The past few weeks I've been learning that doing proofs is hard lmao, well it takes time and practice, so yeah I get why engineering courses jump straight to applications to make it fast paced.
Jump in the discussion.
No email address required.
I could rant about universities for a while lol.
To give you another exemple a friend in physics showed me his first semester math exam while I was in my final year of math and the physics freshmen had like super tough complex analysis problems they were expected to solve through magical formulas.
It's hard but fun! I enjoyed seeing the discussion of Turing's proof in this thread
Jump in the discussion.
No email address required.
I think there's no easy way to make it better, while formula memorization is kind of a braindead way to learn it's basically the only way to have fast paced course, especially for physics as they have to learn so many concepts and apply them.
It is! Though I feel like an r-slur struggling with basic set proofs lmao, maybe I'll fare better in a year or two
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context