Unable to load image
Reported by:
  • ObamaBinLaden : Let me tell you why this fraggot from 80 years ago was wrong. Do something productive mf
  • MinecraftBee : Homophobia

EFFORTPOST :marseybigbrain: how turing was wrong: part 1

this is of course, waaay too fking technical for you to actually understand... so please close this post now and go back to eating that bag of dog cum u had saved up for later, i ain't judging!

but academia is waaay too stiff for me, i much prefer the flaccid dicks rdrama has to offer, so here goes:

in his paper "on computable numbers" turing spent the first half of the paper inventing the theory of modern computing, and proving that said machines he theorized can be described by a single finite length natural number, called it's description number. if u have no idea what that means, or even what binary machines code is... srsly i do recommend consulting that bag of dog semen for further info.

now, the fact these machines can be assigned a unique finite-length number, as each binary code for a machine is unique, has the effect that they as a set, can be bijected with, or assigned a one-to-one relationship to, the set of natural numbers. this has the important implication of being able to count them, meaning we can list then out one by one, in some order, without skipping over any. yes, yes, ur mum told u that u can do anything (u can't), but mathcels get very ornery and uptight about what it means to be able to successfully count a set of things, as not all sets are countable. for example u might think u can just count all real numbers, but actually ur too busy drinking dog sperm while cantor chuckles softly in background, so it won't ever happen.

lets move onto the title line: "computable number". the concept is simple, even for you: it's just a number which we can build a machine for, that can calculate any nth digit. for example: pi is a computable number because we can build a machine that can eventually calculate any nth digit of pi. now there are an infinite number of these, obv, and because these computable numbers can be associated with the machines that compute them, and machine are countable... these computable numbers are countable, as well. muster up all of whatever the fuck that is u call a brain, and don't forget it! (u will)

this brings us to the famous §8 where basically everyone thinks turing went on to prove some ground breaking, mind bending, result of the ages, but was actually just a series of fuckups that has largely contributed to the ungodly world of idiocracy we witness today, in which we confused our ignorance for knowledge and declared math as indefinitely incomplete. just like ur parents marriage.

the section more or less goes like:

well, if machines that compute numbers are countable, then the count can be tricked! i could do this by writing a machine that counts through them, and for each nth machine, calculates the nth digit, returns the inverse of that nth digit, and i can use this to build an inversed diagonal number β. because this β would contain a digit different from every other number on the list, it can't exist is the count!

but since that can't be possible, what's wrong here is counting them would be equivalent to solving the halting problem, and there is no known method of doing that, so obviously we can't acktually count them. so therefore the set of computable numbers stays countable, because they aren't able to be counted. furthermore, because this has the disadvantage that it may leave the reader with a feeling that "there must be something wrong", i will present the halting paradox that backs this up! fin. bow down and suck the dick of incompleteness u fools!

now what a fking mess: the set of computable numbers remains "countable", because we have no method to acktually count them!? i must say, i truly am left with the feeling that "there must be something wrong". i appreciate the effort for the modern theory of computing, turing, but ya shouda stopped there... and i don't want to get into the fact literally all of computer science from then until now just dogpiled ontop with biazzare classifications to justify this like "countable" vs "recursively enumerable" that basically does fuck all but jerk off about how right this!

so anyone else spot the error? no, u didn't, don't fking lie to me, u can't do it. ur a fucking idiot like everyone else. u actually drank all that dog cum, and moved onto raiding the cat litter. marsey is disappoint. :marseyunamused:

but mommy put ur dumbass through a bootcamp cause u failed out of collage, twice, and now u consider urself a "codecel". so let me translate the issue into a language that resembles the utter dogshit u barf up day after day:

inverse_diagonal = (n: number) -> {
  count = 0
  for (comptuable_number) in (enumerate.computable_numbers()) {
    if (count < n)
      count += 1
    else
      return computable_number(n) ? 0 : 1
  }
}

enumerate_diagonal = () -> {
  for (n = 0; true; n++) print inverse_diagonal(n);
}

running enumerate_diagonal is the machine turing claims can calculate β. the problem is it can't actually do that, and it can't do that regardless of whether this would solve the halting problem or not. that halting connection a totally an irrelevant thread turing had no business going down, because there is a more fundamental reason why this doesn't work...

still can't see? ok, i'll throw u a bone, it's a bit healthier than that cat shit u've been munching on. recall that we are counting through the computable numbers, "enumerate" being a fancy word for doing this, so the iterable enumerate.computable_numbers() must iterate over all computable numbers. if turing's claim is to be true, this must also include inverse_diagonal itself, what happens then?

.... no? still don't get it?? god i really am gunna spell out everything line by line, eh? fuck:

  • (1) inverse_diagonal is the machine that computes digits of β.

  • (2) at some input n, the variable comptuable_number will be the machine inverse_diagonal referring to itself, as if β is to be part of the set of computable numbers, then the machine that computes it will need to eventually be enumerated upon,

  • (3) at that point, inverse_diagonal will run the equivalent of inverse_diagonal(n) and get stuck in infinite recursion.

  • (4) therefore, you cannot prove the halting process contradictory through means of β, as β cannot give an inverse to it's own nth number, and therefore cannot be a proper inverse diagonal, regardless of any "hidden assumption" about solving the halting problem. that was totally necessary to bring up.

if u can't understand it now, u had no business reading this post in the first place, and just go jump off a building or something fun like that. ur daddy will be proud that u've learned to fly! think of it as "falling with style". :marseyplanecrash:

for that one slightly less of a retard still with me:

it's absolutely absurd turing didn't see this. the dude went on to proudly to use an infinite recursion issue in his paradox construction on the very next page, so why didn't he see that here? heck he even intuited a sense of wrongness about these initial thoughts on the matter, but i guess was so enamored by the halting problem he just didn't dig deep enough into β to recognize the wrongness inadvertently he went down. i can't explain that anymore than the idiots that just jumped off a building over an rdrama suggestion, that they voluntarily read :marseyshrug:

honestly, i accept the dude being fallible, i forgive u turing. u gave us the theory of modern computing, and that really was a fucking stroke of genius for the ages. this mistep is almost entirely not ur fault.

but what i really can't explain is how the flying fuck this went unnoticed for so long.

what the fuck have mathcels being doing?!

:marseypeanutbrain:

47
Jump in the discussion.

No email address required.

How is computer science as a field full of absolute r-slurs that think they know better than everyone else? Whats the deal with their massively inflated egos?

Jump in the discussion.

No email address required.

I think it's because the standard class curriculum teaches a bunch of mathematical topics that are useful to know for coding (or for engineering, which has the same problem) but it rarely teaches the fundamentals because there's not enough time and the students need more Java OOP slop classes

As a math major it took like 3-4 classes of logic and real analysis to prove rigorously all the calculus ideas that my buddies in engineering were learning in one week of their math classes. It has to be like this so they can study mechanical engineering, or circuits, or all the other applied topics.

Now, the market needs more engineers and programmers than it needs fart huffing math majors, and the engineering and programming student I tutored would have killed themselves if they had to do 2 years of useless math classes. The current system works well enough, but I think it produces a lot of engineers and coders that think they are the next Isaac Newton when in fact they have a shaky understanding of the fundamentals

Jump in the discussion.

No email address required.

right, so u too accept that we are not able to actually count, countable sets?

what's the point of studying anything if u accept something that fricking r-slurred.

Jump in the discussion.

No email address required.

:marseyconfused: what is your definition of countable here?

Jump in the discussion.

No email address required.

countable aka able-to-be-counted aka we can go: 0th computable number, 1st computable number, 2nd computable number, 3rd computable number, etc....

jeez, math bro went to collage just to ask stupid questions on wtf counting is ?!?!

proof academia is a waste of time

Jump in the discussion.

No email address required.

I think I'm confused by the way you write :marseythinkorino:

>academia is a waste of time

:marseyagreesuperspeed:

Jump in the discussion.

No email address required.

counting too hard for u, mathcel? :marseygigaretard:

Jump in the discussion.

No email address required.

:marseyagreestill:

Jump in the discussion.

No email address required.

might as well get it over with now, ur life will never amount to anything

:marseyropewithme:

Jump in the discussion.

No email address required.

More comments

The current system works well enough, but I think it produces a lot of engineers and coders that think they are the next Isaac Newton when in fact they have a shaky understanding of the fundamentals

That's such a weird thing to do if they think that way, but I think it's because despite college Calculus and Differential Equations being just foundational math, it's still more math than >90% of the population learns so it may create this false impression of being way more knowledgeable than they actually are.

The past few weeks I've been learning that doing proofs is hard lmao, well it takes time and practice, so yeah I get why engineering courses jump straight to applications to make it fast paced.

Jump in the discussion.

No email address required.

I could rant about universities for a while lol.

To give you another exemple a friend in physics showed me his first semester math exam while I was in my final year of math and the physics freshmen had like super tough complex analysis problems they were expected to solve through magical formulas.

>The past few weeks I've been learning that doing proofs is hard

It's hard but fun! I enjoyed seeing the discussion of Turing's proof in this thread

Jump in the discussion.

No email address required.

I was in my final year of math and the physics freshmen had like super tough complex analysis problems they were expected to solve through magical formulas.

I think there's no easy way to make it better, while formula memorization is kind of a braindead way to learn it's basically the only way to have fast paced course, especially for physics as they have to learn so many concepts and apply them.

It's hard but fun! I enjoyed seeing the discussion of Turing's proof in this thread

It is! Though I feel like an r-slur struggling with basic set proofs lmao, maybe I'll fare better in a year or two :marseyspecial:

Jump in the discussion.

No email address required.

Engineering also has this problem

Jump in the discussion.

No email address required.

Are American engineers really like this? Or is it !engineering students?

I never see that attitude among engineers here, they typically stay in their realm of competency or expose some general knowledge in other area but never pretending to be expert and it tends to be more work related.

Jump in the discussion.

No email address required.

There's a lot of engineers that think they can break physics or math or are climate experts or doctors, or just about anything. It doesn't help that it already attracts schizos and neurodivergents.

Jump in the discussion.

No email address required.

There's 1000000% an arrogance issue with !engineering everybody :marseyhomestar: knows the fricking best way to do things because it's their way.

Why it works at least civil :marseysaluteconfederacy: wise is fricking because you have 2-5 of them arguing with each other from different :marseyvenn3: agencies about the fricking best way to do it so they have to prove their right :marseymoreyouknow: with calculations

Students are this fricking way because they're inexperienced and don't realize they're wrong. EI/EITs realize they're fricking :marseytom: r-slurred :marseyautism: because they work under :marseyhole: more experienced PEs. PEs from late 20-45 or 50 think :marseyphilosoraptor: they're hot shit because they got the fricking license. PEs after like 50 chill :marseysmokin: out generally because they have experienced being wrong :marseyxdoubt: enough :marseyitsallsotiresome:

Jump in the discussion.

No email address required.

That type of professional arrogance (I know better than your team) yes, but the guys here are referring to engineers and CS pretending to be scientists and mathematicians, that's what I never encountered

Jump in the discussion.

No email address required.

Im a scientist since i code on research projects for biology :marseysmug2:

Jump in the discussion.

No email address required.

You're a microbiologist, so yeah, you're a scientist :marseyscientist: :marseychemist:

Kate Rubins, a NASA astronaut, is also a microbiologist and she's done DNA sequencing in microgravity

https://en.wikipedia.org/wiki/Kathleen_Rubins

Jump in the discussion.

No email address required.

I'll say I'm a fricking scientist :marseyphrenology: because I have a fricking degree :marseygrad: in science :marseychimera: just to frick with my buddies/fiancé because I enjoy being an butt. Idk I haven't seen anything :marseycoleporter: like that in particular. Generally don't deal with scientists but there's a fricking huge respect :marseyfingergoodjob: for soil scientists from what I've seen.

I need to click context more often before :marseyskellington: I respond lol

Jump in the discussion.

No email address required.

Ok but have you considered that you're all wrong and the best way to do things is my way?

Jump in the discussion.

No email address required.

I have to sit through 3 different :marseyvenn3: hour long meetings today :marseyclueless: alone :marseyitsdangerous: and hear something :marseysmugface: close :marseynoyouzoom: to this

Jump in the discussion.

No email address required.

Because im not smart enough to read turing and know how dna sequencing works :marseygigaretard:

Jump in the discussion.

No email address required.

There's simply way too much knowledge out there so we gotta stick with the few ones we like or excel.

Jump in the discussion.

No email address required.

Im reading the textbook for algorithm class and fully frick psuedo code is so hard to understand :marseygigaretard: it takes me like an hour of rereading to actually understand algorithm design. Like i understand it conceptually but the details of the actual algorithms are hard af to understand

Jump in the discussion.

No email address required.

Don't feel bad, it took 200 IQ PhDs weeks to invent some of these.

Have you tried implementing some of these yourself? It helped me understand a few algorithms

Jump in the discussion.

No email address required.

what's the deal with everyone else thinking that they're never absolutely fricked in the head?

Jump in the discussion.

No email address required.

Unlike cstards everyone else's head isn't firmly lodged up their own butt so they can just tell

Jump in the discussion.

No email address required.

i think it stems from the fact we so much more money,

that everyone else is coping with survival too much to be anything but completely r-slurred,

comparatively

Jump in the discussion.

No email address required.

Maybe. Personally I've met several math people working in software and they were all effortlessly better than almost all their coworkers. Most smart people just don't care to make crud applications for their entire lives, even if the pay is slightly better.

I'm sure gpt5 will come around shortly and put most cs idiots out of work anyway

Jump in the discussion.

No email address required.

given that u think gpt is about to do anything more than create jobs by pooping out unmaintainable crud that'll need to be wholly rebuilt,

u definitely aren't qualified to be making this claim:

they were all effortlessly better than almost all their coworkers

Jump in the discussion.

No email address required.

pooping out unmaintainable crud

Sounds like a drop in replacement for cs people

Jump in the discussion.

No email address required.

like i said, u definitely aren't qualified to be making such a claim

Jump in the discussion.

No email address required.

Yeah sure. Give me a shout out in your fields medal acceptance speech btw!

Jump in the discussion.

No email address required.

More comments
Link copied to clipboard
Action successful!
Error, please refresh the page and try again.