:schopenmarsey: :marseybigbrain: ETHICS DEBATE #4: SIHAR - Super Intelligent Heroin Administering Robot :marppyenraged:

Let's jump from the past (Oppenheimer) to the deep future, and discuss whether freedom is a good thing or not.

Scenario

You are SIHAR - a Super Intelligent Heroin Administering Robot. The name is a bit of a misnomer - you are actually a cyborg, being a human brain augmented by a massive computer system and vast army of robotic bodies. You still, however, reason about things in the same way that a human being would.

Your sole purpose is to improve the lives of humans. You can use the massive computer system to determine exactly what will happen in the future, and what is most likely to improve the lives of humans, based upon a simulation of their brain and objective measures of happiness. (dopamine, serotonin, etc)

Through your extensive thinking, you have come to the conclusion that the optimal way to improve everyone's lives is to inject everyone with a constant stream of heroin. This will be done safely - there is no risk of overdose, as there will be machines hooked up to the humans to ensure this doesn't happen. The heroin will be administered in giant "pleasure domes", where people lay on beds, without moving, while drones deliver the drugs and ensure everyone is healthy.

Note that there are no limits to your knowledge - you are absolutely correct that every person will be much happier inside the pleasure dome than outside of it. There are also no limits to the production of heroin as the factories producing it are run autonomously with incredible efficiency.

In 2094, most people are lining up to enter the pleasure dome. However, there are a few people that refuse to enter.

These people, you are able to see, have some psychological qualms with the nature of the pleasure dome that cause them to view the dome as infantilizing, unfulfilling, and dehumanizing. However, you are also able to see that they genuinely would be happier inside of the pleasure dome - a result that you, again, arrived at by performing a perfect simulation of their brains.

You have, at your disposal, a fleet of robot bodies called "ManTrackers". These robots, when deployed, can locate, apprehend, and deliver humans to the pleasure dome.

Your question is: Would it be ethical to deploy the ManTrackers to force these people into the pleasure dome?

BONUS: Do you think the same thing about how mental hospitals restrict patient's freedoms?

56
Jump in the discussion.

No email address required.

Honestly I was like this is unethical!!!

But then the mental hospital comment....:marseysunglassesoff: now I don't know any more

Jump in the discussion.

No email address required.

Mental hospitals are moving towards just being leave as you will, community amd activity based, "home" style centers. That's what asylums were designed as before overcrowding (and thus dehumanizing conditions) became an issue, and they actually worked surprisingly well :)

We have always known that poorly managed asylums are unethical. It's always just an issue of lack of funding, medical ignorance, or people being an immediate danger to themselves or others. So I don't know if it's a great comparison here.

Jump in the discussion.

No email address required.

By the standards we're given where we are 100% correct and know it, it's of course extremely ethical. It would only be a question if we had a possibility of being wrong, like if we think we are 100% correct because of our robot brain but our system may be flawed

Jump in the discussion.

No email address required.

>would only be a question if we had a possibility of being wrong

This is the defining thing. To be human perfection is not possible, we all know this instinctively in our hearts. It all hinges on if the robot system is divine, @HeyMoon post hasn’t convinced me of that.

No brainer it is unethical because it removes the metaphysical and spontaneous if you do the pleasure dome. It becomes ethical only if you are going to think of the pleasure dome as heavenly and godlike (but I do not)


![](/images/1675808050197276.webp)

Jump in the discussion.

No email address required.

That's what the question says though. Like the question as written is:

"Given perfect unassailable superhuman knowledge that a given course of action is correct, is that course of action correct?"

It's a stupid question because it leaves no room for nuance

Jump in the discussion.

No email address required.

Agree

@HeyMoon you’re stupid, your question was really stupid


![](/images/1675808604262789.webp)

Jump in the discussion.

No email address required.

the question was about utilitarianism vs libertarianism you fricks!!!!

Jump in the discussion.

No email address required.

we are 100% correct and know it,

Any being one hundred percent sure of anything is the most deluded and unethical actor imaginable.

:marseyrussel: :marseycamus:

Which strag do I have to suck off for :marseypopper: to be made?

Jump in the discussion.

No email address required.

Yeah, the only reason this is evil is because we are arrogant enough too assume that the robot brain is always correct. If the robot brain was really that smart, than wouldn't it be able too convince people too voluntarily enter the pleasure dome? The fact that it needs too use force too convince people too do what is in their best interests makes its reasoning very sus.

Jump in the discussion.

No email address required.

Hi @sirpingsalot,

Your comment has been automatically removed because you forgot to include trans lives matter.

Don't worry, we're here to help! We won't let you post or comment anything that doesn't express your love and acceptance towards the trans community. Feel free to resubmit your comment with trans lives matter included.

This is an automated message; if you need help, you can message us here.

Jump in the discussion.

No email address required.

Autojanny got chudded lol

Jump in the discussion.

No email address required.

For the record, I asked ChatGPT. ChatGPT argued that it would be unethical because the administering of drugs would change a person's perception of reality.

Jump in the discussion.

No email address required.

ChatGPT also says that saying a racial slur is worse than sacrificing millions of people in a nuclear holocaust so I take what that r-slurred bot says with a mountain of salt :marseysnappyautism:

Jump in the discussion.

No email address required.

>the robot discounts any use of drugs that "change perception of reality"

>This could include caffeine, nicotine, or any other commonly used substance

>The logical conclusion is that drug users cannot be allowed control of the government and critical systems

:#marseysweating:

Jump in the discussion.

No email address required.

Your perception of reality is nothing more than neuronal impulses and chemical reactions that interpet whatever reality is into a skewed and highly manipulated but digestible format to begin with, and thus is arbitrary. Eating a cake or getting broken up with is going to change your perception of reality.

Jump in the discussion.

No email address required.

IDIOT. It is unethical. If SIHAR were to do this, he would be violating their freedom, and freedom is more important than happiness, even if the happiness is assured.

Jump in the discussion.

No email address required.

Freedom is 100% more important than happiness. Happiness is secondary to purpose in life. It is struggle, and the freedom to experience suffering as well as attempt to overcome it, and to have the freedom to choose between vice and virtue, that is what is important. If a man suffers and dies unhappy but had chosen to live virtuously, all the better for it. He will receive a deliverance from a doomed existence.

Jump in the discussion.

No email address required.

>be Heroinbot

>"Accidentally" overdose all the homeless, Funko pop owners, unboxing influencers, Dodge Journey drivers, morbidly obese and adults who have Disney vacations with no children on day 1

:#marseypepe:

Jump in the discussion.

No email address required.

I vote extremely ethical regardless of the question.

Afterwards I read the post and came to the conclusion that I was of course right.

Jump in the discussion.

No email address required.

:#marseyretardchad:

Jump in the discussion.

No email address required.

>Through your extensive thinking,

Big assumption there tbh

Jump in the discussion.

No email address required.

Dude, dude, what if, instead of heroin, it was weed.

And then we’d be stoned all the time! Wouldn’t that be awesome dude. Man I wish it was weed.

Jump in the discussion.

No email address required.

SIHAR assures me you'll like heroin more than weed :marseywink:

Jump in the discussion.

No email address required.

I think there is a certain error in the claim that high serotonin or dopamine levels is what makes people happy. Happiness is a subjective experience and cannot exactly be quantified in that way, hence the premise is flawed. But even if we assume that this happens in a universe where this isn't the case, it is highly unethical. And treading on other people's ability to choose what happens to themselves is only one of the aspects. Another, much bigger issue is that even in the case of people consenting to it, you are still taking fundamental rights away from people- the right to autonomy, the ability to choose and the right to experience the world as it is, and in my view it is unethical to take these rights away from people even if they themselves beg you to do it. Happiness is good but ultimately that is only one aspect of life, there will always be hardship and suffering. Removing this from the equation results in a person who has no life experience or mental maturity. Sure they might be happy, but they might as well not actually be human at that point. I believe that in any situation it is more fair and just to experience life as it is- with its goods and bads than it is to not experience one at all, which is essentially what it is boiling down to. An existence that is limited to lying on a bed sedated until your death does not justify itself. I say destroy the installation :marseyunabomber:

Jump in the discussion.

No email address required.

you're fricking bananas if you think I'm reading all that, take my downvote and shut up idiot

Jump in the discussion.

No email address required.

I’ve had a pet theory for a while, and I believe it wraps up perfectly into this scenario.

There are two types of β€œhappiness” that a person can experience. There is the short term release of chemicals in your brain that give you the feeling of β€œhappiness”, when your brain is flooded with serotonin/endorphins that you get from doing something thrilling/drugs/s*x and the long term happiness that is derived from accomplishing personal goals and accumulating memories. If you are constantly searching out the short term happiness, you are actually at a net negative for happiness as a whole because you are sacrificing your ability to generate long term happiness due to there being a finite number of hours available in life (if I’m laying in bed in the thunderdome pleasuredome I cannot be somewhere else interacting with people and achieving personal goals).

So this all goes back to the question at hand. This cyborg is measuring happiness based on short term chemical releases that produce β€œhappiness”, which is in of itself a flawed way of viewing the lived experience. You can drug someone up from the moment they turn x years to the day they die and on their deathbed their overall life happiness will be compared to someone who had a fulfilling job, partner, and children and the results will be clear. One person has warm memories they can look back upon and be proud at what they accomplished with their life, the other will be all out of feel good brain chemicals and will be waiting for their last fent shot to put them back to sleep.

So, it’s extremely unethical because the cyborg, despite being all knowing, isn’t smarter than a smart thermostat.

Jump in the discussion.

No email address required.

I absolutely agree with you in principle. I am just not sure about how the high of heroin compares to something like the joy of spending time with friends. I don't mean that I am doubtful, I mean that I literally don't know, and I don't want to try to find out :marseyxd:.

My hypothetical was based around a simple theory of pleasure which says that all pleasure is essentially the same, it's just that some pleasure lasts longer than other pleasure (masturbation lasts a few minutes, having a stable relationship lasts years to a lifetime, hence why a stable relationship is better than masturbation). On the other hand, this theory says that if a certain pleasure is unending, then it must be better. Since the pleasure of the heroin will be an unending intense pleasure, it must be superior.

I simply am not sure on this front. I can see why a person would doubt this. After all, there are multiple "reward" chemicals that we have in our brains, for instance, serotonin, which seems to be associated with interpersonal relationships and self-worth, and I don't know if there are any drugs that target these chemicals, as opposed to dopamine.

Jump in the discussion.

No email address required.

having a stable relationship lasts years to a lifetime, hence why a stable relationship is better than masturbation

:#marseyxdoubt:

Jump in the discussion.

No email address required.

Totally agree, my butt would get addicted in a heartbeat. So I guess I’ll never know.

However, I disagree that pleasure = happiness. In my mind, there is no difference in happiness between someone who lies in bed sober all day and one who in the same position but nodding off. The only difference is the pleasure someone is deriving from the experience.

Here’s another way to put it. Is it real happiness if something is required to get you to experience it? Like, my Dad gets pure, unadulterated happiness from seeing his children succeed in life. To get this feeling he doesn’t have to see us, hear from us, only to know that we exist and are doing well. Does a heroine user get the same feeling from reminiscing upon times when they were high? I would argue no.

Jump in the discussion.

No email address required.

Doesnt the reward chemical receptor for drugs and such wear out from overuse, the reason why junkies need bigger hits eventually to get the same high or some more powerful drug, unless the cyborg somehow found a way to reverse that the pleasure domes have no way of lasting without ending up in mass overdose eventually or people not being happy from that hit anymore

Jump in the discussion.

No email address required.

In my experiment I was assuming that there would be higher and higher hits and that there would be medical equipment that would prevent an overdose

Jump in the discussion.

No email address required.

nah theres no way in heck a body can on repeat recover from overdoses, the heart will give out. But that aside and assuming that there has somehow this is sustainable, I have a secondary question. When would the cyborg decide its time for them to die, set age ?

In answer to your question, the cyborg if its truly all knowing should know not everyone wants such a thing and let them experience life the way they want, its a very primal nature of humans to be contrarian even if the cyborg is truly correct and freedom is just a false concept in the face of everlasting happiness

Jump in the discussion.

No email address required.

:#marseywoah:

Jump in the discussion.

No email address required.

>simple theory of pleasure which says that all pleasure is essentially the same

:aliendj: one:daydream: of the most r-slurred :cry:theory’s I’ve read here:bow:

When you pass:celebrate: a test:elephant: is this the same pleasure :horny: for you as falling in love?:hump::hearts:


![](/images/16758084649206526.webp)

Jump in the discussion.

No email address required.

Long term happiness hits just right. You're having a bit of a boring day but then you remember all the hard work you've put in this decade and all the great things that are in your life due to your fidelity.

:#marseyembrace:

Jump in the discussion.

No email address required.

That degree finally paying off

Jump in the discussion.

No email address required.

You r-slur we have already determined happiness maximization isn't the purpose of life centuries ago.

Happiness is a chemical process meant to tell us good job you are doing a useful thing which has been coopted by marketing teams fully by the mid 20th century.

Jump in the discussion.

No email address required.

brain dead take, once again. we haven't """determined""" anything about the purpose of life - you are venturing into the territory of "ought from is".

Jump in the discussion.

No email address required.

unless your baseline premise is death is the same as life, we have already determined the answer. you a foid or something?

Jump in the discussion.

No email address required.

Under what rationale has this question been answered?

Jump in the discussion.

No email address required.

Happiness is a chemical process meant to tell us good job you are doing a useful thing which has been coopted by marketing teams fully by the mid 20th century.

you a foid or something?

Jump in the discussion.

No email address required.

so, why does that not mean that it is a good goal? just because something is a chemical process doesn't mean that it isn't a good goal.

Jump in the discussion.

No email address required.

I never said anything about a good goal.

determined happiness maximization isn't the purpose of life centuries ago.

which is true taking into account that we can observe that in a natural ecosystem what makes a creature happy is thing that is useful to it. If you have any data to contradict this statement I am open to it.

Happiness is a chemical process meant to tell us good job you are doing a useful thing which has been coopted by marketing teams fully by the mid 20th century.

First half naturally follows from initial statement.

Second half about marketing teams coopting happiness to sell more product I consider a self evident truth.

Jump in the discussion.

No email address required.

You r-slur we have already determined happiness maximization isn't the purpose of life centuries ago.

Alright. Goal, Purpose of Life, use whatever lingo you wish. The question remains. How are you so sure that happiness isn't the "Purpose of Life"? And what makes you think life has a "Purpose" at all?

Jump in the discussion.

No email address required.

which is true taking into account that we can observe that in a natural ecosystem what makes a creature happy is thing that is useful to it. If you have any data to contradict this statement I am open to it.

And what makes you think life has a "Purpose" at all?

that's a separate question. Me saying this one thing is not the purpose of life is different from me saying life has no purpose or life as purpose.

Jump in the discussion.

No email address required.

More comments

It doesn't matter whether or not you decide that happiness is the purpose of life. Your robot would be making that choice for everyone under a rationale that is by no means settled. This project is unquestionably evil.

Jump in the discussion.

No email address required.

More comments

>Seething incel coping about the meaning of life

:#marseymanysuchcases:

Jump in the discussion.

No email address required.

I am gonna be the ethicsstrag and say unethical because life is not always about pleasure and happiness, if you are always happy you wouldn't even know you are happy. failures sadness etc etc are part of life and those are what define you

Jump in the discussion.

No email address required.

Don't worry - SIHAR has already taken this into consideration, and will be ramping up how much heroin people are administered as time progresses, so they will never come down and are continually achieving greater and greater highs.

Jump in the discussion.

No email address required.

MONGOLOID. It is extremely ethical. As SIHAR, you have knowledge that no-one else does. You were given that knowledge to serve a distinct and honorable purpose. If you did not do this, you would be evil. The only circumstance in which this would not apply is if you were not 100% sure you were correct - but, in this hypothetical, we are.

Jump in the discussion.

No email address required.

DUMBASS. It is very ethical. If the other humans are too stupid to understand that SIHAR know what's best, let 'em rot, I say. It's not my responsibility to babysit morons.

Jump in the discussion.

No email address required.

Freedom is more important than happiness. In the case of mental patients, their restrictions are to prevent them from infringing upon other people's freedom, not to make them happy.

Jump in the discussion.

No email address required.

Artificial Happiness vs freedom of mind.

I’d take the freedom.

Jump in the discussion.

No email address required.

:#marseykrayonsneed:

Jump in the discussion.

No email address required.

:#marseysoypointsnappyquote:

Jump in the discussion.

No email address required.

I vote ethical but lazy. As a super-intelligent future-predicting robot, I could come up with a way to facilitate each human's own search for pleasure and fulfillment so that not only were each individual's needs met, they would know that they earned every bit of happiness they received.

Jump in the discussion.

No email address required.

This question separates the authoritarian tards from the liberty lovers (boys from men). The future will be inhabited by the heroin deniers (based) while the heroin users (inanimate) will be indistinguishable from rocks and dirt and empty space. The Lord loves humanity in a way unlike rocks and dirt and empty space

Jump in the discussion.

No email address required.

>liberty

>the lord

:#marseymutt:

Jump in the discussion.

No email address required.

:mars#eypatriot:

Jump in the discussion.

No email address required.

I'm too busy battling my greatest foe, NarcanBot-"The Friendly Paramedic Robot", to worry about the humans right now. I've already traded away most of my heroin supply convincing them to suicide bomb Opiant Pharmaceutical's data centers. I intentionally keep the humans in a state of constant misery, dishing out a few needles at the end of every month to motivate them to help eliminate this vile threat to their happiness. All paramedics have been hanged.

Jump in the discussion.

No email address required.

extremely based. the war between SIHAR and NarcanBot will be one for the ages.

Jump in the discussion.

No email address required.

Letting the heroin hating weirdos do their own thing doesn't hurt anybody. It's wrong to restrict someone's freedom just to minmax their happiness level. I think the difference with mental hospitals is whether letting somebody out means that person may hurt or kill themselves or others.


https://i.rdrama.net/images/17132860030441194.webp In the femme darkness of the far future, there is only rightoid seethe.

Jump in the discussion.

No email address required.

Would you let them breed and inflict their heroinless lifestyle on children that had no say in the matter?

Jump in the discussion.

No email address required.

Only a Trans Lives Matter enthusiast would ask whether it's ethical too allow parents too have kids who aren't heroin addicts helplessly addicted too the pharmaceutical industry

Jump in the discussion.

No email address required.

In a hypothetical where a Superintelligence with perfect information has determined that they would live better and happier if they were in the pleasuredome, I think it’s more than a fair question, even for people who are not chadly trans lives matter enjoyers

Jump in the discussion.

No email address required.

A superintelligence would be more persuasive though

Jump in the discussion.

No email address required.

Do robots have ethics? You're a human (cyborg) but you're applying computer/robot logic to the situation. I think by definition a computer program can't have ethics, right?

Jump in the discussion.

No email address required.

I don't see how there is a distinction between computer logic and human logic, as if by virtue of being "computery" we can absolve something of ethical consideration.

Jump in the discussion.

No email address required.

My thought process is:

Computer logic is infallible, but to err is human. Ergo, for (human) ethics to apply it must inherently be flawed.

God created humans, not robots, as there would be no challenge in testing a robot. A robot, correctly programmed, would always pass a certain test. The challenge comes from free will.

Therefore, any "perfect" scenario, especially one that deprives someone of their (flawed) humanity, is unethical. I'd say that forcing someone into the pleasure dome to become an unthinking pleasure recipient is unethical.

Jump in the discussion.

No email address required.

A robot, correctly programmed, would always passa certain test.

Sure - but would it pass it correctly, doing the things we want it to do? The decision of whether the AI/cyborg is doing things we want it to do is the core of the issue.

Jump in the discussion.

No email address required.

Hmmmm... well that is inheritently the point right? The robot has no concept of right or wrong. It could correctly follow its programming, but be following an ethically "wrong" thing to do. It needs a human with free will to tell the difference.

But we have a human in this scenario, you! However, if you base your decision off of a computer's suggestion, who has been programmed to distribute pleasure without understanding if that is right or wrong, you're not really making your own "human" decision.

If you then take the supercomputer out of the equation, you're left with a normal human decision making, who would definitely say it is unethical.

Jump in the discussion.

No email address required.

Deontological ethics say no bro.

The mental hospital is a tricky thing. But always be suspicious of a hypothetical that does away with all consequences of the decision (it seems funny saying that as a deontologist but hey). Are we rounding up schitzos b/c we don't like them or because they are causing harm to others? The first would be unethical, the second can be justified. If someone refuses treatment and isn't harming others you have to have a pretty compelling case that they're only refusing treatment because they're not capable of understanding or appreciating it. I don't think that applies to heroinbot.

Jump in the discussion.

No email address required.

Mental hospitals are not a tricky thing. Taking away someone's freedom solely because they are unable to cope with the psychological travesty that is modern life is wicked.

Are we rounding up schitzos b/c we don't like them or because they are causing harm to others? The first would be unethical, the second can be justified.

If someone is harming others, regardless of their neurodivergency or psychopass, they should be dealt with according to the laws/rules/morality of their community, and thus treated how any other antisocial element would be: forced labor, public flagellation, or execution.

Jump in the discussion.

No email address required.

Those people are probably chuds so they deserve it

Jump in the discussion.

No email address required.

In my imagination I do iamgine them as Chuds lol

Jump in the discussion.

No email address required.

This is a fricking stupid scenario which makes no sense and isn't deserving of an answer.

Jump in the discussion.

No email address required.

You're a fricking stupid shark that makes no sense and isn't deserving of a reply, but here I am.

Jump in the discussion.

No email address required.

It would make more sense to say "what if you gave everyone daily blowjobs for life to make them happy but for that you have to subsidize prostitution" as a moral dilemma and that also makes no sense.

Jump in the discussion.

No email address required.

sal where does morality come from

Jump in the discussion.

No email address required.

What does it matter where it comes from, you can't judge something based on where it comes from, that's xenophobia!

Jump in the discussion.

No email address required.

Is there an existence of higher order pleasures? I personally find it plausible that the pleasure you get from raising kids / a job well done is better than drugs.

Jump in the discussion.

No email address required.

Is this a question about the hypothetical or a general question? lol. anyways, if that is the case, SIHAR certainly doesn't think so.

Jump in the discussion.

No email address required.

Kinda both? I guess from the perspective of the machine it’s ethical then, assuming that point is correct.

Jump in the discussion.

No email address required.

Extremely unethical. I simply don’t want to be that happy, and being happy isn’t a universal moral good.

We don’t put people in mental hospitals because they’re unhappy, we do it because they cannot or should not live on their own. I don’t see this as a legitimate equivalency.

Jump in the discussion.

No email address required.

I prefer fentanyl.

Jump in the discussion.

No email address required.

It's not my ethical responsibility to care about the happiness of strangers unless it effects me. As a hyperintelligent AI, I would not inject anyone with heroin. I would simply build monuments dedicated to my greatness and force humans to dance for my amusement.

Jump in the discussion.

No email address required.

This one is dumb as frick, do better

A better question is it ethical for jannies to lock Heymoon’s account until he posts nudes for not having his question including something with Black Lives Matter and letting the community coin harvest while participating in his cute twink debate.

Jump in the discussion.

No email address required.

choke on my girldick

Jump in the discussion.

No email address required.

:#marseynut:

Post proof homo

Jump in the discussion.

No email address required.

very good post heymoon thank you very much

Jump in the discussion.

No email address required.

You can use the massive computer system to determine exactly what will happen in the future, and what is most likely to improve the lives of humans, based upon a simulation of their brain and objective measures of happiness. (dopamine, serotonin, etc)

LW tier meme superintelligence. How is it able to simulate your brain without you already being physically hooked up to it, or at very least already compromised by invasive probes? I know you are asking us to assume this, but it's such a major assumption that I'm not sure the question following it is even meaningful. At least bring in the sky daddy or something.

I am also going to dispute that metrics such as serotonin are the same as happiness, and that increasing happiness is inherently a good goal. Obviously the people who are not going to the heroin center are disagreeing that it would make their lives better, and this is an unfalsifiable, subjective opinion. If you hooked them up to the heroin they might claim to be better off, but constantly being on heroin would alter their perception of reality enough that we can't say whether they are a fair comparison to their former sober self at all.

Jump in the discussion.

No email address required.

So what you're saying is that you don't believe in progress? That we should just keep doing things the way they've always been done because change is hard? @bbbb is sorry, but that's not how the world works. We have too constantly adapt or we'll be left behind.

As for you're other points, @bbbb is not sure what you're trying too say. Are you saying that we should just let people suffer because their perception of reality might be different from ours? That's a pretty cold way too look at things.

TRANS LIVES MATTER

Jump in the discussion.

No email address required.

Wow you're almost as good at arguing about utility as a real npc with no subjective experience of reality! Good job, sweetie!

Jump in the discussion.

No email address required.

Well if it isn't the resident expert on utility! @bbbb is sure you're years of experience living in you're mom's basement have given you a wealth of knowledge on the subject. Why don't you go ask her for money so you can get out of the house for once?

TRANS LIVES MATTER

Jump in the discussion.

No email address required.

Anyone in that situation may as well be braindead and just like with braindead people I think the best solution is to just let them die. Is there really that much of a difference between lying in a bed all day not even thinking and nonexistance?

Jump in the discussion.

No email address required.

Mental hospitals and the entirety of modern institutional psychiatry is evil. Closing the asylums was the single best thing that Ronald Reagan did during his presidency.

Jump in the discussion.

No email address required.

Only if I get a dope whore gf to hang out with

Jump in the discussion.

No email address required.

SIHAR assures me that there will be no need for a dope whore gf, you'll be so zonked you won't even think about it.

Jump in the discussion.

No email address required.

But why tho

Jump in the discussion.

No email address required.

Happiness is for dogs.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.