hexaglycogen [they/them, he/him]

  • 7 Posts
  • 13 Comments
Joined 2 年前
cake
Cake day: 2024年1月12日

help-circle






  • cancer doesn’t fail on any moral level. It simply is.

    a misfolded protein is the most efficient structure a protein can be in.

    there are billions of dollars being churned around in running for congress, and they’re getting a salary of like, 200k a year. good luck running for congress without representing some very deep pockets.

    I don’t judge cancer. It’s just efficient at growth, very efficient, and in a way that is unhelpful.

    I don’t think the system is evil, i think it needs treatment.







  • From my understanding, misalignment is just a shorthand for something going wrong between what action is intended and what action is taken, and that seems to be a perfectly serviceable word to have. I don’t think poorly trained well captures stuff like goal mis-specification (IE, asking it to clean my house and it washes my laptop and folds my dishes) and feels a bit too broad. Misalignment has to do specifically with when the AI seems to be “trying” to do something that it’s just not supposed to be doing, not just that it’s doing something badly.

    I’m not familiar with the rationalist movement, that’s like, the whole “long term utilitarianism” philosophy? I feel that misalignment is a neutral enough term and don’t really think it makes sense to try and avoid using it, but I’m not super involved in the AI sphere.