• Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      26 days ago

      While I expect this is fake, I don’t know whats worse about it. the fact that copilot got that wrong, or the fact that they used it in the first place instead of one of the 3 built in functions that can summarize cells.

  • ellypony@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    26 days ago

    My harkness test is that if you wouldn’t trust it to run a nuclear power plant. Why on earth would you trust it making changes on your own computer?

    “Help! There’s a mode conflict at startup! ECCS pressure is plummeting and core temps are over 500°C”

    ChatGPT: “Hmmm this is a complex issue that requires thought and consideration… Disable auto-SCRAM, insert rods, and pray to the god of your choosing”

    • anton@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      24 days ago

      I barely trust myself to run a nuclear power plant until the next scheduled maintenance (I wouldn’t touch anything and wait for it to shutdown on its own), but I happily trust myself with root permissions and slapped together bash scripts (but not both at the same time).

    • pinball_wizard@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      My harkness test is that if you wouldn’t trust it to run a nuclear power plant. Why on earth would you trust it making changes on your own computer?

      If I followed that standard, I also wouldn’t run Windows. And…I don’t. So yeah. It checks out.

  • rook@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    26 days ago

    Of course he gets the job, he is probably smarter than the whole team at MS.