• Fuckass [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    It’s incredible how LLM started off as an almost miraculous software that generated impressive answers but now it’s just House Server of Leaves

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      The trick is you have to correct for the hallucinations, and teach it to revert back to a health path when going off course. This isn’t possible with current consumer tools.