Context: jest is a Javascript testing library, mocking is something you do in test in order not to use production services. AI understood both terms in a none programming context

  • Ephera@lemmy.ml
    link
    fedilink
    arrow-up
    162
    ·
    3 months ago

    Man, it really is like an extremely dense but dedicated intern. Does not question for a moment why it’s supposed to make fun of an interval, but delivers a complete essay.

    Just make sure to never say “let’s eat Grandpa” around an AI or it’ll have half the leg chomped down before you can clarify that a comma is missing.

    • Aggravationstation@feddit.uk
      link
      fedilink
      arrow-up
      42
      ·
      3 months ago

      Yea I didn’t think about that but if someone said to an AI powered robot “Hey, can you shred my reports?” as they leave work they could easily come back in the morning to it tearing their junior staff into strips like “Morning boss, almost done”.

          • Hazzard@lemm.ee
            link
            fedilink
            arrow-up
            4
            ·
            3 months ago

            Yeah, this is the problem with frankensteining two systems together. Giving an LLM a prompt, and giving it a module that can interpret images for it, leads to this.

            The image parser goes “a crossword, with the following hints”, when what the AI needs to do the job is an actual understanding of the grid. If one singular system understood both images and text, it could hypothetically understand the task well enough to fetch the information it needed from the image. But LLMs aren’t really an approach to any true “intelligence”, so they’ll forever be unable to do that as one piece.

          • stebo@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            3 months ago

            Well tbf this isn’t what chatgpt is designed for. It can interpret images and give information/advice/whatever, but not solve puzzles crossword puzzles entirely.

              • stebo@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                3 months ago

                There’s a difference between helping to solve puzzles and actually solving them.

                You have to be more specific:

                • Ptsf@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  3 months ago

                  I later did ask it to just be helpful, specifically requesting it give me some possible words that fit for the 5 letter possibility for #1. It repeated “floor it” lol.