• pelespirit@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      6 months ago

      Holy shit, does that mean there are ways to check if it’s a bot because there are triggers?

      Edit: Meaning, if it doesn’t have these weird titles and otherwise seems sort of legit.

      Edit 2:

      Product description

      Introductions:

      We apologize, but we are unable to fulfill your request for a specific product analysis and optimization

      We kindly request providing an alternate product for us to assist

      Our team specializes in optimizing product descriptions for, ensuring they meet the platform requirements and align with optimal search results

      Please provide details about your alternative product, and we will be happy to deliver an optimized description tailored to its presence

      Thank you for understanding.

  • dan1101@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    6 months ago

    Amazon is also losing me with all the crazy brand names for the same products. They really need to curate their products better.

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 months ago

    I find amazons ai user review summaries are always incorrect.

    Like it can only read part of the reviews, up till you need to clock “read more”, so it will always miss key parts of the reviews.

  • NutWrench@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    6 months ago

    Yeah, but think of all the money Amazon saved by not hiring customer service reps. Lol.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    This is the best summary I could come up with:


    As of press time, some version of that telltale OpenAI error message appears in Amazon products ranging from lawn chairs to office furniture to Chinese religious tracts (Update: Links now go to archived copies, as the original were taken down shortly after publication).

    Sometimes, the product names even highlight the specific reason why the apparent AI-generation request failed, noting that OpenAI can’t provide content that “requires using trademarked brand names” or “promotes a specific religious institution” or, in one case, “encourage unethical behavior.”

    The descriptions for these oddly named products are also riddled with obvious AI error messages like, “Apologies, but I am unable to provide the information you’re seeking.”

    On the contrary, in September, Amazon launched its own generative AI tool to help sellers “create more thorough and captivating product descriptions, titles, and listing details.”

    And we could only find a small handful of Amazon products slipping through with the telltale error messages in their names or descriptions as of press time.

    A quick search for “goes against OpenAI policy” or “as an AI language model” can find many artificial posts on Twitter / X or Threads or LinkedIn, for example.


    The original article contains 406 words, the summary contains 192 words. Saved 53%. I’m a bot and I’m open source!