one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    9 months ago

    33,000 households worth of electricity is not a “ridiculous amount of energy.” It’s actually quite modest. Your wild hyperbole doesn’t help your case.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      You have a very strange definition of ‘modest.’ Because I would say one household’s worth of electricity is modest and 33,000 is a fuckload. Or did I miss something and we’re running houses off of AA batteries these days?

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        edit-2
        9 months ago

        OpenAI is a global service. People all over the world are using it and doing a massive amount of work with it. According to this page there are 180.5 million users and openai.com got 1.6 billion visits in December last year. It is extremely modest on that scale.

        You need to account for what’s being done with resources when trying to judge whether the resources are excessive.