Your average science guy, Linux nerd, and Minecraft player. Left Reddit for this place and haven’t looked back. :)

Website: lostxor.com

  • 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle





  • This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

    So not only do shitty “AI” models use >20x the energy of a human to “think,” training them uses the lifetime energy equivalent of hundreds of humans. It’s absolutely absurd how inefficient this technology is.