There is a relatively hard upper bound on streaming video, though. It can't grow past everyone watching video 24/7. Use of genAI doesn't have a clear upper bound and could increase the environmental impact of anything it is used for (which, eventually, may be basically everything). So it could easily grow to orders of magnitude more than streaming, especially if it eventually starts being used to generate movies or shows on demand (and god knows what else).
Perhaps you are right in principle, but I think advocating for degrowth is entirely hopeless. 99% of people will simply not chose to decrease their energy usage if it lowers their quality of life even a bit (including things you might consider luxuries, not necessities). We also tend to have wars and any idea of degrowth goes out of the window the moment there is a foreign military threat with an ideology that is not limited by such ways of thinking.
The only realistic way forward is trying to make energy generation greener (renewables, nuclear, better efficiency), not fighting to decrease human consumption.
This being said, I think that the alternatives are wishful thinking. Better efficiency is often counterproductive, as reducing the energy cost of something by, say, half, can lead to its use being more than doubled. It only helps to increase the efficiency of things for which there is no latent demand, basically.
And renewables and nuclear are certainly nicer than coal, but every energy source can lead to massive problems if it is overexploited. For instance, unfettered production of fusion energy would eventually create enough waste heat to cause climate change directly. Overexploitation of renewables such as solar would also cause climate change by redirecting the energy that heats the planet. These may seem like ridiculous concerns, but you have to look at the pattern here. There is no upper bound whatsoever to the energy we would consume if it was free. If energy is cheap enough, we will overexploit, and ludicrous things will happen as a result.
Again, I actually agree with you that advocating for degrowth is hopeless. But I don't think alternative ways forward such as what you propose will actually work.
If humanity's energy consumption is so high that there is an actual threat of causing climate change purely with waste heat, I think our technological development would be so advanced that we will be essentially immortal post-humans and most of the solar system will be colonized. By that time any climate change on Earth would no longer be a threat to humanity, simply because we will not have all our eggs in one basket.
But why do you think that? Energy use is a matter of availability, not purely of technological advancement. For sure, technological advancement can unlock better ways to produce it, but if people in the 50s somehow had an infinite source of free energy at their disposal, we would have boiled off the oceans before we got the Internet.
So the question is, at which point would the aggregate production of enough energy to cause climate change through waste heat be economically feasible? I see no reason to think this would come after becoming "immortal post-humans." The current climate change crisis is just one example of a scale-induced threat that is happening prior to post-humanity. What makes it so special or unique? I suspect there's many others down the line, it's just very difficult to understand the ramifications of scaling technology before they unfold.
And that's the crux of the issue isn't it? It's extremely difficult to predict what will happen once you deploy a technology at scale. There are countless examples of unintended consequences. If we keep going forward at maximal speed every time we make something new, we'll keep running headfirst into these unintended consequences. That's basically a gambling addiction. Mostly it's going to be fine, but...
I dont feel like putting together a study but just look up the energy/co2/environment cost to stream one hour of video. You will see it is an order of magnitude higher than other uses like AI.
The European average is 56 grams of CO2 emissions per hour of video streaming. For comparison: 100 meters to drive causes 22 grams of CO2.
80 percent of the electricity consumption on the Internet is caused by streaming services
Telekom needs the equivalent of 91 watts for a gigabyte of data transmission.
An hour of video streaming needs more than three times more energy than a HD stream in 4K quality, according to the Borderstep Institute. On a 65-inch TV, it causes 610 grams of CO2 per hour.
"According to the Carbon Trust, the home TV, speakers, and Wi-Fi router together account for 90 percent of CO2 emissions from video streaming. A fraction of one percent is attributed to the streaming providers' data servers, and ten percent to data transmission within the networks."
It's the devices themselves that contribute the most to CO2 emissions. The streaming servers themselves are nothing like the problem the AI data centres are.
AI energy claims are misrepresented by excluding the training steps. If it wasn't using that much more energy then they wouldn't need to build so many new data centers, use so much more water, and our power bills wouldn't increase to subsidize it.
From your last link, the majority of that energy usage is coming from the viewing device, and not the actual streaming. So you could switch away from streaming to local-media only and see less than a 10% decrease in CO2 per hour.
> Telekom needs the equivalent of 91 watts for a gigabyte of data transmission.
It's probably a gigabyte per time unit for a watt, or a joule/watt-hour for a gigabyte. Otherwise this doesn't make mathematical sense. And 91W per Gb/s (or even GB/s) is a joke. 91Wh for a gigabyte (let alone gigabit) of data is ridiculous.
Also don't trust anything Telekom says, they're cunts that double dip on both peering and subscriber traffic and charge out of the ass for both (10x on the ISP side compared to competitors), coming up with bullshit excuses like 'oh streaming services are sooo expensive for us' (of course theyare if refuse to let CDNs plop in edge cache nodes in your infra in a settlement-free agreement like everyone else does). They're commonly understood to be the reason why Internet access in Germany is so shitty and expensive compares to neighbouring countries.
And then compare that to the alternative. When I was a kid you had to drive to Blockbuster to rent the movie. If it's a 2 hour movie and the store is 1 mile away, that's 704g CO2 vs 112g to stream. People complaining about internet energy consumption never consider what it replaces.
I see GP is talking more about Netflix and the like, but user-generated video is horrendously expensive too. I'm pretty sure that, at least before the gen AI boom, ffmpeg was by far the biggest consumer of Google's total computational capacity, like 10-20%.
The ecology argument just seems self-defeating for tech nerds. We aren't exactly planting trees out here.
If you tried the same attitude with Netflix or Instagram or TikTok or sites like that, you’d get more opposition.
Exceptions to that being doing so from more of an underdog position - hating on YouTube for how they treat their content creators, on the other hand, is quite trendy again.
I think the response would be something about the value of enjoying art and "supporting the film industry" when streaming vs what that person sees as a totally worthless, if not degrading, activity. I'm more pro-AI than anti-AI, but I keep my opinions to myself IRL currently. The economics of the situation have really tainted being interested in the technology
I'm not sure about that: The Expanse got killed because of not good enough ratings, Altered Carbon got killed because of not good enough ratings and even then the last seasons before the axe are typically rushed and pushed out the door. Some of the incentives to me seem quite disgusting when compared with letting the creatives tell a story and producing art, even if sometimes the earnings are less than some greedy arbitrary metric.
Youtube and Instagram were useful and fun to start with (say, the first 10 years), in a limited capacity they still are. LLMs went from fun, to attempting to take peoples jobs and screwing personal compute costs in like 12 months.
It’s not ‘trendy’ to hate on AI. Copious disdain for AI and machine learning has existed for 10 years. Everyone knows that people in AI are scum bags. Just remember that.
The point is the resource consumption to what end.
And that end is frankly replacing humans. It’s gonna be tragic (or is it…given how terrible humans are for each other, and let’s not even get to how monstrous we are to non human animals) as the world enters a collective sense of worthlessness once AI makes us realize that we really serve no purpose.
Sources are very well cited if you want to follow then through. I linked this and not the original source because it’s likely the source where root comment got this argument from.
"Separately, LLMs have been an unbelievable life improvement for me. I’ve found that most people who haven’t actually played around with them much don’t know how powerful they’ve become or how useful they can be in your everyday life. They’re the first piece of new technology in a long time that I’ve become insistent that absolutely everyone try."
It's the same one as crypto proof of work, it was super small and then hit 1% while predominantly using energy sources that couldn't even power other use cases due to the loss in transporting the energy to population centers (and the occasional restarted coal plant), while every other industry was exempt from the ire despite all using that 99%
The difference with crypto is that it is completely unnecessary energy use. Even if you are super pro-crypto, there are much more efficient ways to do it than proof of work.