"throw more CPU at the problem" is abstracting the cost, which, well, it's "throw more watts (CO2)* at the problem"

CPU power is abundant but a watt is a watt.
*: wind and solar renewables are bad at constant drain, due to periodic nature, but also you're now pulling renewable watts from the available pool, and things otherwise tasked to them are going to either run out, or drain batteries you'd use over night
using more of the renewable capacity, unless we have more than enough for even peak loads, means someone else is using more "dirty" and nonrenewable supplies
buying an electric car to replace yours in California is better than in Pittsburgh

but mostly because in pgh it's just burning coal to power your car, which is less efficient than a gasoline engine with regard to watts per CO2
like transmission losses mean its a lot better to use nearby renewables than "sell" excess to nearby grids, so increased nearby renewable use is still better, but like, you're still tasking things not otherwise burning fuel to burn fuel
anyway I've been thinking about this a lot lately, ever since I saw a lot of tweeting immediately after much more efficient processors were announced, that it would mean "a spike in available cpu for the same project cost",
eliminating a 20kb JavaScript dependency in mailchimp for wordpress led to a reduction in 59k* kg of CO2 emissions per month.

think about what 20% more efficient infrastructure would do if all the projects stayed same level of cpu use
(*: his calc is explained, conservative gut feeling is hack off a 0)
(probably more, but it's also not getting into computational cost on either end. but the thread still works without it.

see: ) https://twitter.com/NireBryce/status/1340370704513069057?s=19
You can follow @NireBryce.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.