At some point nvidia would be better off mining with these gpus than selling them.
The oddest solution I can think of is gamers lease graphics cards at a low low price and nvidia mines on them in the off-hours. With a big enough operation they could force all the other miners into becoming resellers. It's free, distributed electricity and it gets the gamers off their backs.
(While I have your attention, I'm pretty sure proof of work is the paperclip maximizer sci fi warned us about.)
> The oddest solution I can think of is gamers lease graphics cards at a low low price and nvidia mines on them in the off-hours.
This would be a complete non-starter in markets with high electricity costs or temperatures. Running a GTX 3090 at 100% load during the summer where I live would cost $140/month and will heat up my room to about 65°C.
> I'm from Germany and our power is around 35 cents per khw.
I don't know if this is a typo or just a slightly different way of writing it in German, but now I'm trying to work out exactly what a "kilohour watt" would be. I think it's just a millionth of a kilowatt hour? Definitely wouldn't want to be paying those prices!
I can't find anywhere genuinely using that unit, although it is mentioned in this 1979 9th grade Electrics curriculum as part of a True-or-False quiz at page 370: https://files.eric.ed.gov/fulltext/ED182438.pdf
Time to file it in my list of "technically correct but extremely annoying units" along with "5Mm" (5000km) and - mainly in the context of batteries - "10Ah" (10000mAh).
The oddest solution I can think of is gamers lease graphics cards at a low low price and nvidia mines on them in the off-hours. With a big enough operation they could force all the other miners into becoming resellers. It's free, distributed electricity and it gets the gamers off their backs.
(While I have your attention, I'm pretty sure proof of work is the paperclip maximizer sci fi warned us about.)