Howdy.
I'd like any specialist and not just wild guesser to tell me what is the reason for same gpu (in my case 1080ti) while consuming same ammount of energy ~295W, to run at very different temperatures at different games.
So:
Games run at max frames gpu can give since its less than max panel frequency.
- Rise/Shadow of tomb raider and ACOdyssey get gpu up to ~75°C +/-2°C
- WoW and WoW classic and jedi order get gpu up to 85°C +/-3°C
I am not looking for excuses of difference in temps, I want to understand how same wattage gerenrates more heat and what is it related to and what do those games have that does that.