Jump to content
Strawberry Orange Banana Lime Leaf Slate Sky Blueberry Grape Watermelon Chocolate Marble
Strawberry Orange Banana Lime Leaf Slate Sky Blueberry Grape Watermelon Chocolate Marble

MSFN is made available via donations, subscriptions and advertising revenue. The use of ad-blocking software hurts the site. Please disable ad-blocking software or set an exception for MSFN. Alternatively, register and become a site sponsor/subscriber and ads will be disabled automatically. 


Sign in to follow this  
sdt

Windows 10 runs hotter than windows 7?

Recommended Posts

i know windows 7 runs around 5-10% hotter than windows xp on the same hardware from my experience. I'm wondering how much hotter or cooler does windows 10 run in comparison to windows 7 if anyone has made a note of that while testing so far?

Edited by sdt

Share this post


Link to post
Share on other sites

For example hows 10074s processor temps running a 4k video vs windows 7 for around 5 minutes using VLC?

Share this post


Link to post
Share on other sites

dudes this is a serious question! its pretty easy to check for someone willing to oblige thanks!

NoelC this one is for you if you can take it up, since you are the main win10 tester here with oodles of time in your hands.

Thanks!

Share this post


Link to post
Share on other sites

What software are you using to measure the temperature?

How exactly are you playing a 4k video?

Share this post


Link to post
Share on other sites

How exactly are you playing a 4k video?

I would suspect using VLC:whistle: 

 

For example hows 10074s processor temps running a 4k video vs windows 7 for around 5 minutes using VLC?

 

jaclaz

Share this post


Link to post
Share on other sites

Yes, VLC being a video player... however is it playing content that is local to the system, from a disc or from a website?

My questions were for replication reasons. I have 1 system here that has a 4k display and thought I might be able to see this difference.

Share this post


Link to post
Share on other sites

Well, where the source resides it shouldn't make difference, the increase in heat due to the friction of the 1's and 0's in the cables is a localized effect, only affecting the cables (IDE, SATA or CAT5/6) and not spreading to the CPU. :unsure:

 

My guess would be that the CPU heat is correlated to CPU usage, i.e. the more usage of the CPU you have, the more heat builds up, on modern systems with efficient cooling systems and variable speed fans (or similar technologies) there will probably be an almost exact correlation also to CPU fan speed, probably near 0.99, like (say) US spending on science, space, and technology correlates with
Suicides by hanging, strangulation and suffocation

:w00t:

http://tylervigen.com/view_correlation?id=1597

;)

 

jaclaz

  • Upvote 1

Share this post


Link to post
Share on other sites

@triredacus

you could download the full video before playing locally if continuous playback over the net (such as streaming via VLC) is an issue due to bandwidth limitations. Either way is ok.

Edited by sdt

Share this post


Link to post
Share on other sites

I don't think it will work out for me to try this. Do you have any other examples of seeing this temperature difference that doesn't involve downloading movies?

Share this post


Link to post
Share on other sites

 

I don't think it will work out for me to try this. Do you have any other examples of seeing this temperature difference that doesn't involve downloading movies?

 

You can play this

and select the highest setting possible (1080p/1440p/2160p) and note the temperatures in both operating systems.

Share this post


Link to post
Share on other sites

Ok thanks. I will give it a shot next week when I have access to the system again. I will note that I will be using a notebook to do these tests. Specifically, this one:

http://www.msfn.org/board/topic/173469-uefi-installation-on-a-4k-native-display/

You did not specify what type of computer or what hardware you are using, it could be possible that I will get different results. Drivers may also play a part in seeing a difference. I do know that Windows 7 is not natively compatible with 4k resolution, which might also be a factor. Since the notebook I have is not to be used with Windows 7, I will also try Windows 8.1.

Share this post


Link to post
Share on other sites

Ok thanks. I will give it a shot next week when I have access to the system again. I will note that I will be using a notebook to do these tests. Specifically, this one:

http://www.msfn.org/board/topic/173469-uefi-installation-on-a-4k-native-display/

You did not specify what type of computer or what hardware you are using, it could be possible that I will get different results. Drivers may also play a part in seeing a difference. I do know that Windows 7 is not natively compatible with 4k resolution, which might also be a factor. Since the notebook I have is not to be used with Windows 7, I will also try Windows 8.1.

 

 

that laptop looks fine, you could playback a 4k video on a non 4k display as well since the idea is to test the processor heat only.

 

yes my hardware is different but it wont matter because the idea is to see the percentage increase/decrease in temperatures and not the absolute degrees finally.

 

checking system performance this way is the ultimate benchmark because it checks for all system efficiencies and inefficience in one round figure of heat output! if the OS/drivers/programs are becoming more efficient with win10 it will show up as a net percentage decrease in temperature over windows 7 while doing the same task using the same default software on the same hardware compared to windows 10. Otherwise it will show a decrease in efficiency.

Edited by sdt

Share this post


Link to post
Share on other sites

checking system performance this way is the ultimate benchmark because it checks for all system efficiencies and inefficience in one round figure of heat output! if the OS/drivers/programs are becoming more efficient with win10 it will show up as a net percentage decrease in temperature over windows 7 while doing the same task using the same default software on the same hardware compared to windows 10. Otherwise it will show a decrease in efficiency.

 

Hmmm :dubbio: I am not so sure that this can be correlated to "efficiency", in the sense that a given hardware and OS may be more efficient through being able to do more (or more complex) operations in a given time which would lead to increased produced heat.

 

All in all the proposed test is IMHO a nice one :thumbup but maybe linking it's results to "efficiency" directly is not entirely accurate.

 

jaclaz

Share this post


Link to post
Share on other sites

All in all the proposed test is IMHO a nice one :thumbup but maybe linking it's results to "efficiency" directly is not entirely accurate.

Efficiency is really determined in two ways:

1. This new hardware/software is much more efficient than this other one I am used to using! (this is opinion based)

2. This new hardware/software is more more efficient compared to these other hardware software combinations, measured using the scientific method and all sorts of details/complicated analyzers and gadgets! (this one is "fact" based)

#1 is not feasible because I would not be able to say the thing I have that sdt has not used is more efficient than the thing sdt has and I have not used.

#2 is technically possible but I am not willing to put that much effort into it. That's something those benchmarking websites are for. ;)

Share this post


Link to post
Share on other sites

in this case its not about OS

but codec and app and/or plugin+browser

 

codec tells GPU what to do, no matter if its 2k, XP, "7" or "10"

browser+plugin is different storry, every browser comes with internal plugin for "html 5 video" aka vp6/7 h264

or if you have flash, flash is by itself resource s*** hungry felhound

some browsers work ok with it some dont

 

but 2k and/or XP takes far less resources (tweaked ofc)

you can make XP to take only 64 MB RAM by time it loads all files to desktop

XP also doesn't use shitty dotnet framework as base that uses garbage collector to decrypt what is valid code to run in JIT

Edited by vinifera

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...