Jump to content

Anyone Else Noticed The Newer Windows Versions are SLOWER?


NoelC

Recommended Posts

 

installed Windows 10 Manager beta on 10051 , lol

 

clKkY0O.png

 

 

The word "software" is not a word that can be made plural by adding an 's' at the end.  Personally I would be a bit wary of downloading anything from someone who uses the word "softwares".

 

By the way, for reference normally Windows puts the "edition" in that space...

 

WinVerScreen.png

 

-Noel

Link to comment
Share on other sites


The word "software" is not a word that can be made plural by adding an 's' at the end.  Personally I would be a bit wary of downloading anything from someone who uses the word "softwares".

 

Example (if needed):

All your software are belong to us! 

;)

 

jaclaz

Link to comment
Share on other sites

  • 2 weeks later...

So the manufacturer's solution is one that increases the write load on the drive, effectively wearing it out more quickly.  Presumably so you'll need to buy another Samsung SSD.

Point being how much quicker, I mean it is likely that the the thingy will wear out earlier, but how much earlier?

Like one year, one month, one week, one hour or a handful of minutes earlier?

http://ssdendurancetest.com/

http://www.anandtech.com/show/7173/samsung-ssd-840-evo-review-120gb-250gb-500gb-750gb-1tb-models-tested/3

 

I mean is it "smart"?

No. :no:

 

Is it likely to actually affect noticeably the end user?

Seemingly as well no, if a 256 Gb actually can stand almost 15 years in "normal" use @50 Gb/day (almost twice the usually considered average of 10-30 Gb/day which anyway very few people actually have on a 24/7 basis ) and @100 Gb/day almost 8 years.

 

I think I will sleep well :yes: notwithstanding this piece of news. 

 

jaclaz

Link to comment
Share on other sites

You're right - most folks will probably trash their current computer hardware long before actually wearing out their SSDs (though frankly they still really shouldn't have to have to manage software to rewrite their data regularly).

 

Anecdotal data:

 

I just bought a couple of used ones to augment my existing array.  They're not being made any more but I wanted to get the identical model, which was made from 2011-2013 I believe.

 

I checked the lifetime writes setting in the SMART parameters.  The more used one had had some 17 TB written to it.  Using fairly conservative numbers, the lifetime write load these particular drives should support before wearout should be about 500 Terabytes.

 

Looking at the numbers since, my load on these drives in 11 days of use has been about 50 GB / day - and actually most of that is the data I originally copied to them - daily write loads are quite a bit less.  Projecting forward, even at 50 GB/day, that places the projected wearout date at roughly 27 years from now.

 

-Noel

Link to comment
Share on other sites

In fact if what PcPer says is true this new band-aid sounds as good as could be reasonably expected, given the nature of the problem: Unreliable 19nm TLC needing periodical refresh to keep reading speeds up, which means sacrificing extra P/E cycles for the refreshes goes with the territory.

 

But come to this point I don't trust Samsung at all. I'll only believe it if and when I see it work as advertised after some months of use.

Link to comment
Share on other sites

I wonder if the software workaround seeks to minimize the shuffling of internal structure data - ideally writing the same data back to the same flash blocks again.  If not, there's some (admittedly small) additional risk that something could go wrong.  In my experience there are only a small number of people in the world who care enough about details to create software that's actually perfect - and with a device controller that has to handle literally terabytes of data in real time, manage the allocations of flash blocks, move data around for garbage collection, make wear-leveling decisions, etc., perfection isn't really an option.  It's a necessity.

 

-Noel

Link to comment
Share on other sites

Malventano, the author of the PcPer article, which has tested the yet to be released new 840 EVO firmware and Magician 4.6, has posted this in the whistleblower thread:
 

"Not only did they apparently lick the reading stale data issue (the firmware alone immediately restores read speeds), but the 'Advanced Optimization' feature triggers a sequential rewrite of all data in the background, which actually puts the 840 EVO above most other SSDs that don't have that type of feature available.

 

Rewriting all files sequentially improves the performance of *any* SSD, and now they have this feature baked into Magician, so in trying to fix one stubborn problem they have inadvertently introduced a feature that puts the 840 EVO above other drives in terms of read performance of old randomly written files."

 

http://www.overclock.net/t/1507897/samsung-840-evo-read-speed-drops-on-old-written-data-in-the-drive/2480#post_23793174

 

We'll see.

Link to comment
Share on other sites

Wow, does that sound like someone desperate to turn lemons to lemonade or what?

 

Rewriting all files sequentially improves the performance of *any* SSD

 

How is it I have never heard that old data takes longer to read on any SSD before?

 

For what it's worth, I just tried copying a big 1+ GB file that's been on my SSD array since 2012, when I initially set my array up, and it copied at a speed that implies the above is not correct.  Though the time difference shows 1 second, the delay seemed to be a bit less than 1 second, which is just about what I should be seeing.

 

TimeToCopyBigFile.png

 

I really hate it when marketing people make up "facts" to suit their own motives.

 

-Noel

Link to comment
Share on other sites

Well, let's say that you'd better have the PC connected to mains through an UPS (Uninterruptible Power Supply) as this "background rewriting of all data" sounds a lot like "gambling with blackouts" (or in the case of laptops "gambling with accuracy of battery meters").

 

Let me rewrite all data on the same device, and while I am at it, let me call it a "feature".

 

What could go wrong with this plan? :unsure:

 

jaclaz

Link to comment
Share on other sites

Let me rewrite all data on the same device, and while I am at it, let me call it a "feature".

 

What could go wrong with this plan? :unsure:

Well, that is what defrag does, isn't it?

Not that I agree that the procedure should be necessary or helpful for a device that is inherently more random access than a spinning hard disk is, but I'm just playing devil's advocate.

Cheers and Regards

Link to comment
Share on other sites

Well, that is what defrag does, isn't it?

Sure, and you rarely initiate a defrag, possibly of a huge disk, when you are at 10% battery left on a laptop or when there is a lightning storm in the area, or when you are just about to leave to board a flight and you NEED to disconnect/switch off, etc. etc.

This is one (among others) things I fear the most on latish OS or devices:

automatic, not user initiated or notified, activities of potentially dangerous nature if interrupted

This include untimely Windows updates (including forced reboots) background automatic defragmenting in Windows 7.

Seemingly this "background rewriting" is well below the OS level (let alone the user level) so, good as it might be the safety measures the good Samsung guys provide, I personally would NOT trust them to be 100% effective in the mentioned cases.

jaclaz

Link to comment
Share on other sites

This is a well-known conjecture known as Wirth's law.

Software is getting slower more rapidly than hardware becomes faster.

 

I've noticed this pattern time and time again. Opening a Microsoft Word document on Office 2003 on my old 2006 Pentium D takes no more than 3 seconds on a cold start. At school, we have Office 2013 running on 3.4 GHz Core i7 computers, and I see that splash screen with the flowing dots for at least 6 seconds before the document opens. As hardware gets more powerful, the programmers can get away with higher-level languages and layers upon layers of object-oriented abstraction. This is pretty much what the .NET framework is. A slow, bloated, object-oriented abstraction layer so that programmers can write code faster and more sloppily, and be tied to Microsoft platforms. And since Vista, the Windows OS has come to depend on the .NET framework.

 

An interesting experiment:

http://hallicino.hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...