Jump to content

System Testing and Benchmarks


NoelC

Recommended Posts

I wait a few weeks and check if users have issue or not. If not, this will replace my Samsung 830 256Gb SSD ...

 

Yep good idea, also if you let pass some time you'll get a more polished firmware.

 

The 512GB one is the sweet spot. The 256 & 128 have less dies per channel and slower write speeds, down to just 150MB/s Seq in the 128GB.

 

 

 

@ Jaclaz, with draconian liposuction Se7en can be kept reasonably well trimmed:

 

Size_001.png

Link to comment
Share on other sites


@ Jaclaz, with draconian liposuction Se7en can be kept reasonably well trimmed:

 

Sure :yes:, though we have a somewhat different definition of "draconian", and it is not of course only the matter with the "sheer" OS.

 

Though rather "dated", these graphics show the general trend of what I meant:

http://www.oooninja.com/2008/05/openofficeorg-microsoft-office-moores.html

 

You can see a similar trend in each and every of the "mainstream" apps, (another "dated" example):

http://suretalent.blogspot.it/2009/03/software-bloat-acrobat-reader.html

 

jaclaz

Link to comment
Share on other sites

Of course that is no comparison to XP or jaclaz's beloved Win2K, but I'm sure he will provide you with appropriate figures. :)

 

EDIT: As expected we cross-posted. :)

 

Cheers and Regards

Edited by bphlpt
Link to comment
Share on other sites

@bhplt

The trend for OS size increase has somehow stopped with Vista :ph34r: (possibly a good "size effect" of wanting to push the OS on smaller/more portable devices)

 

Default install sizes (per MS requirements) of the "sheer" OS:

Windows 2000 http://support.microsoft.com/kb/304297/EN-US

A 2 GB hard disk that has 650 MB of free space. 

 

Windows XP http://support.microsoft.com/kb/314865/en-us 

At least 1.5 gigabytes (GB) of available space on the hard disk

 

Windows Vista http://support.microsoft.com/kb/919183/en-us

20-gigabyte (GB) hard disk that has 15 GB of free hard disk space

 

Windows 7 http://windows.microsoft.com/en-us/windows7/products/system-requirements 

16 GB available hard disk space (32-bit) or 20 GB (64-bit)

 

http://windows.microsoft.com/En-us/windows-8/system-requirements

16 GB (32-bit) or 20 GB (64-bit)

 

So, the OS growth trend is "flat" since several years :), still, once you take into account the updates there is some increase, but less than the rest of what can be normally on a hard disk, which is more preoccupying, even the actual file formats have (with the obvious exceptions) grown.

 

Sometimes this latter is not really-really a direct fault of the programmers/software houses, it is only indirect.

 

I will give you a recent "real life" example.

I was doing some consulting with a small construction company and asked them to send me via e-mail the "integral" copy of a set of documents (a bid) that they had submitted for the tender.

It failed, and failed badly.

So I had a CD (rectius DVD) sent to me, size of files in it amounting to more than 3 Gb.

Besides some really huge Autocad drawings (which once purged/DXFed and reimported became between 1/5th and 1/20 of the "original") I was puzzled by the size of all the administrative paperwork (mainly .doc files, one or two pages "standard" forms or statements), each of which more than 10 Mb in size :w00t:.

It turned out that some time ago an "IT guy" :whistle: managed to create a template for all correspondence (headed paper) by plainly embedding a very, very large JPG of the firm logo in the header of the .dot template. :w00t: (instead of reducing the image size, the original was embedded and scaled down to 5% or so).

They had recently upgraded some ten hard disks (+ their backup storage on local server) because they couldn't deal with the sheer amount of data.  :ph34r:

Now, this is clearly a case of "computing illiteracy", but think about all the times wasted transferring useless bytes....

 

jaclaz

Link to comment
Share on other sites

I ran across a compiler some 10 years ago that had a /BLOAT switch, for use when people wouldn't take seriously the "too small" output that was a very efficient executable.  "Nothing useful could be THAT small...:no:

 

A bit of perspective...

 

Notwithstanding artificial / ignorance bloat, people today don't often notice nor correlate the massive increase in data size we deal with daily to an increase in need for stability in today's systems.  The "golden age" operating systems we knew and loved didn't handle anywhere NEAR the amount of data we crunch through today.

 

If you think back on, say, I dunno, Windows 2.0 back in the 1980s...  It would take a couple of minutes to boot up a PC XT system, and it could run maybe half a day without crashing.  If a modern computer could boot that system, running instructions at the pace we do today, it would be booted in a fraction of a second and maybe run 30 seconds before crashing.  That's sobering.

 

A couple of decades ago, I used to get real engineering work done when a 500 MB hard drive was a significant thing (and a significant investment).  Within the past day I've saved still pictures that were several times that large.  All done in a few seconds, without flaw.

 

This even applies to fairly recent versions, such as our beloved XP.  A system based on XP would crunch through hundreds of megabytes of data without flaw.  Gigabytes, maybe.  Terabytes?  Maybe not.  It used to seem very stable back then, but the glasses are rose colored.  I know, I run XP VMs now for testing.  It's a clunky old system by comparison to the new ones.  Not that the new ones are perfect, but their problems are different.

 

Here I sit with a Win 8.1 workstation that's been running now for what, 26 days on the same bootup.  Looking at Task Manager I see processes whose I/O counts have reached many terabytes.  And these are just the processes that are persistent!  We're not seeing all the data that's been crunched through by the transient programs I've run.  The System Idle Process has accumulated 15029:01:38 of CPU time (it counts up 24 CPU seconds for every clock second).  That's 2 CPU-YEARS at modern Xeon speed.  And, as they say, it keeps on ticking.

 

I shudder to think how much information is crunched per second in a typical video game session.  We speak of "hundreds of frames per second" like it's nothing.  And if you can't play for hours, you complain about that stupid unstable Windows system..

 

A side effect of data size increasing is that systems are getting more stable.  They have to.

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

 

A side effect of data size increasing is that systems are getting more stable.  They have to.

Interesting point. :yes:

Still, in my perverted mind, less bytes still mean less time, and less probability of an issue (of any kind).

 

And while I will gladly concur that the increased power allows for things that were unthinkable of only a few years ago (think of real-time or near real time rendering or compiling a complex software) the effect on more "normal" activities (which I believe are what are "normally" carried by "normal" users on "normal" PC's in "normal" offices) have seen not *any* noticeable bettering, the "weak" part is of course the "human" side of the equation, I cannot type a letter or create an Excel spreadsheet faster than I can think what I am typing, and the processors outperformed me already many, many years ago :(.

 

Personally, having a 20 page-per-minute printer and/or the increase in speed over data transmission actually made faster my workflow much more than the increase in processing power (in the good ol' days you had to turn auto-calculation off and press a few times F9 with complex spreadsheets, but all in all that was the least of the problems).

 

jaclaz

Link to comment
Share on other sites

Do you code using Visual Studio?

 

Have you noticed how Intellisense analyzes your code as you type?  Pops up auto-completion prompts?  Lets you know immediately about basic syntax issues like missing parens or misspelled keywords or variable names?

 

It's almost irritating at first that it's so busy but by gosh if you embrace it, it can actually start to be useful and speed things up, making coding easier.  It grows on you.  I hardly ever finish typing a long variable name any more.

 

I think that's a pretty cool use of excess computer power.

 

-Noel

Link to comment
Share on other sites

Do you code using Visual Studio?

 

Have you noticed how Intellisense analyzes your code as you type?  Pops up auto-completion prompts?  Lets you know immediately about basic syntax issues like missing parens or misspelled keywords or variable names?

 

It's almost irritating at first that it's so busy but by gosh if you embrace it, it can actually start to be useful and speed things up, making coding easier.  It grows on you.  I hardly ever finish typing a long variable name any more.

 

I think that's a pretty cool use of excess computer power.

 

-Noel

No (meaning BOTH that I do not code :no: and certainly do not using Visual Studio :ph34r:).

 

But you raised another very good point.

 

That feature is exactly the same (which I turn off ;)) as the auto-correction/grammar correction/spelling/whatever in Word (or similar word processing software), it is very appropriate and useful when writing code[1] and for *anything* that has a definite, defined and "not negotiable" syntax :yes: or for a limited number of characters substitutions (like - as an example ( R ) to ® ) but it soon becomes a nightmare when it tries to changes Capital initials, quotes to double quotes, rounded quotes, etc. and attempts (at least in Italian, but I believe that American vs. British English is even worse) changing what you type into what it thinks to be the right form/spelling.

I find the feature similar to the good ol' T9 madness (that however did have it's use).

You can count me among the [who?] 's here ;):

http://en.wikipedia.org/wiki/Spell_checker#Criticism

 

Particularly:

http://en.wikipedia.org/wiki/Grammar_checker

http://itre.cis.upenn.edu/~myl/languagelog/archives/005061.html

 

... an infinite number of monkeys will analyze your writing and present you with useless grammar complaints while not alerting you to actual grammatical errors because computers don't understand grammar ...

 

 

Very appropriate and useful in some contexts (like writing code) and one of the causes of the progressive decaying of civilization in other contexts.

 

jaclaz

 

 

[1] As an example I find it very useful for formulas in Excel, or even some VBA scripts.

Edited by jaclaz
Link to comment
Share on other sites

@jaclaz

 

no, I need the space for VMs and natively booted VHD.

Sure :) , but the VM's do not contain (say) DOS 6.22 or FreeDOS or even Win2k/XP's, they are likely to be more instances of the same "biggish" recent OS's. ;)

 

jaclaz

Link to comment
Share on other sites

A sampling of the sizes taken by some of my VMs, complete with their own disk images, snapshots, and whatever maintenance data take up the following space on the host system's drives.  Not sure whether this proves or implies anything other than systems that get more attention tend to grow.

 

  46 GB - XP Pro (32 bit)

  21 GB - a really lean Vista Business (32 bit)

136 GB - Win 7 Ultimate (32 bit)

170 GB - Win 7 Ultimate (64 bit)

  54 GB - Win 8 Enterprise (64 bit)

136 GB - Win 8.1 Enterprise (64 bit)

 

-Noel

Link to comment
Share on other sites

A bit out of the conversation but I said the bad word, let's say the good too. PCMark 8 is a behemoth (2890 MB download) but it seems good. It is EIST aware and some of its tests stress the CPU quite a bit. You see the results online but a detailed log is saved in your documents folder too. I did two tests initially, one with EIST enabled - score 2151 and one with EIST disabled - score 2149, anyway as I said PCMark is for the first time EIST aware and it doesn't make a difference anyway. It also has OpenCL testing option which hangs on my laptop - who knows why - on the 4th of the 8 subtests but is still an interesting option. If someone has the time to download it and the disc space to install it then it is something that I can recommend.

Link to comment
Share on other sites

I made two very simplistic benchmarks in FreeBASIC (windows console mode) that test the CPU. The one is for single core, not multithreading processors (1 thread) and the other for processors that run up to 4 threads simultaneously. Example results are as follow: 1+ MIPS for an Intel Pentium III 1 GHz processor (the multithreaded one gives a bit higher score than the other), 50+ MIPS (single threaded) and 150+ MIPS (multithreaded - 100% CPU usage) for an Intel Core i5-3230M 2.6 GHz (with TurboBoost up to 3.2 GHz - enabled) processor (2 cores - 4 threads). The single thread one is very simple, as simple as it could be, the other was a bit more difficult to make because of multithreading (no previous experience, the FreeBASIC help file helped a lot). I attach the programs and the source code. Please don't lough at me!

 

Benchmarks.zip

Link to comment
Share on other sites

A very simple test of simple integer and floating point math.  Nothing to be ashamed of.

 

I got 52 and 227 respectively.  Might be fun to see what a 24 thread version would do.  :)

 

-Noel

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...