Jump to content

Perspectives, Then vs. Now


NoelC

Recommended Posts

Individual interactive computer workstation, mouse, keyboard, interactive GUI (with decent FPS!), along with telepresence and video conferencing, complete with geeks who can barely express themselves without using jargon (but which we fellow geeks can still easily understand).

 

In 1968!

 

http://www.youtube.com/watch?v=yJDv-zdhzMY

 

-Noel 

Edited by NoelC
Link to comment
Share on other sites


Cute!

 

I wonder when it'll be when kids will no longer have seen a computer.  Not that a tablet or smart phone is actually a replacement for same.

 

-Noel

Link to comment
Share on other sites

Things moved very fast, but we are now in the phase of stagnation. Compare the speed increase of computers between 1995-2005 and between 2005-2014. Working on a 95 PC in 2005 was unthinkable, working on a 2005 PC now not so much.

 

Same with cars: Using a car from 1920 in 1950 was almost unthinkable, but using a car from 1990 today isn't such shock.

 

New technologies develop fast, but once they reach  certain phase, stagnation sets in - often even regressions (*cough* Windows 8 *cough*). Same is true for new art forms: There is an undeniable improvement between film making from the 20s to the 70s, but between the 70s and the 2010s? Not so much. Even the first generation CGI feature films (Terminator 2, Jurassic Park) look better than newer CGI FX. Compare the dinosaurs of JP with the one in King Kong.

 

The stagnation phase for computer games came incredibly fast: Many classics from the 90s/early 2000s are considerably better (gameplay and atmosphere) than the newest iterations. We are talking only about 10-20 years here until the stagnation/regression phase was completed.

Edited by Formfiller
Link to comment
Share on other sites

While I understand what you've written, Formfiller, it's kind of an antithesis to what's being shown in this thread.  I'm afraid I can't accept all of it.

 

I personally don't find much utility at all in running a circa 2005 computer today.  Compared to the workstation I had in 2005, the workstation I'm using now has 6 times the cores (12 vs. 2) each of which is executing instructions something like 3 times as fast.  It has 6x more RAM and 5x more disk storage capacity.  The disks are SSD vs. old spinning electromechanical drives, delivering 10x more throughput, and the video card is 35x more powerful at 1/5 the price.  I could not crunch through the work I do today on my old workstation.  Today it's literally seconds where it was minutes just 10 years ago.

 

In 2005 I had cable TV internet access - a whole 5 megabits downlink and 750 kbits uplink.  Today I have fiber optics to the house and enjoy a 40 megabit uplink.  I collaborate in real time with people across the world with real time desktop sharing, high quality audio, and file transfer all at once.

 

And if we're comparing operating systems, in 2005 I was running XP x64 (I didn't move to Vista until 2006).  While that was a decent system, it was unreliable by today's standards, and still very limited by its 32 bit heritage.

 

That said, Windows itself HAS stagnated in the last few years (I agree that 8.1 doesn't do anything useful better than 7), but I think it's a specific thing - Microsoft, for whatever reasons, decided to try to become a (mediocre) hardware company rather than continue their core business as a software company powering enterprise.

 

However, this is still a fine time for high tech advancement.  Just 10 years ago what did your top of the line smart phone look like?  How good was internet access?  Could you talk to it and have it answer?  What was the battery life?

 

razr.gif

 

-Noel

 

 

Edit:  Tidied up some technical details

Edited by NoelC
Link to comment
Share on other sites

Today I have fiber optics to the house and enjoy a 40 megabit uplink.  I collaborate in real time with people across the world with real time desktop sharing, high quality audio, and file transfer all at once.

No. :no:

Meaning that you "collaborate across the world in real time, etc." with a restricted number of people that have like you such a fast fiber optics internet access AND that have the same kind of "top hardware" you have.

Some people like (say) CIA or NASA, had ALREADY that kind of technology 10 years ago.

The difference is that now it is available to you and to a restricted number of your friends/colleagues etc.

 

I.e. you seem (to me) like confusing (a bit) the evolution in technology with the availability (or affordability) of technology (to you personally, to a restricted group of people or to a wider one).

 

jaclaz

Link to comment
Share on other sites

No. :no:

 

So?

 

It's the way of the world - NOBODY can have technology that isn't available.

 

But once it's invented, NOT EVERYBODY can have the technology right away.  But some can.  More and more can as it becomes refined and prices fall.  I've not been talking about technology that's unavailable to you.  Hell, most of the rest of the world has better internet access than we yanks.

 

I posted up above a link to a presentation done in 1968 that employed real-time video conferencing, overlaid image and computer graphics, and featured a demo using a mouse, copy/paste, and other things we didn't start taking for granted for another 20 years.

 

My point is simply that I'm doing telecommunications and conferencing, today with real folks, with software and devices easily available to any consumer, not just well-funded government research agencies.

 

You're typing a forum post on some kind of high tech device.  That's not something you could have done a few decades ago.  Where is the problem acknowledging that technology is marching on?

 

I kind of intended this thread as a light-hearted look at tech evolution, not some kind of confrontational debate.

 

BlindinglyObvious.jpg

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

I kind of intended this thread as a light-hearted look at tech evolution, not some kind of confrontational debate.

Well, it was you that started "not accepting" ... :whistle:

 

You're typing a forum post on some kind of high tech device.  That's not something you could have done a few decades ago.  Where is the problem acknowledging that technology is marching on?

Sure :), technology obviously makes progresses and becomes faster, often (but not always) more reliable, sometimes (with the exclusion of MS UI ;)) even better looking, these are truisms. :yes:

What FormFiller expressed (and with which I somehow agree) is that once set apart these improvements in speed (and as I highlighted, unwantingly confronting you :blushing:) in availability of this technology, is about the fact that lately there have been less "breakthroughs" or "substantial" changes then in the preceding decade.

And BTW, as always, I may well be wrong about this.

jaclaz

Link to comment
Share on other sites

In terms of CPU speeds, there definitely does seem to have been a slowdown in the increase in clock rates. I remember when a 66MHz 80486 was astounding, and that wasn't so long ago. Chip makers have compensated for this relative stagnation in the productivity suggested by clock rates, by (among other things) adding extra cores; first we got dual-core processors, then quad-core, and so forth.

 

While recent gains are real and large, the trouble is that it's not as "sexy" to say that you have a smaller die or more processors, as it is to say that you have doubled the clock rate. A marketing issue, really: CPU vendors need to come up with a replacement measure that the buying public can easily grasp.

 

This is also a useful discussion (not that it's news to most who'll be reading this  ;) ).

 

--JorgeA

 

Link to comment
Share on other sites

Clock speeds may not be shooting through the roof any longer, but don't forget that modern CPU cores are doing more than ever per clock cycle now too.

 

Newer instructions (thinking, e.g., SSE...) also do more per instruction.  Four concurrent floating point operations at once, for example, in an __m128 suits pixel manipulation quite nicely.

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

While I understand what you've written, Formfiller, it's kind of an antithesis to what's being shown in this thread.  I'm afraid I can't accept all of it.

 

I personally don't find much utility at all in running a circa 2005 computer today.  Compared to the workstation I had in 2005, the workstation I'm using now has 6 times the cores (12 vs. 2) each of which is executing instructions something like 3 times as fast.  It has 6x more RAM and 5x more disk storage capacity.  The disks are SSD vs. old spinning electromechanical drives, delivering 10x more throughput, and the video card is 35x more powerful at 1/5 the price.  I could not crunch through the work I do today on my old workstation.  Today it's literally seconds where it was minutes just 10 years ago.

 

In 2005 I had cable TV internet access - a whole 5 megabits downlink and 750 kbits uplink.  Today I have fiber optics to the house and enjoy a 40 megabit uplink.  I collaborate in real time with people across the world with real time desktop sharing, high quality audio, and file transfer all at once.

 

And if we're comparing operating systems, in 2005 I was running XP x64 (I didn't move to Vista until 2006).  While that was a decent system, it was unreliable by today's standards, and still very limited by its 32 bit heritage.

 

 

Yet the workflow on your newest monster machine is exactly the the same as on the XP machine in 2005.

 

Desktop computing peaked with XP - since then nothing new came to the table (same will happen with tablets and smartphones pretty soon, don't worry). That was my point: Once stuff peaks, the improvements sound nice on paper but the the actual results are very negligible. The improvements you cited (cores etc) didn't change a thing on how you work on it. I am pretty sure the thermoplastic of today's cups is not the same as the thirty years ago, but you don't notice it. Are you really more productive on Office 2013 now than on Office XP in 2001 on a P4?

 

Your example about internet usage: That's actually one thing that had the least improvements the least 10 years. Websites today are indistinguishable from the ones ten years ago. Sure, new firms appeared, but MySpace offered the same stuff Facebook does, just with uglier CSS (which was no tech issue).

 

Let's not forget we had real regressions, too: Shae has a very good point about DOS. DOS in 1992 booted faster than Windows 8.1 on the newest machine. The whole system was indeed more predictable and speedy, and you could easily backup a certain program, which is almost impossible today. Sure, we had also lots of improvements in other areas thanks to Windows, but there were undeniably some regressions.

 

Let's not forget that the desired future of the industry is actually a dumb terminal connected 24/7 to the internet, where you need to be subscribed to use even the simplest of programs (even the OS). Your data won't belong to you anymore and if something happens to the connection, you're SOL (as seen recently with the Adobe Cloud).

 

Now that would be a regression of monumental proportions (and which is only possible because of faster internet connections - so the improvements in broadband availability had a price tag attached to them, a price tag that only recently became visible).

Edited by Formfiller
Link to comment
Share on other sites

Fascinating!  Our realities clearly differ.  It's possible the old phrase "perception is reality" may apply.

 

My time scope in starting this thread was rather longer, but even when thinking of the last 10 years and how it affects everyday life I think of things like the gargantuan leaps in GPU and CPU performance, the advent of huge yet affordable SSD storage, full graphics internet access in the pocket of everyone walking around, tablets, and many other things.

 

Sure, a lot of it is evolution of things we had already conceived of, but...  How fast does evolution have to go before it's revolutionary?  Before technological whims become enabling technology.

 

t's hard to deny that digital images from a 24 megapixel camera carry more detail than those with 6 and are really useful for practical photographers, or that the blood spatter in video games at high resolution AND 60+ FPS is all the more real.

 

The most interesting takeaway I have so far, in a "big picture" sense, is that some folks can have a view that things are stagnating/regressing while others feel technology is blazing nicely along into the future.  Thank you for your viewpoints!

 

-Noel

Link to comment
Share on other sites

 

Let's not forget that the desired future of the industry is actually a dumb terminal connected 24/7 to the internet, where you need to be subscribed to use even the simplest of programs (even the OS). Your data won't belong to you anymore and if something happens to the connection, you're SOL (as seen recently with the Adobe Cloud).

 

Now that would be a regression of monumental proportions (and which is only possible because of faster internet connections - so the improvements in broadband availability had a price tag attached to them, a price tag that only recently became visible).

 

 

That's a great point. Something that concerns me is that, while we may adopt a "from my cold dead fingers" stance, the problem isn't that our PCs might be taken away, so much as that they'll be rendered increasingly useless, to the point where we have little choice but to submit to the "dumb terminal" model.

 

One way this is going to be accomplished is via the browser and the websites that browsers are utilized to visit. Already, XP machines can't use any version of Internet Explorer newer than 8, and Vista can't use anything later than 9. Sure, other browsers exist, but can you run Firefox 30 on a Windows 98 machine? And how much can you get out of, say, a cable news company's website if you're on IE5?(*) With the frenzied pace of change and development in browser technology, eventually our Windows 7 and 8 systems will no longer be compatible with emerging Web standards and we'll face the stark choice of either getting that dumb Chromebook terminal, or cutting the Internet cord altogether.

 

My hope is that enough in the computing public will resist the lure of the cloud and refuse to buy both dumb terminals and cloud-based software, such that vendors will sense the need to keep offering real PCs and software that lives and runs on our machines.

 

--JorgeA

 

(*) I wanted to test this idea out on the CNN site in IE6 on a Win98 laptop, but Microsoft intercepted my action with its obnoxious "It's time to upgrade your browser" page and the d*mn thing is taking so long to load, I got tired of waiting.  :)  Will report back on this if I ever get anything.

Edited by JorgeA
Link to comment
Share on other sites

Sure, a lot of it is evolution of things we had already conceived of, but...  How fast does evolution have to go before it's revolutionary?  Before technological whims become enabling technology.

I was around when we installed a Telex machine.

Besides the cost of the machine (which I don't remember but that surely was astronomical) we had (in Italy) to file a request for Government Authorization , the actual cables had to go through special (reinforced/tamperproof tubing), the path of them had to be declared and detailed on a plan, as well as the physical location of the apparatus, that had to be a "dedicated" room, access to his room was reserved to the operators (that needed a special license/course), before connecting the machine a Government Inspector came to verify that everything was fine and in accordance with the licence, we had to keep a log not only of the transmitted (and received) messages, but also of the people that actually made access to the room, including the janitor.

Some years later (but not that many) I also installed a few of the very first Fax machines.

I remember buying one (very basic, thermal paper) fax for a mere ITL 5,000,000 (i.e. roughly 5 salaries of a middle career employee) in 1984 or 1985.

 

The former was a (slow, complicated) alternative to other forms of communication (mainly telegrams), the latter was a revolution, easy, fast (relatively) and allowing to transmit and receive copies of actual documents.

 

Some ten years later, 1994/1995 I had a Tablet with touchscreen :w00t: :

http://www.msfn.org/board/topic/155290-windows-8-deeper-impressions/?p=1008812

http://www.msfn.org/board/topic/155290-windows-8-deeper-impressions/?p=1008822

http://www.msfn.org/board/topic/155290-windows-8-deeper-impressions/?p=1055477

and a modem that allowed me to transmit and receive data at "high speed" (around 1/4 hour for a floppy image 1.44 Mb), besides faxes.

 

That was another revolution for my workflow, I had a (working) spreadsheet, a GANTT program, an (almost working) CAD program (besides a word processor) and we could cooperate remotely on the actual files.

 

About the same time 1993/1994  we started having mobile phones, those changed some of the workflow, we could communicate everywhere and at anytime.

 

In the years 1994-1996 (or so) CAD programs became usable (thanks also to Windows NT) on non-dedicated workstations and affordable, that changed again a part of the workflow, we moved from paper to files for *everything* (or almost everything), we had (relatively) fast (laser) printers and (not-so-fast) plotters.

And every night we connected via modem to synchronize remotely, but we could communicate - still via modem - during the day also (if needed), and we started having access to the Internet (and FTP sites to exchange data).

 

Soon after came more affordable internet and e-mails, that changed again a part of the workflow, we could communicate easily not only "within" the firm, but also with third parties, yet another revolution.

 

Five years later, let's say 2001/2003,  I had a later version of Windows (Windows 2000), a later version of Excel (which did the SAME things as the earlier versions), a later version (much better/much more usable) of AutoCAD (that did the SAME things as the early versions, only faster), a later version of MS Project (which did the SAME things as the earlier versions, but more easily), an almost unchanged (for all practical uses) version of Word.

Each of these apps were faster (thanks also to hardware progress) more capable (but within limits) but did not change the way we worked.

 

Now I am doing more or less the same things I did 10 years ago, I have obviously faster tools, some are actually "better", but nothing has changed substantially.

 

If I really have to find something that partially changed my way of working in the last ten years, I would probably have to mention the increase of speed of printers (much, much faster), of scanners and of plotters (less increase of speed than printers, but still faster than before).

 

jaclaz

Link to comment
Share on other sites

More of what's already there can yield fundamentally better ways of working.

 

Now pretty much anyone can afford multiple monitors with a high pixel count connected to a video card, enabling simultaneous display of much more information - vs. Alt-Tabbing from one to another.  A bit over 10 years ago as a high level designer in a well-funded engineering firm I requested and got two 20 inch CRTs, and they filled the corner of my cubicle.  Later I got one big 24" Sony CRT monitor at 1920 x 1200 pixels (an awesome beast). Where were 30 inch monitors 15 years ago?  Now having a few and/or big LCDs is nothing.

 

I sense that people may want to see "flying cars" before they feel technology is improving life any more.

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...