Jump to content

The obesity of web pages


jaclaz

Recommended Posts

An extremely interesting article/talk, here:
http://idlewords.com/talks/website_obesity.htm
Too many good points to be quoted :thumbup, among them I like the proposal for the Taft test:

This project led me to propose the Taft Test:
Does your page design improve when you replace every image with William Howard Taft?

If so, then, maybe all those images aren’t adding a lot to your article. At the very least, leave Taft there! You just admitted it looks better.

 

ob.027.jpg

 

jaclaz

Link to comment
Share on other sites


I blame this on the fact that as time goes went on, there were less people who learned to build websites from books. Books, you may recall, were the go-to learning tool for nearly all of computing up until around 2003 or so. That year also seems to be when a larger percentage of users (in the United States or users of English language websites) were moving to broadband. I remember my first actual software development job in 2004 where it was not actually required to install the 2 CD add-on for Visual Studio for documentation because Microsoft had everything on Technet or MSDN and it was updated.

Web development from those book times tought two important things:

1. Do it yourself, only use what you need

2. Keep sizes and load times down because most people did not have a fast internet connection... see "56K Warning."

Those getting into making websites around 2000 would often ask, why is my page so slow? It was usually due to being build with a "Generator" such as FrontPage or an online builder like what Homestead had. These WYSIWYG programs would often add code that un-needed (but you might use that someday) or not remove remarked markup or (worse) have some sort of buggy or non-standard code/plugin that would confuse browsers.

So a bad thing happened. People still wanted to make webpages but did not want to do all the work from the ground up. They slowly moved away from site generators and into full on web application packages. The first of these were CMS and CMS add-ons for forums. A huge amount of stuff you would install and just fill in what text and pictures you wanted. Then came the age of adding a something into your site that provided a function that another site provided. This had existed for awhile with page counters, polls and webrings. You just copy this HTML and stick it on your page somewhere and boom! instant functionality.

This now meant that in order to load up your page, you also had to load content from x other websites and HOPE they aren't having a problem. In the early days, the thought was you could take their code and host it locally. Since that code often changed or updated, people didn't want to have to keep up with everything they were adding. Now site builders stick in these includes and forget about it, until one day it screws up.

It seems like nowadays that people designing a website are either not technically sound OR are not willing to tell the decision makers that their idea is not good and shouldn't be used. I've worked on a web team where the others would just accomplish any task given to them. Websites in the business world became less about functionality and more about what looks cool and design experimentation. The Paypal example from the link above is not where I thought the author would go with it. Yes there are usuability problems with Paypal in the user cpl. However, there is no reason in the world why the main index of Paypal needs to stream a video as its page background!

One last thing we can blame is the web browsers we use. They used to be known for their rigidity in adhereing to web standards, and over time they have been changed to ignore them and make it so the websites render. Browsers now will ignore nested HTML tags where in the past using those would break the page, this is mostly caused by includes from other websites. All our browsers by default allow XSS. People are still using Iframes. SVG is now available for everyone but only as images. XSL/T still doesn't work in anything besides IE6.

wow i typed a lot.

Link to comment
Share on other sites

Well duh. I know what you mean. The problem is that people needs places to upload things and want to have fancy graphics and stuff like

that out of fear. I hate the most are artist webpages. In college we get taught to use "Dream weaver" and "Adobe" or even "Quark" products,

A lot of those products generates java/apple script and is not HTML at all.

when it comes to designing a website, most people ( like myself ) could use a web editor, for html, and then use particles of Java script/php or something else, to get the primaries. A good example is the float funtion ( which make the words pop as you glide your mouse over them ).

You want to help stop the problem. You would have to develop web browsers for game systems like the Dreamcast/PS2/Gamecube/Original X-box. Because the idea of web browsing is that it must display the same on both websites, but now people want it on the portable devices.

You have to connect with the PPC movement and somehow get the PPC back into peoples faces. You would need to re-invent the Apple phone box that Steve Jobs originally made.

Why all the crap? Because people making webpages feels they need more crap to get more viewers.

It is not computer people but people nowadays are just using things like Facebook to connect.

......................................................

jaclaz I did some reading into that article. Look okay, for Graphic designers/ or any artist group a lot of people want everybody to see there work the best. Or have a clear picture as possible. A little five year old girl said is best about the videogame "shovel Knight" . "Great game bad graphics".

Same could be said about "ico" or even "Shadow of Colossus", "Okami". Great games but the graphics could be better. In terms of consumerism for videogames or graphics themselves. Many western companies are trying to destroy the 2d bliss of "Marvel Vs Capcom". from the days of the SNES/Genesis era.

Because westerners are taught in art appreciation the more photogenic a work the better. Because people have no imagination and wants everything to be photogenic ( including non-art buffs ). We have this generation of

On the fly/ On demand thinking people. Where they think or want everything on the fly. Like those elderly peoples commercials nowadays ( or commercials from "Where is the beef" days ) "I want my money now" . Again that goes back to the poverty stricken way of thinking.

Everybody feels the are entitled to this on demand practice of everything. Even if on demand is used differently.

......................

The same could be said with arts and animation. People nowadays might complain that art from the 1990's looked like moving paintings, or paint in general. So now they want everything to look like the photographs of people. It is stupid because now people have to work harder then they have to.

GO TO SMITHS MICRO WEBSITE MY. ( whatever the page name is ) and view it in the archives. You will see a slight evolution and changes to what you could accomplish with there program. Honestly the earlier works looks more photoshopped while the later works looks un-naturally etc etc art. I am not too sure.

It defeats the purpose entirely. It is like how the boxart of a videogames for the PSX had simular 3d characters that was in the game or it used comic art. But the flat fact remains was that both was recognizable. They want people to come in and do artwork that looks like

Donatello, Michelangelo, Leonardo, Raphael, Picasso, and bull-crap like that. Not even the legendary Rareware had to go through so much art development.

.........................................................................................................................

That little girl I mentioned. Is going to grow up thinking art has to look like the Renaissance period or even

something from the France Louvre or MOMA. Without any appreciation to other nations at all. What she think is small is big to us. That way of thinking.........

A lot more work being done for something that is small.

Edited by ROTS
Link to comment
Share on other sites

The weight of today's web sites appear to be one of the main reasons why the average user would need to upgrade their computer, even if they don't work with graphics or play games. It is easy to get a browser to consume a gig of memory just by opening a couple of typical portals with a complex layout. Pages don't contain that many images anymore actually, since graphic buttons and rounded corners fell out of fashion. Typically I see few images of mediocre quality pulled in by a chain of JavaScript resources, onto a 'chickens*** minimalistic', – ugly –, flat single color background. Sometimes the page elements get repeatedly resized and reflown in response to some script, which again is quite slow.

I like to see good quality photos without compression artifacts. Not all websites need them. A list of search results probably only needs one small picture at most. But what actually happens is that the same poor quality pictures of yesterday get pulled in from linked resources at high cost (latency, bandwidth, cpu/ram). They're not always advertisements. I've been quite annoyed by services like Gravatar. I've seen some forums (might have been this very one) contact Gravatar asking for an avatar for each and every member whose post was visible. Vast majority of users did not in fact have a picture hosted there, but my browser kept establishing SSL connections and asking anyway.

It is difficult to imagine how people can surf without adblock to avoid "fat asset" nonsense...

Opera versions 11 and 12 have been criticised as having decreased stability compared to earlier builds. I'm sure that vast majority of blame lies on web bloat, overcomplicated layouts and scripts, which didn't exist earlier.

Websites following "obsolete" design standards are rare nowadays, but is quite ar relief to browse one, such as a good old VBulletin forum, or Project Cerbera. This site flies, its code is easy to read and save to for offline use. I love the attention to detail, how the author has used <abbr> to define terms, instead of a custom popup frame.

Edited by j7n
Link to comment
Share on other sites

Nobody mentioned about **** Google infiltrating most of the web. A lot of sites/domains - this one included - depend on certain "services" provided by them - be that text formatting, fonts, in-site search or whatever. Too bad, because I banned Google from all my computers years ago following a blatant "F you" slapped in my face upon a search. Now try to block everything Google-related and see how much this or that page would look like, if search still works or how fast it would load.

 

 

Truckloads of javascript that nobody knows what it does, some loaded from third-party domains - that's another reason for slow loading.

Bad browsers keeping things cached even after closing pages/tabs.

Advertising being blocked only after being downloaded.

 

Most of the time I'm on 15kB/s. Links time out/expire much sooner than I can download whatever that may be.

 

The web has become a mockery.

Link to comment
Share on other sites

Yep, though not mentioned explicitly as Google but only generically as "advertisers", that phenomenon is described as well:

 

The user's experience of your site is dominated by hostile elements out of your control.

This is a screenshot from an NPR article discussing the rising use of ad blockers. The page is 12 megabytes in size in a stock web browser.

The same article with basic ad blocking turned on is one megabyte. It’s no model of parsimony, but still, what a difference a plugin makes.

If you look at what the unblocked version pulls in, it’s not just videos and banner ads, but file after file of javascript. Every beacon, tracker and sharing button has its own collection of scripts that it needs to fetch from a third-party server. Each request comes packed with cookies.

More cookies are the last thing your overweight website needs.

These scripts get served from God knows where and are the perfect vector for malware.

Advertisers will tell you it has to be this way, but in dealing with advertisers you must remember they are professional liars.

 

 

jaclaz

Link to comment
Share on other sites

Epic article! I've enjoyed a lot reading it, thanks for sharing Jaclaz. respect-048.gif

 

... Chickenshit Minimalism. It's the prevailing design aesthetic of today's web ...

 

... Some kind of brain parasite infected designers back when the iPad came out, and they haven’t recovered. Everything now has to look like a touchscreen ...

 

... Not every interface should be designed for someone surfing the web from their toilet ...

 

... Sites that used to show useful data now look like cartoons. Interface elements are big and chunky. Any hint of complexity has been pushed deep into some sub-hamburger. Sites target novice users on touchscreens at everyone else's expense ...

... faint text separated by enormous swathes of nothingness. I shouldn't need sled dogs and pemmican to navigate your visual design ...

... This is no way to live. We're not animals! ...

Link to comment
Share on other sites

Let's not bypass the swear filter.

On searching on Google, it is another bad sign. Nearly every time I search for something, the site decides that I must actually mean to search for something else. It is very frustrating and there doesn't seem to be a way for it not to randomly include "similar" words in place of what you actually typed in without having to resort to quotes and (the limited) boolean (support) every time.

Link to comment
Share on other sites

Let's not bypass the swear filter.

On searching on Google, it is another bad sign. Nearly every time I search for something, the site decides that I must actually mean to search for something else. It is very frustrating and there doesn't seem to be a way for it not to randomly include "similar" words in place of what you actually typed in without having to resort to quotes and (the limited) boolean (support) every time.

That is precisely the reason I parted with Google forever since 2012.

Somebody tell me how much of a resemblance does "LCPRI_ constants" bear with "hotel Capri Constanţa": screenshot

 

But that's an entirely different matter. Problem is, Google now owns and controls a very large part of the web through its font dispatcher, its formatting scripts, its so-called search and whatever else. The screenshot above belongs to my WordPress blog. I don't think there is one WP theme that doesn't at least call for Google fonts. That's merely the first example that popped into mind. Talk about privacy? Google knows everything we type here at MSFN, on our WordPress blogs and pretty much everywhere else on the web.

AOL mail - the bastards that keep sending MSFN notifications to spam - use Google search on their page. Why should Google know what I'm searching for within my private e-mails?!

 

There may be much, much more to say on this subject but it's late here and I'm tired of playing Don Quijote's part anyway. I'll just list the Firefox add-ons I'm using that relate to restoring sanity: QuickJava, NoScript, Ghostery, Request Policy, YARIP (Yet Another Remove It Permanently), Redirect Cleaner, Greasemonkey with a few scripts, Simple Site Blocker, Google Disconnect, Facebook Disconnect, Twitter Disconnect. These are the versions that run with Firefox 13 on XP because that's what I'll stick to. Most of them work (with modifications or not) in Firefox 9 on 98SE.

Link to comment
Share on other sites

I don't think there is one WP theme that doesn't at least call for Google fonts.

Well... you could just write your own... Surely such things aren't required.

Wordpress itself is a mess. One of the first cookie-cutter builders where I noticed the totally un-needed linking to multiple stylesheets AS WELL AS having styles in the HTML.

Link to comment
Share on other sites

 

I don't think there is one WP theme that doesn't at least call for Google fonts.

Well... you could just write your own... Surely such things aren't required.

Wordpress itself is a mess. One of the first cookie-cutter builders where I noticed the totally un-needed linking to multiple stylesheets AS WELL AS having styles in the HTML.

 

Blogs hosted on public wp.com cannot load private themes - only blogs hosted on private domains with full access to the wp software can load such themes, plug-ins and whatnot.

 

But that was just an example. Try to limit your bandwidth to something ridiculously slow and then test a few known sites - the browser should continuously report what's loading (usually at bottom-left). Analyzing the page code might not show the full extent due to javascript or other obfuscated elements.

 

From my point of view, using content delivery networks and distributed payload is just a way to invite unwanted third-party to breach a connection. If - say - one googleapis server were to be infected with malware, dozens, hundreds or much more innocent domains would get red-flagged in an instant. An attack on a site/domain would contain the infection to that domain only.

 

And I'm not even thinking as the real villains think. So, why should one - whatever page/site - retrieve (unwanted/risky/etc) content from dozens other domains? Why does everything have to be centralized in an overtly-communist way? Why does everything have to be the worst way possible instead of the best way possible? Is it because we, as human beings, are defective by design…?

 

EDIT:

And related to jaclaz's link above: why do all those images on Troy Hunt's site have to be hosted on googleusercontent.com? My RequestPolicy add-on rightfully blocks them as they're not served from the same domain the page is.

Edited by Drugwash
Link to comment
Share on other sites

Related, though with a different point of view, centered on abuse rather than size :

http://www.troyhunt.com/2016/01/its-2016-already-how-are-websites-still.html

 

jaclaz

Ok this explains why I couldn't view the Forbes link you posted before. I ran into it on another forum today too. Forbes does a redirect, which puts me into an empty page. I'm not even concerned about it. If the website does not behave properly, I won't reconfigure everything just to make it work. It isn't worth the effort. I won't use websites that do this and the other offender mentioned there: websites that won't let you scroll. I will make exceptions only in required cases and in those instances only to use a specific browser just for those sites. Best example is IE, which I only use on MS websites.

Link to comment
Share on other sites

I have also noticed this trend in the last 5 to 10 years, and I believe there's a combination of reasons:

  • Google - This is the obvious one, Google Analytics is on almost every webpage, and it causes many slowdowns. (Look at your browser status bar when a page is loading, and 90% of the time, you will see Google Analytics there, and usually it takes the longest to load)
  • Facebook, Twitter and other social buttons - This is another common one, every site you see a "Like" or "Tweet" button, external Facebook or Twitter code is being loaded, which I also presume is being used to track for advertising purposes.
  • The rise of the mobile web - As cell phones get more popular, more and more websites "optimize" their pages for mobile, often to the detriment of desktop users. This usually involves an unhealthy amount of CSS and JavaScript, just to allow the same page to be rendered on a 4" screen and a 27" screen. (This also caused a major downfall in UI design)
  • Front-end frameworks (Bootstrap) - Tons of extra CSS and JavaScript that slows down webpage rendering and adds more things to be downloaded
  • Scrolljacking, custom scrollbars, sticky headers, and webpages that have huge background images and very little text - These are just visual gimmicks that serve no practical purpose than to allow the page developer to say "Look at what I can do!" (Sticky headers also get really annoying on mobile and 1366x768 laptop screens)
  • News pages that use a million lines of JavaScript just to add features like "endless scrolling" and slideshows, when all I want to do is read the news story that I clicked on.
  • Lazy developers in general - I can imagine in the last few years, as Internet speeds have increased for most of the world and computers have gotten faster, that developers have gotten lazier and left more stuff in their webpages, or didn't optimize their JavaScript enough. I also imagine that they took high-resolution images and downscaled them in CSS to half of their original size, instead of resizing the image beforehand to reduce file size. Of course, these developers forget about the people still on dial-up, on metered connections or slower computers.

As I previously stated, all of these changes are purely for aesthetics or surveillance purposes, and serve no practical use. However, I did find a solution to some of these issues. Over the last few years, I have noticed that more and more sites are becoming incompatible with Firefox 3.6 on Windows 98SE, and the sites that still are have gotten much slower. I added all Google Analytics, Facebook and Twitter domains to my hosts file, and I noticed that everything is faster, with some sites loading in half the time. This has prevented numerous rogue scripts from actually crashing the browser.

 

I also imagine that Firefox extensions like NoScript and RequestPolicy prevent much of this XSS from happening as well.

 

Edit: I just did a test with a cacheless load of the MSFN board homepage. (Windows 98 SE, Firefox 3.6, 256MB of RAM)

 

Regular someonewhocares.org hosts file: 37 seconds

someonewhocares.org hosts file with Google Analytics, Facebook and Twitter URLs added: 31 seconds.

 

That's a full 6 seconds shaved off of load time just because of Facebook, Twitter, and Google Analytics.

Edited by rn10950
Link to comment
Share on other sites

If the website does not behave properly, I won't reconfigure everything just to make it work. It isn't worth the effort. I won't use websites that do this and the other offender mentioned there: websites that won't let you scroll. I will make exceptions only in required cases and in those instances only to use a specific browser just for those sites. Best example is IE, which I only use on MS websites.

 

 

Yep :thumbup

Now I was thinking if we can do something about it , I mean what about sending a (polite) e-mail to the site admin/support/contact explaining how you:

  1. attempted to view the site with a "good enough" browser and setup
  2. due to the stupidity with which the site has been written/setup you landed to a blank page (or to a home page and not on the one you wanted to get, or got an offending popup/popunder/etc.)
  3. that they lost at the very least one visit or more generally a reader/contact/prospective buyer/etc., i.e. how they managed to lose the actual whoever/whatever they were trying to get
  4. that you have learned your lesson and thus will try your best to avoid ever visiting again the site

Would someone ever read these letters?  :unsure:

Would the amount of similar letters ever be enough to make them at least consider the idea of simplifying their crap?   :dubbio:

 

@ rn10950

JFYI:

http://www.msfn.org/board/topic/174175-not-a-site-issue-but-maybe-worth-some-thought/

 

jaclaz 

Edited by jaclaz
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...