Tunnel overhead [On killing IPv6 transition mechanisms]

Ted Mittelstaedt tedm at ipinc.net
Thu Mar 18 19:48:11 CET 2010



nick hatch wrote:
> On Wed, Mar 17, 2010 at 1:27 PM, Ted Mittelstaedt <tedm at ipinc.net> wrote:
>> OK, I see, however from a users perspective I actually like this, because it discourages content providers from sticking in a bunch of
>> extra advertisements and such that crap up their site.
>>
>> I still maintain, though, that for 90% of the machines out there, the browser rendering engine is a far larger contributor to "unsnappy"
>> websites.  Based on our customer feedback most people have older systems or newer systems that are crapped-up with adware.
> 
> Yes, the rendering engine can be a bottleneck in performance. However,
> this bottleneck is influenced by how the website is authored: Download
> latency, for example, can affect the DOM, which affects the rendering
> engine, which affects speed. The Webkit blog has a great post with
> more details [1]:
> 
> "Introducing just 50ms of additional latency doubled the page loading
> time in the high bandwidth case (from ~3200ms to ~6300ms)."
> 
> I strongly disagree that junky PCs are the limiting factor for 90% of
> people, but don't have empirical data to back it up. In any case, what
> you're describing is the lower-bound for website performance.
> 

Most of the PCs people buy out there (for personal use) are cheap and
the manufacturers make them cheap by reducing disk space and ram.

> Website designers can't change several of those factors, and
> "advertisements and such that crap up their site" isn't the biggest
> problem.

Hello?  You just lost a lot of your credibility there.  How have many
of the viruses spread lately?  Not by infecting the primary sites -
by infecting the advertising providers webservers.   Many times I've
had to deal with advertisements that will throw popups up that cover
the primary site, or that cause the browser to crash or act really slow.

It sounds to me like you don't do a lot of web surfing if your going to
minimize the role that advertisements play in degrading the web 
experience of a site.

> Designers are focused on the upper-bound of performance,
> which improves things for everyone.
> 

SOME are.  The vast majority are not - they are focused on producing
slick-looking sites to justify the money they are charging their clients 
- most of whom don't understand what good web design is, and just want
something that "looks cool"

I agree the designers on staff of the large content providers - 
employees that is - are probably more focused on the upper bound of
performance.  But that is not the majority of sites on the Internet
nor the majority of web designers who have created them.

> Yahoo has a great introduction to the topic [2]. Note they don't say
> "reduce complexity" or "don't use ads", or "code for crap computers!"
> but rather, "reduce the number of http requests".
> 

Since the adverts are served from different servers each one of them
you put on a site creates more http requests from the client, so it's
pretty obvious what they are recommending.  And more and more adverts 
now have multiple elements in them.

When a site carries adverts it's basically handing off a lot of screen
realestate to some other web designer, who is probably NOT going to be
following good design practices.

>> The google toolbar alone ads an extra 2-3 seconds for most systems,
> I've never seen performance issues like this with the Google toolbar,

It takes ram and on a machine that's ram-shy, and swapping already,
and running a winmodem, then yes it does worsen the performance.

It's only been in the last 2 years that systems were regularly
sold with more than a gig of ram.  Walk into a computer store sometime
and chat them up.  Ram and disk upgrades are the most popular sellers -
those are going to at least the clueful people.  But many are not
clueful and just live with it.  I still regularly see people running
XP & IE8 on 512MB ram systems.  I'm typing this message on a 1GB
of ram system, as a matter of fact.

> ever. As stated earlier, Google hates the slow web, and has data to
> back up why. [3] Does it really make sense to you that Google would
> release a tool that slows down the web that much?
> 

Google is making the assumption I talked about earlier which is that
everyone has a reasonably fast machine with lots of ram.

>>> There's a cute tool which allowes one to test it:
>>>
>>> http://www.alphaworks.ibm.com/tech/pagedetailer
>>>
>>> "A graphical tool that enables Web content providers to rapidly and
>>> accurately measure client side performance of Web pages."
>>>
>> Cute too, I wonder how many content providers actually read the
>> advice in the help to put their sites on a diet.
> 
> These tools aren't cute, they're very useful, and common:
> 

I was meaning cute as in pretty, not as in toy-like.  I thought it
was a neat piece of software.

> http://code.google.com/speed/page-speed/
> http://developer.yahoo.com/yslow/
> 
> Good web developers don't talk about "putting sites on diets", they
> talk about progressive enhancement, caching, compression, CSS sprites,
> Javascript include order, etc. This topic is well studied from just
> about every angle, and is  empirical for obvious reasons.
> 

Most of those things are just palliatives to try to get a too-fat site
with a bunch of worthless eye candy that does not convey information to
run acceptably.  And they only work when the system the site is being
displayed on is running modern software and a fast CPU and has a decent
amount of ram since they push more of the processing onto the browser.

And yet meanwhile, technologies like scalable vector graphics which
would be really useful, are put on the back burner and keep getting
delayed.  That ought to tell you plenty as to what's important to
the content community.

The state today of current web design is to continue to make sites
appear slicker and slicker with more and more eye-catching candy
that does nothing to add to the usable information content being
transmitted.  It's a consequence of letting all the commercial
entities on the web who are competing with each other, and putting
a lot of inexperienced users who don't have a lot of surfing under
their belt.

And yet when you go to the forums that collect experienced web surfers
and read them, they are full of posts on how to obtain different bits
of software to BLOCK the eye candy.

If that does not adequately illustrate the disconnect between most web 
designers and most experienced web surfers, I don't know what does.

> Getting back to IPv6: Poorly performing transition mechanisms can
> affect user experience in serious ways.
> 

If the content providers are that concerned with this they need to be
pressuring the ISP's they buy service from for native IPv6, simple as
that.  Those ISPs are the same ISPs supplying connectivity to those
end users.  Rather than taking the retrograde tack and arguing that
since an IPv6 transition mechanism is slow that it should be scrapped
and go back to IPv4, they need to be applauding and encouraging
the transition mechanisms and if users complain, then point them to
the real culprits - the ISPs they are buying service from who are
NOT running native IPv6.

Ted

> -Nick
> 
> [1] http://webkit.org/blog/166/optimizing-page-loading-in-web-browser/
> [2] http://developer.yahoo.com/performance/rules.html
> [3] http://code.google.com/speed/
> 


More information about the ipv6-ops mailing list