nanog mailing list archives

Re: New minimum speed for US broadband connections


From: Mike Hammett <nanog () ics-il net>
Date: Tue, 1 Jun 2021 12:33:52 -0500 (CDT)


"Why is 100/100 seen as problematic to the industry players?" 


In rural settings, it's low density, so you're spending a bunch of money with a low probability of getting any return. 
Also, a low probability that the customer cares. 




" There's an underlying, I think, assumption that people won't use access speed/bandwidth that keeps coming up." 


On a 95th% basis, no, they don't use it. 


On shorter time spans, sure. Does it really matter, though? If I can put a 100 meg file into Dropbox in under a second 
versus 10 seconds, does that really matter? If Netflix gets my form submission in 0.01 seconds instead of .1 seconds, 
does it matter? 




I think you'll find few to argue against "faster is better." The argument is at what price? At what perceived benefit? 




Show me an average end-user that can tell the difference between a 10 meg upload and a 1 gig upload, aside from 
media-heavy professionals or the one-time full backup of a phone, PC, etc. Okay, show me two of them, ten of them... 




99% of the end-users I know can't tell the difference in any amount of speed above 5 megs. It then just either works or 
doesn't work. 



----- 
Mike Hammett 
Intelligent Computing Solutions 
http://www.ics-il.com 

Midwest-IX 
http://www.midwest-ix.com 

----- Original Message -----

From: "Christopher Morrow" <morrowc.lists () gmail com> 
To: "Mike Hammett" <nanog () ics-il net> 
Cc: aaron1 () gvtc com, "nanog list" <nanog () nanog org> 
Sent: Tuesday, June 1, 2021 12:14:43 PM 
Subject: Re: New minimum speed for US broadband connections 







On Tue, Jun 1, 2021 at 12:44 PM Mike Hammett < nanog () ics-il net > wrote: 




That is true, but if no one uses it, is it really gone? 








There's an underlying, I think, assumption that people won't use access speed/bandwidth that keeps coming up. 
I don't think this is an accurate assumption. I don't think it's really ever been accurate. 


There are a bunch of examples in this thread of reasons why 'more than X' is a good thing for the end-user, and that 
average usage over time is a bad metric to use in the discussion. At the very least the ability to get around/out-of 
serialization delays and microburst behavior is beneficial to the end-user. 


Maybe the question that's not asked (but should be) is: 
"Why is 100/100 seen as problematic to the industry players?" 




Current thread: