W3C home > Mailing lists > Public > public-semweb-lifesci@w3.org > July 2006

Re: Net Neutrality and it's potential effect on Public Healthcare and Medical Research

From: William Bug <William.Bug@DrexelMed.edu>
Date: Fri, 21 Jul 2006 03:47:32 -0400
Message-Id: <DA4DD58A-5384-441D-8C81-7BF339E096AC@DrexelMed.edu>
Cc: "w3c semweb hcls" <public-semweb-lifesci@w3.org>
To: Eric Neumann <eneumann@teranode.com>
Hi All,

I've been meaning to give my strong endorsement to Eric's suggestion  
about NN.

We've already run into issues related not to bandwidth so much as  
latency.  Most people miss this subtlety when discussing the NN  
issue, but for those of us doing low-latency, real-time image  
delivery applications - e.g., providing network-based systems to  
dynamically slice through large 3D, spatially-mapped data sets which  
includes not only brain images but also gene & protein datasets that  
have been spatially mapped - the issue of latency is just as  
important as bandwidth.  Where NN is concerned, providers will be  
throttling latency as well as bandwidth.  In fact, much of what they  
seek to do in this arena will likely focus on control of latency.

I think this specific concern has similar ramifications when  
considering SW applications, such as those outlined broadly in Eric's  

There are plenty of technical tricks to get around this problem,  
especially when using IPv6, but if providers start screwing around  
with these parameters, it will be out of our control to provide these  
sort of applications unless both server and client are on a  
"protected" network.  Some of the large, GRID-centric science  
infrastructure projects already provide this sort of highly  
specialized network space, and for some applications it is absolutely  
required.  We've found, however, many extremely useful network-based  
tools can function even over the plain, vanilla Internet, but only  
when there's a fairly level playing field in terms of access.

Should the regulatory decisions on NN go in favor of the providers  
who seek to gain this control over latency & bandwidth so as to  
squeeze more $$$ out of users, this will also tend to create the sort  
of "class" system people feel may arise in public Internet for  
scientific applications as well.

There's also the issue of Internet2 (http://abilene.internet2.edu/),  
which I'm finding most campuses don't really understand.  Even for  
Universities that tout they have a connection to I2 - even a direct  
link to the Abilene backbone (or whatever the "new" I2 backbone will  
be called), for users on that campus to get I2-level performance, LAN- 
wide policies need to be implemented as to how I2 packets get  
routed.  In my experience, this is not being done, unless the  
research lab themselves take it on as an issue and work out the  
details with the local IT staff, a highly impractical requirement for  
most of the labs we seek to provide resources for.  If NN regulatory  
decisions are made in favor of allowing providers to do what they  
please, this issue of I2-level connectivity/throughput will likely  
get MUCH more complicated.

By the way, I came across this hilarious recent clip from Jon  
Stewart's Daily Show that accurately depicts one of my biggest fears  
on this issue - the legislators responsible for overseeing these  
decisions really don't have a very good understanding of the  
underlying technology - which implies they also won't have much  
appreciation for the potential ramifications of the decisions they  
make on this issue:


On Jun 24, 2006, at 2:47 PM, Eric Neumann wrote:

> Many of you may be wondering what the issue around Net Neutrality  
> (NN) has to do with Healthcare and Life Sciences Research. The  
> truth is we really don't know yet for sure, but it could be  
> significant, and we shouldn't ignore the possible consequences at  
> this very critical point in time-- I'll share some of the reasons I  
> can think o...
> Net Neutrality is under siege because of the corporate interests to  
> generate large profits through the (bad) control of high-bandwidth  
> access. The semantic web, though itself not requiring high- 
> bandwidth (yet), opens the door to better access to large amounts  
> of highly relevant information for the researchers, providers, and  
> consumers of healthcare. Consider the following scenarios:
> - Secure access for the Public to our private, managed electronic  
> Health Records in the future, which will include not just our data  
> and MRi scans, but intelligent references to background associated  
> information and images pertaining to knowledge of diseases and  
> available treatments.
> - Guaranteeing all citizens the best possible care by providing  
> full medical information to all care-givers everywhere; Hospitals  
> need to offer access to National Health Library information to all  
> their physicians and specialists (perhaps charters should be  
> created here, for government health orgs such as NIH and NHS).
> - Complete assembly of megavariate datasets (genes x dosing x  
> tissue x genotype) and imaging data to be used by the full research  
> community, e.g., BIRN.
> - Mega-Grid applications involving petabyte simulations and  
> analyses that can be requested by any scientist from anywhere in  
> the world.
> - Other areas of scientific research that will require high- 
> bandwidth, including astronomy, geospatially distributed ecological  
> data (e.g., NOAA), real-time, large-volume epidemiological studies  
> for fast spreading diseases (e.g., SAR, H5N1 reporting, ).
> All these require high-bandwidth network communications that should  
> remain unhampered and evenly available to all. I would like to  
> point you to TimBL's blog on this topic and on the issues  
> surrounding Net Neutrality:  http://dig.csail.mit.edu/breadcrumbs/ 
> node/144
> Another blog by Jon Stokes illustrates the salient points through  
> examples:  http://arstechnica.com/news.ars/post/20060623-7127.html
> I'm in no way suggesting changing the focus of HCLS, since it  
> should remain true and productive to its goals. But the NN issues  
> could have far-reaching consequences on the ideas and vision we're  
> proposing, and I don't want our efforts to come to naught due to  
> political myopia (any opthamologists on the list?). If others also  
> feel this timely and needs to address, we can set up a WIKI to  
> capture our thoughts and recommendations on the issues.
> cheers,
> Eric
> Eric Neumann, PhD
> co-chair, W3C Healthcare and Life Sciences,
> and Senior Director Product Strategy
> Teranode Corporation
> 83 South King Street, Suite 800
> Seattle, WA 98104
> +1 (781)856-9132
> www.teranode.com

Bill Bug
Senior Analyst/Ontological Engineer

Laboratory for Bioimaging  & Anatomical Informatics
Department of Neurobiology & Anatomy
Drexel University College of Medicine
2900 Queen Lane
Philadelphia, PA    19129
215 991 8430 (ph)
610 457 0443 (mobile)
215 843 9367 (fax)

Please Note: I now have a new email - William.Bug@DrexelMed.edu

This email and any accompanying attachments are confidential. 
This information is intended solely for the use of the individual 
to whom it is addressed. Any review, disclosure, copying, 
distribution, or use of this email communication by others is strictly 
prohibited. If you are not the intended recipient please notify us 
immediately by returning this message to the sender and delete 
all copies. Thank you for your cooperation.
Received on Friday, 21 July 2006 07:47:42 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:20:17 UTC