[A2k] We need a new formulation of end-to-end analysis

Seth Johnson seth.p.johnson at gmail.com
Thu Jan 24 09:46:41 PST 2013

I'm not sure about human terms, but it's general purpose processing
whether online or on a local device.  We're losing both points, and I
generally stress to people they're losing their computers and their
ability to communicate flexibly across networks throughout the world.
Decentralized?  More like, you can't do things yourself unless
networks interoperate general-purposedly.  NN is inherent if you get
that point.  Whether as a rule or as a natural outcome of autonomous
networks interoperating in a general purpose way.

E2E is intuitive: if you want to be able to do things flexibly, those
things aren't going to be able to get through unless the network of
networks (one network is another thing, and isn't internet anyway)
transfers information without regard for what it's doing.

But this is academic, what you know.  I guess I'm unclear about the
appeal you're trying to make, or the practical difference you're
inclining toward.  If you want to use new terms that say humans
interacting at their own endpoints, or interacting with a remote
server, what stops you from doing that now?



On Thu, Jan 24, 2013 at 10:38 AM, Philippe Aigrain (perso Wanadoo)
<philippe.aigrain at wanadoo.fr> wrote:
> End-to-end analysis is the major theoretization of the Internet that was
> proposed by Jerome Saltzer, David Reed and David Clark from 1981. In
> their seminal paper and later ones, they formulated what became known as
> the end-to-end principle, interpreted often as “application-specific
> functions ought to reside in the end hosts of a network rather than in
> intermediary nodes – provided they can be implemented ‘completely and
> correctly’ in the end hosts”. Ths principle is much quoted by proponents
> of strong network neutrality requirements, including myself. In reality,
> Saltzer, Reed and Clark derive this “networks better be dumb or at least
> not too smart” approach from an underlying analysis of what happens when
> bits travel from an end (a computer connected to a network) to another
> end in a network.
> However both network neutrality and the end-to-end principle capture
> only part of what we try to make them say. What we have in mind is that
> the analysis of what happens in a network should be conducted by
> considering what happens between the human using one device and another
> human using another device or between one such human and a remote
> device, such as a distant storage device, server or peer computer. We
> need an end-to-end analysis which is understood as
> <em>human-to-human</em> or <em>human-to-remote computer</em>.  What will
> it change? One must first acknowledge that with this extended approach,
> one can't hope to extend the probabilistic model which makes the
> original formulation of Saltzer, Clark & Reed so compelling. The new
> formulation can't replace the old one, it can only extend it with a
> qualitative reasoning...
> more at
> http://paigrain.debatpublic.net/?p=6418&lang=en
> _______________________________________________
> A2k mailing list
> A2k at lists.keionline.org
> http://lists.keionline.org/mailman/listinfo/a2k_lists.keionline.org

More information about the A2k mailing list