[discuss] ICANN policy and "Internet Governance"

JFC Morfin jefsey at jefsey.com
Tue Jan 7 18:24:50 UTC 2014


At 18:57 06/01/2014, Mawaki Chango wrote:
>JFC,
>Two questions:
>1. Have you implemented for yourself what you just described as 
>gathering the ""broken missing layer" functions and services into a whole"?

Hey! There is a misunderstanding.
I respond to you as I responded to Nathalie, just to be sure I did 
not hide the anything from the community. Before they engage in chase 
Sao Paulo stools.

What is being discussed here is only to continue what Louis has 
imagined and tested, what Vint Cerf has made a success, what 
Berner-Lee (W3C), Mark Davis (Unicode), etc. have engaged, in 
rationalizing it from the consolidated experience that we have all 
obtained together over the last 40 years.


Ways to proceed

There are conceptually several ways of doing it that each of us may 
propose/discuss. My favored one is the way:

* I received and geographically deployed it 35 years ago (Tymnet 
technology) and had to extend it in terms of services
* I architectured it 20 years ago for my own business machines and operations
* and I constantly considered and experimented on the internet (e.g. 
Dot-root community testing as per ICP-3, lack of use of root servers 
for years, obtained the precautions I wanted from the IETF, etc.)

I will call it pragmatically agorical (I will explain this key point 
this below). It led me to think that it should be developed as an 
interoperation system with an application as a protocol approach, 
allowing existing applications additional "metaconnections" (call 
them networked hooks, interlinks, etc.) at a minimum development cost 
for them, for a simple and progressive transition. The cost and work 
are a simplification of which I will now explain the complexity.


>2. Do you think there is the slightest possibility to make it so 
>that any user lambda, with some determination for sure, would be 
>able to implement all the steps needed to realize that for 
>themselves? If not what would it take --apart from the need to 
>wrestle with a lot of "engineerese" (as compared to 'legalese')?

The whole thing can only result from network open code by FLOSS 
developers with a single target: to make it simpler for end users, so 
they do much more in their own chosen personal way.

* On this list we have many experts and practitioners of the logic of 
connecting hosts and users *through* a network of geographical dumb networks.
* What we need is new expertise that is able to think about the way 
to agorically connect persons and processes (P&P) to the network 
intelligence (inte-legere) of multi-semantic smart networks.


The agorical why

Let us understand this.

* Since Aristotle, for 2,350 years we have simplified the rhetorical 
information of the fractally deterministic chaos of reality through 
the dialectic of logic (syllogism).
" Norbert Wiener has identified an efficient way to further proceed: 
to question the chaos directly through a monolectic process: 
action/reaction. A reaction is the response of that chaos we call reality.
" 45 years ago some people dug further and started dealing with chaos 
networking itself considering it as a polylectic process compared to 
what happens on an agora.

So we now are confronted with "agorics" as the discipline of studying 
agoras because networks are kinds of agoras. We already know a few 
things about agorics:

" by its essence, a network is polylectic
" complexity looks like the agorical form of the logical simplicity 
and cybernetics unicity
" monolectic cybernetics seems to call for a vision and results in reactions
" dialectic logic seems to call for reason and results in conclusions
" polylectic agoric seems to call for reflection and results in emergences.

This is why openly discussed code and parameter sets can conceptually 
fit the bill. We want and need to facilitate reflection until we 
reach intercomprehension at every scale.


e-Empowerment for all

This is the way to interface with the more and more complex, large 
and powerful networked applications and permit everyone to master 
them with the help of their private and secure personal digital supervisor.

It took decades to standardize the interface with the machine CPU 
through the OS (posix).

The same amount of time was needed to interface Hosts with other 
Hosts, in particular through the internet.

Now, we need to extend the standardization of the interface of our 
digitalized person with the network at large (this is the presentation layer).

NB: Digitalized means that we as persons are in full empowered 
control of our human self, and of its digital appendices and 
extensions through a supervisory panel that we fully master (i.e. 
designed and trained by and for our individual personality).


By all

You, therefore, can measure to which extent the targeted emergences 
involve and call for the contributions of everyone, at least in the 
personal fitting phase. You can also measure why open minded 
developers, who are used to multicultural thinking, are an advantage.

This is, moreover, true that intellition (intelligent presentation 
based discipline) is by nature a semiotic domain and will depend on 
"multimecalingualization" as a continuation of the Unicode's 
globalization (internationalization, localization, filtering). The 
man/machine dialog will certainly extend beyond the keyboard and 
mouse, touch screen, Google glasses, etc.


Starting with the DNS

This is why my best starting bet is the DNS system as the sole 
utility where man directly relates with the internet machinery. One 
can use it (cf. BINDX kinds of work) as a kernel for netix functors 
add-ons. This is why I introduced the ML-DNS (Multi-ledger DNS) 
concept when discussing IDNA2008, as a concept that would fully 
respect the Internet DNS solution and that I could use to continue, 
extend, and generalize the whole support of the digital name space.

This made us (IUCG French writers with the support of some other 
linguistic writers) a blocking filter of the non orthotypographic 
propositions 
(http://translation-blog.trustedtranslations.com/the-importance-of-orthotypography-2012-08-27.html). 
The filter was to test the discussed IDNA2008 ability to support the 
French language majuscules. After months and an appeal, a consensus 
was found in minutes when Pete Resnick and Paul Hoffman introduced 
the draft of RFC 5895 where they exemplify the customization 
(subsidiarity) of the IDNA2008 interface at the user's fringe.


Comments

However, this calls for a few comments.

1.Much work (and, therefore, funding or dedication) is needed.

Complexity is multidimensional simplicity. Simplicity is the basic 
principle of the by then very large internet system architecture as 
per RFC 3439 that continues RFC 1958. Simplification was accepted 
until now as being the most demanding part of software development. 
Complexification will probably be still more demanding: this is what 
is to be eased.


2. This cannot be addressed by an IETF area.

The reason is simple: what the user wants is to master their 
relations with and throughout the digital world (digisphere). *Not 
only the internet.* Therefore, the internet is only a socket into the 
internet layer. This has two huge advantages:

" Network neutrality MUST be built in. Relations are layer six to 
layer six and above.

" The border with the Internet technology is perfectly defined and 
protected from layer violations by the IETF vocabulary (RFC 2119): 
The keywords "MUST", "MUST NOT", and "REQUIRED", must to be used when 
applied to layer 6 issues, and layer 6 issues MUST consider them as 
untouchable "IS/ARE". This simple rule was discovered and proven by 
the WG/IDNA2008 consensus.


3. Dr. Lessig, "code is law" applies even more,

This is because the "code" is still nearer from the  cerebric (legal) 
will. Privacy is more concerned since people will probably want to 
delegate more to their supervisor. Privacy concerns and protection 
will have to extend to digital intimacy, acts of thought and 
conscience. Brainwashing will be possible by digital proxy, in 
interfering with personal referent sources. The NSA and hackers have 
shown that this was already possible: everyone will have to have 
their own PRISM (personal reality information system monitor). We are 
naturally digitally naked: this is a reality that we will become familiar with.


4. Architectonical protection

This will certainly make it extremely complicated to tape exchanges 
and to hack systems, which will be good for national cyberdefense. 
However, it should also be understood that more misdemeanors will be 
more easily traced, not by a costly spying on them, but just by the 
consolidated intellition of public facts. Like in common, real life.


5. Opposition expected

The technical, economic, cultural, and political impacts will be 
important enough to be slowed, opposed, laughed at, fought, etc. 
until they will necessarily win due to the digital world emergence 
momentum. Due to its various contexts and dynamic governance, the Sao 
Paulo meeting is a probable first battleground.

This is a major opportunity for developping countries. Because you do 
not need mines, industries, etc. to be clever. "There is no wealth 
but men" (Jean Bodin).


6. Where to discuss that

The place where to discuss it is at the crossroad between 
architecture, engineering, usership, civil society, innovative 
business, etc. on the https://www.ietf.org/mailman/listinfo/iucg 
mailing list, for those interested.

Everything is to be reviewed there, together.
For and by those who may be interested.
«one small step for a working few, one giant leap for mankind »

jfc  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://1net-mail.1net.org/pipermail/discuss/attachments/20140107/44ccf973/attachment.html>


More information about the discuss mailing list