[discuss] Possible approaches to solving "problem no. 1"
jefsey at jefsey.com
Sat Feb 22 16:20:00 UTC 2014
At 02:35 21/02/2014, Andrew Sullivan wrote:
>On Fri, Feb 21, 2014 at 01:24:51AM +0100, Michel Gauthier wrote:
> > globalization? How do you understand the Unicode "globalization" and
> > the normative internationalization.
>What is "normative" internationali[s|z]ation?
You are pinpointing the core of the international normalization
issues that underlie something you and I faced over two IETF/WGs. The
industrial colonization strategies develop through three of the
concepts that you quote: internationalization, localization,
globalization, and a fourth one being categorization. I have many
times documented them, so I will only describe them succinctly.
Normalization is the description of normality. It describes things as
they are or as people want them to be. It used to be national, but
with the development of the international trade it has become more
and more global.
Standardization is the way one wishes to efficiently refer to
normality either to take advantage from it or to obtain
it. Standardization documentation is a political tool to control
international industries and markets. Whoever masters standards
masters the ulterior normality and can benefit from it. This is
achieved in internationalizing its standards or/and vision, i.e.
making them internationally adopted.
The advantage of an international standard is that each local
industry and market only has to adapt to one single standard rather
than to adapt to everyone's standard. The local adaptation to local
norms is localization. The local drawback is that local industry
becomes dependent on the internationalized standard of the dominant
producers: this is called industrial colonization.
Two things are then to be considered:
- national markets observed that the
internationalization/localization system favored dumping and allowed
TNC (TransNational Corporations) to produce more, reduce their costs,
be cheaper, and colonize local markets. The response of local
governments has been to modify the desired local norms. This created
Technical Barriers to Trade (TBT). Some were deemed as legitimate,
others not. The TBT regulation is the core of the international business.
- categorization: it is also observed that localization called for
some advantageous sector specific organization or regulation (e.g.
semi-finished parts, market segments, industrial sectors, languages,
etc.). There was, therefore, a need to categorize them, which helped
cooperation in penetrating foreign markets.
The world trade and influence is composed of economy and power. The
US is a leader and its internationalization reach is global. With the
advent of computers (as Brian noted it) the meaning of "global" (as
the entire geographical world) extended to logically global (this is
testified as early as IEN 48, where Vint Cerf notes that he uses the
concept of "local" in a loose meaning ranging from geographically
local to local to a system).
Mark Davis, in order to remove the linguistic barrier between IBM
computers and their foreign users, globalized them, providing a very
efficient example of technical globalization that he generalized to
the US and the world industry as UNICODE (being the source of ISO
10646). It was incorporated in Posix and the Internet (which missed
the presentation layer responsible for language and security
support) extending ASCII (i.e. binary English) in order to support
any non-ASCII script, and permit to quote any language expressed in
Unicode code points within the framework of English protocols. Mark
Davis' Globalization's internationalization (Unicode/ISO 10646) is
actually the most impressive colonization move in human history.
By internet technology not offering united presentation layer six
services, Mark Davis had made his first need supported by
applications. But there was still the need for those applications to
filter the data flows and, therefore, to categorize them. He already
had a very precise evaluation of the applications needs, he was able
to reinforced in joining Google, and addressed them through his RFCs
on lang-tags, building the most impressive segregation capacity in
human history. The data-flow and end-use human categorization power
of the lang-tags is tremendous and was initially proposed as being
purely technical and, therefore, outside of any national legal protection.
For example, lang-tags easily permit to automate "retro-meta-spam".
You send mails around in different languages, with their lang-tags.
Responses can be easily filtered by lang-tags to associate mailing
addresses with socio-linguistic social categories. Another
example: you can non-neutrally favor (now a Judge said it was
accepted in the USA) traffic tagged as "en-latn-us" ...
As a summary, I give again the Unicode/Posix "globalization" description as:
- internationalization of the medium: capacity to support code points
for every languages.
- localization of the end: capacity to adapt the received data-flow
to the application/user's expectations.
- categorization of the data-flow to permit filtering allowing
routing, processing, servicing, marketing, and sales on a
language/linguistic market basis.
The technical lack of presentation layer six (language, security,
etc.) explains why any "globalization" of anything related to the
internet leads to an English/US internationalization. Parallel
effects also result from the US-over-influence at ISO, JTC1, IEEE,
IETF, W3C, etc. and results in what is perceived as a de facto
e-colonization of the world. Without its multilinguistic and security
mechanism the "IEN48.1 motivation" can only be unsecurely American.
This is not a bug at all, it only is feature limitation as long as
"IEN48.2nd motivation" is not fulfilled (http://vgnic.org).
For decades, since the end of WWII, the world accepted this
globalization for many good and lesser good reasons, tempered by some
key normative requirements such as the English/French required
bilingualism of ISO documents: it obliges a double way of thinking standards.
Things have slowly changed. The responses must adapt. This is the
purpose of meetings such as the meeting in Sao Paulo.
BTW, you will note that the meeting on the governance of the *global*
network is called "NetMundial".
>I think it will be important in this discussion to distinguish among
>at least three terms. One of them is "globalization", the meaning of
>which seems in this thread to be up in the air. Therefore, I'll not
>talk about it.
>A second is "internationalization" or "internationalisation" or "i18n"
>(for "i-18-letters-n"), which in the Internet protocol world is really
>just about preparing parts of a system for use by diverse language
>users. "Just use Unicode" is an i18n slogan. The point in this case
>is that there are things that are exposed to diverse user communities
>and they need to be available to them in ways that are convenient for
>them. This is what (for instance) IDNA is about: making a protocol
>_available_ to support users.
>A third is "localization" or "localisation" or "l10n". In the
>Internet protocol (or, more accurately, "above" the Internet
>protocols) world, _this_ is the thing that is intended for users.
>I18n is useless for users. Nobody wants a user interface
>simultaneously in English and Farsi and Chinese. L10n is intended to
>make the user-facing features sensitive to the user's local
>conditions, to the extent that is automatically determinable. We have
>learned a great deal about this issue over the past 15 or 20 years,
>but there is more yet to do. If I may be permitted a modest plug, the
>IAB's Internationalization program (we use USian spelling in the IAB
>these days) is in fact currently engaged in tackling an update to RFC
>2277. It's early days, however.
>I hope these distinctions are helpful to the current discussion.
>ajs at anvilwalrusden.com
>discuss mailing list
>discuss at 1net.org
More information about the discuss