IGF, Internet Governance, News, Open Source, technology

IETF 96 remote participation from a Chennai Hub being considered

The IETF 96 is scheduled to be held in Berlin, Germany, July 17-22, 2016. Those interested in remote participation may register at the IETF Meeting Registration System choosing Registration Type: Remote Participant,

If there is sufficient interest in participation from a common Hub at Chennai, Internet Society India Chennai Chapter will set up a hub. Please register your interest by filling up the form, seats are very limited.

ietf

DNSSEC, Future of Internet, IGF, Internet, IPv6, new gTLD, News, Open Source, technology

Y2K20: Opportunities in design and testing for freelance application developers, small IT companies, medium, large and huge.

It was not uncommon to find the earliest of the Web Application Developers to assume that all domain names would end in .com, all email addresses would follow the format @xyz.com. While developers took into account newer domain names such as .info in due course, most continued to design applications to accept Domain names and email addresses in ASCII just as software developers in the 80s assumed that it would be unnecessary to have any more than two digits to denote the year, which led to the famous Y2K issue towards the year 2000.

y2k20
Imaginary logo of y2k20, a name that does not exist

Now there are new Top Level Domain Names (such as .family and .game) and Internationalized Domain Names (in various native non-ascii scripts of India and the world, such as .??????? and .???? (I typed India in Tamil and Devanagiri, displays here as ???) as well as Internationalized email Internet Domain Names that would allow users to have addresses in their native scripts.

If a browser or a form in a webpage limits acceptance of domain names or email addresses with a rule such as “a domain name must be in English and end with .com, or .net or .org” or “an email address must be in English or numerals” then it is archaic.

It is a problem far larger in its dimensions than the Y2K problem of year 2000 which kept the IT community of the entire world talking. On this problem of “Universal Acceptance” there appears to be inadequate attention to the problem in global public interest as well as to the commercial opportunities it presents for enterprising Developers and Corporations. This might emerge to be a huge commercial vertical in itself in view of the Design changes to be brought about and in terms of the testing requirements. #Deity #NASSCOM #WIPRO #TiE #TCS #Cognizant (If you are from a different country, please feel free to rewrite this post to suit your country and publish it. This post is not copyrighted.)

For more information, follow the publicly archived, transparent discussions in the IETF forum, at ICANN and at the Internet Society on this issue. You could also write to isocindiachennai (At) gmail (dot) com for additional pointers or any clarification. Or ask your Executives at a higher level to take part in ICANN meetings that are open and held as multi-stakeholder global meetings. And also join the Internet Society India Chennai Chapter. Such participation would lead you to positive involvement in the global Internet and also connect you to business opportunities not only in the y2k20 (there is no such term, the term is coined to describe the issue and the opportunity) but also in DNSSEC, IPv6 transition, Internet of Things (IoT) and new gTLDs.

What does the phrase “Universal Acceptance” mean?

“Universal Acceptance of domain names and email addresses” (or just “Universal Acceptance”, or even “UA”, for short) means that all apps and online services should accept all Internet domain names and email addresses equally.

Universal Acceptance is an important concept these days because the Internet is changing. One way that it is changing is that addresses no longer need to be composed of ASCII characters. (ASCII characters are the 127 Latin-script letters, numerals and punctuation marks that are dominant on the Internet today. All the characters in this document so far have been ASCII characters.)

Most people on earth are not native speakers of languages which use the ASCII characters, so moving from a character set limited to 127 characters to an alternate which can support more than one million characters is essential for those people to fully use and benefit from the Internet. This alternate is called Unicode.

Another way that the Internet is changing is by allowing lots of new domain names. Not only are there simply more of them, but some are longer than any of the older domain names and many of them use the same Unicode system mentioned above.

Note: “Universal Acceptance” is sometimes confused with “Universal Access” or “Universal Accessibility”; those phrases refer to connecting everyone on earth to the Internet, and to building Internet-connected systems for all differently-abled people on earth, respectively. Universal acceptance is limited to domain names and email addresses.

A special group called “Universal Acceptance Steering group (UASG) has been created to work on issues related to Universal Acceptance. UASG doesn’t work on anything else (e.g. Universal Access or Universal Accessibility).

How does an app or an online service support Universal Acceptance?

Software and online services support Universal Acceptance when they offer the following capabilities:

A. Can accept any domain name or email name as an input from a user interface, from a document, or from another app or service

B. Can validate and process any domain name or email name

C. Can store any domain name or email name

D. Can output any domain name or email name to a user interface, to a document, or to another app or service

Unfortunately, older apps and online services don’t always offer those capabilities. Sometimes they lack support for Unicode; sometimes they make wrong assumptions about new domain names, or even assume they don’t exist. Sometimes they support Universal Acceptance in some features but not in all.

How can Universal Acceptance be measured?

Universal Acceptance can be measured in a few ways.

1. Source code reviews and unit testing

2. Manual testing

3. Automated testing

#1 means inspecting the source code and verifying that only the correct programming techniques, software libraries and interfaces (AKA “APIs”) have been used, then verifying that the app or service works by testing against specific test cases for the capabilities A-D listed above. #1 is only practical for app developers and online service providers.

UASG is reaching out directly to the community of app developers and the largest online service providers to encourage them to perform source code reviews and testing to determine the level of Universal Acceptance in their offerings. UASG is also providing a list of criteria which can be used to develop test cases for the capabilities A-D listed above.

#2 can be done by anyone, but it’s labor-intensive. Examples of #2 would include submitting an email address when registering for an online service and verifying that it has been accepted. Since there are lots of potential online services to sign up for, and lots of potential new email address combinations, one must pick and choose which combinations of app, services, email address and/or domain name to test.

UASG is developing a list of top web sites, apps, email addresses and domain names suitable for testing.

#3 requires up-front technical work, but is more scalable to large measuring and monitoring efforts. An example of #3 is the recent gTLD investigation performed by APNIC on behalf of ICANN. <http://www.potaroo.net/reports/Universal-Acceptance/UA-Report.pdf >

UASG is investigating methods of automated testing for Universal Acceptance and will share these as they are developed.

IGF, Internet, News, Open Source

A Walled Wide Web for Nervous Autocrats

This is a Wall Street Journal Report by EVGENY MOROZOV

At the end of 2010, the “open-source” software movement, whose activists tend to be fringe academics and ponytailed computer geeks, found an unusual ally: the Russian government. Vladimir Putin signed a 20-page executive order requiring all public institutions in Russia to replace proprietary software, developed by companies like Microsoft and Adobe, with free open-source alternatives by 2015.

The move will save billions of dollars in licensing fees, but Mr. Putin’s motives are not strictly economic. In all likelihood, his real fear is that Russia’s growing dependence on proprietary software, especially programs sold by foreign vendors, has immense implications for the country’s national security. Free open-source software, by its nature, is unlikely to feature secret back doors that lead directly to Langley, Va.

Picture from Wall Street Journal

Nor is Russia alone in its distrust of commercial software from abroad. Just two weeks after Mr. Putin’s executive order, Iran’s minister of information technology, citing security concerns, announced plans for a national open-source operating system. China has also expressed a growing interest. When state-owned China Mobile recently joined the Linux Foundation, the nonprofit entity behind the most famous open-source project, one of the company’s executives announced—ominously to the ears of some—that the company was “looking forward to contributing to Linux on a global scale.”

Information technology has been rightly celebrated for flattening traditional boundaries and borders, but there can be no doubt that its future will be shaped decisively by geopolitics. Over the past few years, policymakers around the world have had constant reminders of their growing dependence on—and vulnerability to—the new technology: the uncovering of the mysterious China-based GhostNet network, which spied on diplomatic missions around the globe; the purported crippling of Iran’s nuclear capability by the Stuxnet virus; and, of course, the whole WikiLeaks affair. Governments are taking a closer look at who is providing their hardware, software and services—and they are increasingly deciding that it is dangerous not to develop independent national capabilities of their own.

Open-source software can allay some of these security concerns. Though such systems are more democratic than closed ones, they are also easier to manipulate, especially for governments with vast resources at their command. But open-source solutions can’t deal with every perceived threat. As Google learned, the Chinese government continues to see Western search engines as a challenge to its carefully managed presentation of controversial subjects. Similarly, email can be read by the host government of the company offering the service, and the transmission of sensitive data can be intercepted via secret back doors and sent to WikiLeaks or its numerous local equivalents.

For these reasons, more governments are likely to start designating Internet services as a strategic industry, with foreign firms precluded from competing in politically sensitive niches. The Turkish government has emerged as the leading proponent of such “information independence,” floating the idea of both a national search engine and a national email system. Authorities in Russia, China and Iran have debated similar proposals.

Judging by last year’s standoff between the BlackBerry maker Research in Motion and the governments of India, Saudi Arabia and the United Arab Emirates, questions of access also will play a growing role in shaping technology. If a government suspects that the U.S. National Security Agency has arranged to be able to retrieve private emails sent with BlackBerry’s secure encryption technology, it starts to wonder why it doesn’t have similar streams of intelligence data, from BlackBerry as well as from services like Gmail and Skype. At a minimum, more governments will demand that data servers base their operations in their own jurisdictions, inconveniencing global Internet companies that have based their business plans on the assumption that they could run their Indian operations from Iowa.

More from Wall Street Journal
.