Here is a video from the European Summer School of Internet Governance by Wolfgang Kleinwächter which presents the History of Internet Governance in perspective. The lecture is in public domain.
The initial Inter Planetary Networking technologies have been developed by Vint Cerf and others. In Interplanetary networking the time delay (latency) and the lack of continuous network connectivity (as for example, waiting for the next satellite to come into the communication range) are problems that needed to be solved. Hence the Interplanetary Network has to be Delay Tolerant and Disruption tolerant, whereas on the Internet, by the present protocols, the packets would be dropped when there is a line disruption or any abnormal delay.
The experimental protocols have been developed by members of the Delay & Disruption Tolerant Networking Research Group (which operates under the aegis of the Internet Research Task Force).
The Internet Society has shared an update from NASA’s Human Exploration and Operations Mission Directorate’s (HEOMD) on its Advanced Exploration Systems (AES) program.
While much of the focus is on a human mission to Mars, there are also signs of NASA’s increased commitment to Delay & Disruption Tolerant Networking (DTN).
- Disruption Tolerant Networking (DTN): Infusion of store and forward communications protocols into NASA and international missions.
- Demonstrated supervisory control of ESA Eurobot from the International Space Station (ISS using) DTN.
- Agreement with KARI (the South Korean Space Agency) to conduct DTN experiments.
With an increased dependency upon robotic missions to pave the way for a human mission to Mars, and increasing realization that the data is the mission, these are encouraging signs that DTN is gathering momentum as a critical component of guaranteeing mission success.
The full update is available for download at: https://www.nasa.gov/sites/
For a better understanding of the basics of DTN watch this video:
It was not uncommon to find the earliest of the Web Application Developers to assume that all domain names would end in .com, all email addresses would follow the format @xyz.com. While developers took into account newer domain names such as .info in due course, most continued to design applications to accept Domain names and email addresses in ASCII just as software developers in the 80s assumed that it would be unnecessary to have any more than two digits to denote the year, which led to the famous Y2K issue towards the year 2000.
Now there are new Top Level Domain Names (such as .family and .game) and Internationalized Domain Names (in various native non-ascii scripts of India and the world, such as .??????? and .???? (I typed India in Tamil and Devanagiri, displays here as ???) as well as Internationalized email Internet Domain Names that would allow users to have addresses in their native scripts.
If a browser or a form in a webpage limits acceptance of domain names or email addresses with a rule such as “a domain name must be in English and end with .com, or .net or .org” or “an email address must be in English or numerals” then it is archaic.
It is a problem far larger in its dimensions than the Y2K problem of year 2000 which kept the IT community of the entire world talking. On this problem of “Universal Acceptance” there appears to be inadequate attention to the problem in global public interest as well as to the commercial opportunities it presents for enterprising Developers and Corporations. This might emerge to be a huge commercial vertical in itself in view of the Design changes to be brought about and in terms of the testing requirements. #Deity #NASSCOM #WIPRO #TiE #TCS #Cognizant (If you are from a different country, please feel free to rewrite this post to suit your country and publish it. This post is not copyrighted.)
For more information, follow the publicly archived, transparent discussions in the IETF forum, at ICANN and at the Internet Society on this issue. You could also write to isocindiachennai (At) gmail (dot) com for additional pointers or any clarification. Or ask your Executives at a higher level to take part in ICANN meetings that are open and held as multi-stakeholder global meetings. And also join the Internet Society India Chennai Chapter. Such participation would lead you to positive involvement in the global Internet and also connect you to business opportunities not only in the y2k20 (there is no such term, the term is coined to describe the issue and the opportunity) but also in DNSSEC, IPv6 transition, Internet of Things (IoT) and new gTLDs.
What does the phrase “Universal Acceptance” mean?
“Universal Acceptance of domain names and email addresses” (or just “Universal Acceptance”, or even “UA”, for short) means that all apps and online services should accept all Internet domain names and email addresses equally.
Universal Acceptance is an important concept these days because the Internet is changing. One way that it is changing is that addresses no longer need to be composed of ASCII characters. (ASCII characters are the 127 Latin-script letters, numerals and punctuation marks that are dominant on the Internet today. All the characters in this document so far have been ASCII characters.)
Most people on earth are not native speakers of languages which use the ASCII characters, so moving from a character set limited to 127 characters to an alternate which can support more than one million characters is essential for those people to fully use and benefit from the Internet. This alternate is called Unicode.
Another way that the Internet is changing is by allowing lots of new domain names. Not only are there simply more of them, but some are longer than any of the older domain names and many of them use the same Unicode system mentioned above.
Note: “Universal Acceptance” is sometimes confused with “Universal Access” or “Universal Accessibility”; those phrases refer to connecting everyone on earth to the Internet, and to building Internet-connected systems for all differently-abled people on earth, respectively. Universal acceptance is limited to domain names and email addresses.
A special group called “Universal Acceptance Steering group (UASG) has been created to work on issues related to Universal Acceptance. UASG doesn’t work on anything else (e.g. Universal Access or Universal Accessibility).
How does an app or an online service support Universal Acceptance?
Software and online services support Universal Acceptance when they offer the following capabilities:
A. Can accept any domain name or email name as an input from a user interface, from a document, or from another app or service
B. Can validate and process any domain name or email name
C. Can store any domain name or email name
D. Can output any domain name or email name to a user interface, to a document, or to another app or service
Unfortunately, older apps and online services don’t always offer those capabilities. Sometimes they lack support for Unicode; sometimes they make wrong assumptions about new domain names, or even assume they don’t exist. Sometimes they support Universal Acceptance in some features but not in all.
How can Universal Acceptance be measured?
Universal Acceptance can be measured in a few ways.
1. Source code reviews and unit testing
2. Manual testing
3. Automated testing
#1 means inspecting the source code and verifying that only the correct programming techniques, software libraries and interfaces (AKA “APIs”) have been used, then verifying that the app or service works by testing against specific test cases for the capabilities A-D listed above. #1 is only practical for app developers and online service providers.
UASG is reaching out directly to the community of app developers and the largest online service providers to encourage them to perform source code reviews and testing to determine the level of Universal Acceptance in their offerings. UASG is also providing a list of criteria which can be used to develop test cases for the capabilities A-D listed above.
#2 can be done by anyone, but it’s labor-intensive. Examples of #2 would include submitting an email address when registering for an online service and verifying that it has been accepted. Since there are lots of potential online services to sign up for, and lots of potential new email address combinations, one must pick and choose which combinations of app, services, email address and/or domain name to test.
UASG is developing a list of top web sites, apps, email addresses and domain names suitable for testing.
#3 requires up-front technical work, but is more scalable to large measuring and monitoring efforts. An example of #3 is the recent gTLD investigation performed by APNIC on behalf of ICANN. <http://www.potaroo.net/reports/Universal-Acceptance/UA-Report.pdf >
UASG is investigating methods of automated testing for Universal Acceptance and will share these as they are developed.
On Wednesday, Google gave people a clearer picture of its secret initiative called Project Glass. The glasses are the company’s first venture into wearable computing.
The glasses are not yet for sale. Google will, however, be testing them in public.
The prototype version Google showed off on Wednesday looked like a very polished and well-designed pair of wrap-around glasses with a clear display that sits above the eye. The glasses can stream information to the lenses and allow the wearer to send and receive messages through voice commands. There is also a built-in camera to record video and take pictures.
The New York Times first wrote about the glasses in late February, describing an augmented-reality display that would sit over the eye and run on the Android mobile platform.
A video released by Google on Wednesday, which can be seen below, showed potential uses for Project Glass. A man wanders around the streets of New York City, communicating with friends, seeing maps and information, and snapping pictures. It concludes with him video-chatting with a girlfriend as the sun sets over the city. All of this is seen through the augmented-reality glasses.
Project Glass could hypothetically become Project Contact Lens. Mr. Parviz, who is also an associate professor at the University of Washington, specializes in bionanotechnology, which is the fusion of tiny technologies and biology. He most recently built a tiny contact lens that has embedded electronics and can displaypixels to a person’s eye.
Project Glass is one of many projects currently being built inside the Google X offices, a secretive laboratory near Google’s main Mountain View, Calif., campus where engineers and scientists are also working on robots and space elevators.
From New York Times
[ This was in 1973, 39 years ago, when “when computers ran on steam and the internet was still largely mechanical”. I was led to this document from a message posted by Karl Auerbach in the At Large mailing list today ]
Most of the advanced industrial nations of Western Europe and North America share concerns about the social impact of computer-based personal data systems. Although there are minor differences in the focus and intensity of their concerns, it is clear that there is nothing peculiarly American about the feeling that the struggle of individual versus computer is a fixed feature of modern life. The discussions that have taken place in most of the industrial nations revolve around themes that are familiar to American students of the problem: loss of individuality, loss of control over information, the possibility of linking data banks to create dossiers, rigid decision making by powerful, centralized bureaucracies. Even though there is little evidence that any of these adverse social effects of computer-based record keeping have occurred on a noticeable scale, they have been discussed seriously since the late sixties, and the discussions have prompted official action by many governments as well as by international organizations.
Concern about the effects of computer-based record keeping on personal privacy appears to be related to some common characteristics of life in industrialized societies. In the first place, industrial societies are urban societies. The social milieu of the village that allowed for the exchange of personal information through face-to-face relationships has been replaced by the comparative impersonality of urban living. …
Concern about the effects of computer-based record keeping appears to have deep roots in the public opinion of each country, deeper roots than could exist if the issues were manufactured and merchandised by a coterie of specialists, or reflected only the views of a self-sustaining group of professional Cassandras. The fragility of computer-based systems may account for some of the concern… There are few computer systems designed to deal with the disruption that deliberately lost or mutilated punched cards in a billing system-to give a simple example-would cause. Thus, the very vulnerability of automated personal data systems, systems without which no modern society could function, may make careful attention to the human element transcend national boundaries.
The Response in Individual Nations
On October 7, 1970, the West German State of Hesse adopted the world’s first legislative act directed specifically toward regulating automated data processing. This “Data Protection Act” applies to the official files of the government of Hesse; wholly private files are specifically exempted from control. The Act established a Data Protection Commissioner under the authority of the State parliament whose duty it is to assure that the State’s files are obtained, transmitted, and stored in such a way that they cannot be altered, examined, or destroyed by unauthorized persons…
Thus, the Data Protection Act of Hesse seems designed more to protect the integrity of State data and State government than to protect the interests of the people of the State…
When strong opposition to the 1969 census erupted in Sweden, public mistrust centered not so much on the familiar features of the census itself as on the fact that, for the first time, much of the data gathering would be done in a form specifically designed to facilitate automated data processing. Impressed by the possibility that opposition might be so severe as to invalidate the entire census, the government added the task of studying the problems of computerized record keeping to the work of an official commission already studying policies with respect to the confidentiality of official records.
After a notably thorough survey of personal data holdings in both public and private systems, the commission issued a report containing draft legislation for a comprehensive statute for the regulation of computer-based personal data systems in Sweden.2 The aim of the act is specifically the protection of personal privacy. Its key provisions are these:
- Establishment of an independent “Data Inspectorate,” charged with the responsibility for executing and enforcing the provisions of the Data Law.
- No automated data system containing personal data may be set up without a license from the Data Inspectorate.
- Data subjects have the right to be informed about all uses made of the data about them, and no new use of the data may be made without the consent of the subject.
- Data subjects have the right of access without charge to all data about them, and if the data are found to be incorrect, incomplete, or otherwise faulty, they must either be corrected to the subject’s satisfaction, or a statement of rebuttal from the subject must be filed along with the data.
- The Data Inspectorate will act as ombudsman in all matters regarding automated personal data systems.
The Data Law has been passed by the Swedish Parliament and will become effective on July 1, 1973. A transition period of one year will be allowed to implement all the provisions of the law.
Article 9 of the French Civil Code states plainly, “Everyone has the right to have his private life respected.“ 3 As legal scholars in all countries have noted, however, it is very difficult to define the precise limits of privacy in every case that comes before a court, and in spite of such explicit protection, the privacy of the French, both inside and outside of automated personal data systems, seems in practice no better defended than that of most other people…
One other development on the French scene deserves mention. The 1972 annual report of the Supreme Court of Appeals went considerably out of its way, after reviewing a case of literary invasion of privacy, to comment on the subject of computers and privacy:
… The progress of automation burdens society in each country with the menace of a computer which would centralize the information that each individual is obliged to furnish in the course of his life to the civil authorities, to his employer, his banker, his insurance company, to Internal Revenue, to Social Security, to the census, to university administrations, and, in addition, the data, correct or not, which is received about him by the various services of the police. When one thinks about the uses that might be made of that mass of data by the public powers, of the indiscretions of which that data might be the origin, and of the errors of which the subjects might be the victims, one becomes aware that there lies a very important problem, not only for the private life of everyone, but even for his very liberty.
It appears to us that this eventuality, an extremely probable one, ought to be made the object of consideration of the public power, . . .and that this consideration should take its place among the measures of precaution and of safeguard which should not lack for attention.7
To sum up, the situation in France is complex. The subject of computers and privacy has been given serious attention by a relatively small group of experts, but that group has an influence in government far out of proportion to its numbers. The attitude of the present government is strongly colored by another aspect of the privacy problem: It has been caught in a wiretap scandal, and its defensiveness in that regard appears to be influencing its actions on the computer front. The official report of the present working group is due before the end of 1973, but it does not seem realistic to expect that there will be any definitive action in France before, perhaps, mid-1974.
Britain is unique among the countries reviewed in having recently completed a thorough study of the entire subject of privacy.8 Although the committee in charge of the study, the Younger Committee, was restricted in its terms of reference to private, rather than public, organizations that might threaten privacy, the committee’s report is a model of clarity and concern. In brief, the Committee found that both the customs of society and the Common law had evolved defenses against the traditional intrusions of nosey neighbors, unwelcome visitors, door-to-door salesmen, and the like. Against the new threats of technological intrusions-wiretaps, surveillance cameras, and, of course, computerized data banks-the Committee recognized that the traditional defenses are inadequate. To help deal with the threat of the computer, the Committee recommended specific safeguards to be applied to automated personal data systems, although it left the method of application up to the government to decide. The main features of the safeguards are:
- Information should be regarded as held for a specific purpose and not to be used, without appropriate authorization, for other purposes
- Access to information should be confined to those authorized to have it for the purpose for which it was supplied.
- The amount of information collected and held should be the minimum necessary for the achievement of the specified purpose.
- In computerized systems handling information for statistical purposes, adequate provision should be made in their design and programs for separating identities from the rest of the data.
- There should be arrangements whereby the subject could be told about the information held concerning him.
- The level of security to be achieved by a system should be specified in advance by the user and should include precautions against the deliberate abuse or misuse of information.
- A monitoring system should be provided to facilitate the detection of any violation of the security system.
- In the design of information systems, periods should be specified beyond which the information should not be retained.
- Data held should be accurate. There should be machinery for the correction of inaccuracy and the updating of information.
- Care should betaken in coding value judgments.9
In its report, published in late 1972,11 the Canadian Task Force concluded that computer invasion of privacy is still far short of posing a social crisis. However, the rapidly rising volume of computerized personal data and the equally rapidly rising public expectation of a right to deeper and more secure privacy threaten to converge at the crisis level. To forestall that crisis, the Task Force recommends that a commissioner or ombudsman be established in a suitable administrative setting, that carefully prepared test cases on cogent issues be brought before the courts, and that the operation of government data systems be made to serve as a national model.