SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksThe *NEW* Frank Coluccio Technology Forum


Previous 10 Next 10 
To: Frank A. Coluccio who wrote (20849)4/13/2007 7:16:52 PM
From: Sailtrader
   of 46820
 
I read that post and found it rather alarming if everything George
writes is true. I will admit I am not the sharpest tech tool in the bag anymore, but what am I missing Frank?

Your comment implies it is really not worth the effort to reply, so if that is what you intended, I completely understand.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: Sailtrader who wrote (20853)4/13/2007 8:00:44 PM
From: Frank A. Coluccio
   of 46820
 
Hi Sailtrader. Other than noting the pot-kettle-black dynamic, you're right. I simply haven't the motivation to take it beyond. FAC

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: Frank A. Coluccio who wrote (20852)4/13/2007 8:07:13 PM
From: ftth
   of 46820
 
I didn't read the whole thing but this caught my eye:

"Brian H. Whitton...It is very difficult to say for certain what consumers will demand in terms of higher bandwidth speeds, other then to point out that they will demand more tomorrow then what they get today, based on history of data communications over the past 2 decades."

Two points: first, if they are using the history of data communications, they should have been providing symmetric networks years ago.

Second, and using the same historic observation point, they should have been providing 10Mbps symmetric, years ago, and should be well on their way to 100Mbps. Several other countries "get it" Verizon...why not you?

They aren't even in the ball park of serving the "demand" that is prevalent in every network environment except the last mile.

Consumers can't "demand" what they don't even have available, and for that matter, what miniscule percentage of consumers in any market actually know technical operating parameters well enough to "demand" numbers. Did consumers in Japan demand 100Mbps? No; the providers translated everyday consumer activities into bandwidth...they didn't wait for consumers to get a degree in communications bandwidth metrology so they could "demand" it.

Ridiculous, Verizon. It's on you to know the necessary data transfer rates for all the transfers of data that consumers do on a daily basis, outside the "last mile," and provide a platform to do so. Don't blame your ineptness at developing markets on consumers.

The cost and quality of this service level that they aren't providing...that's a completely different discussion. But to claim they don't even know what the level is, it's just ridiculous. Let me help you out Verizon: provide the equivalent of a 10 year old LAN...10 Mbps symmetric. That's right in line with your choice of the BPON standard for your FTTP networks, since it was about a 10 year old, already outdated standard when you selected it a few years back. ;o)

Share RecommendKeepReplyMark as Last ReadRead Replies (2)


From: LindyBill4/13/2007 8:14:23 PM
   of 46820
 
Breaking: Google Spends $3.1 Billion To Acquire DoubleClick

About 20 minutes ago Google announced that they have agreed to acquired DoubleClick for $3.1 billion in cash (nearly double the size of their YouTube Acquisition). Microsoft was reportedly in a bidding war with Google for the company. Google gets access to DoubleClicks advertising software and, perhaps more importantly, their customers and network.

DoubleClick was founded in 1996. DoubleClick was taken private in 2005 by Hellman & Friedman and JMI Equity for $1.1 billion. The New York Times is reporting that DoubleClicks revenues are about $300 million/year.

10x revenue for a mature company is a…healthy…valuation. At least part of the acquisition price appears to be due to a desire by Google to keep this asset out of Microsoft's hands.

techcrunch.com

tinyurl.com

Share RecommendKeepReplyMark as Last ReadRead Replies (2)


To: ftth who wrote (20855)4/14/2007 1:44:06 AM
From: Frank A. Coluccio
   of 46820
 
"I didn't read the whole thing but..."

Well, I did read the whole thing, and I can't say that I'm any smarter for it, from a technical perspective. Only that I may be a bit wiser as to the ways in which facts are twisted and misrepresented. I stopped listing them early on. At one point you noted:

"Consumers can't "demand" what they don't even have available, and for that matter, what miniscule percentage of consumers in any market actually know technical operating parameters well enough to "demand" numbers. Did consumers in Japan demand 100Mbps? No; the providers translated everyday consumer activities into bandwidth...they didn't wait for consumers to get a degree in communications bandwidth metrology so they could "demand" it."

As I wrote earlier in a discussion concerning this same interview series elsewhere, the larger point worth noting here is that many subscribers are complacent with what's been "delivered" to them as "broadband," as you can probably deduce while reading the following retort that I received from a journalist several days ago. It's part of a newspaper forum discussion (Gotham Gazette) in which I made several observations and criticisms of the asymmetrical delivery of "broadband" by last mile service providers. The journalist replied:

"I challenge your contention that "broadband" is only one-directional--from those who control the connections down to the content recipients. The whole idea of "Web 2.0," YouTube, "mash-ups" and tools like blogs and wikis is to establish arenas in which people can be content creators as well as content consumers. Sure, there are major companies behind broadband, as well as some of these applications--Verizon provides my DSL service, and Google own YouTube. Nevertheless, people are making use of the interactive capabilities of the Web--many of which require a "broadband" connection--much more than a few years ago. Copyright laws must be modernized to take account of these new phenomenons; increasingly, strict interpretations of copyright are a fortress for the old-line media companies. But this would be true with or without broadband connections."
--end Gotham Gazette quote

I stated earlier that I thought that this may only be a regional phenomenon, or one unique to urban centers, but I have found, in fact, that most non-practitioners no matter where they reside, as long as they're receiving DSL or CM (and even an alarming number of IT professionals) who I speak with are near-totally taken with their new found "broadband" toys. I cannot even state that user discontent is always (although to a great extent is it) proportional to Internet or telecoms literacy. As I began to suggest above, I've made it a point to discuss the merits of transparent carriage, symmetry and other attributes of neutrality with clients of mine who are IT department heads, only to be told:

"I've got TW (or Optonline, or fill in the blank) Cablemodem service and it's great! What do you have, and what's wrong with what you're getting now?"

At times like this I have to ask myself if it's really worth the time and effort to inform and ellucidate, only to ultimately wind up bursting some innocent soul's bubble.

FAC

------

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


From: Frank A. Coluccio4/14/2007 2:04:52 AM
   of 46820
 
Researchers Explore Scrapping Internet
By ANICK JESDANUN | AP | Apr 13, 2007

news.yahoo.com

NEW YORK - Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

[Below: Leonard Kleinrock comments on the dark side of the Internet - spam, pornography, among others - at his office in the UCLA Computer Science Department in Los Angeles, Tuesday, March 27, 2007. Kleinrock created the basic principles of packet switching, the technology underpinning the Internet, while a graduate student at MIT, a decade before the Internet was founded. (AP Photo/Damian Dovarganes)]



The Internet "works well in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."

One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."

The
National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.

The
European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.

"The network is now mission critical for too many people, when in the (early days) it was just experimental," Zittrain said.

The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.

But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn't have built-in mechanisms for knowing with certainty who sent what.

The network's designers also assumed that computers are in fixed locations and always connected. That's no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.

Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.

Workarounds for mobile devices "can work quite well if a small fraction of the traffic is of that type," but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford's clean-slate program.

The Internet will continue to face new challenges as applications require guaranteed transmissions — not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.

Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an Internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.

And one day, sensors of all sorts will likely be Internet capable.

Rather than create workarounds each time, clean-slate researchers want to redesign the system to easily accommodate any future technologies, said Larry Peterson, chairman of computer science at Princeton and head of the planning group for the NSF's GENI.

Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the get-go. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.

"We made decisions based on a very different technical landscape," said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.

"Now, we have the ability to do all sorts of things at very high speeds," he said. "Why don't we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?"

Of course, a key question is how to make any transition — and researchers are largely punting for now.

"Let's try to define where we think we should end up, what we think the Internet should look like in 15 years' time, and only then would we decide the path," McKeown said. "We acknowledge it's going to be really hard but I think it will be a mistake to be deterred by that."

Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.

"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.

Think evolution, not revolution.

Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.

These days, Carnegie Mellon professor Hui Zhang said he no longer feels like "the outcast of the community" as a champion of clean-slate designs.

Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.

FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.

These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.

Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf's TCP/IP communications protocols.

"Almost every assumption going into the current design of the Internet is open to reconsideration and challenge," said Parulkar, the NSF official heading to Stanford. "Researchers may come up with wild ideas and very innovative ideas that may not have a lot to do with the current Internet."

Associated Press Business Writer Aoife White in Brussels, Belgium, contributed to this report.

___

On the Net:

Stanford program: cleanslate.stanford.edu

Carnegie Mellon program: 100x100network.org

Rutgers program: orbit-lab.org

NSF's GENI: geni.net

------

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: LindyBill who wrote (20856)4/14/2007 2:39:50 AM
From: Frank A. Coluccio
   of 46820
 
Thanks, Bill. An extended read of the same story from the NY Times follows.
---

Google Buys DoubleClick for $3.1 Billion
Louise Story and Miguel Helft | April 14, 2007

[FAC: All of this talk about "ad space," as though it were something that's fungible and fully understood. Ad space where? On one's screen? Is it the characters that constitute the HTML and markup codes that one writes and copyrights? Is it part of some magical electromagnetic spectrum-like "property" domain that could be bought and sold at auction and at the same time make a claim to a specified viewing area on one's PC, TV or other multimedia appliance? If AOL or the WSJ sells "ad space" to an advertiser, does that mean that the advertiser, or WSJ, in this case, or whomever owns the Web page, owns the real estate on your screen, or has control over the bits sent to fill that 1' x 1' square box or video display area on your screen? As curious minds, we'd like to know.]

nytimes.com

Google reached an agreement today to acquire DoubleClick, the online advertising company, from two private equity firms for $3.1 billion in cash, the companies announced, an amount that was almost double the $1.65 billion in stock that Google paid for YouTube late last year.

The sale offers Google access to DoubleClick’s advertisement software and, more importantly, its relationships with Web publishers, advertisers and advertising agencies.

For months, Google has been trying to expand its foothold in online advertising into display ads, the area where DoubleClick is strongest. Google made its name and still generates most of its revenue from search and contextual text ads.

DoubleClick, which was founded in 1996, provides display ads on Web sites like MySpace, The Wall Street Journal and America Online as well as software to help those sites maximize ad revenue. The company also helps ad buyers — advertisers and ad agencies — manage and measure the effectiveness of their rich media, search and other online ads.

DoubleClick has also recently introduced a Nasdaq-like exchange for online ads that analysts say could be lucrative for Google.

“Google really wants to get into the display advertising business in a big way, and they don’t have the relationships they need to make it happen,” said Dave Morgan, the chairman of Tacoda, an online advertising network. “But DoubleClick does. It gives them immediate access to those relationships.”

The sale brings to an end weeks of a bidding battle between Microsoft and Google. Microsoft has been trying to catch Google in the online advertising business, and the loss of DoubleClick would be a a major setback.

“Keeping Microsoft away from DoubleClick is worth billions to Google,” an analyst with RBC Capital Markets, Jordan Rohan, said.

Acquiring DoubleClick expands Google’s business far beyond algorithm-driven ad auctions into a relationship-based business with Web publishers and advertisers. Google has been expanding its AdSense network into video and display ads online and is selling ads to a limited degree on television, newspapers and radio.

The sale also raises questions about how Google will manage its existing business and that of the new DoubleClick unit while avoiding conflicts of interest. If DoubleClick’s existing clients start to feel that Google is using DoubleClick’s relationships to further its own ad network, some Web publishers or advertisers might jump ship.

A highflying stock in the late 1990s, DoubleClick was an early pioneer in online advertising and was one of the few online ad companies to survive the burst of the dot.com bubble. In 2005, DoubleClick was taken private by two private equity firms, Hellman & Friedman and JMI Equity, in a deal valued at $1.1 billion. Since then, the company has sold two data and e-mail advertising businesses and acquired Klipmart, which specializes in online video.

The company generated about $300 million in revenue last year, mostly from providings ads on Web sites.

DoubleClick’s chief executive, David Rosenblatt, said a few weeks ago that a new system it had developed for the buying and selling of online ads would probably become the chief money maker within five years. The system, a Nasdaq-like exchange for online ads, brings Web publishers and advertising buyers together on a Web site where they can participate in auctions for ad space.

DoubleClick’s exchange is different from the ad auctions that Google uses on its networks because the exchange is open to any Web publisher or ad network — not just the sites in Google’s network. Offline ad sales have been handled through negotiation, but the efficiency of online auction systems has caused some advertising executives to consider using auctions for offline ads in places like television and newspapers. DoubleClick’s new exchange could function as a hub for online and offline ad sales.

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:08:58 AM
   of 46820
 
[Uganda] Special computer developed for rural Ugandan Schools
HANA Staff | By David Kezio-Musoke | 13/04/2007

A new computer that is half the size of the bible and powered by a motorcycle battery makes its debut in Uganda

hana.ru.ac.za

The Ugandan government with the help of an American company called Inveneo Inc has invented a computer to aid rural Uganda.

Uganda's Minister of State for ICT Alintuma Nsambu told HANA that since early this year the government has been working with the Americans to develop the computer, the first of its kind on the African continent.

"It is half the size of a bible and powered by a motorcycle-size battery. It has passed the test and ready for deployment," he said.

The computer referred to as the Inveneo computer can run for three days without recharging if the user utilizes it for at least eight hours per day.

It has 256 MB of Random Access Memory (RAM) and 40 MB hard disc memory capacity. The tiny computer is internet ready with wireless capabilities as well.

Nsambu said that Personal Computers (PCs) use the same amount of electricity (energy) as six Inveneo computers. The user has a solar panel for recharging the Inveneo computer. Furthermore, the user does not need a CPU or any form of power inverter or converter.

Nsambu said that because of its simplicity and low power consumption, the Inveneo computer is extremely fast. It also has two Operating Systems including the Microsoft and Linux.

Nsambu wants the computer to go to the rural schools of Uganda where there is no electricity. It is affordable and cost about US$ 300.

Ends

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:21:05 AM
   of 46820
 
[Kenyan and Rwandan] Google connects Universities
By: David Kezio-Musoke | 04/04/2007

hana.ru.ac.za

Google partners with Rwandan Universities to provide free e-mail services and internet telephone to Kenya and Rwanda.

Google has started providing free online communication services to universities in Kenya and Rwanda.

According to the Rwandan Ministry of Infrastructure, Google has partnered with about four Rwandan Universities to provide free services such as email and Internet telephone calling.

Google which is a California-based Internet search engine announced recently that it had also partnered with the Kenya Education Network to provide relatively the same service. Students in both African countries as well as Rwandan government officials will get to use online tools including shared calendars, instant messaging and word processing, Google said.

"This partnership will be a boost in terms of services offered to our Rwandan Academic Institutions, allowing them to collaborate," said Rwanda's Minister of Energy and Communications, Albert Butare.

"I believe communication between students and their lecturers will be enhanced as users throughout the country will now be using the same state-of-the-art, cutting-edge technology that is available in other parts of the world," he added.

The Rwandan universities to benefit include, the National University of Rwanda, Kigali Institute for Education and the Kigali Institute for Science and Technology in Rwanda. The institutes will get access to Google Apps Education Edition in the initial phase of the project while the Rwandan government ministries will use Google Apps Standard Edition.

According to Google senior vice president, Shona Brown approximately 20 000 users in Rwanda will be able to use the Google services.

In Kenya the University of Nairobi's 50 000 students will be the first to be offered Google Apps for Education in Kenya. The services will later be extended to 150 000 Kenyan students at universities across the country.

"For us, universality is crucial because we believe everyone should have access to the same services wherever they live, whatever their language and regardless of income," said Brown.

"I look forward to working with other African governments to make life-enhancing services like free email, instant messaging and PC-to-PC phone calls more widely available across Africa."

In a style considered a defining trait of Web 2.0, the software to run the applications is hosted on Google computers, freeing users of needing to install or maintain programs on their own machines.

Ends

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:26:51 AM
   of 46820
 
The Civil Society Organisations (CSOs) has condemned the recent composition of the Advisory Group for the Internet Governance Forum (IGF).
------

[Internet Governance] CSO condemns the composition of IGF advisory group
By: Remmy Nweke | 13/04/2007

[FAC: "Civil Society Organizations: The World Bank interacts with thousands of Civil Society Organizations (CSOs) throughout the world at the global, regional, and country levels. These CSOs include NGOs, trade unions, faith-based organizations, indigenous peoples movements, and foundations. These interactions range from CSOs who critically monitor the Bank’s work and engage the Bank in policy discussions, to those which actively collaborate with the Bank in operational activities. There are many examples of active partnerships in the areas of forest conservation, AIDS vaccines, rural poverty, micro-credit, and Internet development." Continued at the World Band site: tinyurl.com ]

hana.ru.ac.za

The Civil Society Organisations (CSOs) has condemned the recent composition of the Advisory Group for the Internet Governance Forum (IGF).

The Caucus is the main coordinating framework for civil society participation in Internet governance discussions at the World Summit on the Information Society (WSIS) and subsequently at the first IGF meeting that was held in Athens-Greece, last quarter. Co-coordinator of the caucus, Mr. Adam Peake, who made this known at the weekend in a report to the CSOs on Internet Governance, said that his group was not adequately represented at the advisory group.

He also said that the composition was never discussed as part of the multi-stakeholders approach agreed upon at the second phase of WSIS in 2005, hence not transparent.

"While supporting the concept, we note that its composition, including the proportionate representation of stakeholder groups and the crosscutting technical and academic communities, was not openly and transparently discussed prior to its appointment," he declared.

Mr. Peake stressed that non-transparency of the composition equally borders on lack of clear norms in terms of mandate and working principles.

"We think that clear terms and rules should be established for the advisory group between now and another meeting coming up in Rio, through an open process involving all the participants in the IGF as a shared foundation for our common work," he said.

The caucus coordinator also noted that if the aforementioned dictates were adhered to, that would pave the way for the Secretary General's endorsement of the advisory group.

"If these rules and quarters for representation from each stakeholder group were openly established, it would be possible for the Secretary-General to delegate the actual process of selection of Advisory Group members to the stakeholder groups themselves," Mr. Peake said.

He emphasised that the dissatisfaction of his group for the limited representation of civil society in the first instance of the Advisory Group, which amounted to about five members out of about 40, is not a welcome idea.

"We think that the significant participation of civil society and individual users, as proved by the Working Group on Internet Governance (WGIG), is key to making Internet governance events a success both in practical and political terms," he said.

Mr. Peake equally said that CSOs would like to see such participation expanded to at least one-fourth of the group, if not one-third, and to the same levels of the private-sector and of the Internet technical community.

"We confirm our support to the civil society members of the incumbent group and stand ready to provide suggestions for additional members with direct experience from diverse civil society groups," he said.

The group also reiterated the need for the IGF to be considered as a process rather than as an event, and supports the concept of dynamic coalitions and their activities.

"However, there needs to be a way to bless their work and give some recognition, even if not binding, to their products," he asserted.

He insisted that a transparent, multi-stakeholder, and democratic process should be commenced to develop the criteria for the recognition of dynamic coalitions by the IGF in which the output of coalitions that satisfy those criteria could be formally received for discussion at a plenary session of the upcoming IGF meeting.

"The IGF was created to help solving global problems that could not be addressed anywhere else," he said, noting that simple discussion is not enough as that would betray what was agreed in Tunis which is clearly stated in the mandate of the IGF itself.

"We stand ready to provide more detailed procedural suggestions on how this could work in progress or to participate in any multi-stakeholder work in process to define it," he said, pointing out that future consultations before Rio should examine in detail the various parts of the IGF mandate as defined in paragraph 72 of the Tunis Agenda, and specifically, how to deal with those that were not addressed in Athens.

He cited for example, that comments F and I required the IGF to discuss the good principles of Internet governance as agreed in Tunis and how to fully implement them inside all existing governance processes, including how to facilitate participation by disadvantaged stakeholders such as developing countries, civil society, and individual users.

End

------

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10