SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksThe *NEW* Frank Coluccio Technology Forum


Previous 10 Next 10 
From: LindyBill4/13/2007 8:14:23 PM
   of 46820
 
Breaking: Google Spends $3.1 Billion To Acquire DoubleClick

About 20 minutes ago Google announced that they have agreed to acquired DoubleClick for $3.1 billion in cash (nearly double the size of their YouTube Acquisition). Microsoft was reportedly in a bidding war with Google for the company. Google gets access to DoubleClicks advertising software and, perhaps more importantly, their customers and network.

DoubleClick was founded in 1996. DoubleClick was taken private in 2005 by Hellman & Friedman and JMI Equity for $1.1 billion. The New York Times is reporting that DoubleClicks revenues are about $300 million/year.

10x revenue for a mature company is a…healthy…valuation. At least part of the acquisition price appears to be due to a desire by Google to keep this asset out of Microsoft's hands.

techcrunch.com

tinyurl.com

Share RecommendKeepReplyMark as Last ReadRead Replies (2)


To: ftth who wrote (20855)4/14/2007 1:44:06 AM
From: Frank A. Coluccio
   of 46820
 
"I didn't read the whole thing but..."

Well, I did read the whole thing, and I can't say that I'm any smarter for it, from a technical perspective. Only that I may be a bit wiser as to the ways in which facts are twisted and misrepresented. I stopped listing them early on. At one point you noted:

"Consumers can't "demand" what they don't even have available, and for that matter, what miniscule percentage of consumers in any market actually know technical operating parameters well enough to "demand" numbers. Did consumers in Japan demand 100Mbps? No; the providers translated everyday consumer activities into bandwidth...they didn't wait for consumers to get a degree in communications bandwidth metrology so they could "demand" it."

As I wrote earlier in a discussion concerning this same interview series elsewhere, the larger point worth noting here is that many subscribers are complacent with what's been "delivered" to them as "broadband," as you can probably deduce while reading the following retort that I received from a journalist several days ago. It's part of a newspaper forum discussion (Gotham Gazette) in which I made several observations and criticisms of the asymmetrical delivery of "broadband" by last mile service providers. The journalist replied:

"I challenge your contention that "broadband" is only one-directional--from those who control the connections down to the content recipients. The whole idea of "Web 2.0," YouTube, "mash-ups" and tools like blogs and wikis is to establish arenas in which people can be content creators as well as content consumers. Sure, there are major companies behind broadband, as well as some of these applications--Verizon provides my DSL service, and Google own YouTube. Nevertheless, people are making use of the interactive capabilities of the Web--many of which require a "broadband" connection--much more than a few years ago. Copyright laws must be modernized to take account of these new phenomenons; increasingly, strict interpretations of copyright are a fortress for the old-line media companies. But this would be true with or without broadband connections."
--end Gotham Gazette quote

I stated earlier that I thought that this may only be a regional phenomenon, or one unique to urban centers, but I have found, in fact, that most non-practitioners no matter where they reside, as long as they're receiving DSL or CM (and even an alarming number of IT professionals) who I speak with are near-totally taken with their new found "broadband" toys. I cannot even state that user discontent is always (although to a great extent is it) proportional to Internet or telecoms literacy. As I began to suggest above, I've made it a point to discuss the merits of transparent carriage, symmetry and other attributes of neutrality with clients of mine who are IT department heads, only to be told:

"I've got TW (or Optonline, or fill in the blank) Cablemodem service and it's great! What do you have, and what's wrong with what you're getting now?"

At times like this I have to ask myself if it's really worth the time and effort to inform and ellucidate, only to ultimately wind up bursting some innocent soul's bubble.

FAC

------

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


From: Frank A. Coluccio4/14/2007 2:04:52 AM
   of 46820
 
Researchers Explore Scrapping Internet
By ANICK JESDANUN | AP | Apr 13, 2007

news.yahoo.com

NEW YORK - Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

[Below: Leonard Kleinrock comments on the dark side of the Internet - spam, pornography, among others - at his office in the UCLA Computer Science Department in Los Angeles, Tuesday, March 27, 2007. Kleinrock created the basic principles of packet switching, the technology underpinning the Internet, while a graduate student at MIT, a decade before the Internet was founded. (AP Photo/Damian Dovarganes)]



The Internet "works well in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."

One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."

The
National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.

The
European Union has also backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years — assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.

"The network is now mission critical for too many people, when in the (early days) it was just experimental," Zittrain said.

The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.

But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn't have built-in mechanisms for knowing with certainty who sent what.

The network's designers also assumed that computers are in fixed locations and always connected. That's no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.

Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.

Workarounds for mobile devices "can work quite well if a small fraction of the traffic is of that type," but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford's clean-slate program.

The Internet will continue to face new challenges as applications require guaranteed transmissions — not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.

Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an Internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.

And one day, sensors of all sorts will likely be Internet capable.

Rather than create workarounds each time, clean-slate researchers want to redesign the system to easily accommodate any future technologies, said Larry Peterson, chairman of computer science at Princeton and head of the planning group for the NSF's GENI.

Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the get-go. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.

"We made decisions based on a very different technical landscape," said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.

"Now, we have the ability to do all sorts of things at very high speeds," he said. "Why don't we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?"

Of course, a key question is how to make any transition — and researchers are largely punting for now.

"Let's try to define where we think we should end up, what we think the Internet should look like in 15 years' time, and only then would we decide the path," McKeown said. "We acknowledge it's going to be really hard but I think it will be a mistake to be deterred by that."

Kleinrock, the Internet pioneer at UCLA, questioned the need for a transition at all, but said such efforts are useful for their out-of-the-box thinking.

"A thing called GENI will almost surely not become the Internet, but pieces of it might fold into the Internet as it advances," he said.

Think evolution, not revolution.

Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.

These days, Carnegie Mellon professor Hui Zhang said he no longer feels like "the outcast of the community" as a champion of clean-slate designs.

Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.

FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.

These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.

Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf's TCP/IP communications protocols.

"Almost every assumption going into the current design of the Internet is open to reconsideration and challenge," said Parulkar, the NSF official heading to Stanford. "Researchers may come up with wild ideas and very innovative ideas that may not have a lot to do with the current Internet."

Associated Press Business Writer Aoife White in Brussels, Belgium, contributed to this report.

___

On the Net:

Stanford program: cleanslate.stanford.edu

Carnegie Mellon program: 100x100network.org

Rutgers program: orbit-lab.org

NSF's GENI: geni.net

------

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: LindyBill who wrote (20856)4/14/2007 2:39:50 AM
From: Frank A. Coluccio
   of 46820
 
Thanks, Bill. An extended read of the same story from the NY Times follows.
---

Google Buys DoubleClick for $3.1 Billion
Louise Story and Miguel Helft | April 14, 2007

[FAC: All of this talk about "ad space," as though it were something that's fungible and fully understood. Ad space where? On one's screen? Is it the characters that constitute the HTML and markup codes that one writes and copyrights? Is it part of some magical electromagnetic spectrum-like "property" domain that could be bought and sold at auction and at the same time make a claim to a specified viewing area on one's PC, TV or other multimedia appliance? If AOL or the WSJ sells "ad space" to an advertiser, does that mean that the advertiser, or WSJ, in this case, or whomever owns the Web page, owns the real estate on your screen, or has control over the bits sent to fill that 1' x 1' square box or video display area on your screen? As curious minds, we'd like to know.]

nytimes.com

Google reached an agreement today to acquire DoubleClick, the online advertising company, from two private equity firms for $3.1 billion in cash, the companies announced, an amount that was almost double the $1.65 billion in stock that Google paid for YouTube late last year.

The sale offers Google access to DoubleClick’s advertisement software and, more importantly, its relationships with Web publishers, advertisers and advertising agencies.

For months, Google has been trying to expand its foothold in online advertising into display ads, the area where DoubleClick is strongest. Google made its name and still generates most of its revenue from search and contextual text ads.

DoubleClick, which was founded in 1996, provides display ads on Web sites like MySpace, The Wall Street Journal and America Online as well as software to help those sites maximize ad revenue. The company also helps ad buyers — advertisers and ad agencies — manage and measure the effectiveness of their rich media, search and other online ads.

DoubleClick has also recently introduced a Nasdaq-like exchange for online ads that analysts say could be lucrative for Google.

“Google really wants to get into the display advertising business in a big way, and they don’t have the relationships they need to make it happen,” said Dave Morgan, the chairman of Tacoda, an online advertising network. “But DoubleClick does. It gives them immediate access to those relationships.”

The sale brings to an end weeks of a bidding battle between Microsoft and Google. Microsoft has been trying to catch Google in the online advertising business, and the loss of DoubleClick would be a a major setback.

“Keeping Microsoft away from DoubleClick is worth billions to Google,” an analyst with RBC Capital Markets, Jordan Rohan, said.

Acquiring DoubleClick expands Google’s business far beyond algorithm-driven ad auctions into a relationship-based business with Web publishers and advertisers. Google has been expanding its AdSense network into video and display ads online and is selling ads to a limited degree on television, newspapers and radio.

The sale also raises questions about how Google will manage its existing business and that of the new DoubleClick unit while avoiding conflicts of interest. If DoubleClick’s existing clients start to feel that Google is using DoubleClick’s relationships to further its own ad network, some Web publishers or advertisers might jump ship.

A highflying stock in the late 1990s, DoubleClick was an early pioneer in online advertising and was one of the few online ad companies to survive the burst of the dot.com bubble. In 2005, DoubleClick was taken private by two private equity firms, Hellman & Friedman and JMI Equity, in a deal valued at $1.1 billion. Since then, the company has sold two data and e-mail advertising businesses and acquired Klipmart, which specializes in online video.

The company generated about $300 million in revenue last year, mostly from providings ads on Web sites.

DoubleClick’s chief executive, David Rosenblatt, said a few weeks ago that a new system it had developed for the buying and selling of online ads would probably become the chief money maker within five years. The system, a Nasdaq-like exchange for online ads, brings Web publishers and advertising buyers together on a Web site where they can participate in auctions for ad space.

DoubleClick’s exchange is different from the ad auctions that Google uses on its networks because the exchange is open to any Web publisher or ad network — not just the sites in Google’s network. Offline ad sales have been handled through negotiation, but the efficiency of online auction systems has caused some advertising executives to consider using auctions for offline ads in places like television and newspapers. DoubleClick’s new exchange could function as a hub for online and offline ad sales.

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:08:58 AM
   of 46820
 
[Uganda] Special computer developed for rural Ugandan Schools
HANA Staff | By David Kezio-Musoke | 13/04/2007

A new computer that is half the size of the bible and powered by a motorcycle battery makes its debut in Uganda

hana.ru.ac.za

The Ugandan government with the help of an American company called Inveneo Inc has invented a computer to aid rural Uganda.

Uganda's Minister of State for ICT Alintuma Nsambu told HANA that since early this year the government has been working with the Americans to develop the computer, the first of its kind on the African continent.

"It is half the size of a bible and powered by a motorcycle-size battery. It has passed the test and ready for deployment," he said.

The computer referred to as the Inveneo computer can run for three days without recharging if the user utilizes it for at least eight hours per day.

It has 256 MB of Random Access Memory (RAM) and 40 MB hard disc memory capacity. The tiny computer is internet ready with wireless capabilities as well.

Nsambu said that Personal Computers (PCs) use the same amount of electricity (energy) as six Inveneo computers. The user has a solar panel for recharging the Inveneo computer. Furthermore, the user does not need a CPU or any form of power inverter or converter.

Nsambu said that because of its simplicity and low power consumption, the Inveneo computer is extremely fast. It also has two Operating Systems including the Microsoft and Linux.

Nsambu wants the computer to go to the rural schools of Uganda where there is no electricity. It is affordable and cost about US$ 300.

Ends

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:21:05 AM
   of 46820
 
[Kenyan and Rwandan] Google connects Universities
By: David Kezio-Musoke | 04/04/2007

hana.ru.ac.za

Google partners with Rwandan Universities to provide free e-mail services and internet telephone to Kenya and Rwanda.

Google has started providing free online communication services to universities in Kenya and Rwanda.

According to the Rwandan Ministry of Infrastructure, Google has partnered with about four Rwandan Universities to provide free services such as email and Internet telephone calling.

Google which is a California-based Internet search engine announced recently that it had also partnered with the Kenya Education Network to provide relatively the same service. Students in both African countries as well as Rwandan government officials will get to use online tools including shared calendars, instant messaging and word processing, Google said.

"This partnership will be a boost in terms of services offered to our Rwandan Academic Institutions, allowing them to collaborate," said Rwanda's Minister of Energy and Communications, Albert Butare.

"I believe communication between students and their lecturers will be enhanced as users throughout the country will now be using the same state-of-the-art, cutting-edge technology that is available in other parts of the world," he added.

The Rwandan universities to benefit include, the National University of Rwanda, Kigali Institute for Education and the Kigali Institute for Science and Technology in Rwanda. The institutes will get access to Google Apps Education Edition in the initial phase of the project while the Rwandan government ministries will use Google Apps Standard Edition.

According to Google senior vice president, Shona Brown approximately 20 000 users in Rwanda will be able to use the Google services.

In Kenya the University of Nairobi's 50 000 students will be the first to be offered Google Apps for Education in Kenya. The services will later be extended to 150 000 Kenyan students at universities across the country.

"For us, universality is crucial because we believe everyone should have access to the same services wherever they live, whatever their language and regardless of income," said Brown.

"I look forward to working with other African governments to make life-enhancing services like free email, instant messaging and PC-to-PC phone calls more widely available across Africa."

In a style considered a defining trait of Web 2.0, the software to run the applications is hosted on Google computers, freeing users of needing to install or maintain programs on their own machines.

Ends

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:26:51 AM
   of 46820
 
The Civil Society Organisations (CSOs) has condemned the recent composition of the Advisory Group for the Internet Governance Forum (IGF).
------

[Internet Governance] CSO condemns the composition of IGF advisory group
By: Remmy Nweke | 13/04/2007

[FAC: "Civil Society Organizations: The World Bank interacts with thousands of Civil Society Organizations (CSOs) throughout the world at the global, regional, and country levels. These CSOs include NGOs, trade unions, faith-based organizations, indigenous peoples movements, and foundations. These interactions range from CSOs who critically monitor the Bank’s work and engage the Bank in policy discussions, to those which actively collaborate with the Bank in operational activities. There are many examples of active partnerships in the areas of forest conservation, AIDS vaccines, rural poverty, micro-credit, and Internet development." Continued at the World Band site: tinyurl.com ]

hana.ru.ac.za

The Civil Society Organisations (CSOs) has condemned the recent composition of the Advisory Group for the Internet Governance Forum (IGF).

The Caucus is the main coordinating framework for civil society participation in Internet governance discussions at the World Summit on the Information Society (WSIS) and subsequently at the first IGF meeting that was held in Athens-Greece, last quarter. Co-coordinator of the caucus, Mr. Adam Peake, who made this known at the weekend in a report to the CSOs on Internet Governance, said that his group was not adequately represented at the advisory group.

He also said that the composition was never discussed as part of the multi-stakeholders approach agreed upon at the second phase of WSIS in 2005, hence not transparent.

"While supporting the concept, we note that its composition, including the proportionate representation of stakeholder groups and the crosscutting technical and academic communities, was not openly and transparently discussed prior to its appointment," he declared.

Mr. Peake stressed that non-transparency of the composition equally borders on lack of clear norms in terms of mandate and working principles.

"We think that clear terms and rules should be established for the advisory group between now and another meeting coming up in Rio, through an open process involving all the participants in the IGF as a shared foundation for our common work," he said.

The caucus coordinator also noted that if the aforementioned dictates were adhered to, that would pave the way for the Secretary General's endorsement of the advisory group.

"If these rules and quarters for representation from each stakeholder group were openly established, it would be possible for the Secretary-General to delegate the actual process of selection of Advisory Group members to the stakeholder groups themselves," Mr. Peake said.

He emphasised that the dissatisfaction of his group for the limited representation of civil society in the first instance of the Advisory Group, which amounted to about five members out of about 40, is not a welcome idea.

"We think that the significant participation of civil society and individual users, as proved by the Working Group on Internet Governance (WGIG), is key to making Internet governance events a success both in practical and political terms," he said.

Mr. Peake equally said that CSOs would like to see such participation expanded to at least one-fourth of the group, if not one-third, and to the same levels of the private-sector and of the Internet technical community.

"We confirm our support to the civil society members of the incumbent group and stand ready to provide suggestions for additional members with direct experience from diverse civil society groups," he said.

The group also reiterated the need for the IGF to be considered as a process rather than as an event, and supports the concept of dynamic coalitions and their activities.

"However, there needs to be a way to bless their work and give some recognition, even if not binding, to their products," he asserted.

He insisted that a transparent, multi-stakeholder, and democratic process should be commenced to develop the criteria for the recognition of dynamic coalitions by the IGF in which the output of coalitions that satisfy those criteria could be formally received for discussion at a plenary session of the upcoming IGF meeting.

"The IGF was created to help solving global problems that could not be addressed anywhere else," he said, noting that simple discussion is not enough as that would betray what was agreed in Tunis which is clearly stated in the mandate of the IGF itself.

"We stand ready to provide more detailed procedural suggestions on how this could work in progress or to participate in any multi-stakeholder work in process to define it," he said, pointing out that future consultations before Rio should examine in detail the various parts of the IGF mandate as defined in paragraph 72 of the Tunis Agenda, and specifically, how to deal with those that were not addressed in Athens.

He cited for example, that comments F and I required the IGF to discuss the good principles of Internet governance as agreed in Tunis and how to fully implement them inside all existing governance processes, including how to facilitate participation by disadvantaged stakeholders such as developing countries, civil society, and individual users.

End

------

Share RecommendKeepReplyMark as Last Read


From: Frank A. Coluccio4/14/2007 3:56:45 AM
   of 46820
 
[TEAMS] This bizarre story exemplifies why Kenya can't seem to get its act together and remains mostly "unflat".
----

No let up in Teams controversy
By: Zachary Ochieng | 13/04/2007

hana.ru.ac.za

Barely a month after The East African Marine Systems (TEAMS), Kenya's much hyped under-sea fibre -optic cable, courted bad press over a US$2.7 million survey tender through single sourcing, the cable was once again in the news last week for all the bad reasons.

This time around, the US$110 million government-fronted fibre optic cable, connecting between Mombasa and Fujairah in the Gulf of Oman, was on the spot over its true owners. Wanjuki Muchemi, the Solicitor-General, blew the whistle upon discovering that records at the Companies Registry showed TEAMS had already been registered by two individuals, namely Jason Wachira, a businessman and Dickson Kahoro, a quantity surveyor.

Following the discovery of the anomaly, Muchemi wrote to various government departments, asking that the matter be treated with the urgency it deserves, given that the government had already invested US$2.86 million in the implementation of the project.

"We are also aware that the government has extensively used the name TEAMS in a number of documents, including the memorandum of understanding with Etisalat", said Muchemi.

However, it later emerged that the two individuals had been asked to reserve the name TEAMS in equally mysterious circumstances by the Communications Commission of Kenya (CCK), the communications industry regulator, and also one of the project's implementing agencies. Whereas the two individuals have since transferred the ownership of the project to the government, with Information and Communications permanent Secretary Dr Bitange Ndemo and his Treasury counterpart Joseph Kinyua holding the shares on the government's behalf, questions are being raised as to why the government, having its own lawyers and investment advisers, opted to register the company in such mysterious circumstances.

Controversy over the registration and ownership of TEAMS mirrors a similar scenario in 2000 when, as UK's Vodafone Plc was preparing to enter into an agreement with Telkom Kenya, the sole landline operator, to form Safaricom Limited, it emerged that a Nairobi businessman, Joseph Chege had already gone ahead and registered a company by the name "Vodafone". Subsequent disputes over the use of the name "Vodafone" almost scuttled plans to form Safaricom.

Last week was the second time the government legal adviser was raising issues with TEAMS. It may be recalled that in March, the Attorney-General's chambers chided the Information and Communications ministry following the award of the survey tender to Tyco International without competitive bidding, contrary to public procurement regulations. Whereas the ministry had applied for and was granted permission by the Directorate of Public Procurement to conduct the said procurement through single sourcing, the Attorney-General's office later argued that the application to the Directorate was made unprocedurally Tyco International announced last week that it had completed the survey.

The contract for the construction of the cable, however, will be subjected to competitive bidding later this month. Already, Telkom Kenya and Dubai's Etisalat are working on the construction modalities of the project, expected to be complete next year. Information and Communications minister Mutahi Kagwe announced last week that the government had appointed Standard Chartered Bank and PriceWaterHouseCoopers as lead financial advisers of the project, with a mandate to attract local investors to put their money in the project The cable is expected to connect East and Horn of African countries to the rest of the world.

TEAMS, a joint venture between the Government of Kenya and the private sector, will have 40 per cent government shareholding, with Etisalat taking 20 per cent while the rest will go to regional telecommunications companies and Internet Service Providers (ISPs), as well as fund managers.

Kenya got approval from its cabinet in September 2006 to establish the parallel cable project following squabbles over ownership and management of the East African Submarine Cable System (EASSy), an initiative to connect countries of eastern and southern Africa via a high bandwidth fibre optic cable system to the rest of the world.

Considered a milestone in the development of information infrastructure in the region, EASSy was expected to reduce unit costs for global connectivity, leading to increased profitability and lower tariffs and charges for end users. "After we got frustrated with EASSy, we decided to seek a cabinet approval for the establishment of The East African Marine Systems (Teams)", said Dr Ndemo.

But the very reasons for which Kenya abandoned EASSy have come back to haunt its own fibre-optic cable project.

End

-------

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


From: Frank A. Coluccio4/14/2007 4:05:27 AM
   of 46820
 
[Comcast] Say Good Night, Bandwidth Hog
Dan Mitchell | April 14, 2007 | NY Times

[FAC: One might infer that this is a sign of pressure, and that the model may be beginning to give, but such an explanation is too convenient. Is it a coincidence, though, that out of all the MSOs the only one that is using strong-arm tactics to this degree (if the article below is correct) just happens to be the largest MSO of them all?]

nytimes.com

An article in PC Magazine last week revealed that customers across the country have received letters from Comcast warning them to limit their bandwidth consumption or face a one-year termination of service.The company, though, “is refusing to reveal how much bandwidth use is allowed, making it impossible for customers to know if they are in danger of violating Comcast’s limit,” according to the article’s writer, Chloe Albanesius.

Ms. Albanesius also spoke with customers who said they had already paid the ultimate penalty. Frank Carreiro, a resident of Utah, had his service cut off in January, and has been blogging about it at omcastissue.blogspot.com . He started the blog in part because “silence will only encourage the company to continue abusing its customers.”

Comcast’s network security department warned Mr. Carreiro in December to reduce his use of bandwidth. He called customer service several times, he said, and was told that there was no record of such a warning and that he might have been the victim of a prank call. A month later, he said, his service was disconnected. He has since switched to D.S.L. service and is “very happy.”

The company sent a response to PC Magazine, which the magazine posted on its Web site.

Comcast restricts bandwidth because, with cable broadband, heavy users can slow service for other nearby customers. The company warns only users who “typically and repeatedly consume exponentially more bandwidth than an average residential user” and in those “rare instances,” it works with customers to get them to either limit their use or upgrade to a more-expensive commercial account.

The company earlier apologized for any “miscommunication” with Mr. Carreiro.

PC Magazine surveyed other Internet service providers, including Verizon, Time Warner and Cox, and found none routinely warned customers or cut off service. Cox provides data on its Web site describing the limits of bandwidth use allowed for its various tiers of service ( cox.com ).

By contrast, “Comcast’s policy, it seems, is to disconnect first and ask questions later,” wrote Owen Thomas of Business 2.0’s Beta blog, adding: “It sounds to me like Comcast’s network technicians are incredibly lazy — or underfunded — and its security team is trigger-happy. That’s a recipe for unhappy customers, and a P.R. disaster in the making” (blogs.business2.com/beta).

But Russell Shaw, a blogger at ZDNet, writes that while suspending service might be an overreaction, “these bandwidth hogs are abusing the system.” His idea is to “hit ’em with a surcharge” (blogs.zdnet.com/ip-telephony).

Odd Jobs Now that you’ve paid your taxes (you’ve paid your taxes, right?) you might be looking for ways to make some extra money. Donna Freedman has tried a lot of ways to make ends meet. “I earned $35 for watching a porn film,” she wrote on MSN Money this month. “I sell my blood, a couple of teaspoons at a time, to medical researchers. I get a monthly rent credit for being the go-to gal in my apartment building.”

And that’s just for starters. She baby-sits and dog-sits, she mystery shops and she undergoes medical tests. Last year, she wrote, she kept herself afloat this way. She’s since scaled back, but keeps her hand in it because, “extra income is always welcome” (moneycentral.msn.com)

Not Recommended A man in South Korea was upset when his cellphone stopped working. He reportedly called his carrier, SK Telcom, 16 times and visited their service center twice, but got no satisfaction. When a representative suggested that he replace the phone with a different model, the man was ready: he had scrawled “Delinquent SK” across his Mercedes S500 and drove it through the revolving doors at SK’s headquarters.

“Lesson to carriers: do everything in your power to replace your customers’ defective phones after a maximum of 15 support calls,” writes Chris Ziegler of the blog engadget (engadget.com). “Lesson to customers: if you aim for the glass instead of the door, you can probably make it into the building.” DAN MITCHELL

------

Share RecommendKeepReplyMark as Last Read


To: LindyBill who wrote (20856)4/14/2007 5:33:20 AM
From: Frank A. Coluccio
   of 46820
 
Google’s ‘Really Crazy’ Acquisition Strategy
April 12, 2007, 3:54 pm

dealbook.blogs.nytimes.com

The market for initial public offerings of Internet startups has been weak in the past few years. So for venture investors and Silicon Valley entrepreneurs seeking a way to cash out, a sale to Google, Yahoo and a handful of other Web giants has become the brass ring. So what does Google, which has about $11 billion in cash and lots of high-priced stock to use as currency, look for in an acquisition target? According to its chief dealmaker, Google likes ideas that are “really crazy.”

“We look at everything very carefully,'’ Salman Ullah, Google’s director of corporate development, said Wednesday in a speech at a meeting of the Los Angeles Venture Association, according to Bloomberg News. “The really crazy ones do really well.'’

Google’s acquisitions in the past year or so have ranged from its $1.65 billion deal for video-sharing Web site YouTube (whose business strategy might not be crazy but has certainly spawned a few lawsuits) and small bolt-ons such as Writerly, a Web-based word processing application.

Google’s acquisitions team consists of about 15 people who Mr. Ullah said meet with dozens of companies every week. They respond to every e-mail pitch they receive, he added. But don’t call them, at least unless you have a really good idea — they return just 10 percent of phone messages from entrepreneurs looking to sell.

Echoing a former marketing line from Apple, Mr. Ullah said, “The crazy ones mean they ignore the usual restraints of investment levels required or design parameters or ‘Gee I need more servers than anyone ever thought was possible’. When you free yourselves from these constraints, you create crazy, cool things.'’

------

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10