From: ms.smartest.person | 6/29/2006 11:06:49 AM | | | | Fleck: Bad data + weak Fed = Buy gold
Bill Fleckenstein Contrarian Chronicles6/26/2006 12:00 AM ET
Last week, when the Commerce Department released its monthly housing-starts data, it said May starts grew 5% from April. But that didn't jibe with other available data. This illuminates a point I've touched on in previous columns -- the sad reality that government data are nearly useless, misleading at best and just plain wrong at worst.
Government data: Keep back 200 feet
The agency's estimate of 5% housing starts growth made no sense to me, not when 1) virtually all publicly traded homebuilders have discussed the serious weakness in their orders, and 2) the May index of homebuilder confidence -- reported by the National Association of Homebuilders -- registered the lowest reading since January 1990, June 1991 and April 1995.
It is possible that the homebuilders have essentially said: Damn the torpedoes, full-speed ahead! And then continued to start new homes, even as orders drop (which would only make matters worse). On the other hand, the homebuilders might not have adopted that approach (though there is really no way to be sure). In either case, the housing-starts data do not accurately reflect the current condition of the homebuilding business.
Were it not for the fact that government data -- distorted as they are -- matter to those who trade markets that I care about, I would ignore these statistics altogether. So, for folks who are making any decisions based on government data, be forewarned.
The Fed has talked down markets
Turning to the subject of trading generically, I suspect we'll see lots of gyrations heading into the Federal Open Market Committee statement on Tuesday and Wednesday. I have no idea how far the tape may go in any one direction. What I can say, though, is that Fed-jawboning has pretty much worked its magic, because commodities have been smacked, and many markets now appear terrified of the Fed as an entity.
I find those fears to be more than slightly ironic -- given that the Fed is the engine of inflation and that the Fed has only recently awakened (since it started to be almost laughed at by the gold market) to the 5%-plus-or-minus rate of inflation we've been experiencing for a couple of years.
The Fed members' ability to still command credibility (after two bubbles in five years and various other blunders) continues to amaze me. But I suspect that, before this year is out, the Fed's credibility will be in short supply. The little rampage in gold about a month ago was just a taste of what things might look like when the Fed is finally understood to be trapped and not in charge.
ETF heft
Speaking of gold, I was pleasantly surprised to see two market developments:
* The gold ETF (StreetTracks Gold Trust (GLD, news, msgs)) took in more gold as prices plunged (with total ounces held now at a record).
* Although the silver ETF (the iShares Silver Trust (SLV, news, msgs)) saw its ounces drop a bit, they are now just shy of their previous high with silver $4 lower.
What this points out: The un-leveraged "cash-type" buyers are availing themselves of dips in price to get more exposure. Meanwhile, the leveraged futures traders are being forced to sell weakness. I recently beefed up my metals' exposure and will now be at full strength as we head into the upcoming week's data.
I think folks should keep that distinction in mind. Due to leverage, people who trade futures tend to chase strength and sell weakness, while cash buyers tend to do the opposite. That phenomenon is one reason why people who trade futures usually lose money.
Bill Fleckenstein is president of Fleckenstein Capital, which manages a hedge fund based in Seattle. He also writes a daily "Market Rap" column on his Fleckenstein Capital Web site. His investment positions can change at any time. Under no circumstances does the information in this column represent a recommendation to buy, sell or hold any security. The views and opinions expressed in Bill Fleckenstein's columns are his own and not necessarily those of CNBC or MSN Money. At the time of publication, Fleckenstein did not own or control shares of securities mentioned in this column.
articles.moneycentral.msn.com |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5044) | 6/29/2006 11:12:48 AM | From: ms.smartest.person | | | Cabo Announces Drilling Contract With Wallbridge Mining Inc. Thursday June 29, 8:30 am ET
NORTH VANCOUVER, BRITISH COLUMBIA--(CCNMatthews - June 29, 2006) - Cabo Drilling Corp. (TSX VENTURE:CBE - News; "Cabo" or the "Company") announces that its Heath & Sherwood Drilling Inc. division has been awarded a drilling contract by Wallbridge Mining Inc. to carry out deep hole surface diamond drilling on a number of their Sudbury, Ontario properties.
The project is for a minimum 2700 metres of NQ core drilling to be completed in the Sudbury Basin with drill holes ranging in depth from 300 metres to 1200 metres. The equipment was moved into the drilling area, approximately 7 kilometres from the transport off loading point, using a D-6 tractor. As the road is not suitable for a transport truck, all fuel and supplies will be transported to the project from the off loading point via a wheeled skidder. The project was mobilized during the week of the 22nd of May. Heath & Sherwood's contract with Wallbridge is an open contract and drilling may continue, at Wallbridge's request, after the minimum 2700 metres has been completed.
Cabo Drilling Corp. is a drilling services company headquartered in North Vancouver, British Columbia, Canada. The Company provides mining related and specialty drilling services through its subsidiaries Advanced Drilling Ltd. of Surrey, British Columbia; Forages Cabo Inc. of Montreal, Quebec; Heath & Sherwood Drilling Inc., of Kirkland Lake, Ontario; and Petro Drilling Company Limited of Springdale, Newfoundland. The Company's common shares trade on the TSX Venture Exchange under the symbol: CBE.
ON BEHALF OF THE BOARD
(signed "John A. Versfelt")
John A. Versfelt
Chairman, President and CEO
Further information about the Company can be found on the Cabo website (http://www.cabo.ca) and SEDAR (www.sedar.com) or by contacting Investor Relations Ms. Sheri Barton at 403-217-5830 or Mr. John A. Versfelt, Chairman, President & CEO of the Company at 604-984-8894.
This news release may contain forward-looking statements including but not limited to comments regarding the timing and content of upcoming work programs, geological interpretations, potential mineral recovery processes and other business transactions timing. Forward-looking statements address future events and conditions and therefore, involve inherent risks and uncertainties. Actual results may differ materially from those currently anticipated in such statements.
The TSX Venture Exchange does not accept responsibility for the adequacy or accuracy of this release.
Contact:
John A. Versfelt Cabo Drilling Corp. Chairman, President and CEO (604) 984-8894 (604) 983-8056 (FAX) Email: ir@cabo.ca Website: www.cabo.ca
Source: Cabo Drilling Corp.
Copyright © 2006 Yahoo! Inc. All rights reserved. biz.yahoo.com |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last Read |
|
To: ms.smartest.person who wrote (5028) | 7/1/2006 9:08:31 PM | From: ms.smartest.person | | | :•) i, Cringely - The Rich Get Richer: Google Needs Some Ad Sense
By Robert X. Cringely May 25, 2006
After 29 years of working in high-tech companies and writing about them, I have noticed how insular they tend to be, often not seeing either the world or themselves at all clearly. Whether intended or not, this cultural artifact comes to control how the world in turn sees them, which rarely works in their favor. The classic example is Microsoft, where hiring smart people fresh from school and working them 60 hours or more per week -- in an environment where they don't even leave the building to eat -- leads to a state of corporate delusion, where lying and cheating suddenly begin to make sense. But it isn't just Microsoft that does this. It is ANY high tech company that hires young people, isolates them through long hours at work, feeds them at work, and effectively determines their friends, who are their co-workers. This trend even extends to the anti-Microsoft, to Google, where the light of day is sorely needed.
Google is secretive. This started as a deliberate marketing mystique, but endures today more as a really annoying company habit. Google folks don't understand why the rest of us have a problem with this, but then Google folks aren't like you and me. The result of this secrecy and Google's "almighty algorithm" mentality is that the company makes changes -- and mistakes -- without informing its customers or even doing all that much to correct the problems. It's all just beta code, after all. But the business part is real, as is the money that some people have lost because of Google's poor communication skills combined, frankly, with poor follow-through.
First there is click fraud. Google makes its money when people click on Google ads, but some of those clicks are fraudulent -- are not honestly intended to gain information or to buy products. Click fraud generally comes in two varieties that I'll call "buy" and "sell." An example of buy-side click fraud would be my little sister religiously clicking on every Google ad on this page (What? We have no Google ads?) in the mistaken belief that doing so would make me some money. It is mass clicking by a single person without an intention to actually buy or even to gain information. Sell-side click fraud would be one advertiser clicking on the ads of a competitor with the intention of costing that competitor money without increasing their sales. Both types of click fraud ought to be detectable, and in fact, Google says it already detects the 10 percent or so of clicks that are fraudulent (Business 2.0 magazine says it is more like 30 percent), and adjusts the bill before the advertiser even knows what is happening.
But not all click fraud is detected automatically. It is one thing to notice the same IP address being used to click 30 ads in three seconds (that is obviously fraud), but quite another if the clicks are spread out or come from what appear to be a variety of users. There, too, Google pledges to make things right, though it may take some time -- too much time, I think.
My friend Mario Fantoni is a victim of click fraud, which in this case is simply defined as his Google AdWords bill climbing from $250 one month to $4,000 the next with no change in the campaign or increase in sales. Mario contacted Google, which, after an "investigation," decided that he was, indeed, a victim of click fraud. Good for Google! But that was seven weeks ago and Mario is still waiting for his credit card to be reimbursed for $3,750. Google has yet to explain why it is taking so long for Mario to get his money back. For that matter, Google has yet to actually say that Mario will GET his money back. They are still "investigating," which could mean anything because the company will not explain what it means.
If you have been a victim of acknowledged click fraud on Google (where Google admits there is a problem), please let me know about it this week (bob@cringely.com), and especially tell me what it took to get reimbursed.
The next problem I have with Google came to me courtesy of Luis Dias, a software developer with IO Software in the UK. Luis's product is a mathematical equation editor cleverly called Equations! It is great for users of Don Knuth's LaTex page formatting program, especially if they want output readable in other applications, like Microsoft Word. You can find more about the program in this week's links.
Luis decided to sell his program online using a Google ad campaign, targeting terms like "physics equations," "equation editor," and of course "LaTex." Because he didn't expect much competition selling equation editors, Luis thought that he could get most of these words for about Google's minimum price, which in the UK is 1p. In practice, though, he found that the minimum price was 3p for most words, and that minimum shortly jumped and then jumped again until some words cost as much as £2.75 (about $5.15). Since there was no competition for these ads, Luis couldn't figure out what was going on, and frankly, Google wasn't much help. They said that his words had low "Quality Scores," which meant that the minimum charge per word had to go up by the amount specified. That made no sense to Luis or to me, so I contacted Jeff Huber at Google.
To his and Google's credit, Jeff became very involved in explaining this situation and Google's position. I happen to think it is the wrong position, but at least we have had good communication.
What I learned is that the Quality Score of Luis's words was low, suggesting that it was doubtful many readers would find the ads useful or click on them. "In the recent past," explained Jeff, "ads with low Quality Scores were disabled -- i.e., not shown -- which could create frustration for an advertiser since it was a binary (on/off) decision. More recently, we evolved from the binary approach to a more flexible economic model that instead of disabling lower performing ads entirely would allow them to participate as long as the minimum bid was set at an appropriate level. This feature has both provided advertisers with greater control, as well as helped reduce the number of low quality ads by better aligning economic incentives."
This explains Luis's surreal experience of having to pay more and more to get less and less. Unfortunately Google doesn't do a very good job of explaining this change, perhaps because it appears to be precisely the kind of paid placement they do at Overture Systems (now part of Yahoo). This almost total lack of explanation may have been part of the reason why Luis was scratching his head.
We have two concepts here -- a Quality Score for words and a History for campaigns. A low Quality Score can lead to an increase in minimum word price and a poor history (lots of low Quality Scores along with low clicks-through) can lead to minimum prices being raised not just for one word, but for all words. In the case of Luis, all this took place in three days and half a dozen words, which I'd hardly call much history, yet there is very little for him to do about it short of canceling his Google account and starting all over.
The funny thing about history at Google is that it exists even before people see the ads. The AdSense algorithm takes a look at ads before they are posted and makes a quality estimate that is the starting point for the history even before an ad is shown. It apparently looks at ads in preparation, too, which is how it is able to assign minimum bid prices. Even ads you decide never to run can affect prices.
There appears to me to be a fundamental error here in Google methodology. While they talk a lot about keywords and their quality scores, the Google system appears not to work with keywords at all, but with campaigns. A poorly performing keyword will drag down all the other keywords in the campaign no matter if their quality score is good or not, if they had just two impressions, or were ever even active.
What's happening here is just that Luis is trying to sell something that hardly anybody wants to buy.
The Google system -- THOUGH I AM SURE IT IS NOT INTENDED TO OPERATE THIS WAY -- works poorly for small sellers trying to reach buyers of obscure products. This may come down, frankly, to Google's concept of small business, which I have never seen defined.
The system appears to be optimized for people selling goods of interest to millions of people, to huge marketers no matter what they are selling, but not to tiny businesspeople trying to mine narrow niches like equation editors. By targeting campaigns and not keywords, big advertisers can use any keywords they wish regardless of the relevance simply because they exist in a much larger corpus. Because keywords are treated by Google not individually but in the context of the whole campaign and these companies receive so many clicks overall, none of their keywords are likely to perform poorly using this algorithm. The result is that no matter how inappropriate or even offensive, those words will also be cheap to buy. And because these keywords have performed "well," their "history" is then updated with regard to the campaign of the big advertiser, effectively blocking out small advertisers who can't compete with the massive click rates of big companies.
Google says "we'd much rather show nothing (white space) than a poorly targeted or non-relevant ad." But on the basis of pure performance -- what Google actually DOES, rather than what Google SAYS -- it would appear that ad quality is irrelevant in the presence of huge ad budgets. The data suggest Google really cares about massive click rates, which under most circumstances come from big companies that have a huge built-in advantage.
So the rich get richer.
Google attracts advertisers like Luis with the idea that their ads will be cheaper because, frankly, they are selling something that is only thinly traded. The dream is that the system scales and scales fairly, only it isn't fair at all because if Amazon wants to advertise an equation editor USING EXACTLY THE SAME AD TEXT AND FORMATTING AS LUIS -- their words will cost 100 times less than the same words bought by Luis. It's not that Amazon (or any other big Google advertiser) has better copy writers, it is just that they sell a broader range of things.
"A large percentage of impressions & clicks do have £0.01 minimum bids," said Jeff from Google, "but these are our very highest quality ads/advertisers."
In other words, the minimum word price is 1p, BUT NOT FOR YOU.
But what is Luis to do, I mean really? All his keywords are now at very high prices and will not come down. The only way to escape this vicious circle is to open a new account, but even then he'll still be at the mercy of badly behaved keywords that come with the equation editing territory. He has to describe his product SOMEHOW.
It would be far better, I suppose, for Google customer service to simply suggest Luis not use Google ads to sell his equation editor.
I am sure this is not what Google intended, but it is misleading, unfair, and poorly explained.
"The system does scale fairly, and provides a level playing field for both small and large advertisers," says Jeff Huber. "If Mr. Dias has relevant ads, keywords, and landing page, he should be able to do just as well as other advertisers, regardless of size. It does not mean, however, that Mr. Dias or any other advertiser will be able to economically show ads that are not relevant and not consistent with user intent. If Mr. Dias or other advertisers want a large quantity of untargeted impressions, there are a variety of media that offer these relatively cost effectively (e.g., web banner ads, TV, newspapers, magazines). It is fair to observe that if there are any advertisers who may have a slight advantage, it's advertisers who have strong brands that users recognize and trust, and therefore users find more compelling when they show relevant ads -- but that's very consistent with the 'real world' and value of brands."
It all comes down to the AdWords algorithm and its intent, which isn't to help Luis OR Amazon, but to simply maximize profit for Google. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5046) | 7/1/2006 9:09:47 PM | From: ms.smartest.person | | | :•) i, Cringely - America's Pastime: Google Responds to Last Week's Column, but Fails to Appreciate the Difference Between Home Run Hitters and Hot Dog Vendors
By Robert X. Cringely June 1, 2006
One of my heroes is a guy named Jeff Angus, who lives in Seattle. Jeff's interests are wider than most. He was an early Microsoft employee, a sports writer, a Congressional aid, a cab driver, and so of course, he works today as a management consultant. Jeff is fiercely intellectual to the point where it can be exhausting to even speak with him, though every one of those discussions has been rewarding, at least for me. I value Jeff's judgment, too, with the sole exception of his decision one year to drive to Comdex. All this is prelude to a plug for Jeff's new book, Management by Baseball, which is a bunch of great sports stories you've never heard used to effortlessly explain how to run a business, pretty much any business. The book is, I believe, unique in its genre and well worth reading. It sure taught me a lot, and you can learn more about Management by Baseball behind one of this week's links.
It was Casey Stengel, according to Jeff, who said that baseball is a talent business and the manager's job is acquiring, nurturing, inspiring, and ultimately disposing of that talent. The same is true in most businesses today, where the heaviest lifting is often done by knowledge workers, and individual contributors can make or break the enterprise. The only significant difference between a baseball team and the typical IT business is that baseball has statistics, which make measuring success a lot simpler. Whether a corporate executive is effective or not is usually open to wide interpretation, whether you won the World Series is not.
Preparing for this second column on Google I turned to Jeff's book for inspiration, and found it in a peculiar form that you'll shortly understand.
Last week, you may recall, I wrote about Mario Fantoni having trouble getting a refund from Google for AdWords click fraud, and about Luis Dias, whose AdWords costs were dramatically (and he thought unfairly) rising because not many people were actively looking for equation editors. In the week since that column appeared, I heard from several hundred readers, many of whom wanted to point out that LaTeX (not LaTex as I called it) was written originally by Leslie Lamport and not Don Knuth. The rest all had comments to share about Google or were representing Google, itself. I heard a LOT from Google.
The Google reps (by this time I had been punted from engineering to PR) thought I was completely mistaken in my representation of Mario's click fraud situation, where I said he had waited seven weeks for a refund of more than $3,000. After a conference call with Mario and three Googlers to go over his account they pointed out that Mario's costs had risen significantly because he had increased the number of keywords in his campaign, the daily budget, and the maximum price per word, so if his costs rose by 300-plus percent between January and February, it was Mario's own doing, not Google's. And they HAD given Mario a click fraud refund of $115, which he had overlooked. Fair enough.
But Mario's concern about click fraud was based not on his Google bill as much as on his own web statistics, which are compiled by web-stat.com. The actual statistics are among this week's links, but here is the summary that caught Mario's eye:
Visitor Sessions per Month Jan 06 = 386 visits (Google account active = $555.99) Feb 06 = 328 visits (Google account active = $1,893.26) Mar 06 = 348 visits (Google account paused) Apr 06 = 410 visits (Google account paused) May 06 = 916 visits (Google account active = $751.03)
When Mario saw that his monthly AdWord cost had more than tripled from January to February, but his number of visitor sessions had actually DECLINED, he suspected click fraud. When he paused his AdWord account completely for two months and his number of visitor sessions edged UP it was even more surprising.
What's going on here? Google doubts Mario's numbers, but not their own. Mario sees no reason to doubt his numbers, which were compiled by a neutral third party. I have no idea who is right or wrong, and just throw it out to you as a curiosity.
Google was clearly annoyed by my column last week, and well, they should be because it generated a ton of e-mail from disgruntled AdWords and AdSense users they would probably rather had not heard from. Any system that involves billions of dollars and is used by millions of people will have disgruntled users. It doesn't matter how good it is, some people will always complain. So the fact that I received more than 200 complaints doesn't, itself, mean that much. But the nature of those complaints was interesting, since most of them had to do with customer service. And that brings us back to Casey Stengel.
Google, as a developer of very high technology on a huge scale, is very much a talent company. Individual programmers are responsible for many of Google's most important components. The outfit is known for hiring the best and the brightest, and their hiring process is among the most difficult and mysterious for those who successfully complete it. The mystique of Google is built entirely around this combination of great people and great secrecy. Of the people I know at Google, for example, the only one whose actual job I know is Vint Cerf, and that's because it appears that Vint's job is to be Vint. Otherwise, I have no idea what my Google friends do at work.
Larry Page and Sergy Brin, but most importantly CEO Eric Schmidt (well, I guess I know what HIS job is) have created a corporate culture in which they feel comfortable. For Larry and Sergy, it is collegial, for Eric it brings all the best parts of Sun Microsystems (that is the software) without the hardware or the big sales organization. There is nothing wrong with any of this. The problem is that while Google is a talent business, customer service ISN'T.
Customer service isn't a program to be optimized or a function where one genius can do the work of 20. Customer service scales linearly, and any attempt to alter that usually is accompanied by a degradation in service. So Google has customer service reps madly cutting and pasting boilerplate e-mails, but I can tell you the boilerplate often doesn't cover the required material in a useful way, and beyond that boilerplate there is nothing -- well, at least nothing we are allowed to know about. It turns out that there ARE ways to escalate through Google customer service, but you pretty much have to know where you are going because they won't tell you.
To the typical disgruntled Google customer, it looks like you can contest your bill, but not to any third party, just back to Google. Beyond that, if there is a credit card involved you can take action through the card company or you can go to court, which is a huge leap that most people aren't willing to take, so they don't, and Google wins.
It's not that there are so many people mad at Google, but that the people who ARE mad tend to feel they have no recourse. And it appears to me that much of this can be attributed to Google's lack of proper attention to customer service and simply explaining better both their services and customer options.
Reading this, Google will fall back on statistics, where they feel most comfortable. Their user satisfaction numbers are fine, they'll say. Well, I ran customer support for a $1 billion high tech company years ago, back when $1 billion was a lot more money than it is today, and I can tell you there is more to this problem than statistics can reveal.
And it isn't just AdWords customers who are frustrated. Here's the view of Google from the third-party developer perspective: "I have been the lead developer and architect at companies that offer automated bid management and optimization of keyword ad campaigns at Google and other search engines. In both cases, interaction with Google was, far and away, the most difficult problem to solve. Not only do they not eliminate some fraudulent clicks that were fairly easy to detect, they are absolutely impossible to actually have a business relationship with. Even when managing campaigns that totaled close to $1 million in revenue every month, getting an actual answer out of Google for ANYTHING was usually a matter of weeks, and often required walking down the street and actually entering the Google offices unannounced to 'chat' with various friends who work there and who were sat fairly close to our account manager. I've never been convinced that it is a culture of secrecy so much as it is a culture of arrogance. They are so convinced of their superiority over there that they honestly don't seem to believe you when you point out their mistakes, so they utterly fail to act."
None of my friends at Google are arrogant, not one. But that doesn't mean the COMPANY doesn't appear to be so.
Google kept explaining to me this week what an inconvenience it was for them having to answer my questions, yet doing so probably saved them many customer support calls as I work through this morass on your behalf.
What Google is failing to remember is that when your entire focus is on hitting home runs, somebody still has to sell the hot dogs. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5047) | 7/1/2006 9:12:01 PM | From: ms.smartest.person | | | :•) i, Cringely Local Heroes: Could the Key to Successful Internet Television Be...PBS?
By Robert X. Cringely June 8, 2006
Though you might not always know it from reading this column, PBS is a television network. And as a TV network, PBS is facing the same sort of technical challenges as its more commercial competitors. At this moment, that includes deciding how to play in the emerging world of digital downloads and IPTV. But there is an aspect of this that most people don't think about, and that's the difference between national and local strategies, between how the network might want to run IPTV versus how local station managers see the opportunity. Up until now, IPTV has seemed to appeal more to the network than to its affiliates, but that's just because people aren't thinking clearly. IPTV might, in fact, lead to a renaissance in local television.
The Internet television story, even as written here in columns going back as far as the late 1990s, pushed the idea of enabling the aggregation of widely-dispersed viewing audiences, allowing programming to thrive that might not be successful on any local station, much less on the national network. A good example is NerdTV, which wouldn't attract enough viewers on most PBS stations to even generate a rating, yet when offered as an Internet download, drawing from a global population, makes some pretty good numbers. But there is no concept called “local” in this aggregation model, so stations tend to feel threatened by it; if the network can reach local viewers directly, what need is there for a local station?
But it doesn't have to be that way, because the supposed strengths of centralization aren't really strengths at all when viewed in terms of the much more imposing issue of bandwidth costs, where all the advantages are local.
I explained this to a group of PBS station managers meeting last month in Orlando, Florida. Where these folks tended to fear IPTV portended the disintermediation of local television, I argued the exact opposite. My reasoning came down to the price differential between Internet bandwidth and intranet bandwidth, the latter being that bandwidth entirely within the ISPs local point of presence or data center. There is a lot more of this intranet bandwidth, for one thing. Depending on how their network is segmented, a local provider of cable Internet or DSL service may have gigabits of aggregate customer bandwidth attached to a much smaller Internet pipe. A 100-to-one ratio of internal to external bandwith is typical, meaning the effective cost of internal bandwidth is 100 times lower.
What I advised the station general managers to do was to serve their traditional audiences as much as possible over internal, rather than external, connections. This means colocating a server down at the telephone and cable TV companies, which isn't hard to do since most communities have just two broadband providers, and the PBS station manager probably knows both of them from Rotary meetings or from the local United Way board.
In large part, this local advantage comes down to personal relationships. This is in stark contrast to my last two columns on Google -– a company that wants to avoid real human contact. Make friends with your local broadband providers, I said, then find a way to put your content INSIDE their network.
The advantages of this strategy are profound. Bandwidth costs go away completely, which not only frees up money for more programming or better servers, it becomes much more practical to display video with larger frame sizes, faster frame rates, and higher resolutions, creating a better viewing experience.
Imagine a local PBS station putting one of those lovely new eight-core Sun servers at the phone and cable companies. The servers would be packed with local programming and capable, through their multiple gig-Ethernet interfaces, of supporting as many simultaneous Internet viewers as the station serves broadcast users for many hours of the day.
What's in this for the phone and cable companies is revenue sharing for advertising, as well as reducing demand on the ISP's own Internet connection. They'll understand instantly and see the revenue and cost-saving potential.
Put the station's live signal on each server and enable downloads of past shows. Promote the idea of using the Internet for something viewers would like to see again. Offer video in 640-by480 at 30 frames-per-second. And do it all using unroutable IP addresses behind a NAT firewall with external interfaces, allowing only lower resolutions and smaller frames for true Internet viewers outside the ISP.
Of course, this has to be limited to content actually owned by the station, which generally means locally-produced content. But on PBS nearly all shows are, at some point, qualified as locally-produced. So do a deal with WGBH or WNET to offer their content (shows like Nova and Masterpiece Theater) on your servers. Absent a good peer-to-peer distribution scheme, this is the best way to get TV over the Internet and it is inherently local.
But wait, there's more!
When John Warnock and Chuck Geschke started Adobe Systems almost 30 years ago, their goal was to build a printer -- one printer -- to serve the world. They knew the processing power required for a PostScript printer would be more than that of any computer they could buy circa 1979, so it logically followed that people wouldn't own their own printers, they'd share one. It's the same way the French look at their bread, which is never baked at home because the professional bakers do too good a job.
So John and Chuck set off to build what was essentially a service bureau for cool printing. This isn't such an odd story, either, because I've heard similar tales from other pioneers who couldn't imagine that their technologies would be eventually affordable by the masses so they planned centralized systems.
Generally, though, we tend to head the opposite direction and avoid centralization in computing. We live in the age of little iron, where large systems tend to mean clusters of smaller computers. But there are times when centralization is actually a good thing, where we don't always have to be reinventing the wheel. One such area is video serving, which I think we're handling in a remarkably stupid fashion.
If there is a 100-to-1 cost and quality advantage to intranet, rather than Internet, distribution, then we ought to do as much as possible without hitting the Internet, itself. This was the whole idea behind edge-caching and companies like Akamai. But I propose we take the model a step further and organize it in a different way: as a not-for-profit enterprise.
To the Libertarian powers of Silicon Valley, not-for-profit enterprises make little sense until they look at their own track records and realize that most high tech startups don't show a profit for years, if ever. So what if you took away the profit incentive entirely, specifically for certain types of commodity activities like Internet video serving?
What brought this idea to mind was a contact recently fom a very well-known company looking for a cheaper way to distribute video. How did we do it for NerdTV? I explained the technique that keeps our costs at $0.16 per gigabyte, but petty theft just didn't seem right for a really big operation. That's when I realized that everyone is looking for a lower cost of video distribution while most ventures aren't themselves, about video distribution. Some are, sure, like Akamai, but most aren't.
What if we offered intranet video downloads and streaming in any format and resolution on a non-profit basis, which is to say something along the lines of cost plus three percent? It would be a huge hit! Every content business would hand over its basic bit schlepping because economies of scale would make the non-profit effort cheaper than they could possibly do themselves. The Internet video business would grow faster and fewer startups would die for technical or funding reasons. Local advertising in association with broadband ISPs would create new business models for content creators and ISPs alike.
And who would run this not-for-profit enterprise? In the U.S. it would be logical for it to be run by the 348 PBS local stations that live in every major and minor market, and already operate on a not-for-profit basis. Instead of just Sesame Street, let them distribute the Sopranos and WWE Wrestling, too. And in doing so they can not only serve their local markets but those markets can help in new ways to support local programming.
The dream we have of a global network has kept us from realizing that when it comes to taking Internet video to the next level, our real heroes ought to be local. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5048) | 7/1/2006 9:13:27 PM | From: ms.smartest.person | | | :•) i, Cringely Taking One for the Team: Microsoft Has to Change and the Departure of Bill Gates (AND Steve Ballmer) Is the Only Way to Assure Their Fortunes. Also, Net Neutrality Isn't What You Think It Is
By Robert X. Cringely June 15, 2006
Here I was, barreling along on a column about Net Neutrality when Bill Gates up and announces his departure from active duty at Microsoft. That simply can't be ignored, so I'm off on a sudden change of course. But because I am never one to hold back, take a look further down and you'll find Net Neutrality covered, too.
As everyone by now knows, Gates announced that he was giving up his Software Architect and Research jobs to Ray Ozzie and Craig Mundie, respectively, and would be withdrawing from day-to-day responsibility over a period of two years. (He would, however, probably remain Microsoft's largest shareholder.) These details mean both a lot and very little depending on precisely how they are executed, but generally I think it bodes well for the company.
Microsoft is in crisis, and crises sometimes demand bold action. The company is demoralized, and most assuredly HAS seen its best days in terms of market dominance. In short, being Microsoft isn't fun anymore, which probably means that being Bill Gates isn't fun anymore, either. But that, alone, is not reason enough for Gates to leave. Whether he instigated the change or someone else did, Gates had no choice but to take this action to support the value of his own Microsoft shares.
Let me explain through an illustration. Here's how Jeff Angus described Microsoft in an earlier age in his brilliant business book, Managing by Baseball:
"When I worked for a few years at Microsoft Corporation in the early '80s, the company had no decision-making rules whatsoever. Almost none of its managers had management training, and few had even a shred of management aptitude. When it came to what looked like less important decisions, most just guessed. When it came to the more important ones, they typically tried to model their choices on powerful people above them in the hierarchy. Almost nothing operational was written down...The tragedy wasn't that so many poor decisions got made -- as a functional monopoly, Microsoft had the cash flow to insulate itself from the most severe consequences -- but that no one cared to track and codify past failures as a way to help managers create guidelines of paths to follow and avoid."
Fine, you say, but that was Microsoft more than 20 years ago. How about today?
Nothing has changed except that the company is 10 times bigger, which means it is 10 times more screwed-up.
Microsoft has spent five years and $5 billion NOT shipping Windows Vista. This reflects a company deliberately built in the image of its founder, Bill Gates -- a single-tasking, technically obsolete executive with no checks or balances whatsoever who fills the back seat of his car with fast food wrappers. So Bill has to go, because as an icon, he's great, but as a manager, he sucks.
Part of this is Gates, personally, and part of it is his entourage -- a meritocracy based as much on historical proximity to Bill as anything else. That inner circle has to go, too, and if it doesn't go -- and go immediately -- the required change won't really happen because the one true Bill will just be replaced by a dozen or more Bill clones.
Up to this point, most of the top people leaving Microsoft have gone because they couldn't stand the working environment. Look now for a second bunch of top people to leave because they liked the working environment too much.
One of the two needed components Microsoft has always lacked is professionalism. Ray Ozzie and Craig Mundie, as successful leaders in other professional organizations and supposedly bonafide adults, are supposed to fix that, which they have a good chance of doing if Gates doesn't meddle.
Notice from the announcement that both men are assuming their new roles immediately? If that's the case, what will Gates be doing, then, during that two-year transition? If he's smart, Gates will do nothing at all. The two-year transition is based on his paranoid need to keep his thumb in the backs of both men and be able to return if he chooses to. But while Ozzie and Mundie are each capable of failure, it is important to remember that GATES HAS ALREADY FAILED, so coming back isn't really an option, though he may not yet get that.
If either of the new guys fail, look for Microsoft to again hire from outside, NOT to bring back Bill Gates.
The other attribute that Microsoft has historically lacked is ethics, which also comes directly from the cult of Bill, with its infinite shades of gray. Microsoft has to this point generally thrived by stealing technology from other companies. But now it is at the point where there isn't that much left to steal, so Microsoft is faced with operating in a whole new manner -- actually inventing stuff. This requires discipline -- not just discipline to do the work, but discipline not to backslide and steal a little of this and that when the going gets rough.
In short, for Microsoft to have the barest hope of preserving its monopoly, it has to build a whole new monopoly based on honest, original work devoid of politics, backstabbing, and lies. This means not only does Gates have to go, but for all practical purposes CEO Steve Ballmer should go, too, because he's as responsible as Gates for this mess.
So IF THEY DO IT THE RIGHT WAY, look for Gates to move his office to the Foundation immediately, look for several dozen of his closest and oldest associates to leave the company in the next four to six weeks, and look for Steve Ballmer to leave, too, within a year.
Anything less simply won't work.
Now to Net Neutrality -- what does it really mean and why do some telecommunication providers seem so opposed to it? The answers are neither as clear -- nor as evil -- as partisans on both sides of the aisle in Congress are suggesting. Those opposing Net Neutrality have in mind VoIP, and nothing but VoIP. Those in favor of Net Neutrality seem to think it means equal treatment under the Internet, which it doesn't really. The only thing we can be sure of, in fact, is that Congress doesn't get it and has a fair chance of making it worse.
The U.S. House of Representatives recently passed legislation allowing Internet Service Providers to do traffic shaping, giving some priority to certain types of content, which would presumably be either the ISPs' own content or that of ISP customers paying a premium for such access. The U.S. Senate is considering similar legislation, as well as other legislation designed to do exactly the opposite -- guarantee that all data packets receive equal service. The prevailing assumption, by the way, is that right now all packets ARE created equal, which of course they are not.
The Net Neutrality issue rests, in part, on the concept of the Internet as a "best effort" network. Best effort, in the minds of the Internet Engineering Task Force, means something slightly different than we are being told in the general press. It means that all packets are treated equally poorly in that no particular efforts are made to ensure delivery. The Net, itself, performs no packet life-support function. This is in keeping with the concept of the Internet as a dumb network. So even in cases of transport protocols that DO attempt to perform reliable transport (protocols like TCP), those recovery measures are negotiated between the server and the client, not by the network that connects them, simple as that.
But all packets aren't created equal. TCP packets over longer distance connections, for example, are effectively at a disadvantage, because they are more likely to have data loss and require retransmissions, thus expanding their appetites for bandwidth. By the same token, packets of all types that originate on the ISP side of its primary Internet connection have the advantage of functioning in an environment with far greater bandwidth and far fewer hops. Perhaps the best example of this disparity: packets that pass through private peering arrangements, versus those traveling from one backbone provider to another through one of the many NAPs, with their relatively high packet loss.
This "to NAP or not to NAP" issue has been with us for a long time. Smaller and poorer ISPs that can't attract peering deals with their larger brethren are stuck with communicating through the NAPs, which requires more time and bandwidth to transfer the same number of data packets successfully. This has long been a marketing point for bigger and richer ISPs. But beyond marketing, this disparity hasn't received much public notice. There are many ISPs that have both private peering and inter-NAP connections, yet whether they send a packet through the NAP or not hasn't been a huge public issue. Perhaps it should be. It has certainly been possible for ISPs to pretty easily put a hurt on packets, and they probably have been doing so, though most pundits assume that we are still living in the good old days.
One thing ISPs supposedly aren't allowed to do is to ban packets completely. If they tried that by, for example, restricting all Internet video or VoIP phone service to a particular provider, the courts would fill with lawyers filing Restraint of Trade lawsuits. So the ISPs take the air carrier approach of not denying passage to anyone, but wanting to give priority boarding to their most loyal frequent fliers. That's the heart of their argument.
But the other position ISPs like to take is that of the common carrier, which supposedly doesn't know the difference between one packet and the next, and is therefore not liable if some of those packets carry kiddie porn or terrorist communications. The ISPs, you see, want it both ways.
And they'll probably get it, because they have the lobbying clout.
But it is important to also keep in mind just how much damage there is to be done here. The biggest culprit in terms of sheer volume of traffic is undoubtedly Bit Torrent, but the failure of Net Neutrality isn't going to have much of an effect, if any, on Bit Torrent, because if your bootleg copy of The Sopranos arrives 10 minutes or even 10 hours later, is it going to matter all that much? No.
ISP high jinks will have little impact on video and audio downloads.
They won't have much effect, either, on search, because the dominant search vendor -- Google -- is working so hard to optimize backbone routes that inserting one slower hop at the ISP will probably STILL leave Google faster than everyone else. That's part of the reason that Google has been buying up all that fiber.
Where this Net Neutrality issue will hit home is for Voice over IP telephone service, which becomes pitiful if there is too much latency. That's what this is all about, folks: VoIP and nothing else. The telcos want to use it to keep out the Vonages, Skypes, and Packet8s, and the cable companies do, too. It is a $1 trillion global business, so we shouldn't be surprised that the ISPs will do anything to own it, but it isn't about movies or music or even AJAX apps -- at least, not yet. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5049) | 7/1/2006 9:15:01 PM | From: ms.smartest.person | | | :•) i, Cringely - Net Neutered: Why don't they tell us ending Net Neutrality might kill BitTorrent?
By Robert X. Cringely June 22, 2006
Last week's column was about Bill Gates' announced departure from day-to-day management at Microsoft and a broad view of the Net Neutrality issue. We'll get back to Microsoft next week with a much closer look at the challenges the company faces as it ages and what I believe is a clever and counter-intuitive plan for Redmond's future success. But this week is all about Net Neutrality, which turns out to be a far more complex issue than we (or Congress) are being told.
Net Neutrality is a concept being explored right now in the U.S. Congress, which is trying to decide whether to allow Internet Service Providers to offer tiers of service for extra money or to essentially be prohibited from doing so. The ISPs want the additional income and claim they are being under compensated for their network investments, while pretty much everyone else thinks all packets ought to be treated equally.
Last week's column pointed out how shallow are the current arguments, which ignore many of the technical and operational realities of the Internet, especially the fact that there have long been tiers of service and that ISPs have probably been treating different kinds of packets differently for years and we simply didn't know it.
One example of unequal treatment is whether packets connect from backbone to backbone through one of the public Network Access Points (NAPs) or through a private peering arrangement between ISPs or backbone providers. The distinction between these two forms of interconnection is vital because the NAPs are overloaded all the time, leading to dropped packets, retransmissions, network congestion, and reduced effective bandwidth. Every ISP that has a private peering agreement still has the right to use the NAPs and one has to wonder how they decide which packets they put in the diamond lane and which ones they make take the bus?
Virtual Private Networks are another example of how packets can be treated differently. Most VPNs are created by ISP customers who want secure and reliable interconnections to their corporate networks. VPNs not only encrypt content, but to a certain extent they reserve bandwidth. But not all VPNs are created by customers. There are some ISPs that use VPNs specifically to limit the bandwidth of certain customers who are viewed as taking more than their fair share. My late friend Roger Boisvert, a pioneer Japanese ISP, found that fewer than 5 percent of his customers at gol.com were using more than 70 percent of the ISP's bandwidth, so he captured just those accounts in VPNs limited to a certain amount of bandwidth. Since then I have heard from other ISPs who do the same.
As pointed out last week, though, there is only so much damage that an ISP can do and most of it seems limited to Voice-over-IP (VoIP) telephone service where latency, dropouts, and jitter are key and problematic. Since VoIP is an Internet service customers are used to paying extra for (that, in itself, is rare), ISPs want that money for themselves, which is the major reason why they want permission to end Net Neutrality--if it ever really existed.
The implications of this end to Net Neutrality go far beyond VoIP, though it is my feeling that most ISPs don't know that. These are bit schleppers, remember, and the advantages of traffic shaping are only beginning to dawn on most of them. The DIS-advantages are even further from being realized, though that will start to change right here.
The key question to ask is what impact will priority service levels have on the services that remain, those having no priority? In terms of the packets, giving priority to VoIP ought not to have a significant impact on audio or video downloads because those services are buffered and if they take a little longer, well that's just the price of progress, right? Wrong. Let's look at the impact of priority services on BitTorrent, the single greatest consumer of Internet bandwidth.
Though e-mail and web surfing are both probably more important to Internet users than BitTorrent, the peer-to-peer file transfer scheme uses more total Internet bandwidth at something over 30 percent. Some ISPs absolutely hate BitTorrent and have moved to limit its impact on their networks by controlling the amount of bandwidth available to BitTorrent traffic. This, too, flies in the face of our supposed current state of blissful Net Neutrality. A list of ISPs that limit BitTorrent bandwidth is in this week's links, though most of them are, so far, outside the United States.
BitTorrent blocking or limiting can be defeated by encrypting the torrents, but that increases overhead, causes a bigger bandwidth hit, and defeats local caching schemes that might help reduce bandwidth demand. So blocking BitTorrent actually makes life worse for all of us, which may be why most U.S. ISPs aren't doing it.
So let's assume that ISPs are allowed to offer tiered services. What impact will that have on BitTorrent? The answer lies in the nature of the TCP/IP protocol. Here is an analysis from a friend who is far more savvy about these things than I am:
"If you look at the amount of overhead TCP needs it's exponential to how slow each connection is; the slower (the connection) the more overhead because the window sizes are smaller and more control packets are being used for verification. And you know what? BitTorrent is FAR WORSE. Remember that for each file you download on BitTorrent you connect to dozens, possibly even hundreds of people, and the slower each of those connections is the more the overhead increases.
"About a month ago the amount of torrents I may (have been) automatically downloading at any given time was between 10 and 30. This means that I was getting no more than 1Kbps from every peer, which meant about half of my bandwidth usage was in BitTorrent protocol overhead and not in downloading file data. I brought this (overhead) down (by 40 percent) by just having five torrent downloads at a time and queuing the rest, and I even got the files faster. I then did some more scheduling and what not to get (my bandwidth use down by a total of 70 percent) and I still downloaded about the same amount of real file data.
"So what happens when everyone's VoIP or other preferred packets get preference over my torrent packets? Since I have no knowledge of the other people's usage in my aggregate network I can't adjust well for changes in the network. The BitTorrent traffic that is going will have exponentially increased overhead due to the slow downs, increasing overall Internet packet overhead (with BitTorrent already 30+ percent of all Internet traffic). Which means that allowing the telco's to subsidize the cost of improving their infrastructure by having preferred packets could exponentially increase the cost accrued by the larger internet and backbone providers just to keep costs down at the aggregate level."
To recap: Giving priority to some traffic puts a hurt on other types of traffic and when that other traffic constitutes more than 30 percent of the Internet, the results can be severe for all of us. On the Internet everything is connected, and you can't easily ignore the impact of one service on another.
With this new knowledge I did a simple test that you can do, too. I have a Vicomsoft Internet Gateway that does very fine traffic shaping, though there are many similar products available, like ClarkConnect and some others that are open source. Many routers can do traffic shaping, too. I did a couple BitTorrent downloads of specific files, measuring how much time and total bandwidth was required. Then I deleted those files, changed my Internet Gateway settings to give priority to my Vonage VoIP packets, called my Mom on the phone and started downloading the same two BitTorrent files.
Vonage and many other VoIP services use an Analog Telephone Adapter (ATA) device to connect your phone to the network. That ATA can do traffic shaping, too, which requires that it be connected inline between your broadband modem and router. Of course I didn't realize this until AFTER my test was done.
My test results were clear. I had no problem downloading the same BitTorrent files, but it took longer. That was no surprise. After all, I WAS talking to my Mom, which would have taken some bandwidth away from BitTorrent. But the more interesting result was that the total bandwidth required to download the same files using traffic shaping versus not using traffic shaping was almost 20 percent more, which undoubtedly came down to increased BitTorrent overhead due to contention and retransmissions involving the priority VoIP service.
Traffic shaping causes different patterns on the local network, especially aggregate networks that use creative technologies to send Ethernet frames over old telephone and cable infrastructure. It's taken a very long time to get Internet technologies to where they are today, and all the protocols built on top of those technologies operate under certain assumptions. Just like a web site sent over TCP assumes that TCP will make sure all the packets will get to their destination, rich application protocols like BitTorrent operate under assumptions like known patterns in bandwidth changes on aggregate networks.
Let's say Net Neutrality goes away and the broadband ISPs start offering tiered services. My simple test suggests that one possible impact is that Bit Torrent traffic, which currently uses, say, 30 percent of Internet bandwidth, is going to expand to about 36 percent simply because of inefficiencies created by the tiered services. This will increase the backbone costs for ISPs and will take back at least some of the very performance advantage they are supposedly selling to their priority customers.
The result of ending Net Neutrality under this scenario, then, is that the ISPs make money from tiered services but with higher overhead costs and lower priority service levels than one might expect. The ISPs then might try banning BitTorrent to keep it from messing with their tiered services, but we've already establish this can't practically be done on a technical level because torrent encryption can always get around the ban. The only way, in fact, to limit BitTorrent traffic would be to have it made illegal and now we're back again to the clueless Congress that started this whole mess.
I don't think these latter ideas are even in the heads of broadband ISPs. They simply haven't thought that far. But eventually, as they try trimming this and expanding that to solve a problem that shouldn't have existed in the first place and can't otherwise be solved, they'll come up with something all of us will hate. I guarantee it. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (1) |
|
To: ms.smartest.person who wrote (5050) | 7/1/2006 9:16:22 PM | From: ms.smartest.person | | | :•) i, Cringely - If we build it they will come: It's time to own our own last mile
By Robert X. Cringely June 29, 2006
Bob Frankston is one of the smartest people I speak to. If you don't recognize his name, Bob is best known as the programmer who wrote VisiCalc, the first spreadsheet, realizing the design of his partner, Dan Bricklin. Bob and Dan changed the world forever with VisiCalc, the first killer app. After a career at Lotus and eventually Microsoft, Bob would now like to change the world for the better again, this time by fixing the mess that we call the Internet.
The problem, to Bob's way of thinking, isn't the Internet per se, but the direction powerful political and business forces are attempting to take it. Part of this can be seen in last week's column on Net Neutrality, but Bob takes it further - a LOT further - to a point where it becomes logically clear that making almost any regulation specifically to hinder OR HELP the Internet can only make things worse. And by making it worse I mean inhibit in a severe way the growth of human knowledge, culture, and economic development. It's just a choice between freedom and totalitarianism, simple as that.
To Bob the issues surrounding Net Neutrality come down to billability and infrastructure. While saying they are doing us favors, ISPs are really offering us services they can bill for. Nothing is aimed at helping us, while everything is aimed at creating a billable event. Take WiFi hotspots, for example. Why should the telephone or cable company care about who connects to my WiFi access point? They are my bits, not the ISP's. I paid for them. If I can download gigabytes of pornography why can't I share my hotspot with someone walking down the street wanting to check his e-mail? Frankston's analogy for this is accusing someone of stealing your porch light by using it to read a street sign.
It isn't about service, it is about creating billable events, that's all. And billable events, by definition, are things we have others do because we are unable or unwilling to do for ourselves. So a Verizon or a Comcast does us a favor, they say, by licensing rights to a movie and allowing us to buy or rent it over the Internet. We could buy the rights ourselves, but who would know where to even go? And wouldn't Verizon, as a big buyer, necessarily get a better price? When you have a preferred or exclusive provider versus a competitive marketplace, prices are always higher, not lower. In this case the ISP isn't doing us a favor, they are forcing us to buy from them something that we might well be able to buy from someone else for a lot less.
But they need the money! After all, they spent billions bringing broadband to our homes in the first place. Don't they deserve to be paid back for that huge investment?
My Internet service isn't free, is yours? I'm paying Comcast every month and from what I can glean from the company's annual report, they seem to be making a profit from my business. Is it enough of a profit? Well they'd always like more, but the current return must be good enough because they keep my bits flowing.
To Bob Frankston's way of thinking this all comes down to who owns the infrastructure. The phone and cable companies own the wire outside our homes but we own the wire inside. (It didn't used to be that way, you know. There was a time when the phone company owned the wire in our walls even though we paid for its purchase and installation.) The Internet has been a huge success to date specifically because nobody much controls the electrons. This is as opposed to services like broadcasting where some perceived scarcity of spectrum allowed governments to determine who could give or sell us entertainment and information. The ISPs (by which I mean telcos and cable companies) would very much like to go back to that sort of system, where they, not you, are the provider and determinant of what bits are good bits and what bits are bad.
No thanks.
Frankston points out that we build and finance public infrastructure in a public way using public funds with the goal of benefiting economic, social, and cultural development in our communities. So why not do the same with the Internet, which is an information infrastructure? Well we did that, didn't we, with the National Information Infrastructure program of the 1990s, which was intended to bring fiber straight to most American homes? About $200 billion in tax credits and incentives went primarily to telephone companies participating in the NII program. What happened with that? They took the money, that's what, and gave us little or nothing in return.
But just because the highway contractor ran off with the money without finishing the road doesn't mean we can go without roads. It DOES mean, however, that we ought not to buy another road from that particular contractor.
The obvious answer is for regular folks like you and me to own our own last mile Internet connection. This idea, which Frankston supports, is well presented by Bill St. Arnaud in a presentation you'll find among this week's links. (Bill is senior director of advanced networks with CANARIE, which is responsible for the coordination and implementation of Canada's next generation optical Internet initiative.) The idea is simple: run Fiber To The Home (FTTH) and pay for it as a community of customers -- a cooperative. The cost per fiber drop, according to Bill's estimate, is $1,000-$1,500 if 40 percent of homes participate. Using the higher $1,500 figure, the cost to finance the system over 10 years at today's prime rate would be $17.42 per month.
What we'd get for our $17.42 per month is a gigabit-capable circuit with no bits inside - just a really fast connection to some local point of presence where you could connect to ANY ISP wanting to operate in your city.
"It's honest funding," says Frankston. "The current system is like buying drinks so you can watch the strippers. It is corrupt and opaque. We should pay for our wires in our communities just like we pay for the wires in our homes."
The effect of this move would be beyond amazing. It would be astounding. No more arguments about Net Neutrality, for one thing, because we'd effectively be extending our ownership and control of the wires all the way to the ISP interconnect. Of course you'd still have to buy Internet service, but at NerdTV rates the amount of bandwidth used by a median U.S. broadband customer would be less than $2.00 per month. Though with that GREAT BIG PIPE most of us would be tempted to use a lot more bandwidth, which is exactly the point.
There would be a community-financed Internet revolution and this time, because it would be locally funded and managed, very little money would be stolen. Dark fibers would be lighting up all over America, telco capital costs would plummet, and a truly competitive market for Internet services would emerge. In 2-3 years whatever bandwidth advantage countries like Korea have would be erased and we'd be back on track building even more innovative online industries.
This would be a real marketplace not a fake one. Today's system is a fake because it depends on capturing the value of the application -- communications -- in the transport and that would no longer be possible because with the Internet the value is created OUTSIDE the network.
"One example of the collateral damage caused by today's approach is the utter lack of simple wireless connectivity. Another is that we have redundant capital-intensive bit paths whose only purpose is to contain bits within billing paths," Frankston explains. "In practice, the telcos are about nothing at all other than creating billable events. Isn't it strange that as the costs of connectivity were going down your phone bill was increasing -- at least until VoIP forced the issue."
"We have an alternative model in the road system: The roads themselves are funded as infrastructure because the value is from having the road system as a whole, not the roads in isolation. You don't put a meter on each driveway. Tolls, fuel taxes, fees on trucks, etc. are ways of generating money but they are indirect. Local builders add capacity; communities add capacity and large entities create interstate roads. They don't create artificial scarcity just to increase toll revenues -- at least not so blatantly."
"I refer to today's carrier networks as trollways because the model is inverted -- the purpose of the road is to pass as many trollbooths as possible. We keep the backbone unlit to assure artificial scarcity. Worse, by trying to force us within their service model we lose the opportunity to create new value and can only choose among the services that fill their coffers -- it's hard to come up with a more effective way to minimize the value of the networks."
A model in which the infrastructure is paid for as infrastructure -- privately, locally, nationally, and internationally can create a true marketplace in which the incentives are aligned. Instead of having the strange phenomenon of carriers spending billions and then arguing that they deserve to be paid, we'd have them bidding on contracts to install and/or maintain connectivity to a marketplace that is buying capacity and making it available so value can be created without having to be captured within the network and thus taken out of the economy.
So why not do it? Well the telcos and cable companies would hate it. Who made them gods?
My recent discussion with Bob Frankston started with talk about Microsoft and what that company might do to turn itself around. "Microsoft seems to confuse end-to-end with womb-to-tomb," Bob said. "Or at least BillG did the last time I tried to speak to him about it. The problem Microsoft has is that it hasn't really given people enough opportunity to add value to the computing. Ironically, Google's APIs and mashups go more in this direction and I do need to give Ray (Ozzie) some credit for joining in this trend. The challenge will be reconciling that with the monolithic platform company. .Net, a stupid name for a great idea, could do very well if liberated from Windows."
So what's a Microsoft to do? Concentrate less on womb-to-tomb and more on end-to-end by embracing the idea of community-owned networks. One billion dollars each in seed capital from Microsoft, AOL, Yahoo, and Google would be enough to set neighborhood network dominos falling in communities throughout America with no tax money ever required. And they'd get their money back, both directly and indirectly, many times over.
Microsoft could go it alone, but the point would have to be to build a market, not to control the last mile, and I think the temptation to fall back on old habits would be less with a consortium involved.
But this leads us to the promised question of what else Microsoft might do as it moves forward into an uncertain future? Well the one thing they aren't doing (hardly any companies do) is to plan for that uncertainty. I have a plan.
Frank Gaudette, when he was Microsoft's first-ever chief financial officer, told me that he hated having all that cash lying around because it was a drag on earnings. In the money markets he could make at most a few percent per year. Investing in Microsoft's own products was yielding more than a 50 percent annual return. The problem was that Microsoft was making so much money then (and now, frankly) that they couldn't spend it all on their core business.
Where Gaudette saw a problem, I see opportunity: spend it on something else.
In a sense Microsoft is a lot like the Roman Empire. The Roman Empire's growth and economy was driven by conquering and plundering neighboring regions. Within the Empire they created a sort of safe economic zone where commerce could work and technology could be developed. However, that came at a price, as they tended to destroy everything outside the empire as it grew.
Same for Microsoft, whose leaders were greedy and made a number of good, shrewd business decisions. They were also ruthless. Over time they managed to destroy the surrounding software industry. Within Microsoft's world was a sort of safe economic zone. If you were not a threat to Microsoft or if you did something Microsoft didn't want to do (like make PCs) you were able to grow under the shadow of Redmond. When the emperor spoke, you listened.
It is too early to predict the fall of the Microsoft Empire. Does Microsoft have the leaders and generals who can lead the company into the future? Who knows? In the software world there is nothing else to conquer or plunder. In other markets it will be hard, if not impossible, for Microsoft to dominate whole industries as it has in the past. Microsoft now needs to act like a responsible company, work well with others, and grow through cooperation and teamwork. This will be hard for Microsoft. The Romans couldn't do it. The Romans neglected one of their "partners" and eventually that partner did them in.
Today's Microsoft is a great generator of cash. With some good product refreshes, this cash generation can continue for years to come. The BIG decision is what to do with the cash. Microsoft needs to develop new businesses. Microsoft could have a great future doing things that have nothing to do with computers. They could be making a great electric car, or great new medications, or any number of other things. Microsoft could create new industries that could have a huge benefit to the economy. Microsoft could change the world, again. Ten years from now Microsoft could be a huge holding company of which PC software is but one part. They don't have to gut the software unit, which is viable enough to be a great moneymaker for another 25 years if Microsoft manages it well.
Right now Microsoft is like a deer in the headlights. They are stuck on software and computer stuff. They can't move. There are much more interesting growth opportunities out there.
And you know there is a really simple way to proceed. Warren Buffett announced this week that he's giving $30+ billion to the Bill and Melinda Gates Foundation to continue their good work of curing diseases so we'll be around to buy more computers. Buffet is the best builder of holding companies in the history of industry. The simple answer for Microsoft is to give Buffet's Berkshire Hathaway half of Microsoft's excess cash flow every year. This year that would be about $6 billion. With Berkshire's switch to international investing, they'd find productive places for that money.
Eventually Microsoft's value might be mainly in its Berkshire shares, which would in turn greatly increase the value of Buffet's gift to the Gates Foundation. It seems only fair. pbs.org |
| Crazy Fools Chasing Crazy CyberNews | Pastime Discussion ForumsShare | RecommendKeepReplyMark as Last ReadRead Replies (3) |
|
| |