We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Apple Inc. -- Ignore unavailable to you. Want to Upgrade?

To: Stock Puppy who wrote (209214)7/24/2020 4:13:31 PM
From: clochard1 Recommendation

Recommended By
Stock Puppy

  Respond to of 212860
Back in the day many successful and promising names dropped the ball. Burroughs, DEC, Motorola, TI, Commodore, Atari, and others failed their customers because they were full of themselves.

IBM, Intel and Microsoft won the wars through shrewd sales and reliable hardware but they cost the world billions by wasting their customers time with their inferior software ecosystems. I did software development in these environments because that's where the money was.

To: Stock Puppy who wrote (209214)7/24/2020 7:00:56 PM
From: Doren  Read Replies (2) | Respond to of 212860
It was my impression that PPC/RISC chips Apple just ran too damn hot. I melted one in my dual gig, part of that was apple's stone stupid cooling design (exhaust on the bottom, chip on the top, video card blocking any cool air from getting to the daughter card) but they were hot enough to cook eggs. I'd avoid using it on very hot days. Speeds had gotten high enough Apple traded performance for cooler chips, particularly in notebooks which were replacing desktops in large numbers.

But my 2.8 i7 runs damn hot too, I worry about that a lot, and Apple under Ive seemed to be too obsessed with thinness to address the heat issue. I worry that dust bunnies inside my iMac block air flow.

It seems to me heat is still an issue, and I believe its a function of the size of the etchings so less nm means cooler. Plus its obviously going to be better for Apple to write code for one chip than two.

I would think that it had to be Intel's architecture that is holding them back. The chip foundries buy their machines from 3rd parties don't they? If so one foundry should be able to make the same sized channels as another unless there are just too many channels, zig zags and what ever lives at the nanometer level.