SI
SI
discoversearch

 Technology Stocks | AMD, ARMH, INTC, NVDA


Previous 10 | Next 10 
To: FUBHO who wrote (5772)4/27/2012 1:56:24 PM
From: neolib
of 12105
 
Looks like Samsung is going to be more nimble than GF at 28nm. Given them converting memory fabs to LSI, I wonder if they will try to pick up ProMos which was rumored to be of interest to GF. I don't recall seeing any conclusion to that supposed sale. It was well less than $1B which for a 12" fab is a pretty good deal if it can be converted to leading edge logic for not too much more.

Share Keep | Reply | Mark as Last Read

From: Joe Kerr4/27/2012 5:23:42 PM
of 12105
 
Some stories on GF enabling stacking at the 20nm node:

theinquirer.net 
fudzilla.com 

Share Keep | Reply | Mark as Last Read | Read Replies (1)

To: Joe Kerr who wrote (5774)4/27/2012 5:34:11 PM
From: FUBHO
of 12105
 
Fudzilla is confused about chip stacking using TSVs and 3D transistors. Not the same thing...

Share Keep | Reply | Mark as Last Read | Read Replies (1)


To: FUBHO who wrote (5775)4/27/2012 7:33:14 PM
From: rzborusa
of 12105
 
TSVs are sort of reminiscent of "local interconnects" which AMD may have pioneered, drilling through rather than wrapping the edge to connect different layers. Stacking should play well with the ASICs and add ons from smaller designers .

Share Keep | Reply | Mark as Last Read

From: bit34/27/2012 11:45:48 PM
of 12105
 
Overclocked Ivy Bridge chips run much hotter than Sandy Bridge


geek.com 

Share Keep | Reply | Mark as Last Read

From: FUBHO4/29/2012 1:54:14 PM
of 12105
 
Apple’s inevitable path to a post-PC era

By Geoffrey Goetz Apr. 28, 2012, 12:00pm PT 2 Comments


gigaom.com 
There is definitely some sort of Zen that makes Apple, well, Apple. Taking something obvious, and making it somehow better, somehow cooler, somehow new. How do they do it? By observing how consumers interact with technology and experimenting ad nauseum internally, until they get it exactly the way they want it. This includes abandoning the pastif it no longer makes any sense. So when Apple’s own Tim Cook declares that merging a refrigerator and a toaster is not good for the consumer, he may just have a point.

But as forward-thinking as the company is, perhaps Apple hasn’t created a new path at all. Through a technique of observe, perfect and discard, Apple has been heading for some time now in one direction — along the pre-defined path into the era of ubiquitous computing.

Ubiquitous computing definedIn some ways, this path is as logical as Moore’s Law. Look at the history of computing — from the mainframe era where there was one computer for many consumers, to the personal computing era where there was one computer for each consumer, to this new era where there are many computers for each consumer – and compare of the number of computer chips to the number of consumers using those chips. At its foundation, ubiquitous computing could be summed up by this simple principle of ratios.

The modern concept of ubiquitous computing originally came from Mark Weiser in 1988 from the Computer Science Lab at Xerox PARC (sound familiar?). The theory proposed a seamless, almost invisible connection between consumers and computers that would help drive a change in ratios from one computer to many people, to many computers to one person.



Apple’s tabs, pads and boardsEven considering the most radical interpretation of ubiquitous computing dust, the main point has remained the same. We will soon be overrun by computer chips. There are, however, three very distinct platforms in this well-defined post-PC era that we have all become accustomed to. Not unlike the three platforms we see evolving within the iOS platform today:

Gesturing tabs: Mobile technology already had small chips, powerful batteries, geolocation services and wireless networking. But that was not enough to win over the masses and drive us all to purchase multiple computing devices. It was the way consumers interacted with these smaller devices that needed to change. For a long time, it was thought that voice recognition was going to propel us into the next era of computing, but that never happened.

Leveraging the fact that there were approximately 100 million iPod users, Apple was able to use convergence to its advantage as it introduced these iPod users to a series of simple touch-based gestures on a nearly buttonless device. In the early years of the iPod, we all were trained on the scroll wheel. With touch-based gestures on a wide open screen, this paradigm was taken one step further. Just as the mouse accompanied the transition from the terminal-based Mainframe Era to PC era, the post-PC era was ushered in by a new way of interacting with other computer chips, touch.



Revolutionary pads: As soon as people became familiar with this new way of interacting with computers, it was time to challenge the personal computer paradigm directly. Netbooks attempted to continue the personal relationship with consumers by maintaining the 1:1 ratio. Tablets such as the iPad are more specialized and were never meant to be a total replacement for a traditional and general purpose personal computer.

The rapid rise and immediate success of the iPad was proof positive that consumers were ready for a third major computing device in their lives. With “pads” being used by pilots, students and doctors and in restaurants, kitchens and at work, the iPad was proving to be a specialized place-based appliance rather than a personal computer with a more general purpose. As powerful as the third generation iPad is, it will never replace the personal computer, just as the personal computer never really replaced the mainframe.



Experimental boards: The current AppleTV may be a stretch to accept as a computing platform as it has no keyboard, no mouse, no touch display, and just a very simple IR remote. That is unless you happen to be near one with a Mac or iOS device. Then the AppleTV becomes an extension of that device on a much larger screen. Although it is marketed along side the iPod, it is just as closely related to the Airport Express. Perhaps Apple needs to look towards Nintendo’s Wii or Microsoft’s Kinect, otherwise the AppleTV will be doomed as just an accessory to their Tabs and Pads.

Take a look at what HBO has done with the XBox Kinect as an example. If Apple’s recently awarded gesture based patents are any indicator, this may be where it is headed as well. The interaction between consumer and computer chip has not been ironed out enough to fully see this final platform — the boards of ubiquitous computing — take hold of our day-to-day life.

One human relation-chipMaking each device “aware” of how consumers use all of the other devices they own is the key to accelerating the adoption of more than one computing device. While Apple may in fact be the only company in the world to have constructed a homogeneous synergy between its personal and its ubiquitous computing platform, it is certainly not the only company trying to forge the relationship between the user and the computer chip. For the relationship between consumers and computing devices to become truly invisible, these new smart devices will need to know more and more about the consumers who own them. For instance, the devices will need to know everything consumers have done in the past, what they are doing now and even what they plan on doing later.



Perhaps this is the reason Tim Cook stated that Apple’s “ best years lie ahead of us.” With technologies like iCloud and Siri, Apple will likely play a larger and larger role in forging the relationship between consumers and the growing number of computing devices in our daily lives. It is not about selling more of these individual devices, it is all about enabling the relationship between an individual and a collection of specialized devices. And Apple knows this.

Share Keep | Reply | Mark as Last Read | Read Replies (1)


To: FUBHO who wrote (5778)4/29/2012 6:12:51 PM
From: neolib
of 12105
 
The Borg cometh...

Share Keep | Reply | Mark as Last Read | Read Replies (1)

To: neolib who wrote (5779)4/29/2012 7:01:33 PM
From: FUBHO
of 12105
 
China plans national, unified CPU architecture

By Sebastian Anthony on April 27, 2012 at 7:35 am
http://www.extremetech.com/computing/127791-china-plans-national-unified-cpu-architecture


According to reports from various industry sources, the Chinese government has begun the process of picking a national computer chip instruction set architecture (ISA). This ISA would have to be used for any projects backed with government money — which, in a communist country such as China, is a fairly long list of public and private enterprises and institutions, including China Mobile, the largest wireless carrier in the world. The primary reason for this move is to lessen China’s reliance on western intellectual property.

There are at least five existing ISAs on the table for consideration — MIPS, Alpha, ARM, Power, and the homegrown UPU — but the Chinese leadership has also mooted the idea of defining an entirely new architecture. The first meeting to decide on a nationwide ISA, attended by government officials and representatives from academic groups and companies such as Huawei and ZTE, was held in March. According to MIPS vice president Robert Bismuth, a final decision will be made in “a matter of months.”

China has a long history with MIPS and Alpha. Loongson processors, which power millions of Chinese school computers, use MIPS — and the ShenWei processors (pictured right) found in China’s first homegrown supercomputer, the Sunway Bluelight MPP, are based on the Alpha ISA. MIPS Technologies (the company) hasn’t been doing very well recently, and it’s rumored that the Sunnyvale-based company could be up for sale — a purchase I’m sure the Chinese government could afford.

According to EE Times, there are some 34 ARM licensees in China, but at $5 million for a single Cortex-A9 core license, it’s unlikely that ARM will be China’s choice. The Power ISA is cheaper, but lacks the software ecosystems that ARM and MIPS enjoy. ShenWei/Alpha is also a possibility, but again it cannot compete with MIPS’ installed base.

The other option, of course, is developing a brand new ISA — a daunting task, considering you have to create an entire software (compiler, developer, apps) and hardware (CPU, chipset, motherboard) ecosystem from scratch. But, there are benefits to building your own CPU architecture. China, for example, could design an ISA (or microarchicture) with silicon-level monitoring and censorship — and, of course, a ubiquitous, always-open backdoor that can be used by Chinese intelligence agencies. The Great Firewall of China is fairly easy to circumvent — but what if China built a DNS and IP address blacklist into the hardware itself?

Taking a leaf out of South Korea’s hardcore gaming scene, what if the Chinese government decided to implement a hardware-level 10pm curfew for video games? Or some code that automatically turns negative mentions of Hu Jintao (the Chinese president) into positives, and inserts a few honorifics at the same time. Or a latent botnet of hundreds of millions of computers that can be activated upon the commencement of World War III. Or, or, or…


    Share Keep | Reply | Mark as Last Read

    From: neolib4/30/2012 9:18:36 AM
    of 12105
     
    theinquirer.net 

    Share Keep | Reply | Mark as Last Read


    To: THE WATSONYOUTH who wrote (5748)4/30/2012 10:16:19 AM
    From: FUBHO
    of 12105
     

    New materials compete with graphene for the future of electronicsApril 30, 2012


    [+]Scanning tunneling microscope image of the 2D honeycomb structure of silicene (credit: Patrick Vogt/TU Berlin)


    Two research groups have invented two new materials that may compete with graphene as the solution for faster, more powerful electronic devices of the future.

    MIT researchers have created a thin film of bismuth-antimony that allows electrons to “travel like a beam of light” — hundreds of times faster than in conventional silicon chips.In thermoelectric generators and coolers, the faster electron flow (and ability to function as an insulator) might lead to much more efficient power production. The new material could even allow for designing electronic devices made of the same material with varying properties, deposited one layer atop another, rather than layers of different materials.Researchers at Technical University in Germany and Aix-Marseille University in France created silicine by condensing silicon vapor onto a silver plate to form a single layer of atoms,New Scientist reports. The new material may lead to smaller, cheaper electronic devices than graphene because it can be integrated more easily into silicon chip production lines.However, for solar cells, adding graphene to the titanium dioxide in dye-sensitized solar cells increases the current by more than 50%, Michigan Technological University materials scientists have discovered. Dye-sensitized solar cells don’t rely on rare or expensive materials, so they could be more cost-effective than cells based on silicon and thin-film technologies.Ref.: Patrick Vogt et al., Silicene: Compelling Experimental Evidence for Graphenelike Two-Dimensional Silicon,Physical Review Letters, 2012, DOI: 10.1103/PhysRevLett.108.155501

    Ref.: Shuang Tang et al., Constructing Anisotropic Single-Dirac-Cones in Bi1–xSbx Thin Films, Nano Letters, 2012, DOI: 10.1021/nl300064d

    Ref.: Yun Hang Hu, Application of Graphene for Solar Cells, US-Egypt Joint Workshop on Solar Energy Systems, 2012 [ PDF]

    Share Keep | Reply | Mark as Last Read
    Previous 10 | Next 10 

    Copyright © 1995-2014 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.