SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksNew Technology


Previous 10 Next 10 
From: FJB12/14/2018 2:51:33 AM
   of 410
 
RISC-V Will Stop Hackers Dead From Getting Into Your Computer

hackaday.com

The greatest hardware hacks of all time were simply the result of finding software keys in memory. The AACS encryption debacle — the 09 F9 key that allowed us to decrypt HD DVDs — was the result of encryption keys just sitting in main memory, where it could be read by any other program. DeCSS, the hack that gave us all access to DVDs was again the result of encryption keys sitting out in the open.

Because encryption doesn’t work if your keys are just sitting out in the open, system designers have come up with ingenious solutions to prevent evil hackers form accessing these keys. One of the best solutions is the hardware enclave, a tiny bit of silicon that protects keys and other bits of information. Apple has an entire line of chips, Intel has hardware extensions, and all of these are black box solutions. They do work, but we have no idea if there are any vulnerabilities. If you can’t study it, it’s just an article of faith that these hardware enclaves will keep working.

Now, there might be another option. RISC-V researchers are busy creating an Open Source hardware enclave. This is an Open Source project to build secure hardware enclaves to store cryptographic keys and other secret information, and they’re doing it in a way that can be accessed and studied. Trust but verify, yes, and that’s why this is the most innovative hardware development in the last decade.

What is an enclave?Although as a somewhat new technology, processor enclaves have been around for ages. The first one to reach the public consciousness would be the Secure Enclave Processor (SEP) found in the iPhone 5S. This generation of iPhone introduced several important technological advancements, including Touch ID, the innovative and revolutionary M7 motion coprocessor, and the SEP security coprocessor itself. The iPhone 5S was a technological milestone, and the new at the time SEP stored fingerprint data and cryptographic keys beyond the reach of the actual SOC found in the iPhone.

The iPhone 5S SEP was designed to perform secure services for the rest of the SOC, primarily relating to the Touch ID functionality. Apple’s revolutionary use of a secure enclave processor was extended with the 2016 release of the Touch Bar MacBook Pro and the use of the Apple T1 chip. The T1 chip was again used for TouchID functionality, and demonstrates that Apple is the king of vertical integration.

But Apple isn’t the only company working on secure enclaves for their computing products. Intel has developed the SGX extension which allows for hardware-assisted security enclaves. These enclaves give developers the ability to hide cryptographic keys and the components for digital rights management inside a hardware-protected bit of silicon. AMD, too, has hardware enclaves with the Secure Encrypted Virtualization (SEV). ARM has Trusted Execution environments. While the Intel, AMD, and ARM enclaves are bits of silicon on other bits of silicon — distinct from Apple’s approach of putting a hardware enclave on an entirely new chip — the idea remains the same. You want to put secure stuff in secure environments, and enclaves allow you to do that.

Unfortunately, these hardware enclaves are black boxes, and while they do provide a small attack surface, there are problems. AMD’s SEV is already known to have serious security weaknesses, and it is believed SEV does not offer protection from malicious hypervisors, only from accidental hypervisor vulnerabilities. Intel’s Management engine, while not explicitly a hardware enclave, has been shown to be vulnerable to attack. The problem is that these hardware enclaves are black boxes, and security through obscurity does not work at all.

The Open Source SolutionAt last week’s RISC-V Summit, researchers at UC Berkeley released their plans for the Keystone Enclave, an Open Source secure enclave based on the RISC-V (PDF). Keystone is a project to build a Trusted Execution Environment (TEE) with secure hardware enclaves based on the RISC-V architecture, the same architecture that’s going into completely Open Source microcontrollers and (soon) Systems on a Chip.

The goals of the Keystone project are to build a chain of trust, starting from a silicon Root of Trust stored in tamper-proof hardware. this leads to a Zeroth-stage bootloader and a tamper-proof platform key store. Defining a hardware Root of Trust (RoT) is exceptionally difficult; you can always decapsulate silicon, you can always perform some sort of analysis on a chip to extract keys, and if your supply chain isn’t managed well, you have no idea if the chip you’re basing your RoT on is actually the chip in your computer. However, by using RISC-V and its Open Source HDL, this RoT can at least be studied, unlike the black box solutions from Intel, AMD, and ARM vendors.

The current plans for Keystone include memory isolation, an open framework for building on top of this security enclave, and a minimal but Open Source solution for a security enclave.

Right now, the Keystone Enclave is testable on various platforms, including QEMU, FireSim, and on real hardware with the SiFive Unleashed. There’s still much work to do, from formal verification to building out the software stack, libraries, and adding hardware extensions.

This is a game changer for security. Silicon vendors and designers have been shoehorning in hardware enclaves into processors for nearly a decade now, and Apple has gone so far as to create their own enclave chips. All of these solutions are black boxes, and there is no third-party verification that these designs are inherently correct. The RISC-V project is different, and the Keystone Enclave is the best chance we have for creating a truly Open hardware enclave that can be studied and verified independently.

Share RecommendKeepReplyMark as Last Read


From: FJB12/15/2018 7:24:12 PM
   of 410
 


Serverless Computing: One Step Forward, Two Steps Back

Joseph M. Hellerstein, Jose Faleiro, Joseph E. Gonzalez, Johann Schleier-Smith, Vikram Sreekanti, Alexey Tumanov and Chenggang Wu UC Berkeley {hellerstein,jmfaleiro,jegonzal,jssmith,vikrams,atumanov,cgwu}@berkeley.edu

ABSTRACT Serverless computing offers the potential to program the cloud in an autoscaling, pay-as-you go manner. In this paper we address critical gaps in first-generation serverless computing, which place its autoscaling potential at odds with dominant trends in modern computing: notably data-centric and distributed computing, but also open source and custom hardware. Put together, these gaps make current serverless offerings a bad fit for cloud innovation and particularly bad for data systems innovation. In addition to pinpointing some of the main shortfalls of current serverless architectures, we raise a set of challenges we believe must be met to unlock the radical potential that the cloud—with its exabytes of storage and millions of cores—should offer to innovative developers.

arxiv.org

Share RecommendKeepReplyMark as Last Read


From: retrodynamic2/3/2019 5:27:39 PM
   of 410
 
Featured Project Development - State of the Art Novel InFlow Tech; ·1-Gearturbine RotaryTurbo:

*Wordpress Blog State of the Art Novel InFlow Gearturbine Imploturbocompressor











-YouTube; * Atypical New • GEARTURBINE / Retrodynamic = DextroRPM VS LevoInFlow + Ying Yang Thrust Way Type - Non Waste Looses

*1-GEARTURBINE BEHANCE Images Gallery Portafolio

·1; Rotary-Turbo-InFlow Tech / - GEARTURBINE PROJECT Have the similar basic system of the Aeolipilie Heron Steam Turbine device from Alexandria 10-70 AD * With Retrodynamic = DextroRPM VS LevoInFlow + Ying Yang Way Power Type - Non Waste Looses *8X/Y Thermodynamic CYCLE Way Steps. 4 Turbos, Higher efficient percent. No blade erosion by sand & very low heat target signature Pat:197187IMPI MX Dic1991 Atypical Motor Engine Type.

*Note: Sorry because I don't see the post before.






Share RecommendKeepReplyMark as Last Read


From: FJB2/12/2019 9:27:45 AM
   of 410
 
TSMC to move 7nm EUV process to volume production in March
Monica Chen, Hsinchu; Jessie Shen, DIGITIMES
Tuesday 12 February 2019



0
Toggle Dropdown

Taiwan Semiconductor Manufacturing Company (TSMC) is expected to kick off volume production of chips built using an enhanced 7nm with EUV node at the end of March, according to industry sources.

ASML, which provides extreme ultraviolet (EUV) litho equipment, is looking to ship a total of 30 EUV systems in 2019. Of the units to be shipped, 18 have already been reserved by TSMC, the sources indicated.

TSMC is also on track to move a newer 5nm node to risk production in the second quarter of 2019, the sources noted. The foundry will fully incorporate EUV in the 5nm node.

TSMC CEO CC Wei disclosed previously the foundry expects to start taping out 5nm chip designs later in the first half of 2019 and move the node to volume production in the first half of 2020.

Meanwhile, TSMC continues to expand its 7nm chip client portfolio. Wei noted that the foundry's 7nm chip client portfolio is "growing stronger" as more chip designs for applications such as HPC and automotive demand the process.

TSMC started volume producing 7nm chips in April 2018, with AMD, Apple, HiSilicon and Xilinx reportedly being among its major 7nm chip customers. The foundry will soon roll out its EUV-based 7nm process, which will boost its total 7nm chip sales to account for 25% of total wafer sales this year compared with 9% in 2018.

Share RecommendKeepReplyMark as Last Read


To: FJB who wrote (327)2/25/2019 6:09:21 AM
From: Kate Jenkins
   of 410
 
Climate change is inevitable. We can see it every day all over the world. I write more detailed about this topic in my prime essay writing. It was difficult to describe but I did it.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: Kate Jenkins who wrote (366)2/25/2019 6:12:55 AM
From: FJB
   of 410
 
RE:Climate change is inevitable.

Climate change has been going on for 4 BILLION YEARS, modern humans have only existed for about 100k years. The industrial revolution is about 100 years old.

Open a book, or wikipedia, and never post here again. Your ignorance makes me sick.

Share RecommendKeepReplyMark as Last Read


From: FJB2/26/2019 11:46:09 AM
   of 410
 

Share RecommendKeepReplyMark as Last Read


From: FJB2/28/2019 8:33:12 AM
   of 410
 
Chip Roadmap Looks Dark, Bumpy

eetimes.com

SAN JOSE, Calif. — The semiconductor roadmap could extend a decade to a 1-nm node or it could falter before the 3-nm node for lack of new resist chemistries. Those were some of the hopes and fears that engineers expressed at an evening panel session at an annual lithography conference here.

The session was intended as a lighthearted send-up of the long-predicted death of Moore’s Law. It also showed the disturbing uncertainties that are natural outgrowths of the many challenges perpetually appearing on the path to next-generation chips.

Today, Samsung has started production of 7-nm devices using extreme ultraviolet lithography. TSMC expects to ramp a 7+-nm node using EUV by June. ASML aims to serve both with a 2019 upgrade of its EUV system, the 3400C, promising throughput of 170 wafers/hour and 90+% availability.

One of the next big challenges is brewing more sensitive resist materials for the 3-nm node. Today’s chemically amplified resists (CARs) “are OK for the current and maybe next generation, but we’d like new platforms,” said Tony Yen, a vice president at ASML.

Yen pointed to the long history of CARs dating back to the 1980s and 248-nm lithography. “It’s about time we put more emphasis in new platforms like molecular resists,” Yen said.

With a total market for the crucial chemicals valued at less than a billion dollars a year, “the model needs to change,” he added. “Development could be done in a pre-competitive place and then licensed to commercial resist vendors.”

Ryan Callahan from resist maker FujiFilm disagreed. “There is great competition to secure the business because those who are first will succeed and others will be gone … [but with the] market getting smaller as some [ such as GlobalFoundries] abandon EUV, resist suppliers won’t do consortia for developing together,” he said.


ASML plans to release this year an upgrade of its current EUV system. Click to enlarge. (Source: ASML)
In an effort to jumpstart work on resists for next-generation EUV systems, imec and laser specialist KMLabs announced that they will form a so-called AttoLab. It will try to characterize how resists absorb and ionize photons in time frames measured in pico- and attoseconds.

“We will learn how to see the fine detail of radiation chemistry, working with suppliers to find new materials to take us to the next level … We will also look at quantum phenomena … it is pure science, but new technologies may come from this work,” said John Petersen, a principal scientist at imec who co-authored papers describing the new lab.

The resists are one way to reduce random errors known as stochastics, an old problem but one raising its head aggressively as engineers push toward the 5-nm node. Yen was bullish that ASML will deal with the defects that threaten yields.

“Stochastics are more severe now than they were with 193-nm lithography, but they can be countered by higher [light] doses,” Yen said. “Our roadmap goes to 500-W systems, so we are going up in power, and High NA systems will deliver a better image quality, so we are well-prepared to combat stochastics.”

Phillipe Leray, a metrology specialist at imec, was less optimistic. “We have to tackle the defect challenge in the near future,” he said. “Time is running out, and I don’t see any solution around the corner.”

Next page: Pulling out all the stops for scaling

Page 1 / 2 NEXT >

Share RecommendKeepReplyMark as Last Read


From: FJB3/6/2019 6:35:29 PM
   of 410
 
IBM announces that its System Q One quantum computer has reached its 'highest quantum volume to date'
phys.org

March 5, 2019 by Bob Yirka, Phys.org report


Credit: IBMIBM has announced at this year's American Physical Society meeting that its System Q One quantum computer has reached its "highest quantum volume to date"—a measure that the computer has doubled in performance in each of the past two years, the company reports.

Quantum computers are, as their name implies, computers based on quantum bits. Many physicists and computer scientists believe they will soon outperform traditional computers. Unfortunately, reaching that goal has proven to be a difficult challenge. Several big-name companies have built quantum computers, but none are ready to compete with traditional hardware just yet. These companies have, over time, come to use the number of qubits that a given quantum computer uses as a means of measuring its performance—but most in the field agree that such a number is not really a good way to compare two very different quantum computers.

IBM is one of the big-name companies working to create a truly useful quantum computer, and as part of that effort, has built models that they sell or lease to other companies looking to jump on the quantum bandwagon as soon as they become viable. As part of its announcement, IBM focused specifically on the term "quantum volume"—a metric that has not previously been used in the quantum computing field. IBM claims that it is a better measure of true performance, and is therefore using the metric to show that the company's System Q One quantum computer advancement has been following Moore's Law.



Credit: IBMAs part of its announcement, IBM published an overview of the results of testing several models of its System Q One machine on its corporate blog. One such metric, notably, was "quantum volume," a metric created by a team at IBM, which is described as accounting for "gate and measurement errors as well as device cross talk and connectivity, and circuit software compiler efficiency." The team that created the metric wrote a paper describing the metric and how it is calculated and uploaded it to the arXiv preprint server last November. In that paper, they noted that the new metric "quantifies the largest random circuit of equal width and depth that the computer successfully implements," and pointed out that it is also strongly tied to error rates.



Credit: IBM Explore further: IBM says it's reached milestone in quantum computing

More information: www.ibm.com/blogs/research/201 … ower-quantum-device/

Related Stories IBM says it's reached milestone in quantum computing November 10, 2017IBM has announced a milestone in its race against Google and other big tech firms to build a powerful quantum computer.

Researchers determine the performance of multi-dimensional bits February 4, 2019What kinds of computers would be conceivable if physics worked differently? Quantum physicists Marius Krumm from the University of Vienna and Markus Müller from the Viennese Institute of Quantum Optics and Quantum Information ...

Cloud based quantum computing used to calculate nuclear binding energy February 2, 2018A team of researchers at Oak Ridge National Laboratory has demonstrated that it is possible to use cloud-based quantum computers to conduct quantum simulations and calculations. The team has written a paper describing their ...

First proof of quantum computer advantage October 18, 2018For many years, quantum computers were not much more than an idea. Today, companies, governments and intelligence agencies are investing in the development of quantum technology. Robert König, professor for the theory of ...

IBM announces cloud-based quantum computing platform May 4, 2016(Tech Xplore)—IBM has announced the development of a quantum computing platform that will allow users to access and program its 5 qubit quantum computer over the Internet. Called the IBM Quantum Experience, it is, the company ...

How to certify a quantum computer November 5, 2018Quantum computers are being developed by teams working not only at universities but also at Google, IBM, Microsoft and D-Wave, a start-up company. And things are evolving quickly, says Nicolas Sangouard, SNSF Professor at ...

Recommended for you The optomechanical Kerker effect: Controlling light with vibrating nanoparticles March 6, 2019For the Kerker effect to occur, particles need to have electric and magnetic polarizabilities of the same strength. This, however, is very challenging to achieve, as magnetic optical resonances in small particles are relatively ...

More evidence of sound waves carrying mass March 6, 2019A trio of researchers at Columbia University has found more evidence showing that sound waves carry mass. In their paper published in the journal Physical Review Letters, Angelo Esposito, Rafael Krichevsky and Alberto Nicolis ...

Einstein 'puzzle' solved as missing page emerges in new trove March 6, 2019An Albert Einstein "puzzle" has been solved thanks to a missing manuscript page emerging in a trove of his writings newly acquired by Jerusalem's Hebrew University, officials announced Wednesday.

Spin devices rev up March 6, 2019Electric currents drive all our electronic devices. The emerging field of spintronics looks to replace electric currents with what are known as spin currents. Researchers from the University of Tokyo have made a breakthrough ...

The science of knitting, unpicked March 6, 2019Dating back more than 3,000 years, knitting is an ancient form of manufacturing, but Elisabetta Matsumoto of the Georgia Institute of Technology in Atlanta believes that understanding how stitch types govern shape and stretchiness ...

Making long-lived positronium atoms for antimatter gravity experiments March 6, 2019The universe is almost devoid of antimatter, and physicists haven't yet figured out why. Discovering any slight difference between the behaviour of antimatter and matter in Earth's gravitational field could shed light on ...

Share RecommendKeepReplyMark as Last Read


From: FJB3/8/2019 1:50:40 PM
1 Recommendation   of 410
 
THE BUGATTI 'LA VOITURE NOIRE' IS THE WORLD'S MOST EXPENSIVE NEW CAR
The Batmobile-like hypercar just sold for an astounding $18.9 million.
BRANDON FRIEDERICHMAR 5, 2019





Bugatti

Bugatti just blew the minds of gearheads and luxury aficionados everywhere by unveiling the world's most expensive new car at the 2019 Geneva Motor Show.



The Bugatti "La Voiture Noire"—French for "the black car"—is a one-off hypercar created to commemorate the 110th anniversary of the French marque's founding.

A Bugatti enthusiast picked it up for a truly astounding $18.9 million, according to Business Insider.

That's nearly $6 million more than the Rolls-Royce "Sweptail" sold for when it set the previous record back in 2017.



And if we're being honest, La Voiture Noire looks way cooler than the Rolls. Basing the design on company founder Jean Bugatti's Type 57 SC Atlantic, engineers aimed to sculpt an exterior that's "all of a piece" by integrating the bumpers into the body and creating a uniform windshield that flows into the side windows.



“Every single component has been handcrafted and the carbon fiber body has a deep black gloss only interrupted by the ultra-fine fiber structure," said Bugatti designer Etienne Salome.

“We worked long and hard on this design until was nothing that we could improve. For us, the coupe represents the perfect form with a perfect finish."



For that astronomical price tag, the anonymous buyer also got the same ludicrous, 1,500-horsepower W16 that powers the 236-mph Divo and the 260-mph Chiron, along with six freakin' tailpipes.



Enjoy viewing it now, because you'll almost certainly never see the Bugatti La Voiture Noire in real life.

TAGS HYPERCARS GENEVA INTERNATIONAL MOTOR SHOW LUXURY MONEY RIDES BUGATTI BUGATTI LA VOITURE NOIRE WORLD RECORDS

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10