We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksNew Technology

Previous 10 Next 10 
To: FJB who wrote (322)10/11/2017 3:38:39 PM
From: FJB
   of 414
Commercial Quantum Computing Pushes On
News & Analysis
10/11/2017 Post a comment
Looking to speed the arrival of commercial quantum computers, Intel has prototyped a 17-qubit superconducting chip, which research partner QuTech will test on a suite of quantum algorithms.

Share RecommendKeepReplyMark as Last Read

From: Glenn Petersen10/23/2017 9:20:36 PM
   of 414
A Washing Machine That Tells the Future

How smart appliances are revolutionizing commerce.

By Adam Davidson
The New Yorker
October 23, 2017 Issue

Illustration by Golden Cosmos

The Smoot-Hawley Tariff Act of 1930 is perhaps the single most frequently mocked act of Congress in history. It sparked a trade war in the early days of the Great Depression, and has become shorthand for self-destructive protectionism. So it’s surprising that, while the law’s tariffs have been largely eliminated, some of its absurd provisions still hold. The other week, the American appliance-maker Whirlpool successfully employed a 1974 amendment to the act to persuade the United States government to impose as yet unidentified protections against its top Korean competitors, LG and Samsung. Whirlpool’s official argument was that these firms have been bolstering their market share by offering fancy appliances at low prices. In other words, Whirlpool is getting beat and wants the government to help it win.

This decision is more than a throwback. It shows that Whirlpool and its supporters in government have failed to understand the shift occurring in the business world as a result of the so-called Internet of Things—appliances that send and receive data. It’s easy to miss the magnitude of the change, since many of these Things seem like mere gimmicks. Have you ever wanted to change the water temperature in the middle of a wash cycle when you’re not at home, or get second-by-second reports on precisely how much energy your dryer is consuming? Probably not, but now you can. And it’s not just washing machines. There are at least two “smart” toasters and any number of Wi-Fi-connected coffeemakers, refrigerators, ovens, dishwashers, and garbage cans, not to mention light bulbs, sex toys, toilets, pet feeders, and a children’s thermos.

But this is just the early, land-rush phase of the Internet of Things, comparable to the first Internet land rush, in the late nineties. That era gave us notorious failures—cue obligatory mention of—but it also gave us Amazon, the company that, more than any other, suggests how things will play out. For most of its existence, Amazon has made little or no profit. In the early days, it was often ridiculed for this, but the company’s managers and investors quickly realized that its most valuable asset was not individual sales but data—its knowledge about its loyal, habit-driven customer base. Amazon doesn’t evaluate customers by the last purchase they made; instead, customers have a lifetime value, a prediction of how much money each one will spend in the years to come. Amazon can calculate this with increasing accuracy. Already, it likely knows which books you read, which movies you watch, what data you store, and what food you eat. And since the introduction of Alexa, the voice-operated device, Amazon has been learning when some customers wake up, go to work, listen to the news, play with their kids, and go to sleep.

This is the radical implication of the Internet of Things—a fundamental shift in the relationship between customers and companies. In the old days, you might buy a washing machine or a refrigerator once a decade or so. Appliance-makers are built to profit from that one, rare purchase, focussing their marketing, customer research, and internal financial analysis on brief, sporadic, high-stakes interactions. The fact that you bought a particular company’s stove five years ago has no value today. But, when an appliance is sending a constant stream of data back to its maker, that company has continuous relationships with the owners of its products, and can find all sorts of ways to make money from those relationships. If a company knows, years after you bought its stove, exactly how often you cook, what you cook, when you shop, and what you watch (on a stove-top screen) while you cook, it can continuously monetize your relationship: selling you recipe subscriptions, maybe, or getting a cut of your food orders. Appliances now order their own supplies when they are about to run out. My printer orders its own ink; I assume my next fridge will order milk when I’m running low.

Whirlpool makes smart appliances, just like Samsung and LG. The president of Whirlpool North America, Joseph Liotine, e-mailed me to say that the firm has “led the way in developing cutting-edge innovations and solutions.” He pointed out that its appliances connect to various services, such as Amazon Dash, which can automatically order laundry detergent, and Yummly, a recipe app that Whirlpool owns. But having the right products isn’t the same as having the right strategy. Unlike its Korean competitors, Whirlpool hasn’t embraced the Amazon lesson: that the way to win in a data-driven business is to push prices as low as possible in order to build your customer base, enhance data flow, and cash in in the long-term. Douglas Irwin, an economist at Dartmouth and the author of “Peddling Protectionism,” a book about Smoot-Hawley, told me, “Whirlpool is putting their resources into stopping competition. Maybe they should put their resources into serving their consumers better. This may just delay the reckoning.”

Irwin points out that Whirlpool’s trade complaint was first filed under the Obama Administration, which had imposed tariffs on LG and Samsung in two related cases. But most of the tariffs were small and easy for the companies to get around. President Trump, of course, views free trade more skeptically, and may well impose huge tariffs on all laundry-machine imports. Irwin suspects that this will produce a flood of trade-protection complaints from other American firms. That would be bad for anyone who wants to buy a laundry machine, but, in the long run, it will be even worse for American business.

This article appears in the print edition of the October 23, 2017, issue, with the headline “Cleaning Up.”\

Adam Davidson is a staff writer at The New Yorker.

Share RecommendKeepReplyMark as Last Read

From: roger wilco11/2/2017 12:16:32 PM
   of 414
Nasdaq company entering into the cryptocurrency business

Marathon Patent Group $MARA to Acquire Global Bit Ventures Inc., a Digital Asset Technology Company

Marathon Patent Group, Inc. (MARA), an IP licensing and management company, today announced that it has entered into a definitive purchase agreement to acquire 100% ownership of Global Bit Ventures Inc. (“GBV”), a digital asset technology company that mines cryptocurrencies.

Share RecommendKeepReplyMark as Last Read

From: Glenn Petersen11/13/2017 3:08:45 PM
   of 414
Can Carbon-Dioxide Removal Save the World?

CO2 could soon reach levels that, it’s widely agreed, will lead to catastrophe.

By Elizabeth Kolbert
The New Yorker
November 20, 2017 Issue

Carbon-dioxide removal could be a trillion-dollar enterprise, because it not only slows the rise in CO2 but reverses it.
Photo-Illustration by Thomas Albdorf for The New Yorker

Carbon Engineering, a company owned in part by Bill Gates, has its headquarters on a spit of land that juts into Howe Sound, an hour north of Vancouver. Until recently, the land was a toxic-waste site, and the company’s equipment occupies a long, barnlike building that, for many years, was used to process contaminated water. The offices, inherited from the business that poisoned the site, provide a spectacular view of Mt. Garibaldi, which rises to a snow-covered point, and of the Chief, a granite monolith that’s British Columbia’s answer to El Capitan. To protect the spit against rising sea levels, the local government is planning to cover it with a layer of fill six feet deep. When that’s done, it’s hoping to sell the site for luxury condos.

Adrian Corless, Carbon Engineering’s chief executive, who is fifty-one, is a compact man with dark hair, a square jaw, and a concerned expression. “Do you wear contacts?” he asked, as we were suiting up to enter the barnlike building. If so, I’d have to take extra precautions, because some of the chemicals used in the building could cause the lenses to liquefy and fuse to my eyes.

Inside, pipes snaked along the walls and overhead. The thrum of machinery made it hard to hear. In one corner, what looked like oversized beach bags were filled with what looked like white sand. This, Corless explained over the noise, was limestone—pellets of pure calcium carbonate.

Corless and his team are engaged in a project that falls somewhere between toxic-waste cleanup and alchemy. They’ve devised a process that allows them, in effect, to suck carbon dioxide out of the air. Every day at the plant, roughly a ton of CO2 that had previously floated over Mt. Garibaldi or the Chief is converted into calcium carbonate. The pellets are subsequently heated, and the gas is forced off, to be stored in cannisters. The calcium can then be recovered, and the process run through all over again.

“If we’re successful at building a business around carbon removal, these are trillion-dollar markets,” Corless told me.

This past April, the concentration of carbon dioxide in the atmosphere reached a record four hundred and ten parts per million. The amount of CO2 in the air now is probably greater than it’s been at any time since the mid-Pliocene, three and a half million years ago, when there was a lot less ice at the poles and sea levels were sixty feet higher. This year’s record will be surpassed next year, and next year’s the year after that. Even if every country fulfills the pledges made in the Paris climate accord—and the United States has said that it doesn’t intend to—carbon dioxide could soon reach levels that, it’s widely agreed, will lead to catastrophe, assuming it hasn’t already done so.

Carbon-dioxide removal is, potentially, a trillion-dollar enterprise because it offers a way not just to slow the rise in CO2 but to reverse it. The process is sometimes referred to as “negative emissions”: instead of adding carbon to the air, it subtracts it. Carbon-removal plants could be built anywhere, or everywhere. Construct enough of them and, in theory at least, CO2 emissions could continue unabated and still we could avert calamity. Depending on how you look at things, the technology represents either the ultimate insurance policy or the ultimate moral hazard.

Carbon Engineering is one of a half-dozen companies vying to prove that carbon removal is feasible. Others include Global Thermostat, which is based in New York, and Climeworks, based near Zurich. Most of these owe their origins to the ideas of a physicist named Klaus Lackner, who now works at Arizona State University, in Tempe, so on my way home from British Columbia I took a detour to visit him. It was July, and on the day I arrived the temperature in the city reached a hundred and twelve degrees. When I got to my hotel, one of the first things I noticed was a dead starling lying, feet up, in the parking lot. I wondered if it had died from heat exhaustion.

Lackner, who is sixty-five, grew up in Germany. He is tall and lanky, with a fringe of gray hair and a prominent forehead. I met him in his office at an institute he runs, the Center for Negative Carbon Emissions. The office was bare, except for a few New Yorker cartoons on the theme of nerd-dom, which, Lackner told me, his wife had cut out for him. In one, a couple of scientists stand in front of an enormous whiteboard covered in equations. “The math is right,” one of them says. “It’s just in poor taste.”

In the late nineteen-seventies, Lackner moved from Germany to California to study with George Zweig, one of the discoverers of quarks. A few years later, he got a job at Los Alamos National Laboratory. There, he worked on fusion. “Some of the work was classified,” he said, “some of it not.”

Fusion is the process that powers the stars and, closer to home, thermonuclear bombs. When Lackner was at Los Alamos, it was being touted as a solution to the world’s energy problem; if fusion could be harnessed, it could generate vast amounts of carbon-free power using isotopes of hydrogen. Lackner became convinced that a fusion reactor was, at a minimum, decades away. (Decades later, it’s generally agreed that a workable reactor is still decades away.) Meanwhile, the globe’s growing population would demand more and more energy, and this demand would be met, for the most part, with fossil fuels.

“I realized, probably earlier than most, that the claims of the demise of fossil fuels were greatly exaggerated,” Lackner told me. (In fact, fossil fuels currently provide about eighty per cent of the world’s energy. Proportionally, this figure hasn’t changed much since the mid-eighties, but, because global energy use has nearly doubled, the amount of coal, oil, and natural gas being burned today is almost two times greater.)

One evening in the early nineties, Lackner was having a beer with a friend, Christopher Wendt, also a physicist. The two got to wondering why, as Lackner put it to me, “nobody’s doing these really crazy, big things anymore.” This led to more questions and more conversations (and possibly more beers).

Eventually, the two produced an equation-dense paper in which they argued that self-replicating machines could solve the world’s energy problem and, more or less at the same time, clean up the mess humans have made by burning fossil fuels. The machines would be powered by solar panels, and as they multiplied they’d produce more solar panels, which they’d assemble using elements, like silicon and aluminum, extracted from ordinary dirt. The expanding collection of panels would produce ever more power, at a rate that would increase exponentially. An array covering three hundred and eighty-six thousand square miles—an area larger than Nigeria but, as Lackner and Wendt noted, “smaller than many deserts”—could supply all the world’s electricity many times over.

This same array could be put to use scrubbing carbon dioxide from the atmosphere. According to Lackner and Wendt, the power generated by a Nigeria-size solar farm would be enough to remove all the CO2 emitted by humans up to that point within five years. Ideally, the CO2 would be converted to rock, similar to the white sand produced by Carbon Engineering; enough would be created to cover Venezuela in a layer a foot and a half deep. (Where this rock would go the two did not specify.)

Lackner let the idea of the self-replicating machine slide, but he became more and more intrigued by carbon-dioxide removal, particularly by what’s become known as “direct air capture.”

“Sometimes by thinking through this extreme end point you learn a lot,” he said. He began giving talks and writing papers on the subject. Some scientists decided he was nuts, others that he was a visionary. “Klaus is, in fact, a genius,” Julio Friedmann, a former Principal Deputy Assistant Secretary of Energy and an expert on carbon management, told me.

In 2000, Lackner received a job offer from Columbia University. Once in New York, he pitched a plan for developing a carbon-sucking technology to Gary Comer, a founder of Lands’ End. Comer brought to the meeting his investment adviser, who quipped that Lackner wasn’t looking for venture capital so much as “adventure capital.” Nevertheless, Comer offered to put up five million dollars. The new company was called Global Research Technologies, or G.R.T. It got as far as building a small prototype, but just as it was looking for new investors the financial crisis hit.

“Our timing was exquisite,” Lackner told me. Unable to raise more funds, the company ceased operations. As the planet continued to warm, and carbon-dioxide levels continued to climb, Lackner came to believe that, unwittingly, humanity had already committed itself to negative emissions.

“I think that we’re in a very uncomfortable situation,” he said. “I would argue that if technologies to pull CO2 out of the environment fail then we’re in deep trouble.”

Lackner founded the Center for Negative Carbon Emissions at A.S.U. in 2014. Most of the equipment he dreams up is put together in a workshop a few blocks from his office. The day I was there, it was so hot outside that even the five-minute walk to the workshop required staging. Lackner delivered a short lecture on the dangers of dehydration and handed me a bottle of water.

In the workshop, an engineer was tinkering with what looked like the guts of a foldout couch. Where, in the living-room version, there would have been a mattress, in this one was an elaborate array of plastic ribbons. Embedded in each ribbon was a powder made from thousands upon thousands of tiny amber-colored beads. The beads, Lackner explained, could be purchased by the truckload; they were composed of a resin normally used in water treatment to remove chemicals like nitrates. More or less by accident, Lackner had discovered that the beads could be repurposed. Dry, they’d absorb carbon dioxide. Wet, they’d release it. The idea was to expose the ribbons to Arizona’s thirsty air, and then fold the device into a sealed container filled with water. The CO2 that had been captured by the powder in the dry phase would be released in the wet phase; it could then be piped out of the container, and the whole process re-started, the couch folding and unfolding over and over again.

Lackner has calculated that an apparatus the size of a semi trailer could remove a ton of carbon dioxide per day, or three hundred and sixty-five tons a year. The world’s cars, planes, refineries, and power plants now produce about thirty-six billion tons of CO2 annually, so, he told me, “if you built a hundred million trailer-size units you could actually keep up with current emissions.” He acknowledged that the figure sounded daunting. But, he noted, the iPhone has been around for only a decade or so, and there are now seven hundred million in use. “We are still very early in this game,” he said.

The way Lackner sees things, the key to avoiding “deep trouble” is thinking differently. “We need to change the paradigm,” he told me. Carbon dioxide should be regarded the same way we view other waste products, like sewage or garbage. We don’t expect people to stop producing waste. (“Rewarding people for going to the bathroom less would be nonsensical,” Lackner has observed.) At the same time, we don’t let them shit on the sidewalk or toss their empty yogurt containers into the street.

“If I were to tell you that the garbage I’m dumping in front of your house is twenty per cent less this year than it was last year, you would still think I’m doing something intolerable,” Lackner said.

One of the reasons we’ve made so little progress on climate change, he contends, is that the issue has acquired an ethical charge, which has polarized people. To the extent that emissions are seen as bad, emitters become guilty. “Such a moral stance makes virtually everyone a sinner, and makes hypocrites out of many who are concerned about climate change but still partake in the benefits of modernity,” he has written. Changing the paradigm, Lackner believes, will change the conversation. If CO2 is treated as just another form of waste, which has to be disposed of, then people can stop arguing about whether it’s a problem and finally start doing something.

Carbon dioxide was “discovered,” by a Scottish physician named Joseph Black, in 1754. A decade later, another Scotsman, James Watt, invented a more efficient steam engine, ushering in what is now called the age of industrialization but which future generations may dub the age of emissions. It is likely that by the end of the nineteenth century human activity had raised the average temperature of the earth by a tenth of a degree Celsius (or nearly two-tenths of a degree Fahrenheit).

As the world warmed, it started to change, first gradually and then suddenly. By now, the globe is at least one degree Celsius (1.8 degrees Fahrenheit) warmer than it was in Black’s day, and the consequences are becoming ever more apparent. Heat waves are hotter, rainstorms more intense, and droughts drier. The wildfire season is growing longer, and fires, like the ones that recently ravaged Northern California, more numerous. Sea levels are rising, and the rate of rise is accelerating. Higher sea levels exacerbated the damage from Hurricanes Harvey, Irma, and Maria, and higher water temperatures probably also made the storms more ferocious. “Harvey is what climate change looks like,” Eric Holthaus, a meteorologist turned columnist, recently wrote.

Meanwhile, still more warming is locked in. There’s so much inertia in the climate system, which is as vast as the earth itself, that the globe has yet to fully adjust to the hundreds of billions of tons of carbon dioxide that have been added to the atmosphere in the past few decades. It’s been calculated that to equilibrate to current CO2 levels the planet still needs to warm by half a degree. And every ten days another billion tons of carbon dioxide are released. Last month, the World Meteorological Organization announced that the concentration of carbon dioxide in the atmosphere jumped by a record amount in 2016.

No one can say exactly how warm the world can get before disaster—the inundation of low-lying cities, say, or the collapse of crucial ecosystems, like coral reefs—becomes inevitable. Officially, the threshold is two degrees Celsius (3.6 degrees Fahrenheit) above preindustrial levels. Virtually every nation signed on to this figure at a round of climate negotiations held in Cancún in 2010.

Meeting in Paris in 2015, world leaders decided that the two-degree threshold was too high; the stated aim of the climate accord is to hold “the increase in the global average temperature to well below 2°C” and to try to limit it to 1.5°C. Since the planet has already warmed by one degree and, for all practical purposes, is committed to another half a degree, it would seem impossible to meet the latter goal and nearly impossible to meet the former. And it is nearly impossible, unless the world switches course and instead of just adding CO2 to the atmosphere also starts to remove it.

The extent to which the world is counting on negative emissions is documented by the latest report of the Intergovernmental Panel on Climate Change, which was published the year before Paris. To peer into the future, the I.P.C.C. relies on computer models that represent the world’s energy and climate systems as a tangle of equations, and which can be programmed to play out different “scenarios.” Most of the scenarios involve temperature increases of two, three, or even four degrees Celsius—up to just over seven degrees Fahrenheit—by the end of this century. (In a recent paper in the Proceedings of the National Academy of Sciences, two climate scientists—Yangyang Xu, of Texas A. & M., and Veerabhadran Ramanathan, of the Scripps Institution of Oceanography—proposed that warming greater than three degrees Celsius be designated as “catastrophic” and warming greater than five degrees as “unknown??” The “unknown??” designation, they wrote, comes “with the understanding that changes of this magnitude, not experienced in the last 20+ million years, pose existential threats to a majority of the population.”)

When the I.P.C.C. went looking for ways to hold the temperature increase under two degrees Celsius, it found the math punishing. Global emissions would have to fall rapidly and dramatically—pretty much down to zero by the middle of this century. (This would entail, among other things, replacing most of the world’s power plants, revamping its agricultural systems, and eliminating gasoline-powered vehicles, all within the next few decades.) Alternatively, humanity could, in effect, go into hock. It could allow CO2 levels temporarily to exceed the two-degree threshold—a situation that’s become known as “overshoot”—and then, via negative emissions, pull the excess CO2 out of the air.

The I.P.C.C. considered more than a thousand possible scenarios. Of these, only a hundred and sixteen limit warming to below two degrees, and of these a hundred and eight involve negative emissions. In many below-two-degree scenarios, the quantity of negative emissions called for reaches the same order of magnitude as the “positive” emissions being produced today.

“The volumes are outright crazy,” Oliver Geden, the head of the E.U. research division of the German Institute for International and Security Affairs, told me. Lackner said, “I think what the I.P.C.C. really is saying is ‘We tried lots and lots of scenarios, and, of the scenarios which stayed safe, virtually every one needed some magic touch of a negative emissions. If we didn’t do that, we ran into a brick wall.’ ”

Pursued on the scale envisioned by the I.P.C.C., carbon-dioxide removal would yield at first tens of billions and soon hundreds of billions of tons of CO2, all of which would have to be dealt with. This represents its own supersized challenge. CO2 can be combined with calcium to produce limestone, as it is in the process at Carbon Engineering (and in Lackner’s self-replicating-machine scheme). But the necessary form of calcium isn’t readily available, and producing it generally yields CO2, a self-defeating prospect. An alternative is to shove the carbon back where it came from, deep underground.

“If you are storing CO2 and your only purpose is storage, then you’re looking for a package of certain types of rock,” Sallie Greenberg, the associate director for energy, research, and development at the Illinois State Geological Survey, told me. It was a bright summer day, and we were driving through the cornfields of Illinois’s midsection. A mile below us was a rock formation known as the Eau Claire Shale, and below that a formation known as the Mt. Simon Sandstone. Together with a team of drillers, engineers, and geoscientists, Greenberg has spent the past decade injecting carbon dioxide into this rock “package” and studying the outcome. When I’d proposed over the phone that she show me the project, in Decatur, she’d agreed, though not without hesitation.

“It isn’t sexy,” she’d warned me. “It’s a wellhead.”

“I have to know how it’s done.”

Our first stop was a building shaped like a ski chalet. This was the National Sequestration Education Center, a joint venture of the Illinois geological survey, the U.S. Department of Energy, and Richland Community College. Inside were classrooms, occupied that morning by kids making lanyards, and displays aimed at illuminating the very dark world of carbon storage. One display was a sort of oversized barber pole, nine feet tall and decorated in bands of tan and brown, representing the various rock layers beneath us. A long arrow on the side of the pole indicated how many had been drilled through for Greenberg’s carbon-storage project; it pointed down, through the New Albany Shale, the Maquoketa Shale, and so on, all the way to the floor.

The center’s director, David Larrick, was on hand to serve as a guide. In addition to schoolkids, he said, the center hosted lots of community groups, like Kiwanis clubs. “This is very effective as a visual,” he told me, gesturing toward the pole. Sometimes farmers were concerned about the impact that the project could have on their water supply. The pole showed that the CO2 was being injected more than a mile below their wells.

“We have had overwhelmingly positive support,” he said. While Greenberg and Larrick chatted, I wandered off to play an educational video game. A cartoon figure in a hard hat appeared on the screen to offer factoids such as “The most efficient method of transport of CO2 is by pipeline.”

“Transport CO2 to earn points!” the cartoon man exhorted.

After touring the center’s garden, which featured grasses, like big bluestem, that would have been found in the area before it was plowed into cornfields, Greenberg and I drove on. Soon we passed through the gates of an enormous Archer Daniels Midland plant, which rose up out of the fields like a small city.

Greenberg explained that the project we were visiting was one of seven funded by the Department of Energy to learn whether carbon injected underground would stay there. In the earliest stage of the project, initiated under President George W. Bush, Greenberg and her colleagues sifted through geological records to find an appropriate test site. What they were seeking was similar to what oil drillers look for—porous stone capped by a layer of impermeable rock—only they were looking not to extract fossil fuels but, in a manner of speaking, to stuff them back in. The next step was locating a ready source of carbon dioxide. This is where A.D.M. came in; the plant converts corn into ethanol, and one of the by-products of this process is almost pure CO2. In a later stage of the project, during the Obama Administration, a million tons of carbon dioxide from the plant were pumped underground. Rigorous monitoring has shown that, so far, the CO2 has stayed put.

We stopped to pick up hard hats and went to see some of the monitoring equipment, which was being serviced by two engineers, Nick Malkewicz and Jim Kirksey. It was now lunchtime, so we made another detour, to a local barbecue place. Finally, Greenberg and I and the two men got to the injection site. It was, indeed, not sexy—just a bunch of pipes and valves sticking out of the dirt. I asked about the future of carbon storage.

“I think the technology’s there and it’s absolutely viable,” Malkewicz said. “It’s just a question of whether people want to do it or not. It’s kind of an obvious thing.”

“We know we can meet the objective of storing CO2,” Greenberg added. “Like Nick said, it’s just a matter of whether or not as a society we’re going to do it.”

When work began on the Decatur project, in 2003, few people besides Klaus Lackner were thinking about sucking CO2 from the air. Instead, the goal was to demonstrate the feasibility of an only slightly less revolutionary technology—carbon capture and storage (or, as it is sometimes referred to, carbon capture and sequestration).

With C.C.S., the CO2 produced at a power station or a steel mill or a cement plant is drawn off before it has a chance to disperse into the atmosphere. (This is called “post-combustion capture.”) The gas, under very high pressure, is then injected into the appropriate package of rock, where it is supposed to remain permanently. The process has become popularly—and euphemistically—known as “clean coal,” because, if all goes according to plan, a plant equipped with C.C.S. produces only a fraction of the emissions of a conventional coal-fired plant.

Over the years, both Republicans and Democrats have touted clean coal as a way to save mining jobs and protect the environment. The coal industry has also, nominally at least, embraced the technology; one industry-sponsored group calls itself the American Coalition for Clean Coal Electricity. Donald Trump, too, has talked up clean coal, even if he doesn’t seem to quite understand what the term means. “We’re going to have clean coal, really clean coal,” he said in March.

Currently, only one power plant in the U.S., the Petra Nova plant, near Houston, uses post-combustion carbon capture on a large scale. Plans for other plants to showcase the technology have been scrapped, including, most recently, the Kemper County plant, in Mississippi. This past June, the plant’s owner, Southern Company, announced that it was changing tacks. Instead of burning coal and capturing the carbon, the plant would burn natural gas and release the CO2.

Experts I spoke to said that the main reason C.C.S. hasn’t caught on is that there’s no inducement to use it. Capturing the CO2 from a smokestack consumes a lot of power—up to twenty-five per cent of the total produced at a typical coal-burning plant. And this, of course, translates into costs. What company is going to assume such costs when it can dump CO2 into the air for free?

“If you’re running a steel mill or a power plant and you’re putting the CO2 into the atmosphere, people might say, ‘Why aren’t you using carbon capture and storage?’ ” Howard Herzog, an engineer at M.I.T. who for many years ran a research program on C.C.S., told me. “And you say, ‘What’s my financial incentive? No one’s saying I can’t put it in the atmosphere.’ In fact, we’ve gone backwards in terms of sending signals that you’re going to have to restrict it.”

But, although C.C.S. has stalled in practice, it has become ever more essential on paper. Practically all below-two-degree warming scenarios assume that it will be widely deployed. And even this isn’t enough. To avoid catastrophe, most models rely on a yet to be realized variation of C.C.S., known as BECCS.

BECCS, which stands for “bio-energy with carbon capture and storage,” takes advantage of the original form of carbon engineering: photosynthesis. Trees and grasses and shrubs, as they grow, soak up CO2 from the air. (Replanting forests is a low-tech form of carbon removal.) Later, when the plants rot or are combusted, the carbon they have absorbed is released back into the atmosphere. If a power station were to burn wood, say, or cornstalks, and use C.C.S. to sequester the resulting CO2, this cycle would be broken. Carbon would be sucked from the air by the green plants and then forced underground. BECCS represents a way to generate negative emissions and, at the same time, electricity. The arrangement, at least as far as the models are concerned, could hardly be more convenient.

BECCS is unique in that it removes carbon and produces energy,” Glen Peters, a senior researcher at the Center for International Climate Research, in Oslo, told me. “So the more you consume the more you remove.” He went on, “In a sense, it’s a dream technology. It’s solving one problem while solving the other problem. What more could you want?”

The Center for Carbon Removal doesn’t really have an office; it operates out of a co-working space in downtown Oakland. On the day I visited, not long after my trip to Decatur, someone had recently stopped at Trader Joe’s, and much of the center’s limited real estate was taken up by tubs of treats.

“Open anything you want,” the center’s executive director, Noah Deich, urged me, with a wave of his hand.

Deich, who is thirty-one, has a broad face, a brown beard, and a knowing sort of earnestness. After graduating from the University of Virginia, in 2009, he went to work for a consulting firm in Washington, D.C., that was advising power companies about how to prepare for a time when they’d no longer be able to release carbon into the atmosphere cost-free. It was the start of the Obama Administration, and that time seemed imminent. The House of Representatives had recently approved legislation to limit emissions. But the bill later died in the Senate, and, as Deich put it, “It’s no fun to model the impacts of climate policies nobody believes are going to happen.” He switched consulting firms, then headed to business school, at the University of California, Berkeley.

“I came into school with this vision of working for a clean-tech startup,” he told me. “But I also had this idea floating around in the back of my head that we’re moving too slowly to actually stop emissions in time. So what do we do with all the carbon that’s in the air?” He started talking to scientists and policy experts at Berkeley. What he learned shocked him.

“People told me, ‘The models show this major need for negative emissions,’ ” he recalled. “ ‘But we don’t really know how to do that, nor is anyone really thinking about it.’ I was someone who’d been in the business and policy world, and I was, like, wait a minute—what?”

Business school taught Deich to think in terms of case studies. One that seemed to him relevant was solar power. Photovoltaic cells have been around since the nineteen-fifties, but for decades they were prohibitively expensive. Then the price started to drop, which increased demand, which led to further price drops, to the point where today, in many parts of the world, the cost of solar power is competitive with the cost of power from new coal plants.

“And the reason that it’s now competitive is that governments decided to do lots and lots of research,” Deich said. “And some countries, like Germany, decided to pay a lot for solar, to create a first market. And China paid a lot to manufacture the stuff, and states in the U.S. said, ‘You must consume renewable energy,’ and then consumers said, ‘Hey, how can I buy renewable energy?’ ”

"Thanks, I’ll write that down.”

As far as he could see, none of this—neither the research nor the creation of first markets nor the spurring of consumer demand—was being done for carbon removal, so he decided to try to change that. Together with a Berkeley undergraduate, Giana Amador, he founded the center in 2015, with a hundred-and-fifty-thousand-dollar grant from the university. It now has an annual budget of about a million dollars, raised from private donors and foundations, and a staff of seven. Deich described it as a “think-and-do tank.”

“We’re trying to figure out: how do we actually get this on the agenda?” he said.

A compelling reason for putting carbon removal on “the agenda” is that we are already counting on it. Negative emissions are built into the I.P.C.C. scenarios and the climate agreements that rest on them.

But everyone I spoke with, including the most fervent advocates for carbon removal, stressed the huge challenges of the work, some of them technological, others political and economic. Done on a scale significant enough to make a difference, direct air capture of the sort pursued by Carbon Engineering, in British Columbia, would require an enormous infrastructure, as well as huge supplies of power. (Because CO2 is more dilute in the air than it is in the exhaust of a power plant, direct air capture demands even more energy than C.C.S.) The power would have to be generated emissions-free, or the whole enterprise wouldn’t make much sense.

“You might say it’s against my self-interest to say it, but I think that, in the near term, talking about carbon removal is silly,” David Keith, the founder of Carbon Engineering, who teaches energy and public policy at Harvard, told me. “Because it almost certainly is cheaper to cut emissions now than to do large-scale carbon removal.”

BECCS doesn’t make big energy demands; instead, it requires vast tracts of arable land. Much of this land would, presumably, have to be diverted from food production, and at a time when the global population—and therefore global food demand—is projected to be growing. (It’s estimated that to do BECCS on the scale envisioned by some below-two-degrees scenarios would require an area larger than India.) Two researchers in Britain, Naomi Vaughan and Clair Gough, who recently conducted a workshop on BECCS, concluded that “assumptions regarding the extent of bioenergy deployment that is possible” are generally “unrealistic.”

For these reasons, many experts argue that even talking (or writing articles) about negative emissions is dangerous. Such talk fosters the impression that it’s possible to put off action and still avoid a crisis, when it is far more likely that continued inaction will just produce a larger crisis. In “The Trouble with Negative Emissions,” an essay that ran last year in Science, Kevin Anderson, of the Tyndall Centre for Climate Change Research, in England, and Glen Peters, of the climate-research center in Oslo, described negative-emissions technologies as a “high-stakes gamble” and relying on them as a “moral hazard par excellence.”

We should, they wrote, “proceed on the premise that they will not work at scale.”

Others counter that the moment for fretting about the hazards of negative emissions—moral or otherwise—has passed.

“The punch line is, it doesn’t matter,” Julio Friedmann, the former Principal Deputy Assistant Energy Secretary, told me. “We actually need to do direct air capture, so we need to create technologies that do that. Whether it’s smart or not, whether it’s optimized or not, whether it’s the lowest-cost pathway or not, we know we need to do it.”

“If you tell me that we don’t know whether our stuff will work, I will admit that is true,” Klaus Lackner said. “But I also would argue that nobody else has a good option.”

One of the peculiarities of climate discussions is that the strongest argument for any given strategy is usually based on the hopelessness of the alternatives: this approach must work, because clearly the others aren’t going to. This sort of reasoning rests on a fragile premise—what might be called solution bias. There has to be an answer out there somewhere, since the contrary is too horrible to contemplate.

Early last month, the Trump Administration announced its intention to repeal the Clean Power Plan, a set of rules aimed at cutting power plants’ emissions. The plan, which had been approved by the Obama Administration, was eminently achievable. Still, according to the current Administration, the cuts were too onerous. The repeal of the plan is likely to result in hundreds of millions of tons of additional emissions.

A few weeks later, the United Nations Environment Programme released its annual Emissions Gap Report. The report labelled the difference between the emissions reductions needed to avoid dangerous climate change and those which countries have pledged to achieve as “alarmingly high.” For the first time, this year’s report contains a chapter on negative emissions. “In order to achieve the goals of the Paris Agreement,” it notes, “carbon dioxide removal is likely a necessary step.”

As a technology of last resort, carbon removal is, almost by its nature, paradoxical. It has become vital without necessarily being viable. It may be impossible to manage and it may also be impossible to manage without.

This article appears in the print edition of the November 20, 2017, issue, with the headline “Going Negative.”

Elizabeth Kolbert has been a staff writer at The New Yorker since 1999. She won the 2015 Pulitzer Prize for general nonfiction for “The Sixth Extinction: An Unnatural History.”

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Glenn Petersen who wrote (326)11/13/2017 3:14:52 PM
From: FJB
   of 414
It makes me sad that people could be as dumb as this writer. CO2 is beneficial to planet Earth. We won't be able to feed 10 billion people without the continued greening of our planet enabled by CO2.


The István Markó Interview: The Science.

AP/Pablo Martinez Monsivais

Maybe the biggest of all the lies put out by the global warming scaremongers is that the science is on their side. No it isn’t. And if you’re in any doubt at all you should read this interview with the brilliant scientist István Markó. It tells you all you need to know about the science of global warming.
Dr. Markó, who sadly died earlier this year aged only 61, was a professor and researcher in organic chemistry at the Université Catholique de Louvain, Belgium’s largest French-speaking university. More importantly for the purposes of this interview, he was one of the world’s most outspoken and well-informed climate skeptics, who contributed to several articles on the subject for Breitbart News.

Before he died, he gave an extensive interview to the French journalist Grégoire Canlorbe. Here are highlights of the English translation. As you’ll see, he doesn’t pull his punches.

CO2 is not – and has never been a poison

Each of our exhalations, each of our breaths, emits an astronomical quantity of CO2 proportionate to that in the atmosphere (some >40,000 ppm); and it is very clear that the air we expire does not kill anyone standing in front of us. What must be understood, besides, is that CO2 is the elementary food of plants. Without CO2 there would be no plants, and without plants there would be no oxygen and therefore no humans.

Plants love CO2. That’s why the planet is greening

Plants need CO2, water, and daylight. These are the mechanisms of photosynthesis, to generate the sugars that will provide them with staple food and building blocks. That fundamental fact of botany is one of the primary reasons why anyone who is sincerely committed to the preservation of the “natural world” should abstain from demonizing CO2. Over the last 30 years, there has been a gradual increase in the CO2 level. But what is also observed is that despite deforestation, the planet’s vegetation has grown by about 20 percent. This expansion of vegetation on the planet, nature lovers largely owe it to the increase in the concentration of CO2 in the atmosphere.

There have been periods where the CO2 concentration was many times higher than now. Life thrived.

During the Jurassic, Triassic, and so on, the CO2 level rose to values sometimes of the order of 7000, 8000, 9000 ppm, which considerably exceeds the paltry 400 ppm that we have today. Not only did life exist in those far-off times when CO2 was so present in large concentration in the atmosphere, but plants such as ferns commonly attained heights of 25 meters. Reciprocally, far from benefiting the current vegetation, the reduction of the presence of CO2 in the atmosphere would be likely to compromise the health, and even the survival, of numerous plants. To fall below the threshold of 280 or 240 ppm would plainly lead to the extinction of a large variety of our vegetal species.

Animals need CO2 too. And by the way – forests are not the ‘lungs of the earth’…

In addition, our relentless crusade to reduce CO2 could be more harmful to nature as plants are not the only organisms to base their nutrition on CO2. Phytoplankton species also feed on CO2, using carbon from CO2 as a building unit and releasing oxygen. By the way, it is worth remembering that ~70 percent of the oxygen present today in the atmosphere comes from phytoplankton, not trees. Contrary to common belief, it is not the forests, but the oceans, that constitute the “lungs” of the earth.

It is not true that CO2 has a major greenhouse effect. Reports of its influence have been exaggerated

It is worth remembering here too that CO2 is a minor gas. Today it represents only 0.04 percent of the composition of the air; and its greenhouse effect is attributed the value of 1. The major greenhouse gas in the atmosphere is water vapor which is ten times more potent than CO2 in its greenhouse effect. Water vapor is present in a proportion of 2 percent in the atmosphere. Those facts are, in principle, taught at school and at university, but one still manages to incriminate CO2 alongside this learning, in using a dirty trick that presents the warming effect of CO2 as minor but exacerbated, through feedback loops, by the other greenhouse effects.

Climate change is natural

Over the last 12,000 years, what we have witnessed is an oscillation between warm and cold periods, thus periods with rising and declining sea levels. Incontestably, sea and ocean levels have been on the rise since the end of the Little Ice Age that took place approximately from the beginning of the 14th century until the end of the 19th century. At the end of that period, global temperatures started to rise. That being said, the recorded rise is 0.8 degrees Celsius and is, therefore, nothing extraordinary. If the temperature goes up, ocean water obviously dilates and some glaciers recede. This is something glaciers have always done, and not a specificity of our time.

Don’t worry about shrinking glaciers. We’ve been here before…

In Ancient Roman times, glaciers were much smaller than the ones we know nowadays. I invite the reader to look at the documents dating back to the days of Hannibal, who managed to cross the Alps with his elephants because he did not encounter ice on his way to Rome (except during a snow storm just before arriving on the Italian plain). Today, you could no longer make Hannibal’s journey. He proved to be capable of such an exploit precisely because it was warmer in Roman times.

Sea level rise is normal

Sea levels are currently on the rise; but this is an overestimated phenomenon. The recorded rise is 1.5 millimeters per year, namely 1.5 cm every ten years, and is, therefore, not dramatic at all. Indeed, it does happen that entire islands do get engulfed; but in 99 percent of the cases, that is due to a classic erosion phenomenon [1] and not to rising sea levels. As far as the Italian city of Venice is concerned, the fact it has been faced with water challenges is not due to any rise of the lagoon level and is just the manifestation of the sad reality that “the City of the Doges” is sinking under its weight on the marshland. Once again, the global sea and ocean levels are rising; but the threat effectively represented by that phenomenon is far from being tangible. I note that the Tuvalu islands, whose engulfment was previously announced as imminent, not only have not been engulfed, but have seen their own land level rise with respect to that of waters around them.

[1] The island shores are eroded by the persistent pounding of the ocean waves. This is perceived as ‘sinking’ or as ‘sea level rise,’ but the upward creep of the waters is due to island soil being washed away.

The polar ice caps are fine too

Still another phenomenon we tend to exaggerate is the melting of the polar caps. The quantity of ice in the Arctic has not gone down for 10 years. One may well witness, from one year to the other, ice level fluctuations, but, on average, that level has remained constant. Right after the Little Ice Age, since the temperature went up, the Arctic started to melt; but the ice level in the Arctic finally settled down. Besides, ice has been expanding in Antarctica over the last 30 years and, similarly, we observe in Greenland that the quantity of ice increased by 112 million cubic kilometers last year. On a global scale, glaciers account for peanuts, with most of the ice being located in Antarctica and so on.

Extreme weather events are actually decreasing

From storms to tornados, extreme events are going down all around the world and, when they occur, their level is much lower, too. As explained by MIT physicist Richard Lindzen, the reduction of the temperature differential between the north hemisphere and the equatorial part of our planet makes cyclonic energy much smaller: the importance and frequency of extreme events thus tend to decrease.

Recent warming is modest – much smaller than the alarmists’ various computer models predicted

If you look at satellite data and weather balloon measurements, you then note that the temperature rise around the world is relatively modest, that it is much lower than the rise that is predicted for us by authorities, and that these predictions rely on calculations that are highly uncertain. This is because the simulation inputs cannot take into account past temperatures, for which there is no precision data [1], except by subjectively adjusting x, y, z data that are not always known. The recent temperature spikes measured by satellites and balloons are part of a classic natural phenomenon which is called El Niño. This short-term phenomenon consists of a return of the very warm waters at the surface of the equatorial Pacific Ocean. The heat thus liberated in the atmosphere pushes up the global temperature and CO2 plays no role in that process.

Claims by alarmist ‘experts’ that 2016 was that ‘hottest year ever’ are pure balderdash

The World Meteorological Organization – another emanation of the United Nations and which is also, like the IPCC, an intergovernmental forum – declares 2016 the year the warmest of history. Knowing that 2016 is supposedly hotter by 0.02°C than 2015 and that the margin of error on this value is 0.1°C, we see the absurdity of this statement. For those who don’t understand, this means that the variation in temperature can be of + 0.12°C (global warming) or -0.08°C (global cooling). In short, we can’t say anything and WMO has simply lost its mind.

No, ‘climate change’ hasn’t led to an increase in tropical diseases

Climate-related diseases are relatively rare; and even malaria does not directly depend on the climate, but rather on the way we enable the parasite to reproduce and the mosquito to flourish in the place where we are located. If you find yourself in a swampy area, the odds you will get malaria are high; if you have drained the system and you no longer have that wetland, the odds you will catch the disease are very low. In the end, automatically blaming the resurgence of some disease on climate change comes down to removing the personal responsibility from the people involved: such as denying that their refusal of vaccinations, for instance, or their lack of hygiene, may be part of the problem.

Again, CO2 is greening the planet. And that’s a good thing. So stop demonizing it!

Present deserts, far from expanding, are receding; and they are receding due to the higher quantity of CO2 available in the air. It turns out that greenhouse operators voluntarily inject three times as much CO2 in the commercial greenhouse as it is present in the atmosphere. The result we can observe is that plants grow faster and are bigger, that they are more resistant to diseases and to destructive insects, and that their photosynthesis is way more efficient and that they, therefore, consume less water. Similarly, the rise of CO2 level in the atmosphere makes plants need less water so they can afford to colonize arid regions.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

From: FJB11/15/2017 1:46:28 PM
   of 414

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: FJB who wrote (328)11/17/2017 8:37:13 PM
From: FJB
   of 414
We operate at banana scale.

Share RecommendKeepReplyMark as Last Read

From: FJB11/27/2017 6:47:37 PM
   of 414

DNA, Nature’s Best Storage Medium, Headed for the Data Center

Inside Microsoft’s effort to solve the world’s data storage capacity problem.
The continued growth in information we’re trying to store (from IoT sensor data to log files and photos) has already outpaced the capacity of some systems. CERN stores only a tenth of the 15PB of data it gets from the Large Hardon Collider each year on disk.

For many organizations, capacity may not be such a large problem; hard drive technology continues to improve, and much of the world’s data is still stored on tape. The storage issue we haven’t yet tackled is longevity – and that’s where storing data on artificial DNA may really shine.

A smear of DNA small enough to scrape up with your fingernail could store exabytes of information, but it’s the fact that it remains readable after thousands of years that really makes it interesting. Paper and microfilm can last 500 years or more, but digital media are hard to keep for even a few decades. Accelerated testing at higher temperatures shows that DNA will stay readable for 2,000 years if it’s stored at ten degrees centigrade (and for up to 2 million

years if it’s frozen); encapsulating it in spheres of silica means that humidity doesn’t affect it.

The format won’t get out of date like digital storage either. “We'll always be interested in reading DNA so we can be sure we'll always have the capability of reading it in the future -- because if we don't we'll have a real problem,” Karin Strauss, senior researcher in computer architecture at Microsoft Research and associate professor at the Department of Computer Science and Engineering at University of Washington, told Data Center Knowledge.

In the lab, researchers have been able to write and read text, photos, videos, and other files with 100 percent accuracy, and last year Microsoft bought ten million DNA molecules from Twist Bioscience to experiment with. But what does it take to turn that research into a real storage system, and when might you think about putting one in your data center?

Storing data in DNA means turning the bits in a file into the four bases in DNA -- mapping 00 to A, 01 to C, 10 to G, and 11 to T every time -- then synthesizing DNA molecules with those bases in the right order. Reading it means putting those molecules in a DNA sequencer, reading out the sequence of bases, and turning that back into bits. Today, there are some manual steps in that process, Strauss explained.

“There's software to do the first step of translating the bits to what bases we want; the next step is manufacturing the molecules. There’s a manual interface there, because we send Twist the file, and we get back the molecules; internally they have an automated process but they still need somebody to remove the DNA from the machine and ship us the molecules. The sequencers are all automated; you throw the molecules in, and it spits out the data. And then we have the rest of the data pipeline to decode the data.”

Microsoft and Twist are working with the University of Washington to turn that into a fully automated system. Strauss predicted the end result would be something that looked like a tape library, complete with the same kind of robotic arm (and maybe with cartridges of reagents you change like toner in a laser printer). Depending on how much parallelism you need – which comes down to how much data you want to write or read at the same time – “that’s likely to look like a few racks in the data center” she said.

Small as the DNA itself is, you can save more space by encapsulating more than one file in the same silica shell, which means chemically separating the DNA to get the file you want. Because sequencing is a batch process, you’re going to be reading back multiple files on the same sequencer anyway. Files are also encoded on multiple sequences of DNA, so the sequences are clustered together to get the full result. There’s a sequence number on each molecule; think of it like numbering the different sections that make up a large ZIP archive.

Reading DNA destroys it, but that’s only because that’s what the medical and biotech applications need. “When you sequence DNA, you don't want to reuse it, you don't want contamination; you just throw the whole thing away including all the reagents.” It would be possible to recover the DNA instead, but it's probably easier just to make more copies with the standard polymerase chain reaction, which is already used in the process to make sure you have enough copies of the different sequences to read; picking which sequences to copy gives you random access to the right section of a large file.

Those copies can introduce errors, so the system has error correction built in; in fact, that’s how it’s going to scale up from the megabytes that have been stored and decoded so far to the petabytes it needs to be able to deal with. “We are engineering the system, which allows us to cut some corners; we can tolerate more errors, which is what we're counting on to be able to improve these processes. We’ll make the processes more parallel, and they may become more imperfect, both the reading and the writing, but we can tolerate and compensate for that in other ways. We have control over the sequences, so we can encode the data in a way that can make it easier for us to decode them on the way out.”

The overhead of error correction is currently around 15 percent; “That's pretty manageable; ECC in servers is 12.5 percent, so this isn’t that far off.”

How Big and How Soon?The cost of DNA sequencing and synthesis are dropping faster than the price of digital media, especially when you factor in needing to rewrite tapes every five to ten years, but it’s still going to make sense only when you need to store data for a long time rather than a few years. Cloud providers will be interested, but so will some organizations who run their own data centers.

”The type of workload is definitely archival, at least at first,” Strauss said. “The type of users we've been seeing where this would make sense are where you need to keep the data by mandate, like hospitals and clinics, or there's legal data, pension data. They’re applications where you want to keep the data for a long time and put it away and not read it very repetitively. In the end, it’s bits you’re storing, and we can store any kind of bits.”

Video archiving is also a good fit, and even the way people look at photos on Facebook fits the model pretty well; every Halloween enough people look back at photos from the previous year for Facebook to spot the pattern. “That’s a perfect workload, because you could read them in advance, and by the time a user wants to look at it, it's already there.”

Currently the throughput of reading DNA isn’t that high. Of the two systems Strauss has worked on, one produces around 2 million reads in 24 hours (with most of the reads done in the first few hours), the other, more parallel system delivers around 400 million reads in 24 hours. But the density means you could get excellent bandwidth at a very low cost if you need to send the data a long distance, because you could fit an exabyte on something the size of a postcard.

“People ship hard drives today; in the future it might be DNA. You have trucks and planes moving hard drives around; it’s done that way because you get better throughput. With DNA you can expect it to be even better, because it’s a lot more compact, and you can easily make copies for distribution.”

If customers are interested, Strauss suggested we could see DNA storage in action relatively soon. “We think there is a good roadmap to getting this into operation, and we see no fundamental reasons why we wouldn’t be able to put it together. It's not going to be next year, but it's not going to be in ten years either; I think it will be somewhere in between.”

Share RecommendKeepReplyMark as Last Read

From: FJB12/26/2017 9:26:27 PM
   of 414
During record-breaking holiday, Echo Dot and Fire TV Stick with Alexa remote were #1 and #2 top sellers with “tens of millions of Alexa-enabled devices” sold — Echo Dot and Fire TV Stick with Alexa Voice Remote were the #1 and #2 top-selling products across all categories on Amazon

More: Business Insider, BGR, TechCrunch, ZDNet, Ubergizmo,, Forbes, San Francisco Chronicle, GeekWire, CNBC, CNET, VentureBeat, Retail Dive, 500ish Words, Android Central, SiliconBeat, Android and Me, and Seeking Alpha

Share RecommendKeepReplyMark as Last Read

From: FJB1/20/2018 8:36:43 AM
   of 414
China's Quantum-Key Network, the Largest Ever, Is Officially Online
By Rafi Letzter, Staff Writer | January 19, 2018 10:56am ET
A figure from the letter shows how the Micius satellite transfers quantum keys across vast distances. Credit: Physical Review Letters

China has the quantum technology to perfectly encrypt useful signals over distances far vaster than anyone has ever accomplished, spanning Europe and Asia, according to a stunning new research letter.

Bits of information, or signals, pass through people's houses, the skies overhead and the flesh of human bodies every second of every day. They're television signals and radio, as well as private phone calls and data files.

Some of these signals are public, but most are private — encrypted with long strings of numbers known (presumably) only to the senders and receivers. Those keys are powerful enough to keep the secrets of modern society: flirty text messages, bank-account numbers and the passwords to covert databases. But they're brittle. A sufficiently determined person, wielding a sufficiently powerful computer, could break them.

"Historically, every advance in cryptography has been defeated by advances in cracking technology," Jian-Wei Pan, a researcher at the University of Science and Technology of China and author on this research letter, wrote in an email. "Quantum key distribution ends this battle."

Quantum keys are long strings of numbers — keys for opening encrypted files just like the ones used in modern computers — but they're encoded in the physical states of quantum particles. That means they are protected not only by the limits of computers but the laws of physics.

Quantum keys cannot be copied. They can encrypt transmissions between otherwise classical computers. And no one can steal them — a law of quantum mechanics states that once a subatomic particle is observed, poof, it's altered — without alerting the sender and receiver to the dirty trick. [ What's That? Your Physics Questions Answered]

And now, according to a new letter due for publication today (Jan. 19) in the journal Physical Review Letters, quantum keys can travel via satellite, encrypting messages sent between cities thousands of miles apart.

The researchers quantum-encrypted images by encoding them as strings of numbers based on the quantum states of photons and sent them across distances of up to 4,722 miles (7,600 kilometers) between Beijing and Vienna — shattering the previous record of 251 miles (404 km), also set in China. Then, for good measure, on Sept. 29, 2017, they held a 75-minute videoconference between researchers in the two cities, also encrypted via quantum key. (This videoconference was announced previously, but the full details of the experiment were reported in this new letter.)

The satelliteThis long-distance quantum-key distribution is yet another achievement of the Chinese satellite Micius, which was responsible for smashing a number of quantum-networking records in 2017. Micius is a powerful photon relay and detector. Launched into low Earth orbit in 2016, it uses its fine lasers and detectors to send and receive packets of quantum information — basically, information about the quantum state of a photon — across vast stretches of space and atmosphere.

"Micius is the brightest star in the sky when it is passing over the station," Pan wrote to Live Science. "The star is [as] green as the beacon laser [that Micius uses to aim photons at the ground]. If there is some dust in the air, you will [also] see a red light line pointing to the satellite. No sound comes from space. Maybe there are some raised by the movement of the ground station."

Just about any time Micius does anything, it blows previous records out of the water. That's because previous quantum networks have relied on passing photons around on the ground, using the air between buildings or fiber optic cables. And there are limits to line-of-sight on the ground, or how far a fiber-optic cable will transfer a photon without losing it.

In June 2017, Micius researchers announced that they had sent two " entangled" photons to ground stations 745 miles (1,200 km) apart. (When a pair of photons gets entangled, they affect each other even when separated by large distances.) A month later, in July, they announced that they had teleported a packet of quantum information 870 miles (1,400 km) from Tibet into orbit, meaning the quantum state of a particle had been beamed directly from a particle on the ground to its twin in space.

Both of these achievements were major steps on the road to real-world quantum-key-encrypted networks.

The new letter announces that the theory has been put into action.

Micius first encrypted two photos, a small image of the Micius satellite itself, then a photo of the early quantum physicist Erwin Schrödinger. Then it encrypted that long video call. No similar act of quantum-key distribution has ever been achieved over that kind of distance.

Already, Pan said, Micius is ready to use to encrypt more important information.

How does a quantum key work?Quantum-key distribution is essentially a creative application of the so-called Heisenberg's uncertainty principle, one of the foundational principles of quantum mechanics. As Live Science has previously reported, the uncertainty principle states that it's impossible to fully know the quantum state of a particle — and, crucially, that in observing part of that state, a detector forever wipes out the other relevant information that particle contains.

That principle turns out to be very useful for encoding information. As the Belgian cryptographer Gilles Van Assche wrote in his 2006 book " Quantum Cryptography and Secret-Key Distillation," a sender and receiver can use the quantum states of particles to generate strings of numbers. A computer can then use those strings to encrypt some bit of information, like a video or a text, which it then sends over a classical relay like the internet connection you're using to read this article.

But it doesn't send the encryption key over that relay. Instead, it sends those particles across a separate quantum network, Van Assche wrote.

In the case of Micius, that means sending photons, one at a time, through the atmosphere. The receiver can then read the quantum states of those photons to determine the quantum key and use that key to decrypt the classical message. [ Album: The World's Most Beautiful Equations]

If anyone else tried to intercept that message, though, they would leave telltale signs — missing packets of the key that never made it to the sender.

Of course, no network is perfect, especially not one based on shooting information for individual photos across miles of space. As the Micius researchers wrote, the networks typically loses 1 or 2 percent of their key on a clear day. But that's well within what Micius and the base station can work together to edit out of the key, using some fancy mathematics. Even if an attacker did intercept and wreck a much larger chunk of the transmission, whatever they didn't catch would still be clean — shorter, but perfectly secure enough to encrypt transmissions in a pinch. [ How Quantum Entanglement Works (Infographic)]

The connection between Micius and Earth isn't perfectly secure yet, however. As the team of Chinese and Austrian authors wrote, the flaw in the network design is the satellite itself. Right now, base stations in each linked city receive different quantum keys from the satellite, which are multiplied together and then disentangled. That system works fine, as long as the communicators trust that no secret squad of nefarious astronauts has broken into Micius itself to read the quantum key at the source. The next step toward truly perfect security, they wrote, is to distribute quantum keys from satellites via entangled photons — keys the satellites would manufacture and distribute, but never themselves be able to read.

Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10