SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   Technology StocksNew Technology


Previous 10 Next 10 
To: Glenn Petersen who wrote (326)11/13/2017 3:14:52 PM
From: FJB
   of 414
 
It makes me sad that people could be as dumb as this writer. CO2 is beneficial to planet Earth. We won't be able to feed 10 billion people without the continued greening of our planet enabled by CO2.

=========================================================


The István Markó Interview: The Science.
breitbart.com

AP/Pablo Martinez Monsivais

Maybe the biggest of all the lies put out by the global warming scaremongers is that the science is on their side. No it isn’t. And if you’re in any doubt at all you should read this interview with the brilliant scientist István Markó. It tells you all you need to know about the science of global warming.
Dr. Markó, who sadly died earlier this year aged only 61, was a professor and researcher in organic chemistry at the Université Catholique de Louvain, Belgium’s largest French-speaking university. More importantly for the purposes of this interview, he was one of the world’s most outspoken and well-informed climate skeptics, who contributed to several articles on the subject for Breitbart News.

Before he died, he gave an extensive interview to the French journalist Grégoire Canlorbe. Here are highlights of the English translation. As you’ll see, he doesn’t pull his punches.

CO2 is not – and has never been a poison

Each of our exhalations, each of our breaths, emits an astronomical quantity of CO2 proportionate to that in the atmosphere (some >40,000 ppm); and it is very clear that the air we expire does not kill anyone standing in front of us. What must be understood, besides, is that CO2 is the elementary food of plants. Without CO2 there would be no plants, and without plants there would be no oxygen and therefore no humans.

Plants love CO2. That’s why the planet is greening

Plants need CO2, water, and daylight. These are the mechanisms of photosynthesis, to generate the sugars that will provide them with staple food and building blocks. That fundamental fact of botany is one of the primary reasons why anyone who is sincerely committed to the preservation of the “natural world” should abstain from demonizing CO2. Over the last 30 years, there has been a gradual increase in the CO2 level. But what is also observed is that despite deforestation, the planet’s vegetation has grown by about 20 percent. This expansion of vegetation on the planet, nature lovers largely owe it to the increase in the concentration of CO2 in the atmosphere.

There have been periods where the CO2 concentration was many times higher than now. Life thrived.

During the Jurassic, Triassic, and so on, the CO2 level rose to values sometimes of the order of 7000, 8000, 9000 ppm, which considerably exceeds the paltry 400 ppm that we have today. Not only did life exist in those far-off times when CO2 was so present in large concentration in the atmosphere, but plants such as ferns commonly attained heights of 25 meters. Reciprocally, far from benefiting the current vegetation, the reduction of the presence of CO2 in the atmosphere would be likely to compromise the health, and even the survival, of numerous plants. To fall below the threshold of 280 or 240 ppm would plainly lead to the extinction of a large variety of our vegetal species.

Animals need CO2 too. And by the way – forests are not the ‘lungs of the earth’…

In addition, our relentless crusade to reduce CO2 could be more harmful to nature as plants are not the only organisms to base their nutrition on CO2. Phytoplankton species also feed on CO2, using carbon from CO2 as a building unit and releasing oxygen. By the way, it is worth remembering that ~70 percent of the oxygen present today in the atmosphere comes from phytoplankton, not trees. Contrary to common belief, it is not the forests, but the oceans, that constitute the “lungs” of the earth.

It is not true that CO2 has a major greenhouse effect. Reports of its influence have been exaggerated

It is worth remembering here too that CO2 is a minor gas. Today it represents only 0.04 percent of the composition of the air; and its greenhouse effect is attributed the value of 1. The major greenhouse gas in the atmosphere is water vapor which is ten times more potent than CO2 in its greenhouse effect. Water vapor is present in a proportion of 2 percent in the atmosphere. Those facts are, in principle, taught at school and at university, but one still manages to incriminate CO2 alongside this learning, in using a dirty trick that presents the warming effect of CO2 as minor but exacerbated, through feedback loops, by the other greenhouse effects.

Climate change is natural

Over the last 12,000 years, what we have witnessed is an oscillation between warm and cold periods, thus periods with rising and declining sea levels. Incontestably, sea and ocean levels have been on the rise since the end of the Little Ice Age that took place approximately from the beginning of the 14th century until the end of the 19th century. At the end of that period, global temperatures started to rise. That being said, the recorded rise is 0.8 degrees Celsius and is, therefore, nothing extraordinary. If the temperature goes up, ocean water obviously dilates and some glaciers recede. This is something glaciers have always done, and not a specificity of our time.

Don’t worry about shrinking glaciers. We’ve been here before…

In Ancient Roman times, glaciers were much smaller than the ones we know nowadays. I invite the reader to look at the documents dating back to the days of Hannibal, who managed to cross the Alps with his elephants because he did not encounter ice on his way to Rome (except during a snow storm just before arriving on the Italian plain). Today, you could no longer make Hannibal’s journey. He proved to be capable of such an exploit precisely because it was warmer in Roman times.

Sea level rise is normal

Sea levels are currently on the rise; but this is an overestimated phenomenon. The recorded rise is 1.5 millimeters per year, namely 1.5 cm every ten years, and is, therefore, not dramatic at all. Indeed, it does happen that entire islands do get engulfed; but in 99 percent of the cases, that is due to a classic erosion phenomenon [1] and not to rising sea levels. As far as the Italian city of Venice is concerned, the fact it has been faced with water challenges is not due to any rise of the lagoon level and is just the manifestation of the sad reality that “the City of the Doges” is sinking under its weight on the marshland. Once again, the global sea and ocean levels are rising; but the threat effectively represented by that phenomenon is far from being tangible. I note that the Tuvalu islands, whose engulfment was previously announced as imminent, not only have not been engulfed, but have seen their own land level rise with respect to that of waters around them.

[1] The island shores are eroded by the persistent pounding of the ocean waves. This is perceived as ‘sinking’ or as ‘sea level rise,’ but the upward creep of the waters is due to island soil being washed away.

The polar ice caps are fine too

Still another phenomenon we tend to exaggerate is the melting of the polar caps. The quantity of ice in the Arctic has not gone down for 10 years. One may well witness, from one year to the other, ice level fluctuations, but, on average, that level has remained constant. Right after the Little Ice Age, since the temperature went up, the Arctic started to melt; but the ice level in the Arctic finally settled down. Besides, ice has been expanding in Antarctica over the last 30 years and, similarly, we observe in Greenland that the quantity of ice increased by 112 million cubic kilometers last year. On a global scale, glaciers account for peanuts, with most of the ice being located in Antarctica and so on.

Extreme weather events are actually decreasing

From storms to tornados, extreme events are going down all around the world and, when they occur, their level is much lower, too. As explained by MIT physicist Richard Lindzen, the reduction of the temperature differential between the north hemisphere and the equatorial part of our planet makes cyclonic energy much smaller: the importance and frequency of extreme events thus tend to decrease.

Recent warming is modest – much smaller than the alarmists’ various computer models predicted

If you look at satellite data and weather balloon measurements, you then note that the temperature rise around the world is relatively modest, that it is much lower than the rise that is predicted for us by authorities, and that these predictions rely on calculations that are highly uncertain. This is because the simulation inputs cannot take into account past temperatures, for which there is no precision data [1], except by subjectively adjusting x, y, z data that are not always known. The recent temperature spikes measured by satellites and balloons are part of a classic natural phenomenon which is called El Niño. This short-term phenomenon consists of a return of the very warm waters at the surface of the equatorial Pacific Ocean. The heat thus liberated in the atmosphere pushes up the global temperature and CO2 plays no role in that process.

Claims by alarmist ‘experts’ that 2016 was that ‘hottest year ever’ are pure balderdash

The World Meteorological Organization – another emanation of the United Nations and which is also, like the IPCC, an intergovernmental forum – declares 2016 the year the warmest of history. Knowing that 2016 is supposedly hotter by 0.02°C than 2015 and that the margin of error on this value is 0.1°C, we see the absurdity of this statement. For those who don’t understand, this means that the variation in temperature can be of + 0.12°C (global warming) or -0.08°C (global cooling). In short, we can’t say anything and WMO has simply lost its mind.

No, ‘climate change’ hasn’t led to an increase in tropical diseases

Climate-related diseases are relatively rare; and even malaria does not directly depend on the climate, but rather on the way we enable the parasite to reproduce and the mosquito to flourish in the place where we are located. If you find yourself in a swampy area, the odds you will get malaria are high; if you have drained the system and you no longer have that wetland, the odds you will catch the disease are very low. In the end, automatically blaming the resurgence of some disease on climate change comes down to removing the personal responsibility from the people involved: such as denying that their refusal of vaccinations, for instance, or their lack of hygiene, may be part of the problem.

Again, CO2 is greening the planet. And that’s a good thing. So stop demonizing it!

Present deserts, far from expanding, are receding; and they are receding due to the higher quantity of CO2 available in the air. It turns out that greenhouse operators voluntarily inject three times as much CO2 in the commercial greenhouse as it is present in the atmosphere. The result we can observe is that plants grow faster and are bigger, that they are more resistant to diseases and to destructive insects, and that their photosynthesis is way more efficient and that they, therefore, consume less water. Similarly, the rise of CO2 level in the atmosphere makes plants need less water so they can afford to colonize arid regions.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


From: FJB11/15/2017 1:46:28 PM
   of 414
 

Share RecommendKeepReplyMark as Last ReadRead Replies (1)


To: FJB who wrote (328)11/17/2017 8:37:13 PM
From: FJB
   of 414
 
We operate at banana scale.

Share RecommendKeepReplyMark as Last Read


From: FJB11/27/2017 6:47:37 PM
   of 414
 

DNA, Nature’s Best Storage Medium, Headed for the Data Center


datacenterknowledge.com

Inside Microsoft’s effort to solve the world’s data storage capacity problem.
The continued growth in information we’re trying to store (from IoT sensor data to log files and photos) has already outpaced the capacity of some systems. CERN stores only a tenth of the 15PB of data it gets from the Large Hardon Collider each year on disk.

For many organizations, capacity may not be such a large problem; hard drive technology continues to improve, and much of the world’s data is still stored on tape. The storage issue we haven’t yet tackled is longevity – and that’s where storing data on artificial DNA may really shine.

A smear of DNA small enough to scrape up with your fingernail could store exabytes of information, but it’s the fact that it remains readable after thousands of years that really makes it interesting. Paper and microfilm can last 500 years or more, but digital media are hard to keep for even a few decades. Accelerated testing at higher temperatures shows that DNA will stay readable for 2,000 years if it’s stored at ten degrees centigrade (and for up to 2 million

years if it’s frozen); encapsulating it in spheres of silica means that humidity doesn’t affect it.

The format won’t get out of date like digital storage either. “We'll always be interested in reading DNA so we can be sure we'll always have the capability of reading it in the future -- because if we don't we'll have a real problem,” Karin Strauss, senior researcher in computer architecture at Microsoft Research and associate professor at the Department of Computer Science and Engineering at University of Washington, told Data Center Knowledge.

In the lab, researchers have been able to write and read text, photos, videos, and other files with 100 percent accuracy, and last year Microsoft bought ten million DNA molecules from Twist Bioscience to experiment with. But what does it take to turn that research into a real storage system, and when might you think about putting one in your data center?


Storing data in DNA means turning the bits in a file into the four bases in DNA -- mapping 00 to A, 01 to C, 10 to G, and 11 to T every time -- then synthesizing DNA molecules with those bases in the right order. Reading it means putting those molecules in a DNA sequencer, reading out the sequence of bases, and turning that back into bits. Today, there are some manual steps in that process, Strauss explained.

“There's software to do the first step of translating the bits to what bases we want; the next step is manufacturing the molecules. There’s a manual interface there, because we send Twist the file, and we get back the molecules; internally they have an automated process but they still need somebody to remove the DNA from the machine and ship us the molecules. The sequencers are all automated; you throw the molecules in, and it spits out the data. And then we have the rest of the data pipeline to decode the data.”


Microsoft and Twist are working with the University of Washington to turn that into a fully automated system. Strauss predicted the end result would be something that looked like a tape library, complete with the same kind of robotic arm (and maybe with cartridges of reagents you change like toner in a laser printer). Depending on how much parallelism you need – which comes down to how much data you want to write or read at the same time – “that’s likely to look like a few racks in the data center” she said.

Small as the DNA itself is, you can save more space by encapsulating more than one file in the same silica shell, which means chemically separating the DNA to get the file you want. Because sequencing is a batch process, you’re going to be reading back multiple files on the same sequencer anyway. Files are also encoded on multiple sequences of DNA, so the sequences are clustered together to get the full result. There’s a sequence number on each molecule; think of it like numbering the different sections that make up a large ZIP archive.

Reading DNA destroys it, but that’s only because that’s what the medical and biotech applications need. “When you sequence DNA, you don't want to reuse it, you don't want contamination; you just throw the whole thing away including all the reagents.” It would be possible to recover the DNA instead, but it's probably easier just to make more copies with the standard polymerase chain reaction, which is already used in the process to make sure you have enough copies of the different sequences to read; picking which sequences to copy gives you random access to the right section of a large file.

Those copies can introduce errors, so the system has error correction built in; in fact, that’s how it’s going to scale up from the megabytes that have been stored and decoded so far to the petabytes it needs to be able to deal with. “We are engineering the system, which allows us to cut some corners; we can tolerate more errors, which is what we're counting on to be able to improve these processes. We’ll make the processes more parallel, and they may become more imperfect, both the reading and the writing, but we can tolerate and compensate for that in other ways. We have control over the sequences, so we can encode the data in a way that can make it easier for us to decode them on the way out.”

The overhead of error correction is currently around 15 percent; “That's pretty manageable; ECC in servers is 12.5 percent, so this isn’t that far off.”

How Big and How Soon?The cost of DNA sequencing and synthesis are dropping faster than the price of digital media, especially when you factor in needing to rewrite tapes every five to ten years, but it’s still going to make sense only when you need to store data for a long time rather than a few years. Cloud providers will be interested, but so will some organizations who run their own data centers.

”The type of workload is definitely archival, at least at first,” Strauss said. “The type of users we've been seeing where this would make sense are where you need to keep the data by mandate, like hospitals and clinics, or there's legal data, pension data. They’re applications where you want to keep the data for a long time and put it away and not read it very repetitively. In the end, it’s bits you’re storing, and we can store any kind of bits.”


Video archiving is also a good fit, and even the way people look at photos on Facebook fits the model pretty well; every Halloween enough people look back at photos from the previous year for Facebook to spot the pattern. “That’s a perfect workload, because you could read them in advance, and by the time a user wants to look at it, it's already there.”

Currently the throughput of reading DNA isn’t that high. Of the two systems Strauss has worked on, one produces around 2 million reads in 24 hours (with most of the reads done in the first few hours), the other, more parallel system delivers around 400 million reads in 24 hours. But the density means you could get excellent bandwidth at a very low cost if you need to send the data a long distance, because you could fit an exabyte on something the size of a postcard.

“People ship hard drives today; in the future it might be DNA. You have trucks and planes moving hard drives around; it’s done that way because you get better throughput. With DNA you can expect it to be even better, because it’s a lot more compact, and you can easily make copies for distribution.”

If customers are interested, Strauss suggested we could see DNA storage in action relatively soon. “We think there is a good roadmap to getting this into operation, and we see no fundamental reasons why we wouldn’t be able to put it together. It's not going to be next year, but it's not going to be in ten years either; I think it will be somewhere in between.”

Share RecommendKeepReplyMark as Last Read


From: FJB12/26/2017 9:26:27 PM
   of 414
 
During record-breaking holiday, Echo Dot and Fire TV Stick with Alexa remote were #1 and #2 top sellers with “tens of millions of Alexa-enabled devices” sold — Echo Dot and Fire TV Stick with Alexa Voice Remote were the #1 and #2 top-selling products across all categories on Amazon

More: Business Insider, BGR, TechCrunch, ZDNet, Ubergizmo, PhoneDog.com, Forbes, San Francisco Chronicle, GeekWire, CNBC, CNET, VentureBeat, Retail Dive, 500ish Words, Android Central, SiliconBeat, Android and Me, and Seeking Alpha

Share RecommendKeepReplyMark as Last Read


From: FJB1/20/2018 8:36:43 AM
   of 414
 
China's Quantum-Key Network, the Largest Ever, Is Officially Online
By Rafi Letzter, Staff Writer | January 19, 2018 10:56am ET
A figure from the letter shows how the Micius satellite transfers quantum keys across vast distances. Credit: Physical Review Letters

China has the quantum technology to perfectly encrypt useful signals over distances far vaster than anyone has ever accomplished, spanning Europe and Asia, according to a stunning new research letter.

Bits of information, or signals, pass through people's houses, the skies overhead and the flesh of human bodies every second of every day. They're television signals and radio, as well as private phone calls and data files.

Some of these signals are public, but most are private — encrypted with long strings of numbers known (presumably) only to the senders and receivers. Those keys are powerful enough to keep the secrets of modern society: flirty text messages, bank-account numbers and the passwords to covert databases. But they're brittle. A sufficiently determined person, wielding a sufficiently powerful computer, could break them.

"Historically, every advance in cryptography has been defeated by advances in cracking technology," Jian-Wei Pan, a researcher at the University of Science and Technology of China and author on this research letter, wrote in an email. "Quantum key distribution ends this battle."

Quantum keys are long strings of numbers — keys for opening encrypted files just like the ones used in modern computers — but they're encoded in the physical states of quantum particles. That means they are protected not only by the limits of computers but the laws of physics.

Quantum keys cannot be copied. They can encrypt transmissions between otherwise classical computers. And no one can steal them — a law of quantum mechanics states that once a subatomic particle is observed, poof, it's altered — without alerting the sender and receiver to the dirty trick. [ What's That? Your Physics Questions Answered]

And now, according to a new letter due for publication today (Jan. 19) in the journal Physical Review Letters, quantum keys can travel via satellite, encrypting messages sent between cities thousands of miles apart.

The researchers quantum-encrypted images by encoding them as strings of numbers based on the quantum states of photons and sent them across distances of up to 4,722 miles (7,600 kilometers) between Beijing and Vienna — shattering the previous record of 251 miles (404 km), also set in China. Then, for good measure, on Sept. 29, 2017, they held a 75-minute videoconference between researchers in the two cities, also encrypted via quantum key. (This videoconference was announced previously, but the full details of the experiment were reported in this new letter.)

The satelliteThis long-distance quantum-key distribution is yet another achievement of the Chinese satellite Micius, which was responsible for smashing a number of quantum-networking records in 2017. Micius is a powerful photon relay and detector. Launched into low Earth orbit in 2016, it uses its fine lasers and detectors to send and receive packets of quantum information — basically, information about the quantum state of a photon — across vast stretches of space and atmosphere.

"Micius is the brightest star in the sky when it is passing over the station," Pan wrote to Live Science. "The star is [as] green as the beacon laser [that Micius uses to aim photons at the ground]. If there is some dust in the air, you will [also] see a red light line pointing to the satellite. No sound comes from space. Maybe there are some raised by the movement of the ground station."

Just about any time Micius does anything, it blows previous records out of the water. That's because previous quantum networks have relied on passing photons around on the ground, using the air between buildings or fiber optic cables. And there are limits to line-of-sight on the ground, or how far a fiber-optic cable will transfer a photon without losing it.

In June 2017, Micius researchers announced that they had sent two " entangled" photons to ground stations 745 miles (1,200 km) apart. (When a pair of photons gets entangled, they affect each other even when separated by large distances.) A month later, in July, they announced that they had teleported a packet of quantum information 870 miles (1,400 km) from Tibet into orbit, meaning the quantum state of a particle had been beamed directly from a particle on the ground to its twin in space.

Both of these achievements were major steps on the road to real-world quantum-key-encrypted networks.

The new letter announces that the theory has been put into action.

Micius first encrypted two photos, a small image of the Micius satellite itself, then a photo of the early quantum physicist Erwin Schrödinger. Then it encrypted that long video call. No similar act of quantum-key distribution has ever been achieved over that kind of distance.

Already, Pan said, Micius is ready to use to encrypt more important information.

How does a quantum key work?Quantum-key distribution is essentially a creative application of the so-called Heisenberg's uncertainty principle, one of the foundational principles of quantum mechanics. As Live Science has previously reported, the uncertainty principle states that it's impossible to fully know the quantum state of a particle — and, crucially, that in observing part of that state, a detector forever wipes out the other relevant information that particle contains.

That principle turns out to be very useful for encoding information. As the Belgian cryptographer Gilles Van Assche wrote in his 2006 book " Quantum Cryptography and Secret-Key Distillation," a sender and receiver can use the quantum states of particles to generate strings of numbers. A computer can then use those strings to encrypt some bit of information, like a video or a text, which it then sends over a classical relay like the internet connection you're using to read this article.

But it doesn't send the encryption key over that relay. Instead, it sends those particles across a separate quantum network, Van Assche wrote.

In the case of Micius, that means sending photons, one at a time, through the atmosphere. The receiver can then read the quantum states of those photons to determine the quantum key and use that key to decrypt the classical message. [ Album: The World's Most Beautiful Equations]

If anyone else tried to intercept that message, though, they would leave telltale signs — missing packets of the key that never made it to the sender.

Of course, no network is perfect, especially not one based on shooting information for individual photos across miles of space. As the Micius researchers wrote, the networks typically loses 1 or 2 percent of their key on a clear day. But that's well within what Micius and the base station can work together to edit out of the key, using some fancy mathematics. Even if an attacker did intercept and wreck a much larger chunk of the transmission, whatever they didn't catch would still be clean — shorter, but perfectly secure enough to encrypt transmissions in a pinch. [ How Quantum Entanglement Works (Infographic)]

The connection between Micius and Earth isn't perfectly secure yet, however. As the team of Chinese and Austrian authors wrote, the flaw in the network design is the satellite itself. Right now, base stations in each linked city receive different quantum keys from the satellite, which are multiplied together and then disentangled. That system works fine, as long as the communicators trust that no secret squad of nefarious astronauts has broken into Micius itself to read the quantum key at the source. The next step toward truly perfect security, they wrote, is to distribute quantum keys from satellites via entangled photons — keys the satellites would manufacture and distribute, but never themselves be able to read.

Share RecommendKeepReplyMark as Last Read


From: FJB1/30/2018 3:22:05 PM
   of 414
 
Mazda Says Its Next-Generation Gasoline Engine Will Run Cleaner Than an Electric Car


Mazda
Mazda is staking much of its future on the continued existence of the internal-combustion engine, with clever tech like spark-controlled compression ignition set to debut in Mazda's next-generation production-car engine, Skyactiv-X. But the automaker is already thinking even further into the internal-combustion future. Automotive News reports that Mazda is working on a new gas engine, Skyactiv-3, which the automaker says will be as clean as an electric vehicle.

Speaking at a tech forum in Tokyo, Mazda powertrain chief Mitsuo Hitomi said that the main goal with Skyactiv-3 is to increase the engine's thermal efficiency to roughly 56 percent. If achieved, that would make the Skyactiv engine the first internal-combustion piston engine to turn the majority of its fuel’s energy into power, rather than waste due to friction or heat loss.


To date, the most thermally efficient automotive internal combustion engine belongs to Mercedes-AMG's Formula 1 team, with an efficiency of 50 percent; AMG hopes the F1-derived engine in the Project One street-legal supercar will achieve 41-percent thermal efficiency, which would make it the most thermally efficient production-car engine in history. Automotive News says Mazda's 56-percent goal would represent a 27-percent improvement over current Mazda engines. Hitomi didn't provide a timeline for when Skyactiv-3 would reach production, nor did he specify how Mazda hopes to achieve such an improvement.

Mazda's claim, that Skyactiv-3 would be cleaner to run than an all-electric vehicle, is a bold one, and requires some unpacking. Mazda bases the assertion on its estimates of "well-to-wheel" emissions, tallying the pollution generated by both fossil fuel production and utility electricity generation to compare Skyactiv-3 and EV emissions. Such analysis reflects the reality that, currently, much electricity is generated through fossil fuels. In regions where electricity comes from wind, solar, or hydroelectric, the EV would clearly win the argument, but that's not the case for many customers today.

If Mazda can make a mass-production internal-combustion engine that achieves more than 50 percent thermal efficiency, it will be an incredible feat—and would likely help guarantee the piston engine's continued survival.

Share RecommendKeepReplyMark as Last Read


From: FJB2/1/2018 10:59:47 PM
   of 414
 
GENE EDITING WITH CRISPR


This scientifically accurate CRISPR systems animation was created by Visual Science biologists and scientific visualization experts with support from the Skoltech Biotechnology faculty. The animation is based on molecular modelling and dynamics, which allowed to produce accurate models of natural and engineered CRISPR complexes, as well as to visualize the interiors of the bacterial cell and human cell nucleus.

Full project description and more about CRISPR systems: visual-science.com/crispr


Share RecommendKeepReplyMark as Last Read


From: FJB3/8/2018 4:58:32 PM
   of 414
 

The Quantum Thermodynamics Revolution


quantamagazine.org


As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.




Ricardo Bessa for Quanta Magazine

In his 1824 book, Reflections on the Motive Power of Fire, the 28-year-old French engineer Sadi Carnot worked out a formula for how efficiently steam engines can convert heat — now known to be a random, diffuse kind of energy — into work, an orderly kind of energy that might push a piston or turn a wheel. To Carnot’s surprise, he discovered that a perfect engine’s efficiency depends only on the difference in temperature between the engine’s heat source (typically a fire) and its heat sink (typically the outside air). Work is a byproduct, Carnot realized, of heat naturally passing to a colder body from a warmer one.

Carnot died of cholera eight years later, before he could see his efficiency formula develop over the 19th century into the theory of thermodynamics: a set of universal laws dictating the interplay among temperature, heat, work, energy and entropy — a measure of energy’s incessant spreading from more- to less-energetic bodies. The laws of thermodynamics apply not only to steam engines but also to everything else: the sun, black holes, living beings and the entire universe. The theory is so simple and general that Albert Einstein deemed it likely to “never be overthrown.”

Yet since the beginning, thermodynamics has held a singularly strange status among the theories of nature.

“If physical theories were people, thermodynamics would be the village witch,” the physicist Lídia del Rio and co-authors wrote last year in Journal of Physics A. “The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and no one dares to contradict her.”

Unlike, say, the Standard Model of particle physics, which tries to get at what exists, the laws of thermodynamics only say what can and can’t be done. But one of the strangest things about the theory is that these rules seem subjective. A gas made of particles that in aggregate all appear to be the same temperature — and therefore unable to do work — might, upon closer inspection, have microscopic temperature differences that could be exploited after all. As the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.”


In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply.

They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured. “Quantum thermodynamics” is a field in the making, marked by a typical mix of exuberance and confusion.




Sandu Popescu, a professor of physics at the University of Bristol.


Anna I Popescu

“We are entering a brave new world of thermodynamics,” said Sandu Popescu, a physicist at the University of Bristol who is one of the leaders of the research effort. “Although it was very good as it started,” he said, referring to classical thermodynamics, “by now we are looking at it in a completely new way.”

Entropy as Uncertainty


In an 1867 letter to his fellow Scotsman Peter Tait, Maxwell described his now-famous paradox hinting at the connection between thermodynamics and information. The paradox concerned the second law of thermodynamics — the rule that entropy always increases — which Sir Arthur Eddington would later say “holds the supreme position among the laws of nature.” According to the second law, energy becomes ever more disordered and less useful as it spreads to colder bodies from hotter ones and differences in temperature diminish. (Recall Carnot’s discovery that you need a hot body and a cold body to do work.) Fires die out, cups of coffee cool and the universe rushes toward a state of uniform temperature known as “heat death,” after which no more work can be done.

The great Austrian physicist Ludwig Boltzmann showed that energy disperses, and entropy increases, as a simple matter of statistics: There are many more ways for energy to be spread among the particles in a system than concentrated in a few, so as particles move around and interact, they naturally tend toward states in which their energy is increasingly shared.

But Maxwell’s letter described a thought experiment in which an enlightened being — later called Maxwell’s demon — uses its knowledge to lower entropy and violate the second law. The demon knows the positions and velocities of every molecule in a container of gas. By partitioning the container and opening and closing a small door between the two chambers, the demon lets only fast-moving molecules enter one side, while allowing only slow molecules to go the other way. The demon’s actions divide the gas into hot and cold, concentrating its energy and lowering its overall entropy. The once useless gas can now be put to work.

Maxwell and others wondered how a law of nature could depend on one’s knowledge — or ignorance — of the positions and velocities of molecules. If the second law of thermodynamics depends subjectively on one’s information, in what sense is it true?



A century later, the American physicist Charles Bennett, building on work by Leo Szilard and Rolf Landauer, resolved the paradox by formally linking thermodynamics to the young science of information. Bennett argued that the demon’s knowledge is stored in its memory, and memory has to be cleaned, which takes work. (In 1961, Landauer calculated that at room temperature, it takes at least 2.9 zeptojoules of energy for a computer to erase one bit of stored information.) In other words, as the demon organizes the gas into hot and cold and lowers the gas’s entropy, its brain burns energy and generates more than enough entropy to compensate. The overall entropy of the gas-demon system increases, satisfying the second law of thermodynamics.

The findings revealed that, as Landauer put it, “Information is physical.” The more information you have, the more work you can extract. Maxwell’s demon can wring work out of a single-temperature gas because it has far more information than the average user.

But it took another half century and the rise of quantum information theory, a field born in pursuit of the quantum computer, for physicists to fully explore the startling implications.

Over the past decade, Popescu and his Bristol colleagues, along with other groups, have argued that energy spreads to cold objects from hot ones because of the way information spreads between particles.
According to quantum theory, the physical properties of particles are probabilistic; instead of being representable as 1 or 0, they can have some probability of being 1 and some probability of being 0 at the same time. When particles interact, they can also become entangled, joining together the probability distributions that describe both of their states. A central pillar of quantum theory is that the information — the probabilistic 1s and 0s representing particles’ states — is never lost. (The present state of the universe preserves all information about the past.)

Over time, however, as particles interact and become increasingly entangled, information about their individual states spreads and becomes shuffled and shared among more and more particles. Popescu and his colleagues believe that the arrow of increasing quantum entanglement underlies the expected rise in entropy — the thermodynamic arrow of time. A cup of coffee cools to room temperature, they explain, because as coffee molecules collide with air molecules, the information that encodes their energy leaks out and is shared by the surrounding air.Understanding entropy as a subjective measure allows the universe as a whole to evolve without ever losing information. Even as parts of the universe, such as coffee, engines and people, experience rising entropy as their quantum information dilutes, the global entropy of the universe stays forever zero.

Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”

Moreover, the idea that energy has two forms, useless heat and useful work, “made sense for steam engines,” Renner said. “In the new way, there is a whole spectrum in between — energy about which we have partial information.”

Entropy and thermodynamics are “much less of a mystery in this new view,” he said. “That’s why people like the new view better than the old one.”

Thermodynamics From Symmetry

The relationship among information, energy and other “conserved quantities,” which can change hands but never be destroyed, took a new turn in two papers published simultaneously last July in Nature Communications, one by the Bristol team and another by a team that included Jonathan Oppenheim at University College London. Both groups conceived of a hypothetical quantum system that uses information as a sort of currency for trading between the other, more material resources.

Imagine a vast container, or reservoir, of particles that possess both energy and angular momentum (they’re both moving around and spinning). This reservoir is connected to both a weight, which takes energy to lift, and a turning turntable, which takes angular momentum to speed up or slow down. Normally, a single reservoir can’t do any work — this goes back to Carnot’s discovery about the need for hot and cold reservoirs. But the researchers found that a reservoir containing multiple conserved quantities follows different rules. “If you have two different physical quantities that are conserved, like energy and angular momentum,” Popescu said, “as long as you have a bath that contains both of them, then you can trade one for another.”In the hypothetical weight-reservoir-turntable system, the weight can be lifted as the turntable slows down, or, conversely, lowering the weight causes the turntable to spin faster. The researchers found that the quantum information describing the particles’ energy and spin states can act as a kind of currency that enables trading between the reservoir’s energy and angular momentum supplies. The notion that conserved quantities can be traded for one another in quantum systems is brand new. It may suggest the need for a more complete thermodynamic theory that would describe not only the flow of energy, but also the interplay between all the conserved quantities in the universe.

The fact that energy has dominated the thermodynamics story up to now might be circumstantial rather than profound, Oppenheim said. Carnot and his successors might have developed a thermodynamic theory governing the flow of, say, angular momentum to go with their engine theory, if only there had been a need. “We have energy sources all around us that we want to extract and use,” Oppenheim said. “It happens to be the case that we don’t have big angular momentum heat baths around us. We don’t come across huge gyroscopes.”



Popescu, who won a Dirac Medal last year for his insights in quantum information theory and quantum foundations, said he and his collaborators work by “pushing quantum mechanics into a corner,” gathering at a blackboard and reasoning their way to a new insight after which it’s easy to derive the associated equations. Some realizations are in the process of crystalizing. In one of several phone conversations in March, Popescu discussed a new thought experiment that illustrates a distinction between information and other conserved quantities — and indicates how symmetries in nature might set them apart.

“Suppose that you and I are living on different planets in remote galaxies,” he said, and suppose that he, Popescu, wants to communicate where you should look to find his planet. The only problem is, this is physically impossible: “I can send you the story of Hamlet. But I cannot indicate for you a direction.”

There’s no way to express in a string of pure, directionless 1s and 0s which way to look to find each other’s galaxies because “nature doesn’t provide us with [a reference frame] that is universal,” Popescu said. If it did — if, for instance, tiny arrows were sewn everywhere in the fabric of the universe, indicating its direction of motion — this would violate “rotational invariance,” a symmetry of the universe. Turntables would start turning faster when aligned with the universe’s motion, and angular momentum would not appear to be conserved. The early-20th-century mathematician Emmy Noether showed that every symmetry comes with a conservation law: The rotational symmetry of the universe reflects the preservation of a quantity we call angular momentum. Popescu’s thought experiment suggests that the impossibility of expressing spatial direction with information “may be related to the conservation law,” he said.

The seeming inability to express everything about the universe in terms of information could be relevant to the search for a more fundamental description of nature. In recent years, many theorists have come to believe that space-time, the bendy fabric of the universe, and the matter and energy within it might be a hologram that arises from a network of entangled quantum information. “One has to be careful,” Oppenheim said, “because information does behave differently than other physical properties, like space-time.”

Knowing the logical links between the concepts could also help physicists reason their way inside black holes, mysterious space-time swallowing objects that are known to have temperatures and entropies, and which somehow radiate information. “One of the most important aspects of the black hole is its thermodynamics,” Popescu said. “But the type of thermodynamics that they discuss in the black holes, because it’s such a complicated subject, is still more of a traditional type. We are developing a completely novel view on thermodynamics.” It’s “inevitable,” he said, “that these new tools that we are developing will then come back and be used in the black hole.”

What to Tell Technologists


Janet Anders, a quantum information scientist at the University of Exeter, takes a technology-driven approach to understanding quantum thermodynamics. “If we go further and further down [in scale], we’re going to hit a region that we don’t have a good theory for,” Anders said. “And the question is, what do we need to know about this region to tell technologists?”

In 2012, Anders conceived of and co-founded a European research network devoted to quantum thermodynamics that now has 300 members. With her colleagues in the network, she hopes to discover the rules governing the quantum transitions of quantum engines and fridges, which could someday drive or cool computers or be used in solar panels, bioengineering and other applications. Already, researchers are getting a better sense of what quantum engines might be capable of. In 2015, Raam Uzdin and colleagues at the Hebrew University of Jerusalem calculated that quantum engines can outpower classical engines. These probabilistic engines still follow Carnot’s efficiency formula in terms of how much work they can derive from energy passing between hot and cold bodies. But they’re sometimes able to extract the work much more quickly, giving them more power. An engine made of a single ion was experimentally demonstrated and reported in Science in April 2016, though it didn’t harness the power-enhancing quantum effect.

Popescu, Oppenheim, Renner and their cohorts are also pursuing more concrete discoveries. In March, Oppenheim and his postdoctoral researcher, Lluis Masanes, published a paper deriving the third law of thermodynamics — a historically confusing statement about the impossibility of reaching absolute-zero temperature — using quantum information theory. They showed that the “cooling speed limit” preventing you from reaching absolute zero arises from the limit on how fast information can be pumped out of the particles in a finite-size object. The speed limit might be relevant to the cooling abilities of quantum fridges, like the one reported in a preprint in February. In 2015, Oppenheim and other collaborators showed that the second law of thermodynamics is replaced, on quantum scales, by a panoply of second “laws” — constraints on how the probability distributions defining the physical states of particles evolve, including in quantum engines.

As the field of quantum thermodynamics grows quickly, spawning a range of approaches and findings, some traditional thermodynamicists see a mess. Peter Hänggi, a vocal critic at the University of Augsburg in Germany, thinks the importance of information is being oversold by ex-practitioners of quantum computing, who he says mistake the universe for a giant quantum information processor instead of a physical thing. He accuses quantum information theorists of confusing different kinds of entropy — the thermodynamic and information-theoretic kinds — and using the latter in domains where it doesn’t apply. Maxwell’s demon “gets on my nerves,” Hänggi said. When asked about Oppenheim and company’s second “laws” of thermodynamics, he said, “You see why my blood pressure rises.”




While Hänggi is seen as too old-fashioned in his critique (quantum-information theorists do study the connections between thermodynamic and information-theoretic entropy), other thermodynamicists said he makes some valid points. For instance, when quantum information theorists conjure up abstract quantum machines and see if they can get work out of them, they sometimes sidestep the question of how, exactly, you extract work from a quantum system, given that measuring it destroys its simultaneous quantum probabilities. Anders and her collaborators have recently begun addressing this issue with new ideas about quantum work extraction and storage. But the theoretical literature is all over the place.

“Many exciting things have been thrown on the table, a bit in disorder; we need to put them in order,” said Valerio Scarani, a quantum information theorist and thermodynamicist at the National University of Singapore who was part of the team that reported the quantum fridge. “We need a bit of synthesis. We need to understand your idea fits there; mine fits here. We have eight definitions of work; maybe we should try to figure out which one is correct in which situation, not just come up with a ninth definition of work.”

Oppenheim and Popescu fully agree with Hänggi that there’s a risk of downplaying the universe’s physicality. “I’m wary of information theorists who believe everything is information,” Oppenheim said. “When the steam engine was being developed and thermodynamics was in full swing, there were people positing that the universe was just a big steam engine.” In reality, he said, “it’s much messier than that.” What he likes about quantum thermodynamics is that “you have these two fundamental quantities — energy and quantum information — and these two things meet together. That to me is what makes it such a beautiful theory.”

Correction: This article was revised on May 5, 2017, to reflect that Lluis Masanes is a postdoctoral researcher, not a student.

Share RecommendKeepReplyMark as Last Read


From: FJB3/19/2018 11:40:29 AM
   of 414
 

Acta Mathematica Sinica, English Series: Celebrates it's 80th Birthday with this Special Issue
Managing Editors:Zhi-Ming Ma, Liqun Zhang

Editors: Fanghua Lin, Gang Tian



To celebrate Acta Mathematica Sinica’s 80th anniversary, we have published a special issue consisting of 9 original research articles from world renowned mathematicians. This special issue covers several fields of mathematics, including algebraic geometry, algebraic topology, Fourier analysis, partial differential equations, dynamical systems, etc.



Read Articles from the Special Issue*


Non-hyperbolic closed characteristics on non-degenerate star-shaped hypersurfaces in R2n
Hua Gui Duan, Hui Liu, Yi Ming Long, Wei Wang

Iwasawa theory of quadratic twists of X 0(49)
Junhwa Choi, John Coates



On a hybrid of bilinear Hilbert transform and paraproduct
Dong Dong, Xiao Chun Li



Smoothness of the gradient of weak solutions of degenerate linear equations
Richard L. Wheeden



A class of large solutions to the 3D incompressible MHD and Euler equations with damping
Yi Zhou, Yi Zhu

Almost sure convergence of the multiple ergodic average for certain weakly mixing systems
Yonatan Gutman, Wen Huang, Song Shao, Xiang Dong Ye

Some developments in Nielsen fixed point theory
Bo Ju Jiang, Xue Zhi Zhao

Semi-stable extensions over 1-dimensional bases
János Kollár, Johannes Nicaise, Chen Yang Xu

Decay of correlations for Fibonacci unimodal interval maps
Rui Gao, Wei Xiao Shen





We hope you enjoy reading these articles!



*Free access ends 15th May, 2018.



Share RecommendKeepReplyMark as Last Read
Previous 10 Next 10