PoliticsSam's miscellany

Previous 10 Next 10 
From: Sam10/3/2017 9:11:53 AM
   of 204
Democrats Overstate Trump Tax Plan Effects
By Eugene Kiely, Lori Robertson and Robert Farley
Posted on October 2, 2017

Democrats have used outdated or inflated figures to attack the GOP tax plan, when sticking to the facts would have served them just as well.

  • Sen. Bernie Sanders tweeted an image of a chart that exaggerates how much President Donald Trump and other wealthy people would benefit from the repeal of the estate tax.
  • Senate Minority Leader Chuck Schumer used an estimate from a liberal group in saying the tax plan “will add anywhere from 5 to 7 trillion dollars to our deficit.” The figures were actually $3 trillion to $5 trillion over 10 years; his office says he misspoke. But it’s worth noting Senate Republicans announced a budget framework that would allow $1.5 trillion in deficit spending over a decade.
  • Sen. Ron Wyden said a reduction in the corporate tax rate “will result in $1.8 trillion in tax cuts for the multinationals and the powerful CEOs.” But 20 percent of the corporate tax burden falls on labor, according to the Tax Policy Center.
  • Rep. Tim Ryan used an outdated and inflated figure when he said: “Since the recession ended, about 85 percent of income growth went to the top 1 percent.” The latest estimate is 52 percent.
To be sure, there are many details missing from the outline of a tax plan that the White House and GOP leaders released on Sept. 27, and there are several measures in that outline that would benefit the wealthy.

The tax plan would cut the corporate tax rate from 35 percent to 20 percent; abolish the alternative minimum tax; collapse the seven income tax brackets, which range from 10 percent to 39.6 percent, to three (12 percent, 25 percent and 35 percent); increase the standard deduction but eliminate exemptions; eliminate most itemized deductions except for mortgage interest and charitable giving; increase the child tax credit; and abolish the estate tax.

In a Sept. 29 analysis of the GOP plan, the nonpartisan Tax Policy Center estimates that the average tax bill would decline in 2018 for “all income groups,” but “[t]hose with the very highest incomes would receive the biggest tax cuts.” Half of the “total tax benefit” would go to the top 1 percent of taxpayers who have incomes of more than $730,000.

The Democrats, of course, have the right to criticize the tax plan for favoring the wealthy. But we found several instances where they went too far.

Estate Tax Exaggeration
Sanders tweeted an image of a chart that exaggerates how much Trump, some members of his cabinet and other wealthy people would benefit from the repeal of the estate tax.

As indicated in the chart, the Democratic staff of the Senate Budget Committee produced the scratched-out figures. Sanders, an independent from Vermont who ran for the Democratic presidential nomination, is the ranking member on the committee.

Notice that the first column says “max estate tax under current law.” The maximum statutory estate tax rate is 40 percent (see Table A of IRS instructions for the estate tax). But the actual rate that estates pay — known as the effective rate — is much less.

The Tax Policy Center estimates the highest average effective rate is 18.8 percent — less than half the 40 percent applied by the Democratic staff.

Sanders’ spokesman Josh Miller-Lewis, who provided us with a copy of the analysis, confirmed for us that the committee staff did “a straight 40% tax on assets (minus the current law exemption for the first $11 million)” when calculating the tax break for Trump and others on the chart. (The first $5.49 million in an estate’s assets, or nearly $11 million for a couple, are exempt from the tax.)

The Democratic analysis also assumes, in Trump’s case, that the president is worth $10 billion. That is based on Trump’s own estimate of his net worth. But, as we have written, independent estimates use a much lower figure.

During the transition in December, Bloomberg did its own calculations on how much Trump (then the president-elect) and some candidates for his cabinet would benefit if the estate tax was repealed. In its calculations, the news service estimated that Trump is worth about $3 billion, not $10 billion.

If the estate tax is repealed, “Trump’s estate would save $564 million, based on his estimated net worth of $3 billion” and an effective tax rate of 18.8 percent, Bloomberg writes. Even if Trump is worth $10 billion, as he says, the tax windfall for Trump’s estate would be $1.9 billion — less than half the $4 billion figure used in the chart that Sanders posted on Twitter.

Asked about the Bloomberg estimate, Miller-Lewis said: “Bloomberg probably assumed that most people with this wealth will take advantage of all the ways to shield assets from the estate tax and not pay the top rate. We accounted for this in the graphic by saying ‘maximum.'”

Bloomberg’s calculations also showed less of a tax break for two of Trump’s cabinet officers.

The news service estimated that Commerce Secretary Wilbur Ross “might save about $545 million, based on estimated net worth of $2.9 billion,” and Small Business Administrator Linda McMahon and her husband, Vince, “might save more than $250 million based on their shared net worth of at least $1.35 billion,” Bloomberg said. Both of those figures are about half of the tax breaks that the Democratic staff projects.

Impact on the Deficit

Schumer said in a press conference on Sept. 27: “The plan will add anywhere from 5 to 7 trillion dollars to our deficit.” But on Twitter the same day, he phrased it slightly differently: “On top of that, this would cost anywhere from 5 to 7 TRILLION dollars and they have no credible plan to pay for it.”

The figures come from Americans for Tax Fairness, a group that advocates “progressive tax reform” that “requires big corporations and the wealthy to pay their fair share,” but Schumer doesn’t quite get them right. ATF said the cost of the plan could be $7 trillion to $8 trillion over 10 years, while the portion added to the deficit could be $3 trillion to $5 trillion over 10 years. The headline on the analysis said: “Trump’s Unpaid-For Tax Cuts May Total $5 Trillion In New Tax Plan.”

Schumer’s office told us that in the press conference, the senator meant to say he was referring to the total cost.

The ATF calculations rely heavily on several past reports from the Tax Policy Center, but ATF’s high-end estimate on deficit spending — $5 trillion over a decade — surpasses what TPC has found. For instance, TPC’s July 12 report on what the administration had outlined then estimated a loss in revenue of $3.5 trillion over the first 10 years. It then estimated a revenue loss of $5.7 trillion over the second decade.

More recently — but two days after Schumer made his comments — TPC estimated that the latest tax framework from the president and Republican congressional leaders “would reduce federal revenue by $2.4 trillion over ten years and $3.2 trillion over the second decade.”

As we said, there are many details missing from the tax plan that has been released so far. But Republicans on the Senate Budget Committee have drafted a 2018 budget that would allow the tax plan to contribute up to $1.5 trillion over 10 years to the deficit — far less than the figures Schumer cited. Republicans announced that framework before the tax plan was revealed, though the full Senate would still need to pass the budget and reconcile it with the House version.

Under budget reconciliation rules, the Senate could pass the tax cuts with a simple majority, and Republicans control 52 seats in the Senate.

Benefits to Corporate Tax Cuts

Wyden criticized the Republicans’ plan to reduce the corporate tax rate, saying at the Sept. 27 press conference: “Back-of-the-napkin estimates show that the Trump plan on the corporate side will result in $1.8 trillion in tax cuts for the multinationals and the powerful CEOs.”

But not all of the benefits from a corporate tax cut would go to the rich and powerful. Many economists — including those with the nonpartisan Joint Committee on Taxation, the Congressional Budget Office and Tax Policy Center — say workers bear some of the burden of corporate taxes.

The Americans for Tax Fairness analysis of the GOP plan says the reduction in the corporate tax rate from 35 percent to 20 percent, and the repeal of the corporate alternative minimum tax, would cost $1.8 trillion over 10 years. It cites the Tax Policy Center, which indeed included that figure in an analysis of a 2016 House Republican plan. But the TPC assigns 20 percent of the corporate tax burden to labor, with the rest falling on capital and shareholders.

That would mean that $1.44 trillion — not the full $1.8 trillion estimate — of the corporate tax cut would benefit those who control the corporations, “multinationals and the powerful CEOs,” as Wyden put it.

We recently wrote about this issue when fact-checking the near opposite claim from Treasury Secretary Steven Mnuchin, who said, “most economists believe that over 70 percent of corporate taxes are paid for by the workers.” We found there were economists who agreed with Mnuchin, but many others who didn’t. In fact, the three nonpartisan organizations we mentioned — JCT, CBO and TPC — say most, but not all, of the corporate tax burden falls on shareholders. The Treasury Department reached the same conclusion in 2008 and 2012 analyses. (Treasury has now removed the 2012 analysis from its website, the Wall Street Journal reports.)

We found that the pro-business Tax Foundation also estimated that reducing the corporate tax rate to 20 percent would cost $1.8 trillion over a decade. But the foundation goes on to say that 30 percent of that would be passed on to labor.

Wyden’s claim is another case of Democrats overstating the impact of the tax plan on the wealthy, and an example of how both sides are spinning the facts on Trump’s tax plan.

Outdated Evidence

Ryan used an outdated and inflated figure when he said, “Since the recession ended, about 85 percent of income growth went to the top 1 percent.” The latest estimate is 52 percent.

Ryan’s comment came during an interview with Neil Cavuto on Fox News on Sept. 28. The Ohio Democrat said he would “absolutely not” support the tax plan proposed by Trump. Ryan argued the plan would disproportionately benefit the wealthy.

Ryan, Sept. 28: Let’s … take a step back and look at this whole thing. Since the recession ended, about 85 percent of income growth went to the top 1 percent, OK?

So, this is just another supply-side economic plan, where you’re getting rid of the alternative minimum tax, you’re getting rid of the estate tax, which doesn’t start until 10 million bucks. So, this is going clearly to the top 1 percent. They’ve seen all of the income growth in the last seven to 10 years.

Ryan’s office referred us to a Time magazine story about a study released on June 16, 2016, by the Economic Policy Institute that did, in fact, report: “For the United States overall, the top 1 percent captured 85.1 percent of total income growth between 2009 and 2013.”

But that study is now outdated.

The EPI study mirrors the work of economist Emmanuel Saez of the University of California, Berkeley, who wrote in January 2015 that after the Great Recession, “the top 1% captured 91% of the income gains in the first three years of the recovery.” At the time, that was an update with 2013 preliminary estimates. The report has been updated twice since then, once on June 25, 2015, (updated with 2014 preliminary estimates) and again on June 30, 2016 (updated with 2015 preliminary estimates).

According to the most recent update, the “[t]op 1% families still capture 52% of total real income growth per family from 2009-2015 but the recovery from the Great Recession now looks much less lopsided than in previous years.”

Mark Price, a labor economist at the Keystone Research Center and one of the co-authors of the EPI study, told us the updated Saez figure of 52 percent is the “most accurate” figure to use now.

As Price explained, early in an economic recovery, most of the income growth goes to the very wealthiest. But as the recovery continues and the unemployment rate comes down, he said, income and wage gains tend to spread out to lower income levels.

The current 52 percent level is about where the historical average has been for the last 36 years, he said, and the top income groups overall capture a disproportionately large share of income growth.

“The overall point is that a disproportionate share of income growth is captured by the Top 1 percent,” Price said.

Nonetheless, Ryan should use the more updated, and accurate, figure.

Share RecommendKeepReplyMark as Last Read

From: Sam10/4/2017 9:32:18 AM
   of 204
Using the autolyse method: unleashing the power of the pause
If artisan bread baking is a passion of yours, you’ve no doubt come across the autolyse method (a short rest after combining flour and water) in recipes or baking books. Have you wondered how using the autolyse method might transform your bread baking? Are you uncertain how best to incorporate this powerful pause?

What is an autolyse? An autolyse is the gentle mixing of the flour and water in a bread recipe, followed by a 20 to 60 minute rest period. After the rest, the remaining ingredients are added and kneading begins. This simple pause allows for some rather magical changes to occur in your bread dough. Let’s explore exactly what’s going on during the autolyse, and how it can improve your bread baking.

A brief history of the autolyse method French scientist and bread expert Prof. Raymond Calvel developed this technique in 1974, in response to what he saw as a deterioration in French bread production.

In the 1950s and ’60s two-speed electric mixers came into use in France, and bakers adopted more aggressive mixing practices. According to Calvel, this resulted in “very white and high in volume” bread — which to his dismay began to gain popularity in France.

This intensive mixing caused the dough to mature more quickly, which meant less fermentation time was required. Unfortunately, reducing fermentation resulted in bread with less flavor and keeping quality.

Excessive mixing also damaged the carotenoid pigments in the flour through over-oxidation. This caused a loss of crumb color (whiter bread) and a reduction in aroma and flavor.

Calvel saw this as a grave degradation of traditional French bread, and sought to bring back the wonderful breads he’d known in his youth.

Why use an autolyse? Calvel demonstrated that using the autolyse method affects dough development in many positive ways:

  • The flour fully hydrates. This is particularly useful when working with whole-grain flour because the bran softens as it hydrates, reducing its negative effect on gluten development.
  • Gluten bonds begin developing with no effort on the part of the baker, and kneading time is consequently reduced.
  • Carotenoid pigments remain intact, leading to better color, aroma, and flavor.
  • Fermentation proceeds at a slower pace, allowing for full flavor development and better keeping quality.
  • The dough becomes more extensible (stretchy), which allows it to expand easily. This leads to easier shaping, greater loaf volume, a more open crumb structure, and cuts that open more fully.
Using the autolyse method requires no effort and can transform your bread baking. Click To Tweet What’s not to love?

The science behind the autolyse method I was trying to explain the autolyse method to my sister, who’s a pathologist, and her eyes lit up: “Autolysis!” In her world this process describes the self-digestion or destruction of tissue by its own enzymes.

As it turns out, this is exactly what’s going on during an autolyse.

Two enzymes that are present in flour — protease and amylase — begin their work during the autolyse:

  • The protease enzymes degrade the protein in the flour, which encourages extensibility.
  • The amylase enzymes turn the flour’s starch into sugars that the yeast can consume.
Proper dough development requires a balance of both extensibility and elasticity. By delaying the addition of yeast, sourdough starter and salt (all of which can have a tightening effect on gluten), the extensibility of the dough has a better chance to develop. Once kneading begins, the dough develops elasticity, which is the quality that allows the dough to retain its shape.

Testing the autolyse method When I began this project I was convinced it would be easy; how hard could it be to demonstrate the well-documented positive results of using the autolyse method?

Then things got a little messy.

My test loaves didn’t always illustrate the promised dramatic results, and using the autolyse method wasn’t always as easy as I expected.

While I certainly don’t dispute Calvel’s findings, sometimes production baking practices need a little tweaking when it comes to utilizing them in your home baking.

Let’s see what my tests reveal about the best way to add the autolyse method to your bread-baking routine.

Choosing autolyse-friendly recipes Strictly speaking, an autolyse includes just the flour and water in a bread recipe. Salt tends to tighten gluten, as does the fermentation brought about by the addition of yeast or sourdough starter. Since these ingredients work against the development of extensibility, they are omitted from the autolyse.

However, when a recipe includes a liquid sourdough starter or a preferment, it’s added to the autolyse as well. Without the significant percentage of liquid contained in these starters it would be impossible to properly hydrate the flour during the autolyse.

For testing purposes, I chose recipes that either didn’t include a liquid preferment, or called for a stiff sourdough starter that could be added after the autolyse.

While an autolyse can be added to almost any bread recipe, my rationale is that these recipes will show the purest results when it comes to the effects of adding an autolyse.

Test #1: Artisan Sourdough Bread with a Stiff Starter This recipe makes one large boule, but I’ll cut it in half and make sourdough demi-baguettes; baguettes make for an easy side-by-side comparison.

For consistency’s sake I’ll use my KitchenAid mixer with the dough hook, and try to keep the mixing and kneading times the same for each dough. My standard mix time will be 3 minutes on the lowest speed (stir). Kneading time will be 2 minutes on speed 2.

The dough is always covered during a rest period.

Here’s what I’ll test with this recipe:

Dough #1, no autolyse or pause: All the ingredients are added at once and immediately mixed and kneaded. Mix/kneading time is 3 minutes on “stir” and 2 minutes on speed 2.

Dough #2, mix and 30-minute pause: A gentle mix of all the ingredients, followed by a 30-minute pause.The dough is then kneaded.

Dough #3, 30-minute autolyse: The flour and water are gently mixed and the dough is allowed to rest for 30 minutes. After the autolyse the salt and sourdough starter are added and the dough is kneaded.

Dough #4, 60-minute autolyse: The same process as the 30-minute autolyse, but with a 60-minute rest.

Dough #1 and dough #2 come together easily, and dough #3 starts out nicely as well. For the autolyse, I simply mix the flour and water together until the flour is fully moistened. I then cover the bowl and let the dough rest for 30 minutes.

It’s truly miraculous to observe how the dough transforms in 30 short minutes from a rough blob to a smooth and stretchy dough.

Difficulties with incorporation after the autolyse However, when it comes time to add the stiff starter and salt to dough #3, I run into a small hitch: the starter and dough don’t want to come together.

I first try ripping the starter into small pieces and sprinkling the salt on top of the dough, but little bits of starter remain intact, even after kneading for 2 minutes on speed 2.

When I mix the 60-minute autolyse for dough #4, I try a different tactic.

I decide to reserve a small amount of the recipe’s water (20g) to soften the stiff starter prior to adding it (along with the salt) to the dough and kneading. This also proves challenging to mix because the water causes the dough to slide around in the mixer, preventing the hook from engaging.

It’s necessary to gently mix on the lowest speed, pausing frequently to fold the dough and help it engage with the hook.

While these added precautions work, they definitely take more time. If less required kneading is a positive result of using the autolyse method, struggling to incorporate the starter before you can begin kneading is definitely a drawback.

Test #1 results With all that extra effort to incorporate the starter properly after the autolyse, I’m a bit disappointed to see so little bang for my buck. The loaves that include an autolyse (#3 and #4) do show greater expansion, but there’s not a dramatic difference.

Dough #1: The mix that received no autolyse or pause produces a slightly smaller baguette, with a more erratic crumb structure than the other test loaves.

Dough #2: The mix and pause method produces the best color, crumb, and scoring.

Dough #3: The 30-minute autolyse yields a robust loaf with a decent crumb, but nothing to write home about.

Dough #4: The 60-minute autolyse produces the largest baguette, but the crumb isn’t as open as #1, #2 or #3.

The winner of this round is dough #2.

Test #2: French-Style Baguettes For this recipe I’ll use the same mixing/kneading process as I did in the first test: 3 minutes on “stir” to blend the ingredients, and 2 minutes on speed 2 to knead the dough. The only exception to this will be the intensive mix (Dough #1).

Here’s what I’ll test with this recipe:

Dough #1, intensive mix: Replicating the intensive mixing practices that drove Calvel to develop the autolyse method risks destroying my mixer — don’t try this at home! I mix all the ingredients and then knead for a full 15 minutes on speed 2 in my KitchenAid mixer.

Dough #2, mix and pause: All of the dough ingredients are mixed together and then allowed to rest for 30 minutes. While not quite an autolyse, this is certainly autolyse-inspired. After the pause, the dough is kneaded.

Dough #3, 60-minute autolyse: The flour and water are gently mixed and allowed to rest for 60 minutes. After the rest, the salt and instant yeast are added and the dough is kneaded.

Dough #4, 30-minute autolyse with yeast: The flour, water, and instant yeast are mixed together in the autolyse. After the 30-minute rest, the salt is added and the dough is kneaded.

Dough #5, 30-minute autolyse: No salt or instant yeast are added to the autolyse. After the 30-minute rest, the salt and instant yeast are added and kneading commences.

An unexpected snag Doughs #1 and #2 come together without incident (my mixer doesn’t die)! But when I get to dough #3, I discover an issue that I hadn’t anticipated.

I mix the flour and water and allow the dough to rest for 60 minutes.

After the autolyse, the yeast and salt are added and I knead on speed 2 for 2 minutes.


The instant yeast doesn’t dissolve properly when added to the fully hydrated flour. The dough is riddled with undissolved yeast granules at the end of the first rise.

Back to the drawing board.

Test #2-A This time when I come to dough #3 and dough #5 I reserve a small amount of the water (1 ounce) from the autolyse, to add in along with the instant yeast and salt after the autolyse is complete.

This method has its own issues. The yeast dissolves nicely; unfortunately, the dough has difficulty accepting the water/yeast/salt mixture: it’s necessary to stop and start the mixer repeatedly to make it happen.

Another option is to add the yeast with the autolyse, as I did with dough #4. This is what bakeries using instant yeast normally do; but how do the resulting baguettes compare?

Let’s bake them and see.

Test #2-A results Dough #1: The intensive mix actually turns out better than expected, with a decent crumb structure. However, it’s much more difficult to roll out, and the resulting baguette is small and pale: definite proof that more isn’t better when it comes to kneading bread dough.

Dough #2: The mix and pause method also yields fairly positive results. Good crumb, decent volume, and cuts that open reasonably well.

Dough #3: Wow! The 60-minute autolyse really shines in this test (though the loaf may have been a little over-proofed). Great color, volume, and crumb, along with decent cuts.

Dough #4: The 30-minute autolyse with yeast yields another beautiful baguette. While not quite as big as #3 and #5, it’s next in line for size, and has a very open crumb.

Dough #5: The 30-minute autolyse without salt or yeast is the winner of this round. Beautiful crumb, color, and volume. I even get some ears!

The winner!

One final test Baking 5 rounds of baguettes is just about my limit for one day, so I decide to refrigerate some of the extra baguette dough from test #2-A and shape and bake it the next day.

Dough #1: After refrigeration, the intensive mix once again produces a small, pale baguette. None of the baguettes have as open a crumb structure as the baguettes that were baked the day before, but the interior of this baguette looks particularly tough and ropey.

Dough #2: The mix and pause method yields a reasonably nice baguette, but it’s also paler and smaller than dough #3.

Dough #3: The 60-minute autolyse is the clear winner of this round. It yields the largest baguette, with a great crust color and cuts that open beautifully.

To autolyse or not to autolyse? One advantage of using an autolyse that I haven’t been able to capture in these photos is the tremendous difference it makes in the feel and workability of the dough. Calvel called French bread made using the autolyse method “more seductive.” I think he was referring to the baked loaves, but dough that’s undergone an autolyse feels more seductive. It has a soft and yielding strength that makes rolling out baguettes a true pleasure. It’s worth giving the autolyse method a try for this reason alone.

Takeaways for using the autolyse method:
  1. When mixing the autolyse be sure that all the flour is fully moistened; dry flour won’t incorporate well later in the process.
  2. Don’t use an autolyse with sourdough rye bread. Because rye flour doesn’t develop gluten the way wheat flour does, and also ferments more quickly, adding an autolyse can cause the dough to deteriorate.
  3. Liquid preferments and starters are always included in the autolyse.
  4. To add instant yeast and/or a stiff sourdough starter after the autolyse is complete, dissolve them in a small amount of water (1 ounce) reserved from the autolyse. Mix gently until the ingredients are fully incorporated before beginning to knead.
  5. Salt is fairly easy to incorporate after the autolyse, so delaying this ingredient isn’t difficult and can have a big impact. Just don’t forget to add it after the autolyse! Sprinkle the salt on top of the autolyse if you’re afraid you’ll forget to add it later.
  6. If you plan to delay adding any ingredients — salt, yeast, and/or starter — measure them out and place them next to the autolyse. This will help prevent a potential disaster!
  7. I wasn’t able to discern a significant difference in crumb color, aroma or flavor in the baguettes that included an autolyse. I suspect those with more sensitive noses and tongues may notice these differences better than I can.
  8. Is a longer autolyse more beneficial than a short one? More experimentation is necessary, but this may well vary from one recipe to the next.
  9. If you find a true autolyse inconvenient, you’ll still see benefits by adding a 30-minute rest between mixing all of the dough ingredients and kneading.
I hope you’ll feel inspired to start using an autolyse in your bread baking.

And let us know how using the autolyse method works for you!

Share RecommendKeepReplyMark as Last Read

From: Sam10/4/2017 3:08:45 PM
   of 204
Kevin Mitnick, a legal hacker, warns of 'the new normal'

Oct. 04--Hacker-turned-security-consultant Kevin Mitnick used an arsenal of everyday items -- laptops, a Bluetooth speaker and Wi-Fi -- to demonstrate the ease with which one bad actor can wreak havoc on a business or consumer.

His Houston presentation came the same week that Equifax(EFX) and Yahoo expanded the number of people affected by their data breaches. In an era of cyberwarfare, he said, the attacks aren't going to ebb anytime soon.

"I believe it's the new normal," Mitnick told the Chronicle. "And that's because of the limited budgets companies spend on security."

He addressed the BBVA Compass Bright Perspectives forum Tuesday at downtown's JW Marriott.

"It's only fitting to have Kevin speak here today because at BBVA Compass we're focused on bringing the forefront knowledge to our clients about the world's rapid digital transformation," Houston CEO Mark Montgomery told 150 banking clients and prospects.

Equifax (EFX) disclosed Monday that an additional 2.5 million U.S. consumers could have been affected by its data breach, bringing the total to 145.5 million people. Yahoo on Tuesday announced that an August 2013 breach affected all 3 billion user accounts. That is three times the size of the impact it previously reported.

According to Identity Theft Resource Center and CyberScout, the number of U.S. data breaches tracked through June 30 hit a half-year record of 791, up 29 percent over 2016. At this pace, the number of breaches could reach 1,500 in 2017, a 37 percent increase over the record-breaking 1,093 breaches in 2016.

All data breaches are damaging to companies and consumers, but Mitnick said the Equifax(EFX) heist could have longer-term implications. Criminals may have gained access to Social Security numbers, birth dates and addresses.

"Because all the information in your credit profile is used to verify your identity, it's now going to be super easy for the bad guys to impersonate you for any purpose whatsoever," Mitnick said. "So it's really, really significant."

He hopes the recent string of attacks will encourage companies to invest more in cybersecurity efforts and to be proactive rather than reactive.

Mitnick is a security consultant hired by governments and Fortune 500 companies to identify and fix cybersecurity weaknesses. But he didn't always hack into companies with their permission.

In high school, Mitnick was involved with phone phreaking, a predecessor of sorts to computer hacking that used phone systems. If he called one number, it would read back the number he was calling from. Another number enabled him to use a five-digit code to call anywhere in the world for free.

Mitnick could even change his friend's home phone to a payphone. Whenever his friend's family tried to make a call, it would ask them to deposit 25 cents.

This interest in phone phreaking led Mitnick to computers, where the first program he wrote was designed to steal his teacher's password.

"It was all about the seduction of adventure, the pursuit of knowledge and the challenge," he told the audience. He wasn't in it for the money or to cause damage, he said.

Mitnick ultimately hacked into 40 major corporations and landed on the FBI's Most Wanted list. He spent time in federal prison, some of that in solitary confinement. When he returned to his old habits, he was on the side of those trying to keep their information secure.

As CEO of Mitnick Security Consulting, Mitnick and his team have a 100 percent success rate at being able to penetrate the security of any system they're paid to hack using technical exploits and social engineering. Their work exposes weaknesses that the companies can then address.

Data breaches are harmful to company's image and bottom line. The average cost of a data breach is $3.62 million, according to the 2017 Cost of Data Breach Study: Global Overview released by IBM Security and Ponemon Institute.

That's down from $4 million reported in fiscal 2016. Despite the decline, the 419 companies that participated in this year's study reported larger data breaches. The average size increased 1.8 percent.

"At the end of the day, a breach costs you money," Mitnick said.

He also spoke to the Chronicle about the Russians using Facebook(FB) ads to meddle with the U.S. election, an attack he described as a psychological operation. He said he doesn't think there are laws that criminalize creating false advertisements, though Mitnick acknowledged he is not an expert in the area.

"Of course it's very concerning and it should be illegal," he said, "but I don't think there's any law that has been broken."

Share RecommendKeepReplyMark as Last Read

From: Sam10/4/2017 4:40:42 PM
   of 204

'We'll pay nothing': TowerJazz seeks growth with minimal investment
REUTERS 7:09 AM ET 10/4/2017

Symbol Last Price Change
14.22 -0.43 (-2.94%)
48.42 +0.32 (+0.67%)
QUOTES AS OF 01:54:03 PM ET 10/04/2017

* CEO made deals with Panasonic(PCRFF), Maxim with little cash down

* Has transformed TowerJazz into profitable, debt-free firm

* But strategy could limit growth, risk competitiveness

* TowerJazz needs to secure more capacity by end of 2018

By Tova Cohen and Steven Scheer

MIGDAL HA'EMEK, Israel, Oct 4 (Reuters) - Executives at Panasonic(PCRFF) were surprised when Russell Ellwanger, CEO of Israeli chipmaker TowerJazz, asked to partner in three of their factories without putting any cash on the table. "They had said that to do the deal it would be between a $300 million and $400 million cost for us," he told Reuters in an interview. "I said, 'No, we want to do this deal ... but we will pay nothing'." Instead Ellwanger sealed the joint venture with what he characterised as a win-win deal: his company filled the unused capacity at the Japanese plants and struck a five-year deal to supply chips to Panasonic(PCRFF) from the joint venture. It also agreed to hand over $7.5 million in TowerJazz stock, plus about half of what it earned from selling chips made at the factories to other customers.

The latter presents Panasonic(PCRFF) with $100 million a year.

Ellwanger has managed to strike agreements - like the 2014 Panasonic(PCRFF) one - to expand the company's manufacturing operations with minimal investment. He is counting on this strategy to drive further growth - but it has its limits and risks.

Rather than building or buying a factory at huge cost, TowerJazz looks for plants owned by bigger fish with idle capacity that is hitting their profitability. It takes up that capacity while striking supply deals and offering other incentives such as a cut of profits and technical know-how.

Ellwanger used similar tactics to acquire a factory in Texas from Maxim Integrated Products(MXIM) that wasn't producing at full capacity and thus costly to run. Rather than paying in cash, it struck a 15-year supply deal with the U.S. multinational and handed over $40 million in stock.

In the latest deal, TowerJazz said in August it was linking up with Tacoma Semiconductor Technology to establish a plant in China. It will not invest any money but will provide technological and operational expertise in exchange for half the annual capacity of 240,000 chips.

The ability to identify such opportunities has been Ellwanger's signature since 2005, when he left California to take on the task of running TowerJazz, which had $100 million in annual revenue and was saddled with $530 million of net debt.

The company, which specialises in the kind of analog chips used in cars, medical sensors and power management, is now debt-free, profitable and with annual revenue approaching $1.5 billion. Its shares are up 64 percent in 2017 versus a roughly 30 percent gain in the Philadelphia chip companies index, and have doubled over the past 12 months.


The strategy of filling the unused capacity of the likes of Panasonic(PCRFF) and Maxim is partly born of necessity. Building or buying a factory can costs hundreds of millions of dollars, a big undertaking for a company with net cash of $143 million at the end of June.

However such opportunities do not come around very often, limiting the company's growth. And with chipmaking heavily dependent on volume, TowerJazz faces a challenge in increasing production capacity fast enough to remain competitive in the market for specialty analog chips.

At present the company only has enough capacity to increase annual revenue to $1.62 billion by the end of 2019, according to Credit Suisse semiconductor sector analyst Quang Tung Le, who last month started coverage of Tower.

Craig-Hallum analyst Richard Shannon said the company would hit the limits of their capacity by the end of 2018, precluding further growth. "The timeframe to get capacity online (in China) will not be early enough," he added.

The analog chips produced by TowerJazz regulate functions such as temperature, speed, sound and electrical current as opposed to digital chips that process binary information. Manufacturers of analog, and mixed-signal, devices tend to have more varied markets than digital.

In the analog/mixed-signal foundry sector TowerJazz has a 30 percent market share followed by Vanguard, Hua Hong Semiconductor, Dongbu Hi Tek and X-FAB .

"Since Russell took over he has performed a heroic turnaround of a company that was on its deathbed," said Robert Katz, managing director of New York-based Senvest Management, TowerJazz's second-largest investor with a 5.2 percent stake.

Ellwanger says TowerJazz had enough capacity without China to get through 2018, but he's actively pursuing more deals like the one with Panasonic(PCRFF) to fill the impending gap.

"There are technologies we are looking at buying," the CEO said in the interview at his offices in the northern city of Migdal Ha'emek, near TowerJazz's two Israeli plants, built before he joined the company.

He added that TowerJazz wanted to "get strongly into sensors".

Drexel Hamilton analyst Cody Acree said South Korea's Dongbu could be a target as it would be immediately accretive to TowerJazz's earnings. German-based X-FAB and Taiwan's Vanguard also make sense as targets, he said, but trade at higher valuations.

Ellwanger declined to comment on such speculation, though he said: "I would have nothing against having a facility in Korea." (Editing by Pravin Char)

Share RecommendKeepReplyMark as Last Read

From: Sam10/6/2017 2:37:49 PM
   of 204

Here are the actual tax rates the biggest companies in America pay
MARKETWATCH 2:34 PM ET 10/6/2017

Others, including 7 Dow components, currently pay tax at a rate that is lower than 20%

There's plenty of reading to do to understand President Donald Trump's tax reform proposal (http://, but one major feature is quite simple -- a lowering of the federal income-tax rate to 20% from the current 35%.

Trump has heralded the corporate rate cut as a move that would greatly benefit Corporate America, although many U.S. companies currently pay far less than the top 35% rate. However, it is widely viewed as the main reason many multinationals tend to keep a lot of cash overseas, as they are reluctant to repatriate it as long as it would be subject to such a high rate.

Read:Why tax repatriation won't jolt the U.S. dollar ( the-us-dollar-2017-09-27)

Related:These are the 5 U.S. companies with the biggest overseas cash piles ( are-the-5-us-companies-with-the-biggest-overseas-cash-piles-2017-04-26)

We have compiled lists of companies to review what's at stake as Congress battles over the proposal for the next several months.

Any company's tax situation can change radically from quarter to quarter or year to year, because of such items as major asset write-downs, gains on sales, realization of deferred tax assets, legal settlements, accounting adjustments or other events. So we decided to use quarterly data provided by FactSet to calculate median effective income-tax rates for the past four quarters.

These tax rates include any state and local income taxes the companies might pay. But regardless of whether a company is paying a boatload of non-federal income taxes, having a high combined effective tax rate means the stakes are high for the companies and their shareholders.

If Trump can make a deal with members of both houses of Congress to slash the corporate income-tax rate, analysts will immediately raise earnings estimates for high-tax companies, which can be reasonably expected to light a fire under the stocks.

These are the median effective income-tax rates for the 30 companies in the Dow Jones Industrial Average, over the past five reported quarters:

Company Ticker Median effective income-tax rate - past five reported quarters UnitedHealth Group Inc.(UNH) US:UNH 40% Home Depot Inc.(HD) US:HD 36% Verizon Communications Inc.(VZ) US:VZ 34% Walt Disney Co.(DIS) US:DIS 33% McDonald's Corp.(MCD) US:MCD 33% American Express Co.(AXP) US:AXP 32% Wal-Mart Stores Inc.(WMT) US:WMT 31% Visa Inc.(V) Class A US:V 29% 3M Co.(MMM) US:MMM 28% Caterpillar Inc.(CAT) US:CAT 28% J.P. Morgan Chase & Co. US:JPM 28% Goldman Sachs Group Inc.(GS) US:GS 27% United Technologies Corp.(UTX) US:UTX 26% Apple Inc.(AAPL) US:AAPL 26% Travelers Cos(TRV). US:TRV 25% Boeing Co.(BA) US:BA 24% Procter & Gamble Co.(PG) US:PG 23% DowDuPont Inc.(DWDP) US:DWDP 22% Intel Corp.(INTC) US:INTC 22% Exxon Mobil Corp.(XOM) US:XOM 21% Coca-Cola Co.(KO) US:KO 21% Cisco Systems Inc.(CSCO) US:CSCO 21% Merck & Co.(MRK) US:MRK 21% Johnson & Johnson(JNJ) US:JNJ 19% Pfizer Inc.(PFE) US:PFE 18% Chevron Corp.(CVX) US:CVX 14% Nike Inc.(NKE) Class B US:NKE 14% Microsoft Corp.(MSFT) US:MSFT 12% International Business Machines Corp.(IBM) US:IBM 10% General Electric Co(GE) US:GE 1% Source: FactSet As the table illustrates, seven of the 30 Dow components currently pay less than a 20% tax rate.

To broaden our horizons, we also looked at S&P 500 companies domiciled in the U.S., for which effective tax rate data is available for at least three of the five most recently reported fiscal quarters. Here are the 10 with the highest median effective income-tax rates:

Company Ticker Median effective income-tax rate - past five reported quarters Newmont Mining Corp.(NEM) US:NEM 50% Hewlett Packard Enterprise Co.(HPE) US:HPE 46% Nordstrom Inc.(JWN) US:JWN 45% CenturyLink Inc.(CTL) US:CTL 45% Aetna Inc.(AET) US:AET 44% Centene Corp.(CNC) US:CNC 44% TripAdvisor Inc.(TRIP) US:TRIP 43% W.W. Grainger Inc.(GWW) US:GWW 41% Hilton Worldwide Holdings Inc.(HLT) US:HLT 41% Inc.(AMZN) US:AMZN 41% Source: FactSet And here are the 10 U.S. domiciled S&P 500 companies with the lowest median tax rates over the past five reported quarters:

Company Ticker Median effective income-tax rate - past five reported quarters Mosaic Co.(MOS) US:MOS -40% Apartment Investment and Management Co.(AIV) US:AIV -37% Microchip Technology Inc.(MCHP) US:MCHP -35% Arthur J. Gallagher & Co.(AJG) US:AJG -34% Zimmer Biomet Holdings Inc.(ZBH) US:ZBH -33% eBay Inc.(EBAY) US:EBAY -32% Boston Scientific Corp.(BSX) US:BSX -17% Thermo Fisher Scientific Inc.(TMO) US:TMO -4% Welltower Inc.(HCN) US:HCN -3% Kimco Realty Corp.(KIM) US:KIM -2% Source: FactSet If a company's tax rate is negative, it indicates a net loss, which means the company can carry the tax loss forward to other years and offset income which would otherwise be taxed.

Read now:Why is Ireland the U.S. government's third-largest creditor? ( ireland-become-the-us-governments-third-largest-creditor-2017-09-20)

-Philip van Doorn; 415-439-6400;

  (END) Dow Jones Newswires 

Share RecommendKeepReplyMark as Last Read

From: Sam10/7/2017 7:48:39 AM
   of 204
Building The Leading Facial Recognition System

Source: AP

Among the new features of iPhones X Apple reeled off, Face ID is redoubtably the most captivating. Commonwealth has determined that Tainan-based Himax Technologies Inc and the industry leader Taiwan Semiconductor Manufacturing Co (TSMC) are the two major suppliers for Face ID’s key components after six months of research. The two latest “World-leading” from Taiwan are thus keen on proclaiming themselves.

By Liang-Rong Chen
web only
???? ??????
Bordering on the Southern Taiwan Science Park to the east, Tree Valley Park was set up more than a decade ago, dedicated to housing Chimei Group’s LCD TV faction during its prime. Despite Chimei Group having lapsed from the industry, the now verdant industrial park has bolstered the next prodigy in high tech industry.

A sturdy tech plant was seen taking shape, with cranes, grout truck and over hundreds of workers toiling day and night. Himax announced affixing a new 8-inch wafer-level optic production line in its new plant, where Himax invested US$80 million (approximated NT$2.4 billion), its largest investment to date.

The construction for Himax’s wafer-level optic plant took shape after merely six months, with workers toiling day and night. The new plant is expected to be put into operation by the beginning of next year, supporting Android phones’ face recognition system.

Himax Technologies Inc was the driver IC supplier under Chimei Group, then the second-largest panel manufacturer in Taiwan. Led by Chimei’s founder Hsu Wen-long’s top aide Wu Biing-seng as the chairman, Himax, along with Novatek Microelectronics Corp, were once the largest two driver IC companies in the world.

However, the latest momentum to its growth came from optical lens, which is no bigger than a grain of rice. Himax positioned itself as the new challenger to usurp Largan Precision Co, which holds the highest stock price in Taiwan market.

“The minuscule glass lenses will be installed on the front of Apple’s forthcoming iPhone X, becoming the key component of its tantalizing Face ID system.”

Source: CNET

The Key Component for Face Recognition System coming from Himax

The wafer-level component was the achievement of the process technology boasting astounding efficiency and precision, which was only made possible by decades of painstaking efforts and resources pooled from the semiconductor industry.

In the Himax headquarters over 600 meters away, the office space on the ground floor, as well as parts of the office canteen and parking lot, have resigned to the expanded optic plant, cranking up the shipments in the third quarter this year.

Himax forecasted in the earning conference call, with its new optic production lines going into operation, its non-Driver IC sectors would drive a steep rise in sales, and the company expected revenue growth of NT$800 million.

It is public knowledge in the industry that the very specific and urgent need could only come from one client: Apple Inc.

On Himax’s second quarter conference call one month ahead Apple’s launch event, Wu Biing-seng’s brother, Himax CEO Jordan Wu was clearly animated, elaborating on for more than an hour.

Morgan Stanley analyst Charlie Chan posed a trap question to Wu, inquiring after Himax’s respective revenue proportion from Android and non-Android operation system.

Jordan Wu declined to answer, for “far as I know, there’s only one company (Apple Inc) running in the non-Android campaign,” while others bursted into laughter.

Yet he made mention of it in his answer to another query. “If there’s anyone claiming revenue yielded from 3D sensor (the technology behind Apple’s Face ID), it could from no other company than Apple Inc, for none of the competitors have had such feature yet,” he said.

It was seen as an indirect recognition of partnering with Apple Inc, since Jordan Wu underscored 3D sensor as the top priority shipment of Himax’s wafer-level optic products only minutes ago.

Spokesperson for Himax declined to comment. But a manager from a major company collaborating with Himax on 3D sensor products corroborated that Himax did ship its optic products to Apple Inc.

TSMC’s secret weapon

Aside from Himax, TSMC was the other pivotal powerhouse in Apple’s 3D sensor scheme.

TSMC was commissioned to produce the NIR image sensor, which would be placed on the front of iPhone X, designed to receive laser light. Sources in the industry said the supply might come as monopoly, or STMicroelectronics from Europe might have the equal shares from the order.

This was the very first time TSMC unsheathed its long-prepared secret weapon.

It’s rare knowledge that TSMC also produces digital cameras, surveillance camera and CMOS image sensor, which, however, has always trailed behind Japan’s Sony Corp and Korea’s Samsung Electronics Co., Ltd. “Despite 20 years of travails, TSMC still comes third,” said Liu Hsin-sen, the director in TSMC’s sensor and monitor sales department, during the company’s tech forum this May. It is a humongous disgrace to people working for TSMC, who seek to champion in every regard.

TSMC might finally have the chance of topping the world in the emerging NIR sensor field.

TSMC’s confidence lies in its capability of integrating the invisible-to-human-eye spectrum in the NIR range: 940nm wavelengths light.

The spectrum is deemed more than ideal, for the sunlight hardly emit the spectrum after the scattering in the atmosphere.

The flood illuminator on the front of iPhone X would project tens of thousands lights at 940 nanometer onto the human face, and then a TSMC-made camera would read the pattern from reflection. Liu Hsin-sen said the process “would be hardly interrupted by other rays of light”, regardless of surroundings because of the property of 940nm light source.

The industry expected the range of laser light be widely applied to other domains, ranging from augmented reality devices, autonomous vehicles and other products embedded with Internet of Things, becoming the key force of light in the booming field of machine vision technology.

Liu Hsin-sen went on claiming the future smartphone would at least carry three to four sets of “machine eye” designed to read the specific spectra, and “as reported by our INTEL, TSMC is on the leading.”

Notwithstanding Samsung and Sony having hammered into the field, TSMC has boasted the best quantum efficiency on the manufacturing process so far, reaching 35% in the wavelengths of 940 nanometer, which spells TSMC’s capacity of reading laser with lower power. As a result, TSMC’s model offers better battery performance, which has the strategic importance in the smartphone market.

Himax’s NIR sensor model is also built by TSMC, which Jordan Wu spoke highly of. “Its quantum efficiency is more than twice as much as its rivals in the market,” Wu said.

As the ultra expensive model set its price at NT$35,900 in Taiwan, iPhone X’ sales remain to be seen.

However, Himax is sanguine about its future in the next two years. Not only will the new plant be completed towards the end of the year, but adjacent vacancy could accommodate another two new plants, which might soon be put into operation, too.

“Judging from our client’s excitement,” said Jordan Wu in the conference call, “the plant expansion in the second quarter would happen much sooner than expected, to meet the upcoming demands in the next two or three years,” he said.

It was because the largest maker of smartphone chips, Qualcomm Inc has allied itself to Himax and TSMC, delivering a fully integrated 3D solution, drafted to be the standard equipment for billions of advanced Android phones worldwide.

“Apple’s technologies aren’t definitely more advanced, they have their fair share, and so do we,” said Chang Chieng-chung confidently, Qualcomm’s vice president of engineering piloting the 3D solution. Chang hails from Taiwan. He majored in electrical engineering at National Tsing Hua University, having working for Qualcomm Inc to date after completion of his Ph.D in the States.

A Taiwanese in Qualcomm

More than a month prior to iPhone launch event, Chang had already illustrated the feature which resembled Apple’s Face ID to Taiwanese visiting media at Qualcomm’s San Diego headquarters. Once the user takes up the phone, it could recognize user’s contour within few thousandth of a second and boot itself.

“This is what a true smartphone user interface should be like, booting itself once it sees you,” he said. “After all, the pet should be able to recognize the owner.”

Qualcomm architected the entire technology concept, where the most decisive part sketching out the human face, the 3D depth sensing mapping algorithms, was under Qualcomm’s exclusive rights.

Himax shouldered carrying through the hardware building from the scratch. Jordan Wu revealed in the conference call conceitedly that Himax had formed an “A-Team” tackling the 3D sensor project. A few of the key components in the technology will be developed and manufactured by Himax itself, including wafer-level optic items and laser drivers IC, and among the others are the NIR sensor in cooperation with TSMC, and the ASIC chips for accelerating the 3D depth mapping algorithms.

The insider disclosed that the other essentials and lasers were supplied by U.S. firm Lumentum, in line with iPhone X, and Truly Opto-electronics Ltd. from Hong Kong assumed the module assembly works.

Quote: Jordan Wu dwelled on company’s caliber on independently developing the bulk of major components. “We could tailor to smartphone clientele’s use promptly,” he said, “which would be exceedingly big hurdles for newcomers.”

Jordan Wu maintained that the 3D solution dubbed “SLiM” might have the selling price US$15-20 per unit, while Himax chalking up the major presence in the solution.

“This would be a game changer for Himax,” Wu continued. “If everything went as planned, it would take Himax to the very next level.”

As for the Qualcomm, the dominant presence in wireless communication, grander ambitions are at work.

Notably, the solution SLiM could only attune Qualcomm’s latest processor. Northland Capital Markets’ analyst Tom Sepenzis raised Qualcomm’s target price subsequently, believing Samsung and Huawei would be forced to adopt Qualcomm’s processors in some of their flagship models in order to rival iPhone X, renouncing their recent efforts in shoring up their own processor businesses.

The optimal strategy for Qualcomm is to aid the Android campaign

Qualcomm has feuded Apple Inc lately, facing off in court over licensing for modem chips. It would be merely a matter of time for Apple Inc to expel Qualcomm from its supply chain.

The optimal and most pragmatic revenge would be aiding Android campaign to jockey for position matching Apple’s newest innovations.

To Qualcomm’s dismay, KGI Securities’ Kuo Ming-chi, dubbed “the best Apple analyst on the planet”, alleged in his late August report that Qualcomm lagged behind in both software and hardware development for 3D sensing, and wouldn’t be able to haul significant shipments until 2019.

Qualcomm and Himax rose to riposte. In a joint statement, they targeted mass production by the first quarter of 2018, less than 3 months from iPhone X’ release date. “Our client runs a very bold schedule for mass production,” Jordan Wu said.

The market forecasted that the smartphone firms taking after the rash of solution SLiM would be Xiaomi and Oppo from China.

In the full escalation of current smartphone strife, Himax and TSMC are instead bound to profit, seeing their products, wafer-level optics and NIR sensor subcontracting, adopted by both the warring campaigns.

Himax’s wafer level optics sales and the stock listed on NASDAQ pulled out a rather unremarkable performance, with HoloLens, Microsoft's augmented reality (AR) viewer, following after Google Glass both opting for that of Himax.

Nonetheless, with Apple’s Face ID’s stunning launch, Himax’s wafer level optics just might have its breakout in sales.

The city Tainan has fostered yet another “world champion” in the vanguard technology field.

Translated from the Chinese: Squirtle
Editor: Maureen Wang, Fiona Chou

Additional Reading
How Is TSMC Winning? Making Every Battle Count
? Taiwan’s Champions of Transformation

Share RecommendKeepReplyMark as Last Read

From: Sam10/8/2017 11:16:44 PM
   of 204
Interesting history/pre-history from the Atlantic.

A New History of the First Peoples in the Americas
The miracle of modern genetics has revolutionized the story anthropologists tell about how humans spread out across the Earth.

Adam Rutherford Oct 3, 2017


The prevailing theory about how the people of the Americas came to those lands is via that bridge [between Alaska and Siberia]. We refer to it as a land bridge, though given its duration and size, it was simply continuous land, thousands of miles from north to south; it’s only a bridge if we view it in comparison to today’s straits. The area is called Beringia, and the first people across it the Beringians. These were harsh lands, sparse with shrubs and herbs; to the south, there were boreal woodlands, and where the land met the sea, kelp forests and seals.

Though these were still tough terrains, according to archaeological finds Western Beringians were living near the Yana River in Siberia by 30,000 BCE. There’s been plenty of debate over the years as to when exactly people reached the eastern side, and therefore at what point after the seas rose they became isolated as the founding peoples of the Americas. The questions that remain—and there are many—concern whether they came all at once or in dribs and drabs. Sites in the Yukon that straddle the U.S.-Alaskan border with Canada give us clues, such as the Bluefish Caves, 33 miles southwest of the village of Old Crow.

The latest radio-dating analysis of the remnants of lives in the Bluefish Caves indicates that people were there 24,000 years ago. These founding peoples spread over 12,000 years to every corner of the continents and formed the pool from which all Americans would be drawn until 1492. I will focus on North America here, and what we know so far, what we can know through genetics, and why we don’t know more.

continues at the link

Share RecommendKeepReplyMark as Last Read

From: Sam10/9/2017 8:13:16 AM
   of 204
Micron, NetFoundry and Microsoft Unlock Full Potential of IoT for All
  • Jeff Shiner
  • October 02, 2017
  • Memory Blog

  • The Internet of Things (IoT) has the potential to disrupt the global economy on a scale larger than any previous industrial revolution. But these two major factors are holding the IoT back from the kind of massive growth where companies both big and small can reap the rewards:

    • How to monetize IoT deployments
    • How to secure IoT devices
    Technology fragmentation, magnified by copious new technology types, has complicated the evaluation and implementation of IoT solutions and makes it difficult to calculate their potential return on investment (ROI). These same fragmented approaches, coupled with the lack of resources to understand them, have also made security a huge roadblock. Firewall protection isn’t enough. Only in-depth defense down to the device level will assure integrity of the entire IoT product life cycle.

    Large Fortune 100 companies tend to have vast resources for cybersecurity, networking and connectivity — along with more personnel to research and understand new technologies. Smaller companies rely heavily on industry collaborations to enable simpler end-to-end deployments that offer a clearer vision of the investment.

    Enter the new Micron, Microsoft and NetFoundry edge-to-cloud solutions that are enabling an ecosystem where companies of all sizes can flourish in the IoT.

    Device Integrity, Made Possible by Micron, Is Key

    Micron’s recently launched Authenta™ technology adds a strong layer of defense to a broad array of IoT devices. Micron’s flash memory with Authenta technology leverages existing standard nonvolatile memory sockets to add a unique level of hardware-based security that protects the integrity of the IoT device itself as well as the software that runs on the device.

    Micron Authenta technology provides protection for the lowest layers of IoT device software, starting with the boot process. By combining the unique device-specific identity only a hardware root of trust can offer, along with the measurement capabilities necessary for in-memory secure boot, Authenta technology provides a strong cryptographic fingerprint necessary to authenticate IoT devices directly with a host, such as a secure gateway, or from a host to the cloud. This kind of device integrity will enable additional functionality like hardware-based device attestation and provisioning as well as administrative remediation of the device.

    In addition to the hardware, Micron will soon offer software development kits (SDKs) that make it easier to provide secure device management and connectivity for new platforms and devices. These SDKs also allow you to retrofit legacy systems, offering faster time to market with fewer resources.

    Gaining Trusted Access to Microsoft® Azure® IoT Hub

    To enable only trusted hardware to authenticate directly to the Microsoft® Azure® IoT Hub, Micron and Microsoft leveraged the Device Identifier Composition Engine (DICE), an upcoming standard from the Trusted Computing Group (TCG), and Micron’s Authenta-enabled memory. (Learn more on Microsoft’s Azure blog.) One key aspect of the combined solution is that the health and identity of an IoT device is verified in-memory where critical code is typically stored.

    However, there is an untrusted onramp between Micron's Authena-enabled memory and the Azure IoT cloud — a gap that is filled by NetFoundry.

    Click to enlarge

    Edge-to-Cloud Connectivity Enabled by NetFoundry™ MultiCloud Connect

    The NetFoundry™ MultiCloud Connect solution provides application-specific networks (ASNs) on-demand, over the public internet that enable ultra-secure, high-performance edge-to-cloud and cloud-to-cloud connections with Azure, Amazon Web Services®, Google® Cloud, IBM® Bluemix® and many others. First, this solution eliminates the need for dedicated, expensive private links for your IoT devices to securely access off-site resources. Second, MultiCloud Connect lets your IoT devices be mobile, eliminating the need for fiber or hardwired communication. If you have internet access (WiFi, LAN, LTE, 4G, 5G, and so on), the NetFoundry ASN will be ready.

    NetFoundry leverages Micron’s strong device identity and hardware roots of trust to securely and reliably deliver IoT solutions over the NetFoundry platform using AppWANs. Each AppWAN is driven by the identity, context or policy, and performance requirements for the specific application. The NetFoundry platform has a flexible form factor to meet various solution needs. It can be embedded in your application via APIs, a client application running on your end device, or as a virtualized gateway running on your x86 appliance. The figure above shows the NetFoundry platform running as a virtualized gateway on the Dell® Automotive Edge 3000 Gateway appliance. (NetFoundry is a Dell IoT Solutions Partner.)

    Automation is a critical component in any IoT deployment due to the sheer volume and policy requirements that IoT endpoints have. The NetFoundry platform provides zero-touch onboarding that leverages the Micron Authenta device-specific identity so that each IoT endpoint automatically accesses network-wide services and resources based on your defined policy. These ASNs have built-in performance and path remediation to ensure highly secure sessions from the endpoint to the Azure IoT Hub or other required destinations, along with optimized performance and application responsiveness.

    Live at IoT Solutions World Congress

    This week, I’m at IoT Solutions World Congress 2017 in Barcelona where Micron, NetFoundry and Microsoft Azure will showcase these next-generation solutions that are establishing a strong, trusted link from the true edge to the cloud. If you’re attending the event, stop by and say hello. We can talk more about how you can take advantage of this end-to-end ecosystem, which is built on a strong chain of trust and set to simplify how companies quickly comprehend IoT deployment resources and the ROI model.

    Share RecommendKeepReplyMark as Last Read

    From: Sam10/10/2017 6:36:01 AM
       of 204
    Columbus Day Has Drawn Protests Almost From Day 1
    OCT. 9, 2017

    A reverend at Calvary Baptist Church in Manhattan appeared on the front page of The New York Times after he criticized Christopher Columbus, the Italian navigator who sailed to the Americas on behalf of Spain in 1492.

    The reverend, R. S. MacArthur, said Columbus was “cruel, and guilty of many crimes.”

    That complaint may sound familiar to those who condemn the explorer for opening a door to European colonialism, which brought disease, destruction and catastrophic wars to the people who already lived here.

    But Mr. MacArthur said those words more than a century ago, in 1893. His comments suggested he was more affronted by Spain, which he called “the poorest and most ignorant country in Europe,” than concerned about Native Americans.

    He was one of many to have questioned the legacy of the explorer, whose arrival in the Americas has been celebrated in the United States for hundreds of years.

    The makings of a holiday

    Americans commemorated Columbus’s first landing in the Caribbean at least as early as 1792, when members of the Tammany Society of New York and, separately, the Massachusetts Historical Society in Boston, gathered to mark the 300th anniversary of the day the Spanish ships made landfall.

    In 1892, President Benjamin Harrison said the entire country should observe “ Discovery Day” to mark the 400th anniversary of Columbus’s landing. It was formally designated as a recurring national holiday on Oct. 1, 1934, when President Franklin D. Roosevelt proclaimed that Oct. 12 would be a day to display the American flag and engage in “appropriate ceremonies in schools and churches” every year. (It was later changed to the second Monday of October.)

    During the 19th and 20th centuries, the holiday was often considered a celebration for Italian-Americans and Catholics. Churches and organizations such as the Knights of Columbus sometimes used Columbus Day gatherings to publicly condemn discrimination against Catholics.

    But Columbus’s role as an Italian representative would be complicated by World War II and the rise of Benito Mussolini.

    ‘Viva Mussolini’ in New York

    In 1936, four years before Mussolini, the Italian dictator, formally declared war to fight alongside Adolf Hitler, rumors swirled that fascist sympathizers were helping to organize the Columbus Day celebrations in New York City.

    In 1938, when thousands gathered at Central Park for the Columbus Day festivities, some shouted “Viva Mussolini” while listening to speeches. “The gathering was definitely sympathetic toward the Fascist regime in Italy,” The Times reported.


    Signs calling for the abolition of Columbus Day at a protest in Flagstaff, Ariz., last year. Credit Jake Bacon/Associated Press

    And in 1943, about a month after Italy surrendered to the Allies, The Times reported that Columbus Day celebrations in recent years had been “increasingly embarrassing.”

    “All of us wanted to pay tribute to the discoverer and through him to other great Italians and to the many notable Italian achievements, but most of us drew the line at celebrating Benito Mussolini, or his friends, or his regime,” it reported.

    Still, the celebration continued annually. And during the first Columbus Day parade after World War II, “ the plight of Italy was dramatized” and marchers made appeals for aid to help the country recover.

    ‘We were here first’

    In the decades since, Columbus came to be seen less as an explorer representing Italians and more as a European colonizer whose journeys led to the decimation of American indigenous populations.

    Those ideas picked up steam during the early 1990s, when criticism about the explorer’s legacy became increasingly visible in cities including Boston, Denver, Philadelphia and Berkeley, Calif., amplifying many Native Americans’ longstanding complaints about the holiday.

    “We were here first,” Ray Geer, a Paucatuck Eastern Pequot and president of the Connecticut River Powwow Society, said to The Times in 1991. “We find the notion that Columbus discovered us extremely distasteful.”

    The push is continuing. Last week, Salt Lake City and Los Angeles County decided to recognize Indigenous People’s Day on the second Monday of every October.

    In New York City, officials have entertained ideas about taking down the iconic statue at Columbus Circle, eliciting some backlash from Italian-American groups. Another statue of Columbus in Central Park was vandalized with red paint and graffiti last month.

    Who needs ‘The Sopranos’?

    Opposition to Columbus Day festivities has come in more prosaic forms, too.

    In 1911, the real estate association representative Abraham Korn urged New York City officials not to spend $50,000 on the celebration. “We’ve done enough for Christopher Columbus by making a holiday,” he said. “No money should be spent in this way for fireworks.”

    In 1949, the Fifth Avenue Association beseeched Mayor William O’Dwyer of New York to reroute the parade so as not to disrupt shoppers in Midtown Manhattan. “The association held that the disruption of traffic caused severe inconvenience to the general public, shoppers and property owners and thereby resulted in a serious loss of business.”


    Mayor Michael Bloomberg of New York and the actors Lorraine Bracco and Dominic Chianese, from the hit HBO show “The Sopranos,” did not attend the Columbus Day parade in Manhattan in 2002. Instead, they ate lunch at Dominick’s in the Bronx. Credit Kelly Guenther

    And 15 years ago, there was a Columbus Day dust-up after Italian-American groups learned that Mayor Michael R. Bloomberg wanted to march with actors from the hit television show “The Sopranos.”

    They were not happy, and the Times columnist Clyde Haberman wrote on Oct. 22, 2002, that there was indignation and offense on all sides, including that of the mayor, who “deemed himself the injured party and skipped the parade, heading instead to the Bronx for linguine marinara.”

    Share RecommendKeepReplyMark as Last Read

    From: Sam10/10/2017 6:43:56 AM
       of 204
    Is Fat Killing You, or Is Sugar?
    What we do and don’t know about dietary science.
    By Jerome Groopman
    April 3, 2017

    In the early nineteen-sixties, when cholesterol was declared an enemy of health, my parents quickly enlisted in the war on fat. Onion rolls slathered with butter, herring in thick cream sauce, brisket of beef with a side of stuffed derma, and other staples of our family cuisine disappeared from our table. Margarine dethroned butter, vinegar replaced cream sauce, poached fish substituted for brisket. I recall experiencing something like withdrawal, daydreaming about past feasts as my stomach grumbled. My father’s blood-cholesterol level—not to mention that of his siblings and friends—became a regular topic of conversation at the dinner table. Yet, despite the restrictive diet, his number scarcely budged, and a few years later, in his mid-fifties, he had a heart attack and died.

    The dangers of fat haunted me after his death. When, in my forties, my cholesterol level rose to 242—200 is considered the upper limit of what’s healthy—I embarked on a regimen that restricted fatty foods (and also cut down on carbohydrates). Six months later, having shed ten pounds, I rechecked my level. It was unchanged; genes have a way of signalling their power. But as soon as my doctor put me on just a tiny dose of a statin medication my cholesterol plummeted more than eighty points.

    In recent decades, fat has been making a comeback. Researchers have questioned whether dietary fat is necessarily dangerous, and have shown that not all fats are created equal. People now look for ways of boosting the “good cholesterol” in their blood and extol the benefits of Mediterranean diets, with their emphasis on olive oil and fatty nuts. In some quarters, blame for obesity and heart disease has shifted from fat to carbohydrates. The Atkins diet and, more recently, the paleo diet have popularized the idea that you can get slim eating high-protein, high-cholesterol foods.

    Still, I remained wary of the delicacies of my childhood. Surely it was wiser simply to avoid fats altogether? I wavered, though, in 2013, when The New England Journal of Medicine published an article endorsing the salubrious effects of Mediterranean eating habits. The article detailed the results of a study, the most rigorously scientific one yet conducted on the issue, which showed that following a Mediterranean diet rich in either olive oil or nuts could reduce the risk of heart attack, stroke, or death from cardiovascular causes by thirty per cent. I was elated until my wife, an endocrinologist who is an expert on metabolism, pointed out that the headline number of thirty per cent emerged from the complex statistical way that the study’s results were projected over time. If you looked at what happened to the people in the study, the picture was less encouraging: 3.8 per cent of the people consuming olive oil and 3.4 per cent of the people eating nuts suffered cardiovascular misfortune, compared with 4.4 per cent of the group on a regular diet. The true difference in outcome between the two diets was, at best, one per cent.

    It’s one of many cautionary tales about assessing dietary data. Everyone wants to be healthy, and most of us like eating, so we’re easily swayed by any new finding, no matter how dubious. Publishers know this all too well and continually ply us with diet and health books of varying degrees of respectability and uplift. The most prominent on the current menu are Sylvia Tara’s “The Secret Life of Fat” (Norton) and “The Case Against Sugar,” by Gary Taubes (Knopf). Both present a range of cutting-edge dietary research, both say that fat is unfairly maligned, and both inadvertently end up revealing that the science behind their claims is complex and its findings hard to translate into usable advice.

    Sylvia Tara is a freelance writer who holds a doctorate in biochemistry and an M.B.A.; she has worked at McKinsey and on the management side of various biotech companies. Drawing on insights from both science and consulting, she has produced a book that is part physiology and part marketing pitch. Tara wants us to view lipids positively. Once we stop treating fat “like a vicious enemy,” she argues, it “could become beloved once again.”
    But Tara’s attitude to fat is more ambiguous than this statement suggests. She claims to be obsessed with her figure, measuring her worth by how well she fits into skinny jeans. In her telling, the spur to her investigations comes from her envy of a friend who stays svelte despite gorging on beer and burritos, drinking sugary lattes, and never exercising. Tara, who writes that she gains weight easily, is interested in the question of why some people eat like hogs and stay thin, while others expand no matter how abstemious they try to be.

    The book is a useful primer on the biology of fat. Fat comes in different forms, categorized by color. White fat, the type that we seek to lose when overweight, stores energy. Brown fat, normally found in the neck, back, and around the heart, is filled with tiny structures called mitochondria, and serves as a furnace to burn energy for body heat. A third type, beige fat, was identified some five years ago; during exercise, it receives messages from our muscles to morph into brown fat. Moreover, fat should not be characterized simply as inert blubber. It is the vehicle by which our cells receive certain essential nutrients, like Vitamins A, D, E, and K. The myelin sheaths around our nerves are eighty per cent lipids, “which means fat is actually required to think,” Tara writes. Studies by Jeffrey Friedman, at the Rockefeller University, have shown that the hormone leptin travels from fat cells to the hypothalamus, a part of the brain which is involved in regulating appetite. “Friedman’s discovery redefined fat,” Tara writes. “It was a verifiable endocrine organ with wide influence to our bodies. Through leptin, fat could talk. It could tell the brain to stop eating.”

    All this will be illuminating for many readers, but Tara is a less reliable guide when she uncritically embraces various new theories about the causes and effects of obesity. She trumpets the findings of a Turkish physician, Gökhan Hotamisligil, whose work suggests that a molecule known as TNF-alpha, which has potent inflammatory properties, may be the link between obesity and Type 2 diabetes—a condition arising when the body becomes resistant to insulin, a hormone that we need in order to process sugar. (Though there’s a clear correlation between diabetes and obesity, no one has yet discovered a causal link.) Hotamisligil’s experiments showed that not only is TNF-alpha produced by fat; it also can cause resistance to insulin. “This discovery was big news,” Tara writes. However, she fails to specify that the finding was in rodents, and that subsequent studies in humans, including some by Hotamisligil, did not show the same results.

    Tara also speculates that viruses may cause obesity. The research she draws on here is obscure and unconvincing. It concerns a virus called Ad-36, which infects fowl and can make chickens fat. In the studies Tara cites, more overweight people appeared to have antibodies to Ad-36—suggesting that they had been infected in the past—than slim people did. There are many reasons to be skeptical: there’s no evidence that fowl can pass Ad-36 to humans, and there are many viruses that could easily be mistaken for Ad-36.

    As with many books on diet, “The Secret Life of Fat” alternates exposition with prescription. But the idea that understanding lipids at a molecular level will help you stay trim seems far-fetched. It’s telling that Tara’s final triumph—managing to fit back into her skinny jeans—has little to do with her sophisticated understanding of fat. Rather, she follows the advice of Mark Sisson, a “fitness educator” who fasts eighteen hours a day, and who, at sixty-two, she writes, is “muscular and fit and looks every bit like the Malibu surfer he is.” Tara lost weight by restricting her daily intake to at most a thousand calories and by intermittent total fasts.

    This is hardly a healthy note to end on, yet elsewhere Tara seems to take aim at our destructive cultural fixation on body image. Fat was prized in the past, she notes, with big bellies signalling access to plentiful food and, thus, prosperity. The Buddha’s belly “is a major part of his brand,” she writes. (Such consultant-speak seems odd in the context of religion.) The porcine aristocrats one sees in eighteenth-century portraits are frequently shown near tables overflowing with delicacies. The women’s bodies depicted in canvasses by Peter Paul Rubens have long since made “Rubenesque” a euphemism for plus-size. And, if one goes far enough back, the huge bellies and buttocks of the Paleolithic “Steatopygian Venus” figures that have been found across much of Europe suggest that fat can connote fertility and desirability.

    Tara digs up examples of Americans celebrating fat as late as the latter half of the nineteenth century. Ladies’ Home Journal gave tips on gaining weight, as did an 1878 book titled “How to Be Plump.” Still, the nineteenth century in general was more notable for a growing concern with being slim, as has been shown by previous writers, such as Gina Kolata, whose “Rethinking Thin” (2007) itself draws heavily on Hillel Schwartz’s remarkable history “Never Satisfied” (1986). Lord Byron, who struggled with his weight, swore by vinegar; at other times, he ingested just a single raisin a day, supplemented by a glass of brandy. Women in the nineteenth century stuffed themselves into near-suffocating corsets to achieve an hourglass figure with an unnaturally tiny waist. Weight-loss regimens included consuming soap, chalk, pickles, digitalis, camphor tea, grapefruit (which was thought to contain fat-dissolving enzymes), potassium acetate (a diuretic), and ipecac (which induces vomiting). People tried sweating their fat away in rubber suits, or squeezing it away in a pressurized reducing machine.

    Indeed, the weight-loss fads of past centuries include precedents for all the main contemporary diets, from low-fat, low-calorie ones to high-fat, low-carbohydrate ones, like the Atkins diet. In 1825, a French lawyer, Jean Anthelme Brillat-Savarin, wrote a famous treatise, “The Physiology of Taste,” in which he contended that true carnivores and herbivores did not get fat; it was only when one ingested grain—read: bread—that the trouble started. Around the same time, an American Presbyterian minister, Sylvester Graham, reasoned that, as gluttony was the greatest sin, abstinence must lead to virtue; he advised eating vegetables and drinking water, eschewing meat, coffee, spices, and alcohol. For a while, students and faculty at Oberlin College were made to follow Graham’s diet; graham crackers were so named in order to appeal to his acolytes. Several years later, Horace Fletcher, known as “the great masticator,” touted very slow chewing as the remedy for obesity; adherents included normally skeptical people like Upton Sinclair and John D. Rockefeller.

    A genuine advance, which put nutrition on a solid scientific footing for the first time, was the work of the chemist Wilbur Atwater. In the eighteen-nineties, he began studying how the body converted food to energy, by placing subjects in a closed chamber and measuring the amount of carbon dioxide they produced and oxygen they consumed after eating various foods. Atwater came up with the idea of the food calorie, adapting a measurement previously used for heat energy. In 1917, Herbert Hoover, then the head of the U.S. Food Administration, worked to publicize Atwater’s findings. “I eat as little as I can to get going,” he said. Low-calorie foods and skipping meals became popular. The importance of calories—if energy gained exceeds output, the excess becomes fat—remains one of the few unchallengeable facts in the field of dietary science. Still, further research has shown that calories eaten are only part of what determines weight. Our metabolism reflects an interplay of things like genes, hormones, and the bacteria that populate the gut, so how much energy we absorb from what we eat varies from person to person.

    In the nineteen-fifties, the American Medical Association identified obesity as the country’s No. 1 health problem, and the diet industry exploded. The end of that decade brought the idea of the liquid diet—skimmed milk, supplemented with bananas or other fruit—which, in turn, eventually gave rise to products like Metrecal, Carnation Slender, and SlimFast. Self-help groups modelled on Alcoholics Anonymous began proliferating with the establishment, in 1948, of a movement called TOPS (the acronym stood for “take off pounds sensibly”). Overeaters Anonymous followed, in 1960; Weight Watchers, in 1963; and Jenny Craig, in 1983.

    The immediate postwar years also brought the first sustained scientific assault on dietary fat. Ancel Keys, a physiologist at the University of Minnesota, who had spent the war developing nutritionally optimal Army rations and studying the effects of starvation, became interested in the high rates of heart attack among a seemingly well-fed sector of the population—American businessmen. He soon became convinced that the saturated fats found in meat and dairy products were the cause, and thus began the war on fat that galvanized my parents. Keys became, with his wife, Margaret, an advocate for the Mediterranean diet of unsaturated fats. Their books promoting the diet were best-sellers, and Keys, who spent his latter years in Italy, lived to the age of a hundred. (Margaret lived to ninety-seven.)

    The author of “The Case Against Sugar,” Gary Taubes, gained prominence as a science writer in 2002, with a cover story in the Times Magazine—“What If It’s All Been a Big Fat Lie?”—which challenged the orthodoxy of restricting dietary fat. Carbohydrates were the real danger, he wrote—not just processed foods containing refined sugars like sucrose and fructose but also easily digestible starches from grains and vegetables. He expanded these arguments in a book, “Good Calories, Bad Calories” (2007), and, in his new book, he goes much further. Though he now allows that people can eat some carbohydrates and still live a “relatively” healthy life, he sees sugar as the devil incarnate, doing harm independent of its known role in causing obesity. Taubes believes that a wide range of seemingly unrelated diseases—“diabetes, heart disease, cancer, stroke, and Alzheimer’s, which account for five of the top ten causes of death in the U.S.”—are in fact linked, and that dietary sugar is the cause of them all, as well as of “other disorders that associate with these illnesses, among them polycystic ovary syndrome (PCOS), rheumatoid arthritis, gout, varicose veins, asthma, and inflammatory bowel disease.” In addition, he aims at showing that the food industry has systematically tried to obstruct scientific research that exposes the dangers of sugar, just as tobacco companies tried to hide the risks of smoking.

    The latter claim is the more persuasive. Taubes, a pugnacious writer who clearly relishes the role of muckraker, digs up a long history of attempts to discredit charges against sugar and to point the finger at fat as the primary dietary cause of disease. In 1943, when sugar, dismissed by the government and medical organizations as “empty calories,” was being rationed as part of the war effort, sugar companies formed a trade association “to set the record straight.” It devised a two-pronged strategy: support scientists who endorsed the notion that sugar was a valuable source of dietary energy without any specific health risks; and then mobilize these experts in a public-relations campaign. A prominent Madison Avenue firm was hired to design a public-health campaign that would “establish with the broadest possible audience—virtually everyone is a consumer—the safety of sugar as a food.” Among the scientists they supported was Ancel Keys, the Mediterranean-diet pioneer; his work influenced the dietary guidelines of the American Heart Association and the American Diabetes Association. Fred Stare, an influential nutritionist at Harvard, received not only research funding but a donation of more than a million dollars, from the General Foods Corporation (whose products included Kool-Aid and Tang), to build a new department. He proclaimed that it was not even “remotely true” that “modern sugar consumption contributes to poor health.” Taubes recounts that Stare appeared on talk shows on more than two hundred radio stations as a “front man to dismiss anti-sugar sentiments publicly.”

    The spread of diet crazes and of obesity anxiety in the fifties alerted the sugar industry to the fact that its product was at risk. Diet sodas with artificial sweeteners were gaining market share. The sugar industry responded in two ways: by stressing how important sugar was as an energy source for children (“neither a weight reducing nor fattening food”); and by discrediting artificial sweeteners such as saccharin and cyclamates as health dangers. It funded research at the Wisconsin Alumni Research Foundation and at the Worcester Foundation for Experimental Biology, which managed to find various adverse effects from consumption of cyclamates in rats. The latter achieved this by giving rats an absurd dose—the equivalent, in human terms, of five hundred and thirty cans of Fresca. Nonetheless, the F.D.A. eventually banned cyclamates as a health risk.

    Though Taubes’s account of these little-known manipulations is useful, he overreaches in blaming sugar for such a wide range of diseases. In attempting to lump them together, he cherry-picks from a variety of recent research. For instance, some epidemiological surveys have shown that when people move from the developing world to the West they change diet and often become obese, leading to an increased incidence of diseases, including diabetes and cancer. And other diseases, such as Alzheimer’s, appear on Taubes’s list, because researchers have studied whether they are linked to insulin resistance.

    Synthesizing these conjectures, Taubes sees insulin resistance as the bedrock disturbance in the body which unleashes a cascade of other hormonal and inflammatory molecules that attack the brain (causing dementia), degrade the heart and the blood vessels (causing heart attack and stroke), disturb the metabolism of uric acid (causing gout), and so on. He then attempts to build his case as a prosecuting attorney by means of a chain of “if/then” statements, such as “If sugar and high-fructose corn syrup are the cause of obesity, diabetes, and insulin resistance, then they’re also the most likely dietary trigger of these other diseases.” He invokes Occam’s razor, a concept adopted by medieval philosophers and theologians, which holds that explanations should rely on the smallest possible number of causes. “If this were a criminal investigation, the detectives assigned to the case would start from the assumption that there was one prime suspect,” Taubes writes.

    Occam’s razor is hardly a fundamental law of the universe, however. No credible scientist would ever think of using it to prove or disprove anything. And Taubes neglects findings that contradict his idea that diabetes—and, by extension, sugar—is at the root of all our troubles. A study of the diabetes drug metformin, published two years ago in The Lancet, failed to show any impact on the treatment of pancreatic cancer. A placebo-controlled trial in which insulin was given to dementia patients did not find a meaningful improvement in cognition. Indeed, there is no conclusive evidence that excess dietary sugar per se causes diabetes. Most important, Taubes’s assertion that all these diseases are “closely related” is not scientifically supported. The biological behavior of cancer—DNA mutations, aberrant growth, metastatic spread—is nothing like that of diabetes. Inflammatory-bowel disease, a complex disorder that appears to have a variety of genetic underpinnings, does not seem to be caused by any particular diet or substance, and there is no evidence that restricting sugar ameliorates it. The attempt to characterize Alzheimer’s as “type-III diabetes,” linking it to insulin resistance and inflammation, is likewise speculative.

    The temptation to draw facile connections is ever-present in medical research, and the most valuable current work on these conditions is a matter not of grand unified theories but of a multiplicity of very fine-grained observations. Taubes is critical of scientists’ tendency to see disorders as “multifactorial” and “multidimensional”—that is, as arising from a complex interplay of factors. Lung cancer, he argues, is also multifactorial (most smokers don’t get it and many non-smokers do), yet no one disputes that smoking is the primary cause. But cigarette smoke contains carcinogens, molecules that have been shown to directly transform normal cells into malignant ones by disrupting their DNA. There’s no equivalent when it comes to sugar. Taubes surmises a causal link by citing findings that cancer cells need glucose to thrive, and absorb more of it than other cells. But this proves nothing: malignant cells consume in abundance not only carbohydrates like glucose and fructose but other nutrients, like vitamins. To imagine that, just because cancer cells like glucose, elevated levels of it might prompt healthy cells to become cancerous is to take a vast, unsubstantiated leap.

    Ultimately, Taubes’s indictment of sugar as the leading culprit in virtually all modern Western maladies doesn’t provide enough evidence for us to convict. That doesn’t mean sugar is without dangers: it certainly plays a role in the development of obesity, to say nothing of dental cavities. But these are lesser charges, and they make for a less dramatic headline.

    Taubes’s big claims get our attention, of course, but for people suffering from these diseases they’re not just a harmless rhetorical strategy. A woman I know who recently emerged from chemotherapy treatment for ovarian cancer and is now in remission told me that she was terrified after reading Taubes’s book. She asked if eating chocolate would make her tumor start growing again.

    The problem with most diet books, and with popular-science books about diet, is that their impact relies on giving us simple answers, shorn of attendant complexities: it’s all about fat, or carbs, or how many meals you eat (the Warrior diet), or combinations of food groups, or intervalic fasting (the 5:2 diet), or nutritional genomics (sticking to the foods your distant ancestors may have eaten, assuming you even know where your folks were during the Paleolithic era). They hold out the hope that, if you just fix one thing, your whole life will be better.

    In laboratories, it’s a different story, and it sometimes seems that the more sophisticated nutritional science becomes the less any single factor predominates, and the less sure we are of anything. Today’s findings regularly overturn yesterday’s promising hypotheses. A trial in 2003, led by researchers at the University of Pennsylvania, compared an Atkins diet, high in fat and low in carbohydrates, with a low-fat, high-carbohydrate, low-calorie one. After a year, there were no significant differences in how much weight the people in each group had lost, or in their levels of blood lipids—including their LDL cholesterol, the primary concern for heart attack and stroke. In a follow-up study in 2010, participants who followed either a low-carbohydrate or a low-fat diet ended up losing about the same amount of weight (seven kilograms) after two years. It was impossible to predict which diet would lead to significant weight loss in any given individual, and, as most dieters well know, sustaining weight loss often fails after initial success.

    Other research seems to undermine the whole idea of dieting: extreme regimens pose dangers, such as the risk of damaged kidneys from a buildup of excess uric acid during high-protein diets; and population studies have shown that being a tad overweight may actually be fine. Even studying these issues in the first place can be problematic. Although the study of the Mediterranean diet, for example, reflects randomized controlled experiments, most nutritional studies are observational; they rely on so-called food diaries, in which subjects record what they remember about their daily intake. Such diaries are notoriously inexact. No one likes admitting to having indulged in foods that they know—or think they know—are bad for them.

    Science is an accretion of provisional certainties. Current research includes much that is genuinely promising—several groups have identified genes that predispose some people to obesity, and are studying how targeted diets and exercise can attenuate these effects—but the more one pays attention to the latest news from the labs the harder it becomes to separate signal from noise. Amid the constant back-and-forth of various hypotheses, orthodoxies, and fads, it’s more important to pay attention to the gradual advances, such as our understanding of calories and vitamins or the consensus among studies showing that trans fats exacerbate cardiovascular disease. What this means for most of us is that common sense should prevail. Eat and exercise in moderation; maintain a diet consisting of balanced amounts of protein, fat, and carbohydrates; make sure you get plenty of fruit and vegetables. And enjoy an occasional slice of chocolate cake. ?

    This article appears in other versions of the April 3, 2017, issue, with the headline “Food Fights.”

    Jerome Groopman, a staff writer since 1998, writes primarily about medicine and biology.

    Read more »

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10 

    Copyright © 1995-2018 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.