PoliticsSam's miscellany

Previous 10 Next 10 
From: Sam7/26/2017 7:45:48 AM
   of 464
The Chipotle Corporate Sabotage Theory Returns
Some people see a simple case of foodborne illness. Others see corporate sabotage.
By Deena Shanker
July 25, 2017, 5:00 AM EDT

Yet another outbreak of foodborne illness last week at Chipotle Mexican Grill did what it usually does to the burrito chain: The stock price plummeted. It's bad news—particularly for the patrons who got sick—but it's a boon for anyone that had the foresight to short the stock.

The latest outbreak was first noted by, a website that crowdsources reports of customer illnesses following visits to restaurants. The goal, it says, is "safer food, safer communities and a healthier economy." Yet, as Bloomberg reported last week, hedge funds looking to profit from others' bad luck can also access a "souped up" version of the site for a $5,000 monthly fee.

Aaron Allen, principal at Aaron Allen & Associates, a restaurant industry consultancy, posited in a LinkedIn post on Monday morning that the Chipotle illness might not just be a matter of luck. "A lot of things stacked up that made it suspicious," he told Bloomberg in an interview on Monday, "and when you look at it from a statistical point of view, even more suspicious." His group has no financial interest in the chain, Allen said, and he has previously lauded the chain's pre-scandal marketing.

He's not the first to publicly speculate about the possibility that corporate sabotage is behind Chipotle's woes. (There was even a recent plot line on Showtime's "Billions" in which a hedge fund manager contaminated the customers of a fictional company, Ice Juice, whose stock he had just shorted.) A similar theory—which was largely dismissed—circulated two years ago when the chain was hit by outbreaks of E. coli, norovirus, and salmonella. The stock price plummeted at the time, costing the company billions of dollars in market capitalization.

While Allen can't prove his theories about Chipotle, he argues that corporate sabotage of a similar nature has happened in the past. A woman planted a severed finger in her Wendy's chili more than a decade ago. In the 1980s, Tylenol capsules were purposely tainted with potassium cyanide, leading to the deaths of seven people in the Chicago area.

Plus, there was a lot of money on the table for Chipotle short sellers—as much as $459 million, according to Allen's calculations—before there were any inklings of a food safety problem. "Chipotle short-sellers saw their ambitions rewarded with $55 million in less than one day," he wrote of the latest scare.

Allen said his team, which includes a statistician and food safety experts, found a number of "statistical anomolies." First, he wrote in his post, the time of year raises questions; while 60 percent of food safety outbreaks occur from December to May, Chipotle's happened from August to December. Second, Chipotle experienced four times the number of norovirus outbreaks a chain of its size would be expected to have, and that's not even counting the E. coli and salmonella. And, he notes, significantly more people got sick from each of the outbreaks than normally do from the same pathogens. The average norovirus outbreak causes about 18 illnesses, while salmonella usually leads to about 25. At Chipotle, more than 200 people were sickened from a single August 2015 norovirus outbreak, and 64 fell ill from the same month's salmonella problem.

"We're not saying this as a definitive," he said. "But if you were a short seller and you were looking for where there would be the most financial gain in the restaurant industry, the best way is a food safety scare, and the best stock would be Chipotle."

Chipotle did not respond to Allen's post but told Bloomberg that it was aware of the 2015 theories. "We ... did not see any evidence to support them," spokesman Chris Arnold said in an email. He also pointed out that the company implemented food safety enhancements after those incidents.

"I can tell you unequivocally that none of the 2015 outbreaks were caused by some terrorist or criminal acts," said Bill Marler, a leading food safety attorney who has litigated foodborne illness claims for decades and who represented clients in each of that year's Chipotle outbreaks. "It's conspiratorial nuts."

Marler examined the fact patterns in each case and believes they were all caused by the usual suspects: sick employees, failure to pay attention to detail and ready-to-eat food showing up already tainted.

He allows that a conspiracy is theoretically possible. "But I don't believe in aliens, or that humans walked the earth with dinosaurs, either," he added.

Allen said he's just raising a question about a statistical anomaly, not accusing anyone of purposely poisoning burrito lovers. "It really is one of those situations, like who would put a finger in the chili?"

Share RecommendKeepReplyMark as Last Read

From: Sam7/29/2017 1:42:46 PM
   of 464
How to Avoid Sepsis, a Deadly Medical Emergency
More than 1 million Americans a year develop this condition, and 250,000 die from it, a new CDC reports says

By Kevin McCarthy
August 23, 2016

Dana Mirman didn’t pay much attention when she noticed what she assumed was a mosquito bite on her shoulder.

“After all, I live in Florida, people get bites,” says the 41-year-old communications professional.

Even when it began throbbing overnight she didn't worry much, assuming it was a simple allergic reaction. The next day she came down with flu-like symptoms, including vomiting and fever. “I still didn’t connect the two,” she says.

But later that evening her temperature spiked to 104 and her husband rushed her to the emergency room. The staff checked her blood pressure, and saw it was plummeting. They immediately started her on I.V. fluids and powerful antibiotics, she says—a response that likely saved her life.

Mirman was a victim of sepsis, an often deadly reaction to certain infections that can quickly race through your body, shutting down vital organs. While not well known, it's actually more common than heart attacks, and just as deadly.

More than a million people in the U.S. develop sepsis each year, and more than 250,000 deaths a year are linked to it, according to the Centers for Disease Control and Prevention. That’s higher than earlier estimates—an increase the CDC attributes mainly to better reporting on the infection, as well as the growing number of older people, who are especially susceptible to sepsis.

While many cases are linked to infections contracted in hospitals, doctor offices, nursing homes, and other healthcare facilities, a report out today from the CDC says that nearly half are community-acquired, meaning from infections people contract in their homes, schools, playgrounds, and elsewhere. Any infection can lead to sepsis, from a manicure gone wrong, food poisoning, or a simple urinary tract infection.

One reason sepsis is so dangerous is that patients and even doctors often don’t recognize the problem until it’s too late, says Anthony Fiore, M.D., of the CDC’s Division of Healthcare Quality Promotion and an author of the new report. “Rapid early treatment can save lives," he says. “It’s just like a heart attack or stroke,” in that the longer you wait, the deadlier it is.

Here’s what you need to know about sepsis, including how to recognize it and what to do if you suspect it.

How Sepsis Harms

Sepsis often starts with a local infection, most often in the lungs, digestive tract, urinary tract, or, as in Mirman’s case, on the skin. It usually stems from a bacterial infection, but fungal or viral infections can trigger sepsis, too.

Whenever the body develops an infection, the immune system normally kicks in, producing chemicals to fight the infection. But sometimes—either because the triggering bacteria is unusually powerful or because the person’s immune system is already weakened by other health problems—those chemicals are set loose in the bloodstream and course through the body.

Instead of just fighting the local infection, those chemicals unleashed by the immune system cause widespread inflammation and damage tissues in the liver, kidneys, heart, and other organs. Within hours, blood clots can begin to form, and damage to blood vessels causes blood pressure to drop, which in turn slows the delivery of vital nutrients to those organs already under attack. In the final stages, the heart weakens and organs begin to fail.

Who’s at Risk?
The new CDC reports helps identify who’s most at risk of sepsis. That includes people:

  • With chronic diseases. Seven out of 10 people in the CDC study who developed sepsis had diabetes, lung, kidney, or liver disease, or some other chronic condition. People with those conditions are at increased risk because they are susceptible to infection.
  • With conditions that weakened immune systems, such as AIDS or cancer.
  • Who are 65 or older or younger than a year old, probably because they too are more prone to infection.
  • Who have recently been in a hospital, nursing home, or other healthcare facility, in part because infection-causing bacteria are often common in those institutions.

How to Prevent Sepsis
While sepsis is often not recognized until it’s already progressed, Fiore said many cases could be prevented.

For one thing, the CDC report found that most sepsis patients have some underlying health problem that requires them to have frequent contact with doctors. Those visits, Fiore says, are an opportunity for healthcare providers to prevent infections that can turn septic (by, for example, ensuring that diabetics get thorough exams of their feet to check for wounds that could breed infections) and to educate people about warning signs for sepsis.

In addition, there are a number of steps people can take on their own to protect themselves from the underlying infections that can trigger sepsis. Here’s how:

  • Get vaccinated. Thirty-five percent of sepsis cases in the CDC study stemmed from pneumonia. Yet the CDC says that only 20 percent of high-risk adults under age 65 and 60 percent of those 65 and older have been vaccinated against that disease. And annual flu shots can also prevent respiratory infections that can sometimes turn septic. But only about a third of sepsis patients in the CDC study had a record of being vaccinated against the flu. “Thousands more deaths could be prevented with better coverage,” wrote the authors of the CDC report.
  • Treat urinary tract infections promptly. A quarter of sepsis cases resulted from urinary tract infections. So it’s important to see a healthcare provider if you have warning signs of those infections—usually a painful, burning feeling when you urinate and a strong urge to “go” often—and, when appropriate, be treated with antibiotics. UTIs are also common among hospital patients, especially those who have been catheterized, so it’s extra important to watch for those infections while in the hospital.
  • Clean skin wounds properly. About one in 10 sepsis cases follows a skin infection. So it’s essential to care for wounds and scrapes properly. That means washing with soap and water, cleaning out any dirt or debris, and covering wounds. And people with diabetes should make sure that they follow good foot care practices, since wounds there can often develop dangerous infections.
  • Avoid infections in hospitals. Since many infections that turn septic originate in hospitals and other healthcare facilities, it’s essential that you—and your healthcare providers—take steps to avoid those infections. That means, for example, insisting that everyone who comes into your hospital room—including doctors and nurses—wash their hands every time they touch you. Read more tips on how to avoid infections in hospitals.
Watch for Warning Signs
If you do develop an infection, it’s important to watch for signs of sepsis and, if you suspect it, act fast. Here’s what to do:
  • Know the symptoms. These include fever plus chills, shortness of breath, rapid breathing, increased heart rate, diarrhea, vomiting, rash, or pain, which many survivors describe as the worst they’ve ever felt. Disorientation or confusion can also be signs of sepsis.
  • Act fast. If you have any of those symptoms, “contact a healthcare provider and say, ‘I think I have an infection and it’s really gotten worse, and I think it’s possible I could have sepsis,” Fiore says. If it’s after hours, either contact the providers on-call service or, if you feel very sick, go to the emergency room. “It’s not something you sit on until morning,” he says.
  • Demand attention. Healthcare providers sometimes miss sepsis, or fail to react quickly enough, says Lisa McGiffert, the director of Consumer Reports Safe Patient Project. “Patients and their advocates need to be aggressive in pushing for quick action, and there is no excuse for a hospital or emergency room to respond to sepsis slowly once it is diagnosed,” McGiffert says. “Hospitals can do a lot to prevent death from sepsis by diagnosing it early and responding with urgency.”
  • Get the right treatment. If your doctor suspects sepsis, you should get treated with IV fluids and antibiotics right away. Initially, you will probably need a broad-spectrum antibiotic, which targets multiple bacteria. But your doctors should also order tests to identify the responsible bacteria and, if possible, switch you to an antibiotic that targets that specific bacteria.

In addition, Mirman—who started volunteering for the Sepsis Alliance, an organization that raises awareness about the condition, after her experience—emphasizes the importance of having support when dealing with sepsis. Sepsis can be extremely disorienting, and cause sleepiness, she says. “I was so exhausted, all I wanted to do was sleep” Mirman remembers. If it weren’t for her husband, she might never have made it to the emergency room. “I would not have picked up a phone and called 911.”

Video at the link

Share RecommendKeepReplyMark as Last Read

From: Sam8/2/2017 7:56:54 AM
   of 464
Some docs are like this. But I've seen some other young docs over the past year who seem really impressive. They seemed to be determined to be both caring and tough. Respectful of the patient's autonomy but not giving up their position as the expert purveyor of information. And aware of their limitations at the same time.

Medicine has become a service industry, and it's making doctors unable to confront death
  • Published on July 29, 2017
  • Featured in: Big Ideas & Innovation, Editor's Picks, Healthcare, The Weekend Essay
  • Seamus O'Mahony
    Consultant Gastroenterologist at Cork University Hospital

    Kieran Sweeney, a doctor and writer, was diagnosed with pulmonary mesothelioma in 2009. The condition, a type of lung cancer usually caused by exposure to asbestos, is almost invariably fatal. He wrote about his diagnosis in the British Medical Journal:

    The next 48 hours are spent talking to our four beautiful kids, aged mid-teens to early 20s, whose joyous careers are currently sprinkled through school, part-time jobs and university. I can’t really convey in words the catastrophic hurt my news has inflicted on them, and it is an insult, which at their age they should never have to endure. I will die of this tumour, I say, and we must address that, neither accepting nor comprehending it. This tumour will kill my body, I say, but I will yield my spirit and personhood reluctantly. We embrace. They weep. I weep for them, for fear for myself, and for the unthinkable horror that they will continue to inhabit the world in which I will play no part.
    These are the words of a grown-up. There is no whiff of the ‘how to break bad news’ workshop. No sentimentality, no aversion of the eyes from the monumental and unavoidable prospect of impending death. No foolishness about ‘fighting’ the cancer, or travelling to the ends of the world to find a cure. This is what ‘breaking bad news’ is really like.

    He died on Christmas Eve of the same year, 2009. He had been an original and eloquent commentator on medicine and its wider role. He drew attention to the limitations of evidence-based medicine, coining the term ‘the information paradox’ to describe the danger of information-overload distracting the doctor from his core role of relieving suffering. Here is an extract from his obituary in the British Medical Journal:

    He later described this approach as metaphysical and in a prescient piece he wrote, ‘The clearest example of this transition to the metaphysical level occurs when someone starts to die or accepts that death is near. Here both the doctor and the patient are confronted by the question: “When is enough, enough? This”, he wrote, “will be the defining question for the next generation of practitioners.”’
    Enough, indeed. A word rarely used in American medicine, where the culture of medical excess is most unbridled. Atul Gawande, the American surgeon and writer, became famous for championing safer surgery by adopting practices from other spheres of human activity such as the aviation industry. He writes regularly for the New Yorker, which published a long piece by him in 2010, called ‘Letting Go: What should medicine do when it can’t save your life?’ This article formed the basis for his 2014 book, Being Mortal. He described how, in the US, dying patients are routinely subjected to futile and painful medical treatments, while their doctors fail to discuss with them the inevitable outcome: ‘Patients die only once. They have no experience to draw on. They need doctors and nurses who are willing to have the hard discussions and say what they have seen, who will help people prepare for what is to come – and to escape a warehoused oblivion that few really want.’

    But doctors are no longer brave enough. They increasingly see themselves as service-providers, a role that does not encourage Difficult Conversations, or a willingness to be brave. Consumerism, fear of litigation and overregulation have conspired to create the customer-friendly doctor, who emerged when the doctor–patient relationship became recast in a quasi-commercial mould. This type of doctor, well trained in communication skills, eminently biddable, is not what Kieran Sweeney or Atul Gawande had in mind. Doctors, by the nature of their selection and training, are conformist, and the now dominant ethos of customer-friendliness has all but silenced other, dissenting, voices. There is now an insatiable appetite for medicine: for scans, for drugs, for tests, for screening. This appetite benefits many professional groups, industries and institutions. It is difficult to call ‘enough’, but a good doctor sometimes has to tell patients things they do not want to hear. Regrettably, it is much easier, in the middle of a busy clinic, to order another scan than to have the Difficult Conversation.

    Doctors have a duty beyond that of pleasing the individual patient, a duty to society at large. The US has many so-called ‘concierge’ doctors, private physicians engaged by the wealthy, who are always on call to minister to the needs of their fastidious and demanding clients. The annual fee per patient is said to be as much as $30,000. The ultimate concierge doctor was Conrad Murray, the late Michael Jackson’s personal physician. Murray’s willingness to prescribe almost anything, including the general anaesthetic agent, propofol, for his wealthy and manipulative patient, eventually led to Jackson’s death.

    Patient autonomy now trumps all other rights and obligations. Autonomy, however, is a useful card to play when, as often happens, particularly with the diagnosis of cancer, I am ambushed by well-meaning relatives, urging me not to tell the patient, because ‘it would kill’ them. Relatives have no formal rights as such, but commonly dictate medical care to those doctors keen on a quiet life and willing to be leaned on. Inevitably there will be instances, such as in the case of patients with dementia or those of very advanced age, where giving a diagnosis of cancer is of no benefit to them. But in most cases I believe it is my duty to tell the truth.

    The difficulty, however, is this: Kieran Sweeney’s acceptance of, and confrontation of, his situation, is the exception, not the rule. He was both advantaged and disadvantaged when he was given the diagnosis of mesothelioma. As a doctor, he knew immediately what the future held in store for him, but this knowledge precluded all hope. Many of my patients lack the educational background or knowledge to fully absorb a diagnosis of something like mesothelioma. Apart at all from this ‘cognitive’ aspect, many simply do not want to know the grisly details about survival statistics and what the future might hold. It is not only relatives who wish to have the truth concealed. Many patients do not want to have the Difficult Conversation.

    The entire modern hospital system conspires against those doctors willing to have this dialogue: the relatives, the chaos and noise of the environment, the technojuggernaut of modern hospital care, the customer-friendly doctors who are happy and willing to dole out false, delusional hope, and sometimes the patients themselves, who may not want to hear what the doctor has to say. The temptation to opt for the quiet life, to avoid the Difficult Conversation, is overwhelming. And no one will ever complain. The relatives will be content, and the dying will soon be dead. Why give yourself the grief?

    Society at large purports to want leadership and professionalism from doctors, but I wonder if this is really true. Leadership and professionalism involve confronting unpleasant truths, and sometimes denying people what they want (or think they want). Many doctors routinely over-diagnose, over-investigate and over-treat; these doctors are invariably described, approvingly, by their patients as ‘thorough’. Inevitably, the ‘thorough’ doctors eventually leave their dying patients mystified and abandoned when there is no chemotherapy left, no scans to order.

    Seamus O'Mahony is Consultant Gastroenterologist at Cork University Hospital and author of The Way We Die Now: The View from Medicine’s Front Line, from which this article is excerpted.

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/3/2017 5:44:07 AM
       of 464
    Himax Technologies, Inc. Reports Second Quarter 2017 Financial Results and Provides Third Quarter 2017 Guidance
    Thu August 3, 2017 5:00 AM|GlobeNewswire|About: HIMX

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/5/2017 9:57:33 PM
    1 Recommendation   of 464
    November 2, 2015 Issue
    The Price of UnionThe undefeatable South.
    By Nicholas Lemann

    Progress in civil rights has been matched by the Southernization of American politics.
    Photograph by Walker Evans / Courtesy Library of Congress

    When the Confederate States of America seceded, the response of the United States of America was firm: dissolving the Union was impermissible. By contrast, it took a few more years for the United States to resolve the question of whether it would permit slavery within its own borders, and it took more than a century for the U.S. to enforce civil rights and voting rights for all its citizens. This was mainly because of the South’s political power. In order to become the richest and most powerful country in the world, the United States had to include the South, and its inclusion has always come at a price. The Constitution (with its three-fifths compromise and others) awkwardly registered the contradiction between its democratic rhetoric and the foundational presence of slavery in the thirteen original states. The 1803 Louisiana Purchase—by which the U.S. acquired more slaveholding territory in the name of national expansion—set off the dynamic that led to the Civil War. The United States has declined every opportunity to let the South go its own way; in return, the South has effectively awarded itself a big say in the nation’s affairs.

    The South was the country’s aberrant region—wayward, backward, benighted—but it was at last going to join properly in the national project: that was the liberal rhetoric that accompanied the civil-rights movement. It was also the rhetoric that accompanied Reconstruction, which was premised on full citizenship for the former slaves. Within a decade, the South had raised the price of enforcement so high that the country threw in the towel and allowed the region to maintain a separate system of racial segregation and subjugation. For almost a century, the country wound up granting the conquered South very generous terms.

    The civil-rights revolution, too, can be thought of as a bargain, not simply a victory: the nation has become Southernized just as much as the South has become nationalized. Political conservatism, the traditional creed of the white South, went from being presumed dead in 1964 to being a powerful force in national politics. During the past half century, the country has had more Presidents from the former Confederacy than from the former Union. Racial prejudice and conflict have been understood as American, not Southern, problems.

    Even before the Civil War, the slave South and the free North weren’t so unconnected. A recent run of important historical studies have set themselves against the view of the antebellum South as a place apart, self-destructively devoted to its peculiar institution. Instead, they show, the South was essential to the development of global capitalism, and the rest of the country (along with much of the world) was deeply implicated in Southern slavery. Slavery was what made the United States an economic power. It also served as a malign innovation lab for influential new techniques in finance, management, and technology. England abolished slavery in its colonies in 1833, but then became the biggest purchaser of the slave South’s main crop, cotton. The mills of Manchester and Liverpool were built to turn Southern cotton into clothing, which meant that slavery was essential to the industrial revolution. Sven Beckert, in “Empire of Cotton,” argues that the Civil War, by interrupting the flow of cotton from the South, fuelled global colonialism, because Europe needed to find other places to supply its cotton. Craig Steven Wilder, in “Ebony & Ivy,” attributes a good measure of the rise of the great American universities to slavery. Walter Johnson, in “River of Dark Dreams,” is so strongly inclined not to see slavery as simply a regional system that he tends to put “the South” in quotes.

    After slavery had ended and Reconstruction gave way to the Jim Crow system, the Democratic Party was for decades an unlikely marriage of the white South (the black South effectively couldn’t vote) and blue-collar workers in the North. This meant that American liberalism had a lot of the South in it. Ira Katznelson, in “Fear Itself,” adeptly identifies the deep Southern influence on the New Deal era, the country’s liberal heyday, including not just its failure to challenge segregation but also a strong pro-military disposition that helped shape the Cold War. The great black migration to the North and the West, which peaked in the nineteen-forties and fifties, partly nationalized at least one race’s version of Southern culture, and, by converting non-voters to voters through relocation, helped generate the political will that led to the civil-rights legislation of the nineteen-sixties. Once those laws had passed, the South became for the Republican Party what it had previously been for the Democratic Party, the essential core of a national coalition. The South is all over this year’s Republican Presidential race.

    I’m a fifth-generation Southerner, though long expatriated, and I know the wounded indignation with which the folks back home react to any suggestion that the South is no longer—or maybe never was—an entirely separate region. What about our hound dogs, our verandas, our charm, our football worship, our slow-moving “way of life”? Outsiders who have visited the South, going back to Alexis de Tocqueville and Frederick Law Olmsted or even further, have usually agreed with the natives about the South’s distinctiveness, though they have often seen it as something to condemn, not admire. How can the South be so American if it feels (and smells, and sounds, and looks) so Southern?

    One of the many categories of visitors to the South was concerned liberals during the New Deal, who were primarily interested not in race but in “conditions”—poverty, disease, ignorance. These included the documentary photographers dispatched by the federal government’s Farm Security Administration, who wound up creating most of the familiar images of the Depression, as well as anthropologists, sociologists, journalists, social reformers, artists, and filmmakers. James Agee and Walker Evans’s lugubrious book “Let Us Now Praise Famous Men” is one of the most enduring examples of this tradition. (The 1941 Preston Sturges film “Sullivan’s Travels” manages the nearly impossible feat of poking fun at such visitors while also making it clear that their mission had a powerful moral justification.) During the same period, white Southern novelists produced their own body of work that trafficked in Southern dispossession and dysfunction. William Faulkner was at the head of this class, which also included Erskine Caldwell (who was part of the social-documentary tradition, too, through his professional and personal partnership with Margaret Bourke-White) and, later, Carson McCullers and Flannery O’Connor.

    Paul Theroux, the veteran travel writer, seems to have prepared for “Deep South: Four Seasons on Back Roads” (Houghton Mifflin Harcourt), the first of his ten travel books set in the United States, by immersing himself in these works from the second quarter of the twentieth century. The genre in which he is working naturally organizes itself into vignettes rendered with a primary focus on literary artistry, rather than analysis, so he never has to state a full-dress argument, or even say exactly what he was looking for in those four long driving tours. The South remains more rural than the Northeast, but by now, as in the rest of the country, most people live in metropolitan areas. Still, Theroux tells us, “I stayed away from the big cities and the coastal communities. I kept to the Lowcountry, the Black Belt, the Delta, the backwoods, the flyspeck towns.” This principle may have been a way of simplifying his writing assignment: these are places where some people eat squirrels and raccoons, and are obviously unusual in a way that people in the Atlanta suburbs are not. That makes them easier to portray vividly. But Theroux is left trying to evoke the fastest-growing region of the country, where a hundred and twenty million people live, by taking us to a series of poor, deep-rural, depopulated places, like Hale County, Alabama; the Mississippi Delta; and the Ozarks, where the main noticeable changes in the past few decades are outsourcing and the advent of Gujarati Indians as motel owners.

    V. S. Naipaul, Theroux’s former mentor, wrote quite a similar book twenty-six years ago, called “A Turn in the South.” Naipaul, never one for sentimentality about oppressed people, wound up celebrating “the redneck” (you have to have pale skin to have a red neck) as the South’s heroic type. Theroux thinks of himself as a liberal, and he doesn’t go anywhere near defending the white South’s politics and attitudes. On the other hand, he also doesn’t want to play the part of the disapproving or sneering Northerner. National culture, these days, seems to connect with the part of the South that Theroux visited through rollicking reality-television carnivals like “Duck Dynasty” and “Here Comes Honey Boo Boo.” Theroux strikes an empathetic, mournful tone rather than a mocking one. The people he visits are older, settled. Many of them either work in or are clients of social-service and community-development agencies. More are white than are black. He often compares the rural South—“rotting, picturesquely hopeless, forgotten”—to the underdeveloped parts of sub-Saharan Africa, which he has been visiting intermittently since he was a Peace Corps volunteer in Malawi, in 1963, and he regularly complains that the South gets far less attention from big philanthropies and the like. (He’s especially annoyed that the Clinton Global Initiative evinces so little interest in the poorest regions of Bill Clinton’s home state.)

    In a final, confessional section, Theroux connects the book’s project to his own stage in life. At seventy-four, he finds himself contemplating the past more than the future, and wonders whether the onrushing world has left him behind. Where better to entertain such thoughts than in Allendale, South Carolina, a ghostly town bypassed by the interstate-highway system? But this turn of mind leads him inexorably to an implied theory of the South as, indeed, a region radically apart. Throughout the book, he registers the South’s religiosity and its preoccupation with guns as products of its degraded status, rather than of a culture that has always been more pious and more martial than the rest of the country’s. On one of several visits he makes to gun shows, during which he tries hard to understand rather than to condemn, he observes, “The whites felt like a despised minority—different, defeated, misunderstood, meddled with, pushed around, cheated.” His final judgment on the South, delivered at the end of the book, is this: “Catastrophically passive, as though fatally wounded by the Civil War, the South has been held back from prosperity and has little power to exert influence on the country at large, so it remains immured in its region, especially in its rural areas, walled off from the world.”

    Even if you believe the South is that separate from the rest of the country, you might still, if you look hard enough, detect tendrils of Southern influence that extend past the Mason-Dixon Line. Race provides the obvious example. The slave states developed an elaborate and distinctively American binary racial system, in which everybody across a wide range of European origins was put into one category, white, and everybody across a wide range of African origins (including those with more white forebears than black forebears) was put into another category, black. These tendentious categories have been nationalized for so long that they seem natural to nearly all Americans. They are Southern-originated, but not Southern. They powerfully determine where we live, how we speak, how we think of ourselves, whom we choose to marry. They are deeply embedded in law and politics, through the census, police records, electoral polling, and many other means.

    A frequent companion of the idea of a simple distinction between black and white is the idea of a simple distinction between racists and non-racists. There can’t be anybody left who believes that racists exist only in the South, but there are plenty of people, especially white people, who believe that racism is another simple binary and that they dwell on the better side of it. Paul Theroux marvels that Strom Thurmond, the old South Carolina arch-segregationist, fathered an out-of-wedlock black child. “Funny that a racist like Thurmond would have an affair with his black servant,” he remarks to someone he’s visiting. Come on! It’s visually evident how often this happened—“racism” as manifest in a sense of sexual entitlement, rather than of revulsion. Theroux himself displays an uncharacteristic electric jolt of resentment on the rare occasions when he contemplates urban black culture. In one passage, he refers to “the obscene, semiliterate yawp and grunt of rap,” and, in another, he describes a well-dressed black-bourgeois group he encounters at an event in Little Rock as being “like a shoal of leathery sharks” who are “suspicious, chilly, with a suggestion of hauteur in their greeting, as if they were still learning how to deal with whites.”

    Ari Berman’s “Give Us the Ballot” (Farrar, Straus & Giroux), a history of the 1965 Voting Rights Act, makes for an excellent extended example of the mechanisms by which race in the South becomes race in the nation. The Voting Rights Act followed the better-known Civil Rights Act by a year. It is properly understood as part of a wave of legislation that represents the political triumph of the civil-rights movement, but Berman, like most people, finds a precipitating event in the murder, in June, 1964, in Neshoba County, Mississippi, of three young civil-rights workers, James Chaney, Andrew Goodman, and Michael Schwerner.

    Chaney, Goodman, and Schwerner’s mission was voter registration—hence their connection to the Voting Rights Act. It’s sad but true that their murders would not have resonated so deeply if Goodman and Schwerner had not been whites from New York who had come South to participate in Freedom Summer. In fact, the grassroots organizing on behalf of voting rights was substantially black and Southern. Just before Freedom Summer, the congregation of Mt. Zion Methodist Church, in the all-black Neshoba County town of Longdale, had voted to make its church the local headquarters of the movement’s voter-registration efforts. A few days before the murders, the Ku Klux Klan burned the church down, because of the role it was playing. Chaney, Goodman, and Schwerner were on their way back from a trip to Longdale to investigate the fire when they were killed.

    “One Mississippi, Two Mississippi” (Oxford), by Carol V.R. George, a history of the Mt. Zion church, makes plain how essential the church was to the local civil-rights struggle. It was organized, with the help of Northern whites, during the period when the citizenship of former slaves was being rescinded, with the end of Reconstruction. For decades, its members were involved in every possible effort to reinstate the rights of blacks in Neshoba County, including the years of relentless activity that preceded Freedom Summer. And, after the church was rebuilt, it was deeply engaged in the long struggle to bring to justice one of Chaney, Goodman, and Schwerner’s killers, Edgar Ray Killen, whom an all-white Neshoba County jury refused to convict in 1967. That took until 2005.

    So the passage of the Voting Rights Act was actually a North-South partnership, not an imposition of the North’s will on the South. And it would be a big mistake to think of the act as a great, enduring civil-rights milestone, representing the country’s belated decision to comply fully and everywhere with the Fifteenth Amendment to the Constitution. As Berman demonstrates, the act has been, instead, the subject of half a century of ceaseless contention, leaving its meaning permanently undetermined. Most of the consequential fights about civil rights, beginning with the Reconstruction-era amendments to the Constitution, have been over the federal government’s role in enforcement. The Voting Rights Act gives Washington the power to review local voter-registration practices, and to change the boundaries of election districts in areas that have a history of discrimination or that appear to be drawing district lines so as to minimize the number of black elected officials. But the act, as written, invites conflict because its enforcement provisions come up for periodic congressional review.

    Every few years, there has been a serious attempt to discontinue these enforcement provisions. Berman makes a persuasive case that the ongoing battles over the reviews of the Voting Rights Act, beginning with the first one, in 1970, have had a major impact on who has held political power. Periods of aggressive enforcement have produced more black voters and more liberal (especially black) elected officials—including, Berman suggests, Barack Obama—and also the potential for conservative politicians to take advantage of white resentment of the Voting Rights Act.

    In August of 1980, Ronald Reagan chose to kick off his general-election Presidential campaign at the Neshoba County Fair, in Mississippi, not far from where Chaney, Goodman, and Schwerner were murdered, and to declare, “I believe in states’ rights.” Once Reagan was in office, there was a battle over the terms of one of the Voting Rights Act’s periodic extensions, in which a significant actor was John Roberts, then a young lawyer at the Justice Department and now the Chief Justice. Berman has found in the National Archives a set of memos that Roberts wrote in 1981 and 1982, demonstrating a passionate opposition to aggressive enforcement of the Voting Rights Act. Three decades later, in the case of Shelby County v. Holder (2013), Roberts led a Supreme Court majority that struck down the major enforcement provision of the act, arguing that the problem the act was passed to correct has long since been solved. This will help Republicans in subsequent elections, including the 2016 Presidential election.

    At passage, the Voting Rights Act appeared to be only about the South, but over the years it has regularly been applied elsewhere. Politics is racial, to some extent, in most places; it was impossible to keep such a major law from having national repercussions. Among the states that have now passed election laws in direct response to the Shelby decision are Arizona, Wisconsin, and Ohio. The same dynamic—in which a “regional” issue goes national—repeats itself in just about every realm: not just in politics but also in culture, business, social mores.

    “It will become all one thing or all the other,” Abraham Lincoln declared of the beleaguered, slavery-stressed Union, in his “House Divided” speech. In fact, the South and the rest of the nation have one of those hot-blooded relationships—the major one, in American history—which never settle into either trustful intimacy or polite distance. The South is too big and powerful to be vestigial; too married to the rest of the country to stand truly apart; too distinctive in its history to be fully united with the other states. Colin Powell, back in the days when, as Secretary of State, he was voicing skepticism about the Iraq War, used to say, “If you break it, you own it.” That seemed true for a while in Iraq, but, being halfway around the world, Iraq wasn’t so hard to leave. The Union’s defeat of the Confederacy makes for a better example. ?

    Nicholas Lemann joined The New Yorker as a staff writer in 1999, and has written the Letter from Washington and the Wayward Press columns for the magazine.

    Read more »

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/6/2017 6:35:35 AM
       of 464
    Rise of the machines

    By Chico Harlan By Chico Harlan
    Washington Post
    August 5 at 6:00 PM

    Rob Goldiez, co-founder of Hirebotics, configures a robot at Tenere Inc. in Dresser, Wis. (Ackerman + Gruber/For The Washington Post)

    The workers of the first shift had just finished their morning cigarettes and settled into place when one last car pulled into the factory parking lot, driving past an American flag and a “now hiring” sign. Out came two men, who opened up the trunk, and then out came four cardboard boxes labeled “fragile.”

    “We’ve got the robots,” one of the men said.

    They watched as a forklift hoisted the boxes into the air and followed the forklift into a building where a row of old mechanical presses shook the concrete floor. The forklift honked and carried the boxes past workers in steel-toed boots and earplugs. It rounded a bend and arrived at the other corner of the building, at the end of an assembly line.

    The line was intended for 12 workers, but two were no-shows. One had just been jailed for drug possession and violating probation. Three other spots were empty because the company hadn’t found anybody to do the work. That left six people on the line jumping from spot to spot, snapping parts into place and building metal containers by hand, too busy to look up as the forklift now came to a stop beside them.

    In factory after American factory, the surrender of the industrial age to the age of automation continues at a record pace. The transformation is decades along, its primary reasons well-established: a search for cost-cutting and efficiency.

    But as one factory in Wisconsin is showing, the forces driving automation can evolve — for reasons having to do with the condition of the American workforce. The robots were coming in not to replace humans, and not just as a way to modernize, but also because reliable humans had become so hard to find. It was part of a labor shortage spreading across America, one that economists said is stemming from so many things at once. A low unemployment rate. The retirement of baby boomers. A younger generation that doesn’t want factory jobs. And, more and more, a workforce in declining health: because of alcohol, because of despair and depression, because of a spike in the use of opioids and other drugs.

    In earlier decades, companies would have responded to such a shortage by either giving up on expansion hopes or boosting wages until they filled their positions. But now, they had another option. Robots had become more affordable. No longer did machines require six-figure investments; they could be purchased for $30,000, or even leased at an hourly rate. As a result, a new generation of robots was winding up on the floors of small- and medium-size companies that had previously depended only on the workers who lived just beyond their doors. Companies now could pick between two versions of the American worker — humans and robots. And at Tenere Inc., where 132 jobs were unfilled on the week the robots arrived, the balance was beginning to shift.

    “Right here, okay?” the forklift driver yelled over the noise of the factory, and when a manager gave him a nod, he placed on the ground the boxes containing the two newest employees at Tenere, Robot 1 and Robot 2.\

    Employees at Tenere in Dresser, Wis., take a smoke break. (Tim Gruber/For The Washington Post)

    Tenere is a company that manufactures custom-made metal and plastic parts, mostly for the tech industry. Five years earlier a private-equity firm acquired the company, expanded to Mexico, and ushered in what the company called “a new era of growth.” In Wisconsin, where it has 550 employees, all non-union, wages started at $10.50 per hour for first shift and $13 per hour for overnight. Counting health insurance and retirement benefits, even the lowest-paid worker was more expensive than the robots, which Tenere was leasing from a Nashville-based start-up, Hirebotics, for $15 per hour. Hirebotics co-founder Matt Bush said that, before coming to Tenere, he’d been all across America installing robots at factories with similar hiring problems. “Everybody is struggling to find people,” he said, and it was true even in a slice of western Wisconsin so attuned to the rhythms of shift work that one local bar held happy hour three times a day.

    Inside the factory, there have been no major issues with quality control, plant managers say, only with filling its job openings. In the front office, the general manager had nudged up wages for second- and third-shift workers, and was wondering if he’d have to do it again in the next few months. Over in human resources, an administrator was saying that finding people was like trying to “climb Everest” — even after the company had loosened policies on hiring people with criminal records. Even the new hires who were coaxed through the door often didn’t last long, with the warning signs beginning when they filed in for orientation in a second-floor office that overlooked the factory floor.

    “How’s everybody doing?” said Matt Bader, as four just-hired workers walked in on a day when Robot 1 was being installed. “All good?”

    “Maybe,” one person said.

    Bader, who worked for a staffing agency that helped Tenere fill some of its positions, scanned the room. There was somebody in torn jeans. Somebody who drove a school bus and needed summer work only. Somebody without a car who had hitched a ride.

    Bader told them that once they started at Tenere they had to follow a few important rules, including one saying they couldn’t drink alcohol or use illegal substances at work. “Apparently, we need to tell people that,” Bader said, not mentioning that just a few days before he had driven two employees to a medical center for drug tests after managers suspected they’d shown up high.

    One worker stifled a yawn. Another asked about getting personal calls during the shift. Another raised his hand.

    “Yes?” Bader asked.

    “Do you have any coffee?” the worker said.

    “I don’t,” Bader said.

    After an hour the workers were heading back to their cars, one saying that everything “sounds okay,” another saying the “pay sucks.” Bader guessed that two of the four “wouldn’t last a week,” because often, he said, he knew within minutes who would last. People who said they couldn’t work Saturdays. People who couldn’t work early mornings. This was the mystery for him: So many people showing up, saying they were worried about rent or bills or supporting children, and yet they couldn’t hold down a job that could help them.

    “I am so sick of hearing that,” Bader said. “And then they wonder why things are getting automated.”


    The new robots had been made in Denmark, shipped to North Carolina, sold to engineers in Nashville, and then driven to Wisconsin. The robots had no faces, no bodies, nothing to suggest anything but mechanical efficiency. If anything, they looked similar to human arms, with silver limbs and powder blue elbows and charcoal-colored wrists.

    Each had been shipped with a corresponding box of wires and controls. Each weighed 40.6 pounds. They had been specifically designed to replicate movements with such precision than any deviation was no greater than the thickness of a human hair — a skill particularly helpful for Robot 1, which had been brought in to perform one of the most repetitive jobs in the factory.

    As the engineers prepared it for operation, Robot 1 had been bolted in front of a 10-foot-tall mechanical press. It was rigged with safety sensors and programmed to make a three-foot path of motion, one that it would use to make part No. 07123571. More commonly, Tenere called this part the claw.

    The purpose of the claw was to holster a disk drive. Tenere had been making them for two years, at two separate mechanical presses, where workers fed 6-by-7 inch pieces of flat aluminum into the machine, pressed two buttons simultaneously, and then extracted the metal — now bent at the edges. Tenere’s workers were supposed to do this 1,760 times per shift.

    Robot 1, almost programmed now, started trying it out. It snatched the flat metal from its left side, then swiveled back toward the press. It moved noiselessly. It released the part into the mouth of the machine, and as soon as it withdrew, down came the press to shape the metal into a claw: Wallop. The robot’s arm then retrieved the part, swiveling back to its left, and dropping the claw on a conveyor belt.

    “How fast do you want it?” Hirebotics co-founder Rob Goldiez asked a plant manager supervising the installation.

    First the robot was cycling every 20 seconds, and then every 14.9 seconds, and then every 10 seconds. An engineer toggled with the settings, and later the speed bumped up again. A claw was being produced every 9.5 seconds. Or 379 every hour; 3,032 every shift; 9,096 every day.

    “This motion,” Goldiez said, “will be repeatable for years.”


    Bobby Campbell, 51, at his workstation. (Ackerman + Gruber/For The Washington Post)

    Some distance away, in front of another mechanical press, was a 51-year-old man named Bobby Campbell who had the same job as Robot 1. He’d wound up with the position because of an accident: In February, he’d had too much to drink, tumbled off a deck at his daughter’s house, and broken his neck. When he returned after three months, Tenere pulled him out of the laser department and put him on light duty. Now, as the testing continued on a robot that he said “just looks like something you see in the damned dentist’s office,” Campbell was starting his 25th consecutive workday feeding claws to the machine. He’d punched the same two buttons that activated the press 36,665 times.

    “Beat that robot today,” Campbell’s supervisor said.

    “Hah,” Campbell said, turning his back and settling in at his station, where there were 1,760 claws to make and eight hours until he drove home.

    He set his canvas lunch container on a side table and oiled his mechanical press. He cut open a box of parts and placed the first flat piece of metal under the press. A gauge on the side of the press kept count. Wallop. “1,” the counter said, and after Campbell had pressed the button 117 more times, there were seven hours to go.

    Unlike the employees on the assembly line, Campbell worked alone. His press was off in a corner. There was no foot traffic, nobody to talk to, nothing to look at. Campbell stopped his work and removed a container of pills. He took a low-dose aspirin for his neck, another pill for high blood pressure. He snacked on some peppers and homemade pickles, fed 393 more parts in the machine, and then it was time for lunch. Four hours to go.

    “Monday,” he said with a little shrug. “I’ll pick it up after I get some fuel.”

    Campbell had been at Tenere for three years. He earned $13.50 per hour. He had a bad back, a shaved and scarred head, a tear duct that perpetually leaked after orbital surgery, and aging biceps that he showed off with sleeveless Harley-Davidson shirts. He liked working at Tenere, he said. Good people. Good benefits. Some days he hit his targets, other days he didn’t, but his supervisors never got on him, and the company had always been patient with him, even as he dealt with some personal problems.

    He lived 31 miles and 40 minutes away, provided he didn’t stop. The problem was, sometimes he did. Along the drive home there were a dozen gas stations and minimarts selling beer, and Campbell said he couldn’t figure out why some days he would turn in. He’d tried everything he could think of to stop himself. Calling his daughters, calling his wife. Turning up the music and listening to Rod Stewart. He’d been to Alcoholics Anonymous meetings, he said. He’d spent 28 days at a treatment center. He’d looked for jobs that would cut down on the commute. He’d faced a family intervention where the whole family read him letters, as he sat there feeling like what he called a “kindergartner.”

    Sometimes, Campbell said, he almost thought he was through the worst — sober for weeks at a time — but then came Saturday, when he was supposed to work an eight-hour shift and instead clocked out after three hours, stopping on the way home and downing a 12-pack of beer before sundown. Then came Sunday, another 12 beers out on the lake. Now it was Monday, and Campbell said he was sure he’d be okay if he could just get home. There, his wife only allowed him to have nonalcoholic beers. But that was 31 miles away. “Just the uncertainty,” Campbell said, and he tried not to think about it, with the lunch break over, and 3 hours and 40 minutes to go.

    He stepped onto the floor pad in front of the press and got back to work. A box of flat metal pieces was to his left, a hopper of finished claws sat on his right, and Campbell’s hands moved in a rhythm, grabbing and inserting. “As long as I’ve got parts in front of me, I’m all right,” he said. Twenty minutes without looking up. Then 40. Then nearly 60. The gauge said 912.

    “All right,” Campbell said, when there was an hour left to go, still pressing the buttons.

    He hummed a song. He whistled. He fed 11 pieces of metal to the machine in a minute, and then 13, then nine. His eyes darted from left to right. He nodded his head.

    The press’s clutch was hissing and exhaling, hissing and exhaling, and Campbell added a last pump of oil to the machine with 15 minutes to go. Out came a few more parts, and he fed them into the hopper, checked the gauge, and shrugged. “Not so bad,” he said.

    Time to go home. He had punched the buttons another 1,376 times, 384 shy of his target, and now he got in the car.


    Hirebotics co-founders Matt Bush, left, and Rob Goldiez, right, work alongside a Tenere employee to set up the workstation for Robot 2. (Ackerman + Gruber/For The Washington Post)

    Robot 2 had a different job than Robot 1. It was to be part of a team — the assembly line. The team worked along a 70-foot row of tables lined with workstations that were always at least a few workers shy, where employees snapped and riveted metal pieces, building silver, rectangular containers. Each container, by the time it reached the last assembly workstation, was outfitted with either 13 or 15 miniature drawer slots. It was the job of the third-to-last worker on the line to fill each with a claw. That would become the sole task of Robot 2, one that it started to test out after days of programming and setup.

    The claws arrived at Robot 2’s station on a conveyor belt. From there, the robot made a three-foot motion of its own. Grabbing the claw with its gripper. Swiveling 90 degrees. Reaching its arm toward the container. And then, inserting the claw into one of the drawer slots with an intricate push: forward 80 millimeters, down five millimeters, forward another 20 millimeters, up eight millimeters, forward another 12.

    “A delicate move,” Bush said.

    One that Robot 2 would be able to make every seven seconds once it joined the line.


    Annie Larson assembles parts on her line at Tenere, where she’s worked for six years. (Ackerman + Gruber/For The Washington Post)

    Days earlier, Annie Larson, the woman who would work alongside Robot 2, had been at home, the end of another shift, laid out in a recliner sipping a Mountain Dew mixed with what she described as the cheapest vodka she could find. There’d been six years at Tenere of days like these. Trying to unwind. Alone in her one-bedroom apartment. Bedtime at 9. Alarm at 5:40 a.m. Out the door at 6:20. Into her old Chevy. Six miles up the street. Then into the Tenere parking lot, clocking in just before 7, the next day of trying to keep pace. Except this time, as a forklift came to a stop nearby, she saw four boxes being dropped off at the end of the line.

    “What in the hell?” she thought.

    Her line supervisor, Tom Johannsen, had told workers a few weeks earlier the robots were coming. But he hadn’t said when they would arrive, or what exactly they would do. He hadn’t described how they would look. He’d just said nobody was losing their jobs, and not to worry, and that Tenere was “supplementing some of the people we can’t find.”

    Now, though, the boxes were being opened up, wires everywhere, and Larson started to worry. The machines looked too complicated. Maybe they’d break down. Maybe they couldn’t keep pace. Maybe they’d be just one more problem at the factory, and already, their boxes were getting in the way. Only six people were on the line, which meant Larson was leapfrogging from one workstation to the next, trying to do the work of two or three. She could feel everybody falling behind. She nearly tripped over a floor mat that had buckled to make space for the robot. Larson turned to one of the robot engineers and said, “We have no room over here. It sucks.” When the end-of-the-shift buzzer sounded, the line had made 32 fewer containers than it was supposed to, and that night, Larson said, she had more drinks than her usual one or two.

    But then came the next day: Back again, on time. Always on time. Larson was one of the steadiest parts of an assembly team in which so many other workers had lasted for weeks or months. “My line,” Larson called it. Her supervisor called her “old school.” A manager called her “no nonsense.” Others moaned about the job during lunchtime breaks. Larson, wanting no part of that, pulled up a stool to the assembly line every day and ate by herself.

    “If the job is that bad, go!” she said.

    She was 48, and she had no plans to leave. Rural Wisconsin was tough, but so what; she couldn’t start over. Her roots were here. Her mother lived four blocks away. Her father lived six blocks away. Her son, daughter and grandchild were all within 15 miles. Larson couldn’t afford vacations or new clothing, but she paid every bill on time: $545 for rent, $33 for electric — every amount and due date programmed into her phone.

    But it was the numbers at work that had been leaving her feeling more drained than usual lately. The team felt as if it was forever in catch-up mode. She and her co-workers were supposed to complete 2,250 containers per week. But with so many jobs unfilled, they missed the mark by 170 the week before the robots arrived. They were off 130 the week prior. The line got a pep talk from the supervisor, Johannsen, who said he could notice Larson in particular “getting frustrated.”

    “Are there any claws in that box?” Larson said now, motioning across a table.

    Another worker checked. “No.”

    “Ugh,” Larson said, and she grabbed the empty box and darted down the line, pony tail bobbing. She returned 15 seconds later with an overflowing pile of claws. “Here,” she said, dropping the box on an assembly line table. She reached over to the pile of containers and began filling them. Fifteen claws. Then 30. Her shirt was darkened with sweat. Forty-five claws. Sixty.

    “You’re power-hauling,” another worker said.

    Her co-workers were always changing. For now, they were a Linda, another Linda, a Kevin, a Sarah, a Miah, a Valerie and a Matt. Valerie was a good worker, Larson said, and so was one of the Lindas. But a few of the others struggled to keep pace. Larson told them sometimes how they could be more efficient in their jobs. How they could line up rivets in parallel rows, for instance. But who was paying attention?

    “There’s no caring,” Larson said. “No pride.”

    Friday now, and Larson was tired. There was one more shift before the weekend, but this time, when she showed up for work, she saw something different at the end of the line. The robots no longer looked like a mess. Their wires had been tucked into control boxes. Their stations had been swept clean. They were surrounded by new conveyor belts.

    “They’re pretty,” she said, and several hours later, mid-shift, she noticed an employee who’d missed the last few weeks with knee surgery wander over, stopping at the robot.

    “Ohh,” the employee said, “they’re taking somebody’s job.”

    “No, they’re not,” she said.

    She was surprised by her response. That she had come to the robot’s defense. But then she looked at what the robot was going to do: Put a claw in the slot. Put another claw in the slot. Put another claw in the slot.

    “It’s not a good job for a person to have anyway,” she said.

    The end-of-the-shift buzzer sounded, and Larson got back into her car. Then back into the recliner, where she poured her drink and tried to think about how the assembly line was going to change. Maybe the robots would actually help. Maybe the numbers would get better. Maybe her next problem would be too many humans and not enough robots.

    “Me and Val and 12 robots,” Larson said. “I would be happy with that.”


    “Now hiring” can be seen on a sign and an inflatable air dancer outside Tenere in Dresser, Wis. (Tim Gruber/For The Washington Post)

    Eight days after arriving in boxes, the robots’ first official day of work had arrived. In between the end of the overnight shift and the start of the first shift, the engineers did a last run-through and then picked up the touch screens that controlled the robots. Robot 1 began grabbing the metal rectangles, feeding them into the mechanical press, then extracting them as claws. Robot 2 began swiveling and grabbing the claws, placing them into a few containers that had been assembled overnight. The robots were six feet apart from one another, at stations producing the only noise in an otherwise quiet factory. Every 9.5 seconds: the wallop of the press. And then, the snapping of a claw sliding into a slot.

    And then, the 7 a.m. buzzer to start the day.

    In came the workers, some of whom took a moment to stand near the robots and watch.

    “It’s pretty amazing,” one said.

    “Gosh, it doesn’t take breaks,” another said.

    “Can I smack it if I need to?” Larson asked, and then she said, “Okay, let’s go.”

    The people took their stations. In one corner, Robot 1 was pounding out claws, laying them on a conveyor belt. Along a half-empty row of workstations, six people were constructing containers. At the end of that row, Robot 2 was filling those containers with claws. And at the other side of the factory, Campbell was stamping out claws the old way, feeding the metal and pressing the buttons, 320 in the first hour, even as he pulled a tissue out of his pocket every few minutes to dab his leaky eye.

    “This machine is getting hot, I’m working it so hard,” Campbell said.

    At the assembly line, Larson and the others were moving fast because they needed to. Robot 2 was filling a container with claws every 1 1/2 minutes, and the humans could barely keep pace. They shoveled 10 containers down the line, and Robot 2 filled them with claws. For a minute, as more containers were being riveted together, the robot sat idle.

    “We have to keep leapfrogging,” Larson shouted. “The robot needs some work.”

    Within an hour, the workers of the first shift had filled a shipping box with finished containers — the first batch made by both humans and robots. Then came a second box, and then a third, and then a buzzer sounded for a break. The work paused, and a manager, Ed Moryn, grabbed the Hirebotics engineers and asked them to follow.

    He took them through a passageway and into another building, stopping at two more workstations where he said the company needed help. A press brake job. An assembly job. “Can we do these?” Moryn asked. The engineers studied the work areas for 15 minutes, took some measurements, and two days later offered Tenere one version of a solution for a company trying to fill 132 openings. Tenere looked at the offer and signed the paperwork. In September, the engineers would be coming back, arriving this time with the boxes holding Robot 3 and Robot 4.

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/16/2017 2:56:42 PM
       of 464
    Apple Readies $1 Billion War Chest for Hollywood Programming
    DOW JONES & COMPANY, INC. 5:00 AM ET 8/16/2017

    Symbol Last Price Change
    160.83 -0.77 (-0.48%)
    QUOTES AS OF 02:54:04 PM ET 08/16/2017

    Apple Inc. (AAPL) has set a budget of roughly $1 billion to procure and produce original content over the next year, according to people familiar with the matter--a sign of how serious the iPhone maker is about making a splash in Hollywood.

    Combined with the company's marketing clout and global reach, that immediately makes Apple(AAPL) a considerable competitor in a crowded market where new media players and traditional media companies are vying to acquire original shows. The figure is about half what Time Warner Inc.'s HBO spent on content last year and on par with estimates of what Inc. spent in 2013, the year after it announced its move into original programming.

    Apple (AAPL) could acquire and produce as many as 10 television shows, according to the people familiar with the plan, helping fulfill Apple(AAPL) Senior Vice President Eddy Cue's vision of offering high-quality video--similar to shows such as HBO's "Game of Thrones"--on the company's streaming-music service or a new, video-focused service.

    Apple (AAPL) declined to comment.

    The budget will be in the hands of Hollywood veterans Jamie Erlicht and Zack Van Amburg, poached in June from Sony Corp. to oversee content acquisition and video strategy. They exited their Sony contracts a month early and started working this month from Apple's(AAPL)Los Angeles offices, where they are taking over programming responsibilities from the Apple Music team, according to the people familiar with the matter.

    Elbowing into the crowded video business won't be easy. Amazon and Netflix Inc. have considerable head starts and far bigger programming budgets. Apple(AAPL) also has to avoid jeopardizing its 15% cut of subscriptions from its app stores for video services like Netflix and HBO Go--a growing contributor to its $24.35 billion in annual services revenue.

    Programming costs can range from more than $2 million an episode for a comedy to more than $5 million for a drama. An episode of some high-end shows such as "Game of Thrones" can cost more than $10 million to produce.

    The back-to-back success of the original shows "House of Cards" and "Orange Is the New Black" is credited with building Netflix's business. At the time they were released the company's annual budget for original and acquired programming was about $2 billion; this year it is expected to spend more than $6 billion.

    For its video service to gain relevance, Apple(AAPL) needs at least one hit, according to the people familiar with the plan. The company's initial video efforts--"Planet of the Apps," launched in June on Apple Music, and "Carpool Karaoke," launched last week--were criticized by reviewers.

    With $215.64 billion in revenue last fiscal year and more than $261 billion in cash on its balance sheet, though, Apple(AAPL) could quickly ramp up spending on content.

    Mr. Van Amburg and Mr. Erlicht have begun meeting with Hollywood agents and holding discussions about shows Apple(AAPL) could acquire, the people familiar said. The men also hired former WGN America President Matt Cherniss to oversee development, the people said.

    Mr. Cherniss will assist with finding programming. He previously worked with Mr. Erlicht and Mr. Van Amburg to bring the Sony shows "Underground" and "Outsiders" to WGN. Mr. Cherniss also has movie experience, having worked as a production executive at Warner Bros.

    Apple (AAPL) is eager to shore up its existing video business--renting movies and TV shows through iTunes--which has been challenged by the rise of video-subscription services that offer programming for a monthly fee. Last year, iTunes generated an estimated $4.1 billion in revenue, but its share of the movie rental-and-sales market has declined to less than 35% from 50% in 2012.

    Apple (AAPL) is hoping that original video bolsters the appeal of movie rentals and other iTunes offerings--a critical piece of its services business, which also includes App Store sales, Apple Pay and Apple Music. The company aims to double that business to about $50 billion by 2020.

    Joe Flint contributed to this article.

    By Tripp Mickle Apple Inc. (AAPL) has set a budget of roughly $1 billion to procure and produce original content over the next year, according to people familiar with the matter, as the iPhone maker shows how serious it is about making a splash in Hollywood.

    Combined with the company's marketing clout and global reach, the step immediately makes Apple(AAPL) a considerable competitor in a crowded market where both new and traditional media players are vying to acquire original shows. Apple's(AAPL) budget is about half what Time Warner Inc.'s HBO spent on content last year and on par with estimates of what Inc. spent in 2013, the year after it announced its move into original programming.

    Apple (AAPL) could acquire and produce as many as 10 television shows, according to the people familiar with the plan, helping fulfill Apple(AAPL) Senior Vice President Eddy Cue's vision of offering high-quality video, similar to shows such as HBO's "Game of Thrones," on the company's streaming-music service or possibly a new, video-focused service.

    Apple (AAPL) declined to comment.

    The budget will be in the hands of Hollywood veterans Jamie Erlicht and Zack Van Amburg, poached in June from Sony Corp. to oversee content acquisition and video strategy. They exited their Sony contracts a month early and started working this month from Apple's(AAPL)Los Angeles offices, where they are taking over programming responsibilities from the Apple Music team, according to the people familiar with the matter.

    Elbowing into the video business won't be easy. Amazon and Netflix Inc. have considerable head starts and far bigger programming budgets. Apple(AAPL) also has to avoid jeopardizing its 15% cut of subscription revenues from its app stores for video services like Netflix and HBO Go, money that is a growing contributor to its $24.35 billion in annual services revenue.

    Hollywood has become a major battleground as consumers increasingly sever cable subscriptions and transition to streaming services like Netflix or Hulu. The disruption has fueled competition between tech and traditional media companies eager to sell subscriptions or generate advertising revenue with new entertainment programming. It has also fueled a major increase in scripted programming, which rose to more than 500 shows during the recently concluded 2016-17 television season, nearly double the total in 2011.

    In addition to Amazon and Apple(AAPL), Facebook Inc. has begun acquiring original programming like a reality show on NBA player Lonzo Ball's family, and Alphabet Inc.'s Google has announced a $35-a-month streaming TV service.

    Netflix has aimed to outflank tech rivals by recruiting star TV producers like Shonda Rhimes, who developed ABC hits like "Grey's Anatomy" and "Scandal." Meanwhile, Walt Disney Co. announced this month it will pull its movies from Netflix and launch its own streaming service.

    David Hill, a former senior executive at 21st Century Fox Inc., said Apple's(AAPL) recent entertainment hires give the company a chance to catch up with established players like Netflix and Amazon. But he said that the flood of new scripted shows is making it increasingly harder to succeed in attracting viewers.

    "There's just not enough time," Mr. Hill said.

    Programming costs can range from more than $2 million an episode for a comedy to more than $5 million for a drama. An episode of some high-end shows such as "Game of Thrones" can cost more than $10 million to produce.

    The back-to-back success of the original shows "House of Cards" and "Orange Is the New Black" is credited with building Netflix's business. At the time they were released the company's annual budget for original and acquired programming was about $2 billion; this year Netflix is expected to spend more than $6 billion.

    For its initiative to gain relevance, Apple(AAPL) needs at least one hit, according to the people familiar with the plan. The firm's initial video efforts via Apple Music -- "Planet of the Apps," launched in June and, "Carpool Karaoke," out last week -- were criticized by reviewers.

    With $215.64 billion in revenue last fiscal year and more than $261 billion in cash on its balance sheet, Apple(AAPL) could quickly ramp up spending on content.

    Messrs. Van Amburg and Erlicht have begun meeting with Hollywood agents and holding discussions about shows Apple(AAPL) could acquire, the people familiar said. The men also hired former WGN America President Matt Cherniss to oversee development, the people said.

    Mr. Cherniss will assist with finding programming. He previously worked with Messrs. Erlicht and Van Amburg to bring the Sony shows "Underground" and "Outsiders" to WGN. Mr. Cherniss also has movie experience, having worked as a production executive at Warner Bros.

    Apple (AAPL) is eager to shore up its existing video business -- renting movies and TV shows through iTunes -- which has been challenged by the rise of video-subscription services that offer programming for a monthly fee. Last year, iTunes generated an estimated $4.1 billion in revenue, but its share of the movie rental-and-sales market has declined to less than 35% from about half in 2012.

    Apple (AAPL) is hoping original video bolsters the appeal of movie rentals and other iTunes offerings -- a critical piece of its services business, which also includes App Store sales, Apple Pay and Apple Music. It aims to double that business to about $50 billion by 2020.

    --Joe Flint contributed to this article.

    Write to Tripp Mickle at

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/18/2017 1:08:25 PM
       of 464
    Forget Amazon. Here's the real reason retail stocks are slumping

    Special to The Globe and Mail
    Updated: 3 days ago August 14, 2017;

    Retail stocks have been annihilated recently, despite the economy eking out growth. The fundamentals of the retail business look horrible: Sales are stagnating and profitability is getting worse with every passing quarter.

    Jeff Bezos and Amazon get most of the credit, but this credit is misplaced. Today, online sales represent only 8.5 per cent of total retail sales. Amazon, at $80-billion (U.S.) in sales, accounts only for 1.5 per cent of total U.S. retail sales, which at the end of 2016 were around $5.5-trillion. Although it is human nature to look for the simplest explanation, in truth, the confluence of a half-dozen unrelated developments is responsible for weak retail sales.

    Consumption needs and preferences have changed significantly. Ten years ago, people spent a pittance on cellphones. Today, Apple sells about $100-billion worth of i-goods in the United States, and about two-thirds of those sales are iPhones. Apple's U.S. market share is about 44 per cent, thus the total market for smart mobile phones in the United States is $150-billion a year. Add spending on smartphone accessories (cases, cables, glass protectors, etc.) and we are probably looking at $200-billion total spending a year on smartphones and accessories.

    Ten years ago (before the introduction of the iPhone), smartphone sales were close to zero. Nokia was the king of dumb phones, with U.S. sales of $4-billion in 2006. The total dumb-cellphone-handset market in the United States in 2006 was probably closer to $10-billion.

    Consumer income has not changed much since 2006, thus over the past 10 years, $190-billion in consumer spending was diverted toward mobile phones.

    It gets more interesting. In 2006, a cellphone was a luxury only affordable by adults, but today 7-year-olds have iPhones. Phone bills per household more than doubled over the past decade. Not to bore you with too many data points, but Verizon Wireless' revenue in 2006 was $38-billion. Fast-forward 10 years and it is $89-billion – a $51-billion increase. Verizon's market share is about 30 per cent, thus the total spending increase on wireless services is close to $150-billion.

    Between phones and their services, this is $340-billion that will not be spent on T-shirts and shoes.

    But we are not done. In the United States, the combination of health-care inflation in the mid-single digits and the proliferation of high-deductible plans has increased consumers' direct health-care costs and further chipped away at discretionary dollars. U.S. health-care spending is $3.3-trillion, and just 3 per cent of that figure is almost $100-billion.

    Then there are soft, hard-to-quantify factors. Millennials and millennial-want-to-be generations (speaking for myself here) do not really care about clothes as much as people may have 10 years ago. After all, high-tech billionaires wear hoodies and flip-flops to work. Lack of fashion sense did not hinder their success, so why should the rest of us care about the dress code?

    In the 1990s, casual Fridays were a big deal – yippee, we could wear jeans to work! Fast-forward 20 years, and every day is casual. Suits? They are worn to job interviews or to impress old-fashioned clients. Consumer habits have slowly changed, and we now put less value on clothes (and thus spend less money on them) and more value on having the latest iThing.

    All this brings us to a hard and sad reality: The United States is over-retailed. It simply has too many stores. Americans have four or five times more square footage per capita than other developed countries. This bloated square footage was created for a different consumer, the one who, in the 1990s and 2000s, was borrowing money against the house and spending it at the local shopping mall.

    Today's post-Great Recession consumer is deleveraging, paying off debt, spending money on new necessities such as mobile phones and paying more for the old ones such as health care.

    Yes, Amazon and online sales do matter. Ten years ago, only 2.5 per cent of retail sales took place online, and today that number is 8.5 per cent – about a $300-billion change. Some of these were captured by bricks-and-mortar online sales, some by e-commerce giants like Amazon and some by brands selling directly to consumers.

    But as you can see, online sales are just one piece of a very complex retail puzzle. All the aforementioned factors combined explain why, when gasoline prices declined by almost 50 per cent (gifting consumers with hundreds of dollars of discretionary spending a month), retailers' profitability and consumer spending did not flinch – those savings were more than absorbed by other expenses.

    Understanding that online sales (when we say this we really mean Amazon) are not the only culprit responsible for horrible retail numbers is crucial in the analysis of retail stocks. If you are only looking at "who can fight back the best against Amazon?" you are solving only one variable in a multivariable problem: Consumers' habits have changed; the United States is over-retailed; and consumer spending is being diverted to different parts of the economy.

    As value investors, we are naturally attracted to hated sectors. However, we demand a much greater margin of safety from retail stocks, because estimating their future cash flows (and thus fair value) is becoming increasingly difficult. Warren Buffett has said that you want to own a business that can be run by an idiot, because one day it will be. A successful retail business in today's world cannot be run by by an idiot. It requires Bezos-like qualities: being totally consumer-focused, taking risks, thinking long term.

    Vitaliy Katsenelson, CFA, is chief investment officer at Investment Management Associates in Denver, Colo. He is the author of Active Value Investing and The Little Book of Sideways Markets. His strategy for investing in an overvalued stock market is spelled out in this article.

    The author does not own or hold short positions in Amazon. His firm owns a position in Apple

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/22/2017 11:28:01 AM
       of 464
    the1a discussion of sugar.

    Tuesday, Aug 22 2017 • 11 a.m. (ET)
    The Sugar Story: A Spoonful Of Addiction Makes The Profits Go Up?

    Our decisions about what to eat are driven by much more than hunger. Social trends, agricultural science and multimillion-dollar industries can make certain vegetables hip or carbs passé, while concerns for overall health sit on the sidelines.

    One of the major food trends of the last half-century was the movement away from fat. But, research published last year found that the fight against fat was fueled in part by sugar interests. As the New York Times reports:

    The documents show that a trade group called the Sugar Research Foundation, known today as the Sugar Association, paid three Harvard scientists the equivalent of about $50,000 in today’s dollars to publish a 1967 review of research on sugar, fat and heart disease. The studies used in the review were handpicked by the sugar group, and the article, which was published in the prestigious New England Journal of Medicine, minimized the link between sugar and heart health and cast aspersions on the role of saturated fat.

    Now, with the research in doubt, with diabetes and obesity rates high and with questions rising about whether sugar is addictive, more and more people are turning away from a decades-long sugar habit.


    Gary Taubes Author of "The Case Against Sugar;" Science writer; @garytaubes

    Michael Moss Author of "Salt Sugar Fat: How the Food Giants Hooked Us;" former investigative reporter for The New York Times; @MichaelMossC

    Courtney Gaine PhD, RD President and CEO, the Sugar Association in Washington, DC

    Related Links

    There’s a Sugar Shock Ahead, in a World That’s Eating Smarter – Bloomberg
    50 Years Ago, Sugar Industry Quietly Paid Scientists To Point Blame At Fat – NPR
    How the Sugar Industry Shifted Blame to Fat – New York Times
    A Big Tobacco Moment for the Sugar Industry – The New Yorker

    Share RecommendKeepReplyMark as Last Read

    From: Sam8/22/2017 12:25:33 PM
    1 Recommendation   of 464
    Why do people believe myths about the Confederacy? Because our textbooks and monuments are wrong.
    False history marginalizes African Americans and makes us all dumber.
    By James W. Loewen July 1, 2015
    James W. Loewen, Emeritus Professor of Sociology at the University of Vermont, is the author of "Lies My Teacher Told Me" and "The Confederate and Neo-Confederate Reader."

    [There are additional links and videos in the original. I am just posting the whole article here for people who don't have access to WaPo.]

    History is the polemics of the victor, William F. Buckley once said. Not so in the United States, at least not regarding the Civil War. As soon as the Confederates laid down their arms, some picked up their pens and began to distort what they had done and why. The resulting mythology took hold of the nation a generation later and persists — which is why a presidential candidate can suggest, as Michele Bachmann did in 2011, that slavery was somehow pro-family and why the public, per the Pew Research Center, believes that the war was fought mainly over states’ rights.

    The Confederates won with the pen (and the noose) what they could not win on the battlefield: the cause of white supremacy and the dominant understanding of what the war was all about. We are still digging ourselves out from under the misinformation they spread, which has manifested in our public monuments and our history books.

    Take Kentucky, where the legislature voted not to secede. Early in the war, Confederate Gen. Albert Sidney Johnston ventured through the western part of the state and found “no enthusiasm, as we imagined and hoped, but hostility.” Eventually, 90,000 Kentuckians would fight for the United States, while 35,000 fought for the Confederate States. Nevertheless, according to historian Thomas Clark, the state now has 72 Confederate monuments and only two Union ones.

    Neo-Confederates also won parts of Maryland. In 1913, the United Daughters of the Confederacy (UDC) put a soldier on a pedestal at the Rockville courthouse. Maryland, which did not secede, sent 24,000 men to the Confederate armed forces, but it also sent 63,000 to the U.S. Army and Navy. Still, the UDC’s monument tells visitors to take the other side: “To our heroes of Montgomery Co. Maryland: That we through life may not forget to love the thin gray line.”

    In fact, the thin gray line came through Montgomery and adjoining Frederick counties at least three times, en route to Antietam, Gettysburg and Washington. Robert E. Lee’s army expected to find recruits and help with food, clothing and information. It didn’t. Instead, Maryland residents greeted Union soldiers as liberators when they came through on the way to Antietam. Recognizing the residents of Frederick as hostile, Confederate cavalry leader Jubal Early ransomed $200,000 from them lest he burn their town, a sum equal to about $3 million today. But Frederick now boasts a Confederate memorial, and the manager of the town’s cemetery — filled with Union and Confederate dead — told me, “Very little is done on the Union side” around Memorial Day. “It’s mostly Confederate.”

    Neo-Confederates didn’t just win the battle of public monuments. They managed to rename the war, calling it the War Between the States, a locution born after the conflict that was among the primary ways to refer to the war in the middle of the 20th century, after which it began to fade. Even “Jeopardy!” has used this language.

    [ Why people convince themselves that the Confederate flag represents freedom, not slavery]

    Perhaps most perniciously, neo-Confederates now claim that the South seceded over states’ rights. Yet when each state left the Union, its leaders made clear that they were seceding because they were for slavery and against states’ rights. In its “Declaration of the Causes Which Impel the State of Texas to Secede From the Federal Union,” for example, the secession convention of Texas listed the states that had offended the delegates: “Maine, Vermont, New Hampshire, Connecticut, Rhode Island, Massachusetts, New York, Pennsylvania, Ohio, Wisconsin, Michigan and Iowa.” Governments there had exercised states’ rights by passing laws that interfered with the federal government’s attempts to enforce the Fugitive Slave Act. Some no longer let slave owners “transit” across their territory with slaves. “States’ rights” were what Texas was seceding against. Texas also made clear what it was seceding for — white supremacy:

    We hold as undeniable truths that the governments of the various States, and of the confederacy itself, were established exclusively by the white race, for themselves and their posterity; that the African race had no agency in their establishment; that they were rightfully held and regarded as an inferior and dependent race, and in that condition only could their existence in this country be rendered beneficial or tolerable.

    Despite such statements, neo-Confederates erected monuments that flatly lied about the Confederate cause. For example, South Carolina’s monument at Gettysburg, dedicated in 1963, claims to explain why the state seceded: “Abiding faith in the sacredness of states rights provided their creed here.” This tells us nothing about 1863, when abiding opposition to states’ rights provided the Palmetto State’s creed. In 1963, however, its leaders did support states’ rights; politicians tried desperately that decade to keep the federal government from enforcing school desegregation and civil rights.

    [ The racist assumptions behind how we talk about shootings]

    So thoroughly did this mythology take hold that our textbooks still stand history on its head and say secession was for, rather than against, states’ rights. Publishers mystify secession because they don’t want to offend Southern school districts and thereby lose sales. Consider this passage from “ The American Journey,” probably the largest textbook ever foisted on middle school students and perhaps the best-selling U.S. history textbook:

    The South Secedes

    Lincoln and the Republicans had promised not to disturb slavery where it already existed. Nevertheless, many people in the South mistrusted the party, fearing that the Republican government would not protect Southern rights and liberties. On December 20, 1860, the South’s long-standing threat to leave the Union became a reality when South Carolina held a special convention and voted to secede.

    The section reads as if slavery was not the reason for secession. Instead, the rationale is completely vague: White Southerners feared for their “rights and liberties.” On the next page, the authors are more precise: White Southerners claimed that since “the national government” had been derelict ” — by refusing to enforce the Fugitive Slave Act and by denying the Southern states equal rights in the territories — the states were justified in leaving the Union.”

    [ Only white people can save themselves from racism.]

    “Journey” offers no evidence to support this claim. It cannot. No Southern state made any such charge against the federal government in any secession document I have ever seen. Abraham Lincoln’s predecessors, James Buchanan and Franklin Pierce, were part of the pro-Southern wing of the Democratic Party. For 10 years, the federal government had vigorously enforced the Fugitive Slave Act. Buchanan supported pro-slavery forces in Kansas even after his own minion, territorial governor and former Mississippi slave owner Robert Walker, ruled that they had won an election only by fraud. The seven states that seceded before Lincoln took office had no quarrel with “the national government.”

    Teaching or implying that the Confederate states seceded for states’ rights is not accurate history. It is white, Confederate-apologist history. “Journey,” like other U.S. textbooks, needs to be de-Confederatized. So does the history test we give to immigrants who want to become U.S. citizens. Item No. 74 asks them to “name one problem that led to the Civil War.” It then gives three acceptable answers: slavery, economic reasons and states’ rights. (No other question on this 100-item test has more than one right answer.) If by “economic reasons” it means issues with tariffs and taxes, which most people infer, then two of its three “correct answers” are wrong.

    The legacy of this thinking pervades Washington, too. The dean of the Washington National Cathedral has noted that some of its stained-glass windows memorialize Stonewall Jackson and Robert E. Lee. There’s a statue of Albert Pike, Confederate general and reputed leader of the Arkansas Ku Klux Klan, in Judiciary Square.

    The Army runs Fort A.P. Hill, named for a Confederate general whose men killed African American soldiers after they surrendered; Fort Bragg, named for a general who was not only Confederate but also incompetent; and Fort Benning, named for a general who, after he helped get his home state of Georgia to secede, made the following argument to the Virginia legislature:

    What was the reason that induced Georgia to take the step of secession? This reason may be summed up in one single proposition. It was a conviction .?.?. that a separation from the North was the only thing that could prevent the abolition of her slavery. .?.?. If things are allowed to go on as they are, it is certain that slavery is to be abolished. .?.?. By the time the North shall have attained the power, the black race will be in a large majority, and then we will have black governors, black legislatures, black juries, black everything. .?.?. The consequence will be that our men will be all exterminated or expelled to wander as vagabonds over a hostile Earth, and as for our women, their fate will be too horrible to contemplate even in fancy.

    With our monuments lying about secession, our textbooks obfuscating what the Confederacy was about and our Army honoring Southern generals, no wonder so many Americans supported the Confederacy until recently. We can see the impact of Confederate symbols and thinking on Dylann Roof, accused of killing nine in a Charleston, S.C., church, but other examples abound. In his mugshot, Timothy McVeigh, who bombed the Alfred P. Murrah Federal Building in Oklahoma City in 1995, wore a neo-Confederate T-shirt showing Abraham Lincoln and the words “Sic semper tyrannis.” When white students in Appleton, Wis. — a recovering “sundown town” that for decades had been all white on purpose — had issues with Mexican American students in 1999, they responded by wearing and waving Confederate flags, which they already had at home, at the ready.

    Across the country, removing slavery from its central role in prompting the Civil War marginalizes African Americans and makes us all stupid. De-Confederatizing the United States won’t end white supremacy, but it will be a momentous step in that direction.

    CORRECTION: This story has been updated. A previous version gave incorrect figures for the ransom paid by Frederick, Md., residents to Confederate troops; they paid $200,000, not $300,000, which would be worth about $3 million now, not $5 million. Also, it misstated the year South Carolina dedicated its monument at Gettysburg; it was 1963, not 1965. And an earlier version of this story incorrectly linked to a different textbook named “American Journey.”

    Share RecommendKeepReplyMark as Last ReadRead Replies (1)
    Previous 10 Next 10 

    Copyright © 1995-2018 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.