SI
SI
discoversearch

   PoliticsSam's miscellany


Previous 10 Next 10 
From: Sam7/12/2017 10:50:27 PM
   of 147
 
Why Do Geese Fly in V Formations?



FacebookTwitterPinterestEmailPrintFriendly

It’s an iconic image of fall—a V of geese high overhead, migrating south for the winter. Several other bird species, such as flamingos, also fly in V formations, but geese are the most well known. On any given fall day, numerous Vs can be seen passing gracefully below the clouds. Most birds actually don’t migrate in the V formation; smaller birds tend to fly in huge amorphous flocks. Others fly in a simple line. But why do geese and other birds fly in V formations?

Writing in the journal “Auk,” John Badgerow examined numerous formations of geese, hoping to come up with a definitive answer. Aerodynamism was not initially seen as a primary reason for the behavior; previous analyses had suggested that the birds were too far apart to benefit from the energy saved. Instead, the angle of the V suggested that the primary purpose was a form of visual communication—the geese that composed both diagonal lines, each slightly angled from one another, would have an unobstructed view of the lead goose who determined the flight path of the flock.

Badgerow, unconvinced, decided to test both hypotheses. Working from film, he analyzed the geometry of the formations. Badgerow calculated that maximum energetic advantage (compared to solo flight) is achieved at exactly 0.16 meters between the wingtips of a bird and the one following. The spacing requirements explain why only certain birds fly in V-formations—only birds with large wingspans and slow beats can achieve the energy saving. Rapid or erratic flapping creates too much wake turbulence, which disrupts the formation. Migrating geese effectively function like airplanes.

At the same time, information about changes in course and velocity must be communicated to other members of the flock, so visual communication does play a role. The angle seems to be chosen so all members of a flock can see the leader and adjust to any changes. Flocks contain both experienced and new migrants, so communicating information about rest and feeding areas is vital.

In practice, turbulence, air currents, and other factors make it extremely difficult for the geese to maintain the exact spacing needed to maximize efficiency or communication, and actual efficiency improvements are only about 20% of the theoretical maximum. Geese go to great lengths to maintain their formation as best as possible, so Badgerow suggests that while both communication and efficiency play a role in V-formations, energy efficiency is the primary motivator. More recent studies have confirmed that flying in the lead is the most tiring position, so geese take turns at the head of the V in order to allow leaders to rest.



FacebookTwitterPinterestEmailPrintFriendly

JSTOR Citations An Analysis of Function in the Formation Flight of Canada Geese By: John P. Badgerow

The Auk, Vol. 105, No. 4 (Oct., 1988), pp. 749-755

American Ornithologists' Union

Share RecommendKeepReplyMark as Last Read


From: Sam7/13/2017 8:28:27 AM
   of 147
 
Millennials Are Attending Events in Droves Because of Fear of Missing Out

Jaimie Seaton, Skift
- Jul 12, 2017 10:00 am
skift.com

excerpt:

“The biggest buzzword right now is FOMO [fear of missing out], and that’s a huge factor for Millennials,” said Aubri Nowowiejski, 28, an executive producer at Coterie Spark, a global meeting and corporate event-planning firm based in Houston, Texas.

“It’s all about projecting to your social media network, and painting a picture of a phenomenal lifestyle. They chase experiences over things to get those likes and comments and interactions, and that dopamine fix,” said Nowowiejski, who founded the Student Event Planners Association in 2009 while a student at Texas State University. It currently has more than 2,000 members nationwide.

In fact, the study found that nearly half of Millennials surveyed attend live events so they have something to share online; and 78 percent reported enjoying seeing other people’s experiences on social media.

The Millennial obsession with FOMO and cultivating the perfect social media feed can be a boon to consumer-focused marketing events, especially if they are free, but it also presents challenges to conference and meetings organizers who value engagement. In an effort to lure the young demographic, organizers may reach for the unique “wow” factor, and wind up with an event that is more style than substance.

“That’s one of my biggest frustrations: I want us to put down our phones,” said Nowowiejski. “We’re experiencing the whole world through a lens and we’re not present in the moment, and it is an issue. Conference planners are trying to integrate social media and meet Millennials where they are. You go to a conference and are told to live-tweet, but if I’m tweeting, am I really absorbing what the speaker is saying? It’s a double-edged sword. What did you really experience if you didn’t put your phone down?”

more at the link

Share RecommendKeepReplyMark as Last Read


From: Sam7/15/2017 6:31:01 PM
   of 147
 
The Lawyer, the Addict
A high-powered Silicon Valley attorney dies. His ex-wife
investigates, and finds a web of drug abuse in his profession.

By EILENE ZIMMERMAN
Photographs by DAVID BRANDON GEETING
JULY 15, 2017


nytimes.com

Share RecommendKeepReplyMark as Last Read


From: Sam7/19/2017 11:27:49 AM
   of 147
 
Interview with Andrew Ng on The1A.org

Getting Really Smart About Artificial Intelligence
Wednesday, Jul 19 2017 • 11 a.m. (ET)

Should We Be Worried About AI? What Worries You About AI? The Artificial Intelligence Behind Alpha Go
Twitter FB Discuss

the1a.org

Chances are, you’ve already encountered artificial intelligence today.

Did your email spam filter keep junk out of your inbox? Did you find this site through Google? Did you encounter a targeted ad on your way?

We constantly hear that we’re on the verge of an AI revolution, but the technology is already everywhere. And Coursera co-founder Andrew Ng predicts that smart technology will help humans do even more. It will drive our cars, read our X-rays and affect pretty much every job and industry. And this will happen soon.

As AI rises, concerns grow about the future of humans. So how can we make sure our economy and our society are ready for a technology that could soon dominate our lives?

Guests
Andrew Ng Computer scientist and expert on artificial intelligence; co-founder and co-chairman, Coursera, a free massive on-line course; founding head, Google Brain, an artificial intelligence initiative; former vice president and chief scientist, Baidu, a Chinese digital services company

Should We Be Worried About AI?

There’s clearly some public anxiety about artificial intelligence.



And why wouldn’t there be? One of the smartest humans alive, Stephen Hawking, says AI could end mankind.

But the question isn’t whether to worry about AI, it’s what kind of AI to worry about.

Tesla founder Elon Musk recently warned a gathering of governors that they need to act now to put regulations the development of artificial intelligence. “I keep sounding the alarm bell, but until people see robots going down the street killing people, they don’t know how to react, because it seems so ethereal,” he said.

This kind of concern should come with a caveat, which The Verge points out:

Musk is not talking about the sort of artificial intelligence that companies like Google, Uber, and Microsoft currently use, but what is known as artificial general intelligence — some conscious, super-intelligent entity, like the sort you see in sci-fi movies. Musk (and many AI researchers) believe that work on the former will eventually lead to the latter, but there are plenty of people in the science community who doubt this will ever happen, especially in any of our lifetimes.

To understand the threats AI may or may not pose to society, it’s best to understand the types of AI that do and don’t (yet) exist. Wait But Why has a great summary:

AI Caliber 1) Artificial Narrow Intelligence (ANI): Sometimes referred to as Weak AI, Artificial Narrow Intelligence is AI that specializes in one area. There’s AI that can beat the world chess champion in chess, but that’s the only thing it does. Ask it to figure out a better way to store data on a hard drive, and it’ll look at you blankly.

AI Caliber 2) Artificial General Intelligence (AGI): Sometimes referred to as Strong AI, or Human-Level AI, Artificial General Intelligence refers to a computer that is as smart as a human across the board—a machine that can perform any intellectual task that a human being can. Creating AGI is a much harder task than creating ANI, and we’re yet to do it. Professor Linda Gottfredson describes intelligence as “a very general mental capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly, and learn from experience.” AGI would be able to do all of those things as easily as you can.

AI Caliber 3) Artificial Superintelligence (ASI): Oxford philosopher and leading AI thinker Nick Bostrom defines superintelligence as “an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills.” Artificial Superintelligence ranges from a computer that’s just a little smarter than a human to one that’s trillions of times smarter—across the board.

(Really, the full post is essential reading.)

Type 1 exists. This is what we use every day. This is what is reshaping our social networks, advertising and economy. The threat here is already visible. “Fake news” designed to hoax humans games algorithms to reach a wider audience. Automation is replacing human jobs.

Types 2 and 3 cause the anxiety. Futurist Michael Vassar, who has worked with AI, has used Nick Bostrom’s thinking on artificial intelligence to predict that “if greater-than-human artificial general intelligence is invented without due caution, it is all but certain that the human species will be extinct in very short order.”

Even though very smart people disagree over whether this AI will ever exist, the concept of a science-fiction dystopia is simultaneously terrifying and alluring. It’s easy to imagine a Terminator-like world where machines do battle with their human creators and think of it as both unlikely to happen in our lifetimes and also inevitable. And this can make it hard to think about taking steps to stop it from happening. At least one study has found that people are worried about smart machines killing them.

However, as the future of AI approaches, there are already noticeable problems in our ever-more automated present. The economy is reacting to the loss of jobs to machines. The algorithms that drive what information we see can be gamed to feed us misinformation, and even if they work as intended, they can lock us into only getting one side of every story.

“In our current society, automation pushes people out of jobs, making the people who own the machines richer and everyone else poorer. That is not a scientific issue; it is a political and socioeconomic problem that we as a society must solve,” wrote scientist Arend Hentz. “My research will not change that, though my political self – together with the rest of humanity – may be able to create circumstances in which AI becomes broadly beneficial instead of increasing the discrepancy between the one percent and the rest of us.”

more at the link plus the interview will be archived there

Share RecommendKeepReplyMark as Last Read


From: Sam7/25/2017 5:13:32 PM
   of 147
 
All Signs Point to a Cyclical Slowdown in Inflation
businesscycle.com

The inflation debate among Federal Reserve policy makers comes down to peering through a mud-caked windshield to figure out what's ahead, or being guided by looking in the rear-view mirror. But a third option, the U.S. Future Inflation Gauge, cuts through the confusion and has a solid track record of providing turn-by-turn directions to the road ahead like a GPS.

Traditionalists on the Federal Open Market Committee who place their faith in the Phillips curve worry that, with the jobless rate below the Fed’s estimate of “full employment,” the central bank will fall behind the curve on inflation if it stops tightening. Others argue that the Phillips curve has stopped working. They prefer the rear-view mirror approach, extrapolating core inflation, which is by definition a coincident indicator of inflation, to guide their expectations of future inflation. That methodology never foresees turns in the road ahead, instead projecting a straight extension of the road just traveled.

Some history is helpful. Following the first interest-rate hike in the 1994-95 tightening cycle -- which produced a “soft landing,” thanks to the only pre-emptive moves by the Fed in recent memory -- then Fed Chairman Alan Greenspan endorsed the work of Geoffrey H. Moore in his congressional testimony. The Wall Street Journal described Moore as “the father of leading indicators,” whose work on inflation is the basis for our firm's U.S. Future Inflation Gauge, or FIG. Justin Martin's biography of the former Fed chairman noted that the FIG “would prove to be one of Greenspan’s favorite indicators.”



For those who think that today’s economy somewhat resembles that of the late 1990s, featuring growth without inflation, we'd point out that the FIG correctly anticipated that unusual combination of circumstances, which the Phillips curve failed to foresee. Rather than using the Phillips curve or extrapolating recent inflation data, the FIG leads cyclical upturns and downturns in inflation by tracking underlying cyclical inflation pressures -- in essence, the co-movement of cyclical leading indicators of inflation.

The news today is that even as the Fed and other central banks get more hawkish the FIG is starting to turn down, flagging a change in the direction of the inflation cycle that the Fed is likely to miss. Skeptical? The big run-up in the FIG a year ago was spot on in anticipating the reflation trade, at a time when inflation expectations were near multiyear lows. Inflation expectations then ran up with the reflation trade through the beginning of this year, falling in line behind the upswing in the FIG.

As for wage growth, most still remain confused as to why in a tight labor market it’s logical to see nominal wage growth rising when economic growth decelerates, and falling when it accelerates. Moreover, wage inflation typically lags CPI inflation at cyclical troughs and leads it at peaks. Given that dynamic and the ongoing cyclical downswing in the FIG, wage inflation is unlikely to see significant gains in the months ahead.

The FIG is clearly signaling a fresh cyclical downswing in inflation, which is being obscured, ironically, by the dip in inflation due to so-called idiosyncratic factors such as the declines in wireless phone service and prescription drug prices. The bottom line: Regardless of the decline in the jobless rate, the inflation cycle is turning down.


VIEW THIS ARTICLE ON BLOOMBERG

bloomberg.com

Share RecommendKeepReplyMark as Last Read


From: Sam7/26/2017 7:45:48 AM
   of 147
 
The Chipotle Corporate Sabotage Theory Returns
Some people see a simple case of foodborne illness. Others see corporate sabotage.
By Deena Shanker
July 25, 2017, 5:00 AM EDT

bloomberg.com

Yet another outbreak of foodborne illness last week at Chipotle Mexican Grill did what it usually does to the burrito chain: The stock price plummeted. It's bad news—particularly for the patrons who got sick—but it's a boon for anyone that had the foresight to short the stock.

The latest outbreak was first noted by iwaspoisoned.com, a website that crowdsources reports of customer illnesses following visits to restaurants. The goal, it says, is "safer food, safer communities and a healthier economy." Yet, as Bloomberg reported last week, hedge funds looking to profit from others' bad luck can also access a "souped up" version of the site for a $5,000 monthly fee.

Aaron Allen, principal at Aaron Allen & Associates, a restaurant industry consultancy, posited in a LinkedIn post on Monday morning that the Chipotle illness might not just be a matter of luck. "A lot of things stacked up that made it suspicious," he told Bloomberg in an interview on Monday, "and when you look at it from a statistical point of view, even more suspicious." His group has no financial interest in the chain, Allen said, and he has previously lauded the chain's pre-scandal marketing.

He's not the first to publicly speculate about the possibility that corporate sabotage is behind Chipotle's woes. (There was even a recent plot line on Showtime's "Billions" in which a hedge fund manager contaminated the customers of a fictional company, Ice Juice, whose stock he had just shorted.) A similar theory—which was largely dismissed—circulated two years ago when the chain was hit by outbreaks of E. coli, norovirus, and salmonella. The stock price plummeted at the time, costing the company billions of dollars in market capitalization.

While Allen can't prove his theories about Chipotle, he argues that corporate sabotage of a similar nature has happened in the past. A woman planted a severed finger in her Wendy's chili more than a decade ago. In the 1980s, Tylenol capsules were purposely tainted with potassium cyanide, leading to the deaths of seven people in the Chicago area.

Plus, there was a lot of money on the table for Chipotle short sellers—as much as $459 million, according to Allen's calculations—before there were any inklings of a food safety problem. "Chipotle short-sellers saw their ambitions rewarded with $55 million in less than one day," he wrote of the latest scare.

Allen said his team, which includes a statistician and food safety experts, found a number of "statistical anomolies." First, he wrote in his post, the time of year raises questions; while 60 percent of food safety outbreaks occur from December to May, Chipotle's happened from August to December. Second, Chipotle experienced four times the number of norovirus outbreaks a chain of its size would be expected to have, and that's not even counting the E. coli and salmonella. And, he notes, significantly more people got sick from each of the outbreaks than normally do from the same pathogens. The average norovirus outbreak causes about 18 illnesses, while salmonella usually leads to about 25. At Chipotle, more than 200 people were sickened from a single August 2015 norovirus outbreak, and 64 fell ill from the same month's salmonella problem.

"We're not saying this as a definitive," he said. "But if you were a short seller and you were looking for where there would be the most financial gain in the restaurant industry, the best way is a food safety scare, and the best stock would be Chipotle."

Chipotle did not respond to Allen's post but told Bloomberg that it was aware of the 2015 theories. "We ... did not see any evidence to support them," spokesman Chris Arnold said in an email. He also pointed out that the company implemented food safety enhancements after those incidents.

"I can tell you unequivocally that none of the 2015 outbreaks were caused by some terrorist or criminal acts," said Bill Marler, a leading food safety attorney who has litigated foodborne illness claims for decades and who represented clients in each of that year's Chipotle outbreaks. "It's conspiratorial nuts."

Marler examined the fact patterns in each case and believes they were all caused by the usual suspects: sick employees, failure to pay attention to detail and ready-to-eat food showing up already tainted.

He allows that a conspiracy is theoretically possible. "But I don't believe in aliens, or that humans walked the earth with dinosaurs, either," he added.

Allen said he's just raising a question about a statistical anomaly, not accusing anyone of purposely poisoning burrito lovers. "It really is one of those situations, like who would put a finger in the chili?"

Share RecommendKeepReplyMark as Last Read


From: Sam7/29/2017 1:42:46 PM
   of 147
 
How to Avoid Sepsis, a Deadly Medical Emergency
More than 1 million Americans a year develop this condition, and 250,000 die from it, a new CDC reports says

By Kevin McCarthy
August 23, 2016

consumerreports.org

Dana Mirman didn’t pay much attention when she noticed what she assumed was a mosquito bite on her shoulder.

“After all, I live in Florida, people get bites,” says the 41-year-old communications professional.

Even when it began throbbing overnight she didn't worry much, assuming it was a simple allergic reaction. The next day she came down with flu-like symptoms, including vomiting and fever. “I still didn’t connect the two,” she says.

But later that evening her temperature spiked to 104 and her husband rushed her to the emergency room. The staff checked her blood pressure, and saw it was plummeting. They immediately started her on I.V. fluids and powerful antibiotics, she says—a response that likely saved her life.

Mirman was a victim of sepsis, an often deadly reaction to certain infections that can quickly race through your body, shutting down vital organs. While not well known, it's actually more common than heart attacks, and just as deadly.

More than a million people in the U.S. develop sepsis each year, and more than 250,000 deaths a year are linked to it, according to the Centers for Disease Control and Prevention. That’s higher than earlier estimates—an increase the CDC attributes mainly to better reporting on the infection, as well as the growing number of older people, who are especially susceptible to sepsis.

While many cases are linked to infections contracted in hospitals, doctor offices, nursing homes, and other healthcare facilities, a report out today from the CDC says that nearly half are community-acquired, meaning from infections people contract in their homes, schools, playgrounds, and elsewhere. Any infection can lead to sepsis, from a manicure gone wrong, food poisoning, or a simple urinary tract infection.

One reason sepsis is so dangerous is that patients and even doctors often don’t recognize the problem until it’s too late, says Anthony Fiore, M.D., of the CDC’s Division of Healthcare Quality Promotion and an author of the new report. “Rapid early treatment can save lives," he says. “It’s just like a heart attack or stroke,” in that the longer you wait, the deadlier it is.

Here’s what you need to know about sepsis, including how to recognize it and what to do if you suspect it.

How Sepsis Harms

Sepsis often starts with a local infection, most often in the lungs, digestive tract, urinary tract, or, as in Mirman’s case, on the skin. It usually stems from a bacterial infection, but fungal or viral infections can trigger sepsis, too.

Whenever the body develops an infection, the immune system normally kicks in, producing chemicals to fight the infection. But sometimes—either because the triggering bacteria is unusually powerful or because the person’s immune system is already weakened by other health problems—those chemicals are set loose in the bloodstream and course through the body.

Instead of just fighting the local infection, those chemicals unleashed by the immune system cause widespread inflammation and damage tissues in the liver, kidneys, heart, and other organs. Within hours, blood clots can begin to form, and damage to blood vessels causes blood pressure to drop, which in turn slows the delivery of vital nutrients to those organs already under attack. In the final stages, the heart weakens and organs begin to fail.

Who’s at Risk?
The new CDC reports helps identify who’s most at risk of sepsis. That includes people:

  • With chronic diseases. Seven out of 10 people in the CDC study who developed sepsis had diabetes, lung, kidney, or liver disease, or some other chronic condition. People with those conditions are at increased risk because they are susceptible to infection.
  • With conditions that weakened immune systems, such as AIDS or cancer.
  • Who are 65 or older or younger than a year old, probably because they too are more prone to infection.
  • Who have recently been in a hospital, nursing home, or other healthcare facility, in part because infection-causing bacteria are often common in those institutions.

How to Prevent Sepsis
While sepsis is often not recognized until it’s already progressed, Fiore said many cases could be prevented.

For one thing, the CDC report found that most sepsis patients have some underlying health problem that requires them to have frequent contact with doctors. Those visits, Fiore says, are an opportunity for healthcare providers to prevent infections that can turn septic (by, for example, ensuring that diabetics get thorough exams of their feet to check for wounds that could breed infections) and to educate people about warning signs for sepsis.

In addition, there are a number of steps people can take on their own to protect themselves from the underlying infections that can trigger sepsis. Here’s how:

  • Get vaccinated. Thirty-five percent of sepsis cases in the CDC study stemmed from pneumonia. Yet the CDC says that only 20 percent of high-risk adults under age 65 and 60 percent of those 65 and older have been vaccinated against that disease. And annual flu shots can also prevent respiratory infections that can sometimes turn septic. But only about a third of sepsis patients in the CDC study had a record of being vaccinated against the flu. “Thousands more deaths could be prevented with better coverage,” wrote the authors of the CDC report.
  • Treat urinary tract infections promptly. A quarter of sepsis cases resulted from urinary tract infections. So it’s important to see a healthcare provider if you have warning signs of those infections—usually a painful, burning feeling when you urinate and a strong urge to “go” often—and, when appropriate, be treated with antibiotics. UTIs are also common among hospital patients, especially those who have been catheterized, so it’s extra important to watch for those infections while in the hospital.
  • Clean skin wounds properly. About one in 10 sepsis cases follows a skin infection. So it’s essential to care for wounds and scrapes properly. That means washing with soap and water, cleaning out any dirt or debris, and covering wounds. And people with diabetes should make sure that they follow good foot care practices, since wounds there can often develop dangerous infections.
  • Avoid infections in hospitals. Since many infections that turn septic originate in hospitals and other healthcare facilities, it’s essential that you—and your healthcare providers—take steps to avoid those infections. That means, for example, insisting that everyone who comes into your hospital room—including doctors and nurses—wash their hands every time they touch you. Read more tips on how to avoid infections in hospitals.
Watch for Warning Signs
If you do develop an infection, it’s important to watch for signs of sepsis and, if you suspect it, act fast. Here’s what to do:
  • Know the symptoms. These include fever plus chills, shortness of breath, rapid breathing, increased heart rate, diarrhea, vomiting, rash, or pain, which many survivors describe as the worst they’ve ever felt. Disorientation or confusion can also be signs of sepsis.
  • Act fast. If you have any of those symptoms, “contact a healthcare provider and say, ‘I think I have an infection and it’s really gotten worse, and I think it’s possible I could have sepsis,” Fiore says. If it’s after hours, either contact the providers on-call service or, if you feel very sick, go to the emergency room. “It’s not something you sit on until morning,” he says.
  • Demand attention. Healthcare providers sometimes miss sepsis, or fail to react quickly enough, says Lisa McGiffert, the director of Consumer Reports Safe Patient Project. “Patients and their advocates need to be aggressive in pushing for quick action, and there is no excuse for a hospital or emergency room to respond to sepsis slowly once it is diagnosed,” McGiffert says. “Hospitals can do a lot to prevent death from sepsis by diagnosing it early and responding with urgency.”
  • Get the right treatment. If your doctor suspects sepsis, you should get treated with IV fluids and antibiotics right away. Initially, you will probably need a broad-spectrum antibiotic, which targets multiple bacteria. But your doctors should also order tests to identify the responsible bacteria and, if possible, switch you to an antibiotic that targets that specific bacteria.

In addition, Mirman—who started volunteering for the Sepsis Alliance, an organization that raises awareness about the condition, after her experience—emphasizes the importance of having support when dealing with sepsis. Sepsis can be extremely disorienting, and cause sleepiness, she says. “I was so exhausted, all I wanted to do was sleep” Mirman remembers. If it weren’t for her husband, she might never have made it to the emergency room. “I would not have picked up a phone and called 911.”

Video at the link
consumerreports.org
Pause
0:15

Share RecommendKeepReplyMark as Last Read


From: Sam8/2/2017 7:56:54 AM
   of 147
 
Some docs are like this. But I've seen some other young docs over the past year who seem really impressive. They seemed to be determined to be both caring and tough. Respectful of the patient's autonomy but not giving up their position as the expert purveyor of information. And aware of their limitations at the same time.

Medicine has become a service industry, and it's making doctors unable to confront death
  • Published on July 29, 2017
  • Featured in: Big Ideas & Innovation, Editor's Picks, Healthcare, The Weekend Essay
  • Seamus O'Mahony
    Consultant Gastroenterologist at Cork University Hospital

    Kieran Sweeney, a doctor and writer, was diagnosed with pulmonary mesothelioma in 2009. The condition, a type of lung cancer usually caused by exposure to asbestos, is almost invariably fatal. He wrote about his diagnosis in the British Medical Journal:

    The next 48 hours are spent talking to our four beautiful kids, aged mid-teens to early 20s, whose joyous careers are currently sprinkled through school, part-time jobs and university. I can’t really convey in words the catastrophic hurt my news has inflicted on them, and it is an insult, which at their age they should never have to endure. I will die of this tumour, I say, and we must address that, neither accepting nor comprehending it. This tumour will kill my body, I say, but I will yield my spirit and personhood reluctantly. We embrace. They weep. I weep for them, for fear for myself, and for the unthinkable horror that they will continue to inhabit the world in which I will play no part.
    These are the words of a grown-up. There is no whiff of the ‘how to break bad news’ workshop. No sentimentality, no aversion of the eyes from the monumental and unavoidable prospect of impending death. No foolishness about ‘fighting’ the cancer, or travelling to the ends of the world to find a cure. This is what ‘breaking bad news’ is really like.

    He died on Christmas Eve of the same year, 2009. He had been an original and eloquent commentator on medicine and its wider role. He drew attention to the limitations of evidence-based medicine, coining the term ‘the information paradox’ to describe the danger of information-overload distracting the doctor from his core role of relieving suffering. Here is an extract from his obituary in the British Medical Journal:

    He later described this approach as metaphysical and in a prescient piece he wrote, ‘The clearest example of this transition to the metaphysical level occurs when someone starts to die or accepts that death is near. Here both the doctor and the patient are confronted by the question: “When is enough, enough? This”, he wrote, “will be the defining question for the next generation of practitioners.”’
    Enough, indeed. A word rarely used in American medicine, where the culture of medical excess is most unbridled. Atul Gawande, the American surgeon and writer, became famous for championing safer surgery by adopting practices from other spheres of human activity such as the aviation industry. He writes regularly for the New Yorker, which published a long piece by him in 2010, called ‘Letting Go: What should medicine do when it can’t save your life?’ This article formed the basis for his 2014 book, Being Mortal. He described how, in the US, dying patients are routinely subjected to futile and painful medical treatments, while their doctors fail to discuss with them the inevitable outcome: ‘Patients die only once. They have no experience to draw on. They need doctors and nurses who are willing to have the hard discussions and say what they have seen, who will help people prepare for what is to come – and to escape a warehoused oblivion that few really want.’

    But doctors are no longer brave enough. They increasingly see themselves as service-providers, a role that does not encourage Difficult Conversations, or a willingness to be brave. Consumerism, fear of litigation and overregulation have conspired to create the customer-friendly doctor, who emerged when the doctor–patient relationship became recast in a quasi-commercial mould. This type of doctor, well trained in communication skills, eminently biddable, is not what Kieran Sweeney or Atul Gawande had in mind. Doctors, by the nature of their selection and training, are conformist, and the now dominant ethos of customer-friendliness has all but silenced other, dissenting, voices. There is now an insatiable appetite for medicine: for scans, for drugs, for tests, for screening. This appetite benefits many professional groups, industries and institutions. It is difficult to call ‘enough’, but a good doctor sometimes has to tell patients things they do not want to hear. Regrettably, it is much easier, in the middle of a busy clinic, to order another scan than to have the Difficult Conversation.

    Doctors have a duty beyond that of pleasing the individual patient, a duty to society at large. The US has many so-called ‘concierge’ doctors, private physicians engaged by the wealthy, who are always on call to minister to the needs of their fastidious and demanding clients. The annual fee per patient is said to be as much as $30,000. The ultimate concierge doctor was Conrad Murray, the late Michael Jackson’s personal physician. Murray’s willingness to prescribe almost anything, including the general anaesthetic agent, propofol, for his wealthy and manipulative patient, eventually led to Jackson’s death.

    Patient autonomy now trumps all other rights and obligations. Autonomy, however, is a useful card to play when, as often happens, particularly with the diagnosis of cancer, I am ambushed by well-meaning relatives, urging me not to tell the patient, because ‘it would kill’ them. Relatives have no formal rights as such, but commonly dictate medical care to those doctors keen on a quiet life and willing to be leaned on. Inevitably there will be instances, such as in the case of patients with dementia or those of very advanced age, where giving a diagnosis of cancer is of no benefit to them. But in most cases I believe it is my duty to tell the truth.

    The difficulty, however, is this: Kieran Sweeney’s acceptance of, and confrontation of, his situation, is the exception, not the rule. He was both advantaged and disadvantaged when he was given the diagnosis of mesothelioma. As a doctor, he knew immediately what the future held in store for him, but this knowledge precluded all hope. Many of my patients lack the educational background or knowledge to fully absorb a diagnosis of something like mesothelioma. Apart at all from this ‘cognitive’ aspect, many simply do not want to know the grisly details about survival statistics and what the future might hold. It is not only relatives who wish to have the truth concealed. Many patients do not want to have the Difficult Conversation.

    The entire modern hospital system conspires against those doctors willing to have this dialogue: the relatives, the chaos and noise of the environment, the technojuggernaut of modern hospital care, the customer-friendly doctors who are happy and willing to dole out false, delusional hope, and sometimes the patients themselves, who may not want to hear what the doctor has to say. The temptation to opt for the quiet life, to avoid the Difficult Conversation, is overwhelming. And no one will ever complain. The relatives will be content, and the dying will soon be dead. Why give yourself the grief?

    Society at large purports to want leadership and professionalism from doctors, but I wonder if this is really true. Leadership and professionalism involve confronting unpleasant truths, and sometimes denying people what they want (or think they want). Many doctors routinely over-diagnose, over-investigate and over-treat; these doctors are invariably described, approvingly, by their patients as ‘thorough’. Inevitably, the ‘thorough’ doctors eventually leave their dying patients mystified and abandoned when there is no chemotherapy left, no scans to order.

    Seamus O'Mahony is Consultant Gastroenterologist at Cork University Hospital and author of The Way We Die Now: The View from Medicine’s Front Line, from which this article is excerpted.

    linkedin.com

    Share RecommendKeepReplyMark as Last Read


    From: Sam8/3/2017 5:44:07 AM
       of 147
     
    Himax Technologies, Inc. Reports Second Quarter 2017 Financial Results and Provides Third Quarter 2017 Guidance
    Thu August 3, 2017 5:00 AM|GlobeNewswire|About: HIMX

    seekingalpha.com

    Share RecommendKeepReplyMark as Last Read


    From: Sam8/5/2017 9:57:33 PM
    1 Recommendation   of 147
     
    November 2, 2015 Issue
    The Price of UnionThe undefeatable South.
    By Nicholas Lemann

    newyorker.com



    Progress in civil rights has been matched by the Southernization of American politics.
    Photograph by Walker Evans / Courtesy Library of Congress

    When the Confederate States of America seceded, the response of the United States of America was firm: dissolving the Union was impermissible. By contrast, it took a few more years for the United States to resolve the question of whether it would permit slavery within its own borders, and it took more than a century for the U.S. to enforce civil rights and voting rights for all its citizens. This was mainly because of the South’s political power. In order to become the richest and most powerful country in the world, the United States had to include the South, and its inclusion has always come at a price. The Constitution (with its three-fifths compromise and others) awkwardly registered the contradiction between its democratic rhetoric and the foundational presence of slavery in the thirteen original states. The 1803 Louisiana Purchase—by which the U.S. acquired more slaveholding territory in the name of national expansion—set off the dynamic that led to the Civil War. The United States has declined every opportunity to let the South go its own way; in return, the South has effectively awarded itself a big say in the nation’s affairs.

    The South was the country’s aberrant region—wayward, backward, benighted—but it was at last going to join properly in the national project: that was the liberal rhetoric that accompanied the civil-rights movement. It was also the rhetoric that accompanied Reconstruction, which was premised on full citizenship for the former slaves. Within a decade, the South had raised the price of enforcement so high that the country threw in the towel and allowed the region to maintain a separate system of racial segregation and subjugation. For almost a century, the country wound up granting the conquered South very generous terms.

    The civil-rights revolution, too, can be thought of as a bargain, not simply a victory: the nation has become Southernized just as much as the South has become nationalized. Political conservatism, the traditional creed of the white South, went from being presumed dead in 1964 to being a powerful force in national politics. During the past half century, the country has had more Presidents from the former Confederacy than from the former Union. Racial prejudice and conflict have been understood as American, not Southern, problems.

    Even before the Civil War, the slave South and the free North weren’t so unconnected. A recent run of important historical studies have set themselves against the view of the antebellum South as a place apart, self-destructively devoted to its peculiar institution. Instead, they show, the South was essential to the development of global capitalism, and the rest of the country (along with much of the world) was deeply implicated in Southern slavery. Slavery was what made the United States an economic power. It also served as a malign innovation lab for influential new techniques in finance, management, and technology. England abolished slavery in its colonies in 1833, but then became the biggest purchaser of the slave South’s main crop, cotton. The mills of Manchester and Liverpool were built to turn Southern cotton into clothing, which meant that slavery was essential to the industrial revolution. Sven Beckert, in “Empire of Cotton,” argues that the Civil War, by interrupting the flow of cotton from the South, fuelled global colonialism, because Europe needed to find other places to supply its cotton. Craig Steven Wilder, in “Ebony & Ivy,” attributes a good measure of the rise of the great American universities to slavery. Walter Johnson, in “River of Dark Dreams,” is so strongly inclined not to see slavery as simply a regional system that he tends to put “the South” in quotes.

    After slavery had ended and Reconstruction gave way to the Jim Crow system, the Democratic Party was for decades an unlikely marriage of the white South (the black South effectively couldn’t vote) and blue-collar workers in the North. This meant that American liberalism had a lot of the South in it. Ira Katznelson, in “Fear Itself,” adeptly identifies the deep Southern influence on the New Deal era, the country’s liberal heyday, including not just its failure to challenge segregation but also a strong pro-military disposition that helped shape the Cold War. The great black migration to the North and the West, which peaked in the nineteen-forties and fifties, partly nationalized at least one race’s version of Southern culture, and, by converting non-voters to voters through relocation, helped generate the political will that led to the civil-rights legislation of the nineteen-sixties. Once those laws had passed, the South became for the Republican Party what it had previously been for the Democratic Party, the essential core of a national coalition. The South is all over this year’s Republican Presidential race.


    I’m a fifth-generation Southerner, though long expatriated, and I know the wounded indignation with which the folks back home react to any suggestion that the South is no longer—or maybe never was—an entirely separate region. What about our hound dogs, our verandas, our charm, our football worship, our slow-moving “way of life”? Outsiders who have visited the South, going back to Alexis de Tocqueville and Frederick Law Olmsted or even further, have usually agreed with the natives about the South’s distinctiveness, though they have often seen it as something to condemn, not admire. How can the South be so American if it feels (and smells, and sounds, and looks) so Southern?

    One of the many categories of visitors to the South was concerned liberals during the New Deal, who were primarily interested not in race but in “conditions”—poverty, disease, ignorance. These included the documentary photographers dispatched by the federal government’s Farm Security Administration, who wound up creating most of the familiar images of the Depression, as well as anthropologists, sociologists, journalists, social reformers, artists, and filmmakers. James Agee and Walker Evans’s lugubrious book “Let Us Now Praise Famous Men” is one of the most enduring examples of this tradition. (The 1941 Preston Sturges film “Sullivan’s Travels” manages the nearly impossible feat of poking fun at such visitors while also making it clear that their mission had a powerful moral justification.) During the same period, white Southern novelists produced their own body of work that trafficked in Southern dispossession and dysfunction. William Faulkner was at the head of this class, which also included Erskine Caldwell (who was part of the social-documentary tradition, too, through his professional and personal partnership with Margaret Bourke-White) and, later, Carson McCullers and Flannery O’Connor.

    Paul Theroux, the veteran travel writer, seems to have prepared for “Deep South: Four Seasons on Back Roads” (Houghton Mifflin Harcourt), the first of his ten travel books set in the United States, by immersing himself in these works from the second quarter of the twentieth century. The genre in which he is working naturally organizes itself into vignettes rendered with a primary focus on literary artistry, rather than analysis, so he never has to state a full-dress argument, or even say exactly what he was looking for in those four long driving tours. The South remains more rural than the Northeast, but by now, as in the rest of the country, most people live in metropolitan areas. Still, Theroux tells us, “I stayed away from the big cities and the coastal communities. I kept to the Lowcountry, the Black Belt, the Delta, the backwoods, the flyspeck towns.” This principle may have been a way of simplifying his writing assignment: these are places where some people eat squirrels and raccoons, and are obviously unusual in a way that people in the Atlanta suburbs are not. That makes them easier to portray vividly. But Theroux is left trying to evoke the fastest-growing region of the country, where a hundred and twenty million people live, by taking us to a series of poor, deep-rural, depopulated places, like Hale County, Alabama; the Mississippi Delta; and the Ozarks, where the main noticeable changes in the past few decades are outsourcing and the advent of Gujarati Indians as motel owners.

    V. S. Naipaul, Theroux’s former mentor, wrote quite a similar book twenty-six years ago, called “A Turn in the South.” Naipaul, never one for sentimentality about oppressed people, wound up celebrating “the redneck” (you have to have pale skin to have a red neck) as the South’s heroic type. Theroux thinks of himself as a liberal, and he doesn’t go anywhere near defending the white South’s politics and attitudes. On the other hand, he also doesn’t want to play the part of the disapproving or sneering Northerner. National culture, these days, seems to connect with the part of the South that Theroux visited through rollicking reality-television carnivals like “Duck Dynasty” and “Here Comes Honey Boo Boo.” Theroux strikes an empathetic, mournful tone rather than a mocking one. The people he visits are older, settled. Many of them either work in or are clients of social-service and community-development agencies. More are white than are black. He often compares the rural South—“rotting, picturesquely hopeless, forgotten”—to the underdeveloped parts of sub-Saharan Africa, which he has been visiting intermittently since he was a Peace Corps volunteer in Malawi, in 1963, and he regularly complains that the South gets far less attention from big philanthropies and the like. (He’s especially annoyed that the Clinton Global Initiative evinces so little interest in the poorest regions of Bill Clinton’s home state.)

    In a final, confessional section, Theroux connects the book’s project to his own stage in life. At seventy-four, he finds himself contemplating the past more than the future, and wonders whether the onrushing world has left him behind. Where better to entertain such thoughts than in Allendale, South Carolina, a ghostly town bypassed by the interstate-highway system? But this turn of mind leads him inexorably to an implied theory of the South as, indeed, a region radically apart. Throughout the book, he registers the South’s religiosity and its preoccupation with guns as products of its degraded status, rather than of a culture that has always been more pious and more martial than the rest of the country’s. On one of several visits he makes to gun shows, during which he tries hard to understand rather than to condemn, he observes, “The whites felt like a despised minority—different, defeated, misunderstood, meddled with, pushed around, cheated.” His final judgment on the South, delivered at the end of the book, is this: “Catastrophically passive, as though fatally wounded by the Civil War, the South has been held back from prosperity and has little power to exert influence on the country at large, so it remains immured in its region, especially in its rural areas, walled off from the world.”

    Even if you believe the South is that separate from the rest of the country, you might still, if you look hard enough, detect tendrils of Southern influence that extend past the Mason-Dixon Line. Race provides the obvious example. The slave states developed an elaborate and distinctively American binary racial system, in which everybody across a wide range of European origins was put into one category, white, and everybody across a wide range of African origins (including those with more white forebears than black forebears) was put into another category, black. These tendentious categories have been nationalized for so long that they seem natural to nearly all Americans. They are Southern-originated, but not Southern. They powerfully determine where we live, how we speak, how we think of ourselves, whom we choose to marry. They are deeply embedded in law and politics, through the census, police records, electoral polling, and many other means.

    A frequent companion of the idea of a simple distinction between black and white is the idea of a simple distinction between racists and non-racists. There can’t be anybody left who believes that racists exist only in the South, but there are plenty of people, especially white people, who believe that racism is another simple binary and that they dwell on the better side of it. Paul Theroux marvels that Strom Thurmond, the old South Carolina arch-segregationist, fathered an out-of-wedlock black child. “Funny that a racist like Thurmond would have an affair with his black servant,” he remarks to someone he’s visiting. Come on! It’s visually evident how often this happened—“racism” as manifest in a sense of sexual entitlement, rather than of revulsion. Theroux himself displays an uncharacteristic electric jolt of resentment on the rare occasions when he contemplates urban black culture. In one passage, he refers to “the obscene, semiliterate yawp and grunt of rap,” and, in another, he describes a well-dressed black-bourgeois group he encounters at an event in Little Rock as being “like a shoal of leathery sharks” who are “suspicious, chilly, with a suggestion of hauteur in their greeting, as if they were still learning how to deal with whites.”

    Ari Berman’s “Give Us the Ballot” (Farrar, Straus & Giroux), a history of the 1965 Voting Rights Act, makes for an excellent extended example of the mechanisms by which race in the South becomes race in the nation. The Voting Rights Act followed the better-known Civil Rights Act by a year. It is properly understood as part of a wave of legislation that represents the political triumph of the civil-rights movement, but Berman, like most people, finds a precipitating event in the murder, in June, 1964, in Neshoba County, Mississippi, of three young civil-rights workers, James Chaney, Andrew Goodman, and Michael Schwerner.

    Chaney, Goodman, and Schwerner’s mission was voter registration—hence their connection to the Voting Rights Act. It’s sad but true that their murders would not have resonated so deeply if Goodman and Schwerner had not been whites from New York who had come South to participate in Freedom Summer. In fact, the grassroots organizing on behalf of voting rights was substantially black and Southern. Just before Freedom Summer, the congregation of Mt. Zion Methodist Church, in the all-black Neshoba County town of Longdale, had voted to make its church the local headquarters of the movement’s voter-registration efforts. A few days before the murders, the Ku Klux Klan burned the church down, because of the role it was playing. Chaney, Goodman, and Schwerner were on their way back from a trip to Longdale to investigate the fire when they were killed.

    “One Mississippi, Two Mississippi” (Oxford), by Carol V.R. George, a history of the Mt. Zion church, makes plain how essential the church was to the local civil-rights struggle. It was organized, with the help of Northern whites, during the period when the citizenship of former slaves was being rescinded, with the end of Reconstruction. For decades, its members were involved in every possible effort to reinstate the rights of blacks in Neshoba County, including the years of relentless activity that preceded Freedom Summer. And, after the church was rebuilt, it was deeply engaged in the long struggle to bring to justice one of Chaney, Goodman, and Schwerner’s killers, Edgar Ray Killen, whom an all-white Neshoba County jury refused to convict in 1967. That took until 2005.

    So the passage of the Voting Rights Act was actually a North-South partnership, not an imposition of the North’s will on the South. And it would be a big mistake to think of the act as a great, enduring civil-rights milestone, representing the country’s belated decision to comply fully and everywhere with the Fifteenth Amendment to the Constitution. As Berman demonstrates, the act has been, instead, the subject of half a century of ceaseless contention, leaving its meaning permanently undetermined. Most of the consequential fights about civil rights, beginning with the Reconstruction-era amendments to the Constitution, have been over the federal government’s role in enforcement. The Voting Rights Act gives Washington the power to review local voter-registration practices, and to change the boundaries of election districts in areas that have a history of discrimination or that appear to be drawing district lines so as to minimize the number of black elected officials. But the act, as written, invites conflict because its enforcement provisions come up for periodic congressional review.

    Every few years, there has been a serious attempt to discontinue these enforcement provisions. Berman makes a persuasive case that the ongoing battles over the reviews of the Voting Rights Act, beginning with the first one, in 1970, have had a major impact on who has held political power. Periods of aggressive enforcement have produced more black voters and more liberal (especially black) elected officials—including, Berman suggests, Barack Obama—and also the potential for conservative politicians to take advantage of white resentment of the Voting Rights Act.

    In August of 1980, Ronald Reagan chose to kick off his general-election Presidential campaign at the Neshoba County Fair, in Mississippi, not far from where Chaney, Goodman, and Schwerner were murdered, and to declare, “I believe in states’ rights.” Once Reagan was in office, there was a battle over the terms of one of the Voting Rights Act’s periodic extensions, in which a significant actor was John Roberts, then a young lawyer at the Justice Department and now the Chief Justice. Berman has found in the National Archives a set of memos that Roberts wrote in 1981 and 1982, demonstrating a passionate opposition to aggressive enforcement of the Voting Rights Act. Three decades later, in the case of Shelby County v. Holder (2013), Roberts led a Supreme Court majority that struck down the major enforcement provision of the act, arguing that the problem the act was passed to correct has long since been solved. This will help Republicans in subsequent elections, including the 2016 Presidential election.

    At passage, the Voting Rights Act appeared to be only about the South, but over the years it has regularly been applied elsewhere. Politics is racial, to some extent, in most places; it was impossible to keep such a major law from having national repercussions. Among the states that have now passed election laws in direct response to the Shelby decision are Arizona, Wisconsin, and Ohio. The same dynamic—in which a “regional” issue goes national—repeats itself in just about every realm: not just in politics but also in culture, business, social mores.

    “It will become all one thing or all the other,” Abraham Lincoln declared of the beleaguered, slavery-stressed Union, in his “House Divided” speech. In fact, the South and the rest of the nation have one of those hot-blooded relationships—the major one, in American history—which never settle into either trustful intimacy or polite distance. The South is too big and powerful to be vestigial; too married to the rest of the country to stand truly apart; too distinctive in its history to be fully united with the other states. Colin Powell, back in the days when, as Secretary of State, he was voicing skepticism about the Iraq War, used to say, “If you break it, you own it.” That seemed true for a while in Iraq, but, being halfway around the world, Iraq wasn’t so hard to leave. The Union’s defeat of the Confederacy makes for a better example. ?

    Nicholas Lemann joined The New Yorker as a staff writer in 1999, and has written the Letter from Washington and the Wayward Press columns for the magazine.

    Read more »

    Share RecommendKeepReplyMark as Last Read
    Previous 10 Next 10 

    Copyright © 1995-2017 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.