|To: Jon Koplik who wrote (8966)||1/8/2020 12:46:35 PM|
|From: Jon Koplik|
|(long) NYT obituary on Paul A. Volcker, Fed Chairman .................................................|
Dec. 9, 2019
Paul A. Volcker, Fed Chairman Who Waged War on Inflation, Is Dead at 92
Mr. Volcker helped shape American economic policy for decades, notably by leading the Federal Reserve’s brute-force campaign to subdue inflation in the 1970s and ’80s.
By Binyamin Appelbaum and Robert D. Hershey Jr.
Paul A. Volcker, who helped shape American economic policy for more than six decades, most notably by leading the Federal Reserve’s brute-force campaign to subdue inflation in the late 1970s and early ’80s, died on Sunday in New York. He was 92.
The death was confirmed by his daughter, Janice Zima, who did not specify the cause. Mr. Volcker had been treated for prostate cancer, which was diagnosed in 2018.
Mr. Volcker, a towering, taciturn and somewhat rumpled figure, arrived in Washington as America’s postwar economic hegemony was beginning to crumble. He would devote his professional life to wrestling with the consequences.
As a Treasury Department official under Presidents John F. Kennedy, Lyndon B. Johnson and Richard M. Nixon, Mr. Volcker waged a long, losing struggle to preserve the postwar international monetary system established by the Bretton Woods agreement.
As a senior Federal Reserve official from 1975 to 1987, in addition to battling inflation, he sought to limit the easing of financial regulation and warned that the rapid growth of the federal debt threatened the nation’s economic health.
In his last official post, as chairman of President Barack Obama’s Economic Recovery Advisory Board, formed in response to the 2008 financial crisis, he persuaded lawmakers to impose new restrictions on big banks a measure known as the “Volcker Rule.”
Mr. Volcker interlaced his long stretches of public service with a lucrative career on Wall Street, most prominently as chief executive of the investment bank Wolfensohn & Company.
His reputation for austere integrity also made him a popular choice as an independent arbiter. In one instance he oversaw the reclamation of deposits that Swiss banks had failed to return to the families of Holocaust victims.
His defining achievement, however, was his success in ending an extended period of high inflation after President Jimmy Carter chose him to be the Fed’s chairman in 1979.
He prevailed by delivering shock therapy, driving the economy into a deep recession to persuade Americans to abandon their entrenched expectation that prices would keep rising rapidly.
The cost was steep. As consumers stopped buying homes and cars, millions of workers lost their jobs. Angry home-builders mailed chunks of two-by-fours to the Fed’s marble headquarters in Washington. But Mr. Volcker managed to wring most inflation from the economy.
His victory inaugurated an era in which the leaders of both political parties largely deferred to the central bank, allowing technocrats to chart the course of monetary policy with little political interference.
Ben S. Bernanke, the Fed’s chairman from 2006 to 2014, kept on his bookshelf one of the chunks of wood that Mr. Volcker received during the anti-inflation campaign.
“He came to represent independence,” Mr. Bernanke said in an interview for this obituary. “He personified the idea of doing something politically unpopular but economically necessary.”
Proud, confident and 6-foot-7 in socks, Mr. Volcker struck many as remote and intimidating. Those who knew him well said the gruff exterior concealed a shy man with a puckish wit. His first wife told a biographer that she had waited vainly for a proposal before she finally asked him if he wanted to marry.
He was famously frugal, favoring drugstore cigars and ill-fitting suits. In the 1960s, when the driver’s seat in his Nash Rambler collapsed, Mr. Volcker propped it up with a chair and continued to drive the car. As chairman of the Fed, he lived in an apartment building populated by George Washington University students and took his laundry to his daughter’s house in the Virginia suburbs.
His time in the national spotlight began in August 1979. Mr. Carter, struggling to salvage public confidence in his administration, decided to reshuffle his cabinet, plucking the Fed chairman G. William Miller to serve as Treasury secretary. Mr. Volcker, who was then serving as president of the Federal Reserve Bank of New York, was not Mr. Carter’s first choice as a replacement.
Mr. Volcker was known to be frustrated with the Fed’s halfhearted efforts to curb inflation, leading Mr. Carter’s aides to warn that he might drive the economy into recession.
Meeting Mr. Carter in the Oval Office, Mr. Volcker slumped on a couch, a familiar cigar in hand, and gestured at Mr. Miller, who was in the room. “You have to understand,” Mr. Volcker said he told the president, “if you appoint me, I favor a tighter policy than that fellow.”
In taking the job, Mr. Volcker strained his finances and his family life.
The job of chairman paid half as much as his post at the New York Fed, and Mr. Volcker’s wife at the time, Barbara Volcker, who struggled for much of her life from debilitating rheumatoid arthritis as well as diabetes, remained in New York to be near her longtime physician. (She died in 1998.) Their son, James, who was born with cerebral palsy, also remained in New York.
When Mr. Volcker arrived in Washington, the national inflation rate was exceeding 1 percent a month. (By comparison, in 2017 inflation was less than 2 percent for the whole year.) Rapid and unpredictable inflation encourages spending while discouraging investment, a combination that creates economic instability and, often, political instability.
Henry C. Wallich, a Fed governor who had lived through the hyperinflation of Weimar Germany and often told of paying 150 billion marks to use a neighborhood swimming pool, was among those warning that the Fed was losing control.
Many economists still argued that the Fed could reduce inflation gently, without causing a recession, by raising interest rates just enough to slow economic activity. But Mr. Volcker said inflation had become a self-fulfilling prophecy. People had come to expect prices and wages to rise, so they borrowed and spent more and demanded larger pay increases, and prices and wages rose.
The Fed had been promising to crack down on inflation for more than a decade, but it had repeatedly caved in to intense political pressure so as to avoid a recession. Mr. Volcker decided a dramatic gesture was necessary to convince the public that this time would be different.
“I wanted to move the story at least to the front page,” he told a biographer.
Channeling the Money
On Saturday, Oct. 6, 1979, Mr. Volcker held an evening news conference in the grand boardroom at the Fed’s headquarters on Constitution Avenue. It was the first time in memory that a Fed chairman had addressed the news media, and the Fed’s staff scrambled to gather the press corps.
Pope John Paul II was visiting Washington; when CBS said that it didn’t have a spare camera crew, Mr. Volcker’s spokesman persuaded the network to abandon the pontiff. “Send your crew here,” he told a CBS producer. “Long after the pope is gone, you’ll remember this one.”
Mr. Volcker’s message was that the Fed was declaring war on inflation. “The basic message we tried to convey was simplicity itself,” he said later. “We meant to slay the inflationary dragon.”
To underscore the Fed’s determination, Mr. Volcker announced a significant change in the conduct of monetary policy. Historically, the Fed had aimed to control interest rates the price of money. Under the new policy, he said that the Fed would instead aim to control the supply of money. Limiting the money supply would cause interest rates to rise, but the Fed would no longer aim for a specific increase. The central bank would determine how much money was available; markets would set the price.
The change was part of a broader shift in economic policy-making toward a greater reliance on financial markets. It marked the end of the postwar era in which disciples of the British economist John Maynard Keynes had argued that governments could deftly manage economic conditions, including interest rates.
An immediate result was that markets pushed interest rates a lot higher than Mr. Volcker had anticipated. The prime rate, which banks charge their most creditworthy customers, nearly doubled by Election Day 1980, peaking at 21.5 percent. Farmers on tractors circled the Fed’s headquarters. Auto dealers sent the keys to cars they could not sell. “Dear Mr. Volcker,” one builder scrawled on a wooden block with a knothole. “I am beginning to feel as useless as this knothole. Where will our children live?”
Mr. Volcker later confessed to doubts, telling an interviewer in 2016 that he had worn a path into his office carpet while waiting for inflation to surrender. And, early on, he blinked: After tipping the economy into recession in early 1980, the Fed briefly took its foot off the brakes.
But when inflation showed signs of accelerating, the foot slammed back down, and a deeper recession began. Thereafter, Mr. Volcker was obdurate, insisting that the pain was necessary and ultimately worthwhile.
Asked by a reporter how much unemployment he was willing to accept, Mr. Volcker responded, “My basic philosophy is over time we have no choice but to deal with this inflationary situation.”
The harsh Fed policy no doubt contributed to Mr. Carter’s re-election defeat at the hands of Ronald Reagan; he had to campaign when interest rates were at their peak, and before the inflation fever had begun to break. Mr. Carter, in his memoirs, would offer a typically understated assessment: “Our trepidation about Volcker’s appointment was later justified.”
Unemployment rose to a peak of 10.8 percent in November 1982 higher than at any point during the recession that began in 2008 but by then the benefits of the Fed’s campaign were beginning to appear. Inflation fell below 4 percent in 1983, and Mr. Volcker’s critics were soon drowned out by a burgeoning chorus of admirers.
Some economists continued to argue that the Fed could have brought inflation under control more gently. Alan Greenspan, Mr. Volcker’s successor as Fed chairman, later described the policy as an excess of necessary medicine, although he added that it had been preferable to not doing enough.
In retrospect, it has also become clear that the developed world was on the cusp of an era of declining inflation, arriving as a result of the globalization of manufacturing and capital markets.
But Mr. Volcker’s triumph was undeniable: Inflation has remained under control ever since.
“Paul was as stubborn as he was tall,” Mr. Carter said in a statement on Monday morning, “and although some of his policies as Fed chairman were politically costly, they were the right thing to do. His strong and intelligent guidance helped to curb petroleum-driven inflation, easing a strain on all Americans’ budgets.”
President Reagan appointed Mr. Volcker to a second term as board chairman in 1983. But the relationship soon soured. Mr. Volcker was increasingly vocal in his criticism of the federal government’s growing deficits and reliance on foreign investors. In his austere view, there were no shortcuts to economic growth. The government needed budget discipline as well as low inflation.
The White House, for its part, grew increasingly unhappy with Mr. Volcker’s focus on inflation. In the summer of 1984, as Reagan campaigned for re-election, Mr. Volcker was summoned to meet the president at the White House. Mr. Volcker recounts in his memoirs, published in October 2018, that Reagan sat silently while his chief of staff, James A. Baker III, delivered a blunt message: “The president is ordering you not to raise interest rates before the election.”
Mr. Volcker was “stunned,” he wrote, but he maintained his composure and left without giving a reply. He added that he had not planned to raise rates before the election, and he did not do so.
Mr. Volcker and Mr. Baker also clashed over financial regulation. The Fed played the leading role in overseeing the nation’s largest banks; in Mr. Volcker’s view, they were getting into enough trouble already.
One of the nation’s largest banks, Continental Illinois, failed in 1984 after a long run of reckless lending, particularly for oil exploration, prompting the former Fed chairman William McChesney Martin to observe acidly that Mr. Volcker was “very good on monetary policy” and “a complete flop on bank supervision.”
The largest American banks, mostly based in New York, also got into trouble by lending heavily to Latin American countries. As the Fed raised interest rates, those countries struggled to make interest payments, precipitating a debt crisis. Mr. Volcker played a leading role in devising bailouts for the countries and the banks.
But the big New York banks, with the support of the Reagan administration, kept pressing for permission to re-enter the business of securities trading for the first time since the Great Depression.
Mr. Volcker delayed consideration of the banks’ requests, but the administration forced his hand by appointing new members to the Fed’s board. In early 1986, those new members outvoted Mr. Volcker. He threatened to resign but decided to serve out his second term.
On June 1, 1987, Mr. Volcker went to the White House to say that he did not want a third term; Reagan promptly telephoned Mr. Greenspan to offer him the job.
Over the next two decades, the Fed maintained firm control of inflation, but policymakers otherwise ignored Mr. Volcker’s advice, allowing the national debt to balloon. The government also continued to reduce regulation of the financial industry. The long era of growth that began with Mr. Volcker’s victory over inflation would come to a crashing end in 2008, bringing him back to Washington one last time.
Paul Adolph Volcker Jr. was born on Sept. 5, 1927 Labor Day in Cape May, on the southern tip of New Jersey, where his father was the city manager. Paul moved as a child to Teaneck, a leafy North Jersey suburb, which had recruited his father to keep the town from bankruptcy.
Mr. Volcker credited his father with inspiring his own career as a public servant.
Nicknamed Buddy by his family, Mr. Volcker had three older sisters like him, they were all more than six feet tall and all were so determined not to spoil him that “they leaned over backward to abuse me,” he told a biographer.
A reserved youth, he became a top student and, thanks to his height, a member of the varsity basketball team at Teaneck High School. Journalists seeking tales of teenage high jinks have come up empty-handed.
“When my buddies went shooting out streetlights, I said goodbye,” Mr. Volcker recalled in an interview for this obituary in 2010.
After graduating from high school in May 1945, with the end of World War II still months away, Mr. Volcker tried to enlist in the Army but was rejected because he was an inch too tall. He decided to apply to Princeton University, despite his father’s warning that the other students there were very smart. Mr. Volcker was soon getting top marks. He would later observe wryly of his classmates, “They weren’t as smart as my father thought.”
At Princeton, Mr. Volcker pursued a new major that combined the study of economics, politics and history. Searching for a thesis topic his senior year, he decided to write about the Federal Reserve. He produced a 250-page analysis entitled “The Problems of Federal Reserve Policy Since World War II.” It argued that the Fed needed to act more firmly to control inflation.
His adviser, Frank D. Graham, encouraged him to pursue a graduate degree in economics. Mr. Volcker applied both to the Harvard Law School and to Harvard’s Littauer School of Public Administration (later the John F. Kennedy School of Government), which he chose when it offered a full ride.
Two years later, Mr. Volcker had a master’s degree and had finished course work for a Ph.D. in political economy when he won a Rotary Club scholarship for foreign study. He decided to write his doctoral thesis at the London School of Economics. In fact, he wrote no thesis but had a grand time traveling the Continent, sometimes on a bicycle, forging relationships that later proved valuable.
He returned home to become a staff economist at the New York Fed, then was recruited by Chase Manhattan, where he served as special assistant to David Rockefeller, the bank’s vice chairman at the time, and on a commission advising the Treasury Department in Washington.
In 1962, Robert V. Roosa, a mentor to Mr. Volcker at the New York Fed who had become a Treasury official in the Kennedy administration, gave Mr. Volcker his first job in Washington, as an adviser to the Treasury. He was later a deputy under secretary.
Mr. Volcker, raised in a Republican family, said he joined the Democratic Party because he had been inspired as a young man by the politician and diplomat Adlai E. Stevenson II, but he saw himself primarily as a civil servant. He would never realize an ambition to become Treasury secretary, in part because several presidents would conclude that he was insufficiently political. But his reputation as a technocrat meant that as presidents came and went, Mr. Volcker stayed put.
“One of my old friends from abroad once told me I think he meant it as an ironic compliment that he thought of my career as a long saga of trying to make the decline of the United States in the world respectable and orderly,” Mr. Volcker said.
As a Treasury official under three presidents, Mr. Volcker struggled to preserve the postwar international monetary system that the United States and its allies had created in 1944 at the Bretton Woods resort in New Hampshire. Under the agreement, nations fixed the values of their currencies in dollars, and the United States promised to exchange dollars for gold at $35 an ounce.
The agreement was a mainstay of the postwar effort to foster global economic cooperation, which in turn was considered a powerful means of discouraging military conflict. But during the 1960s, the fixed rates became increasingly untenable because of the resurgence of the German and Japanese economies.
In 1971, Mr. Volcker played a key role in persuading Nixon to suspend the Bretton Woods agreement by closing the “gold window,” meaning the United States would no longer guarantee the value of the dollar.
Mr. Volcker hoped to negotiate a new set of fixed exchange rates. Instead, the dollar was allowed to float, leaving markets to determine the value of the dollar in British pounds or Japanese yen.
This change was celebrated by many economists as a key step in the deregulation of financial markets. Mr. Volcker long regretted it for the same reason. He foresaw, correctly, that the absence of an international system would make it easier for other nations to manipulate their currencies. And he would go on fighting, with little success, to create an alternative system for regulating exchange rates an idea that has attracted some new interest since the 2008 financial crisis.
Mr. Volcker left the Treasury in 1974, intending to return to Wall Street.
Instead, Arthur F. Burns, the Fed’s chairman, asked him to take the top job at the New York Fed. It was a powerful position: The bank was the Fed’s operational arm, and its president served as vice chairman of the Fed’s policymaking committee.
Mr. Volcker, a dedicated fly fisherman, headed out on a vacation to clear his mind, then called collect from a pay phone to take the job.
At the New York Fed he would watch with growing frustration as Mr. Burns and then Mr. Miller failed to deliver on promises of tough measures to rein in inflation. It would be another four years before Mr. Volcker had his own chance.
After Mr. Volcker stepped down from the Fed in 1987, at age 59, he returned to a financial industry that was being transformed by deregulation.
Paydays on Wall Street
At first he took a teaching post at Princeton and a part-time role at Wolfensohn, an investment banking firm that cultivated a reputation for providing disinterested advice. Within a few years he was running the firm. He made $89,500 in his final year at the Fed (about $203,000 in today’s money). At Wolfensohn, he earned more than a million dollars a year (about $3.5 million today). When Wolfensohn was acquired in 1995, he earned more in one day than he had in 30 years of public service.
Mr. Volcker also agreed to serve as unpaid chairman of the National Commission on the Public Service, a nonprofit organization founded in 1987 to encourage private-sector leaders to serve in government. He later described its failure to win support for civil service changes, like higher pay, as “probably my biggest regret.”
During the 1990s and 2000s, Mr. Volcker found his services as a chairman of public commissions in demand: He investigated the United Nations’ oil-for-food program in Iraq, corruption at the World Bank and the failings of Enron’s auditor, the Arthur Andersen accounting firm.
Probably his most contentious role was as chairman of a committee created in 1996 to mediate the claims of Holocaust victims and their families against Swiss banks. In some cases the banks had actively impeded efforts to reclaim money that had been deposited before World War II.
“There was a feeling of failed justice on the Jewish side,” Mr. Volcker said of the process, which resulted in a settlement of $1.25 billion. “And on the Swiss side, a feeling of unfair criticism. I think I was genuinely neutral.”
Even after making a fortune on Wall Street, Mr. Volcker never lost his distaste for bankers, whom he often criticized as avaricious and fond of risk-taking at the public’s expense.
In a 2005 speech, Mr. Volcker, who had long taken a dim view of the debt-fueled expansion of the American economy, warned of a looming financial crisis. “Baby boomers have been spending like there is no tomorrow,” he told an audience at Stanford University. “Big adjustments will inevitably come.”
After the crisis came in 2008, Mr. Obama named Mr. Volcker to an advisory role, but the embrace was lukewarm. Most of the president’s advisers were veterans of the pro-deregulation Clinton administration.
Mr. Volcker, sensing the moment, began to campaign publicly for stronger financial regulation.
“Wake up, gentlemen,” he lectured bankers at a conference in December 2009. “I can only say that your response is inadequate. I wish that somebody would give me some shred of neutral evidence about the relationship between financial innovation recently and the growth of the economy.”
He added that it was hard to think of a worthwhile financial innovation since the A.T.M.
Mr. Volcker found a more receptive audience among congressional Democrats. Judging the Obama administration’s initial proposal to overhaul regulation to be too weak, he lobbied for a new measure that would restrict risk-taking by the largest banks. The White House was forced by its allies to acquiesce.
On Jan. 21, 2010, Mr. Obama announced a late change to his plan, which had already passed the House. “I’m proposing a simple and common-sense reform which we’re calling the ‘Volcker Rule,’ after this tall guy behind me,” he said, pointing at Mr. Volcker, who loomed over the president’s other advisers.
The rule restricts banks from making investments that are not intended to benefit customers but rather only to increase the bank’s bottom line. The Trump administration has said that it plans to loosen those strictures as part of a broader review of post-crisis financial regulation.
Mr. Volcker married Barbara Bahnson in 1954. After her death, he married Anke Dening, his longtime assistant, in 2010. Besides her and his daughter, he is survived by his son, James, and four grandchildren.
Asked in the 2010 interview for this obituary about mistakes he had made, Mr. Volcker cited a personal rather than a professional one.
“The greatest strategic error of my adult life was to take my wife to Maine on our honeymoon on a fly-fishing trip,” he said, referring to his first marriage.
For his second marriage, the honeymoon was in the Virgin Islands.
Mr. Volcker, who long resisted entreaties to write a memoir, finally published an account of his life in October 2018, entitled “Keeping at It: The Quest for Sound Money and Good Government.” He said he had agreed to write the book to call attention to what he described as a crisis: “a breakdown in the effective governance of the United States.”
The government, he said, has lost the ability to address even the most obvious problems, like the need for better infrastructure.
“We’re in a hell of a mess in every direction,” he said in an interview with The New York Times in 2018. “Respect for government, respect for the Supreme Court, respect for the president, it’s all gone.”
In Mr. Volcker’s view, a key cause of this breakdown is a lack of qualified and committed public servants. In his memoir, he is particularly critical of his alma mater, Princeton, and other elite institutions for failing to prepare students for careers in public service.
“My plea is very simple,” he said. “Good policy is dependent on good management.”
He expressed the hope that it was not too late to restore trust in government, but he closed on a characteristically sober note, writing, “It will not be easy.”
-- Johnny Diaz contributed reporting.
© 2019 The New York Times Company
|RecommendKeepReplyMark as Last ReadRead Replies (1)|
|To: Jon Koplik who wrote (9014)||2/9/2020 11:17:02 PM|
|From: Jon Koplik|
|NYT obituary on Kirk Douglas, a Star of Hollywood’s Golden Age ................................|
Feb. 5, 2020
Kirk Douglas, a Star of Hollywood’s Golden Age, Dies at 103
His rugged good looks and muscular intensity made him a commanding presence in films like “Lust for Life,” “Spartacus” and “Paths of Glory.”
By Robert Berkvist
Kirk Douglas, one of the last surviving movie stars from Hollywood’s golden age, whose rugged good looks and muscular intensity made him a commanding presence in celebrated films like “Lust for Life,” “Spartacus” and “Paths of Glory,” died on Wednesday at his home in Beverly Hills, Calif. He was 103.
His son the actor Michael Douglas announced the death in a statement on his Facebook page.
Mr. Douglas had made a long and difficult recovery from the effects of a severe stroke he suffered in 1996. In 2011, cane in hand, he came onstage at the Academy Awards ceremony, good-naturedly flirted with the co-host Anne Hathaway and jokingly stretched out his presentation of the Oscar for best supporting actress.
By then, and even more so as he approached 100 and largely dropped out of sight, he was one of the last flickering stars in a Hollywood firmament that few in Hollywood’s Kodak Theater on that Oscars evening could have known except through viewings of old movies now called classics. A vast number filling the hall had not even been born when he was at his screen-star peak, the 1950s and ’60s.
But in those years Kirk Douglas was as big a star as there was a member of a pantheon of leading men, among them Burt Lancaster, Gregory Peck, Steve McQueen and Paul Newman, who rose to fame in the postwar years.
And like the others he was instantly recognizable: the jutting jaw, the dimpled chin, the piercing gaze and the breaking voice, the last making him irresistible fodder for comedians who specialized in impressions.
Three Movies a Year
In his heyday Mr. Douglas appeared in as many as three movies a year, often delivering critically acclaimed performances. In his first 11 years of film acting, he was nominated three times for the Academy Award for best actor.
He was known for manly roles, in westerns, war movies and Roman-era spectacles, most notably “Spartacus” (1960). But in 80 movies across a half-century he was equally at home on mean city streets, in smoky jazz clubs and, as Vincent van Gogh, amid the flowers of Arles in the south of France.
Many of his earlier films were forgettable -- variations on well-worn Hollywood themes -- and moviegoers were slow to recognize some of his best work. But when he found the right role, he proved he could be very good indeed.
Early on he was hailed for his performances as an unprincipled Hollywood producer, opposite Lana Turner, in “The Bad and the Beautiful” (1952), and as van Gogh in “Lust for Life” (1956). Each brought an Oscar nomination.
Many critics thought he should have gotten more recognition for his work in two films in particular: Stanley Kubrick’s “Paths of Glory” (1957), in which he played a French colonel in World War I trying vainly to prevent the execution of three innocent soldiers, and “Lonely Are the Brave” (1962), an offbeat western about an aging cowboy.
Early on Mr. Douglas created a niche for himself, specializing in characters with a hard edge and something a little unsavory about them. His scheming Hollywood producer in “The Bad and the Beautiful” was “a perfect Kirk Douglas-type bum,” Bosley Crowther of The New York Times wrote.
Mr. Douglas did not disagree. “I’ve always been attracted to characters who are part scoundrel,” he told The Times in an interview in 1984. “I don’t find virtue photogenic.”
Yet he often managed to win audiences’ sympathy for even the darkest of his characters by suggesting an element of weakness or torment beneath the surface.
“To me, acting is creating an illusion, showing tremendous discipline, not losing yourself in the character that you’re portraying,” he wrote in his best-selling autobiography, “The Ragman’s Son” (1988). “The actor never gets lost in the character he’s playing; the audience does.”
‘Going Over the Line’
The only time that discipline nearly cracked was during the filming of “Lust for Life.” “I felt myself going over the line, into the skin of van Gogh,” he wrote. “Not only did I look like him, I was the same age he had been when he committed suicide.” The experience was so frightening, he added, that for a long time he was reluctant to watch the film.
“While we were shooting,” he said, “I wore heavy shoes like the ones van Gogh wore. I always kept one untied, so that I would feel unkempt, off balance, in danger of tripping. It was loose; it gave him -- and me -- a shuffling gait.”
Most people who worked with Mr. Douglas were either awed by his self-confident intensity or put off by it. He was proud of his muscular physique and physical prowess and regularly rejected the use of stuntmen and stand-ins, convinced he could do almost anything the situation required.
Preparing for “Champion,” he trained for months with a retired prizefighter. He took trumpet lessons with Harry James for “Young Man With a Horn” (although James did the actual playing on the film’s soundtrack). He became a skilled horseman and learned to draw a six-shooter with impressive speed, lending authenticity to his Doc Holliday when he and Lancaster, as Wyatt Earp, blazed away at the Clanton gang in the final shootout in “Gunfight at the O.K. Corral” (1957).
The engine that drove Mr. Douglas to achieve, again and again, was his family history.
The Ragman’s Son
He was born Issur Danielovitch on Dec. 9, 1916, in Amsterdam, N.Y., a small city about 35 miles northwest of Albany. As he put it in his autobiography, he was “the son of illiterate Russian Jewish immigrants in the WASP town of Amsterdam,” one of seven children, six of them sisters. By the time he began attending school, the family name had been changed to Demsky and Issur had become Isadore, promptly earning him the nickname Izzy.
The town’s mills did not hire Jews, so his father, Herschel (known as Harry), became a ragman, a collector and seller of discarded goods. “Even on Eagle Street, in the poorest section of town, where all the families were struggling, the ragman was on the lowest rung on the ladder,” Mr. Douglas wrote. “And I was the ragman’s son.”
A powerful man who drank heavily and got into fights, the elder Demsky was often an absentee father, letting his family fend for itself.
Money for food was desperately short much of the time, and young Izzy learned that survival meant hard work. He also learned about anti-Semitism. “Kids on every street corner beat you up,” he wrote.
Mr. Douglas once estimated that he had held down at least 40 different jobs -- among them delivering newspapers and washing dishes -- before he found success in Hollywood. After graduating from high school, he hitchhiked north to St. Lawrence University in Canton, N.Y., and was admitted and given a college loan.
He became a varsity wrestler there and, despite being rejected by fraternities because he was Jewish, was elected president of the student body in his junior year -- a first for the St. Lawrence campus.
By that time he had decided that he wanted to be an actor. He got a summer job as a stagehand at the Tamarack Playhouse in the Adirondacks and was given some minor roles. He traveled to New York City to try out for the American Academy of Dramatic Arts and performed well, but he was told no scholarships were available.
It was at the Tamarack, the summer after he graduated from college, that he decided to change his name legally to something he thought more befitting an actor than Isadore Demsky. (When he chose Douglas, he wrote, “I didn’t realize what a Scottish name I was taking.”)
Returning to New York, he studied acting for two years, played in summer stock and made his Broadway debut in 1941 as a singing Western Union messenger in “Spring Again.”
The next year he enlisted in the Navy and was trained in anti-submarine warfare. He also renewed his friendship with Diana Dill, a young actress he had met at the American Academy. They married in 1943, just before he shipped out during World War II as the communications officer of Patrol Craft 1139. They had two sons, Michael and Joel, before divorcing in 1951. She died in 2015.
In 1954 Mr. Douglas married Anne Buydens, and they too had two sons, Peter and Eric. All his sons went into the film business, either acting or producing. Michael did both.
Eric Douglas died of an accidental overdose of alcohol and prescription pills in 2004 at the age of 46.
In addition to his son Michael, Mr. Douglas is survived by his wife and his two other sons, as well as seven grandchildren and a great-grandchild.
After being injured in an accidental explosion, Mr. Douglas was discharged from the Navy in 1944. He returned to New York, did some stage work and then headed for Hollywood.
He made his screen debut in 1946 in “The Strange Love of Martha Ivers,” playing a weakling who is witness to a murder. In a big-name cast that also included Barbara Stanwyck, Van Heflin and Judith Anderson, Mr. Douglas more than held his own. He was equally solid in “I Walk Alone,” a 1948 film noir in which he played the heavy in the first of his half-dozen pairings with his close friend Burt Lancaster.
First Shot at an Oscar
But it was the 1949 film “Champion,” produced by the young Stanley Kramer, that made him a star. As Midge Kelly, a ruthless young prizefighter, he presented a chilling portrait of ambition run wild and earned his first Oscar nomination.
He had to wait nearly 50 years, however, before he actually received the golden statuette, for lifetime achievement. He never won a competitive Oscar.
The doors opened wide for him after “Champion.” A year later he appeared in “Young Man With a Horn,” in the title role of a troubled jazz trumpet player modeled on Bix Beiderbecke.
In short order came “The Glass Menagerie” (1950), the screen adaptation of Tennessee Williams’s play about a timid young woman (Jane Wyman) who finds solace in her fantasies, with Mr. Douglas as the gentleman caller; “Ace in the Hole” (1951), in which he played a cynical reporter manipulating a life-or-death situation; and, also in 1951, “Detective Story,” based on Sidney Kingsley’s play, in which Mr. Douglas played an overzealous New York detective who invites his own destruction. Mr. Crowther of The Times wrote that Mr. Douglas’s performance was, “detective-wise, superb.”
Despite his film-star status and all the trappings that came with it -- his autobiography chronicles his many sexual conquests -- Mr. Douglas still hungered for success in the theater. As it turned out he had only one more opportunity.
In 1963 he seized the chance to play the lead role in the Broadway adaptation of “One Flew Over the Cuckoo’s Nest,” Ken Kesey’s novel about authority and individual freedom, set in a mental hospital. Mr. Douglas, to mixed reviews, played Randle P. McMurphy, the all-too-sane patient who is ultimately destroyed by the system. (Jack Nicholson played the part in Milos Forman’s 1975 film adaptation.)
A few years earlier Mr. Douglas, who had worked his way free of a studio contract and formed his own company, Bryna Productions, made waves in Hollywood when he embarked on a film version of “Spartacus,” Howard Fast’s novel of slave revolt in ancient Rome.
He decided not only to hire Dalton Trumbo -- who had been blacklisted during the McCarthy era on suspicion of Communist sympathies -- to write the screenplay, but also to put Mr. Trumbo’s name in the credits rather than one of the pseudonyms he had been using.
“We all had been employing the blacklisted writers,” Mr. Douglas wrote in a 2012 memoir, “I Am Spartacus!: Making a Film, Breaking the Blacklist.” “It was an open secret and an act of hypocrisy, as well as a way to get the best talent at bargain prices. I hated being part of such a system.”
(Mr. Douglas’s role in Trumbo’s redemption -- although some people say he overstated it -- was dramatized in the 2015 biographical film “Trumbo,” a film he praised, telling The Telegraph of London that “its spirit is true to the man I admired.” Dean O’Gorman played Mr. Douglas.)
“Spartacus,” released in 1960, was Mr. Douglas’s third blood-and-thunder spectacle set in the ancient past. In “Ulysses” (1955), as Homer’s wandering hero, he survived legendary perils to return to his faithful Penelope ( Silvana Mangano). In “The Vikings” (1958), he and Tony Curtis were cast as half brothers who, ignorant of their blood ties, battle for control of a Norse kingdom. And in “Spartacus” it was Mr. Douglas, in the title role, who led his rebellious fellow slaves against the Roman legions (played by 5,000 Spanish soldiers).
One of the last cast-of-thousands spectacles to come out of Hollywood, “Spartacus” was notable as well for its international cast, which included Laurence Olivier, Charles Laughton, Jean Simmons and Peter Ustinov, and for its talented young director, Stanley Kubrick, who had also directed Mr. Douglas in “Paths of Glory.” Most critics were not impressed, but the movie’s popularity has been long lasting. It was restored and re-released in 1991.
Of all his films, Mr. Douglas was proudest of “Lonely Are the Brave,” also written by Mr. Trumbo, which Mr. Douglas insisted on making on a small budget and against studio advice. “I love the theme,” he said, “that if you try to be an individual, society will crush you.”
Mr. Douglas made many more films in the years ahead, but none quite lived up to his work of the 1950s and early ’60s. There were more westerns: “The Way West” (1967), with Robert Mitchum and Richard Widmark; “There Was a Crooked Man ...” (1970), with Henry Fonda; and “A Gunfight” (1971), with Johnny Cash. “Tough Guys” (1986), a comedy, was the last movie he made with Burt Lancaster.
There were more military roles. He was a Marine colonel who foils an anti-government plot in “Seven Days in May,” a 1964 Cold War thriller that also starred Lancaster. He was a naval aviator in “In Harm’s Way” (1965) and a Norwegian saboteur in “The Heroes of Telemark” (1966). In “Is Paris Burning?” (1966) he played Gen. George S. Patton, and in “The Final Countdown” (1980) he commanded a nuclear-powered aircraft carrier.
As fewer film roles came his way, Mr. Douglas turned to television. In the HBO movie “Draw!” (1984), he was an aging outlaw pitted against James Coburn as a drunken sheriff. In the CBS movie “Amos” (1985), he was a feisty nursing-home resident battling a tyrannical nurse played by Elizabeth Montgomery.
Setbacks and Triumphs
There were setbacks in his personal life. In 1986 Mr. Douglas was fitted with a pacemaker to correct an irregular heartbeat. In 1991 he survived a helicopter crash that left two other people dead. In January 1996 he suffered a debilitating stroke that left him with seriously impaired speech and depression so deep, he later said, that he considered suicide.
But he fought his way back, and by March he was able to appear at the Academy Awards ceremony, speaking haltingly, to accept an honorary Oscar for lifetime achievement.
By then he could add that statuette to his other lifetime awards: the Presidential Medal of Freedom, presented by President Jimmy Carter just days before Mr. Carter left office in 1981, and a Kennedy Center Honors award, presented in 1994 by President Bill Clinton.
In addition to acting and producing, Mr. Douglas found time to write. Besides “The Ragman’s Son,” he was the author of a number of books, including the novels “Dance With the Devil,” “The Gift” and “Last Tango in Brooklyn.” Besides his book on “Spartacus,” his memoirs include “My Stroke of Luck” (2001), about his recovery and comeback, and “Let’s Face It: 90 Years of Living, Loving, and Learning” (2007).
In his later years he devoted his time to charity, campaigning with his wife to build 400 playgrounds in Los Angeles and establishing the Anne Douglas Center for Homeless Women, for the treatment of drug and alcohol addiction; the Kirk Douglas High School, a program to help troubled students finish their education; and the Kirk Douglas Theater, to nurture young theatrical artists.
In 2015, on his 99th birthday, he and his wife donated $15 million to the Motion Picture & Television Fund in Woodland Hills toward the construction of the Kirk Douglas Care Pavilion, a $35 million facility for the care of people in the industry with Alzheimer’s disease.
Mr. Douglas’s comeback from illness extended to acting as well. In 1999, at 83, he starred in the comedy “Diamonds,” playing a former boxing champion who, while recovering from a stroke, embarks on a hunt for missing jewels. It was his first film appearance since his illness. Critics judged the movie forgettable, but Stephen Holden, writing in The Times, found Mr. Douglas’s “hard, gleaming performance” a saving grace.
The last films in which he starred shared something of a theme: the reconciliation between fathers and sons. One was a comedy, “It Runs in the Family” (2003), in which his son was played by his actual son Michael. The other was the drama “Illusion” (2004), in which he played an ailing father in search of his estranged son.
Perhaps, together, they were a fitting finale for the ragman’s son, an actor whose boyhood poverty and absent father were never far from his mind. “That’s what it’s all about,” he said in describing what had driven him. “That’s the core, that early part of you.”
He also reconciled himself to advanced age. In 2008, in an essay in Newsweek (“What Old Age Taught Me”), Mr. Douglas wrote:
“Years ago I was at the bedside of my dying mother, an illiterate Russian peasant. Terrified, I held her hand. She opened her eyes and looked at me. The last thing she said to me was, ‘Don’t be afraid, son, it happens to everyone.’ As I got older, I became comforted by those words.”
William McDonald and Julia Carmel contributed reporting.
© 2020 The New York Times Company
|RecommendKeepReplyMark as Last ReadRead Replies (1)|
|To: Jon Koplik who wrote (9015)||2/29/2020 12:39:25 AM|
|From: Jon Koplik|
|NYT obituary on Freeman Dyson, Math Genius Turned Visionary Technologist ...................|
Feb. 28, 2020
Freeman Dyson, Math Genius Turned Visionary Technologist, Dies at 96
After an early breakthrough on light and matter, he became a writer who challenged climate science and pondered space exploration and nuclear warfare.
By George Johnson
Freeman J. Dyson, a mathematical prodigy who left his mark on subatomic physics before turning to messier subjects like Earth’s environmental future and the morality of war, died on Friday at a hospital near Princeton, N.J. He was 96.
His daughter Mia Dyson confirmed the death. His son, George, said Dr. Dyson had fallen three days earlier in the cafeteria of the Institute for Advanced Study in Princeton, “his academic home for more than 60 years,” as the institute put it in a news release.
As a young graduate student at Cornell University in 1949, Dr. Dyson wrote a landmark paper -- worthy, some colleagues thought, of a Nobel Prize -- that deepened the understanding of how light interacts with matter to produce the palpable world. The theory the paper advanced, called quantum electrodynamics, or QED, ranks among the great achievements of modern science.
But it was as a writer and technological visionary that he gained public renown. He imagined exploring the solar system with spaceships propelled by nuclear explosions and establishing distant colonies nourished by genetically engineered plants.
“Life begins at 55, the age at which I published my first book,” he wrote in “From Eros to Gaia,” one of the collections of his writings that appeared while he was a professor of physics at the Institute for Advanced Study -- an august position for someone who finished school without a Ph.D. The lack of a doctorate was a badge of honor, he said. With his slew of honorary degrees and a fellowship in the Royal Society, people called him Dr. Dyson anyway.
Dr. Dyson called himself a scientific heretic and warned against the temptation of confusing mathematical abstractions with ultimate truth. Although his own early work on QED helped bring photons and electrons into a consistent framework, Dr. Dyson doubted that superstrings, or anything else, would lead to a Theory of Everything, unifying all of physics with a succinct formulation inscribable on a T-shirt.
In a speech in 2000 when he accepted the Templeton Prize for Progress in Religion, Dr. Dyson quoted Francis Bacon: “God forbid that we should give out a dream of our own imagination for a pattern of the world.”
Relishing the role of iconoclast, he confounded the scientific establishment by dismissing the consensus about the perils of man-made climate change as “tribal group-thinking.” He doubted the veracity of the climate models, and he exasperated experts with sanguine predictions they found rooted less in science than in wishfulness: Excess carbon in the air is good for plants, and global warming might forestall another ice age.
In a profile of Dr. Dyson in 2009 in The New York Times Magazine, his colleague Steven Weinberg, a Nobel laureate, observed, “I have the sense that when consensus is forming like ice hardening on a lake, Dyson will do his best to chip at the ice.”
Dr. Dyson’s distrust of mathematical models had earlier led him to challenge predictions that the debris from atomic warfare could blot out the sun and bring on a devastating nuclear winter. He said he wished that were true -- because it would add to the psychological deterrents to nuclear war -- but found the theory wanting.
For all his doubts about the ability of mortals to calculate anything so complex as the effects of climate change, he was confident enough in our tool-making to propose a technological fix: If carbon dioxide levels became too high, forests of genetically altered trees could be planted to strip the excess molecules from the air. That would free scientists to confront problems he found more immediate, like the alleviation of poverty and the avoidance of war.
He considered himself an environmentalist. “I am a tree-hugger, in love with frogs and forests,” he wrote in 2015 in The Boston Globe. “More urgent and more real problems, such as the over-fishing of the oceans and the destruction of wildlife habitat on land, are neglected, while the environmental activists waste their time and energy ranting about climate change.” That was, to say the least, a minority position.
He was religious, but in an unorthodox way, believing good works to be more important than theology.
“Science is exciting because it is full of unsolved mysteries, and religion is exciting for the same reason,” he said in his Templeton Prize acceptance speech. “The greatest unsolved mysteries are the mysteries of our existence as conscious beings in a small corner of a vast universe.”
Freeman John Dyson was born on Dec. 15, 1923, in the Berkshire village of Crowthorne, England. His father, George Dyson, was a composer and conductor. In the family archives is an unfinished novel Freeman began writing when he was 8 years old about an imaginary expedition to the moon to observe the impending impact of an asteroid. (Later in life he probably would have devised, at least on paper, a means of heading off the celestial crash.) The boy’s reading included, in addition to Jules Verne, nonfiction by James Jeans and Arthur Eddington, British physicists with a flair for popularization and a literary bent.
After finishing high school at Winchester College, where his father taught music, he entered the University of Cambridge, Trinity College, and excelled in mathematics.
Looking for a way to serve the war effort while satisfying his pacifist leanings, he took leave in 1943 to work as a civilian scientist for the Royal Air Force Bomber Command. He was charged with using mathematics to plan more efficient bombing campaigns. Years later, in an interview with the physicist and historian Silvan S. Schweber, he agonized over what he saw as his own moral cowardice, comparing himself to Nazi bureaucrats “calculating how to murder most economically.”
Excited by the theoretical frontiers opened by wartime research on nuclear fission, Dr. Dyson returned to Cambridge and concentrated on becoming a physicist. With a bachelor’s degree in mathematics, he entered the graduate physics program at Cornell in 1947, studying under Hans Bethe, who had been a leader of the Manhattan Project.
It was while touring the United States the following summer that Dr. Dyson resolved a pressing problem in theoretical physics.
Richard Feynman, a young professor at Cornell, had invented a novel method to describe the behavior of electrons and photons (and their antimatter equivalent, positrons). But two other physicists, Julian Schwinger and Sin-Itiro Tomonaga, had each independently devised a very different way. Each of these seemed to satisfy the requirements of both quantum mechanics and special relativity -- two of nature’s acid tests. But which one was correct?
While crossing Nebraska on a Greyhound bus, Dr. Dyson was struck by an epiphany: The theories were mathematically equivalent -- different ways of saying the same thing. The result was QED. Feynman called it “the jewel of physics -- our proudest possession.”
By the time Dr. Dyson published the details in 1949, a doctorate must have seemed superfluous. He was appointed professor of physics at Cornell in 1951. Teaching, he soon realized, was not for him. In 1953, he became a scholar at the Institute for Advanced Study, where he spent the rest of his career.
Dr. Dyson did not begrudge Feynman, Schwinger and Tomonaga the Nobel they received in 1965. “I think it’s almost true without exception if you want to win a Nobel Prize, you should have a long attention span, get hold of some deep and important problem and stay with it for 10 years,” he told The Times Magazine in 2009. “That wasn’t my style.”
He preferred to move from problem to problem, both theoretical and practical. In the late 1950s, consulting for General Atomics in San Diego, he helped design the Triga reactor, which is used for scientific research and nuclear medicine, and worked on Project Orion, which aimed to explore the solar system with an enormous spaceship powered by exploding nuclear bombs.
With the signing of the Limited Nuclear Test Ban Treaty of 1963, Dr. Dyson’s dreams of reaching Saturn by 1970 were put to rest. Despite his disappointment, he came to support the treaty and, sometimes as a member of Jason, an elite group of scientific advisers, consulted with the government on disarmament and defense.
But his interests were not moored to the earth’s surface. Any advanced civilization, he observed in a paper published in 1960, would ultimately expand to the point where it needed all the energy its solar system could provide. The ultimate solution would be to build a shell around the sun -- a Dyson sphere -- to capture its output. Earthlings, he speculated in a thought experiment, might conceivably do this by dismantling Jupiter and reassembling the pieces.
In the meantime Dr. Dyson supported more conventional kinds of solar power, but he proposed that astronomers searching for extraterrestrial intelligence keep an eye out for heat radiating from occluded suns. For mankind’s own colonial efforts, he suggested the Dyson tree, altered genetically to grow on comets and generate a breathable atmosphere.
He also continued with less fanciful work. He and a colleague, Andrew Lenard, won a bottle of Champagne for proving that the Pauli exclusion principle, which states that no two fermions (electrons are an example) can occupy the same state, accounted for the stability of matter. In 1965 Dr. Dyson received a Dannie Heineman Prize, often considered the next best thing in physics to a Nobel.
Little about the world, profound or mundane, escaped his curiosity. Among his work is a short paper deriving a mathematical equation -- beautiful in his eyes -- describing the seam of a baseball.
In the late 1970s Dr. Dyson turned full force to writing. Anyone with an interest in science and an appreciation for good prose is likely to have some Dysons on the shelf: “Disturbing the Universe,” “Weapons and Hope,” “Infinite in All Directions,” “The Sun, the Genome and the Internet.”
He also entered literature in a different way. He appeared in John McPhee’s book “The Curve of Binding Energy” (1974), a portrait of Ted Taylor, the nuclear scientist who led the Orion effort, and in Kenneth Brower’s “The Starship and the Canoe” (1978). In a memorable scene, Mr. Brower wrote of Dr. Dyson’s reunion with his son, George, who had turned his back on high technology to live in a treehouse in British Columbia and build a seafaring canoe. George Dyson later returned to civilization and became a historian of technology and an author. Dr. Dyson’s daughter Esther Dyson is a well-known Silicon Valley investor.
In addition to them and Dr. Dyson’s daughter Mia, he is survived by his second wife, Imme Dyson; their three other daughters, Dorothy Dyson, Emily Dyson Scott and Rebecca Dyson; a stepdaughter, Katarina Haefeli; and 16 grandchildren. Dr. Dyson’s marriage to the mathematician Verena Huber ended in divorce. She died in 2016.
Dr. Dyson’s mind burned until the end. In 2012, when he was 88, he collaborated with William H. Press on a paper about the prisoner’s dilemma, a mathematical concept important to understanding human behavior and the nature of evolution.
In his 90s, Dr. Dyson was still consulting for the government -- on nuclear reactor design and the new gene-editing technology called CRISPR. In 2018, the year he turned 95, his book “Maker of Patterns: An Autobiography Through Letters” was published.
In his Templeton lecture, Dr. Dyson proposed that the universe is guided by “the principle of maximum diversity,” guaranteeing that it unfolds in a way that is “as interesting as possible.” Whatever its merit as a physical law, the principle goes far in describing the course of his extraordinary life.
Julia Carmel contributed reporting.
© 2020 The New York Times Company
|RecommendKeepReplyMark as Last ReadRead Replies (1)|
|To: Jon Koplik who wrote (9019)||2/29/2020 3:24:56 PM|
|From: Jon Koplik|
|WSJ obituary on Trader Joe’s Founder Joe Coulombe ..........................................|
Feb. 29, 2020
Trader Joe’s Founder Joe Coulombe Dies
LOS ANGELES -- Joe Coulombe envisioned a new generation of young grocery shoppers emerging in the 1960s, one that wanted healthy, tasty, high-quality food they couldn’t find in most supermarkets and couldn’t afford to buy in the few high-end gourmet outlets.
So he found a new way to bring everything from a then-exotic snack food called granola to the California-produced wines that for flavor compared with anything from France. And he made shopping for them almost as much fun as sailing the high seas when he created Trader Joe’s, a quirky little grocery store filled with nautical themes and staffed not by managers and clerks but by “captains and mates.”
From the time he opened his first store in Pasadena, Calif., in 1967 until his death Friday at age 89, Mr. Coulombe watched his namesake business rise to a retail giant with more than 500 outlets in over 40 states.
“He wanted to make sure whatever was sold in our store was of good value,” said Mr. Coulombe’s son, also named Joe, who added that his father died following a long illness. “He always did lots of taste tests. My sisters and I remember him bringing home all kinds of things for us to try. At his offices he had practically daily tastings of new products. Always the aim was to provide good food and good value to people.”
He achieved that by buying directly from wholesalers and cutting out the middleman, in many cases slapping the name Trader Joe’s on a bag of nuts, trail mix, organic dried mango, honey-oat cereal or Angus beef chili. He named several products after his daughters Charlotte and Madeleine and gave quirky names to others. Among them were Trader Darwin vitamins and a non-alcoholic sparkling juice called Eve’s Apple Sparkled by Adam.
He prided himself on checking out every vintage of wine from California’s Napa Valley, including Trader Joe’s standby, Charles Shaw, affectionately known as Two-Buck Chuck because it sold for $1.99. (It still does in the California stores, although shipping costs have increased the price in other states.)
“He sold a lot of better wines too,” his son noted with a laugh, recalling trips the family made to France to seek them out.
After selling Trader Joe’s to German grocery retailer Aldi in 1979, Mr. Coulombe remained as its CEO until 1988, when he left to launch a second career as what he called a “temp,” coming in as interim CEO or consultant for several large companies in transition. He retired in 2013.
Joseph Hardin Coulombe, an only child, was born on June 3, 1930, in San Diego and lived on an avocado ranch in nearby Del Mar. After serving in the Air Force, he attended Stanford University, where he earned a bachelor’s degree in economics, a master’s in business administration and met and married his wife, Alice.
A few years after graduation, he was hired by the Rexall drugstore chain, which tasked him with establishing a chain of convenience stores called Pronto. When Rexall lost interest in the stores, he bought them and had grown the chain to about a dozen outlets when the huge 7-Eleven company made a major push into Southern California.
“So I had to do something different,” he told the Los Angeles Times in 2014. “Scientific American had a story that of all people qualified to go to college, 60% were going. I felt this newly educated -- not smarter but better-educated -- class of people would want something different, and that was the genesis of Trader Joe’s.”
His wife’s parents had introduced him to a world of foods previously unfamiliar to him, including fine olive oil, fresh seafood and inexpensive quality wine, and he figured things like that would be perfect for the younger audience he was seeking.
As he bargained for those products, he would sometimes come across a particularly exceptional olive oil or vintage wine, never to find it again, and he wouldn’t stock an inferior product in its place.
He eschewed promotional gimmicks like loyalty clubs or loss-leader sales, getting the word out with brief radio spots and the Trader Joe’s “Fearless Flyer” newsletter, whose old-style appearance was inspired by another money-saving effort. He wanted to dress up the newsletter’s stories with illustrations he cut out of magazines, but he made sure he only took ones on which the copyrights had expired.
He passed such savings on not only to his customers but employees, which Trader Joe’s boasts are among retail’s best compensated, with medical, dental, vision and retirement plans and annual salary increases the company says range from 7% to 10%. Many workers have remained with Trader Joe’s for decades.
“He just had a visit yesterday from employee No. 1,” his daughter Charlotte said shortly before her father’s death.
He and his wife also became well known in Southern California philanthropic circles, contributing time and money to such causes as Planned Parenthood, the Los Angeles Opera and the Huntington Library, Art Museum and Botanical Gardens.
In addition to his three children and wife of 67 years, Mr. Coulombe is survived by six grandchildren.
Copyright © 2020 Dow Jones & Company, Inc.
|RecommendKeepReplyMark as Last ReadRead Replies (1)|