Orphaned Silver Is Finding Its Parent

Orphaned Silver Is Finding Its Parent

Tyler Durden

Sat, 06/06/2020 – 07:00

Authored by Alasdair Macleod via GoldMoney.com,

This article examines the prospects for silver, which has been overlooked in favour of gold. Due to the economic and monetary consequences of the coronavirus lockdowns and the earlier turning of the credit cycle, there is an increasing likelihood of a severe and sustained downturn that will require far more monetary expansion to deal with, favouring the prospects of both gold and silver returning to their former monetary roles.

To understand the consequences for silver, this article draws on history, principally of silver standards in America and Britain, in order to appreciate the issues involved and the prospects for silver to regain its former monetary role.

Introduction

So far this year, the story in precious metals markets has been all about gold. Speculators have this idea that gold is a hedge against inflation. They don’t question it, don’t theorise; they just assume. And when every central bank issuing a respectable currency says they will print like billy-ho, the punters buy gold derivatives.

These normally tameable punters are now breaking the establishment’s control system. On Comex, the bullion establishment does not regard gold and silver as money, just an idea to suck in the punters. The punters are no longer the suckers. With their newly promised infinite monetary expansion, central banks are confirming their inflationary fears.

What makes it worse for bullion bank trading desks is that the banking system is now teetering on the edge of the greatest contraction of bank credit experienced at least since the 1930s, and banks are determined to rein in their balance sheets. We normally think of bank credit contraction crashing the real economy: this time, banks are reining in market making activities as well, and that includes out-of-control gold and silver trading desks, foreign exchange trading, fx swaps and other derivatives —anything that is not a matched arbitrage or an agency deal on behalf of a genuine customer.

Initially, the focus on gold left silver vulnerable. Figure 1 shows how the two metals have performed in dollar terms so far this year, indexed to 31 December 2019. When the bullion banking establishment tried one of its periodic smashes in mid-February, it reduced Comex gold futures’ open interest from just under 800,000 contracts to about 480,000. The price of gold bounced back strongly to be up 14% on the year and the bullion banks are still horribly net short. But silver crashed, losing 34% and has only just recovered to be level on the year so far.

For the punters, in a proper gold bull market silver is seen as just a leveraged bet on gold. They are less interested in the dynamics that cause a relationship to exist than they are on the momentum behind the price. For now, active traders are looking for entry points in both metals to build or add to their positions in a bullish but overbought market.

This is just short-term stuff, and much has been written on it about gold. We are generally unaware today that silver has been money for ordinary people more so than gold and in that sense still has the greater claim as a circulating medium. It is therefore time to devote our attention to silver.

A brief history of monetary silver

Silver has a similar history to gold of being money. Following the ending of barter, communities worldwide adopted durable metals – gold, silver or copper, depending on local availability — as the principal medium of exchange. And until the 1960s this heritage, with respect to copper and silver, was still reflected in the coinage used in most nations. The British currency is still known as sterling because since the reign of Henry II (1154–1189) money was silver coinage of sterling alloy, comprised of 92.5% silver, the balance being mainly copper.

Silver was the sole monetary standard, sometimes with gold on a bimetallic standard, for most regions from medieval times until the nineteenth century. Sir Isaac Newton reset the silver standard against gold in 1717, and it was because the British government overpriced gold and failed to adjust to the consequences of changing mine supplies, principally the subsequent expansion of gold supply from Brazil, that British commerce moved towards a gold standard during the eighteenth century.

We look in greater detail at these events later in this article.

As international trade developed, gold for trading nations assumed greater significance, leading eventually to the adoption of the British sovereign coin as the gold standard in the early nineteenth century.

In colonial America, silver was the principal circulating currency in common with that of Britain at the time, but following Newton’s introduction of a silver standard for the pricing of gold, similar practical relationships between the two metals existed for trade in nearly all Britain’s colonies; in America’s case at least until independence was formally gained by the Treaty of Paris in 1783.

When Alexander Hamilton was Treasury Secretary, the US introduced a bimetallic standard with the first coinage act in 1792 when the dollar was fixed at 371.25 grains of pure silver, minted with alloy into coins of 416 grains. Gold coins were also authorised in denominations of $10 (eagles) and $2.50 (quarter eagles). The ratio of silver to gold was set at fifteen to one. All these coins were declared legal tender, along with some foreign coins, notably the Spanish milled silver dollar, which had 373 grains of pure silver making them a reasonable approximation for the US silver dollar.

However, not long after Hamilton’s coinage act was passed, the international market rate for the gold/silver ratio rose to 15.5:1, which led to gold being drained from domestic circulation, leaving silver as the common coinage. Effectively, the dollar was on a silver standard until 1834, when Congress approved a change in the ratio to 16:1 by reducing the gold in the eagle from 246.5 to 232 grains, or 258 grains at about nine-tenths fine. An additional adjustment to 232.2 grains was made in 1834. After a few years, gold coins then dominated in circulation over silver, the circulation of which declined as it became more valuable relative to gold. Gold discoveries in California and Australia then increased the quantity of gold mined relative to silver, making silver even more valuable relative to gold coinage thereby driving it almost totally out of circulation. This was remedied by an act of 1853 authorising subsidiary silver coins of less than $1 to be debased with less silver than called for by the official mint ratio and less than indicated by the world market price.

Under financial pressure from the civil war, in 1862 the government issued notes that were not convertible either on demand or at a specific future date. These greenbacks were legal tender for everything but customs duties, which still had to be paid in gold or silver. The government had abandoned the metallic standards. Greenbacks were issued in large quantities and the United States experienced a substantial inflation.

After the war was over Congress determined to return to the metallic standard at the same parity that existed before the war. It was accomplished by slowly removing greenbacks from circulation. The bimetallic standard, measuring the dollar primarily in silver, was finally replaced with a gold standard in 1879, reaffirmed in 1900 when silver was officially relegated to small denomination money.

In Europe, most countries on a silver standard moved to gold after the Franco-Prussian war (1870–1), when Germany imposed substantial reparations from France which were paid in gold, and Germany was then able to migrate from a silver to a gold standard. Other European nations followed suit.

More recently, silver circulated as money in Arab lands in the form of Maria Theresa dollars, which had circulated widely in the Middle East and East Africa from the mid-nineteenth century and were still being used in Muscat and Oman in the 1970s.

These are just some examples of silver’s use as money in the past. It lives on in base metal coins today, made to look like silver. Now imagine a world where fiat currencies are discredited: gold or gold substitutes will almost certainly return as the money for larger transactions, and silver will equally certainly return as money for everyday transactions. Bimetallism might not return as official policy due to the frequent adjustments required, but history has shown that a relatively stable market rate between gold and silver is likely to ensue, and silver more than gold will ensure widespread distribution of circulating metallic money.

Supply and demand factors

Analysts are currently grappling with the effects of the coronavirus on supply and demand in their forecasts for the rest of this year. Silver mines have been affected by changes in grades and production shutdowns. According to the Silver Institute, in 2019 less than 30% of mine supply was from mines classified as primarily silver, the rest coming from lead/zinc, copper, gold mines and “others” in that order of importance. Miners of lead/zinc, copper and others made up about 56% of global silver mine supply, so that a decline in global economic activity automatically leads to a decline in silver output from base metal miners.

At the same time, falling industrial demand for silver throws a greater emphasis on investment to sustain demand overall. Last year, non-investment demand was 806 million ounces, while investment was estimated at 186 million, a relationship which in a deep recession will require a significant increase in investment demand to absorb the combination of mine, scrap and available above-ground stocks. Identifiable above-ground stocks are estimated at 1,651 million, a multiple of 1.67 times 2019 demand, and 8.9 times 2019 investment demand.

For 2020 and beyond, I am very bearish for the global economy for reasons stated elsewhere. If I am right, current estimates for mine supply, of which over half is dependent on base metal mines, will prove optimistic. But silver demand for non-investment usage is likely to decline even more, in which case investment demand will probably need to at least double if silver prices are to rise in real terms.

An interesting point is found in the comparison with gold, where above-ground stocks are many multiples of mine and scrap supply. Stock-to-flow comparisons have been popularised recently by the cryptocurrency community as a measure of future monetary stability, compared with that of infinitely expandable fiat currencies. A high stock-to-flow signals a low rate of inflationary supply. Silver has a very low stock to flow ratio due to the low level of above-ground stocks. But it is a mistake is to rely on this measure of monetary stability for a metallic money when the lack of physical liquidity should be the main consideration.

At current prices, silver’s above-ground stock is worth only $31bn, compared with gold’s at over $10 trillion. With this relationship of 323 times of gold to silver’s above-ground stock values and an annual mine supply ratio of only 8 times as many silver ounces to that of gold, it appears that if gold returns to its traditional monetary role, silver will turn out to be substantially undervalued. “If” is a little word for a very big assumption; but given the unprecedented and coordinated acceleration of monetary expansion currently proposed, an ending of the current fiat currency regime and a return to gold and silver as monies is becoming increasingly likely.

The relationship with gold in the numbers above suggest that a bimetallic standard today on mine supply considerations alone would be at almost half Isaac Newton’s 1717 exchange rate. Obviously, the issue is not so simple and will be settled by markets. But looking at some other facts suggest the gold/silver relationship is due for a radical rethink. Table 1 below lists some of the relevant ones.

The clear outlier is the gold/silver ratio.

How Newton decided the gold to silver ratio

It is natural to assume that the greatest scientific genius of the day derived a clever means to settle the gold/silver ratio when he was Master of the Royal Mint in 1717. Not so. He looked at existing exchange rates, how silver was disappearing from circulation in favour of gold at that time and set an initial rate to stop it. Furthermore, he recommended the rate be revised, most probably downwards, in the light of how trade developed. The point was that Britain operated a silver standard of money and both Newton and Parliament wished to retain it. It was, after all, the established money for day-to-day transactions.

To understand the monetary debates at the time, it will be helpful to commence with a guide to the composition of pre-decimal British money and coinage. There were 20 shillings to the pound (£), and twelve pence to the shilling. Silver coins were crowns (5 shillings) and half crowns, being 2 shillings and 6 pence, written 2s. 6d. There were silver coins of lesser value, but they are not relevant to this discussion.

Over a century before Newton, in 1601 a pound weight of old standard silver was coined into £3. 2s. 0d. in crowns and fractions thereof and remained the mint price of silver until 1816, a period lasting over two centuries. In 1670, a pound in weight of gold was coined into £44. 10s. 0d., represented by gold pieces of ten and twenty shillings. That was the equivalent of 14.35 times the value of silver. A monetary pound of twenty shillings was called a guinea, because when they were first struck in 1663 the gold came from the Guinea Coast of Africa, and it was set at 44 ½ to a pound of silver in weight because it was thought that it would be a stable rate of exchange.

There were some adjustments to the price of gold until it was finally fixed in 1717 at the new ratio of £46. 14s. 6d. to the silver standard of £3. 2s.0d. This moved the guinea from £1 in silver to £1. 1s. 0d, or 21 shillings. The crude ratio was now 15.07 to 1; but allowing for the differences in fineness between the slightly purer sterling silver (92.5%) compared with Crown gold coinage at 22 carats (91.67%), the actual ratio was 15.21 to 1.

This rate of exchange was introduced during Isaac Newton’s tenure as Master of the Royal Mint from 1699. In 1696 he had been previously appointed Warden of the Royal Mint to improve the state of silver coinage, and he organised the Great Recoinage in 1696–9.

In setting the price of gold, Newton found that gold, having been fixed as high as 22s. in 1699, had been too expensive compared with its silver value in Europe, particularly Holland, Germany, the Baltic States, France and Italy. Not only did he recommend setting the gold guinea at 21s, but he also recommended the rate be kept under review for a possible change to 20s 6d. That review did not take place.

Trade between Britain and the Continent was increasing, and whatever the rate, merchants had a preference for gold over silver because it was more practical for large payments when they were made in specie, which was normal practice at the time. While Britain remained on a silver standard, for commercial purposes it had increasingly moved to gold, silver being relatively expensive at the rate set and therefore progressively driven from active circulation relative to inflows of gold. The problem was that neither Newton nor Parliament accepted there was more than one currency: silver was the money and gold just a commodity whose price was to be set.

Because silver was valued about 5% more relative to gold in the European countries mentioned in the penultimate paragraph above, silver flowed abroad despite the ban on the export of coinage, and conversely gold flowed into Britain. Furthermore, gold mining output from Brazil began to have an impact on Britain’s monetary system following Newton’s 1717 conversion, due to diplomatic and commercial ties between Britain and Portugal. The bulk of this Brazilian gold, estimated by Fay at about 23 million ounces between 1720–1750, ended up being shipped to London, helping it to become the European monetary centre, taking that mantle from Amsterdam.

We can conclude that it was a combination of Newton overpricing gold, thereby driving silver into Europe and gold into London, and the discovery of Brazilian gold that turned Britain onto a commercial gold standard, even though officially it remained on a silver monetary standard for ninety-nine years after Newton’s fixing. And finally, in 1816, gold was declared the sole standard measure of value, and no tender of silver coin was legal for transactions valued at over forty shillings. By 1821, Britain was on a gold standard in law as well as fact.

From 1601–1816 we learn that silver’s role as money gradually evolved towards a subsidiary role to gold. Gold was the money of merchants and goldsmiths. The latter acting as custodians of gold evolved into banks, so high finance was almost exclusively gold. But silver was always there as money, be it for lesser transactions. And if today’s state-issued unbacked fiat currencies disappear, silver is bound to have a monetary role again alongside gold, because having a lower value and greater abundance of supply it can be more widely circulated.

That being the case, those who believe state currencies are on their way to monetary destruction will accumulate silver as a practical version of sound money, noting that the current gold/silver ratio at about 96 times is over seven times that of its monetary rate in every country that operated a silver or bimetallic standard. Furthermore, those who fear their governments will confiscate gold might observe there is a lesser chance of them confiscating silver and attempts to confiscate gold would probably increase demand for silver anyway.

The current market position for silver

Since the price of gold began to increase from August 2018, silver has lagged, its moneyness broadly ignored. Figure 2 shows how this has been reflected in the gold/silver ratio.

On 19 March a ratio of 125 was the highest ever seen, marking the most extreme undervaluation for silver. Since then, the ratio has fallen rapidly to its current level of 97. For it to fall further a continuing advance in the gold price may be required, because in current financial markets higher gold prices would be associated with economic conditions and monetary policies heading to a substantial, if not catastrophic deterioration in the purchasing power of fiat currencies.

Additionally, traders manning bullion bank desks are generally finding their trading limits being reduced due to a combination of unfavourable trading conditions and pressure from their superiors limiting bank credit expansion generally. When the coronavirus paralysed China and was going to do the same to other nations, and the inflationary response became obvious, it led to a concerted bear raid by the bullion banks to balance their gold and silver positions on Comex before matters got even further beyond their control. The effect on silver’s open interest is shown in Figure 3.

Open interest was driven down to levels not seen for nearly seven years, after the silver price had fallen from a high of nearly $50 an ounce in 2011 to $18 in 2013, a price level only just now being reclaimed. From open interest’s height at 244,705 contracts on 24 February to its low at 181,830 on 4 May, contracts for 314,375,000 ounces of silver have been closed down, which compares with investment demand for the whole of last year estimated by the Silver Institute at 186,000,000 ounces. This contraction amounts to the synthetic equivalent of 20% of the Institute’s estimate of above-ground silver stocks.

Vaulted silver in LBMA vaults was 1,170 million ounces in February, the bulk of the 1.651 million recorded by the Silver Institute. The ownership of that silver is not declared but is likely to be a mixture of industrial users, investors (including ETFs) and bank dealers’ liquidity. In practice, banks keep liquidity at a minimum level consistent with the desk’s trading limits, and we know from developments in other derivative markets that trading limits are tending to contract.

Figure 4 shows the latest available data for the net position of swap dealers, effectively the bullion banks trading desks.

The last commitment of traders’ figures, for 26 May, shows a net short position of only 6,652 contracts; therefor swap dealers positions are almost level. This is a different situation from the gold futures contract where the swaps are currently short a net 182,864 contracts representing the equivalent of 569 tonnes worth $31 billion, almost a record and a major headache for the bullion banks.

In conclusion, having been left behind while monetary events have been focusing on the gold price, silver is now beginning to catch up. The spike to a gold/silver ratio of 125 appears to have marked a major turning point in the relationship, and silver can therefore be expected to continue to outperform gold as the fiat money situation deteriorates. Traders at the bullion banks appear to be avoiding short positions in silver futures, in which case a rising price will see them withdrawing liquidity instead of supplying additional contracts to the buyers.

The global economic and monetary situation is dire, due to both the coronavirus and because the credit cycle was already turning down in late-2019. The amount of monetary debasement deployed by central banks in an attempt to save their economies promises to be unprecedented to the point where total monetary destruction will be an increasingly likely outcome.

That being the case, the attraction of silver over gold is to be found in a substantial fall of the gold/silver ratio, as it dawns on markets that the end of fiat money is nigh.

via ZeroHedge News https://ift.tt/2Ya9WuK Tyler Durden

Alone Together in the Pandemic

The past is a different country, one I used to live in.

In that country, as I remember it, people moved about and met with others freely, passing close to strangers on the sidewalk, wearing gloves and scarves only when the weather required it. They rode crowded buses and packed subway trains, commuting to their offices so they could sit through meetings in close proximity to their yawning colleagues. They stood in lines and watched kids climb on playground equipment. They went to restaurants with dates and had intimate conversations over dinners prepared by someone else, delivered by platoons of waiters whose hands touched each and every plate. They shared sips of expensive cocktails with fussed-over garnishes containing liquors imported from all over the world. They propped themselves up on comfortingly scuzzy stools at crowded dives, drinking cheap beer as other patrons unthinkingly brushed against them. They watched sports, and they played them.

In that country, people sweated out their frustrations in gyms, sharing weights and treadmill grips. They watched suspenseful movies in darkened theaters, never knowing who might be sitting nearby, breathing in unison as killers stalked through crowded streets onscreen. They hugged each other. They shook hands.

They gathered together, to celebrate, to mourn, to plan for a future that seemed, if not precisely knowable, likely to fall within expected parameters. And, though it seems foreign now, they treated each and every one of these moments as ordinary and unremarkable, because they were.

No longer. Over the course of a few weeks in March, America, along with much of the Western world, became a different country. A novel coronavirus was spreading via physical proximity. So in order to slow its transmission, the places where people gathered together—bars, restaurants, gyms, stadiums, movie theaters, churches—were closed. Workers were sent home from offices. Much of the populace was put into lockdown, under orders to shelter in place. People were effectively sent into hiding. What they were hiding from was each other.

The toll taken by the virus and COVID-19, the deadly disease it causes, can be measured in lives, in jobs, in economic value, in businesses closed and plans forgone. That toll is, by any accounting, tremendous—more than 74,000 dead in the U.S. by the first week of May, more than 33 million unemployed, an economy that may shrink by 20 percent or more—and in the coming weeks and months, it is certain to rise.

As lockdowns swept the country, we lost something else, too, something harder to measure: connection, intimacy, the presence of others, the physical communities of shared cause and happenstance that naturally occur as people go about their lives. We lost our social spaces, and all the comforts they brought us. Yet just as quickly, we went about finding ways to reclaim those spaces and rebuild those communities, by any means we could.

Friendship Machine

Among the first casualties of the pandemic were sports. After a player for the Utah Jazz tested positive for COVID-19 in early March, the National Basketball Association (NBA) suspended its season. Major League Baseball quickly followed, canceling spring training and postponing the start of the regular season indefinitely. In the space of a few days, every other professional sport currently in season followed suit.

Athletics facilities of all kinds—from YMCAs to Pilates studios—closed their doors, with no idea when they would reopen. In multiple cities under lockdown, public officials singled out pickup games as prohibited. This was the country we were suddenly living in: You couldn’t watch sports. You couldn’t play them. Group exercise of almost every kind was all but forbidden.

Games and athletic pursuits have many purposes. They strengthen the body and sharpen the mind. They pass the time. And they build bonds of loyalty and friendship, camaraderie and common purpose. In short, they’re a way of making friends. In the days and weeks after the shutdowns, people found ways to quickly, if imperfectly, replicate or replace all of those things.

With facilities closed, fitness centers began offering online classes and instruction. Vida Fitness, a major gym chain in Washington, D.C., rolled out virtual memberships, featuring a regular schedule of live workouts and a library of full-length instructional videos. Cut Seven, a strength-focused gym in D.C.’s Logan Circle neighborhood owned by a husband-and-wife team, started a free newsletter devoted to helping people stay in shape from home. All over the country, group Pilates and yoga classes moved to videoconferencing apps like Zoom, with individual lessons available for premium fees. A mat and a laptop in a cluttered bedroom isn’t quite the same experience as a quiet studio with fellow triangle posers, but it beats nothing at all, and has even become a kind of aspirational experience. We’re all Peloton wives now.

Competitive sports went virtual as well. The National Football League held its annual draft online. Formula One organized a “virtual Grand Prix” around the F1 video game, featuring turn-by-turn commentary from professional sportscasters. Sixteen pro basketball players participated in the NBA 2K Players Tournament, which pitted the athletes against each other on Xboxes, playing an officially licensed NBA video game, with a $100,000 prize going to coronavirus relief charities of the winner’s choice. Phoenix Suns guard Devin Booker won and split the money between a first responders fund and a local food bank.

With traditional sports sidelined, video games stepped into the spotlight. The World Health Organization, which in 2019 had officially classified gaming addiction as a disorder, joined with large game publishers to promote playing at home during quarantine. On March 16, as state-based lockdowns began in earnest, Steam, a popular hub for PC gaming, set a record for concurrent users, with more than 20 million people logged in at once. A week later, it set another record, with 22 million. Live esports events were canceled, but professional players of games like League of Legends, Overwatch, Call of Duty, and Counter-Strike all continued online.

People weren’t just playing; they were watching. In April, viewership of Overwatch League events on YouTube rose by 110,000 on average. Twitch, the most popular video game streaming platform, set new audience-number records. Watching other people play video games, like watching sports, became a way to pass the time.

Video games may not exercise the body—that’s what Zoom Pilates is for—but they can sharpen the mind and help form lasting virtual communities.

That has certainly been the case for EVE Online, one of the oldest and most successful large-group online role-playing games in existence. EVE players build and control massive fleets of ships while participating in a complex virtual economy. More than most of its peers, the game’s story and gameplay are driven by players, who coordinate among themselves to form in-game alliances and trade goods and services, developing complex supply chains and running trading outposts. Many of the game’s core features, including its Alliance system, which allows player-corporations to band together to maintain in-game sovereignty, started as player innovations. In some ways, it’s less a game and more of an economic simulator and an experiment in virtual self-governance.

EVE has been going since 2003. But in the COVID-19 era it has seen a “massive and unprecedented change in the number of players coming into the game,” says Hilmar Pétursson, the CEO of publisher CCP Games.

The pandemic lockdowns may have brought new players in, but Pétursson believes it’s the community that will keep them coming back. The game’s developers recently commissioned player surveys to find out what motivates them to play. “We had this thesis that people would join for the graphics, and stay for the community,” he says.

It worked even better than the game makers expected. “Self-reported, the average EVE player has more friends than the average human on Earth,” Pétursson says. In player surveys conducted in November and December 2019, roughly three-quarters said they had made new friends through the game, and that those friends were very important to their lives. Their data, he argues, shows that “people were making real, deep, meaningful friendships within the game.”

EVE is made to be a very harsh, ruthless, dystopian game,” he says. “It seems that condition pushes people together to bond against the elements, and against their enemies. That creates real, deep, meaningful friendships.” What the developers eventually realized, he says, was that “actually, we were making a friendship machine.”

That has lessons for a world in which everyone is suddenly shut inside their homes. “What we have been seeing from EVE is that physical distance doesn’t mean isolated at all,” Pétursson continues. A game like EVE encourages players to band together to coordinate complex group actions, from space wars to elaborate trading operations. In a time where everyone is separated, that sort of loosely organized teamwork offers “a proven way to maintain social connection.”

Sports may be benched. Gyms may be closed. Pickup games may be illegal. Yet people are still finding ways to keep their bodies fit, to compete with each other in games of skill, to pass the time by watching others do so, and to bond in the absence of physical proximity.

Let’s Not Go to the Movies

Not every communal experience lost to the pandemic will return. And those that do might be forever changed.

Few social experiences are as common as going to the movies. Even as streaming services have made at-home viewing more convenient than ever, theatrical viewing has persisted and even thrived. In 2019, global box office returns hit $42.5 billion, a new record. But it’s hard to have a global movie business when most of the places where people go to see movies around the globe are shut down. Even in boom times—despite the record box office numbers—making and showing movies is a precarious business. In the midst of a pandemic, it’s nearly impossible.

The lockdowns not only shut down movie theaters, which as large gathering places represented potential vectors of transmission; they also shut down film productions, including some of the biggest movies in the works: a fourth Matrix film, all of Marvel’s next big superhero movies, yet another sequel to Jurassic Park. Meanwhile, with theaters closed all over the planet, release dates for films that were already complete were pushed back by months or postponed indefinitely. The biggest impact was on tentpole films—the expensive-to-make franchise sequels whose budgets are predicated on making billions at the box office. Originally set to open in April, the 25th James Bond film, No Time To Die, was bumped to November. The new Fast and Furious film, originally set for May 2020, was moved back to May 2021. A long-gestating Batman follow-up was pushed to October of that year. The dark knight wouldn’t return for a while—if he ever returned at all.

With theaters empty and nothing on the release calendar, the situation looked grimmer than a gritty reboot. Under the best-case scenario, theatrical grosses are expected to drop 40 percent this year. That’s if theaters reopen at all. Most movie screens are owned by a quartet of companies—AMC, Regal, Cinemark, and Cineplex—all of which were in difficult financial positions when the year began. Even the healthiest of the bunch, AMC, was already deep in debt. In April, just weeks after analysts downgraded its credit rating, the company borrowed another $500 million in order to be able to survive in case of closures through the fall. But this was a risky maneuver. If closures persist long enough, industry analysts warned, AMC could end up going bankrupt. And if AMC bit the dust, the other big chains might follow.

Even worse, from the theaters’ perspective, was that movie studios were starting to break the agreement that had long propped up their entire business model: the theatrical window. Studios gave movie theaters exclusive rights to air first-run productions—typically for about three months—before showing them on other platforms, such as video on demand. For years, theater chains had forcefully resisted even the smallest attempts to encroach on their exclusivity. But with theaters closed, the deal was off: Universal released several smaller genre films, including The Hunt and Invisible Man, to video on demand just weeks after they debuted in theaters. Bigger-budget films followed. Trolls World Tour, an animated family film, skipped theaters entirely. And Disney decided to release Artemis Fowl, a $125 million Kenneth Branagh–directed fantasy in the mold of Harry Potter, directly to its new streaming service, Disney Plus.

This was an existential threat to theater chains—and to the modern theatrical experience. Would cinemas -survive?

“I would not invest my kid’s piggy bank in any of the big-box, generic movie theaters, as very few general-audience members are brand loyal to a specific theater chain, and these same viewers will not make the efforts to go see a movie in theaters if it is going to be available in their home days later,” says Dallas Sonnier, CEO of the independent production company Cinestate, in an email. (Disclosure: I appear on Across the Movie Aisle, a podcast published by Rebeller, a Cinestate brand.) Cinestate specializes in genre fare made with modest budgets: Its best-known releases are the neo-western Bone Tomahawk, with Kurt Russell, and Dragged Across Concrete, a noirish crime thriller with Mel Gibson and Vince Vaughn, which had limited theatrical showings but found receptive audiences in home viewing. That gives Sonnier a unique perspective on the industry’s current predicament.

“Sure, there will be a brief surge of pent-up demand,” he says, “but that will wane over time, as big theater chains join the ghosts of the retail apocalypse when they cannot force studios back into traditional windows and cannot survive their mountains of debt.” Theaters probably won’t disappear entirely. But if the major chains collapse, far fewer screens will remain. And the survivors will likely be those that offer a premium experience for cinephiles, differentiated from today’s generic cineplexes.

If movie theaters as we know them go the way of the dinosaur, that leaves big questions for moviemakers, questions that producers like Sonnier are already beginning to ponder. “As much as we’d like to say ‘this will all be over soon,'” he says, “with projections being made for second and third waves of infection in this pandemic, we all have to be prepared to continue to release movies from home. If studios aren’t prepared to make that decision on some of their biggest, most anticipated titles, what happens then?”

It’s not that movie producers, large or small, would have to stop making films entirely. But they would have to build in different assumptions about how and where people will see them. That, in turn, would mean making different types of films.

One possibility is that this year’s losses could foster studio consolidation, driving more production under the umbrella of a few big players, like Disney, which recently bought Fox and already nabbed more than 60 percent of total industry profits in 2019. But in May, Disney reported that its profits were down 91 percent in the previous quarter—before the pandemic took its biggest toll. So it’s also possible the crisis could open up new opportunities for smaller-budget, smaller-scale productions that don’t depend on outsized global box office returns—movies, in other words, of the sort that Cinestate specializes in.

Whatever happens, Sonnier believes the movie business won’t emerge unscathed or unchanged. “I think that we’ve yet to witness the big, real changes that are going to happen here,” he says.

The communal experience of watching movies in a pitch-black room with hundreds of strangers might never be common again. But even still, in the weeks after theaters went dark, people found ways to watch things together. New York Times film critics, who suddenly had no films to criticize, started a weekend “viewing party” in which readers were encouraged to watch movies like Top Gun and His Girl Friday over the weekend, with follow-up discussions with Times critics later in the week. In some parts of the country, old drive-in theaters staged a comeback, and restaurants converted parking lots into neo-drive-in experiences. Netflix Party, a web browser extension, allowed viewers in different locations to sync up their shows. The South by Southwest film festival, one of the first major events to shut down in response to the virus, was resurrected in the form of a 10-day online event on Amazon Prime Video. The American Film Institute, a nonprofit that runs several movie theaters, hosted a movie club, encouraging viewers to watch a slate of classic films and releasing short video introductions featuring famous actors and filmmakers. The tagline was “movies to watch together when we’re apart.”

Cinestate found its own ways to keep viewers engaged, hosting viewing events in partnership with the horror-film streaming service Shudder and transitioning a previously scheduled theatrical release to Vimeo On Demand, with part of the proceeds benefiting theaters. “We’re certainly heartsick over the temporary loss of the theater-going experience,” says Sonnier, “but we’re finding ways to keep that spirit of community alive.”

Alone, Together

By now you may have picked up on a theme: communities staying together by going online. Under lockdown, virtually all of what passed for social life shifted to the internet—to video game streaming services and video chats, to Twitter and Facebook, to YouTube and Netflix, and, perhaps more than anything else, to Zoom.

Zoom, an online videoconferencing service that launched in 2011, was the portal through which quarantined life continued. In the weeks after the lockdowns began, it became the go-to platform not only for workplace meetings but for after-work happy hours, birthday get-togethers, dinner parties, even church services. In mid-April, the state of New York legalized Zoom weddings. (Presumably kissing the bride was still done in person.)

Every videoconferencing service saw growth, but from December 2019 to March 2020, Zoom went from 10 million users to more than 200 million. In April, when the British government voted to continue operating by using the service, The Washington Post ran an article headlined “U.K. Parliament votes to continue democracy by Zoom.” In the space of a few months, Zoom became an all-purpose platform for human connection and the functioning of society.

This was a modern blessing: Humans confined to their houses could talk to each other, see each other, smile and laugh in each other’s virtual presences. Technology and human ingenuity had allowed us to preserve our social lives, our religious communities, our family gatherings and friendly outings. There was something heartening about watching people adapt to their new lives, carrying over their old habits and traditions, like immigrants from a previous time.

Yet as genuinely marvelous as the Zoomification of social life was, it was hard not to wonder: How much had really been salvaged? Yes, there was something reassuring in being able to communicate with other people, but in the course of retaining our connections, we’d transformed all of human existence into a conference call, with all of the frustrations those entail: shaky connections, bad lighting, poor audio quality, confusion about whose turn it is to speak, the inherent alienation of communication mediated through screens. This was, at best, a kind of social limbo, and sometimes it felt like something worse. Hell is other people on Zoom.

The online space we’d moved into was almost certainly better than the alternatives available to us, and it came with tangible benefits. But it was a substitute experience, a simulacrum of human connection, an ersatz social space standing in for the real thing. We’d cobbled together imperfect replicas of our old lives, cramped into tiny boxes on computer screens.

In my last days in the old country, the one I used to live in, I visited the Columbia Room with several friends. The Washington, D.C., establishment is known for its elaborate liquid concoctions; in 2017, it was named best cocktail bar in the country. We spent the better part of the evening there, sitting close together, unconcernedly breathing each other’s air, and even sharing sips of drinks.

A bar like the Columbia Room isn’t just a delivery system for cocktails. With its tufted leather seating and its intricately tiled backbar mural, its plant-walled patio and ink-colored cabinets full of obscure booze, it is also a particular space, designed for comfort and socializing, for being near other people and enjoying conversation and company. It has, in other words, a vibe. Roughly a month later, that place—and every place like it—was closed.

The Columbia Room continued to serve cocktails to go, a legal innovation intended to ease the burden of the lockdowns on businesses and imbibers alike, but it wasn’t the same. It couldn’t be.

Producing take-out cocktails is “very different from the bartending that you’re used to,” says owner Derek Brown. “The main difference is the ritual and engaging with the customer. There’s a ritual to making a cocktail. There’s an interaction to it.” And with the Columbia Room closed to in-person business, that’s gone. “We’re very sad—sad is the only word—that we can’t do that right now,” Brown says.

That’s what we lost to the coronavirus: not the cocktails themselves, but the ability to share them. Not competitive sports, but the companionship of playing games together. Not movies, but the experience of seeing stories on a big screen surrounded by friends and strangers. In the new country, we were suddenly, terribly alone.

Among the most upsetting aspects of the lockdowns, especially for those who live in dense cities, was the closure of many public parks and green spaces. Most beachgoing was prohibited. In New York, playgrounds were shuttered and parts of Central Park were cordoned off. As the orders went out, Gov. Andrew Cuomo complained about crowding in public spaces, warning that although people should try to “walk around, get some sun,” there could be “no density, no basketball games, no close contact, no violation of social distancing, period, that’s the rule.” The message was clear: Stay away from each other.

In Washington, D.C., where I live, authorities blocked road access to the Tidal Basin in late March, as the city’s famous cherry blossoms reached the peak of their annual bloom. The National Arboretum was closed, and the city parks department spent the month of April tweeting the hashtag #StayHomeDC; all the facilities the agency oversaw remained closed.

Officially, nature was more or less off-limits, just as spring arrived. Yet as the weather warmed, and the light lingered later and later into the evening, people emerged from their homes. The streets, mostly emptied of vehicle traffic, created space for runners, allowing them to leave the sidewalks for families and dog walkers. In my neighborhood, a small private park, nestled behind a block of houses and maintained by the community, became a place to stretch out and read a book under the sun.

Before the virus, I didn’t go to the park very often. But in this new country, I found myself visiting more frequently, sometimes in the middle of the day. And so, I noticed, were my neighbors.

People sat in the grass and spread out picnics, walked their dogs, played catch with their kids. The park never became genuinely crowded, but it was always populated, a place where you could see other people and, at an appropriate distance, be reminded of their existence. Somehow, going to the park on a sunny afternoon had become an act of solidarity, of necessity, of rebellion. We were all alone in this strange time, this familiar yet deeply foreign place where the authorities had told everyone to stay apart. But at least we had found a way to be alone together.

from Latest – Reason.com https://ift.tt/2UeE7Qq
via IFTTT

Alone Together in the Pandemic

The past is a different country, one I used to live in.

In that country, as I remember it, people moved about and met with others freely, passing close to strangers on the sidewalk, wearing gloves and scarves only when the weather required it. They rode crowded buses and packed subway trains, commuting to their offices so they could sit through meetings in close proximity to their yawning colleagues. They stood in lines and watched kids climb on playground equipment. They went to restaurants with dates and had intimate conversations over dinners prepared by someone else, delivered by platoons of waiters whose hands touched each and every plate. They shared sips of expensive cocktails with fussed-over garnishes containing liquors imported from all over the world. They propped themselves up on comfortingly scuzzy stools at crowded dives, drinking cheap beer as other patrons unthinkingly brushed against them. They watched sports, and they played them.

In that country, people sweated out their frustrations in gyms, sharing weights and treadmill grips. They watched suspenseful movies in darkened theaters, never knowing who might be sitting nearby, breathing in unison as killers stalked through crowded streets onscreen. They hugged each other. They shook hands.

They gathered together, to celebrate, to mourn, to plan for a future that seemed, if not precisely knowable, likely to fall within expected parameters. And, though it seems foreign now, they treated each and every one of these moments as ordinary and unremarkable, because they were.

No longer. Over the course of a few weeks in March, America, along with much of the Western world, became a different country. A novel coronavirus was spreading via physical proximity. So in order to slow its transmission, the places where people gathered together—bars, restaurants, gyms, stadiums, movie theaters, churches—were closed. Workers were sent home from offices. Much of the populace was put into lockdown, under orders to shelter in place. People were effectively sent into hiding. What they were hiding from was each other.

The toll taken by the virus and COVID-19, the deadly disease it causes, can be measured in lives, in jobs, in economic value, in businesses closed and plans forgone. That toll is, by any accounting, tremendous—more than 74,000 dead in the U.S. by the first week of May, more than 33 million unemployed, an economy that may shrink by 20 percent or more—and in the coming weeks and months, it is certain to rise.

As lockdowns swept the country, we lost something else, too, something harder to measure: connection, intimacy, the presence of others, the physical communities of shared cause and happenstance that naturally occur as people go about their lives. We lost our social spaces, and all the comforts they brought us. Yet just as quickly, we went about finding ways to reclaim those spaces and rebuild those communities, by any means we could.

Friendship Machine

Among the first casualties of the pandemic were sports. After a player for the Utah Jazz tested positive for COVID-19 in early March, the National Basketball Association (NBA) suspended its season. Major League Baseball quickly followed, canceling spring training and postponing the start of the regular season indefinitely. In the space of a few days, every other professional sport currently in season followed suit.

Athletics facilities of all kinds—from YMCAs to Pilates studios—closed their doors, with no idea when they would reopen. In multiple cities under lockdown, public officials singled out pickup games as prohibited. This was the country we were suddenly living in: You couldn’t watch sports. You couldn’t play them. Group exercise of almost every kind was all but forbidden.

Games and athletic pursuits have many purposes. They strengthen the body and sharpen the mind. They pass the time. And they build bonds of loyalty and friendship, camaraderie and common purpose. In short, they’re a way of making friends. In the days and weeks after the shutdowns, people found ways to quickly, if imperfectly, replicate or replace all of those things.

With facilities closed, fitness centers began offering online classes and instruction. Vida Fitness, a major gym chain in Washington, D.C., rolled out virtual memberships, featuring a regular schedule of live workouts and a library of full-length instructional videos. Cut Seven, a strength-focused gym in D.C.’s Logan Circle neighborhood owned by a husband-and-wife team, started a free newsletter devoted to helping people stay in shape from home. All over the country, group Pilates and yoga classes moved to videoconferencing apps like Zoom, with individual lessons available for premium fees. A mat and a laptop in a cluttered bedroom isn’t quite the same experience as a quiet studio with fellow triangle posers, but it beats nothing at all, and has even become a kind of aspirational experience. We’re all Peloton wives now.

Competitive sports went virtual as well. The National Football League held its annual draft online. Formula One organized a “virtual Grand Prix” around the F1 video game, featuring turn-by-turn commentary from professional sportscasters. Sixteen pro basketball players participated in the NBA 2K Players Tournament, which pitted the athletes against each other on Xboxes, playing an officially licensed NBA video game, with a $100,000 prize going to coronavirus relief charities of the winner’s choice. Phoenix Suns guard Devin Booker won and split the money between a first responders fund and a local food bank.

With traditional sports sidelined, video games stepped into the spotlight. The World Health Organization, which in 2019 had officially classified gaming addiction as a disorder, joined with large game publishers to promote playing at home during quarantine. On March 16, as state-based lockdowns began in earnest, Steam, a popular hub for PC gaming, set a record for concurrent users, with more than 20 million people logged in at once. A week later, it set another record, with 22 million. Live esports events were canceled, but professional players of games like League of Legends, Overwatch, Call of Duty, and Counter-Strike all continued online.

People weren’t just playing; they were watching. In April, viewership of Overwatch League events on YouTube rose by 110,000 on average. Twitch, the most popular video game streaming platform, set new audience-number records. Watching other people play video games, like watching sports, became a way to pass the time.

Video games may not exercise the body—that’s what Zoom Pilates is for—but they can sharpen the mind and help form lasting virtual communities.

That has certainly been the case for EVE Online, one of the oldest and most successful large-group online role-playing games in existence. EVE players build and control massive fleets of ships while participating in a complex virtual economy. More than most of its peers, the game’s story and gameplay are driven by players, who coordinate among themselves to form in-game alliances and trade goods and services, developing complex supply chains and running trading outposts. Many of the game’s core features, including its Alliance system, which allows player-corporations to band together to maintain in-game sovereignty, started as player innovations. In some ways, it’s less a game and more of an economic simulator and an experiment in virtual self-governance.

EVE has been going since 2003. But in the COVID-19 era it has seen a “massive and unprecedented change in the number of players coming into the game,” says Hilmar Pétursson, the CEO of publisher CCP Games.

The pandemic lockdowns may have brought new players in, but Pétursson believes it’s the community that will keep them coming back. The game’s developers recently commissioned player surveys to find out what motivates them to play. “We had this thesis that people would join for the graphics, and stay for the community,” he says.

It worked even better than the game makers expected. “Self-reported, the average EVE player has more friends than the average human on Earth,” Pétursson says. In player surveys conducted in November and December 2019, roughly three-quarters said they had made new friends through the game, and that those friends were very important to their lives. Their data, he argues, shows that “people were making real, deep, meaningful friendships within the game.”

EVE is made to be a very harsh, ruthless, dystopian game,” he says. “It seems that condition pushes people together to bond against the elements, and against their enemies. That creates real, deep, meaningful friendships.” What the developers eventually realized, he says, was that “actually, we were making a friendship machine.”

That has lessons for a world in which everyone is suddenly shut inside their homes. “What we have been seeing from EVE is that physical distance doesn’t mean isolated at all,” Pétursson continues. A game like EVE encourages players to band together to coordinate complex group actions, from space wars to elaborate trading operations. In a time where everyone is separated, that sort of loosely organized teamwork offers “a proven way to maintain social connection.”

Sports may be benched. Gyms may be closed. Pickup games may be illegal. Yet people are still finding ways to keep their bodies fit, to compete with each other in games of skill, to pass the time by watching others do so, and to bond in the absence of physical proximity.

Let’s Not Go to the Movies

Not every communal experience lost to the pandemic will return. And those that do might be forever changed.

Few social experiences are as common as going to the movies. Even as streaming services have made at-home viewing more convenient than ever, theatrical viewing has persisted and even thrived. In 2019, global box office returns hit $42.5 billion, a new record. But it’s hard to have a global movie business when most of the places where people go to see movies around the globe are shut down. Even in boom times—despite the record box office numbers—making and showing movies is a precarious business. In the midst of a pandemic, it’s nearly impossible.

The lockdowns not only shut down movie theaters, which as large gathering places represented potential vectors of transmission; they also shut down film productions, including some of the biggest movies in the works: a fourth Matrix film, all of Marvel’s next big superhero movies, yet another sequel to Jurassic Park. Meanwhile, with theaters closed all over the planet, release dates for films that were already complete were pushed back by months or postponed indefinitely. The biggest impact was on tentpole films—the expensive-to-make franchise sequels whose budgets are predicated on making billions at the box office. Originally set to open in April, the 25th James Bond film, No Time To Die, was bumped to November. The new Fast and Furious film, originally set for May 2020, was moved back to May 2021. A long-gestating Batman follow-up was pushed to October of that year. The dark knight wouldn’t return for a while—if he ever returned at all.

With theaters empty and nothing on the release calendar, the situation looked grimmer than a gritty reboot. Under the best-case scenario, theatrical grosses are expected to drop 40 percent this year. That’s if theaters reopen at all. Most movie screens are owned by a quartet of companies—AMC, Regal, Cinemark, and Cineplex—all of which were in difficult financial positions when the year began. Even the healthiest of the bunch, AMC, was already deep in debt. In April, just weeks after analysts downgraded its credit rating, the company borrowed another $500 million in order to be able to survive in case of closures through the fall. But this was a risky maneuver. If closures persist long enough, industry analysts warned, AMC could end up going bankrupt. And if AMC bit the dust, the other big chains might follow.

Even worse, from the theaters’ perspective, was that movie studios were starting to break the agreement that had long propped up their entire business model: the theatrical window. Studios gave movie theaters exclusive rights to air first-run productions—typically for about three months—before showing them on other platforms, such as video on demand. For years, theater chains had forcefully resisted even the smallest attempts to encroach on their exclusivity. But with theaters closed, the deal was off: Universal released several smaller genre films, including The Hunt and Invisible Man, to video on demand just weeks after they debuted in theaters. Bigger-budget films followed. Trolls World Tour, an animated family film, skipped theaters entirely. And Disney decided to release Artemis Fowl, a $125 million Kenneth Branagh–directed fantasy in the mold of Harry Potter, directly to its new streaming service, Disney Plus.

This was an existential threat to theater chains—and to the modern theatrical experience. Would cinemas -survive?

“I would not invest my kid’s piggy bank in any of the big-box, generic movie theaters, as very few general-audience members are brand loyal to a specific theater chain, and these same viewers will not make the efforts to go see a movie in theaters if it is going to be available in their home days later,” says Dallas Sonnier, CEO of the independent production company Cinestate, in an email. (Disclosure: I appear on Across the Movie Aisle, a podcast published by Rebeller, a Cinestate brand.) Cinestate specializes in genre fare made with modest budgets: Its best-known releases are the neo-western Bone Tomahawk, with Kurt Russell, and Dragged Across Concrete, a noirish crime thriller with Mel Gibson and Vince Vaughn, which had limited theatrical showings but found receptive audiences in home viewing. That gives Sonnier a unique perspective on the industry’s current predicament.

“Sure, there will be a brief surge of pent-up demand,” he says, “but that will wane over time, as big theater chains join the ghosts of the retail apocalypse when they cannot force studios back into traditional windows and cannot survive their mountains of debt.” Theaters probably won’t disappear entirely. But if the major chains collapse, far fewer screens will remain. And the survivors will likely be those that offer a premium experience for cinephiles, differentiated from today’s generic cineplexes.

If movie theaters as we know them go the way of the dinosaur, that leaves big questions for moviemakers, questions that producers like Sonnier are already beginning to ponder. “As much as we’d like to say ‘this will all be over soon,'” he says, “with projections being made for second and third waves of infection in this pandemic, we all have to be prepared to continue to release movies from home. If studios aren’t prepared to make that decision on some of their biggest, most anticipated titles, what happens then?”

It’s not that movie producers, large or small, would have to stop making films entirely. But they would have to build in different assumptions about how and where people will see them. That, in turn, would mean making different types of films.

One possibility is that this year’s losses could foster studio consolidation, driving more production under the umbrella of a few big players, like Disney, which recently bought Fox and already nabbed more than 60 percent of total industry profits in 2019. But in May, Disney reported that its profits were down 91 percent in the previous quarter—before the pandemic took its biggest toll. So it’s also possible the crisis could open up new opportunities for smaller-budget, smaller-scale productions that don’t depend on outsized global box office returns—movies, in other words, of the sort that Cinestate specializes in.

Whatever happens, Sonnier believes the movie business won’t emerge unscathed or unchanged. “I think that we’ve yet to witness the big, real changes that are going to happen here,” he says.

The communal experience of watching movies in a pitch-black room with hundreds of strangers might never be common again. But even still, in the weeks after theaters went dark, people found ways to watch things together. New York Times film critics, who suddenly had no films to criticize, started a weekend “viewing party” in which readers were encouraged to watch movies like Top Gun and His Girl Friday over the weekend, with follow-up discussions with Times critics later in the week. In some parts of the country, old drive-in theaters staged a comeback, and restaurants converted parking lots into neo-drive-in experiences. Netflix Party, a web browser extension, allowed viewers in different locations to sync up their shows. The South by Southwest film festival, one of the first major events to shut down in response to the virus, was resurrected in the form of a 10-day online event on Amazon Prime Video. The American Film Institute, a nonprofit that runs several movie theaters, hosted a movie club, encouraging viewers to watch a slate of classic films and releasing short video introductions featuring famous actors and filmmakers. The tagline was “movies to watch together when we’re apart.”

Cinestate found its own ways to keep viewers engaged, hosting viewing events in partnership with the horror-film streaming service Shudder and transitioning a previously scheduled theatrical release to Vimeo On Demand, with part of the proceeds benefiting theaters. “We’re certainly heartsick over the temporary loss of the theater-going experience,” says Sonnier, “but we’re finding ways to keep that spirit of community alive.”

Alone, Together

By now you may have picked up on a theme: communities staying together by going online. Under lockdown, virtually all of what passed for social life shifted to the internet—to video game streaming services and video chats, to Twitter and Facebook, to YouTube and Netflix, and, perhaps more than anything else, to Zoom.

Zoom, an online videoconferencing service that launched in 2011, was the portal through which quarantined life continued. In the weeks after the lockdowns began, it became the go-to platform not only for workplace meetings but for after-work happy hours, birthday get-togethers, dinner parties, even church services. In mid-April, the state of New York legalized Zoom weddings. (Presumably kissing the bride was still done in person.)

Every videoconferencing service saw growth, but from December 2019 to March 2020, Zoom went from 10 million users to more than 200 million. In April, when the British government voted to continue operating by using the service, The Washington Post ran an article headlined “U.K. Parliament votes to continue democracy by Zoom.” In the space of a few months, Zoom became an all-purpose platform for human connection and the functioning of society.

This was a modern blessing: Humans confined to their houses could talk to each other, see each other, smile and laugh in each other’s virtual presences. Technology and human ingenuity had allowed us to preserve our social lives, our religious communities, our family gatherings and friendly outings. There was something heartening about watching people adapt to their new lives, carrying over their old habits and traditions, like immigrants from a previous time.

Yet as genuinely marvelous as the Zoomification of social life was, it was hard not to wonder: How much had really been salvaged? Yes, there was something reassuring in being able to communicate with other people, but in the course of retaining our connections, we’d transformed all of human existence into a conference call, with all of the frustrations those entail: shaky connections, bad lighting, poor audio quality, confusion about whose turn it is to speak, the inherent alienation of communication mediated through screens. This was, at best, a kind of social limbo, and sometimes it felt like something worse. Hell is other people on Zoom.

The online space we’d moved into was almost certainly better than the alternatives available to us, and it came with tangible benefits. But it was a substitute experience, a simulacrum of human connection, an ersatz social space standing in for the real thing. We’d cobbled together imperfect replicas of our old lives, cramped into tiny boxes on computer screens.

In my last days in the old country, the one I used to live in, I visited the Columbia Room with several friends. The Washington, D.C., establishment is known for its elaborate liquid concoctions; in 2017, it was named best cocktail bar in the country. We spent the better part of the evening there, sitting close together, unconcernedly breathing each other’s air, and even sharing sips of drinks.

A bar like the Columbia Room isn’t just a delivery system for cocktails. With its tufted leather seating and its intricately tiled backbar mural, its plant-walled patio and ink-colored cabinets full of obscure booze, it is also a particular space, designed for comfort and socializing, for being near other people and enjoying conversation and company. It has, in other words, a vibe. Roughly a month later, that place—and every place like it—was closed.

The Columbia Room continued to serve cocktails to go, a legal innovation intended to ease the burden of the lockdowns on businesses and imbibers alike, but it wasn’t the same. It couldn’t be.

Producing take-out cocktails is “very different from the bartending that you’re used to,” says owner Derek Brown. “The main difference is the ritual and engaging with the customer. There’s a ritual to making a cocktail. There’s an interaction to it.” And with the Columbia Room closed to in-person business, that’s gone. “We’re very sad—sad is the only word—that we can’t do that right now,” Brown says.

That’s what we lost to the coronavirus: not the cocktails themselves, but the ability to share them. Not competitive sports, but the companionship of playing games together. Not movies, but the experience of seeing stories on a big screen surrounded by friends and strangers. In the new country, we were suddenly, terribly alone.

Among the most upsetting aspects of the lockdowns, especially for those who live in dense cities, was the closure of many public parks and green spaces. Most beachgoing was prohibited. In New York, playgrounds were shuttered and parts of Central Park were cordoned off. As the orders went out, Gov. Andrew Cuomo complained about crowding in public spaces, warning that although people should try to “walk around, get some sun,” there could be “no density, no basketball games, no close contact, no violation of social distancing, period, that’s the rule.” The message was clear: Stay away from each other.

In Washington, D.C., where I live, authorities blocked road access to the Tidal Basin in late March, as the city’s famous cherry blossoms reached the peak of their annual bloom. The National Arboretum was closed, and the city parks department spent the month of April tweeting the hashtag #StayHomeDC; all the facilities the agency oversaw remained closed.

Officially, nature was more or less off-limits, just as spring arrived. Yet as the weather warmed, and the light lingered later and later into the evening, people emerged from their homes. The streets, mostly emptied of vehicle traffic, created space for runners, allowing them to leave the sidewalks for families and dog walkers. In my neighborhood, a small private park, nestled behind a block of houses and maintained by the community, became a place to stretch out and read a book under the sun.

Before the virus, I didn’t go to the park very often. But in this new country, I found myself visiting more frequently, sometimes in the middle of the day. And so, I noticed, were my neighbors.

People sat in the grass and spread out picnics, walked their dogs, played catch with their kids. The park never became genuinely crowded, but it was always populated, a place where you could see other people and, at an appropriate distance, be reminded of their existence. Somehow, going to the park on a sunny afternoon had become an act of solidarity, of necessity, of rebellion. We were all alone in this strange time, this familiar yet deeply foreign place where the authorities had told everyone to stay apart. But at least we had found a way to be alone together.

from Latest – Reason.com https://ift.tt/2UeE7Qq
via IFTTT

A Conspiracy Theorist Confesses

A Conspiracy Theorist Confesses

Tyler Durden

Sat, 06/06/2020 – 00:00

Authored by Iain Davis via Off-Guardian.org,

I am what the general population, politicians and the mainstream media (MSM) would call a conspiracy theorist. While I don’t agree with their definition of the term, there’s not much point in me denying it. It is applied to me, and millions like me, whether we like it or not.

For those who deem conspiracy theorists to be some sort of threat to society, we are the social and political malcontents who lack reason and hate our democratic way of life. We are trolls, bots and disinformation agents on social media, probably employed by the Russians, the Chinese or Iranians.

We are supposedly hellbent on sewing the seeds of discontent and can be found protesting against every government policy and decision. Alternatively, we are arrogant fools, both anti-science and evidence averse, who trot out crazy theories based upon little knowledge and no evidence. Apparently this is a very dangerous thing.

Thus we come to the glaring contradiction at the heart of the concept of the loony conspiracy theorist. Conspiracy theorists are both imbeciles, who don’t have any proof to back up anything they say, while simultaneously being dangerous subversives who threaten to destabilise democracy and foment chaos.

Which is it? It can’t be both. Unless society is so fragile it cannot withstand the opinions of idiots.

So where does the idea that fools present a threat to “our way of life,” come from? What is it that the conspiracy theorists say that is so dangerous? Why do their opinions seemingly need to be censored? What are governments so worried about?

WHAT IS A CONSPIRACY THEORY?

Some definitions are required here. From the Cambridge online English dictionary we have:

Misinformation: [noun] wrong information, or the fact that people are misinformed.

Disinformation: [noun] false information spread in order to deceive people.

Fake News: [noun] false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke.

Conspiracy: [noun’] the activity of secretly planning with other people to do something bad or illegal.

Theory: [noun] a formal statement of the rules on which a subject of study is based or of ideas that are suggested to explain a fact or event.

Conspiracy Theory: [noun] a belief that an event or situation is the result of a secret plan made by powerful people

It is notable that Cambridge University Press have introduced the concept of “secret” into their definition. By describing something as secret you are suggesting that it is impossible to know what it is. This added notion of secrecy is not commonly found in other dictionaries.

Nor is it present in the legal definition of conspiracy. Blacks Law Dictionary defines conspiracy as:

Conspiracy: In criminal law. A combination or confederacy between two or more persons formed for the purpose of committing, by their joint efforts, some unlawful or criminal act.

Obviously conspirators would like to keep their plans hidden. But that doesn’t mean they always remain so. If all conspiracies were “secrets” nobody would ever discover any of them.

Known conspiracies, such as Operation Gladio, Iran Contra, the Lavon Affair, the 2001 anthrax letter hoax and so on, would not have been exposed had people not highlighted the evidence which proved their existence.

The notion of the “secret conspiracy” is not one most people called conspiracy theorists would recognise. Often the whole point of our argument is that the conspiracies can be quite plainly evidenced. Most of that evidence is in the public domain and freely available.

More often conspiracy theorists are concerned with the denial or obfuscation of the evidence. It is not that the evidence doesn’t exist, rather that it either isn’t reported at all or is hidden by labelling those who do report it conspiracy theorists.

We can define “conspiracy theory” simply to mean: the reporting of evidence indicating a plan between two or more people to commit an illegal or nefarious act.

We can add that a conspiracy theory is an opinion or an argument. The merit of which is solely defined by the strength or weakness of the evidence.

However, if you read Wikipedia a very different definition is suggested. Suddenly conspiracy theory means an attempt to ignore other more plausible explanations. It is a theory based upon prejudice or insufficient evidence, it resists falsification and suffers from circular reasoning. It has left the realms of logical deduction and become a matter of faith.

This rationale is some distance away from the dictionary and legal definitions. It relies heavily upon opinion and is highly subjective. It is a pejorative definition which claims to be based in science, though the scientific evidence is feeble to non existent.

This depiction of the delusional conspiracy theorist, as described by Wikipedia, is the popularly accepted meaning. Perhaps we can agree, the narrative we are given about alleged conspiracy theorists broadly runs like this:

Conspiracy theorists forward arguments that are unfounded. These are based upon limited knowledge and lack substantiating evidence. Most conspiracy theorists are simply wrong and unwittingly spread misinformation. However, prominent conspiracy theorists spread disinformation and have used their large followings on the Internet to create a dangerous phenomenon called ‘fake news.’

Many of those with the largest followings are agents for foreign powers. They use a global network of trolls and bots to advance their dangerous political agenda. This is designed to undermine our democratic way of life and valued political institutions. Therefore all conspiracy theory is anti-democratic and must be stopped.

It is difficult to understand how democracies, which supposedly value freedom of thought, speech and expression, can be threatened by diversity of opinion. Yet it appears many people are willing to ignore this contradiction and support government attempts to censor information and silence the voices of those it labels conspiracy theorist. Which is genuinely anti-democratic.

Consequently it has become relatively straightforward for politicians and the media to refute evidence and undermine arguments. As long as they can get the label of conspiracy theory or theorist to stick, most people will discount their arguments without ever looking at the evidence.

The label of conspiracy theorist is an umbrella term for a huge array of ideas and beliefs. Some are more plausible than others. However, by calling everyone who challenges accepted norms a “conspiracy theorist” it is possible to avoid addressing the evidence some offer by exploiting guilt by association.

For example, many people labelled as conspiracy theorists, myself included, believe even the most senior elected politicians are relatively low down the pecking order when it comes to decision making. We suggest powerful global corporations, globalist think tanks and international financial institutions often have far more control over policy development than politicians. We can cite academic research to back up this identification of “Biased Pluralism.”

We do not believe the Earth is flat or the Queen is a lizard. However, because we believe the former, politicians, mainstream academia and the media insist that we must also believe the latter.

Psychology is often cited as evidence to prove conspiracy theorists are deranged, or at least emotionally disturbed in some way. Having looked at some of this claimed science I found it to be rather silly and anti-scientific. But that is just my opinion.

However, unlike many of the psychologists who earn a living by writing junk science, I do not think they should be censored nor stopped from expressing their unscientific opinions. However, governments across the world are seemingly desperate to exploit the psychologist’s ‘work’ to justify the silencing of the conspiracy theorists.

This desire to silence people who ask the wrong questions, by labelling all as conspiracy theorists, has been a common theme from our elected political leaders during the first two decades of the 21st century. But where did this idea come from?

THE HISTORY OF THE CONSPIRACY THEORIST LABEL

Conspiracy theory is nothing new. Nearly every single significant world event had at least one contemporary conspiracy theory attached to it. These alternative interpretations of events, which lie outside the accepted or official narratives, are found throughout history.

In 117 CE, the Roman Emperor Trajan died only two days after adopting his successor Hadrian. All his symptoms indicated a stroke brought on by cardio vascular disease.

Yet by the 4th century, in the questionable historical text Historia Augusta, a number of conspiracy theories surrounding Trajan’s death had emerged. These included claims that Trajan had been poisoned by Hadrian, the praetorian prefect Attianus and Trajan’s wife, Plotina.

While we would call this a conspiracy theory today, the term was not commonly used until the late 1960’s. The earliest written reference to something approaching the modern concept of conspiracy theory appeared in the 1870’s in the Journal of Mental Science vol 16.

“The theory of Dr Sankey as to the manner in which these injuries to the chest occurred in asylums deserved our careful attention. It was at least more plausible that the conspiracy theory of Mr Charles Beade”

This is the first time we see an association made between “conspiracy theory” and implausibility. Throughout most of the 19th and 20th century, if used at all, it usually denoted little more than a rationale to expose a criminal plot or malevolent act by a group.

After the Second World War colloquial use of “conspiracy theory” was rare. However, academics were beginning to lay the foundations for the interpretation which has produced the label we are familiar with today.

The burgeoning idea was that the large numbers of people who questioned official accounts of events, or orthodox historical interpretations, were all delusional to some degree. Questioning authority, and certainly alleging that authority was responsible for criminal acts, was deemed to be an aberration of the mind.

Karl Popper

In 1945 The philosopher Karl Popper alluded to this in his political work The Open Society and Its Enemies. Popper was essentially criticising historicism. He stated that historical events were vulnerable to misinterpretation by those who were predisposed to see a conspiracy behind them.

He argued this was because historians suffered from cognitive dissonance (the uncomfortable psychological sensation of holding two opposing views simultaneously.) They could not accept that tumultuous events could just happen through the combination of error and unrelated circumstances.

In Popper’s view, these historians were too quick to reject the possibility of random, chaotic events influencing history, preferring unsubstantiated conspiratorial explanations. Usually because they made better stories, thereby garnering more attention for their work.

Popper identified what he called the conspiracy theory of society. This reflected Popper’s belief that social sciences should concern themselves with the study of the unintended consequences of intentional human behaviour. Speaking of the conspiracy theory perspective, he wrote:

It is the view that an explanation of a social phenomenon consists in the discovery of the men or groups who are interested in the occurrence of this phenomenon (sometimes it is a hidden interest which has first to be revealed), and who have planned and conspired to bring it about.”

Popper also believed that increasing secularism had led people to ascribe power to secretive groups rather than the gods:

The gods are abandoned. But their place is filled by powerful men or groups – sinister pressure groups whose wickedness is responsible for all the evils we suffer from – such as the Learned Elders of Zion, or the monopolists, or the capitalists, or the imperialists.”

Popper’s theory illustrates the fundamental difference between those labelled conspiracy theorists and those who, on the whole, defend the official narrative and the establishment. For conspiracy theorists the evidence shows that powerful forces have frequently conspired to shape events, control the flow of information and manipulate society. The deliberate engineering of society, suggested by the conspiracy theorists, is rejected by their opponents and critics.

For them the conspiratorial view has some minor, limited merit, but the suggested scale and prevalence of these plots is grossly exaggerated. They see nearly all world events as the result of the unintentional collision between disparate forces and the random influence of fate.

In general, they consider the powerful incapable of malice. Where disastrous national and global events have clearly been caused by the decisions of governments, influential groups and immensely wealthy individuals, these are invariably seen as mistakes.

Any suggestion that the power hierarchy’s destructive decisions may have achieved their intended objectives receives blanket rejection. Even asking the question is considered “unthinkable.”

For many people called conspiracy theorists this is a hopelessly naive world view. History is full of examples of the powerful using their influence to further their own interests at others expense. Often costing people their lives.

For their opponents, like Popper, to reject this possibility outright, demonstrates their cognitive dissonance. They seem unable even to contemplate the possibility that the political and economic power structures they believe in could ever deliberately harm anyone. They have faith in authority and it is not shared by people they label conspiracy theorists.

Following the assassination of President Kennedy in 1963 alternative explanations proliferated, not least of all due to the apparent implausibility of the official account. Many U.S. citizens were concerned that elements within their own government had effectively staged a coup. Others, such as the prominent American historian Richard Hoftsadter, were more concerned that people doubted their government.

Richard Hofstadter

Building on the work of Popper, partly as a critique of McCarthyism but also in response to the Republican nomination loss of Nelson A. Rockefeller, American historian Richard Hofstadter suggested that people’s inability to believe what they are told by government was not based upon their grasp of the evidence. Rather it was rooted in psychological need.

He claimed much of this stemmed from their lack of education (knowledge), political disenfranchisement and an unjustified sense of self importance. He also suggested these dangerous opinions threatened to pollute the body politic.

Like Popper, Hofstadter did not identify conspiracy theorists directly. But he did formulate the narrative underpinning the modern, widely accepted, definition. He wrote:

I call it the paranoid style simply because no other word adequately evokes the sense of heated exaggeration, suspiciousness, and conspiratorial fantasy that I have in mind…It is the use of paranoid modes of expression by more or less normal people that makes the phenomenon significant

[…]

Of course, there are highbrow, lowbrow, and middlebrow paranoids, as there are likely to be in any political tendency. But respectable paranoid literature not only starts from certain moral commitments that can indeed be justified but also carefully and all but obsessively accumulates “evidence.”….he can accumulate evidence in order to protect his cherished convictions.

Going to great lengths to focus on the “paranoid’s” tendency to highlight the evidence, as if that were a failing, like most critics of so-called conspiracy theorists, Hofstadter chose neither to address nor even mention what that evidence was. He merely asserted that it was unbelievable. The reader just had to take his word for it.

The Warren Commission Report into the JFK assassination drew considerable criticism. The finding that Oswald acted alone contradicted numerous eye witness accounts, film, autopsy and ballistic evidence.

Four of the seven commissioners harshly criticised the report issued in their name. Widely seen as quite ridiculous, in the absence of any sensible official account of the assassination, numerous explanatory theories inevitably sprang up.

In response to the mounting criticism, in 1967 the CIA sent an internal dispatch to all field offices called Document 1035-960: Concerning Criticism of the Warren Report.

Revealed by a New York Times Freedom of Information Request in 1976, the dispatch is the first written record we have of the combination of Popper’s “conspiracy theory of society” with Hofstadter’s “paranoid style” militant. It defined the modern concept of the conspiracy theorist.

The document states:

Conspiracy theories have frequently thrown suspicion on our organization, for example by falsely alleging that Lee Harvey Oswald worked for us. The aim of this dispatch is to provide material countering and discrediting the claims of the conspiracy theorists.”

It can be considered as the origin of the weaponised term “conspiracy theory.” It recommends a set of techniques to be used to discredit all critics of the Warren Commission Report. Once you are familiar with them, it is obvious that these strategies are commonly deployed today to dismiss all who question official statements as “conspiracy theorists.” We can paraphrase these as follows:

  1. Deny any new evidence offered and cite only official reports stating ‘no new evidence has emerged.’

  2. Dismiss contradictory eyewitness statements and focus upon the existing, primary, official evidence such as ballistics, autopsy, and photographic evidence.

  3. Do not initiate any discussion of the evidence and suggest that large scale conspiracies are impossible to cover up in an open and free democracy.

  4. Accuse the conspiracy theorists of having an intellectual superiority complex.

  5. Suggest that theorists refuse to acknowledge their own errors.

  6. Refute any suggestion of witness assassinations by pointing out they were all deaths by natural causes.

  7. Question the quality of conspiracy research and point out that official sources are better.

The report recommended making good use of “friendly elite contacts (especially politicians and editors)” and to “employ propaganda assets to [negate] and refute the attacks of the critics.”

The CIA advocated using mainstream media feature articles to discredit people labelled conspiracy theorists.

While the use of these methods has been refined over the years, the essential process of labelling someone a conspiracy theorist, while studiously avoiding any discussion of the evidence they highlight, is extremely common in the mainstream media today. We only need look at the reports about academics who questioned the government’s narrative about COVID19 to see the techniques in operation.

The drive to convince the public to use only “official sources” for information has seen the rise of the fact checker.

These organisations, invariably with the support of government and corporate funding, are offered as the reliable sources which provide real facts. The facts they provide are frequently wrong and the fact checking industry has settled legal claims from those who challenged their disinformation.

People have been directed by the mainstream media to abandon all critical thinking. They just need to go to their government-approved fact-checker in order be told the truth.

Providing the public believe the people labelled conspiracy theorists are crazy, ill informed or agents for a foreign powers, the mainstream media, politicians and other commentators can undermine any and all evidence they present. In keeping with the CIA’s initial recommendations, it is extremely unlikely that the evidence will ever be openly discussed but, if it is, it can be written off as “conspiracy theory.”

However, it isn’t just the mainstream media who use the conspiracy theorist label to avoid discussing evidence. Politicians, speaking on the worlds biggest political stage, have seized the opportunity to deploy the CIA’s strategy.

THREE SPEECHES ONE AGENDA

Even for Prime Ministers and Presidents, addressing the General Assembly of the United Nations is a big deal. These tend to be big thematic speeches as the leader impresses their vision upon the gathered dignitaries and global media.

Yet, despite the fact that conspiracy theorists are supposed to be idiots who don’t know the time of day, global “leaders” have repeatedly used this auspicious occasion to single them out as one of the greatest threats to global security.

In November 2001 George W. Bush addressed the United Nations General Assembly with the following words:

We must speak the truth about terror. Let us never tolerate outrageous conspiracy theories concerning the attacks of September the 11th; malicious lies that attempt to shift the blame away from the terrorists, themselves, away from the guilty. To inflame ethnic hatred is to advance the cause of terror.”

Even if you accept the official account of 9/11, and there are numerous reasons why you wouldn’t, how does questioning it suggest that you support terrorism or mark you out as a racist?

The suggestion appears absurd but it does illustrate that the U.S. president wanted both to silence all criticism of the government account and link those questioning it to extremism and even terrorism.

This theme was reiterated by the UK Prime Minister David Cameron in his 2014 address. He said:

To defeat ISIL – and organisations like it we must defeat this ideology in all its forms…..it is clear that many of them were initially influenced by preachers who claim not to encourage violence, but whose world view can be used as a justification for it. We know this world view. The peddling of lies: that 9/11 was a Jewish plot or that the 7/7 London attacks were staged […] We must be clear: to defeat the ideology of extremism we need to deal with all forms of extremism – not just violent extremism. We must work together to take down illegal online material […] we must stop the so called non-violent extremists from inciting hatred and intolerance.

This season we will mostly be wearing anti-fear glasses

Like Bush before him, Cameron was at pains to identify what he called non violent extremists (commonly called conspiracy theorists). According to him, all who question government accounts of major geopolitical events are, once again, tantamount to terrorists.

Calling for online censorship to stop any questions ever being asked, it is this authoritarian need to avoid addressing evidence that led his successor, Prime Minister Theresa May, to propose wide-sweeping censorship of the Internet.

At the time of writing, the UK is among the many nations still in so called “lockdown” following the outbreak of COVID19. When UK Prime Minister Boris Johnson addressed the U.N General Assembly in September 2019 he delivered a speech which seemed weirdly out of context. With Brexit and possible conflict with Iran high on the agenda his address, which barely touched on those issues, was received with considerable bewilderment.

Six months later his predictive powers appear to be remarkable. It transpires that Johnson’s comments were extremely relevant. Just six months too early.

There are today people today who are actually still anti-science […] A whole movement called the anti-Vaxxers, who refuse to acknowledge the evidence that vaccinations have eradicated smallpox […] And who by their prejudices are actually endangering the very children they want to protect […] I am profoundly optimistic about the ability of new technology to serve as a liberator and remake the world wondrously and benignly […] Together, we can vanquish killer diseases.”

Despite the wealth of scientific evidence which justifies scepticism about some vaccinesanti-vaxxer (a variant of conspiracy theorist), is another label used to convince people not to consider evidence. The assertion is that those who question vaccines all fundamentally reject the concept of artificially inducing an immune response against a disease.

This isn’t true but how would you know? The anti-vaxxer label alone is sufficient to convince most to turn away.

Johnson’s speech rambled across so many seemingly irrelevant subjects there is little reason to suspect any COVID 19 foreknowledge. But given the global pandemic that would occur just a few months later, it was certainly prescient. Johnson was sufficiently concerned about the supposedly baseless questions of so called conspiracy theorists (or anti-vaxxers) to allege they killed children. A ludicrous suggestion the mainstream media strongly promoted.

It doesn’t matter that academic research has proven that the official account of 9/11 cannot possibly be true; it makes no difference that Mossad agents admitted that they had gone to New York on the morning of 9/11 to “document the event;” studies showing that approximately 90% of the total 20th Century disease reduction in the U.S. occurred prior to the widespread use of vaccines are irrelevant.

None of these facts need to be known by anyone and governments are going to censor all who try to tell others about them. All questions that reference them are crazy conspiracy theories. They are both stupid questions and a huge threat to both national security and the safety of the little children.

One of the recurring themes the people labelled conspiracy theorists discuss is that policy is made behind the closed doors of corporate boardrooms and policy think tanks. It doesn’t matter who you elect or what party you choose to rule over you, they are only capable of tinkering at the edges of the policy platform.

The policy agenda is set at a globalist level. So the fact that, over two decades, one U.S president and two British Prime Minsters were delivering essentially the same message doesn’t surprise the conspiracy theorists.

As we move toward a world where certain ideas are forbidden and only officially approved questions can be asked, where governments and corporations have a monopoly on the truth and everything else is a conspiracy theory, only one thing really matters. The evidence.

Hofstadter’s believed that his paranoid style militants constant citation of evidence was merely an attempt to “protect his cherished convictions.” This could be true, but the only way to find out is to look at that evidence. The label of the conspiracy theorist has been deliberately created in order to convince you not to look at it.

Regardless of whether or not you think someone’s opinion is a conspiracy theory, you owe it to yourself and your children to consider the evidence they cite. Perhaps you will reject it. There’s nothing wrong with that.

But to reject it, without knowing what it is, really is crazy. Your only other option is to unquestioningly accept whatever you are told by the government, globalist think tanks, multinational corporations and their mainstream media partners.

If you choose to believe that everyone who claims to have identified the malfeasance of officials, the crimes of government or the corruption of powerful global institutions, are all conspiracy theorists, then you have accepted that the establishment is beyond reproach.

If you also agree the same established hierarchy can not only determine what you can or cannot know, but can also set all the policies and legislation which dictates your behaviour and defines the limits of your freedom, you have elected to be a slave and don’t value democracy in the slightest.

via ZeroHedge News https://ift.tt/2z8xxUp Tyler Durden

Engineering The Perfect Human? Biotech Examines Rare DNA In Himalayan People

Engineering The Perfect Human? Biotech Examines Rare DNA In Himalayan People

Tyler Durden

Fri, 06/05/2020 – 23:40

Variant Bio has spent the last several years scouring the world for genetic outliers in human beings. It found a small group of “outlier humans” with special variations in their DNA that could affect disease risk and eventually be used to develop medicines to improve human life. 

Founded in New York, the 10-person startup’s lead geneticist Stephane Castel is focusing on the DNA of Sherpa people living at high altitudes in Nepal and Himalayas. Their unique genetic characteristics allow them to live healthy lives with blood oxygen levels far below what most humans need. Most people in high altitudes suffer from hypoxia, which is the absence of enough oxygen in the tissues to sustain bodily functions. 

“They [Sherpa people] don’t suffer any ill health effects,” Castel told Bloomberg. “It’s incredible.” 

Sherpa village 

Castel’s team is betting on the sequencing of Sherpa DNA, which could lead to discoveries of new superior traits that would aid in the development of novel medicines and therapies to improve metabolism, eyesight, and immune response.

It’s up to Variant’s software and scientific analysis to find breakthrough genetic coding in Sherpa DNA, Castel said it could take several years to develop drugs and therapies based on the results. 

Sherpa man

Variant Bio recently received a capital infusion from venture firm Lux Capital for $16 million to pursue the research. 

Josh Wolfe, the co-founder of Lux Capital, said:

 “Wouldn’t it be amazing if some secrets of human health were possessed by these small groups of people [Sherpa people], and they could ultimately benefit the rest of the world?,” Wolfe said. 

Variant’s new CEO, Andrew Farnum, previously managed the $2 billion investment arm of the Bill & Melinda Gates Foundation that concentrated on  global health and infectious diseases. 

“There are huge advantages here for drug discovery,” Farnum said while referring to Variant’s research. 

 “People will see how seriously we take this and how we conduct our projects, and other populations will want to work with us,” he said. 

Variant has held talks with Sherpa village leaders and Nepal’s research council to negotiate deals for DNA extraction with locals. 

Keolu Fox, a genome scientist and an assistant professor at the University of California at San Diego, said Variant must compensate Sherpa people for their DNA:

“If the people don’t get a cut, this is colonial,” Fox said. “It’s extractive capitalism.” 

Variant has taken the approach that human genetics has the power to transform drug development. Perhaps, the startup is on to something by examining superior genes possed by Sherpa people. 

Could this company be in the early innings of engineering perfect humans? 

via ZeroHedge News https://ift.tt/3cDTWGX Tyler Durden

“Unreported Truths” – This Is The COVID-19 Book That Amazon ‘Quarantined’

“Unreported Truths” – This Is The COVID-19 Book That Amazon ‘Quarantined’

Tyler Durden

Fri, 06/05/2020 – 23:20

Via RealClearInvestigations.com,

Former New York Times reporter Alex Berenson has developed a wide following on Twitter for detailed posts that challenge some mainstream reporting and government declarations about COVID-19. 

Read the full thread here.

Thursday morning he tweeted that Amazon had refused to offer for sale his self-published book, “Unreported Truths about COVID-19 and Lockdowns Part 1: Introduction and Death Counts and Estimates.”

Among those responding in outrage over what they called blatant censorship were SpaceX CEO Elon Musk (“This is insane @JeffBezos”) and journalist Glenn Greenwald. And late Thursday Berenson reported that Amazon had backed off and is now offering the book for sale on Kindle.

Before Amazon reversed itself (calling its earlier move an “error,” according to Fox News), RealClearInvestigations asked the award-winning novelist to elaborate on his experience. Here’s his response, followed by an excerpt from the book: 

By Alex Berenson
June 4, 2020

The booklet was the first in a series of coronavirus pamphlets I plan to put out covering various aspects of the crisis. Readers of my Twitter feed encouraged me to compile information in a more comprehensive and easier-to-read format, and when I polled people on Twitter to ask if they would be willing to pay a nominal fee for such a pamphlet, the response was strong.

Originally I only planned to write one, but I had so much information I realized that the booklet would be an awkward length – longer than a magazine article but shorter than a book.  Also, doing so would take too long, and I wanted to put it out quickly. So I decided to split the booklet into pieces. Part 1 included an introduction and a discussion of death coding, death counts, and who is really dying from COVID, as well as a worst-case estimate of deaths with no mitigation efforts.

It is about 6,500 words, and I planned to sell it for $2.99 as an ebook or $5.99 for a paperback. It is called “Unreported Truths about COVID-19 and Lockdowns: Part 1, Introduction and Death Counts and Estimates.”

I created covers for both and uploaded the book. I had published Kindle Singles (Amazon’s curated program for short Kindle pieces, which now focuses more on fiction from established writers), so I was relatively familiar with the drill. I briefly considered censorship but assumed I wouldn’t have a problem because of my background, because anyone who reads the booklet will realize it is impeccably sourced, nary a conspiracy theory to be found, and frankly because Amazon shouldn’t be censoring anything that doesn’t explicitly help people commit criminal behavior. (Books intended to help adults groom children for sexual relationships, for example, should be off-limits – though about 10 years ago Amazon did not agree and only backed down from selling a how-to guide for pedophiles in the face of public outrage.) 

I didn’t hear anything until this morning, when I found the note I posted to Twitter in my inbox (shown below).

Note that it does not offer any route to appeal. I have no idea if the decision was made by a person, an automated system, or a combination (i.e. the system flags anything with COVID-19 or coronavirus in the title and then a person decides on the content). I am considering my options, including making the booklet available on my Website and asking people to pay on an honor system, but that will not solve the problem of Amazon’s censorship. Amazon dominates both the electronic and physical book markets, and if it denies its readers a chance to see my work, I will lose the chance to reach the people who most need to learn the truth – those who don’t already know it.

Here are the first 1,000 words of Chapter 1:

Maybe the most important questions of all:

How lethal is SARS-COV-2?

Whom does it kill?

Are the death counts accurate – and, if not, are they over- or understated?

Estimates for the lethality of the coronavirus have varied widely since January. Early Chinese data suggested the virus might have an “infection fatality rate” as high as 1.4 – 2 percent.

A death rate in that range could mean the coronavirus might kill more than 6 million Americans, although even under the worst-case scenarios some people would not be exposed, and others might have natural immunity that would prevent them from being infected at all.

As we have learned more about the virus, estimates of its lethality have fallen. Calculating fatality rates is complex, because despite all of our testing for COVID, we still don’t know how many people have been infected.

Some people who are infected may have no or mild symptoms. Even those with more severe symptoms may resist going to the hospital, then recover on their own. We have a clear view of the top of the iceberg – the serious infections that require hospitalization – but at least in the early stages of the epidemic we have to guess at the mild, hidden infections.

But to calculate the true fatality rate, we need to know the true infection rate. If 10,000 people die out of 100,000 infections, that means the virus kills 10 percent of all the people it infects – making it very, very dangerous. But if 10,000 people die from 10 million infections, the death rate is actually 0.1 percent – similar to the flu.

Unfortunately, figuring out the real infection rate is very difficult. Probably the best way is through antibody tests, which measure how many people have already been infected and recovered – even if they never were hospitalized or even had symptoms. Studies in which many people in a city, state, or even country are tested at random to see if they are currently infected can also help. Believe it or not, so can tests of municipal sewage. (I’ll say more about all this later, in the section on transmission rates and lockdowns.)

For now, the crucial point is this: randomized antibody tests from all over the world have repeatedly shown many more people have been infected with coronavirus than is revealed by tests for active infection. Many people who are infected with SARS-COV-2 don’t even know it.

So the hidden part of the iceberg is huge. And in turn, scientists have repeatedly reduced their estimates for how dangerous the coronavirus might be.

The most important estimate came on May 20, when the Centers for Disease Control reported its best estimate was that the virus would kill 0.26 percent of people it infected, or about 1 in 400 people. (The virus would kill 0.4 percent of those who developed symptoms. But about one out of three people would have no symptoms at all, the CDC said.) (https://www.cdc.gov/coronavirus/2019-ncov/hcp/planning-scenarios.html#box.)

Similarly, a German study in April reported a fatality rate of 0.37 percent (https://www.technologyreview.com/2020/04/09/999015/blood-tests-show-15-of-people-are-now-immune-to-covid-19-in-one-town-in-germany/). A large study in April in Los Angeles predicted a death rate in the range of 0.15 to 0.3 percent.

Some estimates have been even lower. Others have been somewhat higher – especially in regions that experienced periods of severe stress on their health care systems. In New York City, for example, the death rates appear somewhat higher, possibly above 0.5 percent – though New York may be an outlier, both because it has counted deaths aggressively (more on this later) and because its hospitals seem to have used ventilators particularly aggressively.

Thus the CDC’s estimate for deaths is probably the best place to begin. Using that figure along with several other papers and studies suggests the coronavirus has an infection fatality rate in the range of 0.15 percent to 0.4 percent.

In other words, SARS-COV-2 likely kills between 1 in 250 and 1 in 650 of the people whom it infects. Again, though, not everyone who is exposed will become infected. Some people do not contract the virus, perhaps because their T-cells – which help the immune system destroy invading viruses and bacteria – have already been primed by exposure to other coronaviruses. [Several other coronaviruses exist; the most common versions usually cause minor colds in the people they infect.] An early May paper in the journal Cell suggests that as many as 60 percent of people may have some preexisting immune response, though not all will necessarily be immune. (https://www.cell.com/cell/pdf/S0092-8674(20)30610-3.pdf).

The experience of outbreaks on large ships such as aircraft carriers and cruise liners also show that some people do not become infected. The best estimates are that the virus probably can infect somewhere between 50 to 70 percent of people. For example, on one French aircraft carrier, 60 percent of sailors were infected (none died and only two out of 1,074 infected sailors required intensive care). https://www.navalnews.com/naval-news/2020/05/covid-19-aboard-french-aircraft-carrier-98-of-the-crew-now-cured/

Thus – in a worst-case scenario – if we took no steps to mitigate its spread or protect vulnerable people, a completely unchecked coronavirus might kill between 0.075 and 0.28 percent of the United States population – between 1 in 360 and 1 in 1,300 Americans.

This is higher than the seasonal flu in most years. Influenza is usually said to have a fatality rate among symptomatic cases of 1 in 1,000 and an overall fatality rate of around 1 in 2,000. However, influenza mutates rapidly, and its dangerousness varies year by year. The coronavirus appears far less dangerous than the Spanish flu a century ago, which was commonly said to kill 1 in 50 of the people it infected.

It appears more comparable in terms of overall mortality to the influenza epidemics of 1957 and 1968, or the British flu epidemics of the late 1990s. (Of course, the United States and United Kingdom did not only not shut down for any of those epidemics, they received little attention outside the health-care system.)

via ZeroHedge News https://ift.tt/3eWLV1i Tyler Durden

Iran’s Nuclear Stockpile Rose By Over 50% During Three Months Of COVID Lockdown: IAEA

Iran’s Nuclear Stockpile Rose By Over 50% During Three Months Of COVID Lockdown: IAEA

Tyler Durden

Fri, 06/05/2020 – 23:00

Over the past three months of global lockdowns due to the coronavirus, Iran’s nuclear development has been busy, apparently. The International Atomic Energy Agency (IAEA) on Friday circulated a confidential report seen by Associated Press which details the Islamic Republic’s nuclear stockpile rose by a whopping over 50% in the three months prior to May 20.

“The agency said that as of May 20, Iran’s total stockpile of low-enriched uranium amounted to 1,571.6 kilograms (1.73 tons), up from 1,020.9 kilograms (1.1 tons) on Feb. 19,” AP reports. 

AFP/Getty

Tehran months ago vowed to break from its commitments under the Obama-brokered JCPOA until the biting American-led sanctions regimen is eased. Officials warned Europe that if action weren’t taken major nuclear milestones would be reached. 

This has also included enriching uranium up to a purity of 4.5%, blowing past the 3.67% ceiling stipulated under the JCPOA. The IAEA report also noted “with serious concern that, for over four months, Iran has denied access to Agency… to two locations.”

Recently reports suggested IAEA inspectors were being blocked in part on Iranian authorities’ concerns over the coronavirus pandemic

“The [IAEA] director general calls on Iran immediately to cooperate fully with the agency, including by providing prompt access to the locations specified,” the report said, however like other warnings this is likely to fall on deaf ears, given Iran’s economy has been decimated, further at a sensitive moment a large chunk of the population and failing health system have been ravaged by the COVID-19 pandemic. 

The report comes a mere day after President Trump suggested there could be a fresh opening with Iran to ‘negotiate a better deal’ – as he’s long sought after withdrawing from the nuclear deal in May 2018. “Thank you to Iran, it shows a deal is possible!” Trump said in a rare positive tweet regarding the Islamic Republic on the occasion of the return of Navy Veteran Michael White from an Iranian prison. 

But the Iranians immediately poured cold water on that prospect, given Foreign Minister Javad Zarif reminded the US president, “we had a deal when you entered office.”

The Iranian position has been that it will never reenter negotiations until Washington removes sanctions, and returns to the 2015 deal it signed in the first place. 

via ZeroHedge News https://ift.tt/3cAH3xi Tyler Durden

“Somebody Cooked Up The Plot”: The Hunt For The Origins Of The Russia Collusion Narrative

“Somebody Cooked Up The Plot”: The Hunt For The Origins Of The Russia Collusion Narrative

Tyler Durden

Fri, 06/05/2020 – 22:40

Authored by John Solomon via JustTheNews.com,

Hollywood once gave us the Cold War thriller called “The Hunt for Red October.” And now the U.S. Senate and its Republican committee chairmen in Washington have launched a different sort of hunt made for the movies.

Armed with subpoenas, Sens. Lindsey Graham, R-S.C., and Ron Johnson, R-Wis., want to interrogate a slew of Obama-era intelligence and law enforcement officials hoping to identify who invented and sustained the bogus Russia collusion narrative that hampered Donald Trump’s early presidency.

And while Graham and Johnson aren’t exactly Sean Connery and Alec Baldwin, they and their GOP cohorts have a theory worthy of a Tom Clancy novel-turned-movie: The Russia collusion investigation was really a plot by an outgoing administration to thwart the new president.

“What we had was a very quiet insurrection that took place,” Sen. Marsha Blackburn, the Tennessee Republican, told Just the News on Thursday as she described the theory of Senate investigators. “And there were probably dozens of people at DOJ and FBI that knew what was going on.

“But they hate Donald Trump so much … that they were willing to work under the cloak of law and try to use that to shield them so that they could take an action on their disgust,” she added. “They wanted to prohibit him from being president. And when he won, they wanted to render him ineffective at doing his job.”

For much of the last two years, the exact theory that congressional Republicans held about the bungled, corrupt Russia probe — where collusion between Donald Trump and Vladimir Putin was ultimately disproven and FBI misconduct was confirmed — was always evolving.

But after explosive testimony this week from former Deputy Attorney General Rod Rosenstein, who openly accused the FBI of keeping him in the dark about flaws, failures and exculpatory evidence in the case, the GOP believes it may prove the Russia case was a conspiracy to use the most powerful law enforcement and intelligence tools in America to harm Trump.

Two years of declassified memos are now in evidence that show:

  • The FBI was warned before it used Christopher Steele’s dossier as evidence to target the Trump campaign with a FISA warrant that the former British spy might be the target of Russian disinformation, that he despised Trump and that he was being paid to help Hillary Clinton’s campaign. But agents proceeded anyway.

  • The bureau was told by the CIA that its primary target, Trump adviser Carter Page, wasn’t a Russian spy but rather a CIA asset. But it hid that evidence from the DOJ and courts, even falsifying a document to keep the secret.

  • The FBI opened a case on Trump adviser George Papadopoulos on the suspicion he might arrange Russian dirt on Hillary Clinton but quickly determined he didn’t have the Russian contacts to pull it off. But the case kept going.

  • The FBI intercepted conversations between its informants and Papadopoulos and Page showing the two men made numerous statements of innocence, and kept that evidence from the DOJ and the courts.

  • The FBI investigated Trump national security adviser Michael Flynn for five months and concluded there was no derogatory evidence he committed a crime or posed an intelligence threat and recommended closing the case. But higher-ups overruled the decision and proceeded to interview Flynn.

  • The FBI and DOJ both knew by August 2017 there was no evidence of Trump-Russia collusion but allowed another 18 months of investigation to persist without announcing the president was innocent.

That is just a handful of the key evidentiary anchors of the storyline Republicans have developed. Now they want to know who helped carry out each of these acts.

“There are millions of Americans pretty upset about this,” Graham said this week. “There are people on our side of the aisle who believe this investigation, Crossfire Hurricane, was one of the most corrupt, biased criminal investigations in the history of the FBI. And we’d like to see something done about it.”

Graham tried to take action to approve 50-plus subpoenas from the Senate Judiciary Committee to witnesses on Thursday but was forced to delay a week.

Johnson, meanwhile, successfully secured about three dozen subpoenas to get documents and interviews with key witnesses from his Senate Homeland Security and Governmental Affairs Committee.

Evidence is growing, Johnson said, that there was not a “peaceful and cooperative” transition between the Obama and Trump administrations in 2017.

“The conduct we know that occurred during the transition should concern everyone and absolutely warrants further investigation,” he said.

With Rosenstein’s testimony now behind them, the senators have some lofty targets for interviews or testimony going forward, including fired FBI Director James Comey, his deputy Andrew McCabe, ex-CIA Director John Brennan, and the former chiefs of staff for President Barack Obama and Vice President Joe Biden.

Blackburn said during an interview with the John Solomon Reports podcast that the goal of the subpoenas and witnesses was simple: to identify and punish the cast of characters who sustained a Russia collusion narrative that was never supported by the evidence.

“Somebody cooked up the plot,” she explained.

“Somebody gave the go-ahead to order, to implement it. Somebody did the dirty work and carried it out — and probably a lot of somebodies. And what frustrates the American people is that nobody has been held accountable.

“Nobody has been indicted. Nobody has been charged, and they’re all getting major book deals and are profiting by what is criminal activity, if you look at the statutes that are on the book, and if you say we’re going to abide by the rule of law and be a nation of laws.”

For Blackburn, identifying and punishing those responsible is essential for two goals: to deter anyone in the future from abusing the FBI and FISA process again and to ensure Americans there isn’t a two-tiered system of justice in America.

“I think when you Google [Russia collusion] in future years, you’re going to see a screenshot of this cast of characters that cooked this up, because it is the ultimate plot,” Blackburn said.

via ZeroHedge News https://ift.tt/376bjPj Tyler Durden

The Fed Just Unleashed A Trillion In New Debt: Companies Took The Money And Spent It On Dividends While Firing Millions

The Fed Just Unleashed A Trillion In New Debt: Companies Took The Money And Spent It On Dividends While Firing Millions

Tyler Durden

Fri, 06/05/2020 – 22:23

It was all the way back in 2012 when we first described in “How The Fed’s Visible Hand Is Forcing Corporate Cash Mismanagement” that the era of ultra cheap money unleashed by the Fed is encouraging corporations not to invest in capex or growth or investing in a satisfied employee base, but to rush and spend it on cheap, short-term gimmicks such as buybacks and dividends which benefit the company’s shareholders in the short term while rewarding management with by bonuses for reaching stock price milestones, vesting incentive compensation.

We concluded by saying that this was “the most insidious way in which the Fed’s ZIRP policy is now bleeding not only the middle class dry, but is forcing companies to reallocate cash in ways that benefit corporate shareholders at the present, at the expense of investing prudently for growth 2 or 3 years down the road.”

For years, nobody cared about what ended up being one of the most controversial aspects of capital mismanagement in a time of ZIRP/NIRP/QE, then suddenly everyone cared after the coronavirus crisis, when it emerged that instead of prudently deploying capital into rainy day funds, companies were systematically syphoning cash out (usually by selling debt) to rewards shareholders and management, confident that if a crisis struck the Fed would bail them out: after all the Fed bailed out the banks in 2008, and by 2020 US corporate debt had reached $16 trillion, or over 75% of US GDP, making it a systematic risk and virtually assuring that expectations for a Fed bailout would be validated.

Sure enough, that’s precisely what happened.

But while none of this should come as a surprise to anyone following events over the past decade, what came next may be a shock, because in response to creating a massive debt bubble whose proceeds were used to make shareholders extremely rich at the expense of a miserable employee base and declining corporate viability, the Fed… doubled down and virtually overnight gave companies a green light to do everything they did leading to the current disaster.

In a Bloomberg expose written by Bob Ivry, Lisa Lee and Craig Torres, the trio of reporters show how, 12 years after we first laid out the “New Normal” capital misallocation paradigm, we are again back to square one as the Fed actions – which as even former NY Fed president Bill Dudley admits are brazen moral-hazard – have prompted a record debt binge even as corporate borrowers are firing millions of workers while using the debt to – drumroll – make shareholders richer.

Take food-service giant Sysco, which just days after the Federal Reserve crossed the final line into centrally-planned markets on March 23 when it assured that it would make openly purchase corporate debt, Sysco sold $4 billion of debt. Then, just a few days after that, the company announced plans to cut one-third of its workforce, more than 20,000 employees, even as dividends to shareholders would continue.

That process repeated itself in April and May as the coronavirus spread. The Fed’s promise juiced the corporate-bond market. Borrowing by top-rated companies shot to a record $1.1 trillion for the year, nearly twice the pace of 2019.

What happened then was a case study of why Fed-endorsed moral hazard is always a catastrophic policy… for the poor, while making the rich richer:

Companies as diverse as Sysco, Toyota Motor Corp., international marketing firm Omnicom Group Inc. and movie-theater chain Cinemark Holdings Inc. borrowed billions of dollars — and then fired workers.

Just two weeks ago, Fed chair Powell testified before Congress, and when asked why the Fed is buying investment grade and junk bond debt, Powell responded “to preserve jobs.” That was a blatant lie, because as Bloomberg notes, the actions of the companies that benefited from the Fed’s biggest ever handout called into question the degree to which the U.S. central bank’s promise to purchase corporate debt will help preserve American jobs.

Unlike the Small Business Administration’s Paycheck Protection Program, which has incentives for employers to keep workers on the job and is only a grant if the bulk of the proceeds are used to retain workers, the taxpayer-backed facilities that the Fed and Treasury Department created for bigger companies have no such requirements. In fact, to make sure the emergency programs help fulfill one of the Fed’s mandates – maximum employment – the central bank is simply crossing its fingers that restoring order to markets will translate to saving jobs.

Instead, what the Fed’s actions have unleashed so far is a new record debt bubble, with more than $1.1 trillion in new debt issuance in just the first five months of the year, even as companies issuing debt are quick to lay off millions!

“They could set conditions, say to companies, hire back your workers, maintain your payroll to at least a certain percentage of prior payroll, and we will help,” said Robert Reich, the former Secretary of Labor for President Bill Clinton who now teaches economics at the University of California, Berkeley. “It’s hardly clear that if you keep companies afloat they’ll hire employees.”

Just don’t tell the Fed Chair: in a May 29 webinar, Jerome Powell said that it’s “really it’s all about creating a context, a climate, in which employees will have the best chance to either keep their job, or go back to their old job, or ultimately find a new job. That’s the point of this exercise.”

The exercise has failed, because just as soon as the bailout funds expire, America will see a second wave of epic layoffs: the extra $600 a week in unemployment benefits that Congress approved in March stops on July 31, while the prohibition against firing workers in the $25 billion government rescue of U.S. airlines expires Sept. 30, and the biggest recipients have said they intend to shed employees after that date.

But where did all the hundreds of billions in newly issued debt go? Well, dividends for one. Without provisions for employees, “the credit assistance will tend to boost financial markets, but not the broad economic well-being of the great majority of the population,” Marcus Stanley, Americans for Financial Reform’s policy director, told Bloomberg.

Of course, when confronted with this reality, the Treasury Secretary did what he normally does: he lied.

“Our No. 1 objective is keeping people employed,” Mnuchin said during a May 19 Senate Banking Committee hearing after Senator Elizabeth Warren, a Massachusetts Democrat, accused him of “boosting your Wall Street buddies” at the expense of ordinary Americans. “What we put in the Main Street facility is that we expect people to use their best efforts to support jobs,” Mnuchin said.

The phrase “best efforts” echoes the original terms for the Main Street program, which required companies to attest they’ll make “reasonable efforts” to keep employees. The wording was subsequently changed to “commercially reasonable efforts,” which Jeremy C. Stein, chairman of the Harvard University economics department and a former Fed governor, called a welcome watering-down of expectations that the central bank would dictate employment policies to borrowers.

And while Stein said that it was “smart of them to weaken that”, what ended up happening is that companies entirely sidestepped preserving employees and rushed to cash out – guess who – shareholders once again.

But in keeping with the Fed’s overarching directive – that its programs are about lending, not spending in the words of Powell – once the Fed has triggered a new debt bubble with its explicit interventions in the secondary market, the Fed has no control over what companies do with the source of the virtually free funds:

“For the Fed to second-guess a corporate survival strategy would be a step too far for them,” said Adam Tooze, a Columbia University history professor. Putting explicit conditions on program beneficiaries would make the central bank “a weird hybrid of the Federal Reserve, Treasury, BlackRock and an activist stockholder,” he added, clearly unaware that we now live in a world in which this “new normal” Frankenstein monster is precisely who is in charge of capital markets, as the helicopter money resulting from the unholy merger of the Fed and Treasury is precisely what BlackRock is frontrunning, in its own words. But heaven forbid some of the trillions in new debt are used for emplyees…

And while tens of millions of jobs have been lost since March – today’s laughable and fabricated jobs report, in the BLS’ own admission – notwithstanding, there has been one clear beneficiary: the S&P 500 has jumped 38% since March 23, the day the Fed intervened; on Friday, the Nasdaq just hit an all time high. Observers of the stock market wonder how it could be so bullish at the same time as the country faces an avalanche of joblessness unsurpassed in its history.

The choices companies are making – choices which we correctly predicted back in 2012 – provide an answer.

Since selling $4 billion in debt on March 30, Sysco has amassed $6 billion of cash and available liquidity, enabling it to gobble up market share, while cutting $500 million of expenses, according to Chief Executive Officer Kevin Hourican. Sysco, which is based in Houston, will continue to pay dividends to shareholders, Chief Financial Officer Joel Grade said on a May 5 earnings call.

Countless other companies are also splurging on debt-funded dividends, while some – such as Apple and Amazon – are now issuing debt to fund their next multi-billion buyback program.

Of course, it’s not just investment grade debt: the Fed notoriously is also active in the junk bond space, buying billions in high yield ETFs (that now hold bonds of bankrupt Hertz).

Movie theaters were one of the first businesses to close during the pandemic. Cinemark, which owns 554 of them, shut its U.S. locations on March 17. Three days later, the company paid a previously announced dividend. It has since said it will discontinue such distributions. Cinemark borrowed $250 million from the junk-bond market on April 13, the same day it announced the firing of 17,500 hourly workers. Managerial staff were kept on at reduced pay, according to company filings. Cinemark, which is based in Plano, Texas, said it plans to open its theaters in phases starting June 19.

The theater chain opted to go to the bond market over seeking funding from the government because “it didn’t come with any of the strings attached that government-backed facilities can include,” CEO Mark Zoradi said on the April 15 earnings call. It “was really no more complicated than that.” And why did Cinemark find no trouble in accessing the bond market? Because with the Fed now buying both IG and HY bonds, there is no longer any credit risk, which is why spreads have collapsed back to all time lows; in effect the Fed is forcing investors to buy Cinemark’s bonds, which then uses the proceeds to pay shareholders either a dividend or to buyback stock. As for the company’s employees? Why they are expendable, and in a few month there will be millions of unemployed workers begging for work at or below minimum wage.

Win win… for Cinemark’s management and shareholders. Lose for everyone else.

Actually, win win for all corporations: like Cinemark, Omnicom issued $600 million in bonds in late March. In an April 28 conference call to discuss quarterly earnings, CEO John Wren said the company was letting employees go but didn’t say how many. He said the company was extending medical benefits to July 31 for employees furloughed or fired.

Wren added: “Our liquidity, balance sheet and credit ratings remain very strong and we have no plans to change our dividend policy.”

And once again, Dividends 1 – Employees 0, because everything will be done to prevent shareholders from dumping the stock.

Toyota borrowed $4 billion from investors on March 27. Three days later, the Japan-based car company said it would continue paying dividends to shareholders. Eight days after that it said it would drop roughly 5,000 contract workers who helped staff its plants in North America.

And so on, and so on, as companies issue hundreds of billions in debt without a glitch – now that the Fed has taken over the bond market – and use the proceeds to fund dividends, while laying off millions.

In a March 24 letter, 200 academics, led by Stanford University Graduate School of Business Professor Jonathan Berk, called lending programs aimed at corporations “a huge mistake.” Better to focus help directly on people living paycheck to paycheck who lost their jobs, it said.

“Bailing out investors who chose to take high-risk investments because they wanted the high returns undermines capitalism and makes it an unfair game,” Berk said in an interview. “If you don’t have a level playing field in capitalism, it doesn’t work.”

Why dear, misguided Jonathan: whoever told you the US still has “capitalism”?

via ZeroHedge News https://ift.tt/2MvYTXy Tyler Durden