Trump Can’t Live Full-Time at Mar-A-Lago Because of 1993 Covenant

In 1993 Donald Trump reached a “use agreement” with Palm Beach Florida that restricted the use of Mar-A-Lago. This document appears to be a covenant. Article II, titled “Club use,” imposes restrictions on the use of the guest suites on the property:

The use of guest suites shall be limited to a maximum of three (3) non-consecutive seven (7) day periods by any one member during the year.

Generally, Article II empowers Trump. Here, it restricts him. In other words, a member can only live in a guest suit for three, non-consecutive one-week periods. The intent here is to prevent Mar-A-Lago from being used as a permanent home. Or, to put it in Property lingo, Orangeacre shall not be used for residential purposes. (Blackacre just didn’t seem to fit here). And, the document was signed by President Donald J. Trump. (President of the Mar-A-Lago Club, Inc., that is).

President Trump has announced that he plans to live in Mar-A-Lago after he leaves the White House (on or about January 20, 2021).

Now, neighbors have complained, and seek the enforcement of the covenant:

Neighbors of Mar-a-Lago sent a letter to the Town of Palm Beach and the U.S. Secret Service on Tuesday complaining that Mr. Trump has violated the 1993 agreement he made with the town that allowed him to convert the property to a moneymaking club.

“Per the use agreement of 1993, Mar-a-Lago is a social club, and no one may reside on the property,” wrote Reginald Stambaugh, a lawyer representing the DeMoss family, which has a property next to Mar-a-Lago.

“To avoid an embarrassing situation for everyone and to give the president time to make other living arrangements in the area, we trust you will work with his team to remind them of the use agreement parameters,” Mr. Stambaugh wrote. “Palm Beach has many lovely estates for sale, and surely he can find one which meets his needs.”

Construction has been done on the president’s residential quarters at the club, where Mr. Trump is expected to spend the Christmas holiday and which Mr. Stambaugh argued already violates the use agreement.

According to the Washington Post, the town has failed to enforce the covenant over the years. For example, Palm Beach has not imposed restrictions on how many days the President has stayed there–including over Christmas break. Also, the town has allowed the installation of a helipad for Marine One–an addition that would be prohibited by the covenant. And, the club does not seem to be limiting 50% of its members to Palm Beach residents, as the agreement requires.

Perhaps Trump could argue that the covenant cannot be enforced based on the doctrine of acquiescence? That is, Palm Beach failed to enforce the restrictions of the covenant for so long, it cannot now sue to enforce the agreement.

Go figure. We will see Article II litigation concerning President Trump even after he leaves office.

from Latest – Reason.com https://ift.tt/3mzHcWR
via IFTTT

Any Tips for Video Parties?

Any suggestions for how best to do a video party (not just a small-group video conversation)? Since it’s hard to have good video conversations with more than 6 people at once (or maybe even more than 4), the party would at least need a breakout room feature, where people can choose to join one group and then move on to others.

I was thinking of doing it by Zoom, setting the option letting people join the breakout room they choose. But are there other platforms that are better? I noticed Evite has configured one; do any of you have experience with it? Are there other tips beyond just what platform and configuration to choose? Let us all know!

from Latest – Reason.com https://ift.tt/2KyOoVy
via IFTTT

The Case for Paying College Athletes

NCAA

Earlier today, the Supreme Court decided to hear NCAA v. Alston, a case challenging the legality of NCAA rules barring most compensation for college athletes:

The Supreme Court will hear a landmark antitrust case against the NCAA that could upend the business model for college sports by allowing colleges to compensate student athletes.

The high court said Wednesday that it will hear appeals filed by the NCAA and one of its member conferences over a May decision that found the group’s limits on player compensation violate antitrust law…

A group of current and former players challenged the NCAA rules that prohibit athletes from accepting money or other forms of compensation. Following a 2019 trial, a federal judge found the restrictions anti-competitive and said the NCAA must allow colleges to offer student athletes education-related benefits, such as graduate school scholarships, study abroad opportunities or computers for educational use.

The U.S. Court of Appeals for the 9th Circuit affirmed that decision earlier this year.

Economists have long argued that the NCAA rules barring compensation for athletes are essentially a thinly veiled cartel, complete with severe punishment for defecting participants, all the way up to the “death penalty.” The main difference with other cartels is that the NCAA system has better PR, and has thereby managed to persuade many people that they are actually serving the public interest by promoting tradition and protecting the integrity of “student athletes.” That said, I don’t know enough about antitrust law to know who should prevail on the legal issues in the case. I will leave those questions to others with greater relevant expertise.

Legal issues aside, however, there is a strong policy rationale for ending restrictions on student athletes compensation. I summarized it back in 2010 and 2011, building on earlier pieces by economists David Henderson and Nobel Prize winner Gary Becker. Most of what they and I said remains relevant today. As Henderson put it:

The NCAA runs a tightly controlled cartel whose “profits” go to colleges and coaches. It’s not simply a private cartel but one backed by government force. Armen Alchian and William Allen, in their 1964 textbook, University Economics, were the first people I know to point this out. They pointed out that those colleges that decided to pay athletes would find their academic accreditation at risk. So why don’t new schools sense a profit to made and then enter and compete players away by paying them? Alchian and Allen answer: “[N]o new school could get subsidies from the state or major philanthropic foundations without recognition by the present accreditation group.” They add, “We have finally arrived at the source of the value of membership in the NCAA and related organizations: subsidized education.”

One could argue, “Well, the student athletes will cash in on their skills later when they go on to become professional athletes.” Not so, as the NCAA admits in its advertising [noting that most players don’t go on to professional careers].

Becker added:

The toughest competition for basketball and football players occurs at the Division I level. These sports have both large attendances at games-sometimes, more than 100,000 persons attend college football games- and widespread television coverage…. Absent the rules enforced by the NCAA, the competition for players would stiffen, especially for the big stars…

To avoid that outcome, the NCAA sharply limits the number of athletic scholarships, and even more importantly, limits the size of the scholarships that schools can offer the best players….

It is impossible for an outsider to look at these rules without concluding that their main aim is to make the NCAA an effective cartel that severely constrains competition among schools for players. The NCAA defends these rules by claiming that their main purpose is to prevent exploitation of student-athletes, to provide a more equitable system of recruitment that enables many colleges to maintain football and basketball programs and actively search for athletes, and to insure that the athletes become students as well as athletes.

Unfortunately for the NCAA, the facts are blatantly inconsistent with these defenses….

A large fraction of the Division I players in basketball and football, the two big money sports, are recruited from poor families; many of them are African-Americans from inner cities and rural areas. Every restriction on the size of scholarships that can be given to athletes in these sports usually takes money away from poor athletes and their families, and in effect transfers these resources to richer students in the form of lower tuition and cheaper tickets for games…

 

A few of my own thoughts from those earlier posts:

[T]he NCAA cartel is not just a private arrangement. It is propped up by the federal government, which uses the threat of denying federal funding to force schools to comply with cartel rules. If this federal intervention were lifted, the cartel might well fall apart…

The traditional NCAA response to such criticism is that the players are “scholar-athletes” who get compensated with education. This is probably true for many college athletes in lesser-known sports. In Division I football and basketball however, the players are essentially full-time professionals. Most of them have little time to spend on their studies, and many have academic credentials far weaker than those of the regular students at their schools. Few people can do well academically if placed at an institution where their credentials are far below the norm and they had to work at a demanding full-time job at the same time.

I don’t believe that student athletes are morally entitled to be paid for playing. If no one wants to pay to watch them, I have no objection. The reality, however, is that there is a high demand for their services which is being artificially suppressed by government coercion. Indeed, some schools and boosters pay players under the table despite the threat of severe NCAA sanctions if they get caught.

Another particularly galling element of the NCAA cartel system is the way in which it is surrounded by a veneer of righteousness. The NCAA has managed to persuade the media and most of the public that the real bad guys are actually those schools that try to undermine the cartel and pay their players at something more closely resembling market rates. Few people seem to care that most of the athletes who get shortchanged are poor minorities who are being deprived of a key opportunity to create a nest egg for their future. As [David] Henderson points out, only a small percentage of them go on to make big bucks in the pros.

There is a conceptually simple, though politically difficult, solution to this problem. The government should withdraw its support for the NCAA cartel. Universities will gradually stop pretending that Division I football and basketball players are primarily students, and start treating them as the employees they actually are. The players will get paid for their work, and they and the universities won’t have to waste time and money forcing players to attend bogus classes in order to keep up appearances. Those who have the desire and academic credentials to do real coursework should of course be allowed to do so….

It’s also worth noting that the same universities who loudly condemn the very thought of paying players often pay huge salaries to coaches and athletic administrators. I don’t begrudge these people their riches. But it seems strange to claim that paying players any salary at all will somehow sully the academic ethos, while simultaneously contending that there’s nothing wrong with paying big bucks to Mike Krzyzewski or Jerry Tarkanian.

If the Supreme Court rules against the NCAA, perhaps things will move in the direction I advocated. It is also possible that some universities will reconsider whether it is actually desirable for academic institutions to be so heavily involved in what are essentially professional sports.

from Latest – Reason.com https://ift.tt/3gRY61A
via IFTTT

N.Y. Aims to Ban “Symbols of Hate” Sold by Private Vendors at State (or State-Funded) Fairgrounds

The bill, signed yesterday, provides:

[Pub. Buildings L.] § 146. Prohibit symbols of hate.

1. The state of New York shall not sell or display any symbols of hate or any similar image, or tangible personal property, inscribed with such an image unless the image appears in a book, digital medium, museum, or otherwise serves an educational or historical purpose.

2. For the purposes of this section, the term “symbols of hate” shall include, but not be limited to, symbols of white supremacy, neo-Nazi ideology or the Battle Flag of the Confederacy.

[Ag. & Markets L. § 16, subd, 51.] [The Department of Agriculture & Markets] through the commissioner shall have power to:] Take any measures necessary to prohibit the sale, on the grounds of the state fair and any other fairs that receive government funding, of symbols of hate, as defined in [Pub. Buildings L. § 146], or any similar image, or tangible personal property, inscribed with such an image, unless the image appears in a book, digital medium, or otherwise serves an educational or historical purpose.

The new § 146, despite its title, is constitutional; New York can choose what it sells or displays, and doesn’t have to display the swastika any more than it has to display the hammer and sickle.

But the “measures” contemplated by the new subdivision 51 are unconstitutional. Government-run fairs are “limited public fora,” see Heffron v. ISKCON (1981)), when it comes to the “exhibitors … [who] present their products or views, be they commercial, religious, or political” through the fair. This means the government can impose reasonable content-based restrictions on speech in those places, but not viewpoint-based restrictions. See Christian Legal Society v. Martinez (2010)Cornelius v. NAACP Legal Defense & Educ. Fund (1985). A ban on display or sale of Confederate flag merchandise at fairs is based on the viewpoint that many people perceive the Confederate flag to express; likewise for “symbols of white supremacy” or “neo-Nazi ideology” or any other “symbols of hate” that the Department might use its power to ban.

Private decisions by privately run fairs aren’t subject to the First Amendment, even if the fairs “receive government fundings.” But if the government tries to use the funding to require privately run fairs to ban sales of “symbols of hate,” that would be a governmental decision, which is indeed constrained by the First Amendment.

Note that subdivision 51 probably can’t be challenged at this point, because it doesn’t itself ban any sales at fairs; it only authorizes the Department to institute such bans. That’s also why the “symbols of hate” language isn’t unconstitutionally vague (even though it’s not well-defined, given the “include, but not be limited to” language). Any viewpoint discrimination or vagueness challenge would have to wait until the Department implements a particular regulation.

According to the New York Post (Bernadette Hogan & Carl Campanile),

A Cuomo spokesman said the governor’s legal team will be reviewing the bill in consultation with the state Legislature to make a possible amendment.

“There’s going to be a chapter amendment that limits the prohibitions at the state fair, to ensure that we are respecting the protections that the Supreme Court has recognized for individuals and vendors at state fairs to exercise their First Amendment rights,” explained Maya Moskowitz, press secretary of bill sponsor state Sen. Alessandra Biaggi (D-Bronx).

Hard to see exactly what the amendment would be, but I guess we’ll see. Thanks to Robby Soave’s post about this here at Reason for the pointer.

from Latest – Reason.com https://ift.tt/3nsPJfd
via IFTTT

N.Y. Aims to Ban “Symbols of Hate” Sold by Private Vendors at State (or State-Funded) Fairgrounds

The bill, signed yesterday, provides:

[Pub. Buildings L.] § 146. Prohibit symbols of hate.

1. The state of New York shall not sell or display any symbols of hate or any similar image, or tangible personal property, inscribed with such an image unless the image appears in a book, digital medium, museum, or otherwise serves an educational or historical purpose.

2. For the purposes of this section, the term “symbols of hate” shall include, but not be limited to, symbols of white supremacy, neo-Nazi ideology or the Battle Flag of the Confederacy.

[Ag. & Markets L. § 16, subd, 51.] [The Department of Agriculture & Markets] through the commissioner shall have power to:] Take any measures necessary to prohibit the sale, on the grounds of the state fair and any other fairs that receive government funding, of symbols of hate, as defined in [Pub. Buildings L. § 146], or any similar image, or tangible personal property, inscribed with such an image, unless the image appears in a book, digital medium, or otherwise serves an educational or historical purpose.

The new § 146, despite its title, is constitutional; New York can choose what it sells or displays, and doesn’t have to display the swastika any more than it has to display the hammer and sickle.

But the “measures” contemplated by the new subdivision 51 are unconstitutional. Government-run fairs are “limited public fora,” see Heffron v. ISKCON (1981)), when it comes to the “exhibitors … [who] present their products or views, be they commercial, religious, or political” through the fair. This means the government can impose reasonable content-based restrictions on speech in those places, but not viewpoint-based restrictions. See Christian Legal Society v. Martinez (2010)Cornelius v. NAACP Legal Defense & Educ. Fund (1985). A ban on display or sale of Confederate flag merchandise at fairs is based on the viewpoint that many people perceive the Confederate flag to express; likewise for “symbols of white supremacy” or “neo-Nazi ideology” or any other “symbols of hate” that the Department might use its power to ban.

Private decisions by privately run fairs aren’t subject to the First Amendment, even if the fairs “receive government fundings.” But if the government tries to use the funding to require privately run fairs to ban sales of “symbols of hate,” that would be a governmental decision, which is indeed constrained by the First Amendment.

Note that subdivision 51 probably can’t be changed at this point, because it doesn’t itself ban any sales at fairs; it only authorizes the Department to institute such bans. That’s also why the “symbols of hate” language isn’t unconstitutionally vague (even though it’s not well-defined, given the “include, but not be limited to” language). Any viewpoint discrimination or vagueness challenge would have to wait until the Department implements a particular regulation.

Thanks to Robby Soave’s post about this here at Reason for the pointer.

from Latest – Reason.com https://ift.tt/3nsPJfd
via IFTTT

Alex Nowrasteh and Benjamin Powell: Immigrants Revitalize Faith in American Institutions

dreamstime_xxl_13544695_mod.jpg

Do immigrants bring with them the worst attributes of the countries and societies they are fleeing?

That fear motivates anti-immigrant sentiments from populists and nationalists such as President Donald Trump, who famously declared at the start of his campaign for the presidency that “when Mexico sends its people, they’re not sending their best.” It also stokes anxiety from an influential group of mostly free market economists such as Harvard’s George Borjas, Britain’s Paul Collier, and George Mason’s Garret Jones, who speculate that mass immigration from countries with illiberal political and economic cultures will undermine countries such as the United States and the United Kingdom.

In Wretched Refuse?: The Political Economy of Immigration and Institutions, Alex Nowrasteh and Benjamin Powell take an exhaustive look at the data and conclude that destination countries not only benefit economically from immigration but that many key markers of liberal democracy—such as support for the rule of law, belief in private property, and trust in government—improve when newcomers arrive en masse.

“One of the things we take a look at sort of in some detail in the book is whether immigrants have more trust in our institutions—whether they have more trust in the legal system, whether they have more trust in American businesses, whether they have more trust in the general sense of government and the way things are run here,” says Nowrasteh, director of immigration studies at the libertarian Cato Institute. “Overwhelmingly they do. If it weren’t for immigrants in the United States, trust in these institutions would be a lot lower.”

Powell, an economist at Texas Tech who also heads up that school’s Free Market Institute, points to the experience of Israel in the 1990s, whose population increased by 20 percent, largely due to Jews fleeing the former Soviet Union. “There was a massive flood of people coming from a country with a 70-year history of no rule of law and no economic freedoms [who] piled into Israel with immediate voting rights. Confidence in property rights didn’t go down,” notes Powell. “Confidence in property rights didn’t go down, it went up. [So did] economic freedoms across board…Israel went from something like 90th in the world in economic freedom to 45th during a period of massive immigration from a communist country.”

Audio production by Ian Keyser.

Photo: Rrodrickbeiler | Dreamstime.com

from Latest – Reason.com https://ift.tt/3akftqj
via IFTTT

Alex Nowrasteh and Benjamin Powell: Immigrants Revitalize Faith in American Institutions

dreamstime_xxl_13544695_mod.jpg

Do immigrants bring with them the worst attributes of the countries and societies they are fleeing?

That fear motivates anti-immigrant sentiments from populists and nationalists such as President Donald Trump, who famously declared at the start of his campaign for the presidency that “when Mexico sends its people, they’re not sending their best.” It also stokes anxiety from an influential group of mostly free market economists such as Harvard’s George Borjas, Britain’s Paul Collier, and George Mason’s Garett Jones, who speculate that mass immigration from countries with illiberal political and economic cultures will undermine countries such as the United States and the United Kingdom.

In Wretched Refuse?: The Political Economy of Immigration and Institutions, Alex Nowrasteh and Benjamin Powell take an exhaustive look at the data and conclude that destination countries not only benefit economically from immigration but that many key markers of liberal democracy—such as support for the rule of law, belief in private property, and trust in government—improve when newcomers arrive en masse.

“One of the things we take a look at sort of in some detail in the book is whether immigrants have more trust in our institutions—whether they have more trust in the legal system, whether they have more trust in American businesses, whether they have more trust in the general sense of government and the way things are run here,” says Nowrasteh, director of immigration studies at the libertarian Cato Institute. “Overwhelmingly they do. If it weren’t for immigrants in the United States, trust in these institutions would be a lot lower.”

Powell, an economist at Texas Tech who also heads up that school’s Free Market Institute, points to the experience of Israel in the 1990s, whose population increased by 20 percent, largely due to Jews fleeing the former Soviet Union. “There was a massive flood of people coming from a country with a 70-year history of no rule of law and no economic freedoms [who] piled into Israel with immediate voting rights. Confidence in property rights didn’t go down,” notes Powell. “Confidence in property rights didn’t go down, it went up. [So did] economic freedoms across board…Israel went from something like 90th in the world in economic freedom to 45th during a period of massive immigration from a communist country.”

Audio production by Ian Keyser.

Photo: Rrodrickbeiler | Dreamstime.com

from Latest – Reason.com https://ift.tt/3akftqj
via IFTTT

Pornhub Isn’t the Problem. That Won’t Stop the Politicized Crusade Against It.

maphotosseven932615

It’s not hard to understand why Pornhub—a giant clearinghouse of user-posted porn clips that gets massive amounts of web traffic—makes a politically popular scapegoat for problems plaguing the internet more broadly. Those most loudly denouncing the clip site and calling for its demise say a new push to shut down Pornhub is a matter of stopping child abuse.

Pornhub isn’t without significant issues. But if advocates are most concerned about illicit content involving minors—rather than with trying to police what happens between consenting adults—then there is strong evidence that Pornhub’s problems are much smaller in scope than the problems of popular social media sites such as Facebook, Snapchat, and Instagram. That these sites generally get a pass from Pornhub foes and the press suggests there’s something more going on here than just a concern for protecting children. For politicians, activists, and media personalities looking to score an easy win, the campaign against Pornhub appears to be more about moral grandstanding and leveraging generalized shame around pornography than addressing the real problem of child abuse and exploitation.

The new anti-porn coalition

Pornhub’s prime position in this discourse seems more related to high-profile public relations campaigns against it than the documented prevalence of harmful content.

Major anti-Pornhub campaigns have been led by the group formerly known as Morality in Media—now the National Center on Sexual Exploitation (NCOSE)—and by Exodus Cry, a nonprofit “which presents itself as an anti-sex-trafficking organization, but, per the mission statement laid out in their 2018 tax returns, ultimately aims to abolish sex work entirely,” as Tarpley Hitt at The Daily Beast puts it. Exodus Cry stems from a controversial evangelical Christian church called the International House of Prayer.

Calling their efforts TraffickingHub, NCOSE and Exodus Cry have been teaming up to portray Pornhub as a uniquely prolific and unrepentant purveyor of smut featuring minors and abuse. But reports from the National Center for Missing and Exploited Children (NCMEC), tech companies, and law enforcement do not support this contention.

To make their cases against Pornhub, then, crusaders often resort to using statistics in weaselly ways, designed to help casual readers draw false impressions. For instance, in a letter calling for Pornhub to be investigated, Sen. Ben Sasse (R–Neb.) weaves general numbers about Pornhub traffic and searches in with warnings about “the exploitation of human trafficking victims in pornography streamed over the internet.”

Shared Hope International peppers its calls to shut down Pornhub with general stats about child sex abuse material. “There were 18.4 million reports of child sexual abuse imagery online in the last year—making up more than one-third of all such reports in the entire history of the internet,” the group said in March before accusing Pornhub of “conditioning viewers to tolerate the sexualization of and violence against youth, as well as the objectification of women and girls.”

Promotional efforts by Exodus Cry and NCOSE (and the press and politicking that results) also rely on a few other rhetorical tricks. Without evidence, they hold up isolated tales of abuse as representative of Pornhub content more broadly. And despite evidence to the contrary, they suggest Pornhub is particularly bad for attracting and permitting illegal content, with executives at Pornhub indifferent to (or even encouraging of) pictures and videos featuring underage teen girls. Lastly, they suggest that the only feasible solution is to take drastic aim at porn or digital privacy more broadly—sometimes both.

A recent piece by New York Times columnist Nicholas Kristof neatly hits these notes. On Pornhub, “a search for ‘girls under18′ (no space) or ’14yo’ leads in each case to more than 100,000 videos,” wrote Kristof, before adding that “most aren’t children being assaulted.” (Conflating role-playing with actual abuse is also a common feature of anti-Pornhub advocacy.)

Kristof tells us that “after a 15-year-old girl went missing in Florida, her mother found her on Pornhub—in 58 sex videos.” But police say 58 is the total number of at least semi-nude photos and videos of the runaway teen (most of which did not feature sex) that they were found across a range of websites, including Periscope, Modelhub, and Snapchat along with Pornhub.

That doesn’t absolve Pornhub, of course. But it is yet another reminder that the problem goes beyond this particular site—something anti-Pornhub crusaders tend to studiously ignore.

A Google News search reveals ample activist campaigning and political hubbub about Pornhub, but few stories of actual prosecutions involving predators who used the site. The Florida case is one of the same handful that keeps making the rounds in anti-Pornhub articles. Another oft-repeated story occurred over a decade ago. Meanwhile, countless new stories about mainstream apps and gaming sites being used for exploitation receive little attention outside local crime news.

Here’s a recent case out of the U.K. involving Facebook and at least 20 12- to 15-year-old boys. Here’s a prosecution in Arkansas involving Instagram and Snapchat. Here’s another recent prosecution involving Snapchat. And another. And another. And another. This September case out of California involved Snapchat along with Twitter, Telegram, and ICQ. Here’s one involving Snapchat and Whisper.

An April investigation in New York called “Operation Home Alone” involved Fortnite, Hot or Not, Kik, Minecraft, SKOUT, Tinder, and other apps popular with kids and teenagers. In August, a spokesperson for the New Jersey Internet Crimes Against Children Task Force said platforms used by child predators included Kik, SKOUT, Grindr, Whisper, Omegle, Tinder, Chat Avenue, Chatroulette, Wishbone, Live.ly, Musical.ly [now TikTok], Paltalk, Yubo, Hot or Not, Down, Tumblr, Fortnite, Minecraft, and Discord.

Facebook reported nearly 60 million potential child sexual abuse photos and videos on its platforms in 2019. (The company told The Verge that “not all of that content is considered ‘violating’ and that only about 29.2 million met that criteria.”) That makes Facebook the source of 94 percent of U.S. tech companies’ reports to NCMEC.

Despite aggressive (and admirable) efforts to detect and report this content, research suggests that plenty of child sexual abuse material (CSAM) is slipping by Facebook filters and staff anyway. A Tech Transparency Project study of Department of Justice press releases from 2013 through 2019 found that of at least 366 cases involving Facebook, only 9 percent originated from Facebook reporting content to authorities.

The National Society for the Prevention of Cruelty to Children (NSPCC), a U.K.-based organization, looked at “1,220 offences of sexual communication with a child” in England and Wales during the first three months of pandemic lockdowns. “Facebook-owned apps (Instagram, Facebook, WhatsApp) were used in 51% of cases where the type of communication was recorded” and Snapchat in 20 percent, NSPCC said in a November press release.

Putting Pornhub in Perspective

The Internet Watch Foundation, an independent U.K.-based watchdog group, says that it “found 118 instances of child sexual abuse imagery on Pornhub” from January 1, 2017, through October 29, 2019.

Terms like child sexual abuse imagery (and CSAM, used more commonly in the United States) are officialese for a broad range of imagery, from the truly sickening to semi-nude selfies shared by older teens.

Mindgeek, the Montreal-based parent company behind Pornhub, has said that “any assertion that we allow CSAM is irresponsible and flagrantly untrue.” For a few years now, Pornhub has at least been trying to clean up its act. The company has been doing more—though still not enough, according to many in the porn industry—to crack down on pirated videos, underage videos, and content made or shared non-consensually while also helping adult sex workers share and profit from their work.

In the past few days, Pornhub announced new policies surrounding content and removed all videos from non-verified accounts.

Many of these changes have long been on the list of porn performer and producer demands. Folks heaping praise on Kristof, Fox News host Laura Ingraham, and other Pornhub-critical members of the media overlook the years of work that those in the industry have been doing to pressure Pornhub into working harder to prevent illegal content of all kinds while also ensuring that adults who want to monetize their work can do so.

Little coverage has acknowledged this activism and lobbying from sex workers, nor the broader labor and intellectual property issues behind it. The focus in accounts like Kristof’s—or legislation like Missouri Republican Sen. Josh Hawley’s latest bill—is on eliciting disgust and condemnation about legitimate problems and then homing in on Pornhub, not taking a balanced look at the larger problem or a sober analysis of what steps might actually mitigate it.

In several cases that Kristof and Exodus Cry muster against Pornhub, the perpetrators of abuse were relatives of the victims. This is a finding that crops up again and again, no matter what type of child abuse we’re talking about: Kids are being exploited by family members and others close to them, or while under the watch of state services. Yet rather than focus on difficult but commonplace problems in people’s own communities (or favorite family-friendly social platforms) people would rather create boogeymen out of a company willing to do business with sex workers.

Abusive content is portrayed as something ignored by online platforms and untraceable by police when, in reality, perpetrators of abuse are often caught and prosecuted because of help from tech companies. Instead of seeing tech companies as allies in thwarting predators and getting justice for victims, however, legislators and activists are often scheming more ways to make a broader group of people and businesses responsible for these harms.

First, they go after the user-generated content platforms like Pornhub (or OnlyFans, or Craigslist and Backpage before them). Then they go after anyone who does business with the platform, from credit card companies and other payment processors to the web hosting services, software, and advertisers that power them. Ultimately, the target becomes anyone enabling adult sex workers to actually work more safely and independently and to profit from their work.

Last week, Visa and Mastercard announced that they would cease their business relationships with Pornhub, after facing mounting pressure to do so. “This news is crushing for the hundreds of thousands of models who rely on our platform for their livelihoods,” Pornhub said in response. 

We’ve been here before, notes sex worker, activist, and author Maggie McNeill. A few years back, activists and officials pressured credit card companies to stop doing business with Backpage—until a federal court said it was unconstitutional.

Now people like Hawley and Kristof—who both have a history of promoting sex trafficking fables—are trying the same thing with sites like Pornhub.

Of course, there will always be more underground platforms, foreign servers, and ways for criminals to evade U.S. authorities and spread their filth. And these more removed platforms, unlike mainstream porn sites, won’t be willing to cooperate with the government, making it harder to catch people perpetrating violence, fraud, theft, or abuse. But with payment processor options narrowed, consenting adult porn performers and producers are less able to profit from their legal work and more likely to need to rely on exploitative middlemen.

Once again, their efforts will only end up making things harder for the very people they claim to be concerned with, while also punishing and keeping down sex workers broadly.

Behind the numbers, Room for Optimism

Behind the big, bad numbers, there’s evidence that detection of and action around exploitative material is growing, in a way that signals positive change from all sorts of tech companies.

The National Center for Missing and Exploited Children received 16.9 million total tips about potentially abusive online content in 2019, with about 3 percent of this content originating in the U.S.

Tips to NCMEC came from at least 148 different tech platforms last year, with 15,884,511 tips coming from Facebook; 449,283 from Google; 73,929 from Imgur; 19,480 tips from Discord; 123,839 from Microsoft; 7,360 from Pinterest; 82,030 from Snapchat; 596 from TikTok; 45,726 from Twitter; and 13,418 from Verizon Media; 306 from Vimeo; and 57 from Zoom.

Those are staggering numbers. But it’s important to keep in mind that they don’t represent verified or discrete examples of child porn. The data reflect the number of tips about content potentially featuring minors that were reported to the NCMEC CyberTipline, often by groups who have a huge legal and financial (not just moral) incentive to overreport rather than underreport. Tips largely come from tech companies, though also from international hotlines, government officials, activist groups, and the general public. A tip simply means that someone raised suspicions about a post or picture, not that the person behind it was necessarily a minor (or even a real person; NCMEC notes that tips may be about computer-generated imagery). One image, ad, or video could be the subject of multiple reports, as well.

According to Palantir’s Angela Muller, “viral” content that spurs a lot of reports about the same image or video has contributed to the rise in tips, with much of the sharing being done by people who think they’re helping by drawing attention to questionable posts.

Limiting NCMEC data to “cases with an identified victim and one or more adult offenders” yields much smaller numbers, thankfully. From July 2002 through June 2014, NCMEC identified 518 “actively traded cases” (which it defines as those “having been reported on five or more times” to the group’s tipline) around the world, involving a total of 933 identifiable victims.

From July 2011 through June 2014, it found a total of 2,598 cases globally with identifiable underage individuals.

The number of tips to NCMEC has risen greatly over the past decade. Yet while many have reported on the rise, few offer any potential reasons for it other than rising rates of abuse and/or indifferent tech companies. This presents a misleading picture, since much of the increase can be attributed to better tech tools and an increasingly proactive approach from tech companies, as well as an increasing number of them partnering with NCMEC.

Pornhub started working with NCMEC in 2019. “In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and content platforms,” says the company.

Tech companies are required by law to report potential CSAM when alerted to it.

For social media and other tech entities, growing legal and political pressure around online content moderation generally has coincided with better tools for identifying and filtering out obscene content and underage imagery. This means more red flags raised, and more required reporting, even absent any actual increase in illegal content.

Notably, CSAM numbers do include pictures and videos taken and shared by minors without any adult knowing. So greater numbers of young people owning their own photo- and video-capable phones and having access to social media sites might also explain part of the rise in reported numbers.

According to the Internet Watch Foundation (IWF), “self-generated sexual content featuring under-18s now accounts for nearly a third of all actioned child sexual abuse material online by the IWF.”

Instead of chasing selfie-taking teens, tech execs, payment processors, and sex workers, authorities should focus on enforcing existing laws against those actually committing crimes or posting images of abuse. No one except abusers thinks that abusers should get away with this. But as long as tech companies are making good faith efforts to stop abusers from posting content, and cooperating with authorities to identify such content, they can be partners of law enforcement, not enemies. They are the ones in the best position to help recognize illicit content and report it, as well as to provide records that can help in prosecutions.

If you want to catch child abusers and other porn predators—while also protecting victims and workers—you should work with companies like Pornhub, not against them.

Yet for groups like NCOSE and Exodus Cry, and politicians seeking excuses to get government backdoors into encrypted communications, that will not do. Instead, we get a lot of hype about how tech companies—particularly the ones without many friends in Washington—and general data privacy protections are the real issue.

from Latest – Reason.com https://ift.tt/2ITQRcK
via IFTTT

Pornhub Isn’t the Problem. That Won’t Stop the Politicized Crusade Against It.

maphotosseven932615

It’s not hard to understand why Pornhub—a giant clearinghouse of user-posted porn clips that gets massive amounts of web traffic—makes a politically popular scapegoat for problems plaguing the internet more broadly. Those most loudly denouncing the clip site and calling for its demise say a new push to shut down Pornhub is a matter of stopping child abuse.

Pornhub isn’t without significant issues. But if advocates are most concerned about illicit content involving minors—rather than with trying to police what happens between consenting adults—then there is strong evidence that Pornhub’s problems are much smaller in scope than the problems of popular social media sites such as Facebook, Snapchat, and Instagram. That these sites generally get a pass from Pornhub foes and the press suggests there’s something more going on here than just a concern for protecting children. For politicians, activists, and media personalities looking to score an easy win, the campaign against Pornhub appears to be more about moral grandstanding and leveraging generalized shame around pornography than addressing the real problem of child abuse and exploitation.

The new anti-porn coalition

Pornhub’s prime position in this discourse seems more related to high-profile public relations campaigns against it than the documented prevalence of harmful content.

Major anti-Pornhub campaigns have been led by the group formerly known as Morality in Media—now the National Center on Sexual Exploitation (NCOSE)—and by Exodus Cry, a nonprofit “which presents itself as an anti-sex-trafficking organization, but, per the mission statement laid out in their 2018 tax returns, ultimately aims to abolish sex work entirely,” as Tarpley Hitt at The Daily Beast puts it. Exodus Cry stems from a controversial evangelical Christian church called the International House of Prayer.

Calling their efforts TraffickingHub, NCOSE and Exodus Cry have been teaming up to portray Pornhub as a uniquely prolific and unrepentant purveyor of smut featuring minors and abuse. But reports from the National Center for Missing and Exploited Children (NCMEC), tech companies, and law enforcement do not support this contention.

To make their cases against Pornhub, then, crusaders often resort to using statistics in weaselly ways, designed to help casual readers draw false impressions. For instance, in a letter calling for Pornhub to be investigated, Sen. Ben Sasse (R–Neb.) weaves general numbers about Pornhub traffic and searches in with warnings about “the exploitation of human trafficking victims in pornography streamed over the internet.”

Shared Hope International peppers its calls to shut down Pornhub with general stats about child sex abuse material. “There were 18.4 million reports of child sexual abuse imagery online in the last year—making up more than one-third of all such reports in the entire history of the internet,” the group said in March before accusing Pornhub of “conditioning viewers to tolerate the sexualization of and violence against youth, as well as the objectification of women and girls.”

Promotional efforts by Exodus Cry and NCOSE (and the press and politicking that results) also rely on a few other rhetorical tricks. Without evidence, they hold up isolated tales of abuse as representative of Pornhub content more broadly. And despite evidence to the contrary, they suggest Pornhub is particularly bad for attracting and permitting illegal content, with executives at Pornhub indifferent to (or even encouraging of) pictures and videos featuring underage teen girls. Lastly, they suggest that the only feasible solution is to take drastic aim at porn or digital privacy more broadly—sometimes both.

A recent piece by New York Times columnist Nicholas Kristof neatly hits these notes. On Pornhub, “a search for ‘girls under18′ (no space) or ’14yo’ leads in each case to more than 100,000 videos,” wrote Kristof, before adding that “most aren’t children being assaulted.” (Conflating role-playing with actual abuse is also a common feature of anti-Pornhub advocacy.)

Kristof tells us that “after a 15-year-old girl went missing in Florida, her mother found her on Pornhub—in 58 sex videos.” But police say 58 is the total number of at least semi-nude photos and videos of the runaway teen (most of which did not feature sex) that they were found across a range of websites, including Periscope, Modelhub, and Snapchat along with Pornhub.

That doesn’t absolve Pornhub, of course. But it is yet another reminder that the problem goes beyond this particular site—something anti-Pornhub crusaders tend to studiously ignore.

A Google News search reveals ample activist campaigning and political hubbub about Pornhub, but few stories of actual prosecutions involving predators who used the site. The Florida case is one of the same handful that keeps making the rounds in anti-Pornhub articles. Another oft-repeated story occurred over a decade ago. Meanwhile, countless new stories about mainstream apps and gaming sites being used for exploitation receive little attention outside local crime news.

Here’s a recent case out of the U.K. involving Facebook and at least 20 12- to 15-year-old boys. Here’s a prosecution in Arkansas involving Instagram and Snapchat. Here’s another recent prosecution involving Snapchat. And another. And another. And another. This September case out of California involved Snapchat along with Twitter, Telegram, and ICQ. Here’s one involving Snapchat and Whisper.

An April investigation in New York called “Operation Home Alone” involved Fortnite, Hot or Not, Kik, Minecraft, SKOUT, Tinder, and other apps popular with kids and teenagers. In August, a spokesperson for the New Jersey Internet Crimes Against Children Task Force said platforms used by child predators included Kik, SKOUT, Grindr, Whisper, Omegle, Tinder, Chat Avenue, Chatroulette, Wishbone, Live.ly, Musical.ly [now TikTok], Paltalk, Yubo, Hot or Not, Down, Tumblr, Fortnite, Minecraft, and Discord.

Facebook reported nearly 60 million potential child sexual abuse photos and videos on its platforms in 2019. (The company told The Verge that “not all of that content is considered ‘violating’ and that only about 29.2 million met that criteria.”) That makes Facebook the source of 94 percent of U.S. tech companies’ reports to NCMEC.

Despite aggressive (and admirable) efforts to detect and report this content, research suggests that plenty of child sexual abuse material (CSAM) is slipping by Facebook filters and staff anyway. A Tech Transparency Project study of Department of Justice press releases from 2013 through 2019 found that of at least 366 cases involving Facebook, only 9 percent originated from Facebook reporting content to authorities.

The National Society for the Prevention of Cruelty to Children (NSPCC), a U.K.-based organization, looked at “1,220 offences of sexual communication with a child” in England and Wales during the first three months of pandemic lockdowns. “Facebook-owned apps (Instagram, Facebook, WhatsApp) were used in 51% of cases where the type of communication was recorded” and Snapchat in 20 percent, NSPCC said in a November press release.

Putting Pornhub in Perspective

The Internet Watch Foundation, an independent U.K.-based watchdog group, says that it “found 118 instances of child sexual abuse imagery on Pornhub” from January 1, 2017, through October 29, 2019.

Terms like child sexual abuse imagery (and CSAM, used more commonly in the United States) are officialese for a broad range of imagery, from the truly sickening to semi-nude selfies shared by older teens.

Mindgeek, the Montreal-based parent company behind Pornhub, has said that “any assertion that we allow CSAM is irresponsible and flagrantly untrue.” For a few years now, Pornhub has at least been trying to clean up its act. The company has been doing more—though still not enough, according to many in the porn industry—to crack down on pirated videos, underage videos, and content made or shared non-consensually while also helping adult sex workers share and profit from their work.

In the past few days, Pornhub announced new policies surrounding content and removed all videos from non-verified accounts.

Many of these changes have long been on the list of porn performer and producer demands. Folks heaping praise on Kristof, Fox News host Laura Ingraham, and other Pornhub-critical members of the media overlook the years of work that those in the industry have been doing to pressure Pornhub into working harder to prevent illegal content of all kinds while also ensuring that adults who want to monetize their work can do so.

Little coverage has acknowledged this activism and lobbying from sex workers, nor the broader labor and intellectual property issues behind it. The focus in accounts like Kristof’s—or legislation like Missouri Republican Sen. Josh Hawley’s latest bill—is on eliciting disgust and condemnation about legitimate problems and then homing in on Pornhub, not taking a balanced look at the larger problem or a sober analysis of what steps might actually mitigate it.

In several cases that Kristof and Exodus Cry muster against Pornhub, the perpetrators of abuse were relatives of the victims. This is a finding that crops up again and again, no matter what type of child abuse we’re talking about: Kids are being exploited by family members and others close to them, or while under the watch of state services. Yet rather than focus on difficult but commonplace problems in people’s own communities (or favorite family-friendly social platforms) people would rather create boogeymen out of a company willing to do business with sex workers.

Abusive content is portrayed as something ignored by online platforms and untraceable by police when, in reality, perpetrators of abuse are often caught and prosecuted because of help from tech companies. Instead of seeing tech companies as allies in thwarting predators and getting justice for victims, however, legislators and activists are often scheming more ways to make a broader group of people and businesses responsible for these harms.

First, they go after the user-generated content platforms like Pornhub (or OnlyFans, or Craigslist and Backpage before them). Then they go after anyone who does business with the platform, from credit card companies and other payment processors to the web hosting services, software, and advertisers that power them. Ultimately, the target becomes anyone enabling adult sex workers to actually work more safely and independently and to profit from their work.

Last week, Visa and Mastercard announced that they would cease their business relationships with Pornhub, after facing mounting pressure to do so. “This news is crushing for the hundreds of thousands of models who rely on our platform for their livelihoods,” Pornhub said in response. 

We’ve been here before, notes sex worker, activist, and author Maggie McNeill. A few years back, activists and officials pressured credit card companies to stop doing business with Backpage—until a federal court said it was unconstitutional.

Now people like Hawley and Kristof—who both have a history of promoting sex trafficking fables—are trying the same thing with sites like Pornhub.

Of course, there will always be more underground platforms, foreign servers, and ways for criminals to evade U.S. authorities and spread their filth. And these more removed platforms, unlike mainstream porn sites, won’t be willing to cooperate with the government, making it harder to catch people perpetrating violence, fraud, theft, or abuse. But with payment processor options narrowed, consenting adult porn performers and producers are less able to profit from their legal work and more likely to need to rely on exploitative middlemen.

Once again, their efforts will only end up making things harder for the very people they claim to be concerned with, while also punishing and keeping down sex workers broadly.

Behind the numbers, Room for Optimism

Behind the big, bad numbers, there’s evidence that detection of and action around exploitative material is growing, in a way that signals positive change from all sorts of tech companies.

The National Center for Missing and Exploited Children received 16.9 million total tips about potentially abusive online content in 2019, with about 3 percent of this content originating in the U.S.

Tips to NCMEC came from at least 148 different tech platforms last year, with 15,884,511 tips coming from Facebook; 449,283 from Google; 73,929 from Imgur; 19,480 tips from Discord; 123,839 from Microsoft; 7,360 from Pinterest; 82,030 from Snapchat; 596 from TikTok; 45,726 from Twitter; and 13,418 from Verizon Media; 306 from Vimeo; and 57 from Zoom.

Those are staggering numbers. But it’s important to keep in mind that they don’t represent verified or discrete examples of child porn. The data reflect the number of tips about content potentially featuring minors that were reported to the NCMEC CyberTipline, often by groups who have a huge legal and financial (not just moral) incentive to overreport rather than underreport. Tips largely come from tech companies, though also from international hotlines, government officials, activist groups, and the general public. A tip simply means that someone raised suspicions about a post or picture, not that the person behind it was necessarily a minor (or even a real person; NCMEC notes that tips may be about computer-generated imagery). One image, ad, or video could be the subject of multiple reports, as well.

According to Palantir’s Angela Muller, “viral” content that spurs a lot of reports about the same image or video has contributed to the rise in tips, with much of the sharing being done by people who think they’re helping by drawing attention to questionable posts.

Limiting NCMEC data to “cases with an identified victim and one or more adult offenders” yields much smaller numbers, thankfully. From July 2002 through June 2014, NCMEC identified 518 “actively traded cases” (which it defines as those “having been reported on five or more times” to the group’s tipline) around the world, involving a total of 933 identifiable victims.

From July 2011 through June 2014, it found a total of 2,598 cases globally with identifiable underage individuals.

The number of tips to NCMEC has risen greatly over the past decade. Yet while many have reported on the rise, few offer any potential reasons for it other than rising rates of abuse and/or indifferent tech companies. This presents a misleading picture, since much of the increase can be attributed to better tech tools and an increasingly proactive approach from tech companies, as well as an increasing number of them partnering with NCMEC.

Pornhub started working with NCMEC in 2019. “In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and content platforms,” says the company.

Tech companies are required by law to report potential CSAM when alerted to it.

For social media and other tech entities, growing legal and political pressure around online content moderation generally has coincided with better tools for identifying and filtering out obscene content and underage imagery. This means more red flags raised, and more required reporting, even absent any actual increase in illegal content.

Notably, CSAM numbers do include pictures and videos taken and shared by minors without any adult knowing. So greater numbers of young people owning their own photo- and video-capable phones and having access to social media sites might also explain part of the rise in reported numbers.

According to the Internet Watch Foundation (IWF), “self-generated sexual content featuring under-18s now accounts for nearly a third of all actioned child sexual abuse material online by the IWF.”

Instead of chasing selfie-taking teens, tech execs, payment processors, and sex workers, authorities should focus on enforcing existing laws against those actually committing crimes or posting images of abuse. No one except abusers thinks that abusers should get away with this. But as long as tech companies are making good faith efforts to stop abusers from posting content, and cooperating with authorities to identify such content, they can be partners of law enforcement, not enemies. They are the ones in the best position to help recognize illicit content and report it, as well as to provide records that can help in prosecutions.

If you want to catch child abusers and other porn predators—while also protecting victims and workers—you should work with companies like Pornhub, not against them.

Yet for groups like NCOSE and Exodus Cry, and politicians seeking excuses to get government backdoors into encrypted communications, that will not do. Instead, we get a lot of hype about how tech companies—particularly the ones without many friends in Washington—and general data privacy protections are the real issue.

from Latest – Reason.com https://ift.tt/2ITQRcK
via IFTTT

“71 Chickens Saved, 35 Arrested over Alleged Illegal Cockfighting Ring”

From news.com.au, as run in the Advertiser (Adelaide); thanks to Joe Muha for the pointer.

Whenever I run into people named Sydney, I expect them to have sisters named Adelaide—but I’ve always been disappointed.

from Latest – Reason.com https://ift.tt/3r6vu9g
via IFTTT