St. Louis group for women entrepreneurs expands to 6 cities, including Denver

ST. LOUIS — A St. Louis organization that supports women-led startup companies is expanding to six new cities.

The St. Louis Post-Dispatch reports that Brazen Global announced Tuesday that it’s creating branches in Chicago, Dallas, Denver, Detroit, Philadelphia and Fort Worth, Texas. The organization has hired acting directors in each new market.

Brazen was founded in 2014 by Jennifer Ehlen, who noticed that St. Louis ranked near the bottom of an American Express study on cities that support women entrepreneurs. Brazen has assisted more than 1,000 women entrepreneurs in St. Louis since forming.

The organization provides resources, sponsors support groups and hosts networking events for female business owners.

Ehlen says the group hopes to help women entrepreneurs realize their growth aspirations.

Information from: St. Louis Post-Dispatch

Swim beach at Chatfield State Park will be closed all summer due to construction

When that Denver summer sun starts beating down, there is nothing like retreating to a local oasis for a dip in a lake or reservoir to stay cool.

This summer, though, popular Chatfield State Park in Littleton won’t be an option for folks in the metro area.

Because of a construction project related to the expansion of reservoir capacity at the park, the popular swim beach will be closed for the summer.

To meet the need for increased water storage required by an increasing population in the Front Range, park managers are preparing for a substantial rise in the reservoir’s water level. To accommodate that, infrastructure around the reservoir must be relocated, and that includes the swim beach.

“They’re changing the infrastructure to allow a 12-foot vertical increase in water level,” said Kris Wahlers, the park’s operations manager. “Things are having to move up in elevation and a little bit further away from the lake.”

The rise in water level won’t happen until 2020 at the earliest, but construction to accommodate the increase began last fall at several locations around the reservoir. The second phase of construction is due to begin in the fall, with completion anticipated next spring.

The lake will remain open this summer for boating. The new north boat ramps were finished this spring and the ramps by the marina will be replaced this fall.

“Both boat ramps are open right now,” Wahlers said, “and the marina is filling up with boats.”

All hope is not lost. Here are some other options for swimming within an hour of the city. Most open Memorial Day:

And while we’re talking summer and water sports, here are some great places to stand-up paddleboard along the Front Range, the rules of boating etiquette in Colorado, and even places you can surf.

The Denver Post needs your support.
Subscribe now for just 99 cents for the first month.

A Colorado funeral home owner also sold human body parts. A new bill would make that illegal.

Colorado lawmakers are moving to make the state one of the first in the country to regulate companies that sell human body parts, following an FBI raid at a Montrose funeral home that also housed a body broker.

The state Senate on Tuesday gave final approval to a bill that would require businesses selling human body parts that aren’t intended to be used in transplants to register with the state and maintain records documenting the donation of bodies and their sale. Tissue banks that sell organs for transplant are already regulated. The bill, SB18-234, would also prohibit anyone who owns more than a 10 percent stake in a funeral home or crematory from owning a body broker business.

“I decided that there needs to be some intervention to protect families,” said state Sen. Don Coram, a Republican from Montrose who is one of the bill’s sponsors.

The bill comes in response to allegations of misconduct at Sunset Mesa Funeral Directors, a Montrose funeral home that was raided by the FBI and had its state license suspended earlier this year following multiple complaints.

An investigation by the Reuters news agency revealed that Sunset Mesa was the only funeral home in the country that shared a building with a body broker — and both were owned by the same woman, Megan Hess. Among the complaints detailed in the state’s investigation of Sunset Mesa was one instance where a family said they asked their loved one to be cremated but said they received cement mix and not cremated human remains in return.

Body brokers typically sell human body parts to academic or medical research facilities, and they can receive over $1,000 for certain parts. Though the practice is legal, it is also weakly regulated. In its investigation last year, Reuters found only four states that regulated the sales.

“I’m not trying to slow down science,” Coram said. “It is a legitimate business. I just want to make sure it stays legitimate.”

The bill now goes over to the House for several more votes before possibly being sent to the governor for his signature.

The Denver Post needs your support.
Subscribe now for just 99 cents for the first month.

Poll: Privacy debacle prompts social-media changes

NEW YORK — If you’ve made changes to how you use social media since Facebook’s Cambridge Analytica privacy debacle, you’re not alone.

A new poll from The Associated Press-NORC Center for Public Affairs Research finds that 7 out of 10 of online adults who’ve heard of the scandal — revelations that a data mining firm may have accessed the private information of some 87 million Facebook users to influence elections — have unfollowed accounts, deleted their profiles or made other changes in how they use social media.

And since 9 in 10 Americans have heard at least a little bit about Cambridge Analytica, this means the scandal has led to widespread changes in the use of social media among Americans. What’s less clear is whether these changes are permanent, and whether they will affect business at Facebook, Twitter and other social media companies.

Facebook has said that it hasn’t noticed a meaningful decline in usage since the scandal broke and it doesn’t seem to have experienced much of an advertiser exodus, either. But that doesn’t mean the social media giant is in the clear. Some high-profile tech luminaries such as Elon Musk and Steve Wozniak have disavowed Facebook, and a “DeleteFacebook” online campaign — even if it didn’t lead to mass defections — has bruised the company’s already-battered image.

Cole Bearden, 26, a musician and liquor store employee in Nashville, said he soured on Facebook a while ago, after his parents friended him and turned his app into “a perpetual recipe video-sharing machine.” That, along with his concerns about surveillance and advertisements, convinced him to drop the app from his phone a year ago. He said in an interview last month that he checks his profile only occasionally.

Still, Bearden says deleting his profile won’t mean a lot unless many other Facebook users do the same. And even that, he says, may come too late.

“The real damage has been done. Our concept of open democracy has been undermined, subverted and potentially irreparably damaged,” he said.

Some people, though, were cautious long before Cambridge Analytica. Jessica Garcia, who lives in Homewood, Illinois, said she was already “pretty strict” with all her settings and she uses social media (Facebook, mostly) only minimally. She doesn’t post much and stays out of politics.

Asked who bears the responsibility to protect people’s online privacy, the poll found that vast majorities of Americans think both social media companies (84 percent) and individual users (72 percent) have a large share. Just short of half — 46 percent — see that as a large responsibility of the federal government.

Garcia agrees with the majority and said it’s a combination of individual and company responsibility.

“I don’t feel like the government needs to step in and start controlling that,” she said. “If we can’t make good decisions and people and they don’t make good decisions as companies, it’ll fall apart on its own.”

Americans who have taken some action after hearing about Facebook’s recent privacy crisis include 29 percent who have deleted certain social media accounts — the most drastic step. A larger number, 38 percent, uninstalled apps on their phone, while 42 percent said they used certain platforms less often. Nearly half, 47 percent, unfollowed or unfriended certain people, and 41 percent unfollowed groups or organizations.

Forty-five percent reviewed or changed their privacy settings — something Facebook encouraged recently by sending a notice to users through their Facebook pages. First, it notified the 87 million people whose information may have been leaked to Cambridge Analytica. This week, it began sending all 2.2 billion Facebook users a more generic notice to review their settings that show what apps have access to their data.

According to the poll, women were more likely than men to have made at least one change, and younger people were more likely to say they have reviewed their privacy settings or uninstalled apps from their phones. Older Americans were more likely to say they have followed news of the scandal.

The Cambridge Analytica fiasco was not Facebook’s first privacy scandal, though it may have been its worst. The poll also found that Americans have broader concerns about how their data is used by companies like Facebook, Twitter and Google. Sixty percent said they were very or extremely concerned that such companies may not keep their personal information secure, and more than half said they were concerned that the companies might track their data even after they have tried to delete it.

African Americans were more likely to express concern about privacy than whites. For example, 72 percent of blacks and 57 percent of whites are worried about companies securing their personal information, while 62 percent of blacks and 44 percent of whites are concerned about companies tracking their location.

The AP-NORC poll of 1,140 adults was conducted April 11-16 using a sample drawn from NORC’s probability-based AmeriSpeak Panel, which is designed to be representative of the U.S. population. The margin of sampling error for all respondents is plus or minus 4 percentage points

AP Polling Editor Emily Swanson in Washington and AP Business Writer Dee-Ann Durbin in Detroit contributed to this story.

 

Facebook finally explains why it bans some content, in 27 pages

SAN FRANCISCO – Among the most challenging issues for Facebook is its role as the policeman for the free expression of its 2 billion users.

Now the social network is opening up about its decision-making over which posts it decides to take down – and why. On Tuesday the company for the first time published the 27-page guidelines, called Community Standards, that it gives to its workforce of thousands of human censors. The set of guidelines encompasses dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda and disinformation. Facebook said it would offer users the opportunity to appeal Facebook’s decisions.

The move adds a new degree of transparency to a process that users, the public and advocates have criticized as arbitrary and opaque. The newly released guidelines offer suggestions on topics including how to determine the difference between humor, sarcasm and hate speech. They explain that images of female nipples are generally prohibited, but exceptions are made for images that promote breast-feeding or address breast cancer.

“We want people to know our standards, and we want to give people clarity,” Monika Bickert, Facebook’s head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”

The company’s censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs. Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimization.

In another instance, moderators removed an iconic Vietnam War photo of a child fleeing a napalm attack, claiming the girl’s nudity violated its policies. (The photo was restored after protests from news organizations.) Moderators have deleted posts from activists and journalists in Burma and in disputed areas such as the Palestinian territories and Kashmir and have told pro-Trump activists Diamond and Silk they were “unsafe to the community.”

The release of the guidelines is part of a wave of transparency that Facebook hopes will quell its many critics. It has also published political ads and streamlined its privacy controls after coming under fire for its lax approach to protecting consumer data.

The company is being investigated by the U.S. Federal Trade Commission over the misuse of data by a Trump-connected consultancy known as Cambridge Analytica, and Facebook chief executive Mark Zuckerberg recently testified before Congress about the issue. Bickert said discussions about sharing the guidelines started last fall and were not related to the Cambridge controversy.

The company’s content policies, which began in earnest in 2005, addressed nudity and Holocaust denial in the early years. They have ballooned from a single page in 2008 to 27 pages today.

As Facebook has come to reach nearly a third of the world’s population, Bickert’s team has expanded significantly and is expected to grow even more in the coming year. A team of 7,500 reviewers, in places like Austin, Dublin and the Philippines, assesses posts 24 hours a day, seven days a week, in more than 40 languages. Moderators are sometimes temporary contract workers without much cultural familiarity with the content they are judging, and they make complex decisions in applying Facebook’s rules.

Bickert also employs high-level experts including a human rights lawyer, a rape counselor, a counterterrorism expert from West Point and a PhD researcher with expertise in European extremist organizations as part of her content review team.

Activists and users have been particularly frustrated by the absence of an appeals process when their posts are taken down. (Facebook users are allowed to appeal the shutdown of an entire account but not individual posts.) The Washington Post previously documented how people have likened this predicament to being put into “Facebook jail” – without being given a reason why they were locked up.

Malkia Cyril, a Black Lives Matter activist in Oakland, Calif., who is also the executive director for the Center for Media Justice, was among a coalition of more than 70 civil rights groups that pressured Facebook in 2017 to fix its “racially-biased” content moderation system. Among the changes the coalition sought was an appeals process for posts that are taken down.

“At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point,” Cyril said. “They told us they would get back to us when they had something new to say.”

Cyril said that Facebook’s actions Tuesday, while well-intentioned, do not go far enough in terms of addressing the white supremacist groups allowed on the platform.

“This is just a drop in the bucket,” she said. “What’s needed now is an independent audit to ensure that the basic civil rights of users are protected, especially vulnerable users being targeted on the street by hate that’s being fomented online.”

Zahra Billoo, executive director of the Council on American-Islamic Relations’ office for the San Francisco Bay area, said adding an appeals process and opening up guidelines would be a “positive development” but said the social network still has a ways to go if it wants to stay a relevant and safe space.

Billoo said that at least a dozen pages representing white supremacists are still up on the platform, even though the policies forbid hate speech and Zuckerberg testified before Congress this month that Facebook does not allow hate groups.

“An ongoing question many of the Muslim community have been asking is how to get Facebook to be better at protecting users from hate speech and not to be hijacked by white supremacists, right-wing activists, Republicans or the Russians as a means of organizing against Muslim, LGBT and undocumented individuals,” she said.

Billoo herself was censored by Facebook two weeks after Donald Trump’s election, when she posted an image of a handwritten letter mailed to a San Jose mosque and quoted from it: “He’s going to do to you Muslims what Hitler did to the Jews.”

Bickert’s team has been working for years to develop a software system that can classify the reasons a post was taken down so that users could receive clearer information – and so Facebook could track how many hate speech posts were put up in a given year, for example, or whether certain groups are having their posts taken down more frequently than others.

Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook’s community standards. After Tuesday’s announcement, people will be told whether their posts violated guidelines on nudity, hate speech and graphic violence. A Facebook executive said the teams were working on building more tools. “We do want to provide more details and information for why content has been removed,” said Ellen Silver, Facebook’s vice president of operations. “We have more work to do there, and we are committed to making those improvements.”

Though Facebook’s content moderation is still very much driven by humans, the company does use technology to assist in its work. The company currently uses software to identify duplicate reports, a timesaving technique for reviewers that helps them avoid reviewing the same piece of content over and over because it was flagged by many people at once. Software also can identity the language of a post and some of the themes, helping the post get to the reviewer with the most expertise.

The company can recognize images that have been posted before but cannot recognize new images. For example, if a terrorist organization reposts a beheading video that Facebook already took down, Facebook’s systems will notice it almost immediately, said Silver, but it cannot identify new beheading videos. The majority of items flagged by the community get reviewed within 24 hours, she said.

Every two weeks, employees and senior executives who make decisions about the most challenging issues around the world meet. They debate the pros and cons of potential policies. Teams who present are required to come up with research showing each side, a list of possible solutions, and a recommendation. They are required to list the organizations outside Facebook with which they consulted.

 

Pruitt unveils controversial “transparency” rule Tuesday limiting what research EPA can use

WASHINGTON – Environmental Protection Agency Administrator Scott Pruitt proposed a rule Tuesday that would establish new standards for what science could be used in writing agency regulations, according to individuals briefed on the plan. The sweeping change, long sought by conservatives, could have significant implications for decisions on everything from the toxicity of household products to the level of soot that power plants can emit.

The rule would only allow EPA to consider studies for which the underlying data are made available publicly. Advocates describe this approach as an advance for transparency, but critics say it would effectively block the agency from relying on long-standing, landmark studies linking air pollution and pesticide exposure to harmful health effects.

“Today is a red-letter day. It’s a banner day,” Pruitt told a group of supporters at agency headquarters. “The science that we use is going to be transparent. It’s going to be reproducible.”

The move reflects a broader effort already underway to change how the agency conducts and uses science to guide its work. Pruitt has already changed the standards for who can serve on EPA’s advisory committees, barring scientists who received EPA grants for their research while still allowing those funded by industry.

The rule will be subject to a 30-day comment period, EPA officials said. Pruitt, who had described the change during interviews with select media over the past month, said it will “enhance confidence in our decision-making” and prove “durable” because it will be issued as a regulation.

“This is not a policy,” he said. “This is not a memo.”

Many scientists argue that applying a standard to public health and environmental studies that is not currently required by peer-reviewed journals would limit the information the EPA could take into account.

Some researchers collect personal data from subjects but pledge to keep it confidential – as was the case in a major 1993 study by Harvard University that established the link between fine-particle air pollution and premature deaths, as well as more recent research that tapped a Medicare database available to any scientific group guaranteeing confidentiality of the personal information. That practice would not be allowed under the new rule.

In an interview Tuesday, former EPA Administrator Gina McCarthy said that requiring the kind of disclosure Pruitt envisions would have disqualified the federal government from tapping groundbreaking research, such as studies linking exposure to lead gasoline to neurological damage. Scientists will have trouble recruiting study participants if the rule is enacted, she predicted, even if they pledge to redact private information before handing it over to the government.

“The best studies follow individuals over time, so that you can control all the factors except for the ones you’re measuring,” said McCarthy, who now directs the Center for Climate, Health and the Global Environment at Harvard’s public health school. “But it means following people’s personal history, their medical history. And nobody would want somebody to expose all of their private information.”

House Science Committee Chairman Lamar Smith, R-Texas, sought to establish a requirement similar to the one Pruitt has proposed, but his legislation, titled the Honest and Open New EPA Science Treatment Act, failed to pass both chambers.

Pruitt and Smith met at EPA headquarters on Jan. 9, according to Pruitt’s public calendar, and an email obtained under the Freedom of Information Act indicates that the lawmaker pressed the administrator to adopt the legislation’s goal as his own.

Smith made “his pitch that EPA internally implement the HONEST Act [so that] no regulation can go into effect unless the scientific data is publicly available for review,” Aaron Ringel, deputy associate administrator for congressional affairs at the EPA, wrote other agency staffers. His email was obtained by the Union of Concerned Scientists, a scientific advocacy organization.

Conservatives, such as Trump EPA transition team member Steve Milloy, have long tried to discredit independent research the agency used to justify limiting air pollution from burning coal and other fossil fuels. A series of studies has shown that fine particulate matter, often referred to as soot, enters the lungs and bloodstream and can cause illnesses such as asthma as well as premature death.

“During the Obama administration, the EPA wantonly destroyed 94 percent of the market value of the coal industry, killed thousands of coal mining jobs and wreaked havoc on coal mining families and communities,” Milloy said in a statement, “all based on data the EPA and its taxpayer-funded university researchers have been hiding from the public and Congress for more than 20 years.”

While the administration presses ahead, legal experts warn that the rule may be vulnerable to a court challenge. In unanimous decisions in 2002 and 2010, the U.S. Court of Appeals for the District of Columbia Circuit said the EPA is not legally obligated to obtain and publicize the data underlying the research it considers in crafting regulations.

In the 2002 case, brought by the American Trucking Associations, Inc., two judges appointed by Ronald Reagan and one named by Bill Clinton wrote that they agreed with the agency that such a requirement “would be impractical and unnecessary.” The government’s defense had noted that “EPA’s reliance on published scientific studies without obtaining and reviewing the underlying data is not only reasonable, it is the only workable approach.”

A range of scientific organizations are already campaigning to block the rule from being finalized. On Monday, 985 scientists signed a letter organized by the Union of Concerned Scientists, urging Pruitt not to forge ahead with the policy change.

“There are ways to improve transparency in the decision-making process, but restricting the use of science would improve neither transparency nor the quality of EPA decision-making,” they wrote. “If fully implemented, this proposal would greatly weaken EPA’s ability to comprehensively consider the scientific evidence across the full array of health studies.”

Under the proposed rule, third parties would be able to test and try to replicate the findings of studies submitted to EPA. But, the scientists wrote, “many public health studies cannot be replicated, as doing so would require intentionally and unethically exposing people and the environment to harmful contaminants or recreating one-time events.”

Gretchen Goldman, an expert on air pollution and research director for the organization’s Center for Science and Democracy, said the rule could put some scientists in a quandary: Keeping personal health data or propriety information private would mean having their work ignored by the EPA.

“We have this incredible science-based process that works, and it has worked, by and large, even in the face of tremendous political pressures to not go with a science-based decision,” Goldman said.

The Environmental Protection Network, a group of former EPA employees, issued a report Tuesday stating that many older studies – in which the original data sets were either not maintained or stored in outdated formats – would be eliminated under the proposed rule.

And while there is no estimate yet for how much it would cost EPA to obtain and disseminate studies’ underlying data, the Congressional Budget Office has projected that Smith’s measure, if enacted, would cost the agency $250 million for initial compliance and then between $1 million and $100 million annually. A 2015 CBO analysis estimated that EPA would cut the number of studies it relies on by half because of the bill’s requirements.

Geophysicist Marcia McNutt, who is president of the National Academy of Sciences, said Tuesday that she is concerned the rule would prevent the EPA from relying on the best available scientific evidence.

“This decision seems hasty,” she wrote in an email. “I would be fearful that the very foundations of clean air and clean water could be undermined.”