ST. LOUIS — A St. Louis organization that supports women-led startup companies is expanding to six new cities.
The St. Louis Post-Dispatch reports that Brazen Global announced Tuesday that it’s creating branches in Chicago, Dallas, Denver, Detroit, Philadelphia and Fort Worth, Texas. The organization has hired acting directors in each new market.
Brazen was founded in 2014 by Jennifer Ehlen, who noticed that St. Louis ranked near the bottom of an American Express study on cities that support women entrepreneurs. Brazen has assisted more than 1,000 women entrepreneurs in St. Louis since forming.
The organization provides resources, sponsors support groups and hosts networking events for female business owners.
Ehlen says the group hopes to help women entrepreneurs realize their growth aspirations.
Information from: St. Louis Post-Dispatch
When that Denver summer sun starts beating down, there is nothing like retreating to a local oasis for a dip in a lake or reservoir to stay cool.
This summer, though, popular Chatfield State Park in Littleton won’t be an option for folks in the metro area.
Because of a construction project related to the expansion of reservoir capacity at the park, the popular swim beach will be closed for the summer.
To meet the need for increased water storage required by an increasing population in the Front Range, park managers are preparing for a substantial rise in the reservoir’s water level. To accommodate that, infrastructure around the reservoir must be relocated, and that includes the swim beach.
“They’re changing the infrastructure to allow a 12-foot vertical increase in water level,” said Kris Wahlers, the park’s operations manager. “Things are having to move up in elevation and a little bit further away from the lake.”
The rise in water level won’t happen until 2020 at the earliest, but construction to accommodate the increase began last fall at several locations around the reservoir. The second phase of construction is due to begin in the fall, with completion anticipated next spring.
The lake will remain open this summer for boating. The new north boat ramps were finished this spring and the ramps by the marina will be replaced this fall.
“Both boat ramps are open right now,” Wahlers said, “and the marina is filling up with boats.”
All hope is not lost. Here are some other options for swimming within an hour of the city. Most open Memorial Day:
And while we’re talking summer and water sports, here are some great places to stand-up paddleboard along the Front Range, the rules of boating etiquette in Colorado, and even places you can surf.
Colorado lawmakers are moving to make the state one of the first in the country to regulate companies that sell human body parts, following an FBI raid at a Montrose funeral home that also housed a body broker.
The state Senate on Tuesday gave final approval to a bill that would require businesses selling human body parts that aren’t intended to be used in transplants to register with the state and maintain records documenting the donation of bodies and their sale. Tissue banks that sell organs for transplant are already regulated. The bill, SB18-234, would also prohibit anyone who owns more than a 10 percent stake in a funeral home or crematory from owning a body broker business.
“I decided that there needs to be some intervention to protect families,” said state Sen. Don Coram, a Republican from Montrose who is one of the bill’s sponsors.
The bill comes in response to allegations of misconduct at Sunset Mesa Funeral Directors, a Montrose funeral home that was raided by the FBI and had its state license suspended earlier this year following multiple complaints.
An investigation by the Reuters news agency revealed that Sunset Mesa was the only funeral home in the country that shared a building with a body broker — and both were owned by the same woman, Megan Hess. Among the complaints detailed in the state’s investigation of Sunset Mesa was one instance where a family said they asked their loved one to be cremated but said they received cement mix and not cremated human remains in return.
Body brokers typically sell human body parts to academic or medical research facilities, and they can receive over $1,000 for certain parts. Though the practice is legal, it is also weakly regulated. In its investigation last year, Reuters found only four states that regulated the sales.
“I’m not trying to slow down science,” Coram said. “It is a legitimate business. I just want to make sure it stays legitimate.”
The bill now goes over to the House for several more votes before possibly being sent to the governor for his signature.
NEW YORK — If you’ve made changes to how you use social media since Facebook’s Cambridge Analytica privacy debacle, you’re not alone.
A new poll from The Associated Press-NORC Center for Public Affairs Research finds that 7 out of 10 of online adults who’ve heard of the scandal — revelations that a data mining firm may have accessed the private information of some 87 million Facebook users to influence elections — have unfollowed accounts, deleted their profiles or made other changes in how they use social media.
And since 9 in 10 Americans have heard at least a little bit about Cambridge Analytica, this means the scandal has led to widespread changes in the use of social media among Americans. What’s less clear is whether these changes are permanent, and whether they will affect business at Facebook, Twitter and other social media companies.
Facebook has said that it hasn’t noticed a meaningful decline in usage since the scandal broke and it doesn’t seem to have experienced much of an advertiser exodus, either. But that doesn’t mean the social media giant is in the clear. Some high-profile tech luminaries such as Elon Musk and Steve Wozniak have disavowed Facebook, and a “DeleteFacebook” online campaign — even if it didn’t lead to mass defections — has bruised the company’s already-battered image.
Cole Bearden, 26, a musician and liquor store employee in Nashville, said he soured on Facebook a while ago, after his parents friended him and turned his app into “a perpetual recipe video-sharing machine.” That, along with his concerns about surveillance and advertisements, convinced him to drop the app from his phone a year ago. He said in an interview last month that he checks his profile only occasionally.
Still, Bearden says deleting his profile won’t mean a lot unless many other Facebook users do the same. And even that, he says, may come too late.
“The real damage has been done. Our concept of open democracy has been undermined, subverted and potentially irreparably damaged,” he said.
Some people, though, were cautious long before Cambridge Analytica. Jessica Garcia, who lives in Homewood, Illinois, said she was already “pretty strict” with all her settings and she uses social media (Facebook, mostly) only minimally. She doesn’t post much and stays out of politics.
Asked who bears the responsibility to protect people’s online privacy, the poll found that vast majorities of Americans think both social media companies (84 percent) and individual users (72 percent) have a large share. Just short of half — 46 percent — see that as a large responsibility of the federal government.
Garcia agrees with the majority and said it’s a combination of individual and company responsibility.
“I don’t feel like the government needs to step in and start controlling that,” she said. “If we can’t make good decisions and people and they don’t make good decisions as companies, it’ll fall apart on its own.”
Americans who have taken some action after hearing about Facebook’s recent privacy crisis include 29 percent who have deleted certain social media accounts — the most drastic step. A larger number, 38 percent, uninstalled apps on their phone, while 42 percent said they used certain platforms less often. Nearly half, 47 percent, unfollowed or unfriended certain people, and 41 percent unfollowed groups or organizations.
Forty-five percent reviewed or changed their privacy settings — something Facebook encouraged recently by sending a notice to users through their Facebook pages. First, it notified the 87 million people whose information may have been leaked to Cambridge Analytica. This week, it began sending all 2.2 billion Facebook users a more generic notice to review their settings that show what apps have access to their data.
According to the poll, women were more likely than men to have made at least one change, and younger people were more likely to say they have reviewed their privacy settings or uninstalled apps from their phones. Older Americans were more likely to say they have followed news of the scandal.
The Cambridge Analytica fiasco was not Facebook’s first privacy scandal, though it may have been its worst. The poll also found that Americans have broader concerns about how their data is used by companies like Facebook, Twitter and Google. Sixty percent said they were very or extremely concerned that such companies may not keep their personal information secure, and more than half said they were concerned that the companies might track their data even after they have tried to delete it.
African Americans were more likely to express concern about privacy than whites. For example, 72 percent of blacks and 57 percent of whites are worried about companies securing their personal information, while 62 percent of blacks and 44 percent of whites are concerned about companies tracking their location.
The AP-NORC poll of 1,140 adults was conducted April 11-16 using a sample drawn from NORC’s probability-based AmeriSpeak Panel, which is designed to be representative of the U.S. population. The margin of sampling error for all respondents is plus or minus 4 percentage points
AP Polling Editor Emily Swanson in Washington and AP Business Writer Dee-Ann Durbin in Detroit contributed to this story.
SAN FRANCISCO – Among the most challenging issues for Facebook is its role as the policeman for the free expression of its 2 billion users.
Now the social network is opening up about its decision-making over which posts it decides to take down – and why. On Tuesday the company for the first time published the 27-page guidelines, called Community Standards, that it gives to its workforce of thousands of human censors. The set of guidelines encompasses dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda and disinformation. Facebook said it would offer users the opportunity to appeal Facebook’s decisions.
The move adds a new degree of transparency to a process that users, the public and advocates have criticized as arbitrary and opaque. The newly released guidelines offer suggestions on topics including how to determine the difference between humor, sarcasm and hate speech. They explain that images of female nipples are generally prohibited, but exceptions are made for images that promote breast-feeding or address breast cancer.
“We want people to know our standards, and we want to give people clarity,” Monika Bickert, Facebook’s head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”
The company’s censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs. Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimization.
In another instance, moderators removed an iconic Vietnam War photo of a child fleeing a napalm attack, claiming the girl’s nudity violated its policies. (The photo was restored after protests from news organizations.) Moderators have deleted posts from activists and journalists in Burma and in disputed areas such as the Palestinian territories and Kashmir and have told pro-Trump activists Diamond and Silk they were “unsafe to the community.”
The release of the guidelines is part of a wave of transparency that Facebook hopes will quell its many critics. It has also published political ads and streamlined its privacy controls after coming under fire for its lax approach to protecting consumer data.
The company is being investigated by the U.S. Federal Trade Commission over the misuse of data by a Trump-connected consultancy known as Cambridge Analytica, and Facebook chief executive Mark Zuckerberg recently testified before Congress about the issue. Bickert said discussions about sharing the guidelines started last fall and were not related to the Cambridge controversy.
The company’s content policies, which began in earnest in 2005, addressed nudity and Holocaust denial in the early years. They have ballooned from a single page in 2008 to 27 pages today.
As Facebook has come to reach nearly a third of the world’s population, Bickert’s team has expanded significantly and is expected to grow even more in the coming year. A team of 7,500 reviewers, in places like Austin, Dublin and the Philippines, assesses posts 24 hours a day, seven days a week, in more than 40 languages. Moderators are sometimes temporary contract workers without much cultural familiarity with the content they are judging, and they make complex decisions in applying Facebook’s rules.
Bickert also employs high-level experts including a human rights lawyer, a rape counselor, a counterterrorism expert from West Point and a PhD researcher with expertise in European extremist organizations as part of her content review team.
Activists and users have been particularly frustrated by the absence of an appeals process when their posts are taken down. (Facebook users are allowed to appeal the shutdown of an entire account but not individual posts.) The Washington Post previously documented how people have likened this predicament to being put into “Facebook jail” – without being given a reason why they were locked up.
Malkia Cyril, a Black Lives Matter activist in Oakland, Calif., who is also the executive director for the Center for Media Justice, was among a coalition of more than 70 civil rights groups that pressured Facebook in 2017 to fix its “racially-biased” content moderation system. Among the changes the coalition sought was an appeals process for posts that are taken down.
“At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point,” Cyril said. “They told us they would get back to us when they had something new to say.”
Cyril said that Facebook’s actions Tuesday, while well-intentioned, do not go far enough in terms of addressing the white supremacist groups allowed on the platform.
“This is just a drop in the bucket,” she said. “What’s needed now is an independent audit to ensure that the basic civil rights of users are protected, especially vulnerable users being targeted on the street by hate that’s being fomented online.”
Zahra Billoo, executive director of the Council on American-Islamic Relations’ office for the San Francisco Bay area, said adding an appeals process and opening up guidelines would be a “positive development” but said the social network still has a ways to go if it wants to stay a relevant and safe space.
Billoo said that at least a dozen pages representing white supremacists are still up on the platform, even though the policies forbid hate speech and Zuckerberg testified before Congress this month that Facebook does not allow hate groups.
“An ongoing question many of the Muslim community have been asking is how to get Facebook to be better at protecting users from hate speech and not to be hijacked by white supremacists, right-wing activists, Republicans or the Russians as a means of organizing against Muslim, LGBT and undocumented individuals,” she said.
Billoo herself was censored by Facebook two weeks after Donald Trump’s election, when she posted an image of a handwritten letter mailed to a San Jose mosque and quoted from it: “He’s going to do to you Muslims what Hitler did to the Jews.”
Bickert’s team has been working for years to develop a software system that can classify the reasons a post was taken down so that users could receive clearer information – and so Facebook could track how many hate speech posts were put up in a given year, for example, or whether certain groups are having their posts taken down more frequently than others.
Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook’s community standards. After Tuesday’s announcement, people will be told whether their posts violated guidelines on nudity, hate speech and graphic violence. A Facebook executive said the teams were working on building more tools. “We do want to provide more details and information for why content has been removed,” said Ellen Silver, Facebook’s vice president of operations. “We have more work to do there, and we are committed to making those improvements.”
Though Facebook’s content moderation is still very much driven by humans, the company does use technology to assist in its work. The company currently uses software to identify duplicate reports, a timesaving technique for reviewers that helps them avoid reviewing the same piece of content over and over because it was flagged by many people at once. Software also can identity the language of a post and some of the themes, helping the post get to the reviewer with the most expertise.
The company can recognize images that have been posted before but cannot recognize new images. For example, if a terrorist organization reposts a beheading video that Facebook already took down, Facebook’s systems will notice it almost immediately, said Silver, but it cannot identify new beheading videos. The majority of items flagged by the community get reviewed within 24 hours, she said.
Every two weeks, employees and senior executives who make decisions about the most challenging issues around the world meet. They debate the pros and cons of potential policies. Teams who present are required to come up with research showing each side, a list of possible solutions, and a recommendation. They are required to list the organizations outside Facebook with which they consulted.