The first 3D camera built on Google VR180 tech arrives

Timed with the release of the Mirage Solo headset, Lenovo is also releasing the first camera based on Google’s 180 tech. The Lenovo Mirage Camera, which seems to be geared heavily toward YouTube creators, packs a pair of fisheye 13MP lenses positioned about eyes-distance apart, allowing for high-quality 3D vision that’s perfect for VR viewing. At $299, the camera isn’t too expensive for creators looking to experiment, but it’s a worthy question of how big that niche really is. It starts shipping today.

The camera is based on YouTube’s VR180 platform, which is aiming to basically make capturing VR live-action content a little more palatable to creators. Indeed, 360-degree cameras have certainly gotten a lot of attention, but creators haven’t really figured out what to do with them for the most part. Google’s compromise here is to simplify the medium with a camera that shoots half as much but isn’t too expensive and delivers crisp 3D 4K video.

In terms of build, the camera is very nice. It doesn’t feel like it’s overtly high quality, but it’s solid enough and, most importantly, very pocketable. Like many 360 cameras, battery life isn’t awesome at two hours, but the battery can be swapped in and out and the camera comes with a spare, which is very nice. VR180 means 180 degrees, which you will understand in photos, especially if your finger creeps to the outer edge of the top of the camera — it will invade the 180 half-dome.

Users can utilize Google’s VR180 app to preview shots and live-stream footage from the camera.

It could all be a winning solution, but the question is really whether this product is popping up a little late. Tons of YouTube creators have undoubtedly experimented with VR-focused video and have gotten tied up in the frustrations. While the number of headsets is still growing, it’s still not enough that VR viewers can sustain a channel, and while VR180 videos are visible in “magic window” mode without a headset as well on mobile and desktop, it obviously loses the 3D capability as a result, which is kind of the biggest draw.


Read more TechCrunch

Twitch solidifies its lead with viewership up 21% in Q1, while YouTube Gaming drops

Twitch further solidified its lead in the game streaming market in the first quarter of the year, with gains in both average concurrent viewership and peak concurrent viewership, while the number two streaming site, YouTube Gaming, saw losses on both fronts. According to a new report from Streamlabs, which has visibility into the market thanks to its software platform used by hundreds of thousands of streamers, Twitch viewership was up by 21 percent in the quarter, growing from 788K average concurrent viewers in Q4 2017 to 953K in Q1.

Meanwhile, YouTube Gaming dropped 12 percent from 308K average concurrent viewers to 272K during that same time.

Other streaming services also saw gains, but their viewership numbers are much smaller.

Facebook, for example, grew viewership by 103 percent to reach 56K average concurrent viewers, Periscope grew 18 percent to 94K, and Microsoft’s Mixer grew 90 percent to 9.5K. (Microsoft’s real figures are likely much higher, however, because Streamlabs can’t track Mixer’s viewership on Xbox – which is most of it. Streamlabs is also missing some of Facebook Live’s viewership, as it can’t track private live streams only shared with friends.)

It’s no surprise that Twitch has a killer quarter, however.

The company announced in February it saw a record-breaking 388,000 concurrent viewers tune into a stream by Dr. Disrespect. This milestone was then blown out of the water the following month when Ninja played Fortnite with Drake and Travis Scott, reaching 628,000 concurrent viewers.

But even without these special events, Twitch has been growing.

It also saw a 33 percent increase in average concurrent streamers in Q1, going from 27K to 36K. Mixer and Periscope gained as well, up 282 percent and 126 percent, respectively. But YouTube Gaming dropped by 13 percent on this metric, going from 8.7K average concurrent streamers in Q4 2017 to 6.1K in Q1.

As Twitch grew, streamers made more money, too, Streamlabs found.

It claims to have seen the biggest quarter ever in Streamlabs tipping volume, rising 33 percent to $34.7 million, up from $26.2 million in the prior quarter. (Keep in mind this is tipping that takes place through Streamlabs software – the total tipping volume across platforms will be even higher.)

The company chalks up these gains to a variety of factors, including streamers’ more professional-quality videos, streams from games with huge audiences like Fortnite, growth of non-game streams, and more.

Streamlabs’ full report, here, also delves into its own gains in terms of traction, as well as the breakdown of the quarter’s most popular games.


Read more TechCrunch

YouTube releases its first report about how it handles flagged videos and policy violations

YouTube has released its first quarterly Community Guidelines Enforcement Report and launched a Reporting Dashboard that lets users see the status of videos they’ve flagged for review. The inaugural report, which covers the last quarter of 2017, follows up on a promise YouTube made in December to give users more transparency into how it handles abuse and decides what videos will be removed.

“This regular update will help show the progress we’re making in removing violative content from our platform,” the company said in a post on its official blog. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed or removal and policy removal reasons.”

But the report is unlikely to quell complaints from people who believe YouTube’s rules are haphazardly applied in an effort to appease advertisers upset their commercials had played before videos with violent extremist content. The issue came to the forefront last year after a report by The Times, but many content creators say YouTube’s updated policies have made it very difficult to monetize on the platform, even though their videos don’t violate its rules.

YouTube, however, claims that its anti-abuse machine learning algorithm, which it relies on to monitor and handle potential violations at scale, is “paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”

Its report says that YouTube removed 8.2 million videos during the last quarter of 2017, most of which were spam or contained adult content. Of that number, 6.7 million were automatically flagged by its anti-abuse algorithms first.

Of the videos reported by a person, 1.1 million were flagged by a member of YouTube’s Trusted Flagger program, which includes individuals, government agencies and NGOs that have received training from the platform’s Trust & Safety and Public Policy teams.

YouTube’s report positions views a video received before being removed as a benchmark for the success of its anti-abuse measures. At the beginning of 2017, 8% of videos removed for violent extremist content were taken down before clocking 10 views. After YouTube started using its machine-learning algorithms in June 2017, however, it says that percentage increased to more than 50% (in a footnote, YouTube clarified that this data does not include videos that were automatically and flagged before they could be published and therefore received no views). From October to December, 75.9% of all automatically flagged videos on the platform were removed before they received any views.

During that same period, 9.3 million videos were flagged by people, with nearly 95% coming from YouTube users and the rest from its Trusted Flagger program and government agencies or NGOs. People can select a reason when they flag a video. Most were flagged for sexual content (30.1%) or spam (26.4%).

Last year, YouTube said it wanted to increase the number of people “working to address violative content” to 10,000 across Google by the end of 2018. Now it says it has almost reached that goal and also hired more full-time anti-abuse experts and expanded their regional teams. It also claims that the addition of machine-learning algorithms enables more people to review videos.

In its report, YouTube gave more information about how those algorithms work.

“With respect to the automated systems that detect extremist content, our teams have manually reviewed over two million videos to provide large volumes of training examples, which improve the machine learning flagging technology,” it said, adding that it has started applying that technology to other content violations as well.

YouTube’s report may not ameliorate the concerns of content creators who saw their revenue drop during what they refer to as the “Adpocalpyse” or help them figure out how to monetize successfully again. On the other hand, it is a victory for people, including free speech activists, who have called for social media platforms to be more transparent about how they handle flagged content and policy violations, and may put more pressure on Facebook and Twitter.


Read more TechCrunch

RapChat raises $1.6 million to help you make and share your def jams

The first thing to understand about media sharing app RapChat is that co-founder Seth Miller is not a rapper and his other co-founder, Pat Gibson, is. Together they created RapChat, a service for making and sharing raps, and the conjunction of rapper and nerd seems to be really taking off.

Since we last looked at the app in 2016 (you can see Tito’s review below), a lot has changed. The team has raised $1.6 million in funding from investors out of Oakland and the midwest. Their app, which is sort of a musical.ly for rap, is a top 50 music app on iOS and Android and hit 100 million listens since launch. In short, their little social network/sharing platform is a “millionaire in the making, boss of [its] team, bringin home the bacon.”

The pair’s rap bonafides are genuine. Gibson has opened or performed with with Big Sean, Wiz Khalifa, and Machine Gun Kelly and he’s sold beats to MTV. “My music has garnered over 20M+ plays across YouTube, SoundCloud and more,” he wrote me, boasting in the semi-churlish manner of a rapper with a “beef.” Miller, on the other hand, likes to freestyle.

“I grew up loving to freestyle with friends at OU and I noticed lots of other millennials did this too (even if most suck lol) … at any party at 3am – there would always be a group of people in the corner freestyling,” he said. “At the same time Snapchat was blowing up on campus and just thought you should be able to do the same exact thing for rap.”

Gibson, on the other hand, saw it as a serious tool to help him with his music.

“I spent a lot of time, energy and resources making music,” he said. “I was producing the beats, writing the songs, recording/mixing the vocals, mastering the project, then distributing & promoting the music all by myself. With Rapchat, there’s a library of 1,000+ beats from top producers, an instant recording studio in your pocket, and the network to distribute your music worldwide and be discovered…. all from a free app. Rapchat is disrupting the creation, collaboration, distribution, & discovery of music via mobile

“We have a much bigger but also more active community than any other music creation app,” said Miller.

While it’s clear the wold needs another sharing platform like it needs a hole in the head, thanks to a rabid fanbase and a great idea the team has ensured that RapChat is not, as they say, wicka-wicka-whack. That, in the end, is all that matters.


Read more TechCrunch

YouTube ads for hundreds of brands still running on extremist and white nationalist channels

It’s been more than a year since YouTube promised to improve controls over what content advertisers would find their ads in front of; eight months since it promised to demonetize “hateful” videos; two months since it said it would downgrade offensive channels; and yet CNN reports that ads from hundreds of major brands are still appearing as pre-rolls for actual Nazis.

The ongoing failure to police billions of hours of content isn’t exactly baffling — this is a difficult problem to solve — but it is disappointing that YouTube seems to have repeatedly erred on the side of monetization.

As with previous reports, CNN’s article shows that ads were running on channels that, if YouTube’s content rules are to be believed, should have been demonetized and demoted instantly: Nazis, pedophiles, extremists of the right, left, and everywhere in between. Maybe even Logan Paul.

And the system appears to be working in strange ways: one screenshot shows a video by a self-avowed Nazi, entitled “David Duke on Harvey Weinstein exposing Jewish domination. Black/White genetic differences.” Below it a YouTube warning states that “certain features have been disabled for this video,” including comments and sharing, because of “content that may be inappropriate or offensive to some audiences.”

A cheerful ad from Nissan is running ahead of this enlightening piece of media, and CNN notes that ads also ran on it coming from the Friends of Zion Museum and the Jewish National Fund! Ads from the Toy Association ran on the channel of a guy who argued for the decriminalization of pedophilia!

I can’t really add anything to this. It’s so absurd I can barely believe it myself. Remember, this is after the company supposedly spent a year (at the very least) working to prevent this exact thing from happening. I left the headline in the present tense because I’m so certain that it’s still going on.

The responsibility really is YouTube’s, and if it can’t live up to its own promises, companies are going to leave it behind rather than face viral videos of their logo smoothly fading into a swastika on the wall of some sad basement-dwelling bigot. “Subway — eat fresh! And now, some guy’s thoughts on genocide.”

Some of the other brands that had ads run against offensive content: Amazon, Adidas, Cisco, Hilton, Hershey, LinkedIn, Mozilla, Netflix, Nordstrom, The Washington Post, The New York Times, 20th Century Fox Film, Under Armour, The Centers for Disease Control, Department of Transportation, Customs and Border Protection, Veterans Affairs the US Coast Guard Academy.

I’ve asked YouTube for comment on how this happened — or rather, how it never stopped happening.


Read more TechCrunch