The first 3D camera built on Google VR180 tech arrives

Timed with the release of the Mirage Solo headset, Lenovo is also releasing the first camera based on Google’s 180 tech. The Lenovo Mirage Camera, which seems to be geared heavily toward YouTube creators, packs a pair of fisheye 13MP lenses positioned about eyes-distance apart, allowing for high-quality 3D vision that’s perfect for VR viewing. At $299, the camera isn’t too expensive for creators looking to experiment, but it’s a worthy question of how big that niche really is. It starts shipping today.

The camera is based on YouTube’s VR180 platform, which is aiming to basically make capturing VR live-action content a little more palatable to creators. Indeed, 360-degree cameras have certainly gotten a lot of attention, but creators haven’t really figured out what to do with them for the most part. Google’s compromise here is to simplify the medium with a camera that shoots half as much but isn’t too expensive and delivers crisp 3D 4K video.

In terms of build, the camera is very nice. It doesn’t feel like it’s overtly high quality, but it’s solid enough and, most importantly, very pocketable. Like many 360 cameras, battery life isn’t awesome at two hours, but the battery can be swapped in and out and the camera comes with a spare, which is very nice. VR180 means 180 degrees, which you will understand in photos, especially if your finger creeps to the outer edge of the top of the camera — it will invade the 180 half-dome.

Users can utilize Google’s VR180 app to preview shots and live-stream footage from the camera.

It could all be a winning solution, but the question is really whether this product is popping up a little late. Tons of YouTube creators have undoubtedly experimented with VR-focused video and have gotten tied up in the frustrations. While the number of headsets is still growing, it’s still not enough that VR viewers can sustain a channel, and while VR180 videos are visible in “magic window” mode without a headset as well on mobile and desktop, it obviously loses the 3D capability as a result, which is kind of the biggest draw.


Read more TechCrunch

Twitch solidifies its lead with viewership up 21% in Q1, while YouTube Gaming drops

Twitch further solidified its lead in the game streaming market in the first quarter of the year, with gains in both average concurrent viewership and peak concurrent viewership, while the number two streaming site, YouTube Gaming, saw losses on both fronts. According to a new report from Streamlabs, which has visibility into the market thanks to its software platform used by hundreds of thousands of streamers, Twitch viewership was up by 21 percent in the quarter, growing from 788K average concurrent viewers in Q4 2017 to 953K in Q1.

Meanwhile, YouTube Gaming dropped 12 percent from 308K average concurrent viewers to 272K during that same time.

Other streaming services also saw gains, but their viewership numbers are much smaller.

Facebook, for example, grew viewership by 103 percent to reach 56K average concurrent viewers, Periscope grew 18 percent to 94K, and Microsoft’s Mixer grew 90 percent to 9.5K. (Microsoft’s real figures are likely much higher, however, because Streamlabs can’t track Mixer’s viewership on Xbox – which is most of it. Streamlabs is also missing some of Facebook Live’s viewership, as it can’t track private live streams only shared with friends.)

It’s no surprise that Twitch has a killer quarter, however.

The company announced in February it saw a record-breaking 388,000 concurrent viewers tune into a stream by Dr. Disrespect. This milestone was then blown out of the water the following month when Ninja played Fortnite with Drake and Travis Scott, reaching 628,000 concurrent viewers.

But even without these special events, Twitch has been growing.

It also saw a 33 percent increase in average concurrent streamers in Q1, going from 27K to 36K. Mixer and Periscope gained as well, up 282 percent and 126 percent, respectively. But YouTube Gaming dropped by 13 percent on this metric, going from 8.7K average concurrent streamers in Q4 2017 to 6.1K in Q1.

As Twitch grew, streamers made more money, too, Streamlabs found.

It claims to have seen the biggest quarter ever in Streamlabs tipping volume, rising 33 percent to $34.7 million, up from $26.2 million in the prior quarter. (Keep in mind this is tipping that takes place through Streamlabs software – the total tipping volume across platforms will be even higher.)

The company chalks up these gains to a variety of factors, including streamers’ more professional-quality videos, streams from games with huge audiences like Fortnite, growth of non-game streams, and more.

Streamlabs’ full report, here, also delves into its own gains in terms of traction, as well as the breakdown of the quarter’s most popular games.


Read more TechCrunch

YouTube releases its first report about how it handles flagged videos and policy violations

YouTube has released its first quarterly Community Guidelines Enforcement Report and launched a Reporting Dashboard that lets users see the status of videos they’ve flagged for review. The inaugural report, which covers the last quarter of 2017, follows up on a promise YouTube made in December to give users more transparency into how it handles abuse and decides what videos will be removed.

“This regular update will help show the progress we’re making in removing violative content from our platform,” the company said in a post on its official blog. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed or removal and policy removal reasons.”

But the report is unlikely to quell complaints from people who believe YouTube’s rules are haphazardly applied in an effort to appease advertisers upset their commercials had played before videos with violent extremist content. The issue came to the forefront last year after a report by The Times, but many content creators say YouTube’s updated policies have made it very difficult to monetize on the platform, even though their videos don’t violate its rules.

YouTube, however, claims that its anti-abuse machine learning algorithm, which it relies on to monitor and handle potential violations at scale, is “paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”

Its report says that YouTube removed 8.2 million videos during the last quarter of 2017, most of which were spam or contained adult content. Of that number, 6.7 million were automatically flagged by its anti-abuse algorithms first.

Of the videos reported by a person, 1.1 million were flagged by a member of YouTube’s Trusted Flagger program, which includes individuals, government agencies and NGOs that have received training from the platform’s Trust & Safety and Public Policy teams.

YouTube’s report positions views a video received before being removed as a benchmark for the success of its anti-abuse measures. At the beginning of 2017, 8% of videos removed for violent extremist content were taken down before clocking 10 views. After YouTube started using its machine-learning algorithms in June 2017, however, it says that percentage increased to more than 50% (in a footnote, YouTube clarified that this data does not include videos that were automatically and flagged before they could be published and therefore received no views). From October to December, 75.9% of all automatically flagged videos on the platform were removed before they received any views.

During that same period, 9.3 million videos were flagged by people, with nearly 95% coming from YouTube users and the rest from its Trusted Flagger program and government agencies or NGOs. People can select a reason when they flag a video. Most were flagged for sexual content (30.1%) or spam (26.4%).

Last year, YouTube said it wanted to increase the number of people “working to address violative content” to 10,000 across Google by the end of 2018. Now it says it has almost reached that goal and also hired more full-time anti-abuse experts and expanded their regional teams. It also claims that the addition of machine-learning algorithms enables more people to review videos.

In its report, YouTube gave more information about how those algorithms work.

“With respect to the automated systems that detect extremist content, our teams have manually reviewed over two million videos to provide large volumes of training examples, which improve the machine learning flagging technology,” it said, adding that it has started applying that technology to other content violations as well.

YouTube’s report may not ameliorate the concerns of content creators who saw their revenue drop during what they refer to as the “Adpocalpyse” or help them figure out how to monetize successfully again. On the other hand, it is a victory for people, including free speech activists, who have called for social media platforms to be more transparent about how they handle flagged content and policy violations, and may put more pressure on Facebook and Twitter.


Read more TechCrunch