Roland announces software versions of its 808 and 909 drum machines

The Roland TR-808 and TR-909 are iconic drum machines that powered a ton of the music from the ’80s and ’90s. While both hardware units were recently revived as the TR-08 and TR-09 , they haven’t been officially emulated in software yet. That changes now as Roland announces VST and AU plugins for both of the iconic rhythm modules (along with a new SRX Orchestra virtual instrument set) as part of the company’s Roland Cloud service. The TR-808 and TR-909 virtual instruments are full reproductions of the original hardware, according to Roland. The SRX Orchestra is the first one of the SRX series Expansion Library (from the 2000s) available as a software instrument. Roland Cloud will be a suite of high-resolution software synths and sampled instruments that musicians will be able to pull from while creating their own musical works. It sounds similar to what Adobe has done with its own photo and graphics-based Adobe Cloud . All three new additions are headed as updates to the Roland Cloud service starting in February of this year. Via: Fact Mag Source: Roland

More:
Roland announces software versions of its 808 and 909 drum machines

Adobe’s ‘Cloak’ experiment is a content-aware eraser for video

Glamorous show-reels from shows like Game of Thrones get all the fame, but a lot of VFX work is mundane stuff like removing cars , power lines and people from shots. Adobe’s research team is working on making all of that easier for anyone, regardless of budget, thanks to a project called “Cloak.” It’s much the same as ” content-aware fill” for Photoshop, letting you select and then delete unwanted elements, with the software intelligently filling in the background. Cloak does the same thing to moving video, though, which is a significantly bigger challenge. Engadget got an early look at the tech, including a video demonstration and chance to talk with Adobe research engineer Geoffrey Oxholm and Victoria Nece, product manager for video graphics and VFX. At the moment, the technology is in the experimental stages, with no set plans to implement it. However, Adobe likes to give the public ” Sneaks ” at some of its projects as a way to generate interest and market features internally to teams. An example of that would be last year’s slightly alarming “VoCo” tech that lets you Photoshop voiceovers or podcasts. That has yet to make it into a product, but one that did is “Smartpic” which eventually became part of Adobe’s Experience Manager. The “Cloak” tech wouldn’t just benefit Hollywood — it could be useful to every video producer. You could make a freeway look empty by removing all the cars, cut out people to get a pristine nature shot, or delete, say, your drunk uncle from a wedding shot. Another fun example: When I worked as a compositer in another life , I had to replace the potato salad in a shot with macaroni, which was a highly tedious process. Object removal will also be indispensable for VR, AR, and other types of new video tech. “With 360 degree video, the removal of objects, the crew and the camera rig becomes virtually mandatory, ” Nece told Engadget. Content-aware fill on photos is no easy task in the first place, because the computer has to figure out what was behind the deleted object based on the pixels around it. Video increases the degree of difficulty, because you have to track any moving objects you want to erase. On top of that, the fill has to look the same from frame to frame or it will be a glitchy mess. “It’s a fascinating problem, ” Oxholm said. “Everything is moving, so even if you nail one frame, you have to be consistent.” Luckily, video does have one advantage over photos. “The saving grace is that we can see behind the thing we want to remove, ” says Oxholm. “If you’ve got a microphone to remove, you can see behind the microphone.” In other words, if you’re doing shot of a church with a pole in the way, there’s a good chance you have a different angle with a clean view of the church. With 360 degree video, the removal of objects, the crew and the camera rig becomes virtually mandatory. Another thing making content-aware fill for video much more feasible now is the fact that motion-tracking technology has become so good. “We can do really dense tracking, using parts of the scene as they become visible, ” said Oxholm. “That gives you something you can use to fill in.” The results so far, as shown in the video above, are quite promising. The system was able to erase cars from a freeway interchange, did a decent job of deleting a pole in front of a cathedral and even erased a hiking couple from a cave scene. The shots were done automatically in “one quick process, ” Oxholm said, after a mask was first drawn around the object to be removed — much as you do with Photoshop. It’s not totally perfect, however. Shadow traces are visible on the cave floor, and the cathedral is blurred in spots where the pole used to be. Even at this early stage, though, the tool could do much of the grunt-work, making it easier for a human user to do the final touch-ups. I’d love to see Adobe release it in preview as soon as possible, even if it’s not perfect, as it looks like it could be a major time saver — I sure could’ve used it for that macaroni.

Excerpt from:
Adobe’s ‘Cloak’ experiment is a content-aware eraser for video

Facebook’s new 360 cameras bring multiple perspectives to live videos

Last year, Facebook announced the Surround 360 , a 360-degree camera that can capture footage in 3D and then render it online via specially designed software. But it wasn’t for sale. Instead, the company used it as a reference design for others to create 3D 360 content, even going so far as to open source it on Github later that summer. As good as the camera was, though, it still didn’t deliver the full VR experience. That’s why Facebook is introducing two more 360-degree cameras at this year’s F8 : the x24 and x6. The difference: These cameras can shoot in six degrees of freedom, which promises to make the 360 footage more immersive than before. The x24 is so named because it has 24 cameras; the x6, meanwhile, has — you guessed it — six cameras. While the x24 looks like a giant beach ball with many eyes, the x6 is shaped more like a tennis ball, which makes for a less intimidating look. Both are designed for professional content creators, but the x6 is obviously meant to be a smaller, lighter and cheaper version. Both the x24 and the x6 are part of the Surround 360 family. And, as with version one (which is now called the Surround 360 Open Edition), Facebook doesn’t plan on selling the cameras themselves. Instead, Facebook plans to license the x24 and x6 designs to a “select group of commercial partners.” Still, the versions you see in the images here were prototyped in Facebook’s on-site hardware lab (cunningly called Area 404) using off-the-shelf components. The x24 was made in partnership with FLIR, a company mostly known for its thermal imaging cameras, while the x6 prototype was made entirely in-house. But before we get into all of that, let’s talk a little bit about what sets these cameras apart from normal 360 ones. With a traditional fixed camera, you see the world through its fixed lens. So if you’re viewing this content (also known as stereoscopic 360) in a VR headset and you decide to move around, the world stays still as you move, which is not what it would look like in the real world. This makes the experience pretty uncomfortable and takes you out of the scene. It becomes less immersive. With content that’s shot with six degrees of freedom, however, this is no longer an issue. You can move your head to a position where the camera never was, and still view the world as if you were actually there. Move your head from side to side, forwards and backwards, and the camera is smart enough to reconstruct what the view looks like from different angles. All of this is due to some special software that Facebook has created, along with the carefully designed pattern of the cameras. According to Brian Cabral, Facebook’s Engineering Director, it’s an “optimal pattern” to get as much information as possible. I had the opportunity to have a look at a couple of different videos shot with the x24 at Facebook’s headquarters (Using the Oculus Rift, of course). One was of a scene shot in the California Academy of Sciences, specifically at the underwater tunnel in the Steinhart Aquarium. I was surprised to see that the view of the camera would follow my own as I tilted my head from left to right and even when I crouched down on the floor. I could even step to the side and look “through” where the camera was, as if it wasn’t there at all. If the video was shot through a traditional 360 camera, it’s likely that I would see the camera tripod if I looked down. But with the x24, I just saw the floor, as if I was a disembodied ghost floating around. Another wonderful thing about videos shot with six degrees of freedom is that each pixel has depth. Each pixel is literally in 3D. This a breakthrough for VR content creators, and opens up a world of possibilities in visual effects editing. This means that you can add 3D effects to live action footage, a feat that usually would have required a green screen. I saw this demonstrated in the other video, which was of a scene shot on the roof of one of Facebook’s buildings. Facebook along with Otoy, a Los Angeles-based cloud rendering company, were able to actually add effects to the scene. Examples include floating butterflies, which wafted around when I swiped at them with a Touch controller. They also did a visual trick where I could step “outside” of the scene and encapsulate the entire video in a snow globe. All of this is possible because of the layers of depth that the footage provides. That’s not to say there weren’t bugs. The video footage I saw had shimmering around the edges, which Cabral said is basically a flaw in the software that they’re working to fix. Plus, the camera is unable to see what’s behind people, so there’s a tiny bit of streaking along the edges. Still, there’s lots of potential with this kind of content. “This is a new kind of media in video and immersive experiences, ” said Eric Cheng, Facebook’s head of Immersive Media, who was previously the Director of Photography at Lytro. “Six degrees of freedom has traditionally been done in gaming and VR, but not in live action.” Cheng says that many content creators have told him that they’ve been waiting for a way to bridge live action into these “volumetric editing experiences.” Indeed, that’s partly why Facebook is partnering with a lot of post-production companies like Adobe, Foundry and Otoy in order to develop an editing workflow with these cameras. “Think of these cameras as content acquisition tools for content creators, ” said Cheng. But what about other cameras, like Lytro’s Immerge for example? “There’s a large continuum of these things, ” said Cabral. “Lytro sits at the very very high-end.” It’s also not nearly as portable as both the x24 and x6, which are both designed for a much more flexible and nimble approach to VR capture. As for when cameras like these will make their way down to the consumer level, well, Facebook says that will come in future generations. “That’s the long arc of where we’re going with this, ” said CTO Mike Schroepfer. “Our goal is simple: We want more people producing awesome, immersive 360 and 3D content, ” said Schroepfer. “We want to bring people up the immersion curve. We want to be developing the gold standard and say this is where we’re shooting for.” Click here to catch up on the latest news from F8 2017!

See more here:
Facebook’s new 360 cameras bring multiple perspectives to live videos

Mozilla To Drop Support For All NPAPI Plugins In Firefox 52 Except Flash

The Netscape Plugins API is “an ancient plugins infrastructure inherited from the old Netscape browser on which Mozilla built Firefox, ” according to Bleeping Computer. But now an anonymous reader writes: Starting March 7, when Mozilla is scheduled to release Firefox 52, all plugins built on the old NPAPI technology will stop working in Firefox, except for Flash, which Mozilla plans to support for a few more versions. This means technologies such as Java, Silverlight, and various audio and video codecs won’t work on Firefox. These plugins once helped the web move forward, but as time advanced, the Internet’s standards groups developed standalone Web APIs and alternative technologies to support most of these features without the need of special plugins. The old NPAPI plugins will continue to work in the Firefox ESR (Extended Support Release) 52, but will eventually be deprecated in ESR 53. A series of hacks are available that will allow Firefox users to continue using old NPAPI plugins past Firefox 52, by switching the update channel from Firefox Stable to Firefox ESR. Read more of this story at Slashdot.

More here:
Mozilla To Drop Support For All NPAPI Plugins In Firefox 52 Except Flash

Chrome 55 Now Blocks Flash, Uses HTML5 By Default

An anonymous reader quotes Bleeping Computer: Chrome 55, released earlier this week, now blocks all Adobe Flash content by default, according to a plan set in motion by Google engineers earlier this year… While some of the initial implementation details of the “HTML5 By Default” plan changed since then, Flash has been phased out in favor of HTML5 as the primary technology for playing multimedia content in Chrome. Google’s plan is to turn off Flash and use HTML5 for all sites. Where HTML5 isn’t supported, Chrome will prompt users and ask them if they want to run Flash to view multimedia content. The user’s option would be remembered for subsequent visits, but there’s also an option in the browser’s settings section, under Settings > Content Settings > Flash > Manage Exceptions, where users can add the websites they want to allow Flash to run by default. Exceptions will also be made automatically for your more frequently-visited sites — which, for many users, will include YouTube. And Chrome will continue to ship with Flash — as well as an option to re-enable Flash on all sites. Read more of this story at Slashdot.

Read this article:
Chrome 55 Now Blocks Flash, Uses HTML5 By Default

Adobe Photoshop adds Content-Aware Crop and font suggestions

Adobe usually announces significant updates to Creative Cloud every six months , and its delivering another right on schedule. While the changes are scattered across all of the apps in the company’s software subscription and its stock photo service, we’ll focus primarily on Photoshop. For its popular photo-editing app, Adobe is adding a Content-Aware Crop to the collection of smart design tools. Here, the software automatically fills in any gaps that are created when you either rotate and image or expand it beyond its original size. This new cropping option joins the handy Content-Aware Fill and other tools that make quick work of photo edits. Photoshop’s Liquify tool, a feature that’s used to tweak facial features , is getting an update as well. It’s now “Face-Aware, ” which means it’ll keep the subject’s face in proportion while you make those subtle adjustments. The application also has a new font-recognition tool that will not only identify licensed fonts, but it will suggest similar options that are available on your computer or through Adobe’s TypeKit service. A notable chance across all Creative Cloud apps include the ability to set permissions for design assets in CreativeSync. This means that when you’re working with a team, you can determine who sees what rather than having all of the images, fonts and other files available to everyone inside the CC software. There are also new search filters to that you can narrow results to still photos, video, vectors and illustrations. Adobe Premiere Pro, the company’s video-editing app, continues to add on the VR-related tools. This time the software gets a “field of view” preview mode to check progress on that immersive content. In After Effects, you can now match an animated character’s speech and movement with a real-life actor thanks to the Character Animator Preview. For Illustrator users, expect to easily export assets and artboards in multiple formats and resolutions with one click, rather than having to save separate files individually. All of the above updates are available now in Creative Cloud for subscribers, included in the cost of the software plan. Those prices are set at $10/month for the photography option (Lightroom and Photoshop only) and $50/month for the full suite of apps.

See the article here:
Adobe Photoshop adds Content-Aware Crop and font suggestions

Apple will deactivate Flash by default on Safari 10

You know that Maya Angelou quote that says “Never make someone a priority when all you are to them is an option?” If Flash were a person following that tenet, then it now has to drop Safari from its dwindling list of priorities. In a post on the WebKit blog, Apple engineer Ricky Mondello has revealed that the company is deactivating Adobe Flash by default on Safari 10. That’s the version of the browser shipping with macOS Sierra this fall. If you access a website that has both Flash and HTML5, the browser will opt for the latter. But if the page requires Flash to work, then a prompt will pop up asking if you’d like to switch it on. You can choose to active it just for that session or to keep it on for that URL forever. If you’ll recall, Microsoft and Google have been distancing themselves from Flash for quite some time, as well. Edge only displays Flash if it’s a central element on the page you’re looking at (say, a game or a video), while Chrome has started blocking Flash ads late last year. On the mobile side of things, Apple has announced at WWDC that it’s requiring all iOS apps to connect to the internet via HTTPS by January 1st, 2017. That means developers have to switch on a feature Cupertino launched with iOS 9 called App Transport Security. ATS forces apps to use a secure connection to help keep your data safe. Via: MacRumors Source: WebKit , TechCrunch

More:
Apple will deactivate Flash by default on Safari 10

Adobe Acrobat Reader Can Now Edit PDFs Directly From Your Dropbox On Android

Android: I hate dealing with PDFs. I understand why they’re necessary, but loading them is a pain and editing them is even worse. So, Adobe’s news that Acrobat Reader can edit PDFs stored in your Dropbox is a godsend. Read more…

More:
Adobe Acrobat Reader Can Now Edit PDFs Directly From Your Dropbox On Android

After Twenty Years of Flash, Adobe Kills the Name

An anonymous reader writes: From January 2016, Adobe Flash will be renamed to ‘Adobe Animate CC’, killing one of the most unfortunate names in web security as the company pushes the product further and further to HTML5 output. Adobe’s release about the update, which will form part of the annual Creative Cloud upgrade, states that a third of all material output from the program is now HTML5. The transitional HTML5 Adobe animation program Edge Animate will be replaced by the renamed Flash product. Read more of this story at Slashdot.

Follow this link:
After Twenty Years of Flash, Adobe Kills the Name

Study: Ad Blocker Use Jumps 41 Percent

Mickeycaskill writes: A report from Adobe and anti-ad blocking startup PageFair says the number of ad block users worldwide has increased by 41 percent in the past 12 months to 198 million monthly active users. The study suggests the growing popularity of ad blocking software is set to cost online publishers $21.8 billion in 2015 and could reach $41.4 billion by 2016. “About 45 million of them are in the United States, with almost 15 percent of people in states like New York and California relying on these services. The figures are even higher in Europe, where 77 million people use versions of the software. In Poland, more than a third of people regularly block online ads.” Read more of this story at Slashdot.

Read the article:
Study: Ad Blocker Use Jumps 41 Percent