Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

View article:
Google Found Over 1,000 Bugs In 47 Open Source Projects

Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

More:
Google Found Over 1,000 Bugs In 47 Open Source Projects

Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

Follow this link:
Google Found Over 1,000 Bugs In 47 Open Source Projects

Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

Read the article:
Google Found Over 1,000 Bugs In 47 Open Source Projects

Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

See the original post:
Google Found Over 1,000 Bugs In 47 Open Source Projects

Google Found Over 1,000 Bugs In 47 Open Source Projects

Orome1 writes: In the last five months, Google’s OSS-Fuzz program has unearthed over 1, 000 bugs in 47 open source software projects… So far, OSS-Fuzz has found a total of 264 potential security vulnerabilities: 7 in Wireshark, 33 in LibreOffice, 8 in SQLite 3, 17 in FFmpeg — and the list goes on… Google launched the program in December and wants more open source projects to participate, so they’re offering cash rewards for including “fuzz” targets for testing in their software. “Eligible projects will receive $1, 000 for initial integration, and up to $20, 000 for ideal integration” — or twice that amount, if the proceeds are donated to a charity. Read more of this story at Slashdot.

Read More:
Google Found Over 1,000 Bugs In 47 Open Source Projects

‘Avatar’ sequels start arriving on December 18th, 2020

James Cameron has spent years drumming up hype for his Avatar sequels with little to show for it (the first sequel was originally due this December). However, his team is finally ready to commit to specific release dates — for all the new movies. The production team has revealed that Avatar 2 should arrive on December 18th, 2020, with the rest staggered throughout the next few years. The third movie is slated for December 17th, 2021. There will be a 3-year gap between that and the fourth movie, which debuts on December 20th, 2024. The fifth and final (?) title will appear on December 19th, 2025, 16 years after the first. Cameron and crew have started “concurrent” production of the sequels, which are poised to make cases for both high frame rate video as well as Avatar ‘s signature blend of CG with real-world acting. In theory, this gives the team a better sense of the timing than it might have if it was taking a serial approach. With that said, you may still want to take these dates with a grain of salt. It’s not just that the releases have been pushed back in the past, it’s that the scope has changed over time. Cameron added a fourth sequel to the mix just in 2016, so it won’t be surprising if the schedule shifts due to further creative changes or unforeseen challenges. Really, the big news is simply that the director is getting the ball rolling after years of prep — the dates just give you a rough idea of what to expect. Via: Variety Source: Avatar (Facebook)

See the article here:
‘Avatar’ sequels start arriving on December 18th, 2020

Facebook’s new 360 cameras bring multiple perspectives to live videos

Last year, Facebook announced the Surround 360 , a 360-degree camera that can capture footage in 3D and then render it online via specially designed software. But it wasn’t for sale. Instead, the company used it as a reference design for others to create 3D 360 content, even going so far as to open source it on Github later that summer. As good as the camera was, though, it still didn’t deliver the full VR experience. That’s why Facebook is introducing two more 360-degree cameras at this year’s F8 : the x24 and x6. The difference: These cameras can shoot in six degrees of freedom, which promises to make the 360 footage more immersive than before. The x24 is so named because it has 24 cameras; the x6, meanwhile, has — you guessed it — six cameras. While the x24 looks like a giant beach ball with many eyes, the x6 is shaped more like a tennis ball, which makes for a less intimidating look. Both are designed for professional content creators, but the x6 is obviously meant to be a smaller, lighter and cheaper version. Both the x24 and the x6 are part of the Surround 360 family. And, as with version one (which is now called the Surround 360 Open Edition), Facebook doesn’t plan on selling the cameras themselves. Instead, Facebook plans to license the x24 and x6 designs to a “select group of commercial partners.” Still, the versions you see in the images here were prototyped in Facebook’s on-site hardware lab (cunningly called Area 404) using off-the-shelf components. The x24 was made in partnership with FLIR, a company mostly known for its thermal imaging cameras, while the x6 prototype was made entirely in-house. But before we get into all of that, let’s talk a little bit about what sets these cameras apart from normal 360 ones. With a traditional fixed camera, you see the world through its fixed lens. So if you’re viewing this content (also known as stereoscopic 360) in a VR headset and you decide to move around, the world stays still as you move, which is not what it would look like in the real world. This makes the experience pretty uncomfortable and takes you out of the scene. It becomes less immersive. With content that’s shot with six degrees of freedom, however, this is no longer an issue. You can move your head to a position where the camera never was, and still view the world as if you were actually there. Move your head from side to side, forwards and backwards, and the camera is smart enough to reconstruct what the view looks like from different angles. All of this is due to some special software that Facebook has created, along with the carefully designed pattern of the cameras. According to Brian Cabral, Facebook’s Engineering Director, it’s an “optimal pattern” to get as much information as possible. I had the opportunity to have a look at a couple of different videos shot with the x24 at Facebook’s headquarters (Using the Oculus Rift, of course). One was of a scene shot in the California Academy of Sciences, specifically at the underwater tunnel in the Steinhart Aquarium. I was surprised to see that the view of the camera would follow my own as I tilted my head from left to right and even when I crouched down on the floor. I could even step to the side and look “through” where the camera was, as if it wasn’t there at all. If the video was shot through a traditional 360 camera, it’s likely that I would see the camera tripod if I looked down. But with the x24, I just saw the floor, as if I was a disembodied ghost floating around. Another wonderful thing about videos shot with six degrees of freedom is that each pixel has depth. Each pixel is literally in 3D. This a breakthrough for VR content creators, and opens up a world of possibilities in visual effects editing. This means that you can add 3D effects to live action footage, a feat that usually would have required a green screen. I saw this demonstrated in the other video, which was of a scene shot on the roof of one of Facebook’s buildings. Facebook along with Otoy, a Los Angeles-based cloud rendering company, were able to actually add effects to the scene. Examples include floating butterflies, which wafted around when I swiped at them with a Touch controller. They also did a visual trick where I could step “outside” of the scene and encapsulate the entire video in a snow globe. All of this is possible because of the layers of depth that the footage provides. That’s not to say there weren’t bugs. The video footage I saw had shimmering around the edges, which Cabral said is basically a flaw in the software that they’re working to fix. Plus, the camera is unable to see what’s behind people, so there’s a tiny bit of streaking along the edges. Still, there’s lots of potential with this kind of content. “This is a new kind of media in video and immersive experiences, ” said Eric Cheng, Facebook’s head of Immersive Media, who was previously the Director of Photography at Lytro. “Six degrees of freedom has traditionally been done in gaming and VR, but not in live action.” Cheng says that many content creators have told him that they’ve been waiting for a way to bridge live action into these “volumetric editing experiences.” Indeed, that’s partly why Facebook is partnering with a lot of post-production companies like Adobe, Foundry and Otoy in order to develop an editing workflow with these cameras. “Think of these cameras as content acquisition tools for content creators, ” said Cheng. But what about other cameras, like Lytro’s Immerge for example? “There’s a large continuum of these things, ” said Cabral. “Lytro sits at the very very high-end.” It’s also not nearly as portable as both the x24 and x6, which are both designed for a much more flexible and nimble approach to VR capture. As for when cameras like these will make their way down to the consumer level, well, Facebook says that will come in future generations. “That’s the long arc of where we’re going with this, ” said CTO Mike Schroepfer. “Our goal is simple: We want more people producing awesome, immersive 360 and 3D content, ” said Schroepfer. “We want to bring people up the immersion curve. We want to be developing the gold standard and say this is where we’re shooting for.” Click here to catch up on the latest news from F8 2017!

See more here:
Facebook’s new 360 cameras bring multiple perspectives to live videos

Driverless pods begin ferrying the public around Greenwich

It’s been almost a year since the UK’s Transport Research Laboratory (TRL) opened sign-ups for a driverless pod trial in Greenwich. The original plan was to start before Christmas, but given today’s date that obviously didn’t happen. Still, better late than never, eh? Over the next three weeks, roughly 100 people will clamber aboard “Harry, ” a self-driving shuttle named after clockmaker John Harrison. It will take them around a two-mile course in North Greenwich, near The O2, to demonstrate how the technology could be used for “last mile” trips in urban areas. The shuttle is a repurposed Ultra Pod , which is already in operation at London’s Heathrow Airport. With a maximum speed of 10MPH (16KPH), it’s not the fastest electric vehicle — you could beat it on a Boosted Board — however it’s hoped the leisurely pace will reassure pedestrians and minimise dangerous incidents. Each pod carries up to four people, including a safety operator who can pepper the breaks in an emergency. It’s able to ‘see’ it’s surroundings using a mixture of cameras and lasers, and use that information to track obstacles and create a collision-free route. Notably, it doesn’t need to rely on GPS for any of these calculations. The purpose of the trials is to see how the public reacts to self-driving vehicles, and to examine how the technology can best be applied in built-up areas. Each trip will give the research team a wealth of valuable information — four terabytes of data every eight hours, to be precise. It’ll be supplemented with passenger interviews, taken before and after each trip, and written feedback that anyone can submit online through an interactive map . “It is critical that the public is fully involved as these technologies become a reality, ” Professor Nick Reed, academy director at TRL said. The “GATEway Project” at Greenwich is one of many research initiatives being funded by the UK government. We’ve already seen the ” Lutz Pathfinder ” pod, which is being tested in Milton Keynes, and a modified Land Rover that’s serving as a research testbed in Bristol. Plans are also underway for a 41-mile ” connected corridor , ” which will be used to test LTE, local WiFi hotspots and other forms of connectivity in self-driving vehicles. In the private sector, Nissan is testing its electric Leaf cars in the capital, and Roborace is developing a driverless motorsport . It’s an impressive hub of activity, even without Google and Uber’s involvement. Via: BBC

Read the original post:
Driverless pods begin ferrying the public around Greenwich

Microsoft reduced Windows update sizes by 35 percent

Fans of Windows and snappy downloads will be relieved to learn that Microsoft’s Unified Update Platform, which has been rolling out to Windows Insiders since November, will be available to all retail users starting with the release of the Creators Update coming later this spring. In addition to those very handy snooze and schedule features , the UUP significantly shrinks the size of future updates by saving users the trouble of downloading an entire build of their operating system. That feature is called differential download packages , which is a technical term for “only downloads what you need.” A differential download looks at the files already on your system and uses them to rebuild the new OS version from there. It sounds simple, but as Microsoft’s Laura Butler points out , it’s no easy task given the decades of legacy code and patches. For a major release like the Creators Update, you’ll still need to download a full build, but the next feature update after that should be significantly smaller — about 35 percent smaller on average, according to Microsoft’s Director of Program Management Bill Karagounis. For Windows Insiders those downloads should be even smaller, but it’s a bit of a trade-off since those systems get hit with more frequent updates. Source: Windows Blog

See original article:
Microsoft reduced Windows update sizes by 35 percent