Edge, VMWare, Safari, And Ubuntu Linux Hacked at Pwn2Own 2017

The 10th annual Pwn2Own hacking competition ended Friday in Vancouver. Some of the highlights: Ars Technica reports one team “compromised Microsoft’s heavily fortified Edge browser in a way that escapes a VMware Workstation virtual machine it runs in… by exploiting a heap overflow bug in Edge, a type confusion flaw in the Windows kernel and an uninitialized buffer vulnerability in VMware.” Digital Trends reports “Samuel Grob and Niklas Baumstark used a number of logic bugs to exploit the Safari browser and eventually take root control of the MacOS on a MacBook Pro, [and] impressed onlookers even more by adding a custom message to the Touch Bar which read: “pwned by niklasb and saelo.”Ubuntu 16.10 Linux was also successfully attacked by exploiting a flaw in the Linux 4.8 kernel, “triggered by a researcher who only had basic user access but was able to elevate privileges with the vulnerability to become the root administrative account user…” reports eWeek. “Chaitin Security Research Lab didn’t stop after successfully exploiting Ubuntu. It was also able to successfully demonstrate a chain of six bugs in Apple Safari, gaining root access on macOS.”Another attacker “leveraged two separate use-after-free bugs in Microsoft Edge and then escalated to SYSTEM using a buffer overflow in the Windows kernel.” None of the attendees registered to attempt an attack on the Apache Web Server on Ubuntu 16.10 Linux, according to eWeek, but the contest’s blog reports that “We saw a record 51 bugs come through the program. We paid contestants $833, 000 USD in addition to the dozen laptops we handed out to winners. And, we awarded a total of 196 Master of Pwn points.” Read more of this story at Slashdot.

Read more here:
Edge, VMWare, Safari, And Ubuntu Linux Hacked at Pwn2Own 2017

Microsoft’s Project Scorpio Will Pack Internal PSU, 4K Game DVR Capture

According to an exclusive report from Windows Central, Microsoft’s upcoming “Project Scorpio” gaming console will feature an internal power supply unit (PSU), similar to the Xbox One S, and 4K game DVR and streaming at 60 frames-per-second (FPS). From the report: In Microsoft’s efforts to make Project Scorpio a true 4K system, it will also feature HEVC and VP9 codecs for decoding 4K streams for things such Netflix, just like the Xbox One S. It will also leverage HEVC for encoding 2160p, 60 frame-per-second (FPS) video for Game DVR and streaming. Microsoft’s Beam streaming service has been running public 4K stream tests for some time, and it’s now fair to assume it will not only be PC streamers who will benefit. Project Scorpio’s Game DVR will allow you to stream and record clips in 4K resolution with 60FPS, according to our sources, which is a massive, massive step up from the 720p, 30FPS you get on the current Xbox One. With every bit of information we receive about Project Scorpio, the theme of native 4K keeps appearing — not only for games, but also console features. We now believe Scorpio will sport 4K Game DVR, 4K Blu-ray playback, and 4K streaming apps, but the real showstopper will be the 4K games Microsoft will likely flaunt at E3 2017. Read more of this story at Slashdot.

Read more here:
Microsoft’s Project Scorpio Will Pack Internal PSU, 4K Game DVR Capture

How The FBI Used Geek Squad To Increase Secret Public Surveillance

In 2011 a gynecology doctor took his computer for repairs at Best Buy’s Geek Squad. But the repair technician was a paid FBI informant — one of several working at Geek Squad — and the doctor was ultimately charged with possessing child pornography, according to OC Weekly. An anonymous reader quotes their new report: Recently unsealed records reveal a much more extensive secret relationship than previously known between the FBI and Best Buy’s Geek Squad, including evidence the agency trained company technicians on law-enforcement operational tactics, shared lists of targeted citizens and, to covertly increase surveillance of the public, encouraged searches of computers even when unrelated to a customer’s request for repairs. Assistant United States Attorney M. Anthony Brown last year labeled allegations of a hidden partnership as “wild speculation.” But more than a dozen summaries of FBI memoranda filed inside Orange County’s Ronald Reagan Federal Courthouse this month in USA v. Mark Rettenmaier contradict the official line… Other records show how [Geek Squad supervisor Justin] Meade’s job gave him “excellent and frequent” access for “several years” to computers belonging to unwitting Best Buy customers, though agents considered him “underutilized” and wanted him “tasked” to search devices “on a more consistent basis”… evidence demonstrates company employees routinely snooped for the agency, contemplated “writing a software program” specifically to aid the FBI in rifling through its customers’ computers without probable cause for any crime that had been committed, and were “under the direction and control of the FBI.” The doctor’s lawyer argues Best Buy became an unofficial wing of the FBI by offering $500 for every time they found evidence leading to criminal charges. Read more of this story at Slashdot.

Read the article:
How The FBI Used Geek Squad To Increase Secret Public Surveillance

T-Mobile Raises Deprioritization Threshold To 30GB

An anonymous reader quotes a report from TmoNews: T-Mobile’s new deprioritization threshold is 30GB of usage in a single billing cycle. While T-Mo didn’t make an official announcement about the change, you can see in this cached page that the network management policy says 28GB: “Based on network statistics for the most recent quarter, customers who use more than 28GB of data during a billing cycle will have their data usage prioritized below other customers’ data usage for the remainder of the billing cycle in times and at locations where there are competing customer demands for network resources.” Navigating to the webpage today now says 30GB. What this change means is that if you use more than 30GB of data in one billing cycle, your data usage will be prioritized below others for the remainder of that billing cycle. The only time that you’re likely to see the effects of that, though, is when you’re at a location on the network that is congested, during which time you may see slower speeds. Once you move to a different location or the congestion goes down, your speeds will likely go back up. And once the new billing cycle rolls around, your usage will be reset. Read more of this story at Slashdot.

Visit link:
T-Mobile Raises Deprioritization Threshold To 30GB

Windows 10 Build 15048 Has a Windows Mixed Reality Demo You Can Try

Microsoft’s big push into mixed reality involves headsets from multiple manufacturers (including ASUS, Dell, HP, Lenovo), and developer kits with Acer’s headset will begin a phased rollout this month. But Windows 10’s latest “Insider Preview” build already includes a mixed reality simulator with a first-person 3D environment that can be navigated with the W, A, S and D keys. Slashdot reader Mark Wilson writes: From the look of the changelog for Windows 10 build 15048 that was released a few days ago to Insiders, it looked to be little more than a bug fixing release. But in fact Microsoft has already started to include references to — and even a portal for — Windows Mixed Reality. We have seen reference to Windows Holographic in Windows 10 before, but this is the first time there has been anything to play with. It coincides nicely with Microsoft revealing that Windows Mixed Reality is the new name for Windows Holographic, and it gives Insiders the chance to not only see if their computer meets the recommended specs, but also to try out a Windows Mixed reality simulation. Read more of this story at Slashdot.

Originally posted here:
Windows 10 Build 15048 Has a Windows Mixed Reality Demo You Can Try

Streaming TV Sites Now Have More Subscribers Than Cable TV

Nielsen reported this week that millennials “spend about 27% less time watching traditional TV than viewers over the age of 35, ” possibly threatening the dominance of cable TV. An anonymous reader quotes Axios: Streaming service subscribers (free or paid) increased again (68% in 2016 vs. 63% in 2014) and have caught up with the percentage of paid TV service providers (67%) for the first time ever, according to the Consumer Technology Association’s new study, The Changing Landscape for Video and Content. The rise of streaming services represents a shift in consumption habits towards cord-cutting, primarily amongst millennials. Some other trends are impossible to ignore. 2016 also saw a saw dramatic drops in the use of physical disks — from 41% in 2015 to just 28% — as well as another big drop in the use of antennas, from 18% to just 10%. Read more of this story at Slashdot.

See the article here:
Streaming TV Sites Now Have More Subscribers Than Cable TV

Apple Losing Out To Microsoft and Google in US Classrooms

Apple is losing its grip on American classrooms, which technology companies have long used to hook students on their brands for life. From a report on MacRumors: According to research company Futuresource Consulting, in 2016 the number of devices in American classrooms that run iOS and macOS fell to third place behind both Google-powered laptops and Windows devices. Out of 12.6 million mobile devices shipped to primary and secondary schools in the U.S., Chromebooks accounted for 58 percent of the market, up from 50 percent in 2015. Meanwhile, school shipments of iPads and Mac laptops fell to 19 percent, from about 25 percent, over the same period, while Microsoft Windows laptops and tablets stayed relatively stable at about 22 percent. Read more of this story at Slashdot.

View article:
Apple Losing Out To Microsoft and Google in US Classrooms

Researchers Store Computer OS, Short Movie On DNA

An anonymous reader quotes a report from Phys.Org: In a new study published in the journal Science, a pair of researchers at Columbia University and the New York Genome Center (NYGC) show that an algorithm designed for streaming video on a cellphone can unlock DNA’s nearly full storage potential by squeezing more information into its four base nucleotides. They demonstrate that this technology is also extremely reliable. Erlich and his colleague Dina Zielinski, an associate scientist at NYGC, chose six files to encode, or write, into DNA: a full computer operating system, an 1895 French film, “Arrival of a train at La Ciotat, ” a $50 Amazon gift card, a computer virus, a Pioneer plaque and a 1948 study by information theorist Claude Shannon. They compressed the files into a master file, and then split the data into short strings of binary code made up of ones and zeros. Using an erasure-correcting algorithm called fountain codes, they randomly packaged the strings into so-called droplets, and mapped the ones and zeros in each droplet to the four nucleotide bases in DNA: A, G, C and T. The algorithm deleted letter combinations known to create errors, and added a barcode to each droplet to help reassemble the files later. In all, they generated a digital list of 72, 000 DNA strands, each 200 bases long, and sent it in a text file to a San Francisco DNA-synthesis startup, Twist Bioscience, that specializes in turning digital data into biological data. Two weeks later, they received a vial holding a speck of DNA molecules. To retrieve their files, they used modern sequencing technology to read the DNA strands, followed by software to translate the genetic code back into binary. They recovered their files with zero errors, the study reports. The study also notes that “a virtually unlimited number of copies of the files could be created with their coding technique by multiplying their DNA sample through polymerase chain reaction (PCR).” The researchers also “show that their coding strategy packs 215 petabytes of data on a single gram of DNA.” Read more of this story at Slashdot.

Taken from:
Researchers Store Computer OS, Short Movie On DNA

An Incorrect Command Entered By Employee Triggered Disruptions To S3 Storage Service, Knocking Down Dozens of Websites, Amazon Says

Amazon is apologizing for the disruptions to its S3 storage service that knocked down and — in some cases affected — dozens of websites earlier this week. The company also outlined what caused the issue — the event was triggered by human error. The company said an authorized S3 team member using an established playbook executed a command which was intended to remove a small number of servers for one of the S3 subsystems that is used by the S3 billing process. “Unfortunately, one of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended, ” the company said in a press statement Thursday. It adds: The servers that were inadvertently removed supported two other S3 subsystems. One of these subsystems, the index subsystem, manages the metadata and location information of all S3 objects in the region. This subsystem is necessary to serve all GET, LIST, PUT, and DELETE requests. The second subsystem, the placement subsystem, manages allocation of new storage and requires the index subsystem to be functioning properly to correctly operate. The placement subsystem is used during PUT requests to allocate storage for new objects. Removing a significant portion of the capacity caused each of these systems to require a full restart. While these subsystems were being restarted, S3 was unable to service requests. Other AWS services in the US-EAST-1 Region that rely on S3 for storage, including the S3 console, Amazon Elastic Compute Cloud (EC2) new instance launches, Amazon Elastic Block Store (EBS) volumes (when data was needed from a S3 snapshot), and AWS Lambda were also impacted while the S3 APIs were unavailable. Read more of this story at Slashdot.

Continue reading here:
An Incorrect Command Entered By Employee Triggered Disruptions To S3 Storage Service, Knocking Down Dozens of Websites, Amazon Says

Netflix Uses AI in Its New Codec To Compress Video Scene By Scene

An anonymous reader shares a Quartz report: Annoying pauses in your streaming movies are going to become less common, thanks to a new trick Netflix is rolling out. It’s using artificial intelligence techniques to analyze each shot in a video and compress it without affecting the image quality, thus reducing the amount of data it uses. The new encoding method is aimed at the growing contingent of viewers in emerging economies who watch video on phones and tablets. “We’re allergic to rebuffering, ” said Todd Yellin, a vice president of innovation at Netflix. “No one wants to be interrupted in the middle of Bojack Horseman or Stranger Things.” Yellin hopes the new system, called Dynamic Optimizer, will keep those Netflix binges free of interruption when it’s introduced sometime in the next “couple of months.” He was demonstrating the system’s results at “Netflix House, ” a mansion in the hills overlooking Barcelona that the company has outfitted for the Mobile World Congress trade show. In one case, the image quality from a 555 kilobits per second (kbps) stream looked identical to one on a data link with half the bandwidth. Read more of this story at Slashdot.

More:
Netflix Uses AI in Its New Codec To Compress Video Scene By Scene