Amazon floats Windows Server 2012 into AWS cloud

Amazon Web Services announced today that it will now offer virtual Windows 2012 server instances as part of its Enterprise Compute Cloud (EC2) service. Amazon Web Services’ Windows team General Manager Tom Rizzo—who until this June was Microsoft’s Senior Director for the Office and Office 365 teams, and had previously run Microsoft’s SharePoint team—revealed the addition of the Server 2012 platform in a post on the AWS team’s official blog . As Ars found in our review of Windows Server 2012 , the operating system has a number of advantages for cloud users over previous Windows Server operating systems, including better software-defined networking and improved remote configuration through PowerShell commands. Amazon is hardly the first to offer Server 2012 as a public cloud service—Microsoft’s Azure and a number of smaller cloud providers have had Server 2012 instances available since the operating system was released (and in some cases, before that). But there are a number of things that Amazon has done with Windows 2012 that are sure to draw attention from companies and developers looking to ease into using Server 2012 or go big right away. One is Amazon’s support for Server 2012 in AWS’s Elastic Beanstalk , a service that automatically takes care of many of the deployment and capacity-provisioning aspects of deploying an application to the AWS cloud.  Amazon is also offering Server 2012 as part of its “free” tier of services as well—up to 750 hours of EC2 “Micro Instance” compute time per month, for up to a year. There’s also direct integration into Microsoft Visual Studio 2012 through the AWS Explorer Read 1 remaining paragraphs | Comments

Read More:
Amazon floats Windows Server 2012 into AWS cloud

Review: Ubuntu 12.10 Quantal Quetzal a mix of promise, pain

Tux shares a perch with Ubuntu 12.10’s namesake bird Aurich Lawson / Thinkstock Write this down: Ubuntu 12.10, the late-year arrival from Canonical’s six-month standard release factory, marks the first new release within the company’s current long-term support cycle. Got it? Good, because it may be the best takeaway from the latest Ubuntu release, codenamed Quantal Quetzal. After that, it’s a bit of a rocky ride. The product’s development lineage is important to note from more of a business/adoption side perspective. The release of Ubuntu 12.04 LTS in April was Canonical’s fourth long-term support product and signaled the end of one full two-year development cycle. Quantal Quetzal is the first standard release on the road to pushing out Ubuntu 14.04 LTS in Spring 2014 (undoubtedly to be codenamed “Uber-rocking Unicorn” if the pattern holds), and it sets up themes and directions which will mature over the next two years. Standard releases aren’t terribly different from the bi-annual LTS products, though they tend to be slightly less conservative in code offerings. The Ubuntu development community lets off the brakes a little and sticks some shiny back in. Read 63 remaining paragraphs | Comments

Originally posted here:
Review: Ubuntu 12.10 Quantal Quetzal a mix of promise, pain

Apple’s stock price falls to lowest point in six months

On Friday Apple’s stock price closed at $527.68 per share , the lowest it’s been in six months . Since September, the company has lost about 25 percent of its value from its peak of $702 per share. So what’s gone wrong? Analysts say that Apple has had a string of misfortunes lately, ranging from missed  earnings estimates ,  management shakeups , missteps on mapping software , supply chain problems , and increased pressure from competitors. “I think it’s the perfect storm for Apple,” Van Baker, an analyst with Gartner Research, told Ars. “There’s a combination of a lot of things, and add to that, people are starting to think that Apple won’t bring out something that’s truly innovative every few years.” Read 20 remaining paragraphs | Comments

Excerpt from:
Apple’s stock price falls to lowest point in six months

Best of both worlds: Setting up Wi-Fi for iOS on 2.4 and 5GHz

For a while, it seemed that Wi-Fi was becoming a victim of its own success. In many cities, there are numerous active Wi-Fi networks on those preciously few non-overlapping channels—that’s in addition to microwaves, bluetooth, cordless phones, and baby monitors, which all share the 2.4GHz band. But since about 2007, Apple has also built support for 802.11n Wi-Fi on the 5GHz band into its computers and Airport line of Wi-Fi base stations. Now, the iPhone 5 and the latest iPod touch also have that support. (The iPad has had it since day one.) So, how do you set up a Wi-Fi network that makes the most of this confluence of Wi-Fi bands? Not created equal First of all, it’s important to realize that the two bands are created very differently. The 2.4GHz band suffers from lack of non-overlapping channels and interference from other devices. But the lower frequencies pass through walls and floors reasonably well. The 5GHz band on the other hand, has a much larger number of channels—and they don’t overlap—but the higher frequencies have reduced range, even in open air. In addition to this, Apple only supports using two channels as a single, double-speed wide channel in the 5GHz band. If all else is equal, 5GHz is twice as fast as 2.4GHz. Read 12 remaining paragraphs | Comments

More:
Best of both worlds: Setting up Wi-Fi for iOS on 2.4 and 5GHz

Judge blocks California’s new ban on anonymity for sex offenders

ax2groin On Tuesday, voters in California overwhelmingly approved Proposition 35, which ratcheted up penalties for those convicted of sex crimes, including human trafficking. The proposition included a provision requiring registered sex offenders to disclose to law enforcement all of their Internet connections and online identities. On Wednesday, two of the 73,900 registered sex offenders in the state who would be affected by the law filed a lawsuit challenging the constitutionality of these provisions. The two plaintiffs argued that forcing them to expose their online identities would violate their First Amendment right to speak anonymously. Their appeal is supported by the American Civil Liberties Union of Northern California and the Electronic Frontier Foundation. Late on Wednesday, Judge Thelton Henderson granted a temporary restraining order barring the law from going into effect until he had time to consider the plaintiffs’ constitutional arguments. Read 6 remaining paragraphs | Comments

More here:
Judge blocks California’s new ban on anonymity for sex offenders

Google infringes old Lycos patents, must pay $30 million

Vringo is a little company that’s made a huge bet on suing Google over patents. Today that bet paid off, although to a much lesser degree than its investors hoped earlier. After a two-week trial in Virginia, a jury found that Google’s advertising system infringes two old Lycos patents purchased by Vringo in 2011, and that those patents are valid. Google and several of its advertising partners were ordered to pay a total of about $30 million. That’s a lot of money, but far less than the $493 million Vringo was seeking. According to a report  just published in the Virginian-Pilot , the jury found that Google will have to pay $15.9 million. Its advertising partners must pay smaller amounts: $7.9 million in damages for AOL, $6.6 million for IAC Search & Media, $98,800 for Target, and $4,000 for Gannett. The jury also said Google should pay an ongoing royalty; but whether that ultimately sticks is up to the judge. The Vringo case is remarkable for two reasons: first, it’s rare to see a high-profile patent attack played out directly in the stock market, with investors speculating on each move in court. Second, demonstratives submitted in Vringo’s case show a fascinating story in pictures of how a company that’s more or less a “patent troll” tries to convince a jury to shower it with money. Some of those visuals are posted below. Read 11 remaining paragraphs | Comments

View the original here:
Google infringes old Lycos patents, must pay $30 million

IBM prepares for end of process shrinks with carbon nanotube transistors

Carbon nanotubes sit on top of features etched in silicon. IBM Research The shrinking size of features on modern processors is slowly approaching a limit where the wiring on chips will only be a few atoms across. As this point approaches, both making these features and controlling the flow of current through them becomes a serious challenge, one that bumps up against basic limits of materials. During my visit to IBM’s Watson Research Center, it was clear that people in the company is already thinking about what to do when they run into these limits. For at least some of them, the answer would involve a radical departure from traditional chipmaking approaches, swithching from traditional semiconductors to carbon nanotubes. And, while I was there, the team was preparing a paper (now released by Nature Nanotechnology ) that would report some significant progress: a chip with 10,000 working transistors made from nanotubes, formed at a density that’s two orders of magnitude higher than any previously reported effort. During my visit to Watson, I spoke with George Tulevski, who is working on the nanotube project, and is one of the authors of the recent paper. Tulevski described nanotbues as a radical rethinking of how you build a chip. “Silicoon is a solid you carve down,” he told Ars, “while nanoubes are something you have to build up.” In other words, you can’t start with a sheet of nanotubes and etch them until you’re left with the wiring you want. Read 9 remaining paragraphs | Comments

Continue reading here:
IBM prepares for end of process shrinks with carbon nanotube transistors

$99 Raspberry Pi-sized “supercomputer” hits Kickstarter goal

A prototype of Parallella. The final version will be the size of a credit card. Adapteva A month ago, we told you about a chipmaker called Adapteva that turned to Kickstarter in a bid to build a new platform that would be the size of a Raspberry Pi and an alternative to expensive parallel computing platforms. Adapteva needed at least $750,000 to build what it is calling “Parallella”—and it has hit the goal. Today is the Kickstarter deadline, and the project is up to more than $830,000  with a few hours to go. ( UPDATE : The fundraiser hit $898,921 when time expired.) As a result, Adapteva will build 16-core boards capable of 26 gigaflops performance, costing $99 each. The board uses RISC cores capable of speeds of 1GHz each. There is also a dual-core ARM A9-based system-on-chip, with the 16-core RISC chips acting as a coprocessor to speed up tasks. Adapteva is well short of its stretch goal of $3 million, which would have resulted in a 64-core board hitting 90 gigaflops, and built using a more expensive 28-nanometer process rather than the 65-nanometer process used for the base model. The 64-core board would have cost $199. Read 2 remaining paragraphs | Comments

Visit link:
$99 Raspberry Pi-sized “supercomputer” hits Kickstarter goal

US federal agency dropping 17,000 BlackBerrys in favor of iPhones

It’s no secret that Research In Motion, the maker of the fabled BlackBerry, is on the decline . If falling subscriber numbers last month weren’t bad enough, last week, the United States Immigration and Customs Enforcement agency (ICE) said that it will end its contract with RIM , replacing over 17,000 employees devices with iPhones in a deal worth $2.1 million. “The RIM technology, however, can no longer meet the mobile technology needs of the agency,” the agency wrote in a 10-page document , adding that “no other company’s products can meet the agency’s needs.” Read 3 remaining paragraphs | Comments

More:
US federal agency dropping 17,000 BlackBerrys in favor of iPhones

Dept. of Veterans Affairs spent millions on PC software it couldn’t use

Rolling out new software to a few thousand users is an involved process for any organization. But installing software that affects hundreds of thousands of PCs as part of a response to a data breach while under embarrassing scrutiny is a task that would challenge even the most well-managed IT departments. And, apparently, the Office of Information Technology (OIT) at the Department of Veterans Affairs’ answer to that challenge was to sweep it under the rug. After removable hard disks containing unencrypted personal identifying information of  26 million military veterans  were stolen from the home of a VA employee in 2006, then-Secretary of Veterans Affairs   R. James Nicholson mandated that the VA’s Office of Information Technology install encryption software on all of the department’s notebook and desktop computers. But while the VA purchased 400,000 licensees for Symantec’s Guardian Edge encryption software, more than 84 percent of those licenses—worth about $5.1 million, including the maintenance contracts for them—remain uninstalled, a  VA Inspector General’s audit  has found. The VA’s OIT purchased 300,000 licenses and maintenance agreements for Guardian Edge in 2006 and continued to pay for maintenance on those licenses for the next five years. And in 2011, the VA purchased 100,000 more software licenses from Symantec and extended maintenance on all 400,000 licenses for two years. Read 2 remaining paragraphs | Comments

Excerpt from:
Dept. of Veterans Affairs spent millions on PC software it couldn’t use