Google search redesign hews closer to competitor DuckDuckGo

Google’s makeover kicks the underlined URL to the curb, with a few other changes. Experiencing mild disorientation while using Google today? Google has quietly rolled out a subtle redesign for its search results that, among other things, removes the age-old hyperlink underline, bumps the font size two points, and evens out the line spacing. Google search results have gotten incremental changes over the years, and the search page certainly no longer looks like it did when the site first launched. Jon Wiley, the lead designer for Google search, took to Google+ Wednesday to say that the new look “improves readability and creates an overall cleaner look.” Having gone nearly a decade without underlined hyperlinks, we here at Ars wholeheartedly agree with the decision. The redesign moves Google up and away from competitors like Yahoo and Bing , which preserve the underline. However, it only catches Google up to the upstart DuckDuckGo, which does not use underlines and is cleaner still on its search results page, with truncated URLs for each result. Read 1 remaining paragraphs | Comments

See more here:
Google search redesign hews closer to competitor DuckDuckGo

Mozilla strives to take Web gaming to the next level with Unreal Engine 4

Around this time last year Mozilla and Epic Games showed off the Unreal 3 game engine running in the browser, using a combination of the WebGL 3D graphics API and asm.js , the high performance subset of JavaScript. Commercial games built using this technology were launched late in the year. With this apparently successful foray into using the browser as a rich gaming platform, Mozilla and Epic today demonstrated a preview of Epic’s next engine, Unreal Engine 4, again boasting near-native speeds. The Web version of UE4 uses Emscripten to compile regular C and C++ code into asm.js. Unreal Engine 4 running within Firefox. Over the past year, Mozilla has improved asm.js’s performance, to go from around 40 percent of native performance, to something like 67 percent of native. Our own testing largely supported the organization’s claims, though we noted certain limitations at the time, such as JavaScript’s lack of multithreading. Read 1 remaining paragraphs | Comments

View original post here:
Mozilla strives to take Web gaming to the next level with Unreal Engine 4

Refinements, additions, and un-breaking stuff: iOS 7.1 reviewed

Time to update! iOS 7.1 is here, and it fixes a lot of iOS 7.0’s biggest problems. Aurich Lawson There were about six months between the ouster of Scott Forstall from Apple in late October of 2012 and the unveiling of iOS 7.0 in June of 2013. Jony Ive and his team redesigned the software from the ground up in that interval, a short amount of time given that pretty much everything in the operating system was overhauled and that it was being done under new management. The design was tweaked between that first beta in June and the final release in mid-September, but the biggest elements were locked in place in short order. iOS 7.1’s version number implies a much smaller update, but it has spent a considerable amount of time in development. Apple has issued five betas to developers since November of 2013, and almost every one of them has tweaked the user interface in small but significant ways. It feels like Apple has been taking its time with this one, weighing different options and attempting to address the harshest criticism of the new design without the deadline pressure that comes with a major release. We’ve spent a few months with iOS 7.1 as it has progressed, and as usual we’re here to pick through the minutiae so you don’t have to. iOS 7.1 isn’t a drastic change, but it brings enough new design elements, performance improvements, and additional stability to the platform that it might just win over the remaining iOS 6 holdouts. Read 42 remaining paragraphs | Comments

More:
Refinements, additions, and un-breaking stuff: iOS 7.1 reviewed

Critical crypto bug leaves Linux, hundreds of apps open to eavesdropping

A. Strakey Hundreds of open source packages, including the Red Hat, Ubuntu, and Debian distributions of Linux, are susceptible to attacks that circumvent the most widely used technology to prevent eavesdropping on the Internet, thanks to an extremely critical vulnerability in a widely used cryptographic code library. The bug in the GnuTLS library makes it trivial for attackers to bypass secure sockets layer (SSL) and Transport Layer Security (TLS) protections available on websites that depend on the open source package. Initial estimates included in Internet discussions such as this one indicate that more than 200 different operating systems or applications rely on GnuTLS to implement crucial SSL and TLS operations, but it wouldn’t be surprising if the actual number is much higher. Web applications, e-mail programs, and other code that use the library are vulnerable to exploits that allow attackers monitoring connections to silently decode encrypted traffic passing between end users and servers. The bug is the result of commands in a section of the GnuTLS code that verify the authenticity of TLS certificates, which are often known simply as X509 certificates . The coding error, which may have been present in the code since 2005 , causes critical verification checks to be terminated, drawing ironic parallels to the extremely critical “goto fail” flaw that for months put users of Apple’s iOS and OS X operating systems at risk of surreptitious eavesdropping attacks. Apple developers have since patched the bug . Read 7 remaining paragraphs | Comments

Link:
Critical crypto bug leaves Linux, hundreds of apps open to eavesdropping

Comcast subscriber spinoff could create a new cable company

Comcast’s plan to divest itself of 3 million subscribers, which it hopes will help it win approval of a merger with Time Warner Cable, could result in the creation of a new cable company. Rather than selling off territories to existing cable companies, Comcast is considering an option to “[spin] them off in a new publicly traded company,” Bloomberg reported , citing anonymous sources.”Regulators may push for the spin-out because it would create a new competitor,” Bloomberg wrote. “A new company formed in such a way would be the fourth-largest US cable company by subscribers, trailing the merged Comcast-Time Warner Cable, Cox Communications Inc,. and Charter Communications Inc.” Creating a new company with those customers wouldn’t result in more choices for consumers in individual markets. Despite being the two largest cable companies in the US, Comcast and Time Warner Cable don’t compete against each other in any regional territory. Read 2 remaining paragraphs | Comments

More:
Comcast subscriber spinoff could create a new cable company

MtGox code posted by hackers as company files for bankruptcy protection

Cross Office Shibuya Medio, the office building in Tokyo that is home to MtGox and Mark Karpeles’ other companies. Tokyo Apartments As MtGox CEO Mark Karpeles and his lawyers officially filed for court-supervised restructuring of the Bitcoin exchange, someone posted a chunk of code to Pastebin that would appear to lend credence to Karpeles’ contention that his company was hacked. The block of PHP code appears to be part of the backend for MtGox’s Bitcoin exchange site, and it includes references to IP addresses registered to Karpeles’ Web hosting and consulting company, Tibanne . In an update to the MtGox website late Monday, the company reasserted its claim that it had been hacked through an exploit of a weakness in its exchange website code. “Although the complete extent is not yet known, we found that approximately 750,000 bitcoins deposited by users and approximately 100,000 bitcoins belonging to us had disappeared,” the company’s spokesperson said in the latest update at the MtGox website. “We believe that there is a high probability that these bitcoins were stolen as a result of an abuse of this bug and we have asked an expert to look at the possibility of a criminal complaint and undertake proper procedures.” That loss was discovered on February 24. On the same day, the company found “large discrepancies between the amount of cash held in financial institutions and the amount deposited from our users. The amounts are still under investigation and may vary, but they approximate JPY 2.8 billion [$27 million US].” Read 2 remaining paragraphs | Comments

Originally posted here:
MtGox code posted by hackers as company files for bankruptcy protection

Snow Leopard updates are probably done—here are your OS X upgrade options

End of the line, Snowy. Apple Apple offers no end-of-life roadmaps for its operating systems, and it doesn’t officially comment on whether support has dried up for this or that version of OS X. The best you can do is look at historical data. Since switching to a yearly release cadence with Lion back in 2011, Apple seems to be willing to support whatever the latest version is plus the two preceding versions. When OS X 10.9.2 was released earlier this week, it was accompanied by security updates for OS X 10.8 and 10.7 but not for 2009’s OS X 10.6.  It’s the first major security update that Snow Leopard has missed—the OS is still getting iTunes updates, but its last major security patch happened back in September. This has prompted a flurry of posts from various outlets. All point out the same Net Applications data that says 10.6 still powers around 19 percent of Macs. Most compare the OS X support cycle to the much-longer Windows cycle. Some make  a bigger deal about it than others. None really tell anyone in that 19 percent what to do next. You’ll need to know the exact kind of Mac you’re using before proceeding—typing your serial number into this Service and Support page should give you the information you need if you’re not sure. Launching the System Profiler application from the Utilities folder will show you your serial number and your Mac’s specific model identifier (something like MacBook4,1 or iMac11,2), the latter of which can be used with this EveryMac lookup page to find what you’re looking for. Read 17 remaining paragraphs | Comments

Read the article:
Snow Leopard updates are probably done—here are your OS X upgrade options

The day the Mario Kart died: Nintendo’s kill switch and the future of online consoles

flickr user: Andrew Huff Nintendo fans, mark your calendars for May 20, 2014. As Nintendo announced yesterday , that’s the last day you’ll be able to use the Nintendo Wi-Fi Connection to play hundreds of online games on the Wii and Nintendo DS. Single-player modes for those games will still work, of course, but any parts of the games that require an Internet connection will be completely non-functional in a matter of months. The shutdown will affect some of both systems’ most popular games, some of the best-selling games of all time. Suddenly, over 34 million copies of Mario Kart Wii and 23 million copies of Mario Kart DS will be severely diminished. The tens of millions of people who own the DS Pokemon games will no longer be able to trade their beasts or battle online. Animal Crossing: Wild World and Super Smash Bros. Brawl will be less functional for over 11 million players each. Sure, as a practical matter, relatively few of these tens of millions of players are still making regular use of online servers for games that are sometimes pushing nine years old. If they were, Nintendo would probably have more interest in continuing to maintain those servers on the theory that it would lead to some more very-long-tail sales for its online-enabled games. On the other hand, Nintendo could be more interested in trying to force more players off its “legacy systems” and on to the Wii U and 3DS, which of course still have active online support. Read 11 remaining paragraphs | Comments

Read More:
The day the Mario Kart died: Nintendo’s kill switch and the future of online consoles

ESA’s Gaia mission set to survey the galaxy with biggest camera in space

An artist’s rendering of what Gaia will look like when deployed in space. ESA After its successful launch in December, the European Space Agency’s (ESA) Gaia has now taken up its position in orbit and is ready to survey the skies. With the help of two onboard telescopes focused onto the largest-ever camera sent to space, the space observatory is expected to catalog nearly one billion stars throughout its 5-year mission. ESA’s Gaia will map stars in the Milky Way. It will do this by measuring the brightest billion objects and determining their three-dimensional distribution and velocities. It also has the ability to measure the temperature, mass, and chemical composition of each of the objects. The brightest objects won’t necessarily need to be very bright in order to be included in the catalog. Gaia will be able to discern objects up to 400,000 times dimmer than those visible to the naked eye. And the positional accuracy of its measurements are akin to measuring the width of a human hair at a distance of 500 km. Read 9 remaining paragraphs | Comments

Original post:
ESA’s Gaia mission set to survey the galaxy with biggest camera in space

Exoplanet discovery rate goes from a trickle to a flood

The Kepler spacecraft NASA Today, NASA’s Kepler team has announced that it has developed a new technique to verify the existence of many of the planetary candidates in its back catalog. The technique, which relies on the presence of multiple planets in the system, has led to the single largest announcement of new planets in history: 715 of them, orbiting a total of 305 stars. Most of these are small, between the sizes of Earth and Neptune, and are tightly packed in the inner regions of the systems in which they reside, but four appear to be in the habitable zone. If you visit Kepler’s home page , you’ll see a count of confirmed planets in the upper right (it’s currently at 961). Hover over it, and you’ll see there are over 3,800 unconfirmed planetary candidates. Those candidates come from the method that Kepler uses to discover planets: watching for a mini-eclipse that causes a slight dimming of their host star’s light. A similar pattern can be caused by a dim star orbiting in the system (a configuration called an eclipsing binary system), which raises the prospect of false positives. In the past, this has generally involved multiple follow-up observations with a large telescope, which has held back the announcement of confirmed planets to a relative trickle. However, there have been a number of discoveries that have been based on Kepler data alone. These discoveries have come from multi-planet systems, where the planets gravitationally interacted, speeding up or slowing each other down. This activity creates regular variations in the timing and duration of the eclipses as the exoplanets transit between their host star and Earth. Read 6 remaining paragraphs | Comments

Excerpt from:
Exoplanet discovery rate goes from a trickle to a flood