Physicists have created a fluid with “negative mass, ” which accelerates towards you when pushed. From a report on BBC: In the everyday world, when an object is pushed, it accelerates in the same direction as the force applied to it; this relationship is described by Isaac Newton’s Second Law of Motion. But in theory, matter can have negative mass in the same sense that an electric charge can be positive or negative. Prof Peter Engels, from Washington State University (WSU), and colleagues cooled rubidium atoms to just above the temperature of absolute zero (close to -273C), creating what’s known as a Bose-Einstein condensate. In this state, particles move extremely slowly, and following behaviour predicted by quantum mechanics, acting like waves. They also synchronise and move together in what’s known as a superfluid, which flows without losing energy. Read more of this story at Slashdot.
Archive for April 20th, 2017
A new book published by the Imperial War Museum features a rare collection of color photos from World War II, some of which haven’t been seen in over 70 years. From P-51D Mustangs and Flying Fortresses through to anti-aircraft spotters and flame hurling tanks, these images cast the war in a vibrant new light. Read more…
Microsoft is making a few changes to how it will service Windows, Office 365 ProPlus and System Center Configuration Manager. From a report: Announced today, Microsoft will be releasing two feature updates a year for Windows 10 in March in September and with each release, System Center Configuration Manager will support this new aligned update model for Office 365 ProPlus and Windows 10, making both easier to deploy and keep up to date. This is a big change for Microsoft as Windows will now be on a more predictable pattern for major updates and by aligning it with Office 365 Pro Plus, this should make these two platforms easier to service from an IT Pro perspective. The big news here is also that Microsoft is announcing when Redstone 3 is targeted for release. The company is looking at a September release window but it is worth pointing out that they traditionally release the month after the code is completed. Read more of this story at Slashdot.
It’s been a little over a year since Dyson launched its first connected air purifier, the Pure Cool Link , and to celebrate this occasion, the company is about to update its entire line of air purifiers with improvements aimed at the Chinese market. Well, what’s going to be different is really just the replaceable cylindrical glass HEPA filter: the new version’s inner layer will pack three times more specially treated graphite crystals than before, which helps remove more gaseous pollutants (and odors) in addition to the usual particulates as small as PM 0.1. This graphite upgrade is the result of Dyson’s study of used filters collected from over 200 Chinese homes, from which it discovered that harmful gaseous pollutants such as formaldehyde, benzene, naphthalene and non-burnt gases have a higher presence than other markets. Better yet, this new filter comes in the same shape and volume as before, and Dyson has confirmed that it is backwards compatible, so existing Pure users won’t feel left out. To ensure all corners of the room are covered, each Pure machine can pump out over 200 liters of smooth air per second — so much that it can blow large bubbles over a distance of five meters, as demonstrated at the event in Beijing. As with the current Pure line, Dyson will continue to offer both connected and offline models for its heaters and fans to cover more price tiers. The connected ones will still pack a set of sensors for monitoring temperature, humidity, volatile organic compounds and dust, so that it can feed live data back to the app as well as toggling its auto mode, but these are now powered by an optimized algorithm based on data collected from around the world over the year. China will start selling the refreshed Pure machines from May 4th, and we’ll be keeping an out for other regional launch dates later on.
Last year, Facebook announced the Surround 360 , a 360-degree camera that can capture footage in 3D and then render it online via specially designed software. But it wasn’t for sale. Instead, the company used it as a reference design for others to create 3D 360 content, even going so far as to open source it on Github later that summer. As good as the camera was, though, it still didn’t deliver the full VR experience. That’s why Facebook is introducing two more 360-degree cameras at this year’s F8 : the x24 and x6. The difference: These cameras can shoot in six degrees of freedom, which promises to make the 360 footage more immersive than before. The x24 is so named because it has 24 cameras; the x6, meanwhile, has — you guessed it — six cameras. While the x24 looks like a giant beach ball with many eyes, the x6 is shaped more like a tennis ball, which makes for a less intimidating look. Both are designed for professional content creators, but the x6 is obviously meant to be a smaller, lighter and cheaper version. Both the x24 and the x6 are part of the Surround 360 family. And, as with version one (which is now called the Surround 360 Open Edition), Facebook doesn’t plan on selling the cameras themselves. Instead, Facebook plans to license the x24 and x6 designs to a “select group of commercial partners.” Still, the versions you see in the images here were prototyped in Facebook’s on-site hardware lab (cunningly called Area 404) using off-the-shelf components. The x24 was made in partnership with FLIR, a company mostly known for its thermal imaging cameras, while the x6 prototype was made entirely in-house. But before we get into all of that, let’s talk a little bit about what sets these cameras apart from normal 360 ones. With a traditional fixed camera, you see the world through its fixed lens. So if you’re viewing this content (also known as stereoscopic 360) in a VR headset and you decide to move around, the world stays still as you move, which is not what it would look like in the real world. This makes the experience pretty uncomfortable and takes you out of the scene. It becomes less immersive. With content that’s shot with six degrees of freedom, however, this is no longer an issue. You can move your head to a position where the camera never was, and still view the world as if you were actually there. Move your head from side to side, forwards and backwards, and the camera is smart enough to reconstruct what the view looks like from different angles. All of this is due to some special software that Facebook has created, along with the carefully designed pattern of the cameras. According to Brian Cabral, Facebook’s Engineering Director, it’s an “optimal pattern” to get as much information as possible. I had the opportunity to have a look at a couple of different videos shot with the x24 at Facebook’s headquarters (Using the Oculus Rift, of course). One was of a scene shot in the California Academy of Sciences, specifically at the underwater tunnel in the Steinhart Aquarium. I was surprised to see that the view of the camera would follow my own as I tilted my head from left to right and even when I crouched down on the floor. I could even step to the side and look “through” where the camera was, as if it wasn’t there at all. If the video was shot through a traditional 360 camera, it’s likely that I would see the camera tripod if I looked down. But with the x24, I just saw the floor, as if I was a disembodied ghost floating around. Another wonderful thing about videos shot with six degrees of freedom is that each pixel has depth. Each pixel is literally in 3D. This a breakthrough for VR content creators, and opens up a world of possibilities in visual effects editing. This means that you can add 3D effects to live action footage, a feat that usually would have required a green screen. I saw this demonstrated in the other video, which was of a scene shot on the roof of one of Facebook’s buildings. Facebook along with Otoy, a Los Angeles-based cloud rendering company, were able to actually add effects to the scene. Examples include floating butterflies, which wafted around when I swiped at them with a Touch controller. They also did a visual trick where I could step “outside” of the scene and encapsulate the entire video in a snow globe. All of this is possible because of the layers of depth that the footage provides. That’s not to say there weren’t bugs. The video footage I saw had shimmering around the edges, which Cabral said is basically a flaw in the software that they’re working to fix. Plus, the camera is unable to see what’s behind people, so there’s a tiny bit of streaking along the edges. Still, there’s lots of potential with this kind of content. “This is a new kind of media in video and immersive experiences, ” said Eric Cheng, Facebook’s head of Immersive Media, who was previously the Director of Photography at Lytro. “Six degrees of freedom has traditionally been done in gaming and VR, but not in live action.” Cheng says that many content creators have told him that they’ve been waiting for a way to bridge live action into these “volumetric editing experiences.” Indeed, that’s partly why Facebook is partnering with a lot of post-production companies like Adobe, Foundry and Otoy in order to develop an editing workflow with these cameras. “Think of these cameras as content acquisition tools for content creators, ” said Cheng. But what about other cameras, like Lytro’s Immerge for example? “There’s a large continuum of these things, ” said Cabral. “Lytro sits at the very very high-end.” It’s also not nearly as portable as both the x24 and x6, which are both designed for a much more flexible and nimble approach to VR capture. As for when cameras like these will make their way down to the consumer level, well, Facebook says that will come in future generations. “That’s the long arc of where we’re going with this, ” said CTO Mike Schroepfer. “Our goal is simple: We want more people producing awesome, immersive 360 and 3D content, ” said Schroepfer. “We want to bring people up the immersion curve. We want to be developing the gold standard and say this is where we’re shooting for.” Click here to catch up on the latest news from F8 2017!
Enlarge / Hydrophylax bahuvistara (credit: Sanil-George-Jessica-Shartouny ) From the slimy backs of a South Indian frog comes a new way to blast influenza viruses. A compound in the frog’s mucus—long known to have germ-killing properties—can latch onto flu virus particles and cause them to burst apart, researchers report in Immunity . The peptide is a potent and precise killer , able to demolish a whole class of flu viruses while leaving other viruses and cells unharmed. But scientists don’t know exactly how it pulls off the viral eviscerations. No other antiviral peptide of its ilk seems to work the same way. The study authors, led by researchers at Emory University, note that the peptide appears uniquely nontoxic—something that can’t be said of many other frog-based compounds. Thus, the peptide on its own holds promise of being a potential therapy someday. But simply figuring out how it works could move researchers closer to a vaccine or therapy that could take out all flus, ditching the need for yearly vaccinations for each season’s flavor of flu. Read 10 remaining paragraphs | Comments
It seems like every day there’s news of another significant data breach, so here’s today’s: An internal investigation by the InterContinental Hotel Group, which owns Holiday Inn, has revealed that guests at more than a thousand of their hotels had their credit card details stolen. The company identified malware on… Read more…
Enlarge / Orion Hindawi, co-founder and chief technology officer of Tanium Inc. Information security company Tanium is a relatively well-established “next-generation” cybersecurity vendor that was founded 10 years ago—far ahead of the wave of the venture capital-funded newcomers, like Cylance, who have changed the security software space. (Tanium has reached a market valuation of more than $3 billion, though there are no indications of when it will make an initial public offering.) Starting in 2012, Tanium apparently had a secret weapon to help it compete with the wave of newcomers, which the company’s executives used in sales demonstrations: a live customer network they could tap into for product demonstrations.There was just one problem: the customer didn’t know that Tanium was using its network. And since the customer was a hospital, the Tanium demos—which numbered in the hundreds between 2012 and 2015, according to a Wall Street Journal report —exposed live, sensitive information about the hospital’s IT systems. Until recently, some of that data was shown in publicly posted videos. In 2010, Tanium’s software was installed at Allscripts Healthcare Solutions’ El Camino Hospital (which markets itself as “the hospital of Silicon Valley”) in Santa Clara County, California. The hospital no longer has a relationship with Tanium. While Tanium did not have access to patient data, the demos showed desktop and server management details that were not anonymized. Read 3 remaining paragraphs | Comments
New submitter omaha393 quotes a report from R&D Magazine: Toyota announced a new initiative on Wednesday aimed at advancing its work in vehicles powered by alternative energy sources. The automaker unveiled Project Portal, which is a novel hydrogen fuel cell system designed for heavy duty truck use at the Port of Los Angeles. A proof-of-concept truck powered by this fuel cell will be part of a feasibility study held at the Port this summer, with the goal of examining the potential of this technology in heavy-duty applications. The test vehicle will produce more than 670 horsepower and 1, 325 pound feet of torque from two of these novel fuel cell stacks along with a 12kWh battery. Overall, the combined weight capacity is 80, 000 pounds that will be carried over 200 miles. omaha393 adds: “While hydrogen fuel has been criticized due to high cost of production and safety concerns, recent advances in catalysis and solid storage systems have made the prospect of hydrogen fuel an attractive commercial prospect for the future.” Read more of this story at Slashdot.
Our fingerprints are quickly replacing PINs and passwords as our primary means of unlocking our phones, doors and safes. They’re convenient, unique, and ultimately more secure than easily guessed or forged passwords and signatures. So it makes sense that fingerprint sensors are coming to protect our credit and debit cards. MasterCard is testing out new fingerprint sensor-enabled payment cards that, combined with the onboard chips, offer a new, convenient way to authorize your in-person transactions. Instead of signing a paper receipt or entering your PIN while struggling to cover up the number pad, you simply place your thumb on your card to prove your identity. The new cards are currently being tested in South Africa, and MasterCard hopes to roll them out to the rest of the world by the end of 2017. Even if that happens, though, you’ll still have to wait for your bank or financial institution to get on board. Once the technology is ready for the public, here’s how it should work. Your bank will inform you that the biometric card is available, and if you’re interested, you’ll have to go to an enrollment center (most likely a bank) to get your fingers scanned. An encrypted digital template of your fingerprint is stored on the card’s EMV chip . You can save up to two prints, but they would both have to be yours — you can’t authorize someone else to use your card with their fingers. After your templates are saved, your card is ready to be used at compatible terminals worldwide — merchants don’t have to get new equipment to accept your fingerprint-enabled plastic. The card itself is surprisingly no thicker than a regular credit card. The fingerprint sensor is a small, thumbnail-sized rectangle that sits at the top right corner, and is easily accessible when you stick the card into a payment terminal. During a recent demo, I tried to use a MasterCard rep’s biometric card with my finger, and received a “Transaction denied” message from the test payment terminal. When she carried out the faux-purchase, the payment went through, and the machine began printing a receipt. What really surprised me was the speed at which it happened. When the terminal asks you to insert the card, it’s communicating to the bank information like your identity and the amount of the transaction. Then, it verifies your identity by asking for your fingerprint. The sensor reads your finger, and sends the information to the card’s chip, which determines if you’re the owner. If you are, it sends a “Yes” or “Authorized” message to the bank, which then allows the payment to pass. At my demo, the authorization process happened almost instantly, which is reasonable given it’s all happening on the card instead of going through the bank. When it was me using the card, however, it took a slight pause to register that I wasn’t certified. I didn’t have trouble learning the new process at all, either — it’s intuitive and straightforward to simply leave your finger on the card as you slide it into a payment dock. Of course, this method is only compatible with chip-and-pin cards, so it won’t work with stores that only accept the older magnetic stripe models. But embedded chip technology has become increasingly popular in the US, thanks largely to regulations making financial institutions and merchants liable for breaches resulting from a lack of support for chip-and-pin cards. Getting a new biometric card is troublesome, since it would require a trip to the bank and a potentially long wait. But the convenience and the joy you’ll get from waving that fancy new plastic in your friends’ faces may make that agony worthwhile.