Gizmodo: Don’t Buy Anyone an Amazon Echo Speaker

Adam Clark Estes, writing for Gizmodo: Three years ago, we said the Echo was “the most innovative device Amazon’s made in years.” That’s still true. But you shouldn’t buy one. You shouldn’t buy one for your family. Your family members do not need an Amazon Echo or a Google Home or an AppleHomePod or whatever that one smart speaker that uses Cortana is called. And you don’t either. You only want one because every single gadget-slinger on the planet is marketing them to you as an all-new, life-changing device that could turn your kitchen into a futuristic voice-controlled paradise. You probably think that having an always-on microphone in your home is fine, and furthermore, tech companies only record and store snippets of your most intimate conversations. No big deal, you tell yourself. Actually, it is a big deal. The newfound privacy conundrum presented by installing a device that can literally listen to everything you’re saying represents a chilling new development in the age of internet-connected things. By buying a smart speaker, you’re effectively paying money to let a huge tech company surveil you. And I don’t mean to sound overly cynical about this, either. Amazon, Google, Apple, and others say that their devices aren’t spying on unsuspecting families. The only problem is that these gadgets are both hackable and prone to bugs. Read more of this story at Slashdot.

Visit site:
Gizmodo: Don’t Buy Anyone an Amazon Echo Speaker

AI film editor can cut scenes in seconds to suit your style

AI has won at Go and done a few other cool things, but so far it’s been mighty unimpressive at harder tasks like customers service , Twitter engagement and script writing . However, a new algorithm from researchers at Stanford and Adobe has shown it’s pretty damn good at video dialogue editing, something that requires artistry, skill and considerable time. The bot not only removes the drudgery, but can edit clips using multiple film styles to suit the project. First of all, the system can organize “takes” and match them to lines of dialogue from the script. It can also do voice, face and emotion recognition to encode the type of shot, intensity of the actor’s feelings, camera framing and other things. Since directors can shoot up to 10 takes per scene (or way more , in the case of auteurs like Stanley Kubrick), that alone can save hours. However, the real power of the system is doing “idiom” editing based on the rules of film language. For instance, many scenes start with a wide “establishing” shot so that the viewer knows where they are. You can also use leisurely or fast pacing, emphasize a certain character, intensify emotions or keep shot types (like wide or closeup) consistent. Such idioms are generally used to best tell the story in the way the director intended. All the editor has to do is drop their preferred idioms into the system, and it will cut the scene to match automatically, following the script. In an example shown (below), the team selected “start wide” to establish the scene, “avoid jump cuts” for a cinematic (non-YouTube) style, “emphasize character” (“Stacey”) and use a faster-paced performance. The system instantly created a cut that was pretty darn watchable, closely hewing to the comedic style that the script was going for. The team then shuffled the idioms, and it generated a “YouTube” style that emphasized hyperactive pacing and jump cuts. What’s best (or worst, perhaps for professional editors) is that the algorithm was able to assemble the 71-second cut within two to three seconds and switch to a completely different style instantly. Meanwhile, it took an editor three hours to cut the same sequence by hand, counting the time it took to watch each take. The system only works for dialogue, and not action or other types of sequences. It also has no way to judge the quality of the performance, naturalism and emotional beats in take. Editors, producers and directors still have to examine all the video that was shot, so AI is not going to take those jobs away anytime soon. However it looks like it’s about ready to replace the assistant editors who organize all the materials, or at least do a good chunk of their work. More importantly, it could remove a lot of the slogging normally required to edit, and let an editor see some quick cuts based on different styles. That would leave more time for fine-tuning, where their skill and artistic talent are most crucial. Source: Stanford

See original article:
AI film editor can cut scenes in seconds to suit your style

Overclocking to 7GHz takes more than just liquid nitrogen

Over the years, I’ve been fascinated by two kinds of events at Computex: eSports and extreme PC overclocking competitions. I doubt I’d ever make it as a professional gamer (I’m more of a Counter Fight kind of guy than a Counter-Strike man these days), but I’d jump at any opportunity to pour liquid nitrogen onto a PC motherboard, because even if I screw something up, chances are I’d still look cool doing so. It just so happened that at this year’s Computex, gaming accessory maker G.SKILL invited me to its extreme overclocking workshop behind its contest stage. As a total newbie with absolutely zero knowledge of overclocking, I quickly took up this offer. The one-on-one workshop was jointly set up by G.SKILL and overclocking enthusiast group HWBOT . Rather than getting me to build from scratch, the instructors had already put together a rig that would let me dive right into the overclocking process. At first sight, I was slightly overwhelmed by the setup in front of me: I was staring at an ASUS ROG Maximus IX APEX motherboard which carried two 8GB G.SKILL Trident Z DDR4 RAM sticks, an Intel Core i7-7700K plus a chunky copper pot directly on top of the CPU to hold liquid nitrogen. There was a fan hanging off the pot to suck vapor away, in order to avoid condensation on the motherboard. The monitor was showing the ASUS TurboV Core software along with CPU-Z — the former for accessing various CPU parameters, and the latter for keeping an eye on the CPU’s status. To keep track of the CPU’s temperature, an industrial thermometer was hooked up to a thermal probe inside the liquid nitrogen pot. The main objective of the workshop was to push the CPU from its 4.2GHz base frequency all the way to 7GHz, and this required lowering the CPU’s temperature to nitrogen’s boiling point — -195.8°C or -346°F — in the first place. My instructor, HWBOT director Pieter-Jan Plaisier, started by running Cinebench in Windows to ensure the CPU’s stability while I slowly poured liquid nitrogen into the pot. Once the pot reached just a little below -190°C, the liquid nitrogen stopped boiling frantically, and this was when I could actually start overclocking. Plaisier set me off with a couple of settings in TurboV Core: he bumped the CPU ratio to 55 to reach 5.5GHz clock speed (this is derived from the 100 MHz default base clock), then he also pushed the CPU core voltage to 1.855V. As I was going from 55 to 65 for the CPU ratio, I went up by increments of two units each time before hitting the “Apply” button, and I would always wait until CPU-Z reflected the new clock speed before applying my new settings. Meanwhile, I also made sure that the CPU temperature wasn’t fluctuating too much by occasionally refilling the pot (thankfully, G.SKILL had plenty of liquid nitrogen to share). So far so good. When the CPU reached 6.5GHz, I started nudging the CPU ratio by increments of just one unit instead, while also having to start gradually bumping up the CPU core voltage — it’d need about 1.925V to be stable at 7GHz. I became more mindful of my actions whilst having to juggle between the monitor, the mouse, the thermometer, the rig and the flask; but I kept my cool, because the last thing I wanted to do was to spill liquid nitrogen all over the place — especially not in front of my friend Lau Kin Lam , the champion of G.SKILL’s OC World Cup 2015 , who I brought along for support. By taking my time to fine-tune each parameter, I eventually saw the “Core Speed” figure in CPU-Z floating around 7GHz. Just as I was about to give myself a pat on the back, the monitor went black almost immediately. The computer had crashed. My first instinct was to check the thermometer but the temperature was still at around -192°C, so it wasn’t clear what had caused the crash. To my surprise, Plaisier then brought a blowtorch out of nowhere and started blasting fire into the pot. Don’t worry, he wasn’t mad at me; he just needed to bring the CPU temperature up to around -170°C / -274°F so that the system would be able to boot up. And sure enough, soon we were back in Windows. I repeated the same steady process and reached 7GHz again, and this time, both Plaisier and Lau encouraged me to go further. I obliged. As I was starting to push the CPU to its limit, I had to take baby steps in TurboV Core — in the sense that I had to leave the CPU ratio as-is and start tweaking the base clock speed instead. First of all, I had to push the CPU core voltage to 1.955V, and then I started nudging the 100 MHz base clock speed up by 0.2MHz or 0.1MHz each time. The resultant gain in CPU clock speed was obviously much less than before, but my patience eventually paid off: I somehow managed to break HWBOT’s own 7.05GHz record with that particular chip, and I eventually hit 7.08372GHz before the system froze up (no pun intended). This remained the record for that particular chip at the show, until someone else struck back with a 7.09744GHz achievement in a later workshop session. I’d be lying if I said I didn’t mind, but still, the clock speed I got would come in ninth position in HWBOT’s worldwide overall ranking for the Core i7-7700K. Not bad for a first-timer. Of course, I don’t plan to submit my achievement to HWBOT, because after all, I had most of the dirty work taken care of beforehand. For instance, it wasn’t until after the workshop when I realized that in order to place the pot directly on the CPU’s silicon, someone would have had to use a delid tool — like the upcoming der8auer Delid Die Mate-X pictured here — to pop the CPU’s lid off first. You’d also have to apply new paste between the silicon and the pot, and if the paste isn’t applied properly, you’d end up with uneven temperature across the silicon and thus leading to faulty operation. Lau also made a good point before we wrapped up: it is absolutely crucial to do waterproof work around the pot, not because of liquid nitrogen (it’d just roll off the motherboard due to the Leidenfrost effect) but because of water condensation on the outside of the pot. Water on a motherboard would be “game over” for the system, which is why there was a blue towel delicately wrapped around the base of the pot we used. The overclocking fun doesn’t stop here, though. For those who are adventurous enough, you can also use liquid nitrogen to overclock memory — with G.SKILL’s very own Trident Z 2, 133MHz DDR4 RAM being the first DDR4 module to break the 5GHz barrier last year. But if you ask me, I’ll probably stick to Plaisier’s advice and learn from scratch by building my own liquid-cooling system first. One step at a time. Click here to catch up on the latest news from Computex 2017!

Visit site:
Overclocking to 7GHz takes more than just liquid nitrogen

Apple releases iOS 10.1, adds Portrait mode to the iPhone 7 Plus

The Portrait mode for Apple’s iPhone 7 Plus has been in the works for months, and now it’s ready for the masses… sort of. 7 Plus users running beta software have been able to shoot photos full of artificial bokeh for over a month now, but Apple just pushed out its iOS 10.1 update and Portrait mode came along for the ride. Now, here’s the thing: even though you don’t need to be enrolled in the iOS beta program to use the feature anymore, the feature itself still isn’t completely done. Once the update is installed, the camera app asks if you’d like to “try the beta” when you swipe into the new Portrait position. Our professional recommendation? Dive right in. Portrait mode might not be completely complete, but it’s still capable of producing seriously nice headshots. In case you missed it the first time around, the feature uses the iPhone 7 Plus’s two cameras in tandem — the primary 12-megapixel sensor captures the image as normal, but the second, wide-angle sensor is used to determine how far away the subject is. All of that data gets mashed up into a nine -layer depth map, providing the context needed to artfully blur out backgrounds while keeping faces and subjects closer to the phone remain crisp and intact. Apple’s goal was to build a dead-simple photography experience that yields pictures that look like they were shot on expensive SLR cameras, and for the most part, Apple’s work is very impressive. This photo represents well the sort of quality you can expect out of Portrait mode: the focus stays locked on the face and hands, and the windows in the background are blurred pretty dramatically. Thanks to that nine-layer depth map, you can see areas where blurring is very subtle, like the top of the subject’s head and the bottom of her scarf. You don’t need to take photos of people to get some mileage out of Portrait mode, either. Have cats prancing around? Or a sweet new mug you need to share? In my experience, as long as you’re within proper range (the app tells you when you are) and there’s enough contrast between the foreground and background, you’ll get that pleasant background blurring. It’s when you’re in well-lit environments with lots of similar colors that Portrait mode seems to have trouble — that’s often when you’ll see edges blurred when they shouldn’t be. Just check out this photo of a cactus precariously perched on a railing. The camera didn’t have trouble differentiating between the cool blue of the pot and the trees in the background, but it obviously had trouble telling where the cactus ended and the trees began. These disappointments are rare, though, and will probably get ironed out as people continue to put Portrait mode through its paces. Most of the big problems have been solved — now Apple just has to focus on the fine-tuning (which is obviously easier said than done). At this point, Portrait mode is still far from perfect, but there’s a lot to like about just how simple it is to use. It’s fast, it’s impressive and it’s only going to get better with time. Interested in taking it for a spin? Jump into your iPhone 7 Plus’s settings and mash that software update button — it’ll show up sooner or later.

See the original article here:
Apple releases iOS 10.1, adds Portrait mode to the iPhone 7 Plus

Jailbreak Turns Cheap Walkie-Talkie Into DMR Police Scanner

An anonymous reader writes: Last Shmoocon, famous reverse engineer Travis Goodspeed presented his jailbreak of the Chinese MD380 digital handheld radio. The hack has since been published at GitHub with all needed source code to turn a cheap digital radio into the first hardware scanner for DMR digital mobile radio: a firmware patch for promiscuous mode that puts all talk groups through the speaker including private calling. In the U.S. the competing APCO-25 is a suite of standards for digital radio communications for federal users, but a lot of state/county and local public safety organizations including city police dispatch channels are using the Mototrbo MotorolaDMR digital standard. Read more of this story at Slashdot.

See original article:
Jailbreak Turns Cheap Walkie-Talkie Into DMR Police Scanner

New Russian Laboratory To Study Mammoth Cloning

An anonymous reader writes: While plans to clone a woolly mammoth are not new, a lab used in a joint effort by Russia and South Korea is. The new facility is devoted to studying extinct animal DNA in the hope of creating clones from the remains of animals found in the permafrost. IBtimes reports: “The Sakha facility has the world’s largest collection of frozen ancient animal carcasses and remains, with more than 2, 000 samples in its possession, including some that are tens of thousands years old, such as a mammoth discovered on the island of Maly Lyakhovsky; experts believe it may be more than 28, 000 years old.” Read more of this story at Slashdot.

View article:
New Russian Laboratory To Study Mammoth Cloning

A Glue That Only Hardens When Electrified Will Even Work Under Water

Have you ever gotten a piece of tape wet and noticed it loses its stickiness? Water and adhesives usually don’t mix, but researchers from Nanyang Technological University in Singapore have created a new type of glue that works in wet environments because it only hardens when a voltage is applied. Read more…

Taken from:
A Glue That Only Hardens When Electrified Will Even Work Under Water

Newly discovered frog species looks a lot like Kermit the Frog

We’ve found Kermit the Frog in real life and it’s a species of glassfrog just recently discovered called Hyalinobatrachium dianae in Costa Rica. It’s bright green just like Kermit, has big white adorable eyeballs just like Kermit and the males have a very unique mating call… just like Kermit, I guess? Anyway, the resemblance is uncanny. Read more…

See the original article here:
Newly discovered frog species looks a lot like Kermit the Frog

iOS 8.3 Prevents Desktop File Explorers from Accessing Apps

Apple’s recently released iOS 8.3 update prevents desktop file explorers like iFunBox, iExplorer, iTools, and others from accessing the app directories on your devices. A few have updated with temporary fixes, but it might take a little while before everything’s working again. Read more…

View post:
iOS 8.3 Prevents Desktop File Explorers from Accessing Apps

Watching uranium emit radiation inside a cloud chamber is mesmerizing

Here’s a really neat, classic experiment that’s always fun to see. When you place uranium inside a cloud chamber, you can see it decay and emit bits of radiation. It’s like seeing little alpha particle torpedoes shooting out in every direction, leaving a trail behind. Read more…

View article:
Watching uranium emit radiation inside a cloud chamber is mesmerizing