Tech Today w/ Ken May

Featured entries

The Reality of the iPhone Line Is a Black Market Nightmare

Posted by kenmay on September - 20 - 2014

This week, people camped outside Apple stores for days anticipating the iPhone 6. But those line-waiters weren’t all frenzied Apple fans high on the joy of a new smartphone: As filmmaker Casey Neistat portrays it , many of the line-sitters were buying the new iPhone to immediately resell it on the black market. Read more…

San Francisco’s current tech-led boom has seen slick new housing high-rises pop up all across the grid, but Bay Area urban renewal in the 1970s had a very different look. Photographer Dave Glass is a native of the city’s Western Addition, and snapped these images of Victorians being driven around town like massive domestic trailers almost 30 years ago. Read more…

Charles Chan When Alibaba stopped trading its shares on Friday, the Chinese e-commerce company had officially logged the biggest Initial Public Offering (IPO) in US history, raising $21.8 billion in its first day on the New York Stock Exchange. The company’s earnings give it a market capitalization of over $200 billion, “putting it among the 20 biggest companies by market cap in the US,” the Wall Street Journal notes. Alibaba’s IPO beat out  record IPOs like Visa’s $17.9 billion IPO in 2008 and General Motors’ $15.8 billion sale in 2010. And Alibaba beat out its peers in the tech sector too, like Facebook (whose first-day earnings were $16 billion) and Google (whose 2004 IPO raised only $1.67 billion—paltry in today’s terms). Earlier this month , the company announced that it would price shares at $66 per share. This morning around 12pm ET, the NYSE gave the go-ahead for the company, whose ticker symbol is BABA, to start trading. Shares started at $92.70, a third larger than what the company was aiming for, and ended the day at $93.89 after reaching a high of $99.70. In after hours trading, Alibaba is just down slightly at $93.60 per share , as of this writing. Read 3 remaining paragraphs | Comments

A federal judge in Texas has convicted a local man of conducting a massive Bitcoin-based Ponzi scheme, and ordered him to pay $40.4 million. The court found on Friday that Tendon Shavers had created a virtual bitcoin-based hedge fund that many suspected of being a scam—and it turned out they were right. The Bitcoin Savings and Trust (BTCST) shut down in August 2012, and by June 2013 the Securities and Exchange Commission (SEC) filed charges against its founder . In a statement at the time, the SEC said Shavers “raised at least 700,000 Bitcoin in BTCST investments, which amounted to more than $4.5 million based on the average price of Bitcoin in 2011 and 2012 when the investments were offered and sold.” Judge Amos Mazzant wrote: Read 2 remaining paragraphs | Comments

wabrandsma (2551008) writes with this excerpt from The Verge: Last night, researchers at Malwarebytes noticed strange behavior on sites like, The Times of Israel and The Jerusalem Post. Ads on the sites were being unusually aggressive, setting off anti-virus warnings and raising flags in a number of Malwarebytes systems. After some digging, researcher Jerome Segura realized the problem was coming from Google’s DoubleClick ad servers and the popular Zedo ad agency. Together, they were serving up malicious ads designed to spread the recently identified Zemot malware. A Google representative has confirmed the breach, saying “our team is aware of this and has taken steps to shut this down.” Read more of this story at Slashdot.

UK Engineers 3D Print Their Own Raspberry Pi Laptop

Posted by kenmay on September - 19 - 2014

 Is there anything a robotic system for the extrusion of plastic in to solid forms over time can’t do? We present to you today the Pi-Top, a Raspberry Pi-based laptop that is completely 3D printed and lasts for hours on a single charge. The kit, which will launch as a Kickstarter soon, offers a 13.3-inch screen and a little keyboard and trackpad combo for data entry. Viola! A little… Read More

Hack runs Android apps on Windows, Mac, and Linux computers

Posted by kenmay on September - 19 - 2014

The official Android Twitter app running on Mac OS. Ron Amadeo If you remember, about a week ago, Google gave Chrome OS the ability to run Android apps through the ” App Runtime for Chrome .” The release came with a lot of limitations—it only worked with certain apps and only worked on Chrome OS. But a developer by the name of ” Vladikoff ” has slowly been stripping away these limits. First he figured out how to load  any app on Chrome OS, instead of just the four that are officially supported. Now he’s made an even bigger breakthrough and gotten Android apps to work on  any desktop OS that Chrome runs on. You can now run Android apps on Windows, Mac, and Linux. The hack depends on App Runtime for Chrome (ARC), which is built using Native Client , a Google project that allows Chrome to run native code safely within a web browser. While ARC was only officially released as an extension on Chrome OS, Native Client extensions are meant to be cross-platform. The main barrier to entry is obtaining ARC Chrome Web Store, which flags desktop versions of Chrome as “incompatible.” Vladikoff made a custom version of ARC, called ARChon , that can be sideloaded simply by dragging the file onto Chrome. It should get Android apps up and running on any platform running the desktop version of Chrome 37 and up. The hard part is getting Android apps that are compatible with it. ARC doesn’t run raw Android app packages (APKs)—they need to be converted into a Chrome extension—but Vladikoff has a tool called ” chromeos-apk ” that will take care of that, too. Read 4 remaining paragraphs | Comments

TrueCrypt Gets a New Life, New Name

Posted by kenmay on September - 19 - 2014

storagedude writes: Amid ongoing security concerns, the popular open source encryption program TrueCrypt may have found new life under a new name. Under the terms of the TrueCrypt license — which was a homemade open source license written by the authors themselves rather than a standard one — a forking of the code is allowed if references to TrueCrypt are removed from the code and the resulting application is not called TrueCrypt. Thus, CipherShed will be released under a standard open source license, with long-term ambitions to become a completely new product. Read more of this story at Slashdot.

NVIDIA’s latest GPU crams 4K images on 1080p displays

Posted by kenmay on September - 19 - 2014

Back in February, NVIDIA trotted out the very first desktop GPUs to feature its new Maxwell architecture: the GeForce GTX 750 and 750i. These entry level cards were paragons of efficiency, but they were hardly strong examples of what the company’s latest graphics technology was truly capable of. No, NVIDIA revealed those graphics cards today — the GeForce GTX 980 and 970 desktop GPUs. The new flagship GPUs still benefit from the efficiency gains made by the first generation Maxwell cards, but lean far more heavily on performance. If you’re a PC gamer with a GTX 680 or 560 in your tower, these are the cards NVIDIA wants you to upgrade to. On paper, there’s reason enough to appreciate these cards’ power: the $549 GTX 980 boasts a 1.1Ghz base clock speed (1.2 with boost), 2048 CUDA cores and 4GB of GDDR5 video memory. The $329 GTX 970 sheds a few of those CUDA cores (totaling 1664) and clocks down to 1Ghz (1.1 with boost), but it consumes a little less power for the downsizing: 145W to the 980’s 165W. In NVIDIA’s tests (viewable in the gallery above), these stats reportedly outperformed AMD’s kit with almost half the power draw. Still, even NVIDIA knows stats and core count mean bupkis to the general consumer — gamers want to know what all these specifications are going to do for them. We met up with Scott Herkelman, NVIDIA’s general manager of GeForce, to learn about Maxwell’s new tricks. “One of the things that we thought about when we wanted to launch Maxwell is this dichotomy that gamers are running into today, ” Herkelman told Engadget. NVIDIA found that gamers either wanted to increase visuals past a game’s prescribed performance settings or maximize framerate without sacrificing image quality. Surprise, surprise: Maxwell’s second generation GPUs introduce two new technologies that can help. Dynamic Super Resolution, for instance, lies to your game to make it output a higher resolution than your display expects. “We render a 4K image in the background and then put it through a 13 gaussian filter, ” he explained. “Then we bring that down to a 1080p monitor.” As far as the game is concerned, its piping out a ultra high resolution image to a 4K monitor, but Maxwell is forcing it to run on you 1080p display. This feature is designed to improve picture quality on a game that is already tuned to its best visual settings. Basically, it makes downsampling easy. It looks pretty good in action too, but it isn’t perfect: some 4K UI elements don’t scale well on smaller monitors. Herkelman says NVIDIA is continuing to improve and tweak the feature. “The other new technology we have is called MFAA, or Multi-Frame Sample Anti-Aliasing, ” Herkelman said. “This is for those games where you already have great image quality but you want more performance.” Like traditional anti-aliasing, it can sample a pixel multiple times, but MFAA splits the work up over multiple frames. Herkleman says this can improve performance by as much as 30-percent. Finally, high-end maxwell cards will be able to take advantage of games that use Voxel Global Illumination, a new dynamic lighting technology that promises to promises to enable destructive environments with active, realistic lighting. NVIDIA says the new lighting solution will be available for UE4 and other major engines later this year. Not the bells and whistles you’re looking for? Fine — Maxwell has a few more features hidden away, but you won’t be able to use them until the consumer virtual reality market takes off. NVIDIA’s VR Direct program is working to bring low latency graphics to consumer VR headsets like the Oculus Rift . Herkleman showed off a Maxwell-powered Eve: Valkyrie demo as an example. Indeed, the demo was smooth, but VR Direct’s future impact on GeForce Experience really caught our attention. In addition to supporting SLI, DSR and MFAA, NVIDIA’s VR Direct promises “auto stereo, ” a feature designed to bend a game not intended for virtual reality into the Oculus Rift’s stereoscopic perspective. Herkleman told us that the feature would probably have a whitelist of compatible games, not unlike how the company implements NVIDIA 3D Vision. So, when can consumers get their hands on the new Maxwell? Soon. NVIDIA CEO Jen-Hsun Huang officially announced the new GeForce GTX cards at Game24 this evening, and they should be available for sale tomorrow morning from NVIDIA’s usual hardware partners: EVGA, ASUS, Gigabyte, MSI and PNY, among others. Are you planning to upgrade, or will you wait to see what AMD cooks up in competition? Let us know what you think in the comments section below. Filed under: Gaming , NVIDIA Comments

NVIDIA’s new GPU proves moon landing truthers wrong

Posted by kenmay on September - 19 - 2014

Despite overwhelming evidence to the contrary, there still exist some people on planet Earth who believe it’s the only celestial body humanity has ever walked upon. You’ve heard it before — the moon landing was a hoax, a mere TV drama produced by Stanley Kubrick presented as fact to dupe the Soviet Union into giving up the space race. This deliciously ludicrous conspiracy theory has been debunked countless times, but now its advocates have one more refutation to deny: NVIDIA’s Voxel Global Illumination tech demo. It’s a GPU-powered recreation of the Apollo 11 landing site that uses dynamic lighting technology to address common claims of moon-deniers, and it’s pretty neat. Mark Daly, NVIDIA’s senior director of content development told Engadget its Apollo 11 demo was created as an answer to Sponza — a popular global illumination model frequently used in by the academic crowd. It’s a good model, he says, but it’s not very interesting to watch. “Jen-Hsun [Huang], our CEO, looked at it and said ‘Isn’t there something better?’ Anyway, one of our research engineers happened to put this slide up of Buzz Aldrin on the moon in a meeting and said ‘this speaks global illumination to me because of all the hoaxers and deniers of the moon landing.” Conspiracy theorists say that Aldrin simply couldn’t have been lit up the way he is in the picture. NVIDIA took it as a challenge. Buzz Aldrin (right) next to his computer-generated doppelganger (left) NVIDIA chose to create a 3D rendition of a photograph showing Buzz Aldrin descending a ladder to the moon’s surface. Folks that insist the landing was a hoax claim that without the light-diffusing effect of an atmosphere, the shadow of the lander should cast Aldrin in almost complete darkness. “You can explain it, ” Daly says, “and say light bounces around even on the moon… or you can show it. We decided to take the approach to show it, but it turns out that it’s not that easy — there isn’t a lot of light on [Aldrin].” Daly’s challenge was not in placing lights around a computer simulated scene of the Apollo 11 landing, but in using NVIDA’s Voxel Global Illumination to make a single light source, the simulated sun, correctly reflect off of every material in the scene. To do this, he had to research the materials of NASA’s lander, the brightness of our local star and even the reflectivity of the moon’s surface. “It turns out there is a lot of information about the astronomical bodies floating out there in space, ” he explains. “Starting with the sun. The sun itself is 128, 500 lux — that’s lumens per square meter – but it turns out the moon is a crappy reflector of light.” Daly discovered that the moon is only 12-percent reflective, and absorbs most of the sunlight hitting it. On the other hand, 12-percent of 128, 500 lux is quite a lot. “It’s the equivalent to ten 100-watt lightbulbs per square meter of light bouncing off the moon.” More than enough make Aldrin visible under the lander’s shadow. While this exercise showed that the moon was reflective enough to highlight Aldrin, something was still wrong. Daly noticed that the astronaut’s side wasn’t lit the same in NVIDIA’s simulation as it was in NASA’s photograph , but he wasn’t sure why. “A couple of people really into the moon landing told me, ‘by the way, you should take into account Neil Armstrong and the light coming off of him.’ At first I was like, yeah, whatever — the sun is doing all the work — something the size of a guy in a space suit isn’t going to contribute much light.” He quickly learned his assumption was wrong: the material on the outside of the astronaut’s suits is 85-percent reflective. “Sure enough, we put him in there, adjusted the reflectivity of his suit, put him in the position where the camera would be… and it contributed another 10% or so of light to the side of Buzz Aldrin.” Daly found that his own doubt mirrored the claims of some landing-deniers. Some claim that because Aldrin is in shadow, there would need to be some sort of auxiliary lighting behind the camera; supposed proof that the image was taken in a studio. “As it turns out, yes! They’re right — there was a light there, it was the sun reflecting off of Neil Armstrong’s suit. I really didn’t believe it would contribute that much.” It’s the dynamic nature of Voxel Global Illumination that allows NVIDIA to poke fun at these hoax claims: the entire scene renders light reflection on the fly, based solely on the illumination provided by the simulated sun. “We learned a heck of a lot about how all these materials reflect light and put them into the material descriptions, the BRDF (bidirectional reflectance distribution function), ” Daly said, explaining how developers create a VXGI lighting environment. “The VXGI we’ve integrated into Unreal Engine 4 reads all those materials you’ve given it and, based on the reflectivity of those materials, constructs a lighting module.” It’s a lot of work to set up, but it makes adjusting the light easy after the fact. NVIDIA is able to drag the sun to new positions, add new elements to the scene or even remove the moon’s natural reflectivity to create the false conditions moon-truthers think represent the lunar surface. This versatility allowed NVIDIA to address one more hoax-claim before our demo ended: the stars. If NASA really landed on the moon, why can’t we see the stars in any of the Apollo 11 photographs? Well, that’s more of a matter of film exposure than lighting trickery. Because the unfiltered sun is so ridiculously bright (128, 500 lux, remember?), the astronauts’ cameras were set to use a small aperture, letting in only a fraction of the available light in order to keep the picture from blowing out. NVIDIA was able to simulate this too, and widened the virtual camera’s aperture to reveal the demo’s simulated stars. It worked, but at the expense of the camera’s true subject matter: Aldrin’s descent to the lunar surface became a blown out, over-exposed mess. Science has been able to debunk these moon hoax theories for decades, but it’s nice to see a real-time simulation that can help illustrate those explanations in real time. Better still, Daly says NVIDIA is currently building a consumer UI for the demo, and will release it to the public sometime in the next several weeks. It’s also a project that has become important to him. “Because I got to see a lot of this live when I was a kid, it has a special meaning to me. I know in Apollo 1 two men died, and other men risked their lives to get into these crazy contraptions to actually do this. It’s kind of offensive to me when people say this didn’t happen, ” he explains. “I want to show that it really happened and these people risked their lives. They actually did go to the moon.” Post by NVIDIA . Comments