Bay Area tech company caught paying imported workers $1.21 per hour

Ever heard of Electronics for Imaging? We hadn’t either until this morning, but it’s apparently a multimillion dollar, multinational, public corporation based out of Fremont, California. And the United States Department of Labor just caught EFI red-handed in an investigation, which found that “about eight employees” were flown in from India to work 120-hour weeks for $1.21 per hour. EFI apparently thought it was okay to pay the employees the same wages they’d be paid in India (in Indian rupees). Here’s the unbelievably crazy sounding quote EFI gave to NBC ‘s Bay Area affiliate : “We unintentionally overlooked laws that require even foreign employees to be paid based on local US standards.” Just so we’re clear: is there anyone reading this who doesn’t know that any person working in the United States is legally required to be compensated according to United States laws? Alberto Raymond, an assistant district director with the US Department of Labor told NBC, “It is certainly outrageous and unacceptable for employers here in Silicon Valley to bring workers and pay less than the minimum wage.” And that applies to EFI especially, which posted just shy of $200 million in revenue in its last financial quarter. EFI is publicly traded on the NASDAQ exchange, and the company’s in the business of computer peripherals (mainly printer-based stuff). The eight employees are being paid $40, 000 in owed wages; they were reportedly installing computer systems at the company’s headquarters. EFI was charged $3, 500 — yes, seriously — for being at fault. [Image credit: Shutterstock] Filed under: Misc Comments Source: NBC Bay Area

Original post:
Bay Area tech company caught paying imported workers $1.21 per hour

Technology changed product placement (and you didn’t even notice)

As the music video starts, Avicii nonchalantly wanders into Stockholm’s Tele2 Arena. He strolls past the venue’s reception; a Grand Marnier poster gets some vital screen time. The bass drops. The crowd goes wild. For some reason, I feel like drinking. Over the past few weeks, Avicii fans in the US have been unknowingly drawing an association between their favorite Swedish DJ’s proghouse hit ” Lay me Down ” and orange-flavored cognac. Everywhere else in the world, the brand is never seen — a plain wall lies in its place. It’s one of the first examples of a new kind of temporary product placement called “digital insertion.” Typically, product placement currently takes the form of a lingering product shot — like a Beats Pill speaker at the start of a Miley Cyrus video . With recent advances, companies can now use algorithms to digitally serve you unique product placements based on where you live, your age or your salary. It’s a creepy concept, but it could change advertising forever. The Grand Marnier spot is the work of Mirriad, an agency that sells what it calls “advertising for the skip generation.” Mirriad uses highly complex analysis tools to map video clips, automatically discerning the best places to insert products, billboards and other adverts. The software it created tracks objects and backgrounds in each frame, creating an optical flow of how objects move from second to second and essentially mapping the video in 3D. This enables both planar tracking (for modifying flat surfaces like walls, computer screens or newspapers) and 3D tracking (for placing complex 3D objects into a moving scene). Mark Popkiewicz, Mirriad CEO, explains the potential for the company’s technology. “We can embed brand assets, digital forms of whatever the brand is. It could be signage, like posters or billboards; it could be actual products. Anything from a can of Coke, a packet of Frosties, a mobile phone. You name it. It can even be a car; we’ve done many of those.” Mirriad has signed some big deals with Vevo and Universal Music Group (UMG) over the past six months. It also recently announced a partnership with advertising firm Havas to match the right companies to the right videos. Havas is an industry giant with huge brands on its books, and the first wave of Mirriad-UMG placements will include Coca-Cola, LG and Dish Network. Product placement is obviously nothing new. It dates back almost a century in radio and film, and has its beginnings in literature: Companies reportedly clamored to get a mention in Jules Verne’s 19th century novel Around the World in Eighty Days . Music videos, too, have long been firmly in the grasp of brands, with many clips acting as thinly veiled advertisements for Beats, Coca-Cola and countless other brands. However, these placements come with their problems. Advertising is ephemeral. Why should product placement be any different? Ever seen the first minute of Hilary Duff’s “All About You” video? It’s essentially an Amazon Fire Phone commercial . How valuable will that ad be to Amazon in five years’ time? You need only look at the countless ’00s musicians flashing two-way pagers for your answer. Regular advertising, be it in print, web or TV, is ephemeral. The ads running alongside this article, for example, are for current products and companies. Why should product placement be any different? Once Grand Marnier’s contract expires, Avicii may be walking past a Ford poster, or a can of Sprite. But let’s not forget location. At the time of writing, the Fire Phone is available in exactly three countries, yet anyone in the world can watch “All About You.” With digital product placement, the same artist can plug different brands depending on where the video is viewed. When it comes to buying these ads, Mirriad’s software automatically generates metadata about videos it processes, cataloging not only the advertising opportunities in each, but also the ideal target market and the value of placements — in fact, it’s really quite similar to web advertising. Rather than Microsoft placing branding on Taylor Swift’s wall, the company need only come to Mirriad and explain what kind of people it wants to advertise to. A campaign could target a million views from 16- to 24-year-olds in the US over a four-week period. Mirriad then embeds the relevant ads into as many videos as necessary to meet that target, using existing analytics from YouTube and others to prove their worth. “There’s no algorithm in the world that can tell you, ‘This is a good place for Smirnoff.'” “Our algorithms monitor down to a pixel level the actual exposure on screen, time, size, location and orientation of the brand so that we’re always meeting and exceeding a minimum level of exposure, ” says Popkiewicz. “Our technology is monitoring that, so that when you buy a campaign from us, you’re going to get a guaranteed level of exposure … For the brands, it takes the uncertainty out of advertising.” Of course, there are limits to what can be automated. “There’s no algorithm in the world that can tell you, ‘This is a good place for Smirnoff because it’s a party atmosphere, ‘ as opposed to, ‘This is a good place for Starbucks because it’s an office environment.’ Those sort of things we have to leave to human judgment.” Mirriad has already brought its ads to TV, and it’s not the first company to do so, either. If you’re in the UK and you watch Hannibal or Bones , chances are you’ve seen some digital product placement, while in the US, rival firm SeamBI offered a similar service that was used to, among other things, insert up-to-date ads into reruns of How I Met Your Mother . SeamBI was founded almost a decade ago, but it’s unclear what’s happened to the company. It hasn’t issued a press release in over two years; its founders are all working elsewhere; and a request for comment on this article was left unanswered. For now, it seems, Mirriad has this potentially lucrative market largely to itself. Popkiewicz is coy when quizzed on where the company’s placements might end up next, but is clear the company has big ambitions. TV could potentially be a far bigger market for Mirriad and other firms than music videos. There’s an obvious trend away from traditional television and toward digital content, whether through on-demand services from existing TV companies (think Hulu or HBO Go), or from all-digital services like Netflix and Amazon Prime Instant Video. As we move away from watching live broadcasts or buying Blu-ray boxsets, Mirriad’s techniques become more and more feasible, and with a growing audience the potential for more complex placements increases. Although none of the big streaming players are keen on discussing the viability of product placement, TV studios are happy to explain its potential benefits and drawbacks behind closed doors. “As you offer your shows around the world through syndication, you encounter different laws about product placement, ” one executive, who prefers to remain anonymous, explains. “Adding ads after the fact increases the amount of money you can make from syndication because each country that airs your show can potentially generate revenue.” Another executive felt similarly upbeat about the financial possibilities, but did note that placements would have to be “tasteful” in order to prevent upsetting its shows’ “biggest fans.” “If you’re not careful to be tasteful, you’ll just end up upsetting your biggest fans.” Services like Netflix could be key to kicking product placement up a gear. There’s nothing preventing distributors from supplying streaming sites with special versions of your favorite show for various territories, each with different product placements from the version that aired on TV. Similarly, a service could, at any given moment, have hundreds of versions of a particular video for targeted advertising, serving Coca-Cola ads to teens or Grand Marnier to 20-somethings. Of course, this would require a lot of work on Netflix’s end — the company told us it has “nothing to share” on the matter — but should it make financial sense for both parties, it’s hard to see it not happening in some form. The same could be true for on-demand movies. Of course there would be some backlash if, for example, Quentin Tarantino’s Big Kahuna Burger joints suddenly turned into McDonald’s, but with a subtle hand, there’s a chance you may not even notice a new bottle of Coke in the background of your favorite Pulp Fiction scene. Comments

View the original here:
Technology changed product placement (and you didn’t even notice)

The NSA can now use Samsung’s Galaxy phones for classified work

Samsung Galaxy phones and tablets have just become the first consumer mobile devices approved by the US National Security Agency (NSA) to carry classified documents. The edict covers most of its newer Galaxy devices, including the Galaxy S5 , Galaxy Note 4 , and the Galaxy Note 10.1 tablet (2014 edition) — as long as they’re equipped with Knox , Samsung’s mobile security app. Knox-enabled devices have already been approved by the US Department of Defense, but only for general, not classified, use. That’s a shot of good news for Samsung in the face of recent dismal earnings , and it no doubt wants to translate the NSA’s golden nod into consumer and corporate sales. Ironically, many of those potential customers may be paranoid… of the NSA. Filed under: Cellphones , Samsung Comments Via: PC World Source: Samsung

Original post:
The NSA can now use Samsung’s Galaxy phones for classified work

Dish loses Cartoon Network, CNN and other Turner channels

Starting today, Dish customers will no longer have access to a number of networks from Turner Broadcasting, after both parties couldn’t come to terms on a contract extension for these. Among the channels now removed from Dish’s programming are: Boomerang, Cartoon Network, CNN, CNN en EspaƱol, HLN, truTV and Turner Classic Movies. As you’ll notice, others like TBS and TNT aren’t included here, and that’s because they’re part of a different agreement. Dish is unsure of when, or if, the missing Turner channels will be brought back, but the company says it is “committed to reaching an agreement that promptly returns this content to Dish’s programming lineup.” If they do, we’ll let you know as soon as that happens. Filed under: Home Entertainment , HD Comments Source: Dish

Continue reading here:
Dish loses Cartoon Network, CNN and other Turner channels

PhotoMath uses your phone’s camera to solve equations

Need a little help getting through your next big math exam? MicroBlink has an app that could help you study more effectively — perhaps too effectively. Its newly unveiled PhotoMath for iOS and Windows Phone (Android is due in early 2015) uses your smartphone’s camera to scan math equations and not only solve them, but show the steps involved. Officially, it’s meant to save you time flipping through a textbook to check answers when you’re doing homework or cramming for a test. However, there’s a concern that this could trivialize learning — just because it shows you how to solve a problem doesn’t mean that the knowledge will actually sink in. And if teachers don’t confiscate smartphones at the door, unscrupulous students could cheat when no one is looking. The chances of that happening aren’t very high at this stage, but apps like this suggest that schools might have to be vigilant in the future. Filed under: Cellphones , Mobile Comments Via: Quartz , TechCrunch Source: PhotoMath

Taken from:
PhotoMath uses your phone’s camera to solve equations

Google wants to help you leave iOS for Android Lollipop

So, you’ve taken a look at the new iPhones and iPads and thought to yourself: “Nah, it’s time to see if the grass really is greener on the other side.” Well, good timing, because Google has published a guide to help you switch from iOS to its newest platform, Android Lollipop . The tech giant has laid it all out for you: its instructions include how to upload photos stored on iPhones and iPads to Google+, transfer music from iTunes to Google Play Music, keep all your contacts and even set up mail and messaging, among others. In short, it’s what you need to read if the only thing keeping you from moving platforms is the process itself. If you’re ready to switch allegiance, keep an eye out for the Nexus 6 smartphone, the Nexus 9 tablet or the Nexus player , as those will be the first devices to come loaded with Lollipop (though some older devices are also getting it through software upgrades). But in case you’re actually having issues switching to iOS instead of from , don’t worry: Apple has also published a guide to help you become a bona fide iOS user. Filed under: Cellphones , Tablets , Mobile , Google Comments Via: Droid Life Source: Android Switch

View article:
Google wants to help you leave iOS for Android Lollipop

We rode a $10,000 hoverboard, and you can too

It’s impossible to talk about hoverboards without invoking a particular movie title, so we’re not even going to try: remember that awesome scene from Back to the Future Part II ? It’s one step closer to reality: a California startup just built a real, working hoverboard . Arx Pax is attempting to crowdfund the Hendo Hoverboard as a proof of concept for its hover engine technology — it’s not quite the floating skateboard Marty McFly rode through Hill Valley (and the Wild West ), but it’s an obvious precursor to the imagined ridable: a self-powered, levitating platform with enough power to lift a fully grown adult. I initially approached the floating pallet with caution, expecting it to dip and bob under my weight like a piece of driftwood. It didn’t. The levitating board wiggled slightly under my 200-pound frame, but maintained its altitude (a mere inch or so) without visible strain. Arx Pax tells me that the current prototype can easily support 300 pounds and future versions will be able to hold up to 500 pounds without issue. Either way, you’ll need to hover over a very specific kind of surface to get it to hold anything: the Hendo uses the same kind of electromagnetic field technology that floats MagLev trains — meaning it will only levitate over non-ferrous metals like copper or aluminum. Riding the contraption was a lot fun, but also quite the challenge: the Hendo hoverboard doesn’t ride at all like McFly’s flying skateboard . In fact, without a propulsion system, it tends to drift aimlessly. Arx Pax founder and Hendo inventor Greg Henderson says its something the company is working on. “We can impart a bias, ” he tells me, pointing out pressure sensitive pads on the hoverboard’s deck that manipulate the engines. “We can turn on or off different axis’ of movement.” Sure enough, leaning on one side of the board convinces it to rotate and drift in the desired direction. Without feeling the friction of the ground , however, I had trouble knowing how much pressure to exert — Henderson’s staff had to jump in and save me from spinning out of control. Clearly, this might take some practice. As fun as its current form is, Henderson didn’t necessarily set out to reinvent transportation. The Hendo engine’s original inspiration came from architecture. “It came from the idea of hovering a building out of harms way, ” he says. “If you can levitate a train that weighs 50, 000 kilograms, why not a house?” After some prodding he clarifies the idea as a sort of emergency lifting system that could theoretically rise a building off of its foundation during an earthquake, essentially rendering the natural disaster’s tremors harmless. The idea sounds as fictional as, well, a hoverboard — but he already built one of those. Henderson says that floating a building is a long term goal. Right now, the technology is in its early stages, and he’s just trying to get it in the hands of engineers with big ideas. That’s where the Hendo “white box” comes in. Backers who contribute to the company’s Kickstarter at the $299 level will receive a complete, working Hendo hover engine and enough hover surface to play around with. It’s a developer kit, Henderson says, and he wants makers to use it to build their own hover projects. If they have an idea they want to take to market, Arx Pax will work with them to make it a reality. “The most important piece of it all for me is the idea of taking away the limitations of how we think about problems in general. Not just thinking outside the box, but off the page, ” he says, explaining how Hover technology could be used to solve old problems in new ways. “When you do that — when you approach problems that were seemingly impossible in different ways — you’ll never cease to be amazed by the solutions you can come up with.” While long-term goals go far beyond that of the not-so-humble hoverboard, there are plenty of Kickstarter goals focused on scratching that itch exclusively. 250 backers at the $100 level will be eligible for a five minute ride on one of the company’s prototype boards, and $1, 000 buys a privately coached hour-long ride. Not content with merely renting hover-time? Okay, okay: the first ten backers to contribute $10, 000 will get a hoverboard to keep. The delivery date? 10/21/2015 — the date Marty McFly arrives in the future . Filed under: Misc , Transportation Comments Source: Hendo Hover

Read More:
We rode a $10,000 hoverboard, and you can too

Ferrari’s hybrid commits sacrilege, rolls in electric-only mode

Pop quiz, hotshot. How much horsepower do you get with a 789HP V12 and 160HP electric motor? Any Ferrari fan knows the (insane) answer: 949. That’s the total output from the $1.69 million hybrid LaFerrari , and Ferrari has always said that both motors would always work as one to produce it — no electric-only mode. But a new video has shown the limited-edition supercar rolling out of its garage as silently as a cat before the V12 comes alive. It’s mighty strange to see a dead-quiet Ferrari (especially considering its past stance on EVs), even if it was just for a few hundred yards. We’re not sure if that means it’ll now do that in stop and start driving like your hippy uncle’s Prius, though Ferrari has promised a 5 mile EV-only mode on future cars. Not that it’s going to help the EPA numbers — it is a 217 mph car, after all. Filed under: Transportation Comments Via: Autoblog Source: Carlo Delucis

See the original article here:
Ferrari’s hybrid commits sacrilege, rolls in electric-only mode

Tractor beams are suddenly a lot more plausible

Tractor beams now have a better shot at crossing from science fiction trope to reality, thanks to scientists at The Australian National University (ANU). They managed to push and pull a 0.2mm sized particle nearly 20cm using a “hollow” laser beam. That’s a hundred-fold improvement over recent efforts at light propulsion, which have only moved microscopic particles short distances. The ANU team placed gold-coated glass spheres in the light-free center of the beams, creating hotspots on the surface that propelled the spheres via air reactions. The hotspot’s location was changed by adjusting the polarization, giving scientists full control over the sphere’s motion. Sure, it’s not exactly the Death Star, but the scientists think it’ll work over long distances — meaning it could one day be used to, say, control pollution or move dangerous particles in the lab. [Image credits: Patrick Kovarik/AFP/Getty Images, ANU] Filed under: Science Comments Via: CNET Source: Nature

Visit link:
Tractor beams are suddenly a lot more plausible

Disney rendered its new animated film on a 55,000-core supercomputer

Disney’s upcoming animated film Big Hero 6 , about a boy and his soft robot (and a gang of super-powered friends), is perhaps the largest big-budget mash-up you’ll ever see. Every aspect of the film’s production represents a virtual collision of worlds. The story, something co-director Don Hall calls “one of the more obscure titles in the Marvel universe, ” has been completely re-imagined for parent company Disne y. Then, there’s the city of San Fransokyo it’s set in — an obvious marriage of two of the most tech-centric cities in the world. And, of course, there’s the real-world technology that not only takes center stage as the basis for characters in the film, but also powered the onscreen visuals. It’s undoubtedly a herculean effort from Walt Disney Animation Studios, and one that’s likely to go unnoticed by audiences. “We’ve said it many, many times. We made the movie on a beta renderer, ” says Hank Driskill, technical supervisor for Big Hero 6 . “It was very much in progress.” Driskill is referring to Hyperion, the software Disney created from the ground up to handle the film’s impressive lighting. It’s just one of about three dozen tools the studio used to bring the robotics-friendly world of San Fransokyo to life. Some, l ike the program Tonic originally created for Rapunzel’s hair in Tangled, are merely improved versions of software built for previous efforts, or “shows” as Disney calls them. Hyperion, however, represents the studio’s greatest and riskiest commitment to R&D in animation technology thus far. And its feasibility wasn’t always a sure thing, something Disney’s Chief Technology Officer Andy Hendrickson underscores when he says, “It’s the analog to building a car while you’re driving it.” “We’ve said it many, many times. We made the movie on a beta renderer, ” says Hank Driskill, technical supervisor for Big Hero 6 . For that reason, Hendrickson instructed his team to embark on two development paths for Big Hero 6 : the experimental Hyperion and a Plan B that hinged on a commodity renderer. It took a team of about 10 people over two years to build Hyperion, during which time Driskill says resources were being spread thin: “We were running with a backup plan until around June of last year … [and] we realized we were spending too much energy keeping the backup plan viable. It was detracting in manpower … from pursuing the new idea as fully as we could. So we just said, ‘We’re gonna go for it.’ And we turned off the backup plan.” Hyperion, as the global-illumination simulator is known, isn’t the kind of technology that would excite the average moviegoer. As Hendrickson explains, it handles incredibly complex calculations to account for how “light gets from its source to the camera as it’s bouncing and picking up colors and illuminating other things.” This software allowed animators to eschew the incredibly time-consuming manual effort to animate single-bounce, indirect lighting in favor of 10 to 20 bounces simulated by the software. It’s responsible for environmental effects — stuff most audiences might take for granted, like when they see Baymax, the soft, vinyl robot featured in the film, illuminated from behind. That seemingly mundane lighting trick is no small feat; it required the use of a 55, 000-core supercomputer spread across four geographic locations. Disney Animation CTO Andy Hendrickson demonstrates Hyperion’s real-world lighting simulation. “This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems, ” says Hendrickson. To manage that cluster and the 400, 000-plus computations it processes per day (roughly about 1.1 million computational hours), his team created software called Coda, which treats the four render farms like a single supercomputer. If one or more of those thousands of jobs fails, Coda alerts the appropriate staffers via an iPhone app. To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion “could render Tangled from scratch every 10 days.” If that doesn’t drive the power of Disney’s proprietary renderer home, then consider this: San Fransokyo contains around 83, 000 buildings, 260, 000 trees, 215, 000 streetlights and 100, 000 vehicles (plus thousands of crowd extras generated by a tool called Denizen). What’s more, all of the detail you see in the city is actually based off assessor data for lots and street layouts from the real San Francisco. As Visual Effects Supervisor Kyle Odermatt explains, animating a city that lively and massive simply would not have been possible with previous technology. “You couldn’t zoom all the way out [for a] wide shot down to just a single street level the way we’re able to, ” he says. “This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems, ” says Hendrickson. Beyond the supercomputer cluster and software tools devised to make the movie, Big Hero 6 leans heavily on cutting-edge technology for its visual majesty in one other way: its characters. Both Baymax, the aforementioned, lovable robot sidekick and the microbots, swarm-like mini-drones controlled by telepathy, are steeped in some very real scientific research. That decision to ground the world of Big Hero 6 in near-future technologies led Hall and co-director Chris Williams on research trips to MIT, Harvard and Carnegie Mellon in the US and even to Tokyo University in Japan. A soft robotic arm developed by researchers at Carnegie Mellon University. “You know, we try to look at, like, five to 10 years down the road at what was coming … It seems counterintuitive because in animation you can do anything, but it still has to be grounded in a believable world, ” says Hall. Indeed, there’s even a moment where supergenius lead character Hiro Hamada uses a 3D printer in his garage to create an outfit for Baymax. In discussing the scene, Roy Conli, the film’s producer, credits the “maker movement that’s going on right now.” He adds, “These kids are makers. So it’s a little bit the celebration of the nerd.” To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion “could render Tangled from scratch every 10 days.” It was during a visit to Carnegie Mellon that Hall came across researcher Chris Atkeson, who’d been working in the field of inflatable, soft robotics; robots intended for the health care industry. Hall says Atkeson pleaded with him to “make a movie where the robot is not the villain.” But Atkeson didn’t have to do much convincing — Hall’s vision for Baymax meshed nicely with his research. He’d wanted a robot audiences hadn’t seen on screen before. Hall continues, “The minute I saw this [research], I knew that we had our huggable robot. I knew that we had found Baymax.” The team also drew inspiration for Baymax from existing compassionate-care tech out of Japan . “They’re a little ahead of the curve, ” Hall says. “I mean, [health care robots] are actually in practice in some of the hospitals in Japan. They’re not vinyl; they’re not Baymax. They’re plastic robotics.” The high-tech city of San Fransokyo represents a mash-up of eastern and western culture. Robotics research out of Carnegie Mellon also provided the basis for the unwitting pawns of the film: the Lego-like, mind-controlled microbots. Of course, the version we see in the film is a much more fantastical approach to the simple, water-walking bots Hall’s team glimpsed during their visit. That, coupled with a heavy dose of inspiration from swarm-drone tech, led to the insect-like creepiness of the microbots in the final film. By design, the electromagnetic microbots move as if part of a chain: Each individual “link” travels from front to back to propel the swarm forward in a circuit-board-like pattern. On average, the visual effects team says there are about 20 million microbots onscreen in a given shot, and that level of complexity is where Hyperion once again comes crucially into play. Originally, however, the team didn’t think its full vision of the microbots would even be possible to render. In a way, Big Hero 6 is a love letter to technology. “We thought the technology would never actually be able to handle it happening in all of the shots, ” explains Head of Effects Michael Kaschalk. “And to do that from shot to shot, that takes artists’ work to just be able to create the [lighting] cheat. But as Hyperion developed, and we actually built the system, we found that it was handling all of this data just fine. So we actually built the real thing.” Hiro scans Baymax to create 3D-printed armor. Though tech innovation clearly plays an important role in development at Disney Animation Studios, it’s not the sole guiding force for each film and, for that matter, neither is the story. The studio’s process is entirely collaborative. “We are looking for input from everybody that works here for storytelling … there’s no doubt that those ideas can rise up from anywhere to become a big piece or small piece of the story, ” says Odermatt. There’s no one single source of motivation other than a love of research and functional design — key concepts imparted by Chief Creative Officer John Lasseter. “The movie does celebrate science and technology in a way that we haven’t really done before.” In a way, Big Hero 6 is a love letter to technology. It’s a fantasy film that gives audiences a knowing wink toward the robot-assisted near-future, as if to say, “This is exactly where you’re headed. And it’s coming soon.” Big Hero 6 also represents a perfect storm for Disney: The subject matter (makers and robotics) and setting (hyper-tech San Fransokyo) dovetailed with the economic feasibility of cutting-edge computational hardware (that massive render farm) and the development of advanced animation techniques (Hyperion). It’s a film for, by and from lovers of technology. That Big Hero 6 has a technological heart and soul is not lost on Hall. In fact, he’s keenly aware of this. “The movie does celebrate science and technology in a way that we haven’t really done before.” [Image credit: Walt Disney Animation; Carnegie Mellon University (soft robotic arm)] Filed under: HD Comments

Visit site:
Disney rendered its new animated film on a 55,000-core supercomputer