Johnny Depp surprises Pirates of the Caribbean riders as live animatronic at Disneyland

Haha! Lucky riders on the Pirates of the Caribbean ride at Disneyland were treated to a live Jack Sparrow animatronics performance yesterday by Johnny Depp. He made a surprise visit to the park as part of a PR stunt. It’s fun to hear the passengers as they realize that the real Johnny Depp is standing right in front of them.

Read this article:
Johnny Depp surprises Pirates of the Caribbean riders as live animatronic at Disneyland

Nintendo programmer coded Game Boy classic without using a keyboard

Nintendo programmer Masahiro Sakura coded the Game Boy classic Kirby’s Dream Land on a cartridge-based Famicom console and Disk System that lacked a hardware keyboard. According to a recent presentation given by Sakura, “values had to be input using a trackball and an on-screen keyboard.” Sakura, who was 20-years-old at the time, said he just thought that was “the way it was done.” From Game Watch’s report in Japanese, translated by Source Gaming : At the time, the development tool that HAL Laboratory was using was the Twin Famicom, a console that combined the Famicom and the Famicom Disk System. A trackball made specifically for the Twin Famicom was used with the machine, which read and wrote data to a floppy disk and uploaded data to the floppy disks [during development]. Essentially, they were using a Famicom to make Famicom games. Sakurai told the crowd, “It’s like using a lunchbox to make lunch”. However, because of that, they were able to create a functional test product before the project plan was even completed. (via Ars Technica )

View the original here:
Nintendo programmer coded Game Boy classic without using a keyboard

Few sad as About.com closure announced

When long-lived websites close down, they often give little notice, sending archivists scrambling to rescue its work for posterity. About.com, the venerable topic-mining hive abruptly put to death, seems to be a counter-example: a faceless mountain of bland, undifferentiated, half-plagiarized content that no-one seems sad to see vanish . Even its own CEO is plainly contemptuous of it. That’s why Vogel tells Business Insider he’s going to shut the site down as of May 2nd. “I got a phone call from Joey Levin, who is the CEO of IAC [About.com Group’s parent company]. He asked, ‘What do you think of About.com?'” Vogel told BI. “My answer — in perfect arrogance — was ‘I don’t.’ Who thinks of About.com? Nobody.” But not all of About.com is going away necessarily. Vogel says he will take parts of the website and turn them into separate niche verticals, then announce a new name for the overarching brand at a conference in New Orleans. “A year ago we were a general interest site,” Vogel told The Drum in March. “We were not growing. In fact, we were kind of shrinking. We had great content, but we were doing the wrong thing.” About.com was one of the earliest big web successes to cash out: to Prime Media in 2000 for $690m, then to the New York Times in 2005 for $410m, IAC in 2012 for $300m, and now to the deep void for sweet fuck all—but also the hope that the staff and infrastructure can be used to launch something new. “I’m not going to be the guy who ruined About.com,” Vogel told Business Inside. “ It’s already ruined, so this is all upside here. ”

More:
Few sad as About.com closure announced

This worm eats plastic bags

Humans discard a trillion single-use plastic bags every year. If you were a wax worm, this statistic would make you drool. The caterpillar loves to eat them. From Atlas Obscura : Frederica Bertocchini, a biologist at the Institute of Biomedicine and Biotechnology in Spain, noticed some wax worms had managed to eat their way through the plastic bags they were being kept in. While other organisms can take weeks or months to break down even the smallest amount of plastic, the wax worm can get through more—in a far shorter period of time. The researchers let 100 wax worms chow down on a plastic grocery bag, and after just 12 hours they’d eaten about 4 percent of the bag, according to findings published Monday in the journal Current Biology. That may not sound like much, but that’s a vast improvement over fungi, which weren’t able to break down a noticeable amount of polyethylene after six months. Image of wax worm: skeeze/Pixabay

View post:
This worm eats plastic bags

Mafia used the text-message ticker at the bottom of a sports broadcast to get messages to mob bosses

Quelli che il Calcio (That which is Football) is one of Italy’s top sports broadcasts and it is played in the country’s prisons; it has a ticker that you can send SMSes to that then show up on screen. (more…)

More:
Mafia used the text-message ticker at the bottom of a sports broadcast to get messages to mob bosses

In Paraguay, the "heist of the century" is blamed on a notorious Brazilian prison-gang

50 armed men in camou flak jackets driving armored cars cordoned off the roads leading to a transportation company’s office in Ciudad del Este, Paraguay (a “smugglers’ haven in the border region with Brazil and Argentina”), blew the entire face of the building up with demolition equipment, stole an estimated $40M and escaped by motorboat up the Panama River. (more…)

More:
In Paraguay, the "heist of the century" is blamed on a notorious Brazilian prison-gang

Internet Archive to ignore robots.txt directives

Robots (or spiders, or crawlers) are little computer programs that search engines use to scan and index websites. Robots.txt is a little file placed on webservers to tell search engines what they should and shouldn’t index. The Internet Archive isn’t a search engine, but has historically obeyed exclusion requests from robots.txt files. But it’s changing its mind, because robots.txt is almost always crafted with search engines in mind and rarely reflects the intentions of domain owners when it comes to archiving. Over time we have observed that the robots.txt files that are geared toward search engine crawlers do not necessarily serve our archival purposes. Internet Archive’s goal is to create complete “snapshots” of web pages, including the duplicate content and the large versions of files. We have also seen an upsurge of the use of robots.txt files to remove entire domains from search engines when they transition from a live web site into a parked domain, which has historically also removed the entire domain from view in the Wayback Machine. In other words, a site goes out of business and then the parked domain is “blocked” from search engines and no one can look at the history of that site in the Wayback Machine anymore. We receive inquiries and complaints on these “disappeared” sites almost daily. A few months ago we stopped referring to robots.txt files on U.S. government and military web sites for both crawling and displaying web pages (though we respond to removal requests sent to info@archive.org). As we have moved towards broader access it has not caused problems, which we take as a good sign. We are now looking to do this more broadly. An excellent decision. To be clear, they’re ignoring robots.txt even if you explicitly identify and disallow the Internet Archive. It’s a splendid remember that nothing published on the web is ever meaningfully private, and will always go on your permanent record.

Read the original post:
Internet Archive to ignore robots.txt directives