Slashdot reader v3rgEz writes: When MuckRock started using public records to find the oldest computer in use by the U.S. government, they scoured the country — but it wasn’t until a few tipsters that they set their sights a little higher and found that the oldest computer in use by the government might be among other planets entirely. The oldest computer still in use by the U.S. government appears to be the on-board systems for the Voyager 1 and 2 space probes — nearly 40 years old, and 12.47 billion miles away from earth. (Last year NASA put out a call for a FORTRAN programmer to upgrade the probes’ software.) But an earlier MuckRock article identified their oldest software still in use on earth — “the computers inside the IRS that makes sure everybody is paying their taxes”. And it also identified their oldest hardware still in use — “the machines running the nuclear defense system”. (The launch commands are still stored on 8-inch floppy disks.) Read more of this story at Slashdot. 
Read the original:
MuckRock Identifies The Oldest US Government Computer Still in Use
prisoninmate writes: What’s Kali Linux 2016.2? Well, it’s an updated Live ISO image of the popular GNU/Linux distribution designed for ethical hackers and security professionals who want to harden the security of their networks, which contains the latest software versions and enhancements for those who want to deploy the OS on new systems. It’s been quite some time since the last update to the official Kali Linux Live ISOs and new software releases are announced each day, which means that the packages included in the previous Kali Linux images are very old, and bugs and improvements are always implemented in the most recent versions of the respective security tools. Best of all, the new Kali Linux 2016.2 release comes in KDE, MATE, Xfce, LXDE, and Enlightenment E17 flavors. Their blog also points out that Kali recently appeared in an episode of Mr. Robot. Read more of this story at Slashdot. 
An anonymous reader shares a CNET report: Australia is changing from “down under” to “down under and across a bit”. The country is shifting its longitude and latitude to fix a discrepancy with global satellite navigation systems. Government body Geoscience Australia is updating the Geocentric Datum of Australia, the country’s national coordinate system, to bring it in line with international data. The reason Australia is slightly out of whack with global systems is that the country moves about 7 centimetres (2.75 inches) per year due to the shifting of tectonic plates. Since 1994, when the data was last recorded, that’s added up to a misalignment of about a metre and a half. While that might not seem like much, various new technology requires location data to be pinpoint accurate. Self-driving cars, for example, must have infinitesimally precise location data to avoid accidents. Drones used for package delivery and driverless farming vehicles also require spot-on information.ABC has more details. Read more of this story at Slashdot. 
An anonymous reader writes from a report via PCWorld: Google says its Tensor Processing Unit (TPU) advances machine learning capability by a factor of three generations. “TPUs deliver an order of magnitude higher performance per watt than all commercially available GPUs and FPGA, ” said Google CEO Sundar Pichai during the company’s I/O developer conference on Wednesday. The chips powered the AlphaGo computer that beat Lee Sedol, world champion of the game called Go. “We’ve been running TPUs inside our data centers for more than a year, and have found them to deliver an order of magnitude better-optimized performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law), ” said Google’s blog post. “TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation. Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models, and apply these models more quickly, so users get more intelligent results more rapidly.” The chip is called the Tensor Processing Unit because it underpins TensorFlow, the software engine that powers its deep learning services under an open-source license. Read more of this story at Slashdot.