NVIDIA proves the cloud can replace a high-end gaming rig

A year ago, NVIDIA’s GeForce Now game-streaming service let me play The Witcher 3 , a notoriously demanding PC-only title, on a MacBook Air. This year, NVIDIA finally unveiled the Windows version of the service, and it was even more impressive. I was able to play Rainbow Six: Siege and PlayerUnknown’s Battlegrounds on underpowered PCs that sell for $200 to $300. If NVIDIA’s Mac demo was a revelation, playing high-end PC games on discount hardware felt like a miracle. Now, after testing the GeForce Now beta release on PCs for a week, I’m even more intrigued by the possibilities of game streaming. To put it simply, the service lets you remotely tap into the power of an expensive gaming rig from any computer. It runs on remote servers powered by NVIDIA’s GTX 1080Ti GPUs. While the company isn’t divulging further specifications, you can bet they’re also stuffed with more than enough RAM and CPU horsepower. (NVIDIA claimed they were the equivalent of a $1, 500 gaming PC a year ago.) When you launch GeForce Now, you’re actually watching a video streaming to your PC. But since there’s very little latency between what you’re seeing and your keyboard and mouse inputs, it feels as if the games are running right on your computer. You don’t need a very powerful PC to run the GeForce Now client. At the minimum, NVIDIA recommends using a 3.1GHz Core i3 processor and 4GB RAM, along with either Intel HD 2000, GeForce 600 series or Radeon HD 3000 graphics. Those are all specs you’ll find in PCs four to six years old. But of course, solid internet access is a must. You’ll need speeds of at least 25Mbps, but NVIDIA advises a 50Mbps connection for the best experience. You’ll also have to make sure your computer has a reliable link to your router — which means you’ll either need to use an Ethernet cable or a 5GHz Wi-Fi network. At the moment, GeForce Now on Macs and PCs only lets you play games you already own on Steam, Blizzard’s Battle.net or Ubisoft’s Uplay. Anything you don’t own can be purchased through the streaming platform. That’s a major difference from GeForce Now on NVIDIA’s SHIELD tablet and set-top box , which includes a handful of titles as part of its $7.99 monthly fee as well as games for purchase. Both versions of the service support popular titles like Overwatch, Call of Duty WWII and The Witcher 3 , but you’ll probably have to wait a bit for them to work with lesser-known games. NVIDIA isn’t specifying what it takes to make a game compatible with the service, but I’d wager it has to test them out to make sure nothing breaks in the streaming process. Setting up GeForce Now is as easy as downloading and installing the client and choosing a title to play. Then you just need to provide your login information for whichever service hosts the game. If you’re launching a Steam title, you’ll end up seeing the service’s familiar Windows interface, where you can either buy the game or download it to your library. One big downside with GeForce Now is that you’ll have to install games every time you want to play them, since you’re thrown onto a different server whenever you log in. It’s not a huge problem, though, since the remote machines are plugged into a fat network pipe and offer unlimited storage. PUBG , which weighs in at 12GB, installed in around four minutes, while The Witcher 3 (31.7GB) took over ten minutes. Devindra Hardawar/AOL On the Surface Laptop — a great ultrabook marred only by its weak integrated graphics — running over our office’s WiFi, PUBG felt almost as smooth as it does on my dedicated gaming rig. It ran at a steady 60 frames per second, even though I cranked the graphics settings to “Ultra” and the resolution to 2, 560 by 1, 400. After a few minutes of running around the game’s apocalyptic European town and taking out other players, I almost forgot I was playing something that was running on a server hundreds of miles away. The game’s excellent audio design also survived — I had no trouble pinpointing people sneaking around a house while wearing headphones, and the bomb strikes in “Red Zones” still rattled my skull. Mostly, though, I was surprised that I didn’t feel any lag while I was using the Surface Laptop’s keyboard and a Logitech wireless G903 gaming mouse. Moving the camera around and aiming my weapons felt incredibly responsive, and I was surprised that I was able to outgun some players in some heated shootouts. That lack of latency as even more impressive with Overwatch , an even faster-paced game. Characters like Tracer and Genji, both of whom would be tough to play with any noticeable lag, felt as nimble as they do on my desktop. I didn’t even have trouble landing shots with snipers like Hanzo and Ana. I was simply able to enjoy playing the game as I normally do. And, even more so than PUBG , I was impressed by how well GeForce Now handled Overwatch’s vibrant and colorful graphics. Gorgeous maps like Ilios and Dorado appeared as detailed as ever, and the same goes for the game’s imaginative character models and costumes. GeForce Now easily handled graphically intensive titles like Destiny 2 and The Witcher 3 , which felt even more impressive to play on the Surface Laptop. Both games managed to run at 60 FPS at a 2, 560 by 1, 400 resolution (the service supports up to 2, 560 by 1, 600), with all of their graphics settings turned all the way up. Even though Destiny 2 isn’t exactly a fast-paced shooter, it still benefited from the service’s low latency, which helped me mow down waves of enemies without much trouble. And with the Witcher 3 , I was impressed that its graphically rich world didn’t lose any fidelity while being streamed. Perhaps because these games are particularly demanding, I occasionally experienced connection hiccups while playing them. They only lasted a few seconds, but if I were fighting against tough bosses, they could have easily led to my doom. Those disruptions also made it clear that your experience with GeForce Now will depend largely on your internet connection. I had a mostly trouble-free experience in our office and at home, where I have 100 Mbps cable service. But if you don’t have a steady 25 Mbps connection, Ethernet access or strong wireless reception, you’ll likely see more gameplay-disrupting issues. I wasn’t able to run any games at Starbucks locations around NYC, and based on my terrible experiences with hotel WiFi, I’d wager you’d have trouble using GeForce Now while traveling, too. (The service is only supported in the US and Europe, at the moment.) Devindra Hardawar/AOL The big problem with GeForce Now? We don’t know what the service will look like once it leaves beta. You can request access now , and if you’re lucky enough to get in, you can test the service for free. NVIDIA isn’t giving us a timeframe for an official release, or how much it’ll eventually cost. Based on what we typically see with streaming services, I’d also expect GeForce Now’s smooth performance to take a hit once it’s open to the hordes of frag-happy gamers. For now, though, it’s a glimpse at the true future of gaming — a world where we don’t have to worry if our video cards are fast enough, or if we have enough hard drive space for a massive open world game. Well, as long as you have an internet connection fast enough to handle all of that gaming goodness.

View article:
NVIDIA proves the cloud can replace a high-end gaming rig

NVIDIA’s GeForce GTX 1070 Ti battles AMD’s latest video cards

NVIDIA has largely been sitting pretty since the GeForce 10-series arrived and gave it a comfortable performance lead in the graphics realm, but things have changed: AMD’s Vega cards are at least fast enough that you might consider them instead. Needless to say, NVIDIA isn’t about to let that situation stand. It’s launching the GeForce GTX 1070 Ti, a $449 upper mid-range card that could outperform the $399 Vega 56 and undercut the $499 Vega 64 on price. For all intents and purposes, it’s very nearly as powerful as a GTX 1080: you have the same core clock speed as the pricier board, and only slight hits to the CUDA core count (2, 432 vs. 2, 560), texture units (152 vs. 160) and boost clock (1, 683MHz vs. 1, 733MHz). About the only major difference is that you’re still limited to ‘just’ GDDR5 memory instead of the speedier GDDR5X on the 1080. Pre-orders are starting today ahead of the November 2nd release, and it’s notable that both the Founders Edition card and the official suggested retail price are the same. You’re not necessarily paying extra to go with the first-party design this time around. The real dilemma is whether or not it’s worth springing for the 1070 Ti, at least those models that cling to the stock specifications. It’s entirely possible to score an overclocked GTX 1070 for less than $400 if you play your cards right, and the Vega 56 (if you can find it; it’s not as common) still packs quite a punch. The 1070 Ti is mostly alluring if you prefer NVIDIA hardware and want near-1080 speed without paying the well over $500 it typically costs to get a full-fledged 1080. With that said, keep an eye out for overclocked third-party boards that don’t carry a significant premium — those may hit the sweet spot and give you a reason to jump to the 1070 Ti instead of sticking to the regular 1070 or AMD’s offerings. Via: Ars Technica Source: NVIDIA (1) , (2)

Read the original:
NVIDIA’s GeForce GTX 1070 Ti battles AMD’s latest video cards

NVIDIA introduces a computer for level 5 autonomous cars

At the center of many of the semi-autonomous cars currently on the road is NVIDIA hardware. Once automakers realized that GPUs could power their latest features, the chipmaker–best known for the graphics cards that make your games look outstanding–became the darling of the car world. But while automakers are still dropping level 2 and sometimes level 3 vehicles into the market, NVIDIA has announced its first AI computer, the NVIDIA Drive PX Pegasus that it says is capable of level 5 autonomy. That means no pedals, no steering wheel, no need for anyone to ever take control. The new computer delivers 320 trillion operations per second, 10 times more than its predecessor. Before you start squirreling away cash for your own self-driving car, though, NVIDIA’s senior director of automotive, Danny Shapiro, notes that it’s likely going to be robotaxis that drive us around. In fact, the company said that over 25 of its partners are already working on fully autonomous taxis. The goal with this smaller more powerful computer is to remove the huge computer arrays that sit in the prototype vehicles of OEMs, startups and any other company that’s trying to crack the autonomous car nut. NVIDIA’s announcement should make all those companies happy. The computing needed to power a self-driving car’s AI and data crunching not to mention the huge amounts of data coming from potentially dozens of cameras, LiDAR sensors , short and long-range radar is staggering and usually means there’s a small server room stored in the trunk. All that processing power sucks up a ton of power from the vehicle and as more cars are going electric, the last thing an automaker wants is a system that cuts in the range of their new car. The new NVIDIA Drive PX Pegasus AI computer is the size of a license plate and uses far less power than the current model. But it’s going to be a while before anyone gets their hands one. The new computer will be available in the second half of 2018 with next generation GPUs that NVIDIA hasn’t actually announced yet. But there’s already one institution that’s ready to go autonomous: the Deutsche Post DHL. The delivery service is looking to deploy a pilot fleet with the current Drive PX in 2018. The hope is to have the car be able to shadow its delivery persons as they drop off packages. A driver could get out of the truck or van with a few packages for a block and when they are finished, the vehicle will be waiting for them outside the last house. So the autonomous future isn’t just about delivery people, it’s also about delivering your online purchases. Source: Nvidia

Continued here:
NVIDIA introduces a computer for level 5 autonomous cars

iOS 11 Released

Today, Apple released the final version of iOS 11, its latest mobile operating system. If you have an iPhone or iPad that was released within the last few years, you should be able to download the new update if you navigate to the Settings panel and check for a software update under the General tab. The Verge reports: OS 11, first unveiled in detail back at Apple’s WWDC in June, is the same incremental annual refresh we’ve come to expect from the company, but it hides some impressive complexity under the surface. Not only does it add some neat features to iOS for the first time, like ARKit capabilities for augmented reality and a new Files app, but it also comes with much-needed improvements to Siri; screenshot capture and editing; and the Control Center, which is now more fully featured and customizable. For iPads, iOS 11 is more of an overhaul. The software now better supports multitasking so you can more easily bring two apps into split-screen mode, or even add a third now. The new drag-and-drop features are also much more powerful on iPad, letting you manage stuff in the Files app more intuitively and even letting you drag and drop photos and text from one app to another. Read more of this story at Slashdot.

Read More:
iOS 11 Released

HP unveils its insanely upgradeable Z-class workstations

No matter how great your latest PC build is, HP’s new Z Workstation lineup can probably top it. The company’s latest Z8, Z6 and Z4 desktop workstations are its most powerful and ridiculously upgradeable ever, it says. The top-end Z8 features 24 RAM slots and up to 3TB of RAM, dual Xeon CPUs (with up to 56 cores), dual M.2 SSDs and dual NVIDIA Quadro Pro graphics cards. It’s aimed squarely at VFX artists, letting them run 3D simulations, edit 8K video and do Nuke compositing, probably all at the same time. Just to rattle off a few more specs (because there aren’t many machines like this), it offers 10 USB 3.1 Gen 2 Type C ports, dual Gig-E ports, seven full-length, full height PCIe slots (nine total) and optional Thunderbolt 3. All of that is housed in a cleanly laid out, tool-free chassis with a 1, 700 watt power supply and invective ducting. Just buying the Z8 box without much inside will cost you $2, 439, but if you want, say, a pair of the latest Xeon Platinum 8180 chips with 28 cores each and two of NVIDIA’s 24GB P6000 Quadro graphics adapters, those items alone would run you a cool $35, 000. Overall, the new machine boosts the memory, CPU core count, graphics and PCIe bandwidth capacity significantly across the board compared to its previous Z840 flagship model. The HP Z6 Workstation ($1, 919) dials that craziness down a notch, with 384GB of max system memory and fewer slots and ports, but still has the dual Xeon CPU option. The Z4 Workstation ($1, 239), meanwhile, lets you install 256GB of RAM and a single Xeon CPU, limited to Intel’s new W-series . The top end of that right now is the Xeon W2155 with 10 cores and 20 threads, and HP hasn’t said whether it’ll support the flagship 18-core Xeon W that’s coming later in the year. Remember, these prices are for the bare boxes only, not the graphics cards, memory, CPUs and other stuff you’ll need. On the Z4 model, however, the Xeon W 10-core chip is “just” $1, 000, so it should be no problem to build a box for around $3, 000 to $4, 000. HP also unveiled a few new displays, including the 38-inch Z38c curved display that is a bit of an odd duck for HP’s video-oriented Z lineup. Much like LG’s 38UC99 38-inch screen , it has sort-of 4K resolution (3, 840 x 1, 600) 21:9 that can’t actually handle full-resolution UHD video, so seems more suited for gaming. It also lacks other features like 10-bit capability, which is becoming increasingly important for 4K HDR video editing and color correctoin. HP’s Z8 and Z6 Workstations are coming in October, while the Z4 arrives sometime in November. This end-of-year date should help film and VFX houses fit them into their fiscal purchase budgets, because other than the Z4, these aren’t really meant for you or me.

Read More:
HP unveils its insanely upgradeable Z-class workstations

Sharp Announces 8K Consumer TVs Now That We All Have 4K

Thuy Ong reports via The Verge: Now that you’ve upgraded to a shiny new 4K TV, Sharp has revealed its latest screen to stoke your fear of missing out: a 70-inch Aquos 8K TV. That 8K (7, 680 x 4, 320) resolution is 16 times that of your old Full HD (1920 x 1080) TV. Sharp calls it “ultimate reality, with ultra-fine details even the naked eye cannot capture, ” which doesn’t seem like a very good selling point. Keep in mind that having a screen with more pixels doesn’t buy you much after a certain point, because those pixels are invisible from a distance — while an 8K panel would be beneficial as a monitor, where you’re sitting close, it won’t buy you much when leaning back on the couch watching TV. HDR, however, is something else entirely, and fortunately, Sharp’s new 8K set is compatible with Dolby Vision HDR and BDA-HDR (for Blu-ray players). The lack of available 8K HDR content is also a problem. But there is some content floating around. The TV will be rolling out to China and Japan later this year, and then Taiwan in February 2018. Sharp is repurposing its 70-inch 8K TV as an 8K monitor (model LV-70X500E) for Europe, which will be on sale in March. There is no news about a U.S. release. Read more of this story at Slashdot.

Originally posted here:
Sharp Announces 8K Consumer TVs Now That We All Have 4K

What kind of gaming rig can run at 16K resolution?

The consumer gaming world might be in a tizzy about 4K consoles and displays of late, but that resolution standard wasn’t nearly enough for one team of PC tinkerers. The folks over at Linus Tech Tips have posted a very entertaining video showing off a desktop PC build capable of running (some) games at an astounding 16K resolution. That’s a 15260×8640, for those counting the over 132 million pixels being pushed every frame—64 times the raw pixel count of a standard 1080p display and 16 times that of a 4K display. The key to the build is four Quadro P5000 video cards provided by Nvidia. While each card performs similarly to a consumer-level GTX1080 (8.9 teraflops, 2560 parallel cores), these are pro-tier cards designed for animators and other high-end graphic work, often used for massive jumbotrons and other multi-display or multi-projector installations. The primary difference between Quadro and consumer cards is that these come with 16GB of video RAM. Unfortunately, the multi-display Mosaic technology syncing the images together means that mirrored memory doesn’t stack, leading to the rig’s most significant bottleneck. All told, the graphics cards alone would cost over $10,000, including a “quadrosync” card that ties them all together to run a single image across 16 displays. Read 5 remaining paragraphs | Comments

More:
What kind of gaming rig can run at 16K resolution?

Slackware, Oldest Linux Distro Still In Active Development, Turns 24

sombragris writes: July 17 marked the 24th anniversary of Slackware Linux, the oldest GNU/Linux still in active development, being created in 1993 by Patrick Volkerding, who still serves as its BDFL. Version 14.2 was launched last year, and the development version (Slackware-current) currently offers kernel 4.9.38, gcc 7.1, glibc 2.25, mesa 17.1.5, and KDE and Xfce as official desktops, with many others available as 3rd party packages. Slackware is also among the Linux distributions which have not adopted systemd as its init system; instead, it uses a modified BSD init which is quite simple and effective. Slackware is known to be a solid, stable and fast setup, with easy defaults which is appreciated by many Linux users worldwide. Phoronix has a small writeup noting the anniversary and there’s also a nice reddit thread. Read more of this story at Slashdot.

Visit site:
Slackware, Oldest Linux Distro Still In Active Development, Turns 24

Asus ROG GX800VH review: A ludicrous liquid-cooled $6,000-plus laptop

Enlarge (credit: Mark Walton) The Asus ROG GX800VH, a liquid cooled monstrosity of a gaming laptop, is one of those things that, like 4K phones or the Apple Watch , is wholly unnecessary yet awfully desirable. Beneath its fully mechanical, RBG-lit keyboard is Intel’s top-of-the-line mobile i7-7820HK processor, which is based on the same Kaby Lake architecture as the  i7-7700K  and is similarly overclockable. There are two Nvidia GeForce GTX 1080 graphics cards paired in SLI, 64GB of DDR4 memory, and an 18.4-inch 4K display with G-Sync. Buying one costs £6,600 /$6,300, which is an astonishing amount of money even considering the tech that’s included. Specs at a glance: Asus ROG GX800VH Screen 3840×2160 18.4-inch IPS G-Sync display 100 percent RGB OS Windows 10 Home x64 CPU 4C/8T 2.9GHz Core i7-7820HK (OC to 4.4GHz) RAM 64GB 2800MHz DDR4 GPU 2x Nvidia GTX 1080 HDD 2x 512GB NVMe SSD in RAID 0 Networking 802.11ac WiFi, Bluetooth 4.1, Gigabit Ethernet Ports 1 x Microphone-in jack 1 x Headphone-out jack (SPDIF) 1 x Type C USB3.1 (GEN2) Thunderbolt 3 x Type A USB3.0 (USB3.1 GEN1) 1 x RJ45 LAN Jack for LAN insert 1 x HDMI 1 x Docking port (HOT swap) 1 x mini Display Port 1 x SD card reader Size Laptop: 45.8 x 33.8 x 4.54 cm (WxDxH) Dock: (Thermal Dock) 35.9 x 41.8 x 13.3 cm (WxDxH) Other perks 8 Cells 71 Whrs Battery, HD Web Camera, Mechanical Keyboard Warranty 1 year Price £6,600 /$6,300 The GX800VH certainly isn’t for everyone, then, not least those that want the most bang-for-the-buck. But as an example of what’s possible on the bleeding edge when money is no object, it’s one of the finest pieces of technological willy-waving that we’ve ever seen. Buying a GX800VH requires a commitment from both your credit card and your ego. Not only is the laptop itself physically large and covered in orange highlights, but it comes with both a backpack and a suitcase to carry the accompanying liquid cooling unit around—and the graphics on the suitcase are hardly what you’d call subtle. Still, the suitcase—which is filled a pre-cut foam insert for the liquid cooling unit and extra power supply—and bag do make carrying the whole setup around that much easier, should you want to lug it around to a friend’s house or, if you’re seriously committed to gaming, on holiday. Read 15 remaining paragraphs | Comments

Read more here:
Asus ROG GX800VH review: A ludicrous liquid-cooled $6,000-plus laptop

Apple To Phase Out 32-Bit Mac Apps Starting In January 2018

Apple will be phasing out 32-bit apps with iOS 11, and soon the company will make the same changes on its macOS operating system. During its Platform State of the Union keynote at the Worldwide Developers Conference, Apple told developers that macOS High Sierra will be the “last macOS release to support 32-bit apps without compromises.” MacRumors reports: Starting in January of 2018, all new apps submitted to the Mac App Store must be 64-bit, and all apps and app updates submitted must be 64-bit by June 2018. With the next version of macOS after High Sierra, Apple will begin “aggressively” warning users about 32-bit apps before eventually phasing them out all together. In iOS 11, 32-bit apps cannot be installed or launched. Attempting to open a non-supported 32-bit app gives a message notifying users that the app needs to be updated before it can run on iOS 11. Prior to phasing out 32-bit apps on iOS 11, Apple gave both end users and developers several warnings, and the company says it will follow the same path for the macOS operating system. Read more of this story at Slashdot.

Read the original post:
Apple To Phase Out 32-Bit Mac Apps Starting In January 2018