Intel squeezed an AMD graphics chip, RAM and CPU into one module

Intel may have unveiled its latest Core CPUs for mainstream laptops , but the company has something more advanced up its sleeves for what it calls its “enthusiast” customers. The new chip will be part of the 8th-generation Core H series of processors, and comes with discrete-level graphics cards built in, as well as its own RAM. Having all this built into the processor frees up space for other components inside a laptop, so device manufacturers can squeeze in things like larger batteries or more optimal fan designs. Intel is not sharing performance details for the new CPUs yet, but it’s promising power that will be good enough for gamers or content creators who often run taxing programs like Adobe Photoshop and Lightroom. Specifically, the new processor integrates a “semi-custom” AMD graphics chip and the second generation of Intel’s “High Bandwidth Memory (HBM2)”, which is comparable to GDDR 5 in a traditional laptop. The three typically distinct components are able to coexist on one chip because of Intel’s Embedded Multi-Die Interconnect Bridge (EMIB), which “allows heterogeneous silicon to quickly pass information in extremely close proximity.” The company also came up with a power-sharing framework that lets the GPU manage each component’s temperature, performance and energy use. This infrastructure should free up about three square inches of board space that could either be used for other components as described above, or make for thinner laptops altogether. The idea is that powerful laptops for gamers no longer have to be chunky beasts. The new Core H processor is the first consumer product to use EMIB, and will be released in the first quarter of 2018, and many laptop makers are expected to offer products powered by the chip. This is a pretty significant development that not only benefits the enthusiast audience, but could also have trickle down effects that could improve mainstream laptops (and even other devices) in the future. Source: Intel

View the original here:
Intel squeezed an AMD graphics chip, RAM and CPU into one module

Researchers craft an LED just two atoms thick

Enlarge / Hexagonal boron nitride, one of the materials used here. (credit: Wikimedia Commons ) Modern computers are, in many ways, limited by their energy consumption and cooling requirements. Some of that comes from the process of performing calculations. But often, the majority of energy use comes from simply getting data to the point where calculations are ready to be performed. Memory, storage, data transfer systems, and more all create power draws that, collectively, typically end up using more power than the processor itself. Light-based communications offers the possibility of dropping power consumption while boosting the speed of connections. In most cases, designs have focused on situations where a single external laser supplies the light, which is divided and sent to the parts of the system that need it. But a new paper in Nature Nanotechnology suggests an alternate possibility: individual light sources on the chip itself. To demonstrate this possibility, the team put together an LED just two atoms thick  and integrated it with a silicon chip. Better still, the same material can act as a photodetector, providing a way of building all the needed hardware using a single process. Atomic The work relied on two different atomically thin materials. These materials consist of a planar sheet of atoms chemically linked to each other. While their study was pioneered using graphene, a sheet of carbon atoms, they developed a variety of other materials with similar structures. The materials being used here are molybdenum ditelluride (MoTe 2 ), a semiconductor, and hexagonal boron nitride. Read 7 remaining paragraphs | Comments

More:
Researchers craft an LED just two atoms thick

Surface Book 2: More cores, more GPU, and more screen

Just over two years ago, Microsoft unveiled its Surface Book hybrid laptop : a tablet with a detachable hinged keyboard base. It was a compelling concept , with Microsoft pulling off some clever tricks. The base contained a battery, boosting the life of the tablet portion substantially, and could optionally contain a discrete GPU, too. A little under a year ago, the Surface Book was partially refreshed: a new base was offered with a bigger battery and a faster GPU. The tablet portion, however, was left unchanged. Today, Microsoft unveiled not only a full refresh of the system—both tablet and base are being updated—but a whole new version of the machine. Surface Book 2 (Microsoft is using numerical version number suffixes here, even after abandoning the practice with the Surface Pro ) will come in two sizes. There’s a 13-inch model, same as before, but this is now paired with a 15-inch version. The broad concept of Surface Book remains the same. The screen half of the “laptop” is in fact a tablet computer, containing the processor, memory, mass storage, and a battery; the “keyboard” half is a larger battery, some expansion ports, and, optionally, a discrete GPU. The systems look essentially the same as the old versions, too, with the 15-inch version looking for all intents and purposes like a scaled-up version of the 13-inch one. Read 15 remaining paragraphs | Comments

Read more here:
Surface Book 2: More cores, more GPU, and more screen

More progress on carbon nanotube processors: a 2.8GHz ring oscillator

Enlarge (credit: NASA ) Back in 2012, I had the pleasure of visiting the IBM Watson research center. Among the people I talked with was George Tulevski , who was working on developing carbon nanotubes as a possible replacement for silicon in some critical parts of transistors. IBM likes to think about developing technology with about a 10-year time window, which puts us about halfway to when the company might expect to be making nanotube-based hardware. So, how’s it going? This week, there was a bit of a progress report published in Nature Nanotechnology (which included Tulevski as one of its authors). In it, IBM researchers describe how they’re now able to put together test hardware that pushes a carbon nanotube-based processor up to 2.8GHz. It’s not an especially useful processor, but the methods used for assembling it show that some (but not all) of the technology needed to commercialize nanotube-based hardware is nearly ready. Semiconducting hurdles The story of putting together a carbon nanotube processor is largely one of overcoming hurdles. You wouldn’t necessarily expect that; given that the nanotubes can be naturally semiconducting, they’d seem like a natural fit for existing processor technology. But it’s a real challenge to get the right nanotubes in the right place and play nicely with the rest of the processor. In fact, it’s a series of challenges. Read 11 remaining paragraphs | Comments

Excerpt from:
More progress on carbon nanotube processors: a 2.8GHz ring oscillator

A Beginner’s Introduction to Overclocking Your Intel Processor

If you want to squeeze every last ounce of processing power out of your new computer or aging system, overclocking is a great—if slightly nerve-racking—option. Here are some simple guidelines for safely overclocking your processor. Read more…        

Excerpt from:
A Beginner’s Introduction to Overclocking Your Intel Processor

AMD wins race to 5GHz CPU clock speed, in which it was the sole participant

AMD has refreshed its lineup of eight-core FX chips in what sounds like some straightforward overclocking of last year’s products. The FX-9590 claims a clock speed of 5GHz in turbo mode, making it the “world’s first commercially available 5GHz CPU processor,” while the FX-9370 lags slightly behind at 4.7GHz, as compared to the 4.2GHz top speed of the current FX-8350 . Both new CPUs are based on the familiar Piledriver core, which has a reputation for being relatively cheap and easily overclockable (honestly, the 5GHz barrier was obliterated long ago ), but far behind an Intel Core i5 in terms of all-around computing. This is especially true since the launch of Haswell , which largely avoided clock speed increases in favor of architectural tweaks that didn’t compromise efficiency . Maingear plans to pick up the 5GHz part for use in a gaming system coming this summer, but there’s no word yet on pricing or even general availability for DIY upgraders. Now, we’re just speculating, but with AMD increasingly focused on APUs, it’s possible that today’s chips will represent the FX’s lap of glory. Filed under: Desktops , Gaming , AMD Comments

Read More:
AMD wins race to 5GHz CPU clock speed, in which it was the sole participant

Intel: Haswell will boost laptop battery life by 50 percent

When Intel launched Haswell , it promised a generational leap in battery life, and now the chip giant’s talking numbers to back that up. Architecture Group VP Rani Borkar said that laptops packing the chipset should get 50 percent more battery life than current Ivy Bridge models and go up to 20 times longer in standby or idle mode — without any cost to performance . She said that lower power requirements will be one factor in the drop in consumption, but an all-new architecture including a power management chip will also help reduce the energy draw. We’ll have to see whether that encouraging piece of news will help the moribund PC notebook market pick up lost ground to tablets, or whether companies will just keep blurring the line . Filed under: Laptops , Intel Comments Source: Computerworld

Continued here:
Intel: Haswell will boost laptop battery life by 50 percent

MIT imaging chip creates natural-looking flash photos

Mobile image processing in itself isn’t special when even high dynamic range shooting is virtually instant, at least with NVIDIA’s new Tegras . A new low-power MIT chip, however, may prove its worth by being a jack of all trades that works faster than software. It can apply HDR to photos and videos through near-immediate exposure bracketing, but it can also produce natural-looking flash images by combining the lit photo with an unassisted shot to fill in missing detail. Researchers further claim to have automatic noise reduction that safeguards detail through bilateral filtering, an established technique that uses brightness detection to avoid blurring edges. If you’re wondering whether or not MIT’s work will venture beyond the labs, don’t — the project was financed by contract manufacturing giant Foxconn , and it’s already catching the eye of Microsoft Research . As long as Foxconn maintains interest through to production, pristine mobile photography won’t be limited to a handful of devices. Filed under: Cameras , Science , Alt Comments Source: MIT

See original article:
MIT imaging chip creates natural-looking flash photos

Intel gets go-ahead for $4 billion chip plant in Ireland, will produce its next-gen 14nm processors

Intel has been planning to make its Ireland base one of three global manufacturing sites for its 14nm chips since May last year , and its now been given the okay by Ireland’s lead planning agency. The new $4 billion plant will create around 4,300 jobs for the region in Co. Kildare, where Intel already has around 4,000 on staff. The two-year plan involves redeveloping its existing operation, expanding and shifting to make its smaller, more efficient 14nm process. Intel’s plans don’t stop there, however. It still plans to roll out 10nm products sometime in 2015. Filed under: Desktops , Misc , Laptops , Intel Comments Via: Silicon Republic Source: Pleanala

More:
Intel gets go-ahead for $4 billion chip plant in Ireland, will produce its next-gen 14nm processors

NVIDIA officially unveils Tegra 4: offers quad-core Cortex A15, 72 GPU cores, LTE support

One new SoC per year? That’s what NVIDIA pledged back in the fall of 2010 and today at its CES 2013 presser, it delivered with the Tegra 4’s official unveiling. The chip, which retains the same 4-plus-1 arrangement of its predecessor, arrives with a whopping 72 GeForce GPU cores — effectively offering six times the Tegra 3’s visual output and is based on the 28nm process. It also is the first quad-core processor with Cortex A15 cores on-board, and offers compatibility with LTE networks through an optional chip. NVIDIA claims this piece of silicon is the world’s fastest mobile processor, and showed a demonstration in which a Tegra 4 went head-to-head against a Nexus 10 in loading websites (you can guess which one won). The Tegra 4 also introduces new computational photography architecture, which adds a new engine to drive the image processing and significantly improve the amount of time it takes to calculate the necessary mathematics 10 times faster than current platforms. To show off its power, NVIDIA demonstrated HDR rendering on live video. The chip is also capable of implementing HDR in burst shots and with LED flash. The idea, NVIDIA says, is to eventually make our mobile cameras more powerful than DSLRs, and this is certainly a step in the right direction. Gallery: NVIDIA CES 2013 press event Joseph Volpe contributed to this report. Continue reading NVIDIA officially unveils Tegra 4: offers quad-core Cortex A15, 72 GPU cores, LTE support Filed under: Cellphones , Tablets , Wireless , Mobile , NVIDIA Comments

See the article here:
NVIDIA officially unveils Tegra 4: offers quad-core Cortex A15, 72 GPU cores, LTE support