When metals become insulators and back again, it’s math to the rescue

Written by admin. Posted in Hardware, Science, Tech



Making insulators conduct like metals is the bread and butter of our electronic world. Despite knowing many ways to do this — as in doping semiconductors, applying heat or pressure, or otherwise reducing the space between adjacent atoms — we lack a general theory to explain all the observed physics. A recent paper by Carnegie’s Russell Hemley and Ivan Naumov provides a new mathematical foundation that successfully predicts not only when external forces drive metals to become insulators, but also the reverse transition to make insulators conduct like metals.

Dating back to the time of the origins of quantum mechanics, it was generally held that once a material went full metallic under pressure, it would stay that way as pressure increased even further. Not so. We now know that certain metals, like lithium, defy this axiom and actually switch from metal to semiconductor under pressure: at 80 gigapascals it begins to act like a semiconductor, but then at 120 gigapascals it reverts back to metal behavior. Sodium, calcium, and nickel are all predicted to show similar behaviors.

With the limits to current doped semiconductor technology now looming, new ways to control the metal-to-insulator transition are highly sought. We recently discussed how some metals, like gold for example, make such a transition when the number of individual atoms in a cluster is increased from 102 to 144. While their new results may not yet be applicable to all metals, the researchers were able to show that for some cases, in order to become an insulator, overlap in the electronic structure of the metal must be organized into a specific kinds of asymmetry. When these symmetry breaks occur, electrons localized in the spaces between the atoms do not flow as freely as they do in the metallic form.


Backblaze pulls 3TB Seagate HDDs from service, details post-mortem failure rates

Written by admin. Posted in Hardware


For the past few years, online backup company Backblaze has published hard drive reliability data on the tens of thousands of hard drives it’s bought and put into service. Throughout that time period, one drive family has persistently cropped up as failing at rates far above the rest of the industry — a Seagate 3TB model, ST3000DM001. Now, the company has decided to pull that drive from its operations altogether, and it’s published a detailed account of how the 3TB Seagate stood up to its competition (or failed to).

According to Backblaze, the shift to 3TB Seagate’s was the result of the Thailand flood crisis and the massive disruption to the HDD industry that resulted. Seagate was largely unaffected by the floods that paralyzed production at its competitors’ plants, and so reaped the lion’s share of benefits from the event. Backblaze purchased nearly 5000 ST3000DM001 drives over the course of a year, with a nearly 50/50 split between internal and external drives (the company removed the external drives from their chassis and used them in its storage pods).

The result? Massive failures beginning in 2013 and rising all throughout 2014. Instead of the typical “bathtub” failure curve, in which a handful of units fail immediately and then the rest continue operating normally for years, the Seagate drives showed a period of strong operation followed by high failure rates.


Nvidia GeForce GTX Titan X reviewed: Crushing the single-GPU market

Written by admin. Posted in Hardware

Titan X


Today, Nvidia is launching its new, ultra-high end luxury GPU, the GeForce GTX Titan X. This is the fourth GPU to carry the Titan brand, but only the second architecture to do so. When Nvidia launched the first Titan, it used a cut-down version of its workstation and HPC processor, the GK110, with just 14 of its 15 SMX units enabled. Later cards, like the Titan Black, added RAM and enabled the last SMX unit, while the dual-GPU Titan Z packed two Titan Black cores into the same silicon with mixed results.


GM200, full fat edition


The Titan X is based on Nvidia’s GM200 processor and ships with all 24 of its SMMs enabled (that’s the current term for Nvidia’s compute units). The chip has 3072 CUDA cores, and a whopping 12GB of GDDR5 memory.  To those of you concerned about a GTX 970-style problem, rest assured: There are no bifurcated memory issues here.


Which SSDs are the most reliable? Massive study sheds some light

Written by admin. Posted in Hardware

NAND flash


Ever since SSD’s started showing up in consumer hardware, end-users have had one consistent question: How long do the drives live, and how robust are they compared with conventional hard drives? Data on these metrics is often difficult to find, and the complexity of the drives themselves makes it hard to isolate which kinds of failure are more or less likely to occur on a given drive. Manufacturers publish lifetime write specifications and recommended usage patterns, but this data tends to be extremely general.

18 months ago, Tech Report set out to test the limits of SSD endurance and catalog how a set of six drives would fail under load. The drives chosen: Corsair Neutron GTX, Intel 335 Series, Kingston HyperX 3K, Samsung 840, and Samsung 840 Pro. All were in the 240GB to 256GB class of hardware, and all started off the experiment pristine.


All six drives made it several hundred terabytes past their manufacturer-set limits, but four of the six drives died before or just after the 1PB mark. Intel’s SSD died first, of a self-inflicted wound (the drive is designed to stop working once it begins having problems), but two drives — the Kingston and the Samsung 840 Pro — made it past the 2PB mark. Of course, six drives aren’t a representative sample of how all SSDs perform, and TR doesn’t recommend treating this test as such. SSDs can fail for a variety of reasons and causes — this particular test measured how the drive would handle steady wear and repeated write cycles, as opposed to testing how the SSD handled repeated loss of power events.

What it does point to, however, is that at least in this one particular metric, manufacturers appear to set their guidelines extremely conservatively. That’s good news for anyone looking to jump from HDDs to SSDs, though we should note that recent EVO drives haveleft us wary of TLC-based NAND once again.

One thing to be aware of, as Tech Report puts it, is that “SSDs don’t always fail gracefully… watch for bursts of reallocated sectors.” This is good advice for any storage medium — hard drives don’t always fail gracefully, either. One of the most profound disconnects in computing is the vast difference between the value of hard drives or solid state drives as a blank storage medium (where they both cost well under $1 per GB) and the extraordinarily high cost of recovering that lost information in the event of failure. As always, ExtremeTech recommends a good backup solution, no matter what kind of storage you use.

USB-C vs. USB 3.1: What’s the difference?

Written by admin. Posted in Hardware, Tech

USB Type-C


With the launch of the Apple MacBook and Google’s Chromebook Pixel, USB-C (also called USB Type-C) and the accompanying USB 3.1 standard are both hitting market somewhat earlier than we initially expected. If you’re curious about the two standards and how they interact, we’ve dusted off and updated our guide to the upcoming technology. The situation is more nuanced than it’s been with previous USB standard updates — USB 3.1 and USB Type-C connectors may be arriving together on the new machines, but they aren’t joined at the hip the way you might think.

USB Type-C: Fixing an age-old problem

The near-universal frustration over attempts to connect USB devices to computers has been a staple of nerd humor and lampooned in various ways until Intel finally found a way to take the joke quantum.

Super-positioned USB

Researchers create glasses-free 3D display with tiny spherical lenses

Written by admin. Posted in Hardware, Tech


Several industries have tried and failed to get consumers excited about 3D, but it simply hasn’t taken hold yet. The most successful foray into the realm of 3D technology is probably the Nintendo 3DS, which has sold quite well by the standards of handheld game consoles. Part of that is effective use of 3D in games, but more importantly, you don’t need glasses to experience a 3D effect. Glasses-free 3D comes with drawbacks, but a team of researchers from Chengdu, China might have figured out how to make this type of 3D viable using spherical lenses in the display.

The screen on the 3DS can produce a 3D image without glasses, making it an “autostereoscopic” screen. This works on the concept of parallax — each eye sees a slightly different image, which is beamed from the screen using a parallax barrier. This method does away with the need for glasses to split up the image with shutters or polarized light. You just look at the screen and see a 3D effect. The main drawback is that parallax screens only work in a very narrow viewing angle of 20 to 30 degrees, and the effect is considerably lessened toward the high end of that.

Most people can tolerate a narrow viewing angle with a handheld device. But with anything larger, it’s far too inconvenient. The spherical lens display design featured in the new paper has the potential to boost the viewing angle of an autostereoscopic screen dramatically. The proof-of-concept display created by the researchers works at 32 degrees, with a theoretical viewing angle of up to 90 degrees. Additionally, microsphere-lens (MSL) arrays can be produced inexpensively using ball placement technology.

3D panel

Intel confirms Skylake upgrade for Core M later this year

Written by admin. Posted in Hardware, Tech News

Intel Xeon E7 Ivy Bridge-EX die (15 core)


Intel’s ultra low-power Core M has been available on the market since the back half of 2014. As the first Broadwell chip, Core M had the twin tasks of improving Intel’s performance in the lowest power segments while simultaneously allowing it to push into smaller form factors and tighter thermal envelopes. The chip achieved both of these goals to some extent, but OEM design decisions have sapped some of the potential out of the CPU. Intel is apparently going to keep pushing the ultramobile form factor front and center — at the Goldman Sachs Technology and Internet conference this week, Intel CEO Brian Krzanich told analysts that the company would launch Core M on Skylake later this year.

Information on the Core M version of the platform refresh is still limited. Krzanich referred to the usual suspects — improved battery life, improved performance — but didn’t give specifics on either front. It’s interesting that Intel’s Skylake predictions have been fairly muted compared to what the company had released for Haswell by this point. This may be a marketing decision — with Broadwell still rolling out, Intel likely doesn’t want to put too much emphasis on its next-gen platform or risk a short-term Osborne effect. Intel’s programming documentation suggests that AVX-512, at least, will only be deployed in Xeon-branded Skylake chips — but since AVX-512 is explicitly designed to focus on HPC workloads, consumers may not mind the lack.

14nm yield trend

Krzanich did note that the Core M Skylake would also receive an upgraded version of Intel’s RealSense 3D camera, and that the platform would support Windows 10, Android, and Google Chrome. whether this Windows 10 support includes full DirectX 12 support or not. Intel has demonstrated DX12 running on its own hardware, but the state of DX12 support is somewhat fluid — there’s a base level of minimum compatibility required for the spec, and there are advanced secondary areas that GPUs can optionally support. It’s still unclear exactly which chips from which vendors will tag all the checkboxes, and complete support will require a robust driver stack (Intel’s historic 3D drivers have often lagged behind its competitors when it comes to compatibility and overall performance).

If Broadwell’s debut has demonstrated anything, it’s that an improved processor isn’t always sufficient to drive a compelling product. Systems like the Lenovo Yoga Pro 3 drew relatively mediocre reviews because Lenovo chose to push multiple boundaries simultaneously — trim the system weight, cut the battery life, and include an ultra-high resolution display, and the power gains the CPU once offered are effectively negated.

Intel’s second-generation architecture refreshes on a given process still tend to improve overall power efficiency, so it’s possible we’ll see further gains from Skylake on this front — or more horsepower in the same TDP bands, which amounts to the same thing. Either way, if Intel keeps its schedule, second-generation Core M systems should be on shelves by Christmas.


Canon G7 X Review: Canon’s Best Point-And-Shoot Camera In Years

Written by admin. Posted in etc, Everything Else, Featured, Gossip, Hardware, Lifestyle, Science, Tech, Tech News

Canon G7 X Review: Canon's Best Point-and-Shoot Camera in Years

Five years ago, I was broke, but I still needed a great camera. The Canon S90 was the perfect fit for my needs, and my credit card balance. And I wasn’t the only one who thought so. The amazing S90 and successors made Canon a mint — at least until Sony’s RX100 came along with higher quality images. With the G7 X, Canon is striking back with specs, plus a little bit of the charm that made Canon compacts so easy to love in the first place.

What Is It?

Canon G7 X Review: Canon's Best Point-and-Shoot Camera in Years
  • Resolution: 20.2-megapixel
  • Sensor Size: 1-inch backlit CMOS
  • Screen: 3-inch LCD
  • Video: Full HD 1080p
  • Lens Mount: N/A (fixed lens)
  • Warranty: 2 Years

Why a 50mm Lens is your new Best Friend

Written by admin. Posted in Hardware, Lifestyle, Tech, Tech News


You may have heard the term Nifty Fifty before.  If you haven’t, it is usually a reference to the Canon 50mm f1.8 lens. But for the purpose of this article I’m going to use it synonymously with any prime 50mm lens.

What’s the best “next” lens to buy?

I get asked all the time by my students about what lens they should buy next after the basic kit lens that came with the camera. I almost always recommend picking up a simple 50mm prime lens. Let’s look at some reasons . . .

Reasons why this lens should be in your bag

How to Get a Strong Wi-Fi Signal in Every Room of Your House

Written by admin. Posted in Hardware, Tech

How to Get a Strong Wi-Fi Signal in Every Room of Your House

If you live in a particularly tall or wide house, or one with a complicated layout, then you might have problems with Wi-Fi dead zones where your high-speed wireless broadband connection just can’t reach. That can seriously hamper your Netflix binge-watching or Spotify streaming. You don’t have to settle for patchy coverage though, and there are several ways in which you can extend the reach of your Wi-Fi.