This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Can You Get Flash Without Adobe? Meet DivX Plus Web Player

DivX Plus Web Player is the first alternative to Adobe's Flash Player. It promises better performance, battery life, and video quality, all in one shot. If Flash content brings your computer to its knees, read on to find out if DivX is the answer.

A while back, Steve Jobs posted six reasons why Apple wouldn't support Adobe's Flash-based products on the iPad and iPhone. Those reasons ranged from the software's openness to the fact that it's from a third-party to its effect on battery life. Also listed is the performance of mobile devices playing back Flash-based content. We already spent a lot of time digging into the product's effect on system resources in Adobe Flash: A Look At Browsers, Codecs, And System Performance. If Flash can torpedo the performance of a desktop PC (and it can; we've seen it), just imagine what it could do to a smartphone.

Now, no one is debating the merits of streaming video. Apple simply believes the problem lies in video delivery. That's why it is promoting a new standard known as HTML5 for its mobile devices. However, Apple's overwhelming popularity is one of the factors preventing more widespread adoption. Most Web sites only enable HTML5 content when they detect the iOS user agent. So, if you're cruising the Internet on a PC, HTML5 compliance might not be as relevant right now. That leaves us with a big question mark over the desktop systems (or laptops or netbooks) that still struggle with Flash-based content.

This is where DivX steps in. The company is sidestepping a format war altogether by providing a player that supports HTML5 and Flash-encoded video.  For a long time, Adobe's Flash Player was the only game in town able to play back Flash video. But that's in the past. DivX isn't out to trash Flash, but it is out to prove that it can deliver better performance, battery life, and image quality.

CNN requires Adobe's Flash Player

How important is Flash, really? If you enjoy following news sites like CNN, you generally have to install Flash to access all of the content. ComScore estimates that roughly 75% of all online video is Flash-based. For better or for worse, Adobe's Flash format is a fact of today's Web surfing experience. Clearly, DivX has a tall order to fill. Let's see how its solution stacks up.


View the original article here

Curbing Your GPU's Power Use: Is It Worthwhile?

In many cases, the graphics card is the most power-hungry component in a PC. The enthusiast community is no stranger to CPU tweaking, so why hasn't GPU modification caught on? We're going to see just how much you stand to gain (or lose) from tweaking.

Introduction

An increasing number of enthusiasts are becoming aware that their GPUs are the primary consumers of power in their PCs. For some, especially for our readers outside of the U.S., power consumption is an important factor in choosing a graphics card, along with performance, price, and noise levels. At the same time, as we begin to focus more intently on GPU power consumption, it is disconcerting to see the real cost of owning a high-end graphics card. If you read What Do High-End Graphics Cards Cost In Terms Of Electricity?, then you already know what we’re talking about. 

Now, you're probably wondering what you can do to help alleviate the issue. Graphics vendors like AMD and Nvidia build in technologies that help cut power use during idle periods, but the only surefire way to slash consumption is using a mainstream graphics card instead of a high-end model. At the end of the day, simpler cards based on less-complex GPUs require less power than their high-end siblings. 

You end up making sacrifices when you give up the displacement of a big graphics engine, though. Most mainstream cards don't offer enough performance to play the latest games at in the highest resolutions using the most realistic detail settings. If you want to play games completely maxed out, a high-end card is the only option.

Is there really no alternative to using mainstream graphics cards for the power-conscious? What if there was a way to manually cut down the power consumption of faster graphics cards? These are questions we ask (and try to answer) today.

A Short Overview of GPU Power Management

Although add-in graphics cards for desktop PCs don't curb power consumption as aggressively as discrete notebook GPUs, they still employ power management. In fact, power management technologies for desktop graphics cards have been available for quite a while. They usually manifest themselves as separate clocks for 2D (desktop) mode and 3D mode. Think of these as P-states on modern processors. With the availability of hardware-accelerated video playback, vendors have also added a new mode for video playback.

What's missing in most graphics cards is an option to limit power consumption, which is what the “Power saver” preset in the Windows Control Panel does. Enable this and the graphics card runs at lower clocks to keep power consumption down. There are new approaches to this problem. AMD introduced PowerTune for its Radeon HD 6900–series cards. We’ll look into the effectiveness of this capability later in this piece.

Finding the Right Combination

Lowering operating clocks is one way to reduce power consumption. This is as true for GPUs as it is for CPUs. However, clock speed (core and memory) is only one part of the equation. As with CPUs, the graphics processor’s operating voltage also plays a role.

If we wanted to limit or lower power consumption, couldn't we just manually set clocks and voltages? It’s actually really easy to modify frequencies using vendor-provided utilities and third-party software. And why not? Finding the right combination of clock and voltage can offer significant power savings.

Altering voltages is a different matter. Most graphics cards don't offer an easy way to adjust voltage. And in fact, in light of the fact that certain individuals have blown up their GeForce GTX 590s using unrealistic voltage settings, Nvidia even locks out voltage manipulation of those cards altogether. It’s not clear if that’ll apply to just the GTX 590 or a broader sampling of the company’s portfolio, but it demonstrates the bad that can come from too much tinkering.

How about the other cards out there that can still be modified? Unfortunately, voltage adjustments are limited to 3D-mode. Most cards do not offer a way to adjust voltages at idle or some intermediate mode.

But today we’re going to perform an experiment. We're going to see just how much power we can save by lowering clocks and voltages. In the process, we’re going to measure the associated performance hit to gauge whether those changes are worthwhile. We’ll be using two cards: AMD’s Radeon HD 5870 and 6970, with a 5770 for comparison.


View the original article here

System Builder Marathon, June 2011: $500 Gaming PC

Here are links to each of the five articles in this month’s System Builder Marathon (we’ll update them as each story is published). And remember, these systems are all being given away at the end of the marathon.

To enter the giveaway, please fill out this Google form, and be sure to read the complete rules before entering!

Day 1: The $2000 Performance PC
Day 2: The $1000 Enthusiast PC
Day 3: The $500 Gaming PC
Day 4: Performance And Value, Dissected
Day 5: Tom's Hand-Picked SuperCombo

In the first quarter of this year, we went a bit over budget on our $500 Gaming PC, squeezing in both a quad-core AMD Phenom II processor and Radeon HD 6850 graphics. The resulting build, which did leave some room for improvements, still packed serious punch for the money we spent.

Additional price drops over the past three months mean we could have taken that same configuration and jumped up to an even more attractive Phenom II X4 955 Black Edition (BE). The 955 BE would have not only given us a nice frequency increase across its four physical cores, but also the flexibility of a fully unlocked CPU multiplier, and even a better cooling solution for overclocking. Of course, overclockability plays heavily into component selection for our System Builder Marathons (SBMs), typically providing the benchmark data set we value the most.

And here's the point where some folks are going to be disappointed, because this lead-up depicts what we need to compete against, and not what we actually built. This month’s $500 gaming rig departs from the norm by centering on a budget-oriented Intel Sandy Bridge-based platform that cannot be overclocked at all, really. It'll either stand or fall based on its out-of-box performance.

Component  Model  Price (in dollars) Crucial 4 GB (2 x 2 GB) DDR3-1333 CT2KIT25664BA1339Sapphire 100315L Radeon HD 6850 1 GBSeagate Barracuda ST3500413AS 500 GB, SATA 6Gb/s Xigmatek Asgard II B/S CPC-T45UE-U01Antec EarthWatts Green EA380D 380 WAsus 24X DVD Burner SATA Model DRW-24B1ST/BLK/B/AS


Attempting a Sandy Bridge-based gaming PC was contingent on two self-set stipulations. First, I already broke the bank last round, and had to draw a line somewhere on spending. So, it was imperative to avoid outspending last quarter's PC. Second, an even more demanding suite of 3D titles meant that graphics horsepower couldn't be sacrificed.

The only way to achieve both goals was starting with the cheapest available platform possible, a feature-stripped ASRock H61M-VS microATX motherboard and Core i3-2100 processor. Apart from a couple of insignificant difference, we ended with component prices almost exactly the same as this machine's predecessor. Higher-capacity storage was three dollars cheaper than our previous 320 GB drive, but securing a DVD burner ate up a couple of those bills.

Now, we already know from stories like Don Woligroski’s Who's Got Game? Twelve Sub-$200 CPUs compared that the stock Core i3-2100 is a capable gaming processor. In fact, by avoiding some low-resolution CPU limitations, we should easily be able to set a new bar for the $500 SBM build in terms of frame rates at stock settings. But with 70% of the overall performance evaluation weighted outside of games, will a lack of overclocking become a deal-breaker for this machine, preventing us from reusing Intel's severely-limited entry-level parts in future Marathons?


View the original article here

The Brazos Round-Up: Eight AMD E-350-Based Motherboards

AMD’s Brazos platform, driven by the Zacate APU, offers a lot of performance per watt. It comes up short on features, though. Eight manufacturers try to change that perception by adding slots, controllers, and even overclocking in a couple of cases.

Buying a low-cost, high-efficiency platform is kind of like choosing a cable package. For a few dollars more you can always get a killer feature that you really want. But then it seems like the features that are still missing really aren’t that much more expensive. It's a clever scheme to keep you constantly thinking about the next-best thing just a few dollars up the ladder.

You might face the same sticky decision when it comes time to shop for that inexpensive PC: go for the budget gold, or spend a few extra dollars on a faster processor (and then a few more dollars on a better graphics card, and a bit more on more memory, and...it really never ends). But AMD is hoping that its Brazos platform gives you enough compute horsepower, enough graphics performance, and enough value to squelch that never-ending desire to push just a little bit further. If you want a solid idea of what the platform includes, check out ASRock's E350M1: AMD's Brazos Platform Hits The Desktop First.

Today we have eight different Brazos-based platforms, priced from $115 to $175. If you're dead set on saving money, one of these setups should be able to satisfy you. Now it's time to figure out which tier in this little sub-market serves up the best value.

E350 Fusion Motherboard FeaturesOther DevicesBluetooth Transceiver
802.11n Antennas


E350 Fusion Motherboard Features Sapphire Pure
Fusion Mini E350
PCIe x1/x41x mini PCIe (Filled)
1x x4 (Open for x16)

View the original article here

Three PCI Express-Based SSDs: When SATA 6 Gb/s Is Too Slow

When it comes time to hunt down the ultimate in storage performance, you simply cannot settle for standard SSDs. Instead, look to PCI Express-based drives that circumvent the limitations of SATA. We have products from Fusion-io, LSI, and OCZ on the bench.

SSD vendors are all in the middle of transitioning from 3 Gb/s to 6 Gb/s interface speeds, which effectively doubles the available interface speed on solid state drives. But if you think that achieving 500 MB/s is fast you should think again. Flash-based storage can be made to move data much faster than that once it is no longer limited by Serial ATA. Today we're comparing the latest offerings from Fusion-io, LSI, and OCZ to figure out who makes the fastest solid state drive available today. To do that, we must say goodbye to SATA and hello to PCI Express!

The idea behind the four products we compare in this article is simple: their creators want to maximize throughput, I/O performance, or both. Cost ends up being secondary on this venture. Fusion-io, LSI Corporation, and OCZ Technology all share a common opinion of the Serial ATA interface. Mainly, it's inadequate for a true high-performance product, as the bandwidth is limited to less than 600 MB/s on SATA 6Gb/s. Therefore, all of the products in this roundup center on PCI Express, which directly attaches flash storage to the fastest available system interface. With that said, this doesn’t mean that SATA cannot be used at all. In fact, both LSI and OCZ employ SATA to connect flash memory to their solutions internally.

The individual approaches on how to procure maximum performance differ a lot. While LSI and OCZ create cards that employ RAID-based configurations using multiple controllers attached to dedicated NAND flash, Fusion-io is the first and only firm to provide a direct PCI Express storage solution that doesn’t utilize an internal storage interface like SATA. Therefore, we decided to put the LSI WarpDrive and OCZ’s Ibis up against the ioDrive and the ioXtreme by Fusion-io.

As always, different implementations have their own unique pros and cons. As mentioned, LSI and OCZ access conventional RAID and storage controllers to create powerful devices, while Fusion-io created new silicon to minimize the number of interfaces that have to be involved. The latter appears to be the most elegant solution. But it's still not bootable. That might not be very important in enterprise environments, where lots of capacity and high performance is used to accelerate I/O-intensive workloads. It is a problem in the enthusiast space, though.

Be that as it may, in the end, we’re interested in understanding how each product is designed and how it works. And what matters most are the benchmark results, right? Let’s look at the ioDrive (160 GB) and ioXtreme (80 GB) by Fusion-io, the LSI WarpDrive Accelerator Card SLP-300 (300 GB), and OCZ's Ibis. The Ibis is technically very similar to the RevoDrive X2 that Chris reviewed in January 2011 (Ed.: And, in fact, I took an early look at the Ibis in OCZ's HSDL: A New Storage Link For Super-Fast SSDs, too).

Before we dive too deeply into this comparison, which may strike some readers as imbalanced based on the pricing of each product, it is important to consider the markets addressed by high-end PCI Express-based SSDs. The solutions sold by Fusion-io and LSI are clearly geared towards an enterprise audience. Their design, components, firmware, support and pricing are totally different from OCZ's Ibis and its more enthusiast/workstation-oriented specifications. The Ibis just happens to interface via PCI Express as well. To make a long story short, please don't take this review as a shootout, but as a look at different concepts and options. We think the conclusion reflects unique considerations for each dissimilar piece of hardware.


View the original article here

System Builder Marathon, June 2011: $2000 Performance PC

Here are links to each of the five articles in this month’s System Builder Marathon (we’ll update them as each story is published). And remember, these systems are all being given away at the end of the marathon.

To enter the giveaway, please fill out this Google form, and be sure to read the complete rules before entering!

Day 1: The $2000 Performance PC
Day 2: The $1000 Enthusiast PC
Day 3: The $500 Gaming PC
Day 4: Performance And Value, Dissected
Day 5: Tom's Hand-Picked SuperCombo

While last year’s systems often focused on the expandability of high-end platforms, those platforms (largely based on Intel's X58 Express chipset) no longer support the fastest CPUs for our applications. Even enthusiasts must live with the reality that most programs can use no more than four CPU cores, and the fastest four-core processors have used decidedly mainstream chipsets since January. Those are the parts on which our most recent $2000 builds have centered.

Intel’s compelling upgrade for its mainstream processor interface turned the market on its head by adding ultra-fast video transcoding that could not be matched even by today’s fastest discrete GPUs, let alone an ultra-expensive six-core CPU. We've already seen what Quick Sync can do. With that said, none of the tests in our System Builder Marathon suite are optimized to exploit it; perhaps that's something we'll change going into next quarter's comparison. The real question is: should we even bother to upgrade?

Of course, the answer is yes. We've shown via extensive testing that Intel's Z68 Express platform loses nothing by way of overclocking or performance compared to the P67-based boards previously included in this series. These motherboards do cost a little more, which means today’s system takes a small hit in the value calculations, especially given that our transcoding-oriented benchmarks are processor-bound, and not able to enjoy the speed-up enabled through Quick Sync. Still, we had to follow our hearts on this one and think of what we’d build if these were our own machines. At the end of the day, Z68's additional functionality is really worthwhile.

Oh yes, and we ditched the often-favored Antec Three Hundred Illusion for something with a little more flash and twice the cost. With the same CPU, GPUs, and SSDs carried over from our most recent $2000 build, we're leaning hard on our partner's Newegg's recent price drops to retain the high-end value score of our former build.

MotherboardASRock Z68 Extreme4: LGA 1155, Intel Z68 ExpressGraphics2x MSI R6970-2PM2D2GD5: Radeon HD 6970 2 GB, CrossFireProcessorIntel Core i7-2600K: 3.4 GHz-3.8 GHz, 8 MB CacheMemoryG.Skill F3-12800CL8D-8GBXM: DDR3-1600 C8, 4 GB x2 (8 GB)System Drive2x A-Data S599 64 GB, SATA 3Gb/s SSDStorage DriveSamsung F3 HD103SJ 1 TB, 7200 RPM HDDOpticalLG WH10LS30 BD-RE: 10x BD-R, 16x DVD±RPowerSeasonic SS-850HT: 850W, ATX12V v2.31, 80-Plus Silver


Yet, value can’t completely be quantified by our scoring system. For example, how would a real-world user feel about having only 128 GB of storage? We added a 1 TB drive for that, along with a BD-RE for backups. This month’s disc burner is also cheaper than that of our previous build, which helps to offset the cost of that gorgeous case. Many of our readers have, after all, voiced concern that one of the things they value is the look on their friends’ faces when they show off their latest creation!


View the original article here

AMD A8-3850 Review: Llano Rocks Entry-Level Desktops

Earlier this month we previewed AMD's Llano architecture in a notebook environment. Now we have the desktop version with a 100 W TDP. How much additional performance can the company procure with a loftier thermal ceiling and higher clocks?

Editor's Note: As we've done so many times before, we're partnering up with CyberPower to give away one of the first Llano-based desktop machines, which the builder calls its Gamer Ultra, to one of our readers. Flip through our review and, on the last page, enter to win a brand new PC, compliments of CyberPower!

A8-3850 Makes Its Desktop Debut

Don Woligroski did an absolutely killer job on our first look at AMD’s Llano architecture. If you haven’t yet read that story and you want to know more about the plumbing inside the company’s first mainstream APU, you really owe it to yourself to check out The AMD A8-3500M APU Review: Llano Is Unleashed before diving into this piece.

Because Don covered the underlying architecture so well, I’m going to use our first experiences with AMD’s Llano-based desktop platform, code-named Lynx, to dive deeper into the stuff I know you guys love: benchmark results and analysis. What kind of performance can you expect out of Dual Graphics? How does Sandy Bridge with discrete graphics compare? What effect does memory performance have on gaming frame rates? How does integrated USB 3.0 support measure up to some of the add-on controllers we’ve seen? I’ll answer all of that.

But first let’s go over the basics of AMD’s first desktop-class Llano-based APUs.

Llano: The Recession-Friendly APU

Oh, Audi would be so proud (or maybe not, given the entry-level pedigree of these processors). AMD is using the same A8 and A6 designators to distinguish between the perceived performance levels of its four launch SKUs.

There are two A8s and two A6s. The Llano-based flagship is A8-3850, a 100 W part with Radeon HD 6550D graphics, four execution cores with 1 MB L2 cache each, and a 2.9 GHz clock rate. That part does not offer Turbo Core support—the only way to get it running faster than 2.9 GHz is through overclocking. AMD says to expect pricing around $135.

The A6-3650 is also rated at 100 W, even though it’s armed with Radeon HD 6530D graphics and a more conservative 2.6 GHz clock rate (again, Turbo Core isn’t available). The -3650 boasts four cores as well, includes the same 4 MB of L2 cache, and support DDR3-1866 data rates, just like the other three models. That one is expected to run $115.

Model
GPU
TDP
Cores
Base CPU Clock
Max. Turbo
L2 Cache
Shaders
GPU Clock
Turbo Core


Interestingly, dipping down to the 65 W level doesn’t seem to sacrifice much in the way of functionality. AMD’s A8-3800 includes the capable Radeon HD 6550D engine, quad-core Stars architecture, and 4 MB L2 repository. However, Turbo Core helps compensate for a fairly severe drop to 2.4 GHz, kicking frequency up to 2.7 GHz in situations where thermal headroom allows for it. Unfortunately, AMD didn’t send over any Turbo Core-equipped processors for testing, so it’s impossible to gauge how much time this four-core part spends at its elevated setting.

Finally, the A6-3600 is also a 65 W component. It scales way back, though, giving up not only processor clock rate—its four cores running at 2.1 GHz by default and up to 2.4 GHz with Turbo Core—but also graphics performance via the less-complex Radeon HD 6530D engine. Is still includes 4 MB of L2 cache though, complementing each core with 1 MB.

What's the difference, exactly, between the Radeon HD 6550D and Radeon HD 6530D GPU engines? One SIMD engine, for the most part.

Graphics Processor Classification
Radeon HD 6550D (A8-Series APUs)
Radeon HD 6530D (A6-Series APUs)


When you look at a block diagram of the Llano's GPU component, it's easy to see how AMD differentiates these two lineups. Each SIMD hosts 80 ALUs and is associated with four texture units. Turning one SIMD off yields the 320 shaders and 16 texture units offered by Radeon HD 6530D.

A Small Launch Gets Smaller

With four SKUs in the initial Llano-based desktop portfolio, a zero-hour revelation that the 65 W A8-3800 and A6-3600 won't be available until an undisclosed date narrows the family down to two models: A8-3850 and A6-3650, both 100 W parts. As a result, it won't be possible to test Turbo Core functionality on Llano until AMD addresses the availability of its lower-power offerings.


View the original article here