Monday, September 12, 2011

seagate hard disk 4tb externalseagate hard disk 4tb external

I'm off to IDF this week while Ryan
and Brian cover Microsoft's BUILD
conference, so expect lots of CPU
and Windows 8 news in the coming
days. Just before I left however
Seagate sent me a review sample
of its recently announced GoFlex
Desk 4TB drive. Eager to find out if
anything had changed since I
reviewed last year's 3TB model I
dove right into testing.
The GoFlex connector standard
Seagate's GoFlex Desk is a line of
external 3.5" hard drives with
interchangeable GoFlex Desk docks.
Internally all GoFlex Desk drives
have a standard Seagate 3.5" SATA
hard drive; it's the GoFlex Desk
dock that converts SATA into USB
3.0, USB 2.0 or FireWire 800. Since
3.5" drives require more power
than you can get out of a single
USB port, Seagate's GoFlex Desk
requires an external power adapter
that comes with the drive.
Although the SATA power and data
connectors on the GoFlex Desk are
in a standardized location, to date
all implementations of Seagate's
GoFlex spec have been designed
for 2.5" drives. As a result the only
real advantage to this being a
GoFlex drive is that you can swap
out docks to get support for
different interfaces.
By default the GoFlex Desk bundle
comes with a USB 3.0 dock that's
obviously backwards compatible
with USB 2.0 ports. Seagate offers
an optional USB 2.0/FireWire 800
dock, presumably for Mac users
with FireWire 800 ports. The dock
features five LEDs, one for power
and the other four indicate capacity
used in 25% increments.
Seagate sent me the standard 4TB
USB 3.0 bundle; with it you get the
drive, power adapter and a USB 3.0
cable. The drive comes preloaded
with Seagate's Dashboard as well
as Memeo Instant Backup. Seagate
will part with a 4TB GoFlex Desk
bundle for $249.
As I mentioned in our initial post on
the 4TB GoFlex Desk, Seagate uses
a 5-platter 7200RPM 3.5" 6Gbps
SATA Barracuda hard drive inside
the GoFlex Desk. At 4TB that works
out to be 800GB per platter.
The spec for hard drive storage
capacity is done in base 10 where
1TB = 1 trillion bytes. That works
out to be 3725GiB of storage on the
GoFlex Desk 4TB. We've addressed
the issue with hard drives greater
than 2TB in previous articles, the
same discussion applies here.
A Redesigned Chassis
Despite retaining the name, the 4TB
GoFlex Desk introduces a new
external enclosure. It's sleeker and
more angular than last year's, but
more importantly it has better
cooling properties. For now it looks
like you can only get the new
chassis if you buy the 4TB drive,
the smaller capacities still ship with
the old chassis.
Seagate GoFlex Desk 3TB 2010
(left) vs. GoFlex Desk 4TB 2011
(right)
In our review of the 3TB GoFlex
Desk we found that under hours of
continued use the drive got quite
warm: up to 69C. The high
temperatures resulted from two
things: the GoFlex Desk enclosure
had very little ventilation and the
5-platter 7200RPM drive inside put
out a lot of heat. With the move to
4TB Seagate stuck with a 5-platter
7200RPM design, but gave the
enclosure more holes for
ventilation:
Seagate GoFlex Desk 3TB 2010
(left) vs. GoFlex Desk 4TB 2011
(right)
The top and back of the new
GoFlex Desk are vented to bring
down drive temperatures. The old
design had dents that looked like
holes but they were simply to give
the plastic texture, they weren't
functional. Western Digital's My
Book Essential is still better
ventilated but this is definitely a
step in the right direction.
Hooray, vents!
The new chassis definitely keeps
temperatures cooler for longer
under light usage, however if
you're copying a lot of data to the
GoFlex Desk temperatures will
climb. After one hour of sequential
writes over USB 3.0 I measured a
drive temperature of 63C. In just
under 2 hours the drive got up to
67C, a bit lower than last year's
model but still troubling. The good
news is unlike last year's model,
the drive will continue to operate at
full performance in this state. When
testing the 3TB version last year
we found that sequential write
speeds dropped to 50MB/s when
the temperature got into the upper
60s.
Granted that's after copying nearly
1TB of data without pause, so you
shouldn't see these numbers other
than the very first time you copy
all of your data to the drive. During
normal use and even when moving
around a couple hundred GB of
data the 4TB GoFlex Desk kept to
51C and below. I'm happy to see
that Seagate redesigned the
chassis, but I'd still feel more
comfortable with even more
ventilation or at least a cooler
running drive inside.
The internal Barracuda + USB 3.0
dock consume 11.9W at idle and
13.7W under load. The drive whine
is audible when on but it's not
overly loud. If you're using
anything other than a very quiet
notebook you'll likely not be too
bothered by the drive.

Monday, September 5, 2011

2nd Generation Intel® Core™ i7 Extreme Processor Specifications

Processor Number Cache Clock Speed # of Cores/
# of Threads
Max TDP Memory Types Graphics
Intel® Core™ i7-2920XM Processor Extreme Edition (8M Cache, 2.50 GHz) 8.0 MB 2.50 GHz 4 / 8 55 DDR3-1066/1333/1600 Intel® HD Graphics 3000

2nd Generation Intel® Core™ i3 Processor

Processor Number Cache Clock Speed # of Cores/
# of Threads
Max TDP Memory Types Graphics
Intel® Core™ i3-2330E Processor (3M Cache, 2.20 GHz) 3.0 MB 2.20 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i3-2330M Processor (3M Cache, 2.20 GHz) 3.0 MB 2.20 GHz 2 / 4 35 DDR3-1066/1333 No
Intel® Core™ i3-2357M Processor (3M Cache, 1.30 GHz) 3.0 MB 1.30 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i3-2340UE Processor (3M Cache, 1.30 GHz) 3.0 MB 1.30 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i3-2100 Processor (3M Cache, 3.10 GHz) 3.0 MB 3.10 GHz 2 / 4 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i3-2100T Processor (3M Cache, 2.50 GHz) 3.0 MB 2.50 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i3-2120 Processor (3M Cache, 3.30 GHz) 3.0 MB 3.30 GHz 2 / 4 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i3-2310M Processor (3M Cache, 2.10 GHz) 3.0 MB 2.10 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i3-2102 Processor (3M Cache, 3.10 GHz) 3.0 MB 3.10 GHz 2 / 4 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i3-2312M Processor (3M Cache, 2.10 GHz) 3.0 MB 2.10 GHz 2 / 4 35 DDR3-1066/1333 Processor Graphics
Intel® Core™ i3-2310E Processor (3M Cache, 2.10 GHz) 3.0 MB 2.10 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i3-2105 Processor (3M Cache, 3.10 GHz) 3.0 MB 3.10 GHz 2 / 4 65 DDR3-1066/1333 Intel® HD Graphics 3000

Intel Core i5 Processor Specifications

Processor Number Cache Clock Speed # of Cores/
# of Threads
Max TDP Memory Types Graphics
Intel® Core™ i5-2557M Processor (3M Cache, 1.70 GHz) 3.0 MB 1.70 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2467M Processor (3M Cache, 1.60 GHz) 3.0 MB 1.60 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2540M Processor (3M Cache, 2.60 GHz) 3.0 MB 2.60 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2520M Processor (3M Cache, 2.50 GHz) 3.0 MB 2.50 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2390T Processor (3M Cache, 2.70 GHz) 3.0 MB 2.70 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2510E Processor (3M Cache, 2.50 GHz) 3.0 MB 2.50 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2515E Processor (3M Cache, 2.50 GHz) 3.0 MB 2.50 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2410M Processor (3M Cache, 2.30 GHz) 3.0 MB 2.30 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2537M Processor (3M Cache, 1.40 GHz) 3.0 MB 1.40 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2300 Processor (6M Cache, 2.80 GHz) 6.0 MB 2.80 GHz 4 / 4 95 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2400 Processor (6M Cache, 3.10 GHz) 6.0 MB 3.10 GHz 4 / 4 95 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2400S Processor (6M Cache, 2.50 GHz) 6.0 MB 2.50 GHz 4 / 4 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2500 Processor (6M Cache, 3.30 GHz) 6.0 MB 3.30 GHz 4 / 4 95 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2500K Processor (6M Cache, 3.30 GHz) 6.0 MB 3.30 GHz 4 / 4 95 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i5-2500S Processor (6M Cache, 2.70 GHz) 6.0 MB 2.70 GHz 4 / 4 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2500T Processor (6M Cache, 2.30 GHz) 6.0 MB 2.30 GHz 4 / 4 45 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2310 Processor (6M Cache, 2.90 GHz) 6.0 MB 2.90 GHz 4 / 4 95 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i5-2405S Processor (6M Cache, 2.50 GHz) 6.0 MB 2.50 GHz 4 / 4 65 DDR3-1066/1333 Intel® HD Graphics 3000

2nd Generation Intel® Core™ i7 Processor Specifications

Processor Number Cache Clock Speed # of Cores/
# of Threads
Max TDP Memory Types Graphics
Intel® Core™ i7-2677M Processor (4M Cache, 1.80 GHz) 4.0 MB 1.80 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2637M Processor (4M Cache, 1.70 GHz) 4.0 MB 1.70 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2655LE Processor (4M Cache, 2.20 GHz) 4.0 MB 2.20 GHz 2 / 4 25 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2610UE Processor (4M Cache, 1.50 GHz) 4.0 MB 1.50 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2620M Processor (4M Cache, 2.70 GHz) 4.0 MB 2.70 GHz 2 / 4 35 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2629M Processor (4M Cache, 2.10 GHz) 4.0 MB 2.10 GHz 2 / 4 25 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2649M Processor (4M Cache, 2.30 GHz) 4.0 MB 2.30 GHz 2 / 4 25 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2657M Processor (4M Cache, 1.60 GHz) 4.0 MB 1.60 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2617M Processor (4M Cache, 1.50 GHz) 4.0 MB 1.50 GHz 2 / 4 17 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2720QM Processor (6M Cache, 2.20 GHz) 6.0 MB 2.20 GHz 4 / 8 45 DDR3-1066/1333/1600 Intel® HD Graphics 3000
Intel® Core™ i7-2600 Processor (8M Cache, 3.40 GHz) 8.0 MB 3.40 GHz 4 / 8 95 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i7-2600K Processor (8M Cache, 3.40 GHz) 8.0 MB 3.40 GHz 4 / 8 95 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2600S Processor (8M Cache, 2.80 GHz) 8.0 MB 2.80 GHz 4 / 8 65 DDR3-1066/1333 Intel® HD Graphics 2000
Intel® Core™ i7-2630QM Processor (6M Cache, 2.00 GHz) 6.0 MB 2.00 GHz 4 / 8 45 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2820QM Processor (8M Cache, 2.30 GHz) 8.0 MB 2.30 GHz 4 / 8 45 DDR3-1066/1333/1600 Intel® HD Graphics 3000
Intel® Core™ i7-2635QM Processor (6M Cache, 2.00 GHz) 6.0 MB 2.00 GHz 4 / 8 45 DDR3-1066/1333 Intel® HD Graphics 3000
Intel® Core™ i7-2710QE Processor (6M Cache, 2.10 GHz) 6.0 MB 2.10 GHz 4 / 8 45 DDR3-1066/1333/1600 Intel® HD Graphics 3000
Intel® Core™ i7-2715QE Processor (6M Cache, 2.10 GHz) 6.0 MB 2.10 GHz 4 / 8 45 DDR3-1066/1333/1600 Intel® HD Graphics 3000

AMD Radeon HD 6990 And Nvidia GeForce GTX 590


In March of 2011, both AMD and Nvidia showed us their ultra high-end dual-GPU flagships. The Radeon HD 6990 has a total combined 3072 ALUs, an 830 MHz core clock, and dual-2 GB banks of GDDR5 memory at 5 GT/s. The GeForce GTX 590 features a total 1024 CUDA cores, a 607 MHz core, an independent 1215 MHz shader clock, and two banks of RAM at 3412 MT/s. These cards trade blows, depending on the application and settings, and were released at darn close to the same time. Some folks (like our editor-in-chief Chris Angelini) might argue that the GeForce GTX 590 should win based on the Radeon HD 6990's overly-loud cooling fan (Ed.: Actually, I'd skip both cards, personally), but we'll keep raw performance as the main criteria for this list and call this one a tie. Unfortunately, these cards quickly sold out, and we're seeing dismal availability of both flagships.
At this point, it's up to Nvidia or AMD to change the status quo with their next-generation graphics chipsets, and as always, we're looking forward to report what the future will bring. We hope you enjoyed our trip through graphics card history as much as we did; it stirred up a lot of good memories.
Read less

Best cpu for gaming/games 2011

5-2500k is the best for best bang for your buck by far. You will not see an improvement in performance if you bought the I7-2600k unless you were going to utilize hyper threading. But I would recommend the I5-2500K


Best per budget lowest 1st
Athlon x3
Pentium GXXX series
i3 2100
i5 2500
I can't see anyone disagreeing with this list

yahoo.com highly specialised site

Yahoo provides many free services: search, yahoo mail, maps, yellow pages, games, shopping, jobs, movies, news, finance, sports, messenger and much more.

its not more than google but it is competating with the google

Google.com the Best and the Good

Google is undoubtedly world No. 1 search engine website.  Besides search, google also provides many other products like gmail, adwords, google earth, voice, talk, groups, sitemap, webmaster tools.

Google is the one of the leading company which has huge demand in the market

Sunday, September 4, 2011

AMD Raises the Mobile Performance Bar with Radeon HD 6990M

The battle for graphics supremacy has been going for well over a decade now, with several casualties of war along the way (RIP 3dfx, Trident, S3, etc.). The primary competitors continue to be NVIDIA and AMD, and with NVIDIA having recently reclaimed the single GPU performance crown on both desktops and laptops with their GTX 580/580M, it’s time for AMD to respond. We’re not presenting any details for next generation desktop parts at present, and in fact the HD 6990M isn’t much of a surprise, but either way AMD is ready to release the details of their next mobile GPU.

We recently covered the mobile GPU landscape, with a discussion of the various performance levels and price segments. The price/performance ratio is actually pretty similar between AMD and NVIDIA mobile GPUs (at least until we hit the top-tier models), and both have a decent number of design wins with notebook ODMs. The current mobile performance crown goes to NVIDIA’s recently launched GTX 580M, but along with the performance crown comes a hefty price and performance bill that needs to be paid. AMD’s top mobile part prior to today’s announcement is the 6970M, which is basically a lower clocked version of the desktop Barts core with some of the Stream processors disabled (essentially a mobile HD 6850). Our testing has shown the 6970M to offer just slightly less performance on average compared to the GTX 485M, but interestingly enough NVIDIA managed to use less power in low/idle loads than AMD. Of course, even the HD 6970M is a trimmed Barts core, and there’s still the desktop 6950/6970 Cayman core that has yet to see a mobile variant, which brings us to today’s announcement.

If you were hoping to see a truly crazy mobile GPU running off the Cayman architecture, we’re unfortunately not getting that. Unlike the desktop 6990, we’re also not talking about a dual-GPU in a single card solution. Instead, the HD 6990M will be a full Barts core, with all 1120 shaders enabled. (The closest desktop equivalent is the HD 6870, which comes clocked at 900MHz, 25% higher than the 6990M.) Besides the now-standard DX11 support that AMD has been shipping since the first HD 5000 parts, 6990M also includes HD3D (stereoscopic 3D), OpenCL 1.1, and DirectCompute 11 support. AMD groups many of the features under the umbrella of "AMD App Acceleration", though there's technically nothing new here as all the 5000M and 6000M DX11 parts use the same drivers and support nearly the same features.

Looking at the mobile parts, the shader count gives the 6990M an immediate 17% boost in performance relative to the 6970M, and with a slightly higher cores clock as well (715MHz on the 6990M vs. 680MHz on the 6970M), we’re looking at up to 23% higher performance than the 6970M. Both the 6970M and 6990M continue to feature 3600MHz GDDR5 memory, although the 6990M comes with 2GB instead of 1GB. AMD also enabled OverDrive up to 740MHz for the 6990M if you want to try some quick overclocking. Here’s how performance between the AMD parts stacks up, according to AMD’s internal testing (using a desktop 3.4GHz Phenom II CPU):



On the other side of the fence, NVIDIA’s GTX 580M has the same number of CUDA cores as the GTX 485M (384 cores), but with an 8% increase in clock speed. (The closets desktop equivalent is the GTX 560 Ti, which comes clocked 37% higher than the 580M.) Our earlier testing of the 485M and 6970M resulted in nearly identical average gaming performance across eight tested games, with both sides winning a few titles. In theory, then, HD 6990M should retake the mobile performance crown given the greater increase in compute and clock speeds relative to the second-tier parts. The following slide uses simulated performance (e.g. a downclocked desktop GTX 560 Ti GPU running at mobile speeds and with only 1GB GDDR5, again with a 3.4GHz Phenom II CPU), so take these results with a grain of salt:



Both the AMD and NVIDIA parts should be plenty fast for 1080p mobile gaming, so the real question is more likely to be who offers the best overall value. Sure, value in a gaming notebook is something of an oxymoron, but unless you absolutely need CUDA/PhysX support on the NVIDIA side or are looking at Bitcoin mining on the AMD side, performance is going to be close enough that pricing will sway the vote. Availability of the 6990M starts today, with the Alienware M18x coming in both single and CrossFire configurations. Clevo will also support the HD 6990M in their P170HM, P150HM, and X7200 notebooks, which means we’ll see whitebooks from the usual suspects like Eurocom, AVADirect, and others. Here's AMD's complete high-end mobile GPU lineup: AMD Mobility Radeon 6800M and 6900M Lineup
Radeon HD 6990M Radeon HD 6970M Radeon HD 6950M Radeon HD 6870M Radeon HD 6850M Radeon HD 6830M
Model Name (Code Name) Blackcomb Pro (Barts) Blackcomb Pro (Barts) Blackcomb Pro (Barts) Granville Pro (Juniper) Granville Pro (Juniper) Granville Pro (Juniper)
Stream Processors 1120 960 960 800 800 800
Texture Units 56 48 48 40 40 40
ROPs 32 32 32 16 16 16
Core Clock 715MHz 680MHz 580MHz 675MHz 625MHz 575MHz
Memory Clock 900MHz (3.6GHz) GDDR5 900MHz (3.6GHz) GDDR5 900MHz (3.6GHz) GDDR5 1000MHz (4.0GHz) GDDR5 1000MHz (4.0GHz) GDDR5 900MHz (3.6GHz) GDDR5
Memory Bus Width 256-bit 256-bit 256-bit 128-bit 128-bit 128-bit
Memory Bandwidth 115.2GB/s 115.2GB/s 115.2GB/s 64GB/s 64GB/s 57.6GB/s
VRAM 2GB 1GB 1GB 1GB 1GB 1GB


As we've noted in the past, the 6000M consists of parts from both the Evergreen and Northern Islands series of graphics chips. For many users, the difference between the two isn't all that important, but Northern Islands does upgrade the video engine to UVD3 where Evergreen is UVD2.2. Also worth remembering is that the 6800M parts are really just renamed 5800M parts with slightly altered clocks in some cases, so they're not as attractive as the 6900M parts. Finally, the 6800M parts can come with either GDDR5 or DDR3, the latter being significantly slower and thus less desirable. Our table only uses the specs from GDDR5 variants, so if you're shopping for a 6800M make sure you get a GDDR5 model.

Outside of their newest mobile GPU, we also asked AMD about the current state of their switchable graphics on Intel platforms. AMD says they should have some partners releasing laptops with application based switching (e.g. similar to NVIDIA’s Optimus), but that will likely be with lower performance GPUs. In contrast, NVIDIA is touting Optimus support on certain GTX 580M configurations, though as always it’s up to the notebook vendors to utilize the feature. We haven’t had a chance to get hands on time with any form of AMD switchable graphics for some time, so the jury is still out. We hope to have an appropriate laptop for testing in the not-too-distant future, at which time we’ll be able to provide a better answer on which solution is the overall winner.

As for the question of who actually takes home the mobile gaming performance crown, we hope to have both GTX 580M and HD 6990M notebooks for testing in the coming weeks. On paper and using our previous 6970M and GTX 485M results, it looks like the 6990M should come out on top, but with various driver updates in the past several months we’re not ready to declare an official winner. If you’re looking for more than a few slides and potentially biased game selections, stay tuned: we’ll provide our usual in-depth look at real-world performance as soon as we can get hardware into our labs. Our money is still going to be on whoever can come in at a lower price point, and if recent history is any indication, that will likely be AMD with the 6990M. Update: Alienware now has both the GTX 580M SLI and HD 6990M CrossFire configurations available on their web site; at present, the SLI setup costs $700 more than the CrossFire configuration, which makes the 6990 an easy recommendation.

The HP TouchPad Review: webOS on the Big Screen






If this were a race of numbers, Apple would have already won. It isn't. The iPad 2, as successful as it is, isn't perfect. There's tons of room for innovation and we're seeing its competitors offer clear examples of that innovation. As with any market, the lower your market share the more likely you are as a company to take risks. After all, you've got nothing to lose. It's in breaking the mold and taking these risks that great ideas are often born.

For HP there wasn't much of a risk to take with their first entry into the new tablet market, thanks to Palm's risk taking three years ago. For those who have used a webOS phone in the past, the OS needed very little functional improvement. It was just a matter of needing better hardware, squashing bugs and improving performance. The fundamentals were sound.

In fact, I'm still surprised that no one has managed to really copy the things that made webOS so great given how much time has passed since the Palm Pre first went on sale. Even today with multitasking improvements in Honeycomb and iOS, it's still easier to launch, exit and switch between apps on webOS.



Has HP been able to give webOS the rest of the ingredients it needs to succeed? On the hardware side I think that's definitely the case. The new webOS family is powered by the latest and greatest from Qualcomm. Fast single core SoCs in the phones and Qualcomm's fastest dual-core SoC in the tablet. It's the software that remains webOS' blessing and curse. The functionality is there and remains unrivaled in many ways, but the platform is still buggy and is at times seriously limited in the performance department.

We'll get to all of that throughout the course of this review but first let's meet the TouchPad.



The hardware itself is pretty. The TouchPad is made entirely of glossy black plastic around the back and a 9.7-inch glass touchscreen on the front. The edges are all curved making the tablet easy to hold. While the glossy black plastic looks elegant at first, it shows fingerprints just like an old iPod.



The TouchPad is thicker than the iPad 2 or Galaxy Tab 10.1, but it's not the thickness that bothers me. The TouchPad is the heaviest tablet we've reviewed. At 730g it's over 20% heavier than the iPad 2 and the weight is noticeable. If you're using it on its dock or on your lap the weight isn't a problem, but holding it up for long periods of time can be fatiguing.

Build quality is good but not great. I detected a little bit of movement in the chassis if I tried to flex the TouchPad slightly. The micro USB connector at the bottom isn't perfectly lined up with the cutout in the chassis either, requiring me to insert its USB cable at an angle. The volume rocker on the right side of the unit wiggles a bit in place. All of these are minor complaints in the grand scheme of things but they're worth pointing out.



There's a microphone up top as well as your standard power/lock button and 1/8" headset/mic jack. A physical home button is in the usual place with a built in white LED notification indicator.

The TouchPad has two speakers along its left side:



Like its competitors the TouchPad has a built in accelerometer and gyroscope to detect rotation and movement along multiples axes. You can orient the TouchPad in all four directions and the OS will rotate accordingly. The accelerometer in the TouchPad is extremely sensitive, often rotating the screen for very slight movements of the tablet itself. While this sounds like a good thing, in practice it's not. The TouchPad usually rotated when I didn't want to and then seemed to lose its sensitivity issues when I tried to rotate it back. The problem here is likely in software. 2011 Tablet Comparison
Apple iPad 2 ASUS Eee Pad Transformer HP TouchPad Samsung Galaxy Tab 10.1
SoC Apple A5 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) Qualcomm APQ8660 (Dual Scorpion @ 1.2GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz)
GPU PowerVR SGX 543MP2 NVIDIA GeForce Adreno 220 NVIDIA GeForce
RAM 512MB 1GB 1GB 1GB
Display 9.7-inch 1024 x 768 IPS 10.1-inch 1280 x 800 IPS 9.7-inch 1024 x 768 IPS 10.1-inch 1280 x 800 PLS
NAND 16GB - 64GB 16GB - 32GB 16GB - 32GB 16GB - 32GB
Dimensions 241.2mm x 185.7mm x 8.8mm 271mm x 175mm x 12.95mm 240mm x 190mm x 13.7mm 256.6 x 172.9 x 8.6mm
Weight 601g 695g 730g 565g
Price $499 $399 $499 $499


There is no support for external storage and HP offers a 16GB and 32GB version at $499 and $599 respectively. Both support WiFi although AT&T has already announced a HSPA+ version for use on its network.

LG N2A2 NAS Review

he consumer Network Attached Storage (NAS) market has seen tremendous growth over the past few years. As connected homes become more ubiquitous, the need for centralised storage has become very important. On one hand, we have full blown NAS appliances like the ones from Synology and QNAP. They are aimed at the SMB market, but also serve home consumers well. On the other hand, we have appliances targeted specifically towards home consumers. D-Link and Netgear are some of the more famous companies catering to this market. LG is also trying to cater to this market with some NAS offerings aimed at the home consumer.

LG differentiates itself from the rest of the competitors' units by offering DVD and Blu-ray rewriters as part of the NAS unit. However, the unit we are looking at today is the plain vanilla offering.

The LG N2A2DD2 2TB (2x1TB) comes with two preinstalled hard disks. The disks in the N2A2 can't be replaced by the user. The diskless version is the N2R1D (based on a previous generation Marvell chipset), and it is much cheaper than the N2A2. We will come back to the comparisons with the N2R1D towards the end of the review, but first, let us look at the specs.



The full brochure from the LG website can be found here. In the next section we will take a look at the package contents.

Sandy Bridge Memory Scaling: Choosing the Best DDR3


Intel's Second Generation Core processors, based on the Sandy Bridge architecture, include a number of improvements over the previous generation's Nehalem architecture. We’ll be testing one specific area today: the improved memory controller. Current Sandy Bridge based processors officially support up to DDR3-1333 memory. Unfortunately, due to changes in the architecture, using faster rated memory (or overclocking memory) on Sandy Bridge via raising the base clock is extremely limited. Luckily, there are additional memory multipliers that support DDR3-1600, DDR3-1866, and DDR3-2133 memory. Some motherboards include support for even higher memory multipliers, but we’ll confine our investigations to DDR3-2133 and below.

Since Sandy Bridge is rated for up to DDR3-1333 memory, we will start there and work our way up to DDR3-2133 memory. We'll also be testing a variety of common CAS latency options for these memory speeds. Our purpose is to show how higher bandwidth memory affects performance on Sandy Bridge, and how latency changes—or doesn’t change—the picture. More specifically, we’ll be looking at the impact of memory speed on application and gaming performance, with some synthetic memory tests thrown into the mix. We’ll also test some overclocked configurations. So how much difference will lowering the CAS latency make, and does memory performance scale with processor clock speed?

Back when I originally envisioned this comparison, the price gap between DDR3-1333 and DDR3-2133 memory was much wider. A quick scan of Newegg reveals that a mere $34 separates those two 4GB kits. Below is a breakdown of the lowest prices (as of 7/16/2011) for various memory configurations.4GB 2x2GB Kits
DDR3-1333 CL9 $31
DDR3-1333 CL8 $40
DDR3-1600 CL9 $40
DDR3-1600 CL8 $41
DDR3-1333 CL7 $45
DDR3-1600 CL7 $50
DDR3-1866 CL9 $60
DDR3-2133 CL9 $65


8GB 2x4GB Kits
DDR3-1333 CL9 $58
DDR3-1600 CL9 $66
DDR3-1333 CL7 $75
DDR3-1600 CL8 $80
DDR3-1866 CL9 $85
DDR3-1600 CL7 $115
DDR3-2133 CL9 $150


You can see from the above chart that balancing memory clocks with latency results in some interesting choices, particularly on the 8GB kits where price differences are a bit larger. Is it best to go with a slower clock speed and better timings, or vice versa, or is the optimal path somewhere in between? That’s the aim of this article.

Apple Updates Cinema Display, It's a Thunderbolt Display Now



Along with today’s MacBook Air and Mac mini updates, Apple has also updated their 27” Cinema Display. The display now goes by a new name: the Apple Thunderbolt Display (ATD). As the name implies, the display now features Intel’s new Thunderbolt interface, which Apple has heavily adopted in all new 2011 Macs. The ATD is world’s first commercially available Thunderbolt display and the second Thunderbolt device, the first one being Promise’s Pegasus enclosure.



Lets go through the specifications now: Apple Thunderbolt Display Specifications
Screen size 27"
Resolution 2560x1440
Panel type In-plane switching (IPS)
Brightness 375 cd/m2
Viewing angles 178°/178°
Contrast ratio 1000:1
Response time 12ms
Cables (built-in) Thunderbolt, MagSafe
Ports 3x USB 2.0, FireWire 800, Gigabit Ethernet, Thunderbolt
Video and audio FaceTime HD camera with mic, 2.1 speaker system
Dimensions (WxDxH) 25.7" x 8.15" x 19.35"
Weight 23.5lb
Price $999


Essentially, the ATD is just a 27” Cinema Display with Thunderbolt. The screen size is the same, the resolution is the same, and I wouldn’t be surprised if the panel was exactly the same as well. From outside, you can’t see any difference, sans the extra ports. The dimensions are a match. Even the price stays at $999.

The difference comes when we talk about Thunderbolt and what it brings. The Cinema Display had three cables: Mini DisplayPort, MagSafe (power) and USB 2.0. Thanks to Thunderbolt, mDP and USB 2.0 have been merged into one and there are now only two cables: MagSafe and Thunderbolt.

Laptop-as-a-desktop users rejoice, the Thunderbolt Display features FireWire 800, USB 2.0 and Gigabit Ethernet - all of which are carried over the single Thunderbolt cable. There is also a second Thunderbolt port for daisy-chaining. As Thunderbolt provides up to 10Gb/s per channel, it’s more than adequate for 2560x1440 display and an external RAID box as we mentioned in our Promise Pegasus R6 & Mac Thunderbolt Review.





Example of daisy-chaning


Apple's Thunderbolt Display really shows us the potential of Thunderbolt by integrating many different interface standards into a single cable. Honestly the only thing that's missing is audio-out on the Thunderbolt Display itself for users who prefer external speakers.

The biggest, and possibly the only, issue here is USB 2.0 - it feels so outdated considering that nearly all PCs have USB 3.0 now. We probably won't see USB 3.0 support from Apple until Ivy Bridge brings it natively in 2012. However, even with only USB 2.0, the ATD is a great option for the owners of 2011 Macs with Thunderbolt. Apple will continue to sell the existing 27-inch Cinema Display as the new Thunderbolt Display will not work with machines that don't support Thunderbolt.

The Apple Thunderbolt Display is available from Apple's Online Store with an estimated shipping time of 6-8 weeks.

The 2011 MacBook Air (11 & 13-inch): Thoroughly Reviewed



I've always liked ultraportables. Back when I was in college I kept buying increasingly more portable notebooks until I eventually ended up with something horribly unusable for actual work. When Apple introduced the first MacBook Air back in 2008 I fell in love. It finally stuck a fast enough CPU in a small enough chassis and gave me a full sized keyboard to type on. I was set.

Last year Apple introduced the first major update to the MacBook Air, bifurcating the lineup with the first ever 11-inch model in addition to the standard 13. With last year's update the MacBook Air did so well that it actually started outselling the base MacBook. Apple isn't a fan of large complicated lineups so it retired the MacBook. If you want a portable Mac you can buy a MacBook Air or a MacBook Pro.

As the mainstream counterpart to the MacBook Pro, Apple had to do something about the performance of the MacBook Air. While last year's updates were great alternatives to cheap, underpowered netbooks, they weren't fast enough to be a mainstream computer in 2011. Last year's Air featured Intel's Core 2 Duo processors, based on an architecture that debuted in 2006. Intel has released two major architectures since then.

Just nine months after the release of the 2010 MacBook Air, Apple fixed the problem. Meet the new Air:



If these systems look identical to the ones they're replacing that's because they are, at least from the outside. With the exception of a backlit keyboard, some differences in the row of function keys and a Thunderbolt logo, these babies look identical to last year's models.

You shouldn't judge a (Mac)book by its cover, because the MacBook Air's internals are much improved. 2011 MacBook Air Lineup
11.6-inch 11.6-inch (high-end) 13.3-inch 13.3-inch (high-end)
Dimensions H: 0.11-0.68" (0.3-1.7cm)
W: 11.8" (30cm)
D: 7.56" (19.2cm) H: 0.11-0.68" (0.3-1.7cm)
W: 12.8" (32.5cm)
D: 8.94" (22.7cm)
Weight 2.38 lbs (1.08kg) 2.96 lbs (1.35kg)
Base CPU 1.6GHz dual-core Core i5 1.7GHz dual-core Core i5
Graphics Intel HD 3000
RAM 2GB DDR3-1333 4GB DDR3-1333 4GB DDR3-1333 4GB DDR3-1333
SSD 64GB SSD 128GB SSD 128GB SSD 256GB SSD
Display Resolution 1366 x 768 1440 x 900
Ports Thunderbolt, 2x USB 2.0, composite audio in/out jack Thunderbolt, 2x USB 2.0, SDHC slot, composite audio in/out jack
Price $999 $1199 $1299 $1599

Motorola Droid 3 Review - Third Time's a Charm



If ever a product has summed up the progression of the Android ecosystem, it’s the Motorola Droid. The first Droid catapulted Android into the mainstream with its first 2.x release, and since then the Droid itself has seen a yearly update cadence that honestly has shown no sign of stopping. The updates thus far track the trends that we’ve seen affect the Android ecosystem as a whole - newer and better versions of Android alongside ever increasing SoC performance, display improvements, camera improvements, and refined hardware design.



I think that pretty much sums up what kind of update the Motorola Droid 3 (henceforth just Droid 3) is. It’s an iterative product launch, for sure, but that belies just how good the improvements all around really are. I noted a few of them already - the Droid 3 includes a dual core OMAP 4430 SoC, larger 4” qHD display, more internal storage, better camera, front facing camera, and most notably a much improved 5 row QWERTY keyboard.

Of course the huge question mark is what has improved connectivity-wise on the Droid 3. There’s no 4G LTE baseband, however, instead of repeating the Droid 2 and Droid 2 Global duopoly, Motorola just went ahead and made the Droid 3 global from the start. That’s right, it’s a dual-mode phone. It’s no consolation if you’re still waiting for an LTE enabled device with a physical keyboard (for that, you’ll have to wait for Samsung to release its rumored next device), but in my mind right now you can either have multi-mode global (CDMA2000 and GSM/UMTS) compatibility or multi-mode (CDMA2000 and LTE) with 4G connectivity. As of yet there’s no having it both ways.

We’ll talk more about all of that in due time, but for now let’s just go over the Droid 3’s outward physical appearance and hardware.



First off, the Droid 3 is notably larger than its predecessor. It’s 3 mm wider, 7 mm taller, but almost 1 mm thinner. Those changes in outline are both to accommodate the 4” screen (as opposed to 3.7”) and likewise the additional keyboard row. Mass is up as well, from 169 to 184 grams. I won’t bore you with all the specifications that have changed, you can just check out the table below. Physical Comparison
HTC Thunderbolt Motorola Droid 2 Motorola Droid X2 Motorola Droid 3
Height 122 mm (4.8") 116.3 mm (4.6") 126.5 mm (4.98") 123.3 mm (4.85")
Width 67 mm (2.63") 60.5 mm (2.4") 65.5 mm (2.58") 64.1 mm (2.52")
Depth 13.2 mm (0.52") 13.7 mm (0.54") 9.9 - 14.4 mm (0.39"-0.57") 12.9 mm (0.51")
Weight 183.3 g (6.46 oz) 169 g (5.9 oz) 148.8 g (5.25 oz) 184 g (6.49 oz)
CPU 1 GHz MSM8655 45nm Snapdragon 1 GHz Cortex-A8 OMAP 3620 1 GHz Dual Core Cortex-A9 Tegra 2 AP20H 1 GHz Dual Core Cortex-A9 OMAP 4430
GPU Adreno 205 PowerVR SGX 530 ULP GeForce PowerVR SGX 540
RAM 768 MB LPDDR2 512 MB LPDDR2 512 MB LPDDR2 512 MB LPDDR2
NAND 4 GB NAND with 32 GB microSD Class 4 preinstalled 8 GB integrated, preinstalled 8 GB microSD 8 GB NAND, 8 GB microSD class 4 preinstalled 16 GB NAND, up to 32 GB microSD
Camera 8 MP with autofocus and dual LED flash, 720p30 video recording, 1.3 MP front facing 5 MP with dual LED flash and autofocus 8 MP with AF/LED Flash, 720p30 video recording 8 MP with AF/LED Flash, 1080p30 video recording, VGA (0.3MP) front facing
Screen 4.3” 800 x 480 LCD-TFT 3.7" FWVGA 854 x 480 IPS-LCD 4.3" 960 x 540 RGBW LCD 4.0" 960 x 540 RGBW LCD
Battery Removable 5.18 Whr Removable 5.2 Whr Removable 5.65 Whr Removable 5.65 Whr


Subjectively however, I was shocked at just how thin the Droid 3 feels in the hand in spite of the slide-out keyboard. It seems like generally there’s a certain amount of unacceptable overhead that always comes alongside including an actual keyboard, yet the Droid 3 manages to do it without making it painfully obvious that everything was designed around it instead of with it.

Apple's 11-inch MacBook Air (Core i7 1.8GHz) Review Update

Last week we published our review of the new 2011 MacBook Air. Both the 11 and 13-inch models ship with ultra low-voltage (ULV) dual-core Sandy Bridge CPUs, a first for the lineup. Also another first for the lineup is the fact that you can now get equally specced CPUs in both models. In theory you'd be able to have the same performance regardless of chassis size.



The table below highlights the three CPUs available on the new MBAs: 2011 Apple MacBook Air CPU Comparison
1.6GHz Core i5 1.7GHz Core i5 1.8GHz Core i7
Available in 11-inch (default) 13-inch (default) high-end 11-inch (option)
high-end 13-inch (option)
Intel Model Core i5-2467M Core i5-2557M Core i7-2677M
Cores/Threads 2/4 2/4 2/4
Base Clock Speed 1.6GHz 1.7GHz 1.8GHz
Max SC Turbo 2.3GHz 2.7GHz 2.9GHz
Max DC Turbo 2.0GHz 2.4GHz 2.6GHz
L3 Cache 3MB 3MB 4MB
GPU Clock 350MHz / 1.15GHz 350MHz / 1.2GHz 350MHz / 1.2GHz
Quick Sync Yes Yes Yes
AES-NI Yes Yes Yes
VT-x Yes Yes Yes
VT-d No Yes Yes
TDP 17W 17W 17W


The 1.8GHz Core i7 is offered as an upgrade to both the 11 and 13-inch MacBook Air. With much higher max turbo speeds and another megabyte of L3 cache, it's clear this is going to be a big upgrade over the standard 11-inch Air.



Last week we got our hands on one of these upgraded 11-inch models to find out just how much faster it is. We also wanted to find out what sort of an impact the faster CPU would have on the 11's thermals and battery life. It just so happens that our upgraded 11 gave us more than just that to investigate.
The Panel Lottery

For commodity parts within its systems Apple typically sources from two different vendors. This is done to avoid shortages due to a single component vendor. It also puts Apple in a good negotiating position. The MacBook Air is no different. Each model (11 & 13-inch) ships with one of two panels. Word on the street is that one of those panels is better than the other. It was time to find out if that's the case.



If you want to know who makes the display in your Mac and Apple hasn't overridden the EDID information from the panel simply open up terminal and execute this string:

ioreg -lw0 | grep IODisplayEDID | sed "/[^<]*</s///" | xxd -p -r | strings -6

The output will look something like this:

LP116WH4-TJA3
Color LCD

The first line is the panel's model number. Typically a quick Google search of the first few characters will give you the manufacturer's name. In this case, the LP116WH4 is made by LG Philips (hence the LP prefix). This happens to be the panel in the 11-inch Core i7 MacBook Air I just got my hands on. If you read my original review of the 2011 MBAs you'll know that both of the systems I tested there had panels by a different manufacturer:

LTH133BT01A03
LTH116AT01A04

The LT prefix on both of those part numbers implies Samsung is the OEM. The current working theory is that the LG panel in the new MacBook Air is somehow worse than the Samsung panel. Given the vast difference we saw in SSD performance between the Samsung and Toshiba drives, is it possible that Apple has allowed a similarly large gap to form between LCD vendors? Not so much:



The LG panel is slightly dimmer than the Samsung panel I originally tested.

Although you don't get peak brightness, you do get lower black levels on the LG panel:





The combination of the two actually gives us a healthy boost in max on/off contrast:



In normal usage I never noticed the increase in contrast, nor did I feel the panel was any dimmer, but there is technically an advantage here.



Color accuracy is also slightly better on the LG panel, although this small of a difference is basically impossible to notice.



Perhaps due to backlight differences the LG panel does have a narrower color gamut.

Apple calibrates all of its systems with integrated displays before shipment so the LG panel has a similar white point to the earlier Samsung panel we tested (6700 - 6800K):



Based on these numbers alone I don't see any reason to believe that the LG panel in the new MacBook Air is any worse than the Samsung panel. However I do believe that there may be an explanation for the perceived inferior quality of the LG panel. The LG panel exhibits vertical color/contrast shift more readily than the Samsung panel. Unfortunately I don't have them both here to show you a side by side comparison but the LG panel seems to be slightly more sensitive to vertical viewing angle. As I mentioned in our original 2011 MacBook Air review, the 11 is a particularly tough system to use due to the height of its display. In order to get perpendicular line of sight to the display you need to tilt the display back and your head down. If you're off by just a few degrees you'll start to see color/contrast shift. On the LG panel that classic TN panel distortion seems to come a bit sooner than on the Samsung panel.
Viewed Above Center
Viewed Perpendicular to the Screen

Viewed Below Center


The issue was most noticeable to me when I had the 11 on a desk rather than on my lap. While it was particularly bothersome when I first got it, I've since become used to the display. Obviously these machines are expensive enough that I believe you should be happy with your purchase, but from my perspective the two panels are close enough that it's not worth losing sleep over. Both the LG and Samsung panels are TN panels. They may have better display characteristics than your typical cheap TN panel, but they still have the same viewing angle limitations as other TN panels. Both panels exhibit the same issues, the LG may just show them off a few degrees sooner.

The bigger problem for some is that the 11-inch MacBook Air has the highest pixel density of anything Apple ships:



While I'm glad Apple opted for a high resolution 11-inch display, not everyone will find it easy to read. This isn't something that varies with panel type, it's just a side effect of having a small display with a high pixel density.

Samsung TouchWiz UX Review: Honeycomb Gets Skinned






It wasn’t long after the thinner, lighter, better Galaxy Tab 10.1 was announced that we heard Samsung would bring its TouchWiz skin to Android’s tablet OS, Honeycomb. After much debate over whether Honeycomb was truly ‘open,’ and not a closed iOS like environment, here we are. Starting today, Samsung Galaxy Tab 10.1 owners will start to see OTA updates pushed to their devices, offering the first skinned Honeycomb experience. We’ve got it now, and this is more than just a few widgets. Was it worth the wait, or will users avoid this optional update as long as they can?



Skins are Evil/Great


Let’s go over a few things that this update is not. This update is not Android 3.2; however, we did receive assurances that the heavy lifting for TouchWiz has already been done so there should be no delay in rolling out 3.2 and it’s graphics and UI enhancements. This update is not a cure for all the bugs that continue to ail Honeycomb; the occasional sluggishness and random Force Close events persist. This update is not a lag inducing crime against Android users; any enhancement that burdens the CPU or GPU at all will inherently result in some worsening lag, either in the UI or in opening, closing or using apps. But the lag we’ve seen is nothing too jarring, nor is it so far off from what’s normally found on Honeycomb. So, what is this update? Samsung described three key areas they wanted to enhance with what’s known now as TouchWiz UX; Ease of Use, Fun and Entertainment, Open for Business. By far, the most outward of these enhancements is Ease of Use, so we will start there.

Gallery: Samsung TouchWiz UX - UI Changes








Widgets and Apps

TouchWiz introduces three major enhancements, LivePanels, MiniApps and the Quick Panel. The last is the simplest of the lot and the easiest to dispense with; the same set of toggles we’ve seen stuffed into most skinned UI’s on Android is now present within the Notification shade, allowing users to manipulate all the devices radios, volume, screen brightness and vibrate functions with just a press. Hardly revolutionary but a good way to bring up some settings that were previously buried in menus.


The MiniApps may be one of the more compelling additions. Tucked in a hidden OS X style dock are six applications that are unique and redundant at once. The applications are a task manager, calculator, calendar, world clock, music player and finger or stylus driven memo pad. None of these apps are terribly novel but they are each blessed with something not seen in Android yet. When the app is pulled up it appears as an overlay atop the screen, any apps previously on the screen will remain open below the MiniApp. The app will then remain overlaid until closed manually, meaning you can continue working on other apps and even change home screens and the MiniApp will be there, it even takes on a transparency affect when focus is moved from it so you can see what’s behind it.


The best use case for MiniApps is writing in the memo app. On a PC it’s easy to keep a text window open along side or over a browser window opened to a relevant item and switch between them. Previously on smartphones and tablets this sort of work flow was dreadful because of the jarring and often slow transition from one app to the next and back. This enhancement solves this problem for these six apps. If you have little use for any of these apps (certainly the World Clock’s utility is beyond me) then you’ll have little use for MiniApps; we’re inquiring whether this technique will be accessible to devs and eventually grant the ability to add or delete apps from the Dock bar.

Airport Extreme (5th Gen) and Time Capsule (4th Gen) Review - Faster WiFi




Apple has been playing it cool on the WiFi side of things lately. It started with the previous Airport Extreme (Gen 4) which quietly introduced three spatial stream support, followed up by the Early 2011 MacBook Pro update which brought a three spatial stream compliant WLAN stack, and now has continued with an even more understated update for the Time Capsule (4th generation) and Airport Extreme (5th generation).



Both updates launched just prior to this latest round of Apple launches, which included the Mac Mini, Macbook Air, and Thunderbolt Display, but unlike those three, the Time Capsule and Airport Extreme updates saw almost no mention. Starting with the exterior packaging, you’d be hard pressed to tell that a particular Time Capsule or Airport Extreme is the newer refresh. I no longer have the old Airport Extreme packaging, but the new device box is virtually indistinguishable. Outside of bumping the supported storage capacity for the Time Capsule up to 3TB, there’s no real obvious giveaway for the Time Capsule either.

The only way to tell which version is which by looking at the box is by the model numbers—MD031LL/A for the 5th generation Airport Extreme, and MD032LL/A for the 2TB 4th generation Time Capsule.

The contents of the Airport Extreme box remain the same as well, starting with the device itself on top, and underneath it, a power cable, 12 volt power supply (model A1202) and some literature about setup in a white plastic bag.



The Time Capsule box is much the same affair, with the device inside, a power cable, no power supply (since it’s internal), and some literature.



I stacked all three devices up so you can compare physically. Really the only big giveaway between the two Airport Extremes is an extra line of text on the previous generation, and of course the model number or FCC ID. Both the Time Capsule and Airport Extreme still retain the same port configuration—four GigE ports, one USB 2.0 port, power, reset, and a Kensington security slot. Those four gigabit ethernet ports can either be used as a switch, or you can use the device as a router and then the leftmost port becomes WAN and the right three become LAN.



At this point it isn’t really looking like there’s much different, but exterior appearances can be deceptive.

The Sony Ericsson Xperia Play




your perspective) for cellular handsets to supplant dedicated portable gaming consoles was already at the 'dull roar' stage when Steve Jobs unveiled the first-generation iPhone in January 2007. Successive iPhone iterations, along with iOS ecosystem expansion to the iPod touch and iPad, have upped the argument amplification a notch or few, as have competitive offerings based on the Android, RIM, WebOS and Windows Mobile (now Windows Phone) operating systems.

Sony's approach to addressing the standalone-versus-cellphone debate is, if nothing else, intriguing. The multi-subsidiary company includes the game console division, of course, which is determined to do its utmost to maintain a lucrative dedicated-function portable hardware business. Yet Sony Computer Entertainment is also responsible for a plethora of game software titles, whose developers are likely challenged to sell as much content as possible, ideally but not necessarily exclusively running on Sony-branded hardware. And then there's Sony Ericsson, a joint venture company chartered with maintaining and expanding its status as a top-tier cellular handset manufacturer.

On one end of the strategy spectrum, Sony has to date produced four different models in its PlayStation Portable line; the original PSP-1000, PSP-2000, PSP-3000 and PSP Go. The upcoming PlayStation Vita successor, formally unveiled at June's E3 Conference with availability (beginning in Japan) slated for later this year, aspires to one-up even the most powerful current-generation smartphone with features such as a SoC containing a quad-core ARM Cortex-A9 CPU (clock speed currently unknown) and quad-core 200 MHz Imagination Technologies SGX543MP4+GPU, not to mention PS Vita's 5" OLED display. On the other end of the spectrum is Sony's PlayStation Certified program, unveiled in late January, which conceptually enables generic Android-based hardware to run PlayStation Suite content.

And in-between these two extremes is the subject of this particular writeup, Sony Ericsson's Xperia Play gaming cellphone:



Gallery: Sony Xperia Play







The mythical 'PlayStation Phone' had been rumored for several years, but when it finally appeared in late March in 11 countries (not then including the United States), it was curiously absent any explicit 'PlayStation' branding. Sony Ericsson's initial U.S. carrier partner was Verizon, who began selling the handset in late May subsequent to its first official U.S. unveiling, a commercial which ran during February's Super Bowl. More recently, AT&T picked up the handset in mid-July. One week later, Verizon dropped the Xperia Play's contract-subsidized price to $99.99, from $199.99 at introduction. Was Verizon's action a competitive response to AT&T's entry, a reaction to poor Xperia Play sales, or some combination of these and/or other factors? Verizon's not saying, but let's see how well (or not) the handset performs to get a sense of its degree of market appeal.

The SandForce Roundup: Corsair, Kingston, Patriot, OCZ, OWC & MemoRight SSDs Compared



It's a depressing time to be covering the consumer SSD market. Although performance is higher than it has ever been, we're still seeing far too many compatibility and reliability issues from all of the major players. Intel used to be our safe haven, but even the extra reliable Intel SSD 320 is plagued by a firmware bug that may crop up unexpectedly, limiting your drive's capacity to only 8MB. Then there are the infamous BSOD issues that affect SandForce SF-2281 drives like the OCZ Vertex 3 or the Corsair Force 3. Despite OCZ and SandForce believing they were on to the root cause of the problem several weeks ago, there are still reports of issues. I've even been able to duplicate the issue internally.



It's been three years since the introduction of the X25-M and SSD reliability is still an issue, but why?

For the consumer market it ultimately boils down to margins. If you're a regular SSD maker then you don't make the NAND and you don't make the controller.

A 120GB SF-2281 SSD uses 128GB of 25nm MLC NAND. The NAND market is volatile but a 64Gb 25nm NAND die will set you back somewhere from $10 - $20. If we assume the best case scenario that's $160 for the NAND alone. Add another $25 for the controller and you're up to $185 without the cost of the other components, the PCB, the chassis, packaging and vendor overhead. Let's figure another 15% for everything else needed for the drive bringing us up to $222. You can buy a 120GB SF-2281 drive in e-tail for $250, putting the gross profit on a single SF-2281 drive at $28 or 11%.

Even if we assume I'm off in my calculations and the profit margin is 20%, that's still not a lot to work with.

Things aren't that much easier for the bigger companies either. Intel has the luxury of (sometimes) making both the controller and the NAND. But the amount of NAND you need for a single 120GB drive is huge. Let's do the math.


8GB IMFT 25nm MLC NAND die - 167mm2

The largest 25nm MLC NAND die you can get is an 8GB capacity. A single 8GB 25nm IMFT die measure 167mm2. That's bigger than a dual-core Sandy Bridge die and 77% the size of a quad-core SNB. And that's just for 8GB.

A 120GB drive needs sixteen of these die for a total area of 2672mm2. Now we're at over 12 times the wafer area of a single quad-core Sandy Bridge CPU. And that's just for a single 120GB drive.

This 25nm NAND is built on 300mm wafers just like modern microprocessors giving us 70685mm2 of area per wafer. Assuming you can use every single square mm of the wafer (which you can't) that works out to be 26 120GB SSDs per 300mm wafer. Wafer costs are somewhere in four digit range - let's assume $3000. That's $115 worth of NAND for a drive that will sell for $230, and we're not including controller costs, the other components on the PCB, the PCB itself, the drive enclosure, shipping and profit margins. Intel, as an example, likes to maintain gross margins north of 60%. For its consumer SSD business to not be a drain on the bottom line, sacrifices have to be made. While Intel's SSD validation is believed to be the best in the industry, it's likely not as good as it could be as a result of pure economics. So mistakes are made and bugs slip through.

I hate to say it but it's just not that attractive to be in the consumer SSD business. When these drives were selling for $600+ things were different, but it's not too surprising to see that we're still having issues today. What makes it even worse is that these issues are usually caught by end users. Intel's microprocessor division would never stand for the sort of track record its consumer SSD group has delivered in terms of show stopping bugs in the field, and Intel has one of the best track records in the industry!

It's not all about money though. Experience plays a role here as well. If you look at the performance leaders in the SSD space, none of them had any prior experience in the HDD market. Three years ago I would've predicted that Intel, Seagate and Western Digital would be duking it out for control of the SSD market. That obviously didn't happen and as a result you have a lot of players that are still fairly new to this game. It wasn't too long ago that we were hearing about premature HDD failures due to firmware problems, I suspect it'll be a few more years before the current players get to where they need to be. Samsung may be one to watch here going forward as it has done very well in the OEM space. Apple had no issues adopting Samsung controllers, while it won't go anywhere near Marvell or SandForce at this point.