Archive for August, 2010

Intel set to take leap in solid-state drives

Monday, August 30th, 2010

(Credit:
Intel Corp.)

Finally, Winslow addressed the price collapse in the flash market in general–a topic that generated a lot of press after the Intel analyst meeting on Wednesday. “A majority of flash is being sold in very cyclical consumer electronics devices. Q1 and Q2 are soft quarters,” he said. On top of this, suppliers continue to shrink manufacturing process technologies, leading to more capacity at lower cost, he said.

SSD Primer, Part 1: SSDs are based on flash memory chip technology and have no moving parts. Hard-disk drives (HDDs), in contrast, use read-write heads that hover over spinning platters to access and record data. With no moving parts, SSDs avoid both the risk of mechanical failure and the mechanical delays of HDDs. Therefore, SSDs are generally faster and more reliable. The catch is the cost: SSDs are currently much more expensive than HDDs.

Intel believes 2008 is the year of the SSD. (See SSD primer below.) “For the first time, flash is going into the compute environment. In the last nine years or so when it experienced all of its growth, this has been in digital cameras and USB keys,” Winslow said. But now flash memory, in the form of SSDs, will be used as the main storage device in PCs. “When you’re putting all your critical applications and data into notebook or server (SSDs), who knows those markets better than the manufacturer that’s supplying the world with CPUs,” Winslow added.

With new competition, drive speeds will jump. Currently, the fastest SSDs from companies like Samsung approach 100MB/second for reading data. “What I can tell you is ours is much better than that,” Winslow said. Hard drives typically read data at about half this speed.

“When Intel launches its…products, you’ll see that not all SSDs are created equal,” Winslow said. “The way the SSDs are architected, the way the controller and firmware operates makes a huge difference,” he said, referring to the chip (controller) that manages the SSD and software (firmware) that the controller uses.

“We will be supplementing our product line with a SATA offering,” he said. Serial ATA, or SATA, is an interface used in high-performance hard disk drives. Intel’s products will be based on the SATA II specification that offers speeds of 3 gigabits (Gb) per second. Samsung is now shipping 64GB SSDs to Dell using the same technology.

Intel's current offering: the Z-P140 PATA solid-state drive.

SSD Primer, Part 2: Intel will be shipping in the second quarter a Multi-Level Cell or MLC solid-state drive. This is a more sophisticated technology than current Single-Level Cell or SLC. The advantage is larger capacity since MLC uses multiple levels per cell to allow more bits to be stored. The disadvantage is more complexity which can result in lower performance. “Inherently, MLC is slower and inherently fewer write cycling endurance,” Winslow said. Intel, however, has technology that will get around these problems, he said.

Intel doesn’t enter markets gently. Its new high-capacity solid-state drives (SSDs) are expected to jolt a market currently dominated by Samsung, Toshiba, and SanDisk.

Also, like Samsung, Intel sees SSDs playing a role in the server market as a “performance accelerator.” Winslow said that Intel recently did a video-on-demand demonstration where it streamed 4,000 videos simultaneously. Just to do the streaming (not to store the video), it took 62 15,000 RPM (very high-performance) hard drives, he said. “We were able to replace those 62 hard drives with 10 SATA (SSD) technology drives,” he said.

At the moment, Intel offers small-capacity chip-level (what are called Thin Small Outline Packages or TSOPs) technology that provides end-product sizes ranging up to 16GB. But this modest line of products will get a big boost in the second quarter when Intel offers 1.8- and 2.5-inch SSDs ranging from 80GB to 160GB in capacity, said Troy Winslow, marketing manager for the NAND Products Group at Intel. Intel’s new SSDs will compete with Samsung, for example, which is slated to bring out a 128GB SSD in the third quarter.

But to be competitive with hard drives, SSD prices have to come down–a lot. In many cases, upgrading from a hard drive to an SSD in a notebook can mean paying an extra $1,000. Intel, like Samsung and Toshiba, sees steep declines in cost in the next two years. “Price declines are historically 40 percent per year,” Winslow said. “And in 2009, a 50 percent reduction, then again in 2010.”

While the latter statement seems like typical marketing spin, it’s more than just spin in Intel’s case. The largest chipmaker in the world is in a competitive position because it already supplies many of a PC’s core components including the processor, chipset, communications silicon, and in some cases, the graphics processor. Add the main storage device to the mix, and–with the exception of an optical drive and screen–that’s all the core component in a notebook PC.

Intel Flash/SSD capacity: Intel and Micron have a joint venture called IM Flash Technologies. Both companies are currently making flash on a 50-nanometer process with plans to move to 40nm later this year. There are three NAND flash fabrication plants and one more currently being built in Singapore. The Intel-Micron venture provides funding for the development of silicon technology and the capacity to produce that silicon, according to Winslow. But marketing and end-product decisions are “absolutely separate,” he said.

Nvidia, AMD gaming graphics buck green-PC trend

Tuesday, August 24th, 2010

And the trend in power supplies exemplifies how this market has changed. “The power supply used to be just silver box, and nobody gave it a second thought,” he said. “(But) as graphics cards have evolved, they have forced the power supply makers to keep providing more and more power pipes–or cabling–to the graphics cards”–increasing the unit’s complexity, he said.

Neither Nvidia nor ATI show any signs of slowing down, according to Reeves. “Eventually these chips get so hot that their own heat becomes a barrier to performance,” he said.

But Paul claims the performance per watt is the key yardstick, not raw power. “Where you see a little under 2X increase in maximum power, you’ve seen probably 3-times or 4-times (the) increase in the level of performance. So, overall we see a substantial improvement in performance per watt. This is the big metric we track to ensure we’re delivering efficient architectures. ”

(Credit:
Cooler Master)

(Credit:
Dell Computer)

Paul says Nvidia has implemented power savings techniques on its GTX 280 that keep the power down when it’s not running at top performance loads. “With the GTX 280 at idle, that card runs at about 25 watts, which is one-tenth of its absolute worst-case power,” he said. Nvidia also offers hybrid graphics technology that turns off all the power-sucking boards when they’re not in use.

Green PC designs have become more than just practical; they’re cool. Power-sipping Netbooks are in, as are small desktops like the Dell Studio Hybrid and Hewlett-Packard Pavilion Slimline.

This symbolized why Apple eventually abandoned PowerPC: The platform wasn’t efficient with power.

There is an ungreen revolution taking place in enthusiast game PC circles.

Moreover, Paul says that the multiboard systems are limited to a small niche at the very top of the market. “There’s definitely a segment of the market that wants more and more performance. Remember, however, that this is the ultimate performance (segment).”

Nvidia admits that its chips are drawing more power than before. “If we go back about three years, our graphics card power was in the 120- to 130-watt range,” said Jason Paul, product manager in charge of enthusiast GPUs (graphics processing units) at Nvidia. “The GTX 280 which we launched a couple of months back, it’s around 230 watts (of) graphics card power,” he said.

Reeves says that 1,200 watts is now essential for gaming systems based on multiple boards from Nvidia or AMD’s ATI graphics unit. “With three GTX 280s or two of the R700 cards, we’re recommending they go with a 1,200-watt power supply,” Reeves said, referring to the newest graphics chips from Nvidia and ATI respectively.

Dell XPS 730 game box uses special liquid cooling to control heat.

Fast-forward to 2008. Game rig makers are cramming as many as four graphics chips into high-end boxes that are notable not only for performance but also for the power they consume. As a consequence, big power supply units are in vogue. Today, bragging rights extend to the units themselves: some systems boasting boutique brand names such as Cooler Master and SilverStone draw 1,200 watts–roughly three times the power requirements of game systems a few years ago.

It’s an ominous trend, according to box makers. “If this trend does continue, then, yes, it will give us problems,” said George Yang, an engineer at Los Angeles-based game rig maker IBuyPower. “A regular home user would have to have an electrician come in, get the outlet out, and plug in a higher breaker,” Yang said. Today, some of the higher-end systems with big power supplies require a special wall power socket, according to Yang.

A 1,250-watt power supply–this one from Cooler Master–is the largest a game PC maker will install today.

“A regular home user would have to have an electrician come in, get the outlet out, and plug in a higher breaker.” –George Yang, IBuyPower engineer

This is not the case for high-end gaming PCs, where bigger is better. How far this trend can go isn’t clear, but a seminal event in Apple’s history may offer a lesson. In 2001, Apple unveiled one of the first dual-processor consumer systems, based on the overheating-prone IBM PowerPC G4 processor. The original Apple tower design had a Rube Goldberg feel to it, with a host of fans straining to rid the system of heat. A noise like that emitted by a wind tunnel, generated by the power supply and fans, forced Apple to redesign the system.

Reeves cites GPUs, not CPUs from Intel, as the culprit. “The latest CPUs use very little wattage. If you overclock a 3GHz Intel CPU to 4GHz, you might pull 40 more watts. Whereas a graphics card, you put three of them in a system, they’ll pull 800 watts running some of the higher-end games,” he said.

Other game rig makers are equally concerned. “I swore that I’d never break 1,000 (watts),” said Kelt Reeves, president of game PC maker Falcon Northwest. “Unfortunately, that’s been the solution for the past several years. Bigger, bigger, bigger power supplies.”

This is just about the limit, he said. “We can’t go too much more over that before–if you actually pull that (power)–you start tripping the client’s household circuit breaker.”

The eye-opening graphics possible on today’s game PCs come at a cost: light-dimming power consumption. The trend, rooted in the perennial quest for more speed, bucks the overall greening of the PC industry.

But game box makers ship many–if not most–of their systems to the very niche that Paul is describing. “We’re all about the high end. The higher-end the graphics card is, and the more expensive, the more we sell,” said Reeves.

Facebook kicks off developer funding competition

Tuesday, August 24th, 2010

Additionally, FBFund has heretofore flown under the radar, unusual for something that has come out of a publicity magnet like Facebook–and some of the moderate press it’s gotten has been fairly negative. Throwing a contest is probably a decent way to drum up some attention.

Facebook is drawing developer attention to its platform at a crucial time: first, it’s expanded its API to the Facebook Connect initiative; and second, it’s now competing for geek attention not only with rival social-networking platforms but also with Apple’s
iPhone, the hot platform du jour.

Developers, start your engines: submissions are now open for the developer application contest that Facebook created for its FBFund grant program. Winning developers, who submit business plans for their prototypical Facebook Platform applications, will receive between $25,000 and $250,000 in grant money. The company plans to give away $10 million total.

Monday saw the kickoff of the competition’s Round 1, in which 25 winning proposals announced on September 22 will each be awarded $25,000. The winners of that round will have the option to apply for Round 2, in which five final winners will receive $250,000 to fund the development of their Facebook applications. Winners will also have access to “mentorship” from Facebook as well as a boost in publicity and marketing resources.

The contest was originally detailed at this year’s F8 conference, in which the 10 original FBFund selectees were also unveiled.

Antivirus dominates PC application sales chart

Tuesday, August 24th, 2010

According to research group NPD, this month’s list of top 10 PC software applications contains three video games, one productivity tool, and six antivirus/security tools.

It’s amazing that Microsoft has created more of a market for applications that fix the problems Windows causes than it has for entertainment or business.

1. Spore

2. MS Office 2007 Home & Student
3. Warhammer Online: Age of Reckoning

4. Trend Micro AntiVirus 2008 Plus Anti-Spyware
5. Spy Sweeper
6. Norton 360 2.0 3User
7. The Sims 2/Apartment Life
8. Norton Antivirus 2008
9. Trend Micro Internet Security 2008 3User
10. Spy Sweeper with Antivirus

Vista price cuts show how much trouble Microsoft i

Saturday, August 21st, 2010

Ever since Microsoft released Vista to the masses, most people knew just how bad the operating system was. Instead of offering the kind of functionality already found on Linux or
Mac OS X and the stability that we had come to welcome in XP, Vista was nothing more than a beta release on day one, and very few improvements have been made to change that.

To make matters worse, most companies and individuals are more than happy to keep XP running, and even Apple has been able to capitalize somewhat on the issues people have had with Microsoft’s latest operating system.

Obviously realizing that there is some trouble afoot, Microsoft on Thursday announced price cuts on its most expensive versions of Vista and said those discounts will range from 20 percent to 48 percent. Ironically, those discounts are designed to coincide with the release of Vista Service Pack 1, which according to Microsoft, will usher in a slew of security fixes and improvements that should make the Vista experience much better.

And while I applaud Microsoft for finally dropping the price on its ill-fated software, the price drop looks more like a PR move than something that will have an impact on consumers and, most importantly, shows just how much trouble this company is in with Windows.

First off, let’s not kid ourselves. This price drop will have no impact on Microsoft’s bottom line and is nothing more than a ploy to show that it’s trying to do all it can to attract customers. After all, how many people actually buy retail versions of Windows?

According to Goldman Sachs, approximately 5 percent of all Windows sales are executed through retail chains and the vast majority–80 percent–come from OEMs (original equipment manufacturers). Knowing this, what sort of impact does anyone actually think this will have?

If only 5 percent of Microsoft’s customers are affected by this price drop, why is this even news? Wouldn’t it make more sense from a business standpoint to drop the price to OEMs (which it hasn’t done), if all it really wants to do is sell more copies of Vista? If it did that, OEMs would finally feel a little bit better about the software and could pass some of that savings on to the consumer, who would then more readily choose the more expensive Vista over XP.

But alas, this price cut has nothing to do with revenue or any other excuse Microsoft can come up with. The Vista price cut is designed specifically to show people that this company is doing all it can to create a worthwhile product and is trying to make its customers happy.

Of course, what it fails to realize is what the customers really want is a robust operating system that offers the stability and functionality this one is missing. And although Microsoft would be quick to mention that it’s doing just that with SP1 and the price cut is making its product more attractive, I would argue that the company is ignoring consumers and doing all it can to force a crappy product on you so it can turn a profit on its huge investment in Vista.

But in the end, I just don’t know if that will ever happen. As long as companies like Dell and Acer continue to have reservations about Vista and Microsoft tries to play the PR game instead of the “make this crap better” game, Windows is in trouble.

Vista is a damaged product that lacks many of the important elements a good operating system would boast. And although it may be a bit cheaper, it’s still not the OS XP was. Say what you will about Windows, but as it stands, XP is one of the best operating systems on the market today and Microsoft shouldn’t lose sight of that.

The future of Vista is bleak and regardless of what Microsoft tries to do to erect a facade that entices consumers to buy a dilapidated operating system, eventually it will come back to haunt this company and that will be bad for everyone.

Go back to work, Microsoft. We’re waiting for something better.

My Bono moment…in 3D

Saturday, August 21st, 2010

PARK CITY, Utah–Last night I saw U2 live in concert here at the local high school performing arts center…at least it felt that way.

Bono and I even had a moment–during “Sunday Bloody Sunday” he reached out his hand and almost touched me. He had to be singing to me, and not Robert Redford, Google’s founders, or the rest of the Hollywood glitterati in my company. Right?

It wasn’t actually a concert. Rather, I was attending a screening for the concert film U2 3D at the Sundance Film Festival. But same diff. It really felt like I was on the concert floor. Better yet, at times I felt like I was one of those waify teenage girls at concerts who gets hoisted onto someone’s shoulders for a bird’s-eye view.

That's me, getting used to my cool glasses before the screening gets under way.

(Credit:
Michelle McPherson, festival-goer from Rippen, Calif.)

I don’t use this term lightly, but I really felt like I was witnessing something “revolutionary” in filmmaking. The 90-minute compilation of footage from the band’s Vertigo tour in South America was shot using a new generation of 3D technology provided by Burbank, Calif.-based 3ality, which co-director Catherine Owens said was initially conceived for sports footage. For the Sundance screening, it was projected in Dolby 3D Digital Cinema. (More to come on 3D tech following a related panel discussion later Sunday.)

What blew me away was the seamlessness and subtlety of the 3D tech, combined with the surround sound. You quickly forgot you were wearing those goofy glasses (in my case, over my own specs). It was hard to tell whether the applause and singing was coming from the film itself, or the Sundance audience members. When Bono asked the crowd to show him the light of their digital devices, the glow of cell phones from the festival audience blended right in with those of the concert audience.

Bono speaks to a star-studded crowd just before the screening of U2 3D at the Sundance Film Festival on Saturday night.

(Credit:
Michelle Meyers/CNET News.com)

Never, in my five years of covering the festival, have I seen such a hot and hyped ticket. Only two screenings of the film were scheduled, both of them taking place Saturday night. One was at 9:45 p.m. and the other at midnight.

One of the first festival-goers to arrive on the scene in hopes of getting a wait list ticket to the first showing was Nick Buckmaster, a huge U2 fan from Sausalito, Calif., who had been waiting since 10 a.m. He made the trek to the festival to see U2 (he’s seen them perform 60 times) and also because he heard such amazing reviews of the film, which screened at the Cannes Film Festival among other places.

Despite their early arrival, festival staff members didn’t let Buckmaster and his fellow fans start lining up officially until 7:45 p.m. And first dibs for wait list tickets went to those who had been waiting in line unsuccessfully to see the prior star-studded Robert DeNiro film, What Just Happened. Buckmaster did get into the show, which he said “was really far better than I expected.”

He had worrried a little that the 3D would be gimmicky, as it was, in his opinion in some 1980s-era 3D films like Jaws. “This was more an enhancement of the experience,” he said, adding that he was also happy it featured all the band members, not just Bono.

Ticket scalping at Sundance is very uncool; however, rumor has it that tickets to last night’s show were going for up to $1,000. Kind of crazy for a film that opens in wide release next week both in IMAX and digital cinema.

A still from the film, U2 3D

(Credit:
U2 3D)

The band’s presence, however, did make the screening extra special. Bono opened the show by touting the importance of Sundance and the special mood that exists despite the “celebrity clusterf***.”

“There is a lot of love and Irish whiskey in the air,” he told the crowd, adding that if Sundance were in Dublin, it would be called “Raindance.”

Of course, he had to sneak in his comments in between yells of “I love you, Bono.” (I promise, it wasn’t me.)

Owens, in her closing after the Q &: A, emphasized that the fact that she was able to put together U2 3D with no filmmaking background says much about the technology and its power as a new medium.

The cloud is not a computer

Saturday, August 21st, 2010

My hat goes off to Preston Monroe, the developer of iCopy, an online service that adds cut and paste functionality to the iPhone’s browser and e-mail apps. As you probably know, Apple’s handheld computer bizarrely omits this feature.

Cut-and-paste on the iPhone, via a Web service.

(Credit: iCopy video)

iCopy is a clever hack that lets you select text or a link from a Web page and paste it into another page, or an e-mail. It gets around the lack of
iPhone-native copy and paste by sending selected text to a temporary online repository when you “copy,” and retrieving it when you “paste.” In operation, it’s a horrible kludge–it requires a lot of Web page switching and too many visits to the iCopy site to do a simple copy/paste operation. But the fact that Monroe figured out a way to make the Web a giant clipboard in the sky is pretty cool.

iCopy illustrates that while the Web can be employed to do a lot of things that we’ve formerly thought of as belonging solely in the domain of local computing, it doesn’t mean we should do so.

I edit a blog about Web 2.0 apps. It’s my job to push the vision of Web-based products and cloud-based resources. But even I realize that local processing has a place. I find it curious that many people I talk to think Microsoft’s rumored Midori project, for instance, is a “cloud OS.” While there’s no question that an operating system written from the ground up today should use Internet resources in a more native fashion than most OSes do today, the change should be seen as one of degree, not replacement.

The Internet can be used to deliver apps and updates, for storage and backup, for social networking and person-to-person communications, and other functions. But for the moment and the near future, you need local processing to maintain speed and robustness of applications, and native graphics capability to present the interface. One of the reasons Web 2.0 apps can work well today is because today’s browsers have deep user interface and graphics capabilities, and because they run on powerful local PCs. Many popular Web apps–like Google Docs and Microsoft Live Search Maps–rely on capabilities that were simply not present in PCs only a few years ago.

That’s why I continue to refer to Web operating systems like G.ho.st as science fair projects. They’re really cool, and they provide glimpses of the evolution of personal computing. Much of what we do on a PC today can be done over the Web. But a lot cannot, at least not well. To deliver the best experience–the best user interface, reliability, collaboration, and so on–smart developers don’t force all their apps either onto the Web or the local PC. Today’s architectures make distributing applications among platforms easier than ever. They even make it possible for apps to adapt to their environment and redistribute themselves depending on circumstance (see Google Gears). The really interesting upcoming apps and operating systems will not just be hybrid (online/offline), but adaptive.

Meanwhile, if you’re interested in how copy and paste might work on the iPhone, check out Proximi’s Magicpad, a text editing app that offers cut and paste controls. Proximi has also published video proposing a user interface for general cut and paste on the iPhone. This is the work Apple should have done. Although for all we know, the company has done it already, but in secret.

PlayStation 3 gets firmware upgrade

Saturday, August 21st, 2010

Update: Sony has suspended the 2.40 upgrade, following reports that it has fouled up some
PS3 systems (see Joystiq for more). While the two PS3s we have here at CNET were able to install the update with no adverse effects, it appears that some users were considerable less fortunate.

The 2.40 firmware update for the PlayStation 3 is now available. The free update, which Sony has been talking up for the past several days, adds a smattering of new features, including in-game access to the XMB (Cross Media Bar) home screen, custom soundtracks, a new trophy system, and a shortcut to Google searches.

The new features carry a host of caveats: the in-game XMB, customized soundtracks, and trophies aren’t supported on all games; in-game XMB features are fairly limited; and the trophy system (with the a few exceptions) won’t be retroactive to already-accomplished goals.

Trophies, for instance, are better viewed as a feature that will begin becoming more useful as future games begin supporting the feature. That said, both the trophy system and in-game XMB help the PS3 better compete with the achievements and
Xbox Live or Xbox Dashboard features offered by the Xbox 360.

PS3 users will also notice a handful of other simple but useful touches with this update, including an on-screen clock, a quick shutdown icon, and a shortcut to Google searches. And speaking of the PS3’s browser: while it’s not new for 2.40, it’s worth noting that the Web browser splash screen now includes shortcuts to YouTube, Flickr, and Facebook.

So what do you think? Does the 2.40 update add some worthwhile features to the PS3, or does the Xbox 360 or
Nintendo Wii still have an edge? And what other features would you like to see come in future PS3 software upgrades?

PlayStation.com: Firmware 2.40 walk-through, part 1 (embedded above)
PlayStation.com: Firmware 2.40 walk-through, part 2
PlayStation.com: Firmware 2.40 FAQ

The Digital Home Video Hands-on with Silent Hill

Saturday, August 21st, 2010

Should you buy Silent Hill: Homecoming? I’ll tell you in my latest video.

Even better news: you can now subscribe to this show. Just add it up right here!

And as always, drop me a line or follow me on Twitter!

Hybrid patrol vehicle called ’safest of its kind’

Saturday, August 21st, 2010

(Credit:
Rheinmetall)

Check out the schnoz on this one. D?sseldorf based Rheinmetall advertises its Gefas (Gesch?tztes Fahrzeugsystem) as the “safest, most future-proof system of its kind anywhere.”

The conceptual model shown here was configured for convoy protection. Some options include a high-powered, electromagnetic, counter-IED system, an automated weapons station controlled from safely inside the vehicle, electro-optical sensor systems with downstream image processing for detecting and tracking moving targets, a 12-meter tactical radar, laser-optical sensors for detecting enemy optics, and an “instantaneously activated” smokescreen.

(Credit:
Rheinmetall)

Gefas replaces standard steering and braking with a hybrid-electric Renk “drive by wire” system. Each Timoney double-wishbone axle has its own electric drive (comes in 4×4 to 8×8) which–along with the all-wheel-steering and run-flat, auto-inflating tires–gives you a better chance of gettin’ in or out of a fracas. Up to six passengers ride in a suspended safety cell attached to the roof, protecting them from landmines and booby traps.

But what really distinguishes the Gefas is its modular, building-block assembly. The modules, composed of axles, power pack, and a main building block, are held together by connectors that can break at predetermined points when damaged in battle. This allows the surviving modules to be reconnected to form a another completely viable vehicle, according to Rheinmetall.