Thursday, September 6, 2007

What Is the Difference Between Data Deduplication, File Deduplication, and Data Compression?

Data deduplication is one of the hottest topics in storage. eWEEK IT expert W. Curtis Preston, vice president of Data Protection for Glasshouse Technologies, explains how it differs from other storage technologies.

Q: Can you explain the differences between compression, file deduplication and data deduplication?
A: All of these products fit into an overall market and technical concept, which is capacity optimization or data reduction. This refers to a broad group of products that seek to reduce the amount of data that has to be stored. Roughly speaking, you can rank these techniques by the amount of data reduction they yield. Compression might typically get you a 2-to-1 reduction. File deduplication, which is commonly known as content addressable storage or CAS, might yield a 3-to-1 or 4-to-1 reduction. But data deduplication—which is deduplication at the level of individual disk blocks or "chunks" rather than entire files—can often give you a 20-to-1 reduction or better, depending on the type of data. Remember, we're talking about the aggregate reduction in the total amount of data stored on your backup storage device, not necessarily the reduction in any particular file or block, which can vary considerably.

Q: Why is data deduplication so much more effective in reducing data than file deduplication?
A: Data deduplication examines all your data on the block level and eliminates redundant blocks. So obviously it will take care of entire files that are redundant, but unlike file deduplication it will also eliminate the redundant pieces that occur when many slightly different versions of the same file are created by users or by applications like Microsoft Exchange. If users have been e-mailing back and forth a PowerPoint file while making minor changes, you can end up storing 10 or 20 files whose content is 95 percent identical. Data deduplication will catch that.

Q: When should you use data deduplication and when should you use file deduplication?
A: A very short answer would be that file deduplication is often used for backup solutions in so-called ROBO environments (remote office, branch office). Data deduplication can be used either in the data center itself, as a software function installed on the intelligent disk target, or on the backup client side in a ROBO environment.

Q: Who are some of the more commonly used data deduplication vendors?
A: There are plenty of vendors, because data deduplication is a very hot area these days, especially now that the VTL (virtual tape library) vendors are getting involved. There is Avamar (acquired by EMC), Symantec Puredisk, Asigra, Data Domain, Diligent Technologies, Falconstor, Sepaton, Quantum. Network Appliance has a product in beta.

Q: Who are some of the more commonly used file deduplication or content addressable storage vendors?
A: EMC has the Centera product line. Then there is Archivas (recently acquired by Hitachi Data Systems) and Caringo.

Q: What accounts for the difference in yield between compression and file deduplication?
A: With compression you are using some algorithm or other to reduce the size of a particular file by eliminating redundant bits. But if your users or applications have stored the same file multiple times, then no matter how good your compression method is your backup storage will end up with multiple copies of the compressed files. File deduplication goes a step further and eliminates these redundant copies, storing only one. So it gives you more reduction than just compression alone.

Q: Where does delta block optimization fit in?
A: This is another capacity optimization technique. It's used by incremental remote backup products like Connected (acquired by Iron Mountain) and EVault (acquired by Seagate). When you go to back up the most recent version of a file that has already been backed up, the software looks at it and tries to figure which blocks are new. Then it writes only these blocks to backup and ignores the blocks in the file that haven't changed. But again, this technique has the same shortcoming compared with file deduplication as compression. If two users sitting in the same office have identical copies of the same file, then delta block optimization will create two identical backups instead of storing just one like file deduplication.

Are Open-Source Databases Ready for Production Applications?

Open source in the enterprise is growing steadily, but for what applications? eWEEK IT expert Kevin Closson, chief software architect for PolyServe-HP, answers the question.

Q: Are open-source databases like PostgreSQL, Ingres and MySQL becoming serious alternatives to Oracle for enterprise applications?

A: Yes, more and more, depending on the application.

Oracle became the leading database in the 1990s because it ran better on high-end SMP Unix servers. But in those days most applications were still just dumb terminals talking to the big Unix box. So the database software had to be very sophisticated to perform well.

But in modern multi-tier applications you have a lot of intelligence in the application server tier and even in the browser on the user's desktop. If the database server goes down, all is not lost, because you have persistence in the front-end and the middle tier. After a certain delay you will be able to reconnect and finish your transaction.

In that situation it may not make sense any more to spend $40,000 or even $60,000 per license for a database like Oracle 10g when MySQL or Postgres or Ingres could do the job. They might not be good enough for a 911 emergency call center.

But if your application is a Web store that sells widgets, they might be fine. Given the high MTBF [mean time between failure] on the servers and disks that are available now, you might only get one or two unplanned outages per year, and for many small or mid-size businesses, that is perfectly acceptable.

But today I see even high-end IT shops asking these questions, too.

IT Workers Second-Guess Career Choice

Analysis: In general, IT workers are feeling unsettled about the state of the IT workplace.

Though it has come a long way from the gloom-and-doom days of the dot-com bust, the state of the IT workplace isn't shiny and happy.

If you ask an IT pro what they think of their chosen career path, a surprising number might pause before giving you a litany of reasons that the technology workplace leaves them feeling unsettled.

They love what they do, but they're not sure IT is a great place to be doing it anymore. Even worse, they're not sure that they would encourage their own computer-inclined children to pursue the same line of work.

Fortunately, this isn't the case for everyone. Several reports point to the vigor of the IT job market, the overall health of the U.S. IT sector, the preponderance of bright students in the talent pipeline and soaring salaries in some sub sectors.

Yet, good news can only go so far to undo the damage wrought by some cold, hard IT workplace facts.

No matter how many stories crop up in which CIOs confess "outsourcing didn't work for me," the trend toward the commoditizing of IT and development work, not to mention sending IT overseas to save money, shows little sign of letting up.

Will their jobs be next? IT workers worry everyday. Technology company CEOs predicted that their use of offshore services would increase over the next several years, according to a 2007 CEO Survey released by Deloitte, a Swiss company, on May 1.

Nearly half (45 percent) of the respondents stated that they were currently offshoring and 55 percent said they planned to in the coming years, so much so that nearly one-third expected to have 10 percent of their work force offshore in five years time.

The good news on the offshoring front is a little more difficult to track down, but it mostly involves senators stepping forward to offer protection to victims of the global talent market and coming down on firms that abet companies that disregard protections for U.S. workers.

However, as long as outsourcing lays down onshore and nearshore roots, its not likely that IT professionals will be feeling any extra job security.

Is there a shortage of IT professionals? Is there not? It depends on who you talk to. Yet if you speak to enough people, one message becomes clear: There is a shortage, but it's of workers with the most highly sought-after skill sets. Everyone else is having a harder time finding work.

In a way, there are two IT work forces: those whose salaries and opportunities climb due to head-turning percentiles each year, and those whose skill sets are left behind, and whose heads spin when they read another account of the "vibrant health" of IT.

In the first category, there is a shortage. Companies scramble to find IT workers with SAP skills or project management skills, and end up paying premium prices for them. In the latter category, those having trouble finding work are often not able to find the companies looking for them, which are often small and lack the resources of a big tech HR department. In both categories, the luck of the draw seems to reign supreme.

Any person not living under a rock has watched the writing appear on the wall: Housing prices are falling into a crisis zone; the credit market is collapsing; the Fed even cut the discount rate. It could only be a matter of time before this recession takes its toll on IT workers, a population still scarred from the dot-com bust.

A recession seems to be looming for the U.S. economy this fall, and no matter how many glowing reports about the IT job market are released, the slipping economy will inevitably have its way with IT professionals.

The view isn't much brighter from the top. A report released Aug. 30 by ExecuNet, an executive recruiting firm in Norwalk, Conn., evidenced the dovetailing of confidence in the executive employee market since the spring. The confidence rate is down to 55 percent from 80 percent in April.

As user technology advances by leaps and bounds, logic stands that this would make the jobs of IT professionals easier. In reality, this is not the case: For IT pros, technology advances mean more security risks, more demands on their time and more to worry about.

Driven by the consumerization of technology, the effectiveness of centralized IT has slipped, argued an Aug. 6 report by the Yankee Group, as 50 percent of employees reported that their personal technology was more advanced than their workplace's technology.

The report reasoned that that IT could either ban employee technology, creating an endless game of whack-a-mole, or they could manage both the technology and the rogue employee.

While analysts agreed that the latter had the potential to improve long-term internal customer satisfaction, there is little doubt that IT professionals know that the IT care co-op model is just more work for them.

A July 30 article in the Wall Street Journal on the topic of IT limitations of employee technology took more of a guerrilla warfare approach, providing readers with a how-to manual to make an end-run around the IT department, and painting IT pros as control freaks.

While IT professionals raged against the article, others took a more forward-reaching approach.

This quieter minority reasoned that because users already know how to use Google to find out how to circumvent their technology regulations, smart IT departments should take steps to bridge the gap between what users want and what they knew is safe.

If the dot-com bust was the first nail in the IT work force's coffin and offshore outsourcing the second, the decline in student enrollments in computer science programs and a dearth of qualified candidates may just be the third. Worse yet, many IT professionals admit that they don't feel comfortable ushering their own children down a career path so fraught with land mines.

Recruiters facing difficulties finding the right IT candidate for a job bemoan the fact that after the dot-com boom parents told their kids not to go into technology and haven't changed their message since. Yet the issue runs deeper than parents disregarding that IT may be back and healthier than ever.

"The shine is off the apple," one told eWEEK. "Outsourcing… H-1Bs… the commoditization of the IT workforce. Other career paths seem a safer bet."

Microsoft fails to win global standard approval

BRUSSELS, Belgium (AP) -- Microsoft Corp (Charts, Fortune 500). has failed in a first step to win enough support to make the data format behind its flagship Office software a global standard, the International Standards Organization said Tuesday.

This weekend's vote by national standards agencies from 104 nations did not provide the two-thirds majority needed to give Microsoft's format the ISO stamp of approval. But they will meet again in February to try to seek a consensus, and Microsoft could win them over at last.

ISO approval for Microsoft's Open Office XML would encourage governments and libraries to recognize the format for archiving documents, which in turn could help ensure that people using different technologies in the future could still open and read documents written today in Open Office XML.

Approval of its system as a standard would also help Microsoft tamp down competition from the OpenDocument Format, created by open source developers and pushed by such Microsoft rivals as IBM Corp. (Charts, Fortune 500)

Massachusetts state government stirred huge interest in the matter when it advocated saving official documents for long-term storage in the nonproprietary ODF format. That prompted Microsoft to seek recognition of Open XML by the global standards body.

The company has offered to license Open Office XML for free to anyone who wants to build products that access information stored in Office documents.

It claims the format is richer than ODF because, being based on XML computer language, it can store the layout of spreadsheets and legal documents created with Office 2007.

But Shane Coughlan of the Free Software Foundation Europe, a group of open source developers, questioned whether Open Office XML would truly live up to its name and be open to all.

Coughlan said it was unclear whether some of the code requires Microsoft's permission to be used. "It is important that everyone owns their data, that access does not depend on any one company," he said. "Any serious corporation or government should be dubious about using it if the legality is unclear."

Publishing an open standard means it will be available to everyone, a sort of Rosetta stone that makes sure the key documents of today -- whether they be legal texts, novels-in-progress or accounting spreadsheets -- don't become unreadable hieroglyphics to future generations.

Despite losing the initial round of voting with ISO, Microsoft was confident of future success, saying many of the ISO members that did not vote for the format said they would do so when certain criticisms have been addressed.

"This preliminary vote is a milestone for the widespread adoption of the Open XML formats around the world for the benefit of millions of customers," said Microsoft's general manager for interoperability, Tom Robertson. "We believe that the final tally in early 2008 will result in the ratification of Open XML as an ISO standard."

According to ISO, Microsoft had 53 percent of the votes in favor -- instead of the 66 percent it needed.

The ISO process is essentially a debate that tries to fix outstanding problems so a format can win sufficient support. But Coughlan said Microsoft's heavy lobbying for Open Office XML had showed that ISO selection needs to be reviewed to make sure one voice could not shout louder than others.

Coughlan and others have alleged that Microsoft unduly influenced the industry committees that advise national standards bodies on ISO votes.

Race for ‘next big thing’ in Silicon Valley

Silicon Valley’s annual coming-out season for tech start-ups is about to turn into a stampede.

In the next few weeks, the wraps will be removed from some 150 new companies and products at a handful of events in California competing to identify the tech industry’s Next Big Thing.

The race to find the Valley’s hottest new idea reflects growing investor interest triggered by the high prices paid for recent internet start-ups such as YouTube, as well as the increasingly fierce Darwinian struggle among the newcomers to get noticed.

The large number of companies formed around hot trends such as web search, social networking and online video has added spice to the importance of the autumn events, according to entrepreneurs and venture capitalists.

“At this stage of the frothiness, it’s extremely difficult to get attention,” says Munjal Shah, founder of, an image search engine.

“The capital cost of starting a business today is very low,” says Chris Shipley, producer of Demo, one of the first tech events. “We’re seeing a lot of ideas make it from the spare bedroom to a showcase or the marketplace very quickly.” was the sole start-up featured two years ago at a party thrown by Mike Arrington, whose widely read TechCrunch blog has made him the Valley’s latest kingmaker.

For his first formal conference this month, Mr Arrington has just doubled the number of companies presenting to 40 because, according to his website, there are “just too many strong start-ups”.

Other events that hope to unveil hot companies and products in the coming weeks include the Web 2.0 conference, the event that gave its name to the latest wave of online innovation, and Demo, which has expanded to two events a year.

The scramble for attention is another symptom of Silicon Valley’s latest start-up boom. The amount of venture capital being invested in the US is at its highest level since 2001 and it has led to a rash of “me-too” companies.

The flood of copycat companies is a sign of the over-heated phase of the investment cycle, according to observers.

However, for most of those that make it to the big showcase events, the attention from being in the spotlight is likely to be fleeting.

Being named “the coolest, hottest thing” can produce a “drug-induced traffic high” as users rush to try out the latest websites.

Once that initial surge of interest falls off, the hard work of building a lasting business really begins.

Japan to fight Google search dominance

Tokyo, alarmed by the global dominance of Google and other foreign internet services, is spearheading a project to try to seize the lead in new search technologies for electronic devices.

The push has been sparked by concerns in Japan that the country’s pre-eminence in consumer electronics has faded and value in the technology industry is moving away from hardware.

As South Korean and Taiwanese electronics companies churn out products nearly identical to those of the Japanese majors, there are fears in Tokyo that the country’s manufacturers are falling behind in innovation.

“The question is how Japanese companies like Sharp and Matsushita can be encouraged to provide services. They clearly have the know-how to build things,” says Toshihide Yahiro, director of the information service industry division at the ministry of trade. “The key to Japan’s competitiveness has been our core technology but we need to create a new value-added service that is personalised.”

The shift of focus away from hardware echoes attempts by some of the biggest personal technology companies to become stronger in software and services. In one of the most prominent examples, Nokia last week outlined plans for an online music store and other services.

Tokyo hopes to use Japan’s strength in developing devices, such as mobile phones and car navigation systems, to create proprietary search and information retrieval functions. But some question whether a state-led project is capable of overhauling Google.

The Japanese project is comprised of 10 partnerships, each tasked with a specific next-generation search function. For example, the government has matched NTT Data with Toyota InfoTechnology Center and Toyota Mapmaster to create an interactive, personalised car navigation system. Other partnerships involve NEC, Hitachi and Sony Computer Science Laboratories. The ministry of trade has allocated Y14bn-Y15bn (€89m-€95m) to the project.

“Seventy per cent of car navigation systems are made in Japan. There is scope for more personalization,” says Mr Yahiro. “There is a need for car navigation systems that are capable of searching for which bathrooms are equipped with baby-changing stations and other necessities.”

Some blame Japan’s copyright laws for holding back the development of web services. Services such as Google hold copies of other companies’ web pages on their servers. Because Japanese law forbids the duplication of copyrighted works without the rights holders’ permission, Yahoo Japan, Google Japan and other search engines offered in Japan operate from US-based servers.

The specific focus on search reflects the prominence that this service has achieved since the rise of Google, while also reflecting broader international concerns about US dominance of an important information business.

France and Germany launched a plan of their own to seed development of a “next generation” European search engine nearly two years ago, though Germany pulled out of the plan.


Computer Scientists Take the “Why” out of WiFi

Why isn’t my wireless working? Is yours? – Computer Scientists Respond

“People expect WiFi to work, but there is also a general understanding that it’s just kind of flakey,” said Stefan Savage, one of the UCSD computer science professors who led development of an automated, enterprise-scale WiFi troubleshooting system for UCSD’s computer science building. The system is described in a paper presented last week in Kyoto, Japan at ACM SIGCOMM, one of the world’s premier networking conferences.

“If you have a wireless problem in our building, our system automatically analyzes the behavior of your connection – each wireless protocol, each wired network service and the many interactions between them. In the end, we can say ‘it’s because of this that your wireless is slow or has stopped working’ – and we can tell you immediately,” said Savage.

For humans, diagnosing problems in the now ubiquitous 802.11-based wireless access networks requires a huge amount of data, expertise and time. In addition to the myriad complexities of the wired network, wireless networks face the additional challenges of shared spectrum, user mobility and authentication management. Finally, the interaction between wired and wireless networks is itself a source of many problems.

“Wireless networks are hooked on to the wired part of the Internet with a bunch of ‘Scotch tape and bailing wire’ – protocols that really weren’t designed for WiFi,” explained Savage. “If one of these components has a glitch, you may not be able to use the Internet even though the network itself is working fine.”

There are so many moving pieces, so many things you can not see. Within this soup, everything has to work just right. When it doesn’t, trying to identify which piece wasn’t working is tough and requires sifting through a lot of data. For example, someone using the microwave oven two rooms away may cause enough interference to disrupt your connection.

“Today, if you ask your network administrator why it takes minutes to connect to the network or why your WiFi connection is slow, they’re unlikely to know the answer,” explained Yu-Chung Cheng, a computer science Ph.D. student at UCSD and lead author on the paper. “Many problems are transient – they’re gone before you can even get an admin to look at them – and the number of possible reasons is huge,” explained Cheng, who recently defended his dissertation and will join Google this fall.

“Few organizations have the expertise, data or tools to decompose the underlying problems and interactions responsible for transient outages or performance degradations,” the authors write in their SIGCOMM paper.

The computer scientists from UCSD’s Jacobs School of Engineering presented a set of modeling techniques for automatically characterizing the source of such problems. In particular, they focus on data transfer delays unique to 802.11 networks – media access dynamics and mobility management latency.

The UCSD system runs 24 hours a day, constantly churning through the flood of data relevant to the wireless network and catching transient problems.

“We’ve created a virtual wireless expert who is always at work,” said Cheng.

Within the UCSD Computer Science building, all the wireless help-desk issues go through the new automated system, which has been running for about 9 months. The data collection has been going on for almost 2 years.

One of the big take-away lessons is that there is no one thing that affects wireless network performance. Instead, there are a lot of little things that interact and go wrong in ways you might not anticipate.

“I look at this as an engineering effort. In the future, I think that enterprise wireless networks will have sophisticated diagnostics and repair capabilities built in. How much these will draw from our work is hard to tell today. You never know the impact you are going to have when you do the work,” said Savage. “In the meantime, our system is the ultimate laboratory for testing new wireless gadgets and new approaches to building wireless systems. We just started looking at WiFi-based Voice-Over-IP (VOIP) phones. We learn something new every week.”


The State of the Desktop

The desktop computer market is facing a replenishment phase. Continued purchases of desktop PCs will be primarily made by existing desktop owners who need to upgrade their hardware, though more and more frequently, those consumers will seriously consider and in fact decide to spend their money on a fully powered laptop instead.

The laptop computer has been gaining on traditional desktop PCs for some time. Replacing one's desktop completely with a portable computer that has enough power to handle any common task is now a feasible option for consumers, and more are heading that direction. Laptops are siphoning off sales Email Marketing Software - Free Demo of desktops.

As more and more customers look to smaller computing solutions, desktops are undergoing a transition. With many models, manufacturers are turning away from big, clunky, energy-hogging boxes to smaller, thinner and more energy efficient solutions.

The desktop computer market is facing a replenishment phase. Continued purchases of desktop PCs will be primarily made by existing desktop owners who need to upgrade their hardware, though more and more frequently, those consumers will seriously consider and in fact decide to spend their money on a fully powered laptop instead, according to industry analysts.

"Customers want the best use of their dollar for the PC they buy. We've addressed customer pain points with ergonomic designs. Companies are starting to question if they need all the bells and whistles. Customers want to buy the right system Stay on budget with simple to install HP server technology. and forget it for three or four years and then replace it again," Tom Tobul, executive director of emerging products marketing at computer maker Lenovo Latest News about Lenovo, told TechNewsWorld.
Shrinking Stateside Market

About a decade ago, computer manufacturers had few new roads to explore, having sold PCs to nearly all of the 850 million people worldwide who wanted and could afford a machine, according to Stephen Dukker, chairman of NComputing and former CEO of Emachines. Citing a Gartner (NYSE: IT) Latest News about Gartner Research report, Dukker said there is a potential for 755 million new computer users who can't afford desktops as they are priced today.

"The desktop market has not been growing until recently with the rise of developing countries," Dukker told TechNewsWorld.

A shrinking list of PC makers is voraciously pursing these potential foreign buyers, thinning out the amount of available sales. However, some manufacturers are seeing signs of renewed interest over new desktop sales.

Learning Desktops
One of the biggest developing PC markets in the U.S. is education, according to Dukker. It represents 15 to 17 percent, based on one computer for every five students. So there are lots of new users waiting for a product they can afford to buy, he noted.

By far, however, the best hope for tapping into a steady stream of new customers for desktop computers lies in foreign markets, other PC makers assert.

"We are seeing some resurgence of desktop opportunity in the U.S. market. [Compare that to a] 38 percent market share for desktops in China," countered Tobul. "We are also seeing very positive growth in India."
Desktop Greenery

Desktop manufacturers are facing a double-edged sword. While the desktop market sloughs through a replenishing phase, companies are discovering that green PC initiatives -- efforts to make computers that require less electricity to run -- are increasing the costs.

"Green PCs use less power and give more performance," Steve Bulling, senior product manager for professional desktops and displays for Gateway (NYSE: GTW), told TechNewsWorld.

For instance, new technologies are reducing power specifications for desktop PCs from 95 watts to 60 watts while still maintaining performance, he explained.

Related to the green PC influences are shifting attitudes over outfitting every computer user with top-of-the-line performance. There is a growing viewpoint in corporate management circles that few workers need maximum features and power to do their jobs, Bulling said.

"Consumers are starting to want smaller form factors and are becoming receptive to energy efficiency with the ability to put the box under the desk or behind other items on the desk surface," suggested Bulling.

Desktop Trends
There will always be users who need tall towers with maximum computing power, conceded Tobul. Still, Lenovo and other desktop manufacturers are developing new designs from the ground up, he said.

One focus is on acoustics, for example. This includes new boxes to better integrate with a worker's cubicle environment and meet new concerns over thermal and physical footprints and energy efficiency, he explained.

A growing acceptance of the Linux operating system over a forced upgrade to Windows Vista may also impact on desktop trends, Bulling predicted. First, though, IT managers have to want to change, which is not happening yet in large numbers because many are generally content with existing applications.

"We are starting to see more use of Linux, but still a ways off. We may start to see more movement towards Linux on the desktop with reduced hardware needs once companies have to decide about upgrading to Vista, because XP support ends in a few years," he said.
New Products

"People have to consider trade-offs. Computer makers have to make desktops more sellable for the small- to medium-sized business market," Tobul said.

To that end, Lenovo is planning a major new desktop announcement regarding its ThinkCentre line in a few weeks. Tobul declined to discuss specifics other than saying the new line combines form factors and design features totally new to the ThinkCentre line.

Consumers may expect similar desktop line adjustments from other desktop PC makers over the next year.

Courting Innovation
Perhaps one of the more radical changes in the desktop computing concept is a solution developed by NComputing. It reinvents an older concept based on the thin client model. Its goal is to reduce the cost of buying multiple desktop computers.

"Emachine took the (US)$800 PC and sold it for $400. That was the last major expansion in user base. People still pay today about $700. The cost to build hasn't changed. Only the performance has changed," Dukker explained.

By comparison, today's PCs are supercomputers with 1,000 times more power than 10 years ago, he said. Now PC makers have to worry about a trend for all applications going to the Web.

"Nobody can make any money selling desktops. The margin is 6 percent. There is so little money that Emachines had to sell out to a competitor in a similar fashion to Compaq being absorbed by HP," Dukker said.

A Different Take
Dukker's desktop solution is a new twist on the thin client concept. However, the term "thin client" is not something he likes to use to describe his desktop alternative, he said.

The product uses two components. One is software for a terminal server running six or seven Linux distros and Windows. The other is the hardware device itself, consisting of a keyboard, a mouse, a flat display screen, speakers and a connection bridge to a standard base computer.

The X Series connects through a hardware connection up to 30 feet from a shared PC. The network card can support up to seven users. It does not require separate virtualization Learn how SAN/iQ technology works with VMware. software because the chip that handles the process lives on the network card in the shared desktop PC.

It draws only 1.5 watts of power per user and can be powered from the base PC much like a USB Latest News about USB interface powers the USB device. It costs $11 to build, and Dukker sells the system for $70 per seat.

The L Series, which costs $35 to build and sells for $149 per seat, uses an Ethernet connection to a remote Manage remotely with one interface -- the HP ProLiant DL360 G5 server. server. It can be used much like a standalone unit anywhere that has access to a wired broadband connection to the server. Up to 30 L series terminals can run on seven $800 desktop computers to give each tethered user a full PC experience, Dukker said.

Cost Reduction
Power consumption for the L Series is 6 watts. Compared to the standard 200 watts that powers a desktop PC, at 8 cents per kilowatt hour, NComputing's solution can pay for itself in 18 months, according to Dukker.

Neither model has a hard drive for program or data storage. The X Series has a USB port on the network card that allows the user of the thin client to connect to an external storage device via a long cable. The L Series has a USB port at the thin client location.

In both cases the USB external device can be seen by the shared PC. The thin client can also see the internal hard drive of the shared PC.

MS Says Vista SP1 Is Almost Baked

Microsoft talked up Vista Service Pack 1 on a company blog Wednesday, promising a beta version in the next few weeks and a full version in Q1 of next year. The first service pack of a Microsoft operating system represents a milestone of sorts, as some businesses use the occasion as a signal to begin enterprise-wide deployment of the latest OS.

Microsoft (Nasdaq: MSFT) Latest News about Microsoft on Wednesday announced details and a timeframe for the release of Vista's first service pack, giving a heads-up to businesses waiting on information needed to plan possible deployment of the company's latest operating system Stay on budget with simple to install HP server technology..

"Now is the time and the time is now: Let's talk about Windows Service Pack 1 (SP1)," Nick White, Vista product manager, wrote on the Windows Vista Blog. "Much has been made of what will or will not be included in SP1 and when it will be released -- some accurate, some otherwise.

"I'm here to set the story straight: We're in the process of developing and deploying a beta version of SP1," he continued.

Missed Anniversary
The Redmond, Wa.-based company said it will release a beta version of the Vista service pack in "a few weeks." The final version of Vista SP1, Microsoft said, should be available during the first quarter of 2008.

The scheduled release of SP1 is late and should have been available at the latest by November, Michael Silver, an analyst at Gartner (NYSE: IT) Latest News about Gartner, told TechNewsWorld.

"It would have been nice if it could have been available for the 11/30 enterprise availability anniversary," he stated.

Circle of Life
For PC users and IT managers, service packs have become regular milestones in the life of a Windows operating system. They are, according to White, part of Microsoft's "commitment to continuous improvement." For Windows XP, the company released two service packs, each of which contained significant alterations and enhancements to the software.

While consumers look forward to service packs to fix pesky glitches in their applications, desktop managers have been anxious for Microsoft to make a Windows Vista SP1 announcement because it directly affects their plans for deploying a new operating system, Benjamin Gray, a Forrester Research analyst, told TechNewsWorld.

"Regardless of whether this is justified or not, experience tells desktop managers to not deploy a new Windows operating system until SP1," he said.

Kicking Enterprises Into Action
This, he continued, has proven particularly true with Windows Vista not only because of the ever-expanding number of devices and applications needing certification but also due to the importance of said applications. Vista has had compatibility issues with applications, including virtual private networking (VPN) and antivirus applications.

"A few major software vendors have been somewhat sluggish to get their enterprise-class applications certified for Vista," he explained. "So what good is a new OS if, for example, the VPN connection doesn't work for remote access or the antivirus software can't run?"

Since Vista's launch, more than 70 major enterprise software makers have released applications compatible with Vista, including Adobe, Citrix, Oracle, Sun, HP, LANDesk and IBM, according to Microsoft. SP1 will also include support for technologies such as flash memory storage Learn how SAN/iQ technology works with VMware. and consumer devices that will use the exFAT file system, support for Secure Digital (SD) Advanced Direct Memory Access (DMA) as well as support for Direct3D 10.1

"SP1 is technically important because it will add support for emerging technologies, devices and standards and will address some early end-user feedback, but it's also symbolically significant for enterprises that have temporarily held off evaluating or deploying Vista," Gray pointed out. "After SP1 starts shipping by the major OEMs in Q1, I expect full-scale enterprise adoption to really kick off in mid-2008 -- in line with the natural PC refresh cycle of enterprises."

The Big WU
Technology and tools included in Vista, however, have slightly altered what Microsoft includes in its service packs and when they are deployed. With the addition of the new Windows Update (WU) tool Free white paper on customer satisfaction metrics. Click here., the software maker no longer relies exclusively on its SPs as the primary method to disseminate system fixes and enhancements.

Instead, Microsoft can use the WU online service to deliver system repairs and other improvements. The new system alleviates the waiting game many enterprises and home users had to withstand as the company rolled its corrections into a single service pack. On Wednesday, Microsoft released an update using WU to improve two separate fixes to bump up Vista's reliability and performance.

"WU makes service packs less important because many important updates will be available long before the SP," Gartner's Silver noted. "But SPs are still important because they mark when the clock starts ticking for the end of support of the previous SP."

With mainstream deployment of the OS already underway, Silver said, it will pick-up after the release of SP1. "Most organizations will still need a few months to test the SP and integrate it into the process."
Upping Quality

Among the improvements included in the upcoming SP security Webroot AntiSpyware 30-Day Free Trial. Click here., reliability and performance are top priorities.

Users frustrated with Vista's sometimes sluggish performance may see a boost after installing the SP. It's designed to improve the speed of extracting and copying files and shorten the time needed to switch from "Hibernate" and "Resume" modes to become active. Domain-joined PCs will function more quickly when operating off the domain.

Surfing the Internet will take on new speed with improvements to Windows Internet Explorer 7 for Vista with the reduction of CPU (central processing unit) utilization and increasing JavaScript parsing speed. Notebook battery life will also receive a bump up as CPU utilization is reduced by not redrawing the screen as frequently on certain computers.

On the security front, SP1 will provide security software vendors with a more secure method to communicate with Windows Security Center. It will include application programming interfaces (APIs) by which third-party security and malicious software detection applications can work with kernel patch protection on x64 versions of Vista. The API, Microsoft said, will help ISVs develop software that extends the functionality of the Windows kernel on x64 computers without disabling or weakening the protection offered by kernel patch protection.

SP1 will also improve the security of running RemoteApp programs and desktops by allowing Remote Desktop Protocol files to be signed, thus allowing users to differentiate user experiences based on the publisher's identity.

In terms of reliability, Microsoft said it continues to look at some of the most common causes of crashes and hangs in order to give users a more consistent experience. In this category, many of the planned improvements will specifically address issues identified in feedback provided by the Windows Error Reporting tool.

The service pack, Microsoft said, will improve reliability and compatibility of the OS when used with cutting-edge graphics cards in several scenarios and configurations as well as bump up reliability in networking configuration scenarios and for systems upgraded from Windows XP to Vista. Laptops used in conjunction with external displays will be more reliable as well. Vista will provide more compatibility with printer drivers. SP1 will also increase reliability and performance as the OS moves from sleep to active use and vice versa, according to Microsoft.

IBM Takes Baby Steps Toward Atomic-Level Data Storage

Researchers at IBM said research into the magnetic properties of atoms has uncovered ways to possibly create electronics on an extremely small scale. Theoretically, such research could pave the way to dramatically minimizing storage space. The entire library of YouTube, for instance, could exist on a device the size of an iPod. That sort of technology, though, is decades away at best.

In two big breakthroughs on the smallest scale, IBM (NYSE: IBM) Latest News about IBM researchers say they've made discoveries about the nature of individual atoms and molecules that could someday lead to dramatic improvements in computing and other consumer technologies.

In Friday's edition of the journal Science, the nanotechnology researchers report they have discovered new ways to measure the magnetic properties of individual atoms and also how to use a single atomic molecule like a miniature electronic switch, such as those found in computer chips Latest News about computer chips.

It could be decades -- if ever -- before devices based on the technology are available. However, the researchers say the discoveries could be key to developing new types of semiconductors and data storage devices.

All of YouTube on an iPod
Using atoms or clusters of atoms to store data magnetically, for instance, could allow a device the size of an iPod to store nearly 30,000 feature-length movies or the entire contents of the YouTube Latest News about YouTube video-sharing Web site.

Using single molecules to replace integrated circuits on computer chips could help usher in another new age of ever-smaller computers, cell phones or other electronics.

"We are working at the ultimate edge of what is possible --- and we are now one step closer to figuring out how to store data at the atomic level," Gian-Luca Bona, manager of science and technology at the IBM Almaden Research Center in San Jose, Calif., said in a statement.

Using a special scanning tunneling microscope, IBM researchers were able to arrange individual iron atoms on a copper surface and measure their magnetic properties.

In doing so, they took a step toward being able to manipulate such atoms to represent a "1" or a "0" -- the basis for magnetic storage Learn how SAN/iQ technology works with VMware. and digital data today.

Decades Out
IBM researchers in Switzerland, meanwhile, say they've determined how to use molecules formed from hydrogen atoms as logic switches, much like the "gates" used in today's integrated computer circuits to turn electric currents on and off.

Creating molecule-sized gates could let semiconductor engineers design ever-smaller microprocessors, the "brains" of any electronic device.

IBM researcher Cyrus Hirjibehedin said if IBM's new technology makes it out of the lab and into real-world production, it could be a decade or more before devices at such small scale and with such storage capacity hit the market.

Given the pace of existing technologies, though, it would take as much as five decades to reach similar levels, he said.

Microsoft Adds Linux Concession to Silverlight Release

Microsoft and Novell are working together to develop a Linux version of the Silverlight video plug-in, which is slated as a contender to Adobe's Flash. Redmond, which has historically been no friend of the open source movement, is increasingly sharing with the community, said Miguel de Icaza, lead Mono project developer and Novell vice president of developer platforms.

Microsoft (Nasdaq: MSFT) Latest News about Microsoft on Wednesday announced that its cross-browser plug-in, Silverlight will eventually appear on Linux.

Silverlight, a challenge to Adobe's (Nasdaq: ADBE) Latest News about Adobe Flash, means Microsoft will have to convince Web users they can really "light up the Web," with Silverlight, as the ad copy states, and view high-quality video.

The announcement was paired with news that Microsoft has partnered with Novell (Nasdaq: NOVL) Latest News about Novell to produce a future release of a Silverlight Linux version, dubbed Moonlight, which should be available in about six months.
No Zero-Sum Game

The collaboration goes against the perceived "zero-sum game mentality" that has gone on between free software and Microsoft camps, Miguel de Icaza, lead Mono project developer and Novell vice president of developer platforms, told LinuxInsider.

Novell-supported Mono has been working on the Moonlight implementation.

"We should be working together," de Icaza said. He called the collaboration between Microsoft and Novell developers historic.

"I think historically, the birth of Linux and the creation of a completely free operating system Stay on budget with simple to install HP server technology. was positioned by the press and the world as being a zero-sum game between Microsoft and the free software world," he said.

"Folks presented Linux as the operating system that would destroy Microsoft. The reality is that this is not a zero-sum game, that we can all grow the market together."

De Icaza pointed to Microsoft's steps toward collaborations. "They are starting to open source fairly important projects and starting to adopt open source collaboration practices, IronRuby, for example," he said, pointing to the .NET implementation of the Ruby programming language, which Microsoft is developing under the Microsoft Permissive License.

Desktop Linux

The "Moonlight" Linux announcement, though, is a first, he told LinuxInsider. "This was the first time that Microsoft would put a significant set of resources into making a product that would directly benefit Linux on the desktop."

Novell, nonetheless, see challenges. Collaborations with Microsoft are easily taken in a positive light by corporate customers, said Justin Steinman, marketing Email Marketing Software - Free Demo director for open platform solutions at Novell.

"When we talk about making directories, virtualization Learn how SAN/iQ technology works with VMware., and documents communicate across Windows and Linux, they like that," he said. "At the same time, there are some in the community who don't like our agreement with Microsoft. So that's made our communications challenges with the community more difficult."

Novell's Goal
Novell's Linux goal and Microsoft collaborations worked hand in hand, Steinman said. "We are increasing Linux adoption and making it work better with Windows. Our overarching goal is interoperability. That's what customers want. And we're delivering that."

The more applications on Linux, the more attractive it becomes, Steinman stated.

"I would argue that the announcement around Moonlight is very much supportive of an open information world," he said.

Tuesday, September 4, 2007

Nokia Opens Doors to the Internet

In a paradigm shift, Nokia today introduced 'Ovi', its new Internet services brand name.

'Ovi' means 'door' in Finnish, and promises to enable consumers easily access their existing social networks, communities, and content.

As part of 'Ovi', Nokia has announced the Nokia Music Store, N-Gage, and Nokia Maps. As such, 'Ovi' at will act as the gateway to all of these Internet services.

It will be an open door to Web communities, enabling people access their content, communities, and contacts from a single place, either directly from a compatible Nokia device, or from a PC.

The first version of is slated to go live in English during Q4 2007, with additional features and languages expected to go live during the first half of 2008.

Part of 'Ovi', the Nokia Music Store at will offer millions of tracks from major artists, and independent labels, as well as local artists, available only through Nokia.

The store can be accessed via a desktop computer, or directly from a compatible Nokia device such as the Nokia N81 or Nokia N95 8GB multimedia computer.

Consumers will be able to browse for new music, buy what they want, or add a song to their wish-list to download later. They will also be able to transfer purchased songs to their mobile devices. With the built-in music player, they will be able to create playlists on the go and manage their music collection.

The store will open across key European markets this fall, with additional stores in Europe and Asia opening over the coming months.

In Europe, individual tracks will cost 1 euro and albums will be upwards of 10 euros, with a monthly subscription for PC streaming of 10 euros.

Also part of 'Ovi', N-Gage will offer an easy way to find, try, and buy great quality games directly from compatible Nokia devices.

By selecting the N-Gage application on compatible Nokia devices, users will be able to preview available games, connect with friends, read reviews, or download free demos.

Games can be bought either with a credit card, or by charging to the user's monthly phone bill. Electronic Arts, Gameloft, etc, are making some of their big brands available through N-Gage.

The application is expected to be available for download at in November 2007.

Meanwhile, Nokia Maps, as the name suggests, will offer maps, city guides, and more directly to compatible mobile devices.

Commenting on Nokia's shift from mobile to Internet services, Olli Pekka Kallasvuo, president and chief executive officer of Nokia, said that the industry is converging towards an Internet-driven experience, and that 'Ovi' represents Nokia's vision in combining the Internet with mobility.

Microsoft to acquire group chat provider Parlano

Microsoft Corp. said Wednesday it will buy a small Chicago-based technology company and add its group-chat software to a broad vision for integrated office communications programs.

Microsoft and Parlano did not disclose financial details of the deal but said they expect it to close in October. Executives for the two companies said they do not yet know if any of Parlano's staff will be laid off, or if the company will relocate to Redmond, Wash., where Microsoft is based.

Parlano, a seven-year-old company with 50 employees, makes the MindAlign group chat program, which lets users log on to a permanent chat room and send messages in real time, or search through an archive of the conversation later.

Microsoft has been working to integrate software for e-mail, instant messaging, video conferencing, office and mobile phones. Under this ''unified communications'' vision, PC users will be able to see whether the person they want to contact is available by IM but not by phone, for example, or move seamlessly from an e-mail to an IM conversation to a video chat.

One of the key pieces of software, Office Communications Server 2007, is slated to launch Oct. 16.

Call centre workers often frustrated and under stress

The phone rings off the hook, callers are often in foul moods and the pressure to end the call as quickly as possible is enormous. Work in a call centre can be stressful. Call centre agents can expect to field between 50 and 200 calls a day, many of them in the evening. Many people complain of health problems. On top of that comes time pressure, bad work environments and low salaries.

"But it's a field that's exploding," says Erich Welthe, head of the ver.di Service Workers Union in Neubrandenburg-Greifswald. In the German state of Mecklenburg-Western Pomerania alone, around 13,000 people are employed in call centres, selling products, taking complaints or conducting surveys.

"The working conditions are difficult," says Welthe. Gross pay is between 5-6.5 euros ($7-9) an hour. But with only 30 hours a week, most employees rarely manage to scrape 800 euros together a month.

"Most call centre agents have to round off their income with state support."

But its not just the low pay that annoys Welthe. The hours of work are unacceptable, he bemoans.

Working conditions in call centres leave a lot to be desired, says labour economist Bernd Bienzeisler of the Fraunhofer Institute for Industrial Engineering in Stuttgart. Large rooms with 150 workers are standard.

"It's very stressful work. People can't be expected to be on the telephone for more than four hours consecutively."

Mental stress also runs high. "Call centres stipulate how long it should take to solve a customer's problem," explains Bienzeisler. Agents are often expected to solve a problem within 90 seconds. But customers can be frustrated and vent their anger on the agents.

Around 30,000 workers are employed in call centres across Germany, and the numbers are growing. Bienzeisler says no training is required to become an agent. Many of those employed in call centres are housewives or students trying to earn a little on the side.

"But now there's a trend towards qualified personnel, who have experience in the field," he says.
The high number of sick days call centre employees clock up indicates the enormous pressure exerted on them.

Last year, around 5.75 percent of workers with AOK, Germany's biggest health insurance company, called in sick, says Klaus Pelster, deputy director at the Institute for Advancement of Workplace Health in Cologne.

"That's a full percentage point higher than the average for the entire Rheinland region in Germany."

On an average, every call centre employee is officially registered as sick twice a year - which Pelster attributes mostly to stress and time pressure.

Of those workers who called in sick - around 25 percent suffered from respiratory problems. "They included cold symptoms, but also lung infections," said Pelster, who suspects people are straining their voices.

"The second biggest problem was muscular pain. Two thirds of all diagnoses point to back problems."

Beate Beermann, scientific director at the Federal Institute for Occupational Safety and Health in Dortmund, advises not to sit at a computer and telephoning continuously.

"There should be a 5-10 minute break every hour." Companies should allow employees access to rooms for relaxation.

"After all, customers notice when the agents are stressed." Regular education and diction training also encourages workplace motivation.

Jump-Start Your Job Satisfaction

Ever have those periods at work when you find yourself singing the famous refrain from the Rolling Stones' hit "(I Can't Get No) Satisfaction"? When you feel like there's something missing in your work, answer these questions to spur you to take action against job funk.

What's Right About Your Job?
Figuring out what's good about your work situation can make it easier to identify the one or two things you're really missing in your job. Perhaps you work with colleagues and clients who stimulate you or you're comfortable with. Maybe your job gives you the chance to exercise the skills that you enjoy using the most. Whatever it is that gives you satisfaction and is important to you, write it down.

What's Missing?
Consider what's lacking in your job. Just because you are good at doing something, doesn't necessarily mean you want to do it anymore. Maybe your work-life balance is out of whack. Or you might be stuck in the same old rut. Create a second list of things you don't like about your job.

Can You Get What You Want Where You Are?
Before you start looking for another job, figure out if you can get the things that are missing in your current job. Let's say you haven't learned anything new in several years. Is there an aspect of your company or industry that is growing that you'd like to learn more about? Are there specific skills that other people in your organization have that you can use or plan to learn? What about improving a weak skill, like public speaking, training, financial management, or marketing?

How Can You Make a Difference at Work?
If you really like what you do but feel like you need a change, get involved in a special project that focuses on improving your employer's business. Identify an area within your company that needs help, think through the details of how you can improve it, and put together a brief proposal to present to your boss.

You could also get involved in your industry's key association as a way to meet new people in your industry who might give you a new perspective on potential opportunities in your work.

Can You Get What You Need Elsewhere?
If you've taken a hard look at your current job and company and realize that you just can't create a situation or do the work that satisfies you, then it makes sense to start looking for opportunities in other companies or industries.

Who Can Help You To Find a New Job?
Now is the time to start networking with as many people as possible. You may not find a new opportunity as quickly as you'd like, but you are planting seeds. Seek out acquaintances who have experience in your desired field or who can introduce you to decision-makers who will value your skills and experience.

Finally, you'll want to build on the lists you created in response to the first two questions. The information you recorded will help you prepare key questions and answers for the next stage in your pursuit of satisfaction -- the job interview.

How to Ace Your Performance Review

In many workplaces, the end of the year brings not just the holiday party but also the dreaded annual ritual of performance reviews. Experts say preparation is the key to making yours productive rather than painful.

* Gather your evidence. List your accomplishments for the year -- and have documentation to back them up. "Very often managers getting ready to do reviews can't remember everything that the employee has done," says Ann J. Willson, a human resources consultant and owner of Human Resource Directions in Raleigh, N.C.

At many companies, employees are given a form to fill out before the review, listing their accomplishments and goals. Take the time to prepare this document carefully, says Diane Foster, principal of executive coaching and consulting firm Diane Foster & Associates in Alameda, Calif. When listing career goals, Foster advises focusing on "what you see as your next career step within this next year."

* Know what you want. Performance reviews aren't just a time for you to listen to your boss. "In every performance review, you are directly or indirectly coaching your boss," Foster says.

If you have a particular skill -- public speaking, for example -- that you'd like to improve, ask your boss for help. Perhaps you could take a class, or maybe your boss could coach you. "It's up to the employee to really kind of push on the boss for commitment on that," Foster says. After outlining your request, you can say, "I'd love to have that be one of our goals for next year on the performance review," Foster says.

* Face problems in advance. Perhaps you were part of a team whose project wasn't exactly a glowing success. Glenn Shepard, a management consultant and owner of Glenn Shepard Seminars, says it's best to bring up the issue yourself. Some situations are complicated: Perhaps you weren't able to complete your part on time because someone else missed a deadline for getting you crucial information, for example. If you broach the topic, you can explain the part you played, what you could not have changed and what you would do differently next time.

"If the manager brings up that marginal performance first, then the employee looks defensive," he says.

* Expect to hear criticism. It's part of managers' jobs to point out areas where their employees could improve. "Even if they're happy with you, they strive to find something to make better," says Debra Benton, executive coach and author of "Executive Charisma."

If your boss says you lack leadership, for example, ask your boss to describe a time when you didn't demonstrate leadership. Then ask for examples of situations that will come up in the upcoming year when you can practice and demonstrate leadership.

Finally, as long as the criticism is balanced with praise, be glad your boss has taken the time to tell you how to improve. "If they have nothing negative, I think that's a bad sign, because they don't care," Benton says.

Increase Your Efficiency at Work

The vague but important notion of "getting things done" is a key success factor for most employees. Those "things" may be complex processes or simple tasks. Nevertheless, they involve a continuous circle of thinking, planning, finding time to make it happen, and taking action, as outlined in the simple steps below.

Step 1: Think about getting things done.

This step starts with a positive mental attitude and approach. How often do you get stuck in a non-productive dwelling mode, or have a pity party happening in your head? Dwelling accomplishes nothing, and negative thinking slows or stops your actions.

Since positive thinking yields productivity, here's one way to get out of the negative or dwelling mode. Make an appointment (15 minutes or more) with yourself to think only about that particular negative or dwelling issue -- and nothing else. This system will free you to be more productive during the remainder of the day. Also, the concentrated focus during the "appointment" will help dilute the issue's negative power over you.

Step 2: Plan it on paper.

Keep the plan simple to increase the probability of accomplishment. Write it down, because writing increases personal commitment and establishes a record.

Categorize issues and items into:

Opportunities, which create growth.
Problems, which solving can either create new opportunities or simply take a load off your mind.
Dust, which are those necessary maintenance issues such as filing, paying bills, or expense reports. Dust issues just keep coming back no matter what you do.
Make a plan that:

  • Focuses on the opportunities.
  • Solves the problems.
  • Creates efficient systems for the dust.
  • Is ranked and prioritized by importance to you (or your boss).
  • Includes a timeline with a sense of urgency to help prevent procrastination.
  • Has a measuring tool for your results.

Step 3: Find time to get it done.

Everyone has the same amount of time each week -- 168 hours -- yet some are able to accomplish a great deal more than others.

To add hours to your day:

  • Get up earlier, stay up later, or both.
  • Focus on the priorities.
  • Learn to say "yes" to those actions which help you reach your goals and "no" to those that don't -- unless the boss says otherwise.
  • Use self-imposed deadlines to increase your speed.
  • Be organized so you can find things within three minutes. That's the rule.
  • Do the hardest stuff when you're at your best. Maximize your body rhythm: The morning person works best in the morning while the night person works best in the afternoon or night.
Step 4: Take action!

Every action and activity should make a positive contribution to your plan. Having trouble getting started? Set a time to start, and make sure you start the action -- even if you are not in the mood or don't want to. More often than not, your motivation will catch up with your actions.

Having trouble keeping motivated? One of the strongest motivators is checking off things that are "done" and seeing results. Record the completion date to help measure your speed of progress. You are now getting things done.

Are you a workaholic?

In a nation of overachievers, hard work is a virtue. If you work hard, you'll achieve your goals. If you work even harder, you'll achieve even more. Right?

Perhaps not. There are, in fact, several downsides to working too hard. Being the office workaholic can cost you coveted promotions, hurt your home life, and even turn friends into enemies. Evaluate yourself with the following five questions.

1. Are you busy ... or disorganized?

Are you constantly staying late and coming in early yet producing the same output as others? If so, your boss may come to view you as inefficient and possibly disorganized. Dave Cheng, an executive coach with Athena Coaching, says, "There are some people, type A's, who get a lot of satisfaction from doing lots of work, but the quality isn't necessarily superior."

Focus on getting your work done in a reasonable time frame. If you have perfectionism or time-management issues, ask your supervisor to help you prioritize things and learn when to let go of a task. Cheng says, "Just because you're working longer doesn't mean you're working better."

2. Are you delegating ... or hoarding?

If you have any aspirations at all to move into management, you must learn to delegate work. Again, tasks need to be completed in a timely fashion; if you're having trouble finishing a project, you must delegate to other team members, even if you happen to relish the task you're giving away.

Cheng, who has more than 12 years of experience in corporate human resources, reveals, "Some workers feel like if they do everything and they're the only one who knows how to do it, they're making themselves irreplaceable. However, sharing information and teaching others around you is a valued skill as far as management is concerned."

Focus on completion and quality and be generous enough to let a colleague learn and shine. If you lack sufficient support, ask your boss about expanding your group.

3. Are you hungry ... or is your plate full?

Once you've solidified your reputation as the office workaholic, you may find that when your dream project comes through the door, you aren't asked to work on it. Why? Your boss probably thinks you don't have the bandwidth to take on anything else. Always keep a bit of room in your schedule to sink your teeth into new challenges and opportunities.

Cheng reminds professionals, "Your ability to say no to certain things gives you the freedom to say yes to others."

4. Do you have friends ... or 'frenemies'?

Your workaholic ways are likely alienating once-valued associates. Above and beyond the obvious grumblings of, "You're making the rest of us look bad," your colleagues may dread collaborating on a project with you.

Lose the overly methodical approach, don't expect folks to come in early or stay late for meetings, and focus on process and outcome.

5. Do you work to live ... or live to work?

The best workers are well-rounded professionals with full lives, in and out of the office. Each year, new studies abound about the importance of vacations, hobbies, and enjoying your leisure time. But are you listening?

Your friends and family will be in your life a lot longer than you'll hold most jobs. Also, pursuing leisure activities you're passionate about can lead to a second career.

Cheng concludes, "Work-life balance is a choice. If you reflexively say yes to taking on extra work, you may live to regret it."