In-Depth
Unix: The 64-Bit Gold Standard
Many say it will be years before 64-bit Windows becomes a serious challenger.
Microsoft may be the behemoth of the software industry, dominating lucrative
markets like desktop operating systems, productivity applications and application
development tools, but there is one area where its influence is miniscule, not
monstrous. It still lags behind in high-performance computing.
"For compute-intensive applications, medium and large companies still
turn to Oracle running on a Unix server rather than SQL Server running on a
PC server," says John Enck, a vice president at Gartner Inc.
An important reason for the continued performance delta between Unix and Windows
is the former's superior support for 64-bit processing. In the Unix market,
the migration to 64-bit computing has become routine. On the other hand, Windows
still finds itself in a relatively embryonic stage of 64-bit computing. At the
turn of the millennium, Microsoft made significant investments in this area,
but they resulted in little to no progress. There are a handful of reasons for
that.
"Microsoft has been trying to get software vendors to move to 64-bit computing,
but most just haven't seen a compelling reason to do that," says Joe Clabby,
president of market research firm Clabby Analytics.
Despite its lack of progress, Microsoft continues to throw research and development
dollars at the high end of the computing market. SQL Server has operated with
64-bit processing for a few years, Vista comes in 32-bit and 64-bit modes, Exchange
and other Windows Server 2007 products were built to run with 64-bit microprocessors,
and the company is requiring that all third-party vendors deliver 64-bit versions
of their products in order to gain Microsoft's blessing in the future.
While these steps should help Microsoft present a stronger case to Fortune
500 companies, observers expect many more years will pass before they rely on
Windows for complex, back-end processing. "It took close to a decade for
Microsoft to move Windows from 16-bit to 32-bit processing, and it looks like
that will also be the case with its migration from 32-bit to 64-bit processing,"
says Gartner's Enck.
Performance Matters
In the heart of the data center, where high-performance applications reside,
performance is king. And 64-bit processing flat out delivers more than 32-bit.
The difference centers on how the system manipulates data. In the 32-bit world,
you can place a maximum of 4GB of data in a computer's internal memory. Placing
data in internal memory, as opposed to reading it from disk, improves performance
because there are fewer input/output read/writes to disk subsystems. This takes
significantly longer than working directly with the information.
A 64-bit system can work with up to 16TB of internal memory. Consequently,
64-bit systems address more memory faster and process more data per clock cycle.
This greatly improves complex application performance.
In 2003, Microsoft released its Windows Server 2003 Datacenter Edition. Executives
boldly discussed burrowing their way into the back-end of the data center. "I
did expect faster adoption of 64-bit computing on Windows," notes Aaron
Foint, Windows systems administrator at Worcester Polytechnic Institute. "Right
now, there are just not a lot of 64-bit applications available."
One necessary building block has fallen into place. Many servers (estimates
range as high as 90 percent of all servers sold since 2006) can indeed run 64-bit
applications, even though most now work with 32-bit operating systems, says
Jason Hermitage, senior product manager at Microsoft.
The crooked path of Microsoft's 64-bit server strategy has been a problem,
though. Initially, the company crafted Windows XP to run on Intel Corp.'s Itanium
microprocessor line as its primary 64-bit platform. That may not have been the
best choice.
"Application developers were unfamiliar with the Itanium processor,"
notes Brian Corcoran, manager of Windows host development at SAS Institute Inc.'s
JMP division.
Compounding that drawback was application compatibility. The first few 64-bit
versions of Windows didn't seamlessly support native 32-bit and 64-bit Windows
applications. Because of this, Microsoft has been moving away from the Itanium
architecture, which has its roots in the Unix market. It has instead focused
on x64 microprocessors, which have a PC microprocessor foundation.
When
Will 64-Bit Computing Arrive at the Desktop? |
Like throwing
a rock into the middle of the lake, moving to 64-bit computing
starts at the heart of the data center and gradually ripples
out to the edges of a company's network. The first ripple
is evident. Hardware vendors have been delivering 64-bit servers
for a few years, and a select number of applications now take
advantage of that extra processing power.
Now it's clear that desktop hardware manufacturers are also
getting ready for 64-bit processing. "Many desktop systems
already come with 1GB of internal memory," notes Nathan
Brookwood, principal analyst at market research firm Insight
64.
Currently, it only costs a company a few hundred dollars
to outfit a PC with 4GB of memory. This is currently 32-bit
processing's upper threshold. Intense competition is expected
to push memory pricing down and the amount of memory on these
systems up. Therefore, in the next 12 to 18 months, a growing
number of desktop systems will indeed be able to support 64-bit
computing.
Microsoft laid the foundation for its movement to 64-bit
computing at the desktop with Windows Vista, which supports
both 32-bit and 64-bit processing. While the operating system
is 64-bit-ready, few applications require that much bandwidth.
High-end imaging, complex multimedia and financial-analysis
applications are three that will lead the charge to this migration.
Any migration is yet to appear on the horizon, though. Servers
can skirt limitations, like a lack of device drivers and infrastructure
software, because they often operate in a closed environment,
moving information from internal to external storage. Desktop
computers need all the 64-bit accoutrements to be in place
before they make the switch. So even though the 64-bit-processing
rock has been dropped in the data center pond, its ripples
are still a long way from reaching the desktop. -P.K.
|
|
|
Missing Pieces
Yet another hurdle is that the entire Windows ecosystem (software, peripherals
and device drivers) needs to be rebuilt to take full advantage of 64-bit processors.
For instance, a 32-bit DLL can't address memory space larger than 4GB, which
a 64-bit processor does easily. Currently, 64-bit apparel for the Windows world
is more of a fig leaf than a full wardrobe.
Device drivers for hardware peripherals, like scanners and printers, are hard
to find. Few application-development tools have been rewritten to support 64-bit
processing. Application infrastructure software, like vital anti-virus software,
is also missing.
Because application development is such a chore, only companies that really
need the extra processing power have taken on the challenge. The first wave
of applications has included large database-management systems, decision-support
and business-intelligence systems, medical applications like drug discovery
and medical imaging, computer-aided design and-computer aided engineering, enterprise
resource planning (ERP), customer relationship management (CRM) and supply chain
management (SCM), video production and gaming-software design.
Cakewalk, which develops desktop music and sound software, is a true pioneer.
It moved to 64-bit Windows in 2005. Because its software manipulates multimedia
files, the extra processing power was desirable. Its migration did present a
few challenges, however.
"In theory, moving to 64-bit computing should have been simple,"
says Noel Borthwick, chief technology officer at Cakewalk. "In reality,
we ran into a few unexpected gotchas." Cakewalk found that many development
tools rely on 32-bit, not 64-bit, algorithms to track code. Something as simple
as inserting a pointer to tell an application where to locate data became a
cumbersome programming task.
Database management systems (DBMSes) are an area where 64-bit computing is
taking root. If a company can place an entire database in memory and process
a query without having to read it from a disk, then it can provide significantly
faster results.
In 2006, Gainesville State College (GSC) in Gainesville, Ga., which has 7,500
students and 750 faculty and staff, decided to upgrade to the 64-bit version
of SQL Server. After sorting through some pesky problems, like getting its 32-bit
applications and 64-bit applications to work harmoniously on the server, the
college found that the 64-bit technology delivered a significant performance
boost, according to Brandon Haag, executive director of IT.
GSC, which relies solely on Microsoft software, has been testing Exchange Server
2007 and SharePoint Server 2007. The goal is to have them fully operational
by the end of the year. This rollout represents a few of several steps that
Microsoft is taking to prod its customer base and third-party supporters to
move to 64-bit computing. Starting in 2008, independent software vendors will
need to deliver 64-bit versions of their apps in order to earn Microsoft's certification.
Skeptics Reign Supreme
Even with those moves, many remain skeptical of using Windows to support complex
applications. Although Windows will have 64-bit capabilities, it lacks other
needed features. "Reliability is a key function for high-performance applications.
Users don't want their systems going down," says Nathan Brookwood, principal
analyst for market research firm Insight 64. Unix systems are more resilient
than PC servers because they support features like hot failovers, where transactions
are completed even during an outage.
Personnel requirements are another obstacle. While there are oodles of Microsoft-certified
engineers sitting in IT department cubicles, the number of them that actually
understand how to deploy and support complex high-end applications is relatively
low. Consequently, Microsoft professionals will need knowledge transfers from
more experienced Unix systems administrators. It's unclear how much help these
individuals may be, however. In some cases, they may push their enterprises
toward Linux alternatives and away from Windows.
Inertia appears to be still another force working against Microsoft. "Large
companies are extremely cautious with their key applications," explains
Clabby Analytics' Clabby. "They won't move to a new computing platform
unless something is tried and true and offers them compelling economic advantages."
To date, Windows simply hasn't given them a good reason to make that change.
While Microsoft has dominated many other markets, the back-end of the data center
is one area where the company is now -- and will remain for at least a few more
years -- a persona non grata.
About the Author
Paul Korzeniowski is a freelance writer based in Sudbury, Mass. He has been writing about networking issues for two decades, and his work has appeared in Business 2.0, Entrepreneur, Investors Business Daily, Newsweek and Information Week.