AMLogic Processor Roadmap: Quad core Cortex A9 (M8), and 64-bit ARM processors (M9)
We already knew AMLogic is about to release a quad core Cortex A9 SoC, previously codenamed AML8726-M8, but recently renamed to M80X...
Windows 8.1 64-bit Gaining Traction Among Gamers, Steam Data Claims
Windows 8.1 is also gaining ground among gamers, with new data provided by Valve’s Steam platform confirming that more users are making...
64-Bit Mobile CPUs Arrive in 2014, Handset Vendors to Widely Adopt Them
Next year is expected to bring to the market a flurry of smartphones powered by 64-bit processors, the latest reports on the matter...
Rumor: Qualcomm, Nvidia and Broadcom to announce 64-bit processors at CES 2014
Ever since Apple launched the iPhone 5S with a 64-bit ARMv8 processor (the A7), Android handset makers, Google and the various...
“When I’m sixty-four”: discussing the 64-bit version(s) of Firestorm
On October 18th, Jessica Lyon poked me about an upcoming blog post she was preparing for Firestorm which would make mention of a 64-bit...
Stardock helps to form Oxide Games; making new 64-bit Nitrous Engine
A few days ago, Stardock announced it was working on Galactic Civilizations III for the PC, and that the game will be made to run...
Solaris X86 64-bit Assembly Programming
This is a simple example on writing, compiling, and debugging Solaris 64-bit x86 assembly language with a C program. This is also...
RyuJIT: Microsoft's new 64-bit .Net compiler
Microsoft made available this week for public testing a build of a new 64-bit .Net compiler built by the company's .Net runtime...
Tizen 3.0 Features Announced: Multiple User Support, 64-bit Architecture & More
Tizen 2.2.1 was announced yesterday in the lead up to today’s Tizen Developer Summit, and the message that was sent out in the keynote...
NVIDIA Prepares Linux Users for a Future Without 32-bit
NVIDIA has announced that it has dropped support for the 32-bit Linux-x86 CUDA and OpenCL applications.
FreeBSD 10.0 Beta 3 Gets Improvements for the 64-bit Version
FreeBSD 10.0 Beta 3, an operating system for x86, ARM, IA-64, PowerPC, PC-98, and UltraSPARC architectures, is now available for...
Linus Acknowledges 32-Bit Linux As Less Important
The Linux 3.12 kernel was released on Sunday evening but prior to that was a last-minute pull request that got rejected by Linus...
Monday, 20 February 2006 00:12
Good question. And here is the answer in a single sentence: We have everything and still, we have nothing. Of course, this sounds cynical and highly biased. The truth lies somewhere in the middle. It will probably be best for us to take a look around. A detailed analysis
64-bit systems have been around for more than a decade now, but we have just begun to take a closer look at them. One possible reason: the 64-bit technology has only recently become available. Desperate ones and those not paying attention still have the opportunity to build 32-bit computers today. If we make the rights choice, we can have a 64-bit system for the same amount. It happens that we get a 64-bit computer in the end, even though that was not a priority. And this is point. We can see a trend of 32-bit parts (motherboards, CPUs) losing ground even in the lower market segments. If this tendency is to continue, it will be only a matter of months before 64-bit systems achieve complete domination of the market. But that alone is not enough as there is also a need for suitable software. This is the bottleneck today, standing in the way of widespread distribution. But before we move on to get a more detailed insight, let us have a brief history lesson:
1991: MIPS Technologies produces the first 64-bit CPU, the R4000. R4000 CPUs are used in SGI graphics workstations running the 64-bit version of the IRIX operating system.
1992: Digital Equipment Corporation introduces the DEC Alpha architecture.
1997: IBM releases its 64-bit PowerPC processors, the RS64.
1998: IBM releases its 64-bit PowerPC/POWER processors, the POWER3.
1999: Intel releases the instruction set for the IA-64 architecture. AMD discloses its 64-bit plans named x86-64.
2001: Intel ships its Itanium processors, targeting mainly high-end servers.
2002: Intel introduces the Itanium 2 systems.
2003: AMD introduces its 64-bit Opteron and Athlon 64 CPU product lines. Apple ships its new computers with 64 bit PowerPC CPUs and the new Mac OS X operating system. Several 64-bit Linux distributions support the x86-64 platform. Microsoft announces that it will support new 64-bit AMD processors also on the Windows level.
2004: Intel develops 64-bit CPUs supporting the EM64T instruction set. The EM64T instructions are supported by the new, updated versions of Xeon and Pentium 4 processors.
30 April, 2005: Microsoft releases the Windows XP x64 Edition operating system.
May 2005: AMD announces its dual core CPU family called Athlon 64 X2. The new chip consists of 233.2 milion transistors.
July 2005: IBM announces its dual core 64 bit PowerPC 970MP processor.
The 64-bit system has existed for 15 years now, but initially it was used only in mainframe environments. The turning point came when Microsoft announced that it would support 64-bit systems also on the Windows level. The pace accelerated from that date on: today the competition between IBM and AMD is a competition of 64-bit processors. A couple of years ago we already had the means to build a 64-bit system at home using the Linux operating system, but Linux could not actually accelerate the spread of this architecture.
It is important to note here that the ideas brought up in this article are based on facts, but we should not forget about the business interests and marketing aspects related to the IT business, either. These also play a significant role in the history of the 64-bit architecture, and the parts cannot be separated from the whole, nor can the borders lying between them be drawn exactly. Information Technology is a science, technical opportunity and everyday reality with a peculiar sub-culture, which go hand in hand along the unforeseeable paths of business interests and marketing aspects. I believe that the ‘what came first – the hen or the egg?’ problem of the 64-bit world was solved with announcing, then releasing the 64-bit version of Windows. Or, that is not so important after all; unlike the fact that the 64-bit system is finally here, at our fingertips.
It is very easy to have a computer with 64-bit hardware, but the system will not necessarily be defined as a 64-bit system because it also needs proper software for that. There is a good chance that the 64-bit version of Windows XP Professional can be installed on a system which has a motherboard supporting 64-bit CPUs and does not contain any special hardware. Chipset manufacturers release the 64-bit drivers for their chipsets by default. ATI, NVIDIA and Matrox all develop 64 bit drivers for their cards – even for older models –, which means that we have a fair chance that older cards can also be used. You will also need a network card. If it is integrated on the motherboard, you will not have any problem in that respect. If there is no such integrated card, you can buy an x64-compatible network controller at a very favourable price. So, as you can see, it is easy to build the system.
However, it is not certain that the peripherals will work. Although the XP operating system supports quite a number hardware parts, it is worth taking a look around because they do not always release 64-bit drivers for devices which are not supported by default. The situation is similar to that when Windows 2000 was released about 5 years ago. It was nice and stable, but even so, there were situations where it was not the right choice as it did not support some key devices. This interim period was comparatively short in the case of Windows 2000 (Windows NT5.0), and with the release of the XP (Windows NT5.1) operating system it was more or less solved.
The security aspect, however, is a completely different matter. Microsoft continuously develops and publishes security fixes. But there are huge gaps in the field of 64-bit firewalls, anti-virus and anti-spy software. Currently, there are two 64-bit anti-virus programs: the CA eTrust Antivirus r7.1 x64 and its updated version, the eTrust Antivirus r8. Anti-virus programs written in a hybrid code can be obtained from three more manufacturers: Eset NOD32 Anti-Virus, avast! Antivirus and Symantec AntiVirus 10.0 Corporate Edition Client for 64-bit. In the case of these latter programs, only the deepest (core) level of the program code is written in 64-bit. The situation in the firewall segment is even worse: Tiny Firewall 64 and ZoneAlarm 64 Beta version. But we must face the worst situation in the field of anti-spy software, where the single 64-bit solution is the recently released Windows Defender Beta 2 build 1051 x64.
Of course, you can experiment with 32-bit software but you should not forget that the programs discussed here are system utilities, which integrate with the system at a low level. They may show some sign of operation, they could even operate under certain circumstances, but we cannot make a general statement claiming that they are compatible with the 64-bit environment, because they may operate in a completely different manner in another situation. Their product support is also tested in the original 32-bit environment. And there is always the chance that they will malfunction following an update.
The situation is better in the segment of user applications. 32-bit software operate well. It has been promised that the Office Suite would be released in a native 64-bit version sometime in the future. It is rather difficult to explain the advantages of the 64-bit architecture in this category. Most probably, this category will not be the one to greatly promote the 64-bit architecture, but it has a great benefit: it cannot hinder it, either, because 32-bit applications run well on 64-bit systems (in theory – never forget that when speaking about informatics!).
In the field of 64-bit graphics and music workstations we already have a huge advantage. The companies which develop special software released the 64-bit version of the products as soon as they could. These are horrendously expensive programs and their target users will pay anything for the advantages granted by these new opportunities. This field is the actual driving force in the 64-bit world, even though we rarely come across the devices belonging to this group.
Only a few 64 bit games have been released so far, they have mainly demonstrative value. When the cause of the 64-bit architecture moves forward, it will likely provide great motivation for game developers. It is not that easy to convince an executive to have the company IT infrastructure replaced with 64 bit solutions, but – relying on much modest means – his child can achieve the same goal concerning their home computer in a rather short time.
Where is the truth?
And here are some ideas to conclude my article. In my experience, it is much easier to follow platform trends in the case of different free licences and open source programs. I believe that it would not be too complicated to let the source programs got through a 64-bit development system, which could generate the 64-bit version in the blink of an eye. Well, the situation cannot be that simple, but making use of the existing foundations, we could achieve the final product with considerably less efforts. If it took only that little to develop them, it is certain that we would already have a huge number of 64-bit programs. But there is yet another aspect to it: the 64-bit software, as a final product, is a different program, regardless of the fact that it has the same or a very similar source code. And as a different program it needs new product support resources, which brings up new problems. Thus, we can say that it is easy and not too costly to build on the existing foundations and produce 64-bit software versions, but product support is relatively very expensive since – compared to the 32-bit market – not too many are expected to buy 64-bit software. This is why open source programs have an advantage in this respect: they come without product support. On the other hand, those who do not release the 64-bit version of their software may well find themselves lagging behind the others. It is an interesting game with special rules. There are plenty of aspects, it is difficult to get one’s bearings, and luck is as important a factor as is wisdom. Bill Gates is just another engineer. In his biography he writes that, actually, he did not accomplish any wonders. He was simply at the right place at the right time, and the next moment the wonder was in the making.
I have seen several tests comparing the performance of 32-bit and 64-bit systems. It can be rationally recognized and mathematically proven that 64-bit systems are more powerful. But practice does not prove that assumption on several occasions. Where can the truth be? I think that the development of informatics has departed from quality and gone for quantity. In this specific case it means that computer programs are not properly optimized. Well, I do not think that I have the opportunity to go into its details now but the experience of many supports that idea. So, there are two basic situations:
- Inefficiently optimized 32-bit software is competing with other inefficiently optimized 32-bit software running on a 64-bit system.
- Inefficiently optimized 32-bit software is competing with inefficiently optimized 64-bit software which is running on a 64-bit system, has a source code heavily relying on its 32-bit foundations and was developed using a 64-bit development system.