64bit, ATM7059 with Android 5.0 Lollipop, ActDuino S200... - 2014-10-16 - Actions Semiconductor launches their ATM7059 Quad-core ARM Cortex-A9 with Android 5.0 Lollipop support, at 1.6Ghz on 28nm on UMC's HLP 28nm process with a PowerVR...

Cube T7 is a 64-bit ARM Android Tablet Powered by... - 2014-10-14 - That’s it! The 64-bit ARM era is upon us! ARMv8 servers have been available to companies for a few months, entry-level 64-bit ARM smartphones such as HTC Desire 510...

Google Adds Intel x64 Support To Android Emulator; ARMv8... - 2014-10-09 - Today, Google added the 64-bit Intel images (Windows, Mac OS X and Linux) to its Android emulator, signaling the fact that developers can now start testing their...

64 bits. What does it offer, and what does it not?

Start64!Good question. And here is the answer in a single sentence: We have everything and still, we have nothing. Of course, this sounds cynical and highly biased. The truth lies somewhere in the middle. It will probably be best for us to take a look around. A detailed analysis 

64-bit systems have been around for more than a decade now, but we have just begun to take a closer look at them. One possible reason: the 64-bit technology has only recently become available. Desperate ones and those not paying attention still have the opportunity to build 32-bit computers today. If we make the rights choice, we can have a 64-bit system for the same amount. It happens that we get a 64-bit computer in the end, even though that was not a priority. And this is point. We can see a trend of 32-bit parts (motherboards, CPUs) losing ground even in the lower market segments. If this tendency is to continue, it will be only a matter of months before 64-bit systems achieve complete domination of the market. But that alone is not enough as there is also a need for suitable software. This is the bottleneck today, standing in the way of widespread distribution. But before we move on to get a more detailed insight, let us have a brief history lesson:



  • 1991: MIPS Technologies produces the first 64-bit CPU, the R4000. R4000 CPUs are used in SGI graphics workstations running the 64-bit version of the IRIX operating system.
  • 1992: Digital Equipment Corporation introduces the DEC Alpha architecture.
  • 1994: Intel announces plans for the 64-bit IA-64 architecture as a successor to its 32-bit IA-32 processors. Its launch is scheduled for 1998-1999.
  • 1995: Fujitsu-owned HAL Computer Systems launches workstations based on a 64-bit CPU, HAL's independently designed first generation SPARC64. IBM releases its 64-bit AS/400 systems.
  • 1996: Sun and HP release their 64-bit processors, the UltraSPARC and the PA-8000 systems. Running Sun Solaris, IRIX and a Unix-variant operating system.
  • 1997: IBM releases its 64-bit PowerPC processors, the RS64.
  • 1998: IBM releases its 64-bit PowerPC/POWER processors, the POWER3.
  • 1999: Intel releases the instruction set for the IA-64 architecture. AMD discloses its 64-bit plans named x86-64.
  • 2000: IBM ships its first 64-bit mainframe, the zSeries z900, with its new z/OS operating system.
  • 2001: Intel ships its Itanium processors, targeting mainly high-end servers.
  • 2002: Intel introduces the Itanium 2 systems.
  • 2003: AMD introduces its 64-bit Opteron and Athlon 64 CPU product lines. Apple ships its new computers with 64 bit PowerPC CPUs and the new Mac OS X operating system. Several 64-bit Linux distributions support the x86-64 platform. Microsoft announces that it will support new 64-bit AMD processors also on the Windows level.
  • 2004: Intel develops 64-bit CPUs supporting the EM64T instruction set. The EM64T instructions are supported by the new, updated versions of Xeon and Pentium 4 processors.
  • March 2005: Intel announces that its first dual-core processors, Pentium Extreme Edition 840 and Pentium D, will be shipped in the same year.
  • 30 April, 2005: Microsoft releases the Windows XP x64 Edition operating system.
  • May 2005: AMD announces its dual core CPU family called Athlon 64 X2. The new chip consists of 233.2 milion transistors.
  • July 2005: IBM announces its dual core 64 bit PowerPC 970MP processor.

The 64-bit system has existed for 15 years now, but initially it was used only in mainframe environments. The turning point came when Microsoft announced that it would support 64-bit systems also on the Windows level. The pace accelerated from that date on: today the competition between IBM and AMD is a competition of 64-bit processors. A couple of years ago we already had the means to build a 64-bit system at home using the Linux operating system, but Linux could not actually accelerate the spread of this architecture. 

It is important to note here that the ideas brought up in this article are based on facts, but we should not forget about the business interests and marketing aspects related to the IT business, either. These also play a significant role in the history of the 64-bit architecture, and the parts cannot be separated from the whole, nor can the borders lying between them be drawn exactly. Information Technology is a science, technical opportunity and everyday reality with a peculiar sub-culture, which go hand in hand along the unforeseeable paths of business interests and marketing aspects. I believe that the ‘what came first – the hen or the egg?’ problem of the 64-bit world was solved with announcing, then releasing the 64-bit version of Windows. Or, that is not so important after all; unlike the fact that the 64-bit system is finally here, at our fingertips.



It is very easy to have a computer with 64-bit hardware, but the system will not necessarily be defined as a 64-bit system because it also needs proper software for that. There is a good chance that the 64-bit version of Windows XP Professional can be installed on a system which has a motherboard supporting 64-bit CPUs and does not contain any special hardware. Chipset manufacturers release the 64-bit drivers for their chipsets by default. ATI, NVIDIA and Matrox all develop 64 bit drivers for their cards – even for older models –, which means that we have a fair chance that older cards can also be used. You will also need a network card. If it is integrated on the motherboard, you will not have any problem in that respect. If there is no such integrated card, you can buy an x64-compatible network controller at a very favourable price. So, as you can see, it is easy to build the system. 

However, it is not certain that the peripherals will work. Although the XP operating system supports quite a number hardware parts, it is worth taking a look around because they do not always release 64-bit drivers for devices which are not supported by default. The situation is similar to that when Windows 2000 was released about 5 years ago. It was nice and stable, but even so, there were situations where it was not the right choice as it did not support some key devices. This interim period was comparatively short in the case of Windows 2000 (Windows NT5.0), and with the release of the XP (Windows NT5.1) operating system it was more or less solved.



The security aspect, however, is a completely different matter. Microsoft continuously develops and publishes security fixes. But there are huge gaps in the field of 64-bit firewalls, anti-virus and anti-spy software. Currently, there are two 64-bit anti-virus programs: the CA eTrust Antivirus r7.1 x64 and its updated version, the eTrust Antivirus r8. Anti-virus programs written in a hybrid code can be obtained from three more manufacturers: Eset NOD32 Anti-Virus, avast! Antivirus and Symantec AntiVirus 10.0 Corporate Edition Client for 64-bit. In the case of these latter programs, only the deepest (core) level of the program code is written in 64-bit. The situation in the firewall segment is even worse: Tiny Firewall 64 and ZoneAlarm 64 Beta version. But we must face the worst situation in the field of anti-spy software, where the single 64-bit solution is the recently released Windows Defender Beta 2 build 1051 x64. 

Of course, you can experiment with 32-bit software but you should not forget that the programs discussed here are system utilities, which integrate with the system at a low level. They may show some sign of operation, they could even operate under certain circumstances, but we cannot make a general statement claiming that they are compatible with the 64-bit environment, because they may operate in a completely different manner in another situation. Their product support is also tested in the original 32-bit environment. And there is always the chance that they will malfunction following an update.

The situation is better in the segment of user applications. 32-bit software operate well. It has been promised that the Office Suite would be released in a native 64-bit version sometime in the future. It is rather difficult to explain the advantages of the 64-bit architecture in this category. Most probably, this category will not be the one to greatly promote the 64-bit architecture, but it has a great benefit: it cannot hinder it, either, because 32-bit applications run well on 64-bit systems (in theory – never forget that when speaking about informatics!).

In the field of 64-bit graphics and music workstations we already have a huge advantage. The companies which develop special software released the 64-bit version of the products as soon as they could. These are horrendously expensive programs and their target users will pay anything for the advantages granted by these new opportunities. This field is the actual driving force in the 64-bit world, even though we rarely come across the devices belonging to this group.

Only a few 64 bit games have been released so far, they have mainly demonstrative value. When the cause of the 64-bit architecture moves forward, it will likely provide great motivation for game developers. It is not that easy to convince an executive to have the company IT infrastructure replaced with 64 bit solutions, but – relying on much modest means – his child can achieve the same goal concerning their home computer in a rather short time.


Where is the truth?

And here are some ideas to conclude my article. In my experience, it is much easier to follow platform trends in the case of different free licences and open source programs. I believe that it would not be too complicated to let the source programs got through a 64-bit development system, which could generate the 64-bit version in the blink of an eye. Well, the situation cannot be that simple, but making use of the existing foundations, we could achieve the final product with considerably less efforts. If it took only that little to develop them, it is certain that we would already have a huge number of 64-bit programs. But there is yet another aspect to it: the 64-bit software, as a final product, is a different program, regardless of the fact that it has the same or a very similar source code. And as a different program it needs new product support resources, which brings up new problems. Thus, we can say that it is easy and not too costly to build on the existing foundations and produce 64-bit software versions, but product support is relatively very expensive since – compared to the 32-bit market – not too many are expected to buy 64-bit software. This is why open source programs have an advantage in this respect: they come without product support. On the other hand, those who do not release the 64-bit version of their software may well find themselves lagging behind the others. It is an interesting game with special rules. There are plenty of aspects, it is difficult to get one’s bearings, and luck is as important a factor as is wisdom. Bill Gates is just another engineer. In his biography he writes that, actually, he did not accomplish any wonders. He was simply at the right place at the right time, and the next moment the wonder was in the making.

I have seen several tests comparing the performance of 32-bit and 64-bit systems. It can be rationally recognized and mathematically proven that 64-bit systems are more powerful. But practice does not prove that assumption on several occasions. Where can the truth be? I think that the development of informatics has departed from quality and gone for quantity. In this specific case it means that computer programs are not properly optimized. Well, I do not think that I have the opportunity to go into its details now but the experience of many supports that idea. So, there are two basic situations:

  1. Inefficiently optimized 32-bit software is competing with other inefficiently optimized 32-bit software running on a 64-bit system.
  2. Inefficiently optimized 32-bit software is competing with inefficiently optimized 64-bit software which is running on a 64-bit system, has a source code heavily relying on its 32-bit foundations and was developed using a 64-bit development system.
It is easy to understand that the drawbacks coming from insufficient optimization accumulate in the 64-bit system in either case, therefore it falls behind, or cannot produce the results expected based on its true capacities.

facebook-3 twitter-3 rss-3 email-3


   eXTReMe Tracker