The decision to utilize Microsoft Windows NT Server or one of the many Unix operating systems is the concern of many IS managers around the world today. Unix is not a single operating system; it refers to a family of operating systems which includes AIX, BSDI, Digital UNIX, FreeBSD, HP-UX, IRIX, Linux, NetBSD, OpenBSD, Pyramid, SCO, Solaris, SunOS, just to name a few. Microsoft Windows NT has a well-known reputation. But these managers have to consider whether or not choosing a Microsoft product can increase the company’s profits.
The cost of the network operating system (NOS) will be the ultimate factor in their decision. It is not just the initial cost of the hardware however, but rather many other factors will need to be considered to insure that further maintenance costs are not overwhelming. For instance, software licenses will need to be procured. Technical support agreements will need to be assessed. The costs of upgrades/service packs, hardware upgrades will need to be weighed for both types of systems. Determining which system has a greater occurrence of glitches can be a factor in estimating lost profits for every hour of downtime.
If the company should experience a glitch, how substantial will personnel costs for recovering/recreating data be? Knowledgeable systems administrators will need to be employed to maintain the system. This task is not to be taken lightly as these are only some of the situations to be considered prior to making a decision on which NOS to purchase. Since accruing costs is a primary concern for managers, the conditions previously discussed give an indication that a combination of server hardware and operating systems seems to be the most cost-effective option for long term use.
Unix is a fully developed, group of operating systems known for its performance, reliability, and security in a server environment. On the other hand, Windows NT Server has the advantage of Windows 95’s popularity. This desktop operating system is already being used in homes and offices everywhere. Before making the operating system decision a manager should consider visiting the local library to research the particular subject. It will be difficult to find current unbiased literature. But a determined manager or QM student should be able to separate the important information from personal preferences.
Most of the older books are concerned with theory using Unix as a guide. For current information, periodicals are the best source. But as stated earlier, much of it is very biased one way or the other. The preferences are split down the middle with half of the professionals supporting Unix or a Unix variant and the other half supporting Microsoft products. Operating Systems Operating systems (OS) were originally developed as a large set of instructions for large mainframe computers in order to control the hardware resources of the mainframe.
Thereafter, they have been developed to run on smaller and smaller computers, first mini computers then on the new personal computers (PC). But, the main job of the OS was the same, a layer between the hardware and the user. The main reason for having an OS is for the application programmers to have a common base upon which to run their applications, no matter what hardware is being used. One important function of the OS is to perform file management. This allows applications to read or write to disk, regardless of the hardware being used or how it is stored.
Without this feature programmers would have to write new programs for every different type of hardware and every different type of hardware configuration. However, Microsoft Windows is the dominant PC OS, so most of the applications written today are written for the Windows environment. Network Operating Systems When businesses initially began to use desktop PCs in the 1980’s, there was no connection between PCs and mainframes or between the PCs themselves. The PC was normally used for word processing, spreadsheets, etc. Soon users wanted to more efficiently share resources than disk swapping allowed.
A solution emerged, networking, and to control these resources, network operating systems (NOS) were developed. At first NOSs allowed the most basic of functions like sharing printers and files. Soon the NOSs role expanded greatly to management of the resources in the local network, and to link up with other local area networks (LAN), therefore creating wide area networks (WAN). NOS’s controlled the network through a server. The server only controlled the resources directly linked to it and the PCs used a second OS that controlled their specific hardware. Peer-to-peer networks later developed.
While using a peer-to-peer LAN there was no need for a dedicated server, which was great for small businesses with few users. But with many users and large amounts of data, a greater need surfaced for a dedicated server. Methodology Managers without knowledge or experience with systems and network administration find it difficult to choose a server platform. This report will attempt to compare and contrast Microsoft Windows NT Server and Unix, a mixture of commercial and non-commercial operating systems originating from the same source so they share many similarities.
The main focus of the paper is to assist managers in choosing a network operating system using quantitative methods. The issues of comparison discussed are in the areas of product costs and licensing, functionality, reliability, and performance. These are presented to provide a more complete view of these products. Product costs and licensing issues Most managers will agree that the mere cost of an operating system is trivial when evaluating the big picture. Although Windows NT Server 4. an be expensive, a Unix variant can be bought for a minor dollar amount. In order to match the functionality of a BSDI (a variant of Unix) installation, additional Microsoft products and third party solutions would bring the final price of a comparable NT solution within a reasonable cost. Functionality What can you expect from Windows NT Server and from Unix immediately after acquiring the systems? NT and Unix can both communicate with many different types of computers. Both Unix and NT can secure sensitive data and keep unauthorized users off the network.
Essentially, both operating systems meet the minimum requirements for operating systems functioning in a networked environment. Reliability As computers become more and more utilized in our world today, reliability is the more significant feature, even more important than speed. Although performance is largely a function of hardware platform, it is in the area of reliability that the choice of operating systems has the most influence. An operating system may offer more functionality. Also, it may be more scalable. To add to that it may even offer greater ease of system management.
But if you are constantly being challenged with glitches in the system and are unable to even get any use out of the system because it is always down, what good are these advantages? Performance Processing power is largely a function of computer hardware rather than of the operating system. Since most commercial Unix operating systems run only on high-end workstations or servers, Unix has historically been considered an operating system for high-end hardware. To say that Unix outperforms NT based on the results of differing hardware would be unfair to Microsoft.
One should compare NT Server’s performance to that of Linux or FreeBSD, since all three operating systems run on the same hardware which is Intel, the hardware-type most often used with NT. A truly unbiased comparison of performance would have to be based on benchmarks, but these are few and usually only focus on specific areas like web performance. There are some specific issues that affect performance. Unix does not require a graphical user interface to function while NT does. Graphics require incredible amounts of disk space and memory, the same holds true for sound files.