When computers for personal use were introduced in the 1970s, it was common practice that the computer’s hardware and its main operating software (which, back then, usually involved not much more but a simple command line driven interface for standard I/O operations or a BASIC interpreter) were delivered by the same company. As a full package, the combination of hardware and software made up “the computer”. Even during the 80s, when computers became much more powerful and hence useful to end users, computer companies from the early days still hold true to the basic principle of developing their own hardware and software. Think of the Mac, the Commodore Amiga, the Atari ST or the countless other computer systems of the time, all of which were made unique not only because of their hardware specs, but mainly because of the things the computer’s operating system would let the machine actually do.
There was of course one other company, which thought up a business model that was rather different. This company solely devoted itself to developing only the operating system, which it would license to any hardware manufacturer that was willing to pay for it. Obviously, this company’s flagship product, MS-DOS, became rather popular and soon evolved into the defacto standard for computer operating software. Although the system was among the least advanced of all operating software efforts of that time, as a result of not capable of doing much advanced stuff it required inexpensive hardware to run on. This quickly lead to a huge growth in sales, and hence in the availability of third party sofware. Microsoft laid the groundworks of its imperium, and formed the foundation for spin-offs of their popular DOS operating system in the form of its many Windows iterations. The companies that sold these DOS- and Windows-compatible computers (increasingly refered to as “PCs”) basically had no possibilities to differentiate themselves from their competitors on other things but price.