History

Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines.

While personal computer users may develop their own applications, usually these systems run commercial software, free-of-charge software (“freeware“), which is most often proprietary, or free and open-source software, which is provided in “ready-to-run”, or binary, form. Software for personal computers is typically developed and distributed independently from the hardware or operating system manufacturers.[2] Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible. This contrasts with mobile systems, where software is often only available through a manufacturer-supported channel,[3] and end-user program development may be discouraged by lack of support by the manufacturer.[4]

Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and then with Microsoft Windows. Alternatives to Microsoft’s Windows operating systems occupy a minority share of the industry. These include Apple‘s macOS and free and open-source Unix-like operating systems.

The advent of personal computers and the concurrent Digital Revolution have significantly affected the lives of people in all countries.