What’s the difference between a microcontroller and a microprocessor?

There is no hard definitions and many devices blur the distinction, but these are general guidelines:

A microcontroller is designed to be self-contained device that can be used to control a wide range of machines. In this context it is called an embedded controller. In addition to the processor, a microcontroller will include program and data memory and a range of peripherals such as timers, input and output ports, and analog-to-digital and digital-to-analog converters. The outputs of a microcontroller will usually be able to provide enough current to drive LEDs and possibly even small relays. A microcontroller typically does not have an operating system, but runs only the program necessary to control the machine. A microcontroller is designed for building embedded systems with as few external components as possible.

A microprocessor is only the core of a computer, the part that reads data from memory, manipulates that data, and stores the results back in memory. A microprocessor is expected to be part of a larger system, and lacks memory and peripherals. Because it is expected to interface to other integrated circuits, a microprocessor will have very limited ability to provide current. This current will be enough to connect to the inputs of other integrated circuits, but probably not enough to drive even LEDs.  Microprocessors range from very simple and relatively slow, that might be used in a microcontroller, to very complex and fast, such as those used in modern desktop computers.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *