Inter-task communication refers to the mechanisms that enable tasks to coordinate/cooperate/synchronise by means of sending or receiving information that falls into two logical categories: Signals or Messages.
- Signals: A signal is totally defined by its absence or presence. The meaning is implicit.
- Messages: When the operations used for tasks to communicate also allow conveying flexible information, these mechanisms are regarded as Message Passing.
Note that message passing also is a means of coordinating — after all, if Task A sends a message to Task B, Task B can only receive it some point in time after A has sent it.
To make this distinction clear, let us consider two mechanisms: a Binary Semaphore and a Mailbox. Assume both are public, i.e., any task can signal/write to and wait/read from them.
A Binary Semaphore assumes values ‘1’ or ‘0’. A value of ‘1’ means that some task has signalled this semaphore to record that an event has occurred. However, this record is limited to a single occurrence and carries no further information.
A (typically single) task interested in this event, can be notified when it happens (if blocked waiting for the event) – or can catch up with it latter, and perform — for example, reading the value from a temperature sensor. In this case, there is a single possible interpretation associated with the semaphore.
A Mailbox holds a single message. If we replace the binary semaphore with a mailbox, a reader task now has the ability to alter its execution based on the contents of the message.
For instance, one message might instruct the task to read the temperature sensor and store the result in a buffer; another message might instruct it to read the humidity sensor and transmit the result over a serial line, and so on. In this case, the event is associated with the mailbox and with the message it conveys.
Nevertheless, both binary semaphores and mailboxes are latching – they have a state to represent whether an event has occurred.
