Utility is a generic term used to specify how much a certain action gives results according to an agent’s preferences. Its unit – util or utilon – is an abstract arbitrary measure that assumes a concrete value only when the agent’s preferences have been determined through an utility function.
It’s a concept rooting from economics and game theory, where it measures how much a certain commodity increases welfare. One of the clearest examples, especially in this field, is money: it directly represents the price a person is willing to pay for the satisfaction of his preference (that is, to acquire something one desires). Although it has been argued that utility is hard to quantify when dealing with human agents, it is widely used when designing an AI capable of planning.
Utilitarianism is a moral philosophy advocating actions which bring the greatest welfare for the greatest amount of agents (human, in the case) involved.
Further Reading & References
- Mistakes in Choice-Based Welfare Analysis by Botond Köszegi and Matthew Rabin
- Russell, Stuart J.; Norvig, Peter (2003), Artificial Intelligence: A Modern Approach (2nd ed.), Upper Saddle River, New Jersey: Prentice Hall, ISBN 0-13-790395-2
- Purchase Fuzzies and Utilons Separately
- Post your Utility Function
- Applying utility functions to humans considered harmful
- Do Humans Want Things?
- Money: The Unit of Caring