Information plays an important role in our understanding of the physical world. Hence we propose an entropic measure of information for any physical theory that admits systems, states and measurements. In the quantum and classical worlds, our measure reduces to the von Neumann and Shannon entropies, respectively. It can even be used in a quantum or classical setting where we are only allowed to perform a limited set of operations. In a world that admits superstrong correlations in the form of non-local boxes, our measure can be used to analyze protocols such as superstrong random access encodings and the violation of 'information causality'. However, we also show that in such a world no entropic measure can exhibit all the properties we commonly accept in a quantum setting. For example, there exists no 'reasonable' measure of conditional entropy that is subadditive. Finally, we prove a coding theorem for some theories that is analogous to the quantum and classical settings, providing us with an appealing operational interpretation.