The Mathematics of Humour

Snunkoople, howaymb, finglam and quingel – no, these aren’t real words but they are funny according to a study carried out last year, which has managed to find humorous words using the mathematics of probability.

In November 2015, a paper was published concerning a study that aimed “to directly test Schopenhauer’s hypothesis that greater incongruity between expectation and events produces a stronger feeling of humour”. [1] By calculating the entropy of some made up words (‘non-words’), a group of researchers at the University of Alberta, USA, led by a psychology Professor Chris Westbury, tested the hypothesis that uncommon letter combinations within words increases the humour of the word. The idea for the study came from previous work in which Westbury had noticed participants finding some non-words more funny than others.

The study involved two parts, the first of which asked participants to choose the funniest non-word from a choice of two. The second experiment involved the same participants being asked to rate some non-words on their humour level from 1 to 100. Predictions were made beforehand as to the outcome of the participants’ decisions and it turned out that the participant that chose most predictably had 92% accuracy. [2] A very high value when we consider this is a prediction of humour – a supposedly subjective topic.

http://media.graytvinc.com/images/scrabble4.jpg

New research suggests that uncommon letters placed together in a non-word makes the word funnier

These predictions were made based on the entropy of the words. Entropy is the measure of how disordered a system is. [3] In his paper, “Prediction and Entropy of Printed English” from 1950, Claude E. Shannon describes entropy as “the uncertainty regarding which symbols are chosen from a set of symbols with given a priori probabilities”. [4] He produced two ways in which we can calculate the entropy of the English language. Firstly he used “N-grams” – a sequence of N items – which in this case are the letters in the word. This allows us to work out the entropy of the Nth letter based on the N-1 previous letters in the sequence.  The calculation involves the probability of the N-gram being formed and the conditional probability of the Nth letter following the previous N-1 letters. [5]

The second method Shannon used was to calculate the entropy of every word in the English language and take an average assuming the average word has 4.5 letters. This produced an entropy value of 2.62 per letter. [5]

The pattern observed in this study of non-word entropy was that the words with lower entropy were funnier.  Lower entropy is assigned to uncommon letter combinations and hence a stranger non-word. The predictions within the first part of the study were, therefore, easier when the difference between the entropies of the two non-words being compared was increased.

So what does that mean for our perception of humour? Westbury admits that “There are many theories of humour and no final consensus on what makes something funny”. [1] German philosopher Arthur Schopenhauer defines humour with respect to patterned co-occurrence.  The important factor is the probability of an event occurring at the same time as a pre-existing expectation which is inconsistent with that event. Therefore surprise could be an important part of what makes something funny. C. E. Shannon (who wrote the 1948 article – A Mathematical Theory of Communication) managed to formalise surprise as an inverse measure of probability, namely the measure known as Shannon surprisal. In measuring surprise, Shannon made it possible to consider humour mathematically.

The study could have commercial applications; for example, naming a serious product might mean avoiding words with low entropy.  Additionally, it has posed some interesting ideas about thinking of humour in terms of probability and how this can help us to understand how different things can be seen as funny.

 

References:

[1] Journal of Memory and Language, “Telling the world’s least funny jokes: On the quantification of humor as entropy” by Chris Westbury, Cyrus Shaoul, Gail Moroschan and Michael Ramscar

[2] https://uofa.ualberta.ca/science/science-news/2015/november/the-snunkoople-effect

[3] http://people.seas.harvard.edu/~jones/cscie129/papers/stanford_info_paper/entropy_2.htm

[4] “Prediction and entropy of printed English” by C. E. Shannon

[5] http://people.seas.harvard.edu/~jones/cscie129/papers/stanford_info_paper/entropy_of_english_9.htm

Image: http://media.graytvinc.com/images/scrabble4.jpg

Leave A Comment

Copyright Sci@StAnd 2013