The yuzz-a-ma-tuzz and snunkoople of funny words

Can something as intangible as humour be understood scientifically, its parts isolated and explained mathematically? It would appear so

What is funnier than a dead scientist? A dead scientist in a clown suit. There might be as many readers offended by that joke as there are readers still laughing. Humour is a subjective, personal trait, shaped by countless variables in each individual. But could something as intangible as humour truly be understood scientifically, its parts isolated, measured and explained mathematically?

The idea of some formal metric for funniness and, by extension, artificially engineered humour, might be cause for a wry laugh, but psychologists from the University of Alberta, Canada, have developed a method for measuring humour in words. Or nonwords, to be more precise.

“We were originally conducting research on people with aphasia [a speech and language disorder caused by brain damage],” says Dr Chris Westbury, lead author of the study.

“Test subjects were shown computer-generated letter strings and asked to determine whether they were actual words or not. We noticed a pattern: consistently participants would laugh at certain made-up words but not others.”


"Snunkoople", for example, always gave people a chuckle, whereas "clester" did not. Why? Westbury hypothesised that the answer lay in a word's entropy. In this context, entropy relates to the measure of unpredictability of various combinations of letters. The lower the entropy, the funnier the nonword is likely to be.

In this study, volunteers were asked to compare two nonwords and choose the one they thought was funnier. They were then shown a single nonword, which they had to rate in terms of its funniness on a scale from 1-100.

“We found the bigger the difference in the entropy between the two words, the more likely the subjects were to choose the way we expected them to,” says Westbury. “What we computed in our research was the contribution of each nonword to the total entropy of the English language.”

“In essence, we treated each nonword as but one part of a very long signal that is the English language. Contribution to total entropy is a measure of how unexpected, improbable or weird a particular string is, but that is not quite entropy, because, in the strictest sense, it is a metric for global rather than local probability,” he says.

“If I say to you, ‘I love the cat, I love you, and I love hablump’, you will be struck by ‘hablump’ because it is unexpected, improbable, weird. We quantified how unexpected, improbable or weird each nonword was (the local probability of that part of the signal), in the context of the predictability of the signal that is English as it is spoken (or written).”

Amazing predictability

Put simply, the research accurately demonstrates that the more improbable a nonword sounds, the funnier it is likely to be. One test subject chose correctly 92 per cent of the time.

“To be able to predict with that level of accuracy is amazing,” says Westbury. “You hardly ever get that in psychology research.”

Although this study is the first of its kind to look at humour from a quantitative angle, it doesn’t take an academic paper to know that some weird-sounding words are funny. “Comedians have long argued that certain sounds are inherently funnier than others – such as the ‘k’ and ‘oo’ sounds in ‘snunkoople’,” says Dr Tony Veale of UCD’s department of computer science.

“There are many theories of humour, none of which are entirely satisfactory or complete, suggesting it is a complex category with no shared characteristics that apply to every single instance. Rather, humour can only be described by a series of family resemblances.”

Resolving incongruity

What makes something fundamentally funny, according to Veale, is the resolution of incongruity. “A ‘what the hell’ moment,” he says. “If you come home after work and are surprised to find the house is totally empty but later learn everyone is next door, it’s not funny but you resolve the incongruity.

“If, however, you come home to a seemingly empty house only to be surprised by a big party in your honour, this would be a funny resolution of incongruity.

“If you came home to find your entire family dead, it would be far from funny, but there would still be a resolution. Humour, therefore, could be defined as appropriate incongruity.”

There are several theories as to the evolutionary basis for humour. "We think that it is personal, but evolutionary psychologists have described humour as a message-sending device," says Westbury. "By laughing, we let others know that someone or something isn't a threat."

It's also an effective social adhesive. "Humour helps to strengthen group cohesion, often at the expense of an outsider," says Veale.

Then you have irony, used to point to a failure of expectations. Unlike other types of humour, there is always a critical element to irony, making it a very powerful communication device. Humour has a range of functions,” says Veale. “While some overlap, all provide value from an evolutionary perspective.”


American writer and illustrator Theodor Seuss Geisel, aka Dr Seuss, probably wasn’t thinking about entropy levels when inventing nonwords such as “snergelly”, “truffula” or “barbaloots” for his children’s books.

However, psychologists from the University of Alberta have discovered surprisingly similar parallels between their research into the relationship between humour and entropy and the creative approach to language taken by Seuss in classics such as The Lorax and The Cat in the Hat.

"We have found that Dr Seuss, who was well-known for his ability to make funny nonwords, did so using combinations of letters that were predictably lower in entropy," says Dr Chris Westbury of the University of Alberta. Take a Seuss word such as "yuzz-a-ma-tuzz" from the book On Beyond Zebra.

“Here we find it has a low entropy because of the frequency of an improbable letter like Z. Of course for Seuss it was intuitive. He was simply inventing words that he thought sounded funny, but essentially it comes down to the probability of the individual letters.”

John Holden

John Holden

John Holden is a contributor to The Irish Times specialising in science, technology and innovation