×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Scientists decode books to detect literary 'fingerprint'

Last Updated 11 December 2009, 17:31 IST

New research describes a new concept from a group of Swedish physicists from Umeå University. The ‘meta book’ uses the frequency with which authors use new words in their work to discern distinct patterns in authors’ written styles.

For more than 75 years, George Kingsley Zipf’s maxim, based on a carefully selected compilation of American English called Brown Corpus, suggested a universal pattern for the frequency of new words used by authors.

Zipf’s law suggests that the frequency ranking of a word is inversely proportional to its occurrence. New research suggests, however, that the truth behind word frequency is less universal than Zipf asserted, and is linked more with the author’s linguistic ability than any over-arching linguistic rule.

Researchers first found that the occurrence of new words in the texts by Hardy, Lawrence and Melville did begin to drop off in their texts as their book gets longer, despite new settings and plot-twists.

Their evidence also shows however that the rate of unique word drop-off varies for different authors and, most significantly, is consistent across the entire works of any one of the three authors they analysed.

The statistical analysis was applied to entire novels, sections from novels, complete works and amalgamations from different works by the same authors - they all had a unique word-frequency “fingerprints”.

By using the statistical patterns evident from their study, the researchers have pondered the idea of a meta-book - a code for each author which could represent their entire work, completed or in the mental pipeline.

ADVERTISEMENT
(Published 11 December 2009, 17:31 IST)

Follow us on

ADVERTISEMENT
ADVERTISEMENT