It's been theorized that some combination of widespread literacy and large groups of people speaking the same language should slow language innovation. And indeed, mining Google's English corpus data, we see a growth rate of about 8,500 words a year, and it's slowing - but another theory put forward is that the marginal utility of new words is decreasing. One reason that we start using new words is to describe new technologies, and technology certainly hasn't slowed down in the previous centuries relative to the ones before it, so it's hard to see how the marginal utility theory could apply.
Another study to test this could be to see whether new technologies are being described by new combinations of words, or by words that have undergone semantic drift (same word applied with new meanings). This would be harder to do because the machines would have to parse semantic content, and if we could do that well we'd already have lots of Turing-test-passing computers.
Human cooperation in dynamic networks.
4 hours ago