Algorithm::Diff::XS's performance can be rather unpredictable. I'm diffing a document with 14k tokens and performance varies between 9ms and 900ms. The overall number of changes doesn't seem to affect that number too much, instead it looks like many consecutive insertions are causing that slowness.
Reducing the number of tokens by doing line-based rather than word-based diff speeds things up considerably, but is less useful.
Maybe I should try a two-step diff, first finding the updated lines, then applying a word-based diff on those.
A lonely little town in the wider world of the fediverse.