
I'm one of those people guilty of big backlogs, with 13 books in various stages of completion in Post Processing after going through DP. I always do SGML, XML, and HTML, have heavily illustrated works with lots of tables, and these take a lot of work, no matter what the proofreaders do. A few times I've just left out the most horrendous tables, to type them personally, or even drop them from the work altogether. Then, adding ASCII versions also adds to the delay, as I can generate HTML from the SGML automatically, but ASCII simply is too hard to automate, which means redoing the tables again. An average novel can be PP-ed in a few hours, but those scientific works take many hours. My longest running project (not through DP) is Alberuni's India, with long citations in Greek, Persian, Sanskrit, etc., often in the original scripts (five different scripts in this book). Spread out over five years, hundreds of hours have gone into it, and it is not yet done. Jeroen. Joshua Hutchinson wrote:
DP produces about 200+ new books a month. Unfortunately, the proofers at DP, finish about 250 books a month. Which means we have an ungodly backlog of texts that need to be post-processed (over 450 books right now). Our proofing output has scaled up from last year, but our post-processing has not been able to keep up. There are plans for ways to improve the bottleneck. Unfortunately, developers to implement those ideas are another bottleneck.
As big_bill at DP always says, though ... Those books aren't going anywhere and we will get to them eventually. :)
JHutch