pontifications from mount high horse -- #1500

remember i said that, in the early days of d.p., many people doubted that it would even work, because too many cooks would spoil the soup? the same scenario is now playing again at d.p., as some people there believe postprocessing can't be distributed, giving similar reasoning. meanwhile, other people -- including don -- think it'd be wise to test to see if distribution will work well enough to clear a bad backlog. plus, as i said, i believe that digitization will increasingly be done "for the love of a book", by people who'll do that book "end-to-end". while some of them -- like roger frank, or nick hodson -- will do hundreds of books, others might only do one or two, or a few... whatever the case, it would be good if they had a website that guides them through it, and helps 'em attain a good level of quality. this project, the one i started christmas day, might eventually grow into that kind of site:
because although some people might get a sense of satisfaction by finishing a page, as roger recently argued, i think more will seek that feeling by finishing a whole book, especially a book that's meaningful to them. as one aspect of "book-dig", i built an editor:
the name is because i used the "betty lee" book that we've been using for sample content lately. like the rest of the "book-dig" system, bettyedit is still "under construction", and thus a bit raw (e.g., edits aren't "sticky" yet, but will be soon). but as an example of the type of help it'll give, i've currently coded it to render italic highlights (in magenta) and doublequote-checks (in blue), with due apologies to the color-blind out there. cleaning the text of books is a lot of fun for me. and coding apps to do that job is even more fun. *** have a nice year in 2012. i will. -bowerbird

BB>the same scenario is now playing again at d.p., as some people there believe postprocessing can't be distributed, giving similar reasoning. Not to go to far to try to defend DP, but some years ago I was complaining bitterly that as the queue backload increased the proportion of unfinished books was increasing proportionately, such that 1/3 of all book effective "never" gets finished (statistically and queuing-theory speaking) But this is no longer the case. The backload is being somewhat better managed, with a larger proportion of books actually getting finished. Now, this is probably just due to them chasing away newbies.... Not sure anyone one can come up with a system whereby books are finished for some reason other than "a labor of love" because let's be honest, finishing up a book is a pain in the behind. Getting it "close" is duck soup. Ultimately *someone* has got to actually *care.* http://www.pgdp.net/c/stats/stats_central.php
participants (2)
-
Bowerbird@aol.com
-
Jim Adcock