Re: In search of a more-vanilla vanilla TXT

jim said:
In the case of this book, the answer was more than 1000 "losses" -- or an average of about 3 losses per page. And this is NOT counting about an addition 1000 losses in representation of emphasis.
what was the book? i'd like to compare the versions myself. the question is, "why are you having _any_ losses in the .txt files?" the answer, i am sure, will once again be, "you're doing it wrong". it sucks, yeah, but it will be important to fix your broken workflow. seriously, if you are stripping out emphasis, you're making a mistake. (unless the "emphasis" had no meaning, and was simply ornate decor.) in the meantime, if you do things intentionally that harm the .txt file, then yes, the .txt file is going to seem awfully incapable to you, so it's not a big surprise that you keep wanting to insist that such is the case. -bowerbird

what was the book? i'd like to compare the versions myself.
The book is E-text #29452 And contrary to your complaints I didn't "strip out anything." A more accurate statement of your complaint is that I didn't waste a whole lot of my time and energy inventing and manually inserting semantic markings in a legacy file format that is a hopelessly broken representation of this book in the first place. If you want to hack up the TXT file some way to make you more happy feel free to do so - you're a "volunteer" too - I'm certainly not going to be reading the TXT file, so personally I don't care what you do!
participants (2)
-
Bowerbird@aol.com
-
James Adcock