
Hi There, Let me chime in here. Yes, you can use these tools as a start and for casual use, but otherwise you can forget them as a professional tool. - due to the statistacal modell you get only 80-90 % accuracy - I see a lot of sites on which the content for different languages is diferent no one to comparison possible - I have work and help develope such tools and know that they give interresting results and are in the range above. Yet, these methods are only good as a analyse tool. - a system with fairly decent grammar models and lexicons give better results using less resources give better results. The Problem here is is that then are not publically availble. The actual problem with translation is the so-called extra- linguistical part!! Culture related facts, context, register etc. to get the last 5 % for a decent translation the effort and resources rises exponentially. As proof the japanese in the 80s said they would have a real-time translation for telephones on the market in 5 years. This was is vaporware. The method is not new. It was used successfully for wheather reports already in the 80s. The method works only for small areas of knowledge/language. In the 70s word for word used to be good enough. Now they have something they say is "good enough". Two Euro cents worth Keith. Am 09.03.2006 um 09:41 schrieb Bowerbird@aol.com:
http://www.dancohen.org/blog/posts/no_computer_left_behind said:
Google researchers have demonstrated (but not yet released to the general public) a powerful method for creating 'good enough' translations—not by understanding the grammar of each passage, but by rapidly scanning and comparing similar phrases on countless electronic documents in the original and second languages. Given large enough volumes of words in a variety of languages, machine processing can find parallel phrases and reduce any document into a series of word swaps. Where once it seemed necessary to have a human being aid in a computer's translating skills, or to teach that machine the basics of language, swift algorithms functioning on unimaginably large amounts of text suffice. Are such new computer translations as good as a skilled, bilingual human being? Of course not. Are they good enough to get the gist of a text? Absolutely. So good the National Security Agency and the Central Intelligence Agency increasingly rely on that kind of technology to scan, sort, and mine gargantuan amounts of text and communications (whether or not the rest of us like it).
sounds like something you might find interesting, michael. of course, a "good enough" translation probably wouldn't be, not for literature, where the realm of creativity is instantiated,
but could it work as a "first pass" that would do the bulk of the "heavy lifting", so a person knowledgeable in both languages could come in and spend relatively little time smoothing it out? well, it's certainly possible, i would think. and maybe probable. especially if progress on the technique proves to be forthcoming...
-bowerbird _______________________________________________ gutvol-d mailing list gutvol-d@lists.pglaf.org http://lists.pglaf.org/listinfo.cgi/gutvol-d