Clarification of Question by
mikael-ga
on
08 May 2006 22:46 PDT
Hi again,
Here are my attempts at some clarifications:
1) We receive papers from many different authors in most of the major
languages (English, German, French, Russian and in the future we also
see Arabic coming). This means that 90% (minus the initial abstract)
is written in that original language.
2) My understanding of these software tools is that they run local on
a PC (or hosted somewhere) and you supply an orginal text to be
cross-checked against the programs reference database. This database
can be either propriety or based on a series of defined web searches
against web sites. My problem is that I need to be sure that the
system checks against web sites that we know our authors use when
doing there research or fact gathering. Otherwise, I'm not really sure
how confident we could be that the systems actually will detect any
"copy and pasting" from such sites.
But - I can be wrong were in my understaning of how these systems
work. Please than include some links on the technical foundations of
the systems.
3)A system that I have tried is called "EVE - Essay Verification
Engine". This systems runs locally on my PC, but seems a little bit to
light weight for us.
It might also be that an other market for these systems are the
traditional search engine vendors like Autonomy, FAST, Google,
dTSearch etc etc can supply specially tailored versions of their
software that provides this kind of functionality.
regards / Mikael