- 签证留学 |
- 笔译 |
- 口译
- 求职 |
- 日/韩语 |
- 德语
In the Handbook of Terminology Management Wright and Budin define terminology management as "any deliberate manipulation of terminological information" (Wright and Budin 1997: 1). This is an admittedly broad definition, but the key word is "deliberate." Terminology management as a series of actions carried out in a planned manner ensures the availability of terms, definitions, metadata, and other information pertaining to terminology. Terminology management guarantees that terminology is a known entity. One manages terminology so that one knows what terminology one has. Unmanaged terminology, on the other hand, is terminology that has not been documented. It represents terms and definitions about which one knows little or nothing. Like other types of data, terminology is most useful when it is documented and organized.
At many software companies, "terminology management" follows a scenario similar to the following: After a source-language (typically U.S. English) product is created, a documentation specialist typically compiles a glossary (i.e., a list of terms and definitions in the source language). If the product is to be localized, the key source terms are collected and passed - along with the glossary - to target-language localizers. The "target language" is the language into which the source language is to be translated (Japanese, for example). Target-language localizers grapple with inconsistencies in the source terminology and may have time to query the software company about a few items. The timeline for providing localized versions is typically extremely short; even so, localized versions are sometimes more consistent than source-language versions, since localizers tend to pay much more attention to terminology and consistency than developers.
While the target-language terminology in this scenario qualifies as "managed under the definition" given in the Handbook, the source-language terminology does not. The fact is that most software companies have no specific process for managing terminology beyond collecting terms for their source glossary. Most do not, for example, check source terminology for internal consistency, or consistency across products. And many smaller companies may rely on the terms used by large software publishers, such as Microsoft, which themselves employ inconsistent terminology.
Traditionally, the localization community in each company by default has been most likely to try to convince source development teams that terminology documentation is important. Even so, those who develop software, user interfaces, and user documentation may never see the localization costs of not managing terminology, ostensibly the most compelling reason to implement source terminology documentation processes. For example, a development team in the U.S. may not keep track of the terms it uses to reference a closing or stopped application, using multiple terms such as cancel, quit, close, end, and stop inconsistently in user interface elements and error messages. No one on the development team will think twice about this inconsistent usage - they know what these terms mean. What developers don't realize is that the localizer must treat every change in form as a change in meaning. If a localization vendor has a match for the term close, for example, but not quit, then she must do some research and determine if quit means the same thing as close, and whether she should then use the same translation for both. This probably involves contacting the vendor manager on the U.S. team, who must then try to find someone on the U.S. team who can definitively answer the question. It would not be unusual for five to seven people to be involved in the resolution of this kind of problem, each of whom spends 1½ to 1 hour on the problem. This might cost a software company an average of $50 per hour per person involved, plus basic overhead costs. If the same question comes in from multiple languages to different U.S. team contacts, and the company maintains no central clearinghouse for localization issues, this could easily become a $1,000 question.
In addition, if the localization tasks are outsourced, the localization vendor may not (or may not be able to) push for increased efficiency on the source side. Thus the cycle of unmanaged terminology for all languages continues. This classic scenario has several repercussions: it enables the bad habits of those creating terminology in the source (in the U.S. this is typically U.S. English), it ensures customer confusion at some level, and it ensures that these software companies will continue to spend more on localization than they have to. The companies who put most of their energy towards cleaning up localization issues are in fact dealing with the symptoms and not the root causes of the problem.
A great illustration of the cost of not managing terminology is the error message, a dialog box that appears on the screen when a user or program has performed an unexpected or illegal action. A typical program can contain hundreds of error messages; larger programs, such as operating systems, contain thousands of error messages.