IEEE Transactions on Professional Communication
291 articlesDecember 1979
-
Abstract
From empirical data on the authorship of scientific papers, Alfred Lotka deduced on inverse-square law relating the number of authors of scientific papers to the number of papers written by each author. A basic assumption underlying Lotka's law is that the number of papers published by a scientist is a measure of his contribution to science. This assumption is debatable. In this paper Lotka's law is applied to the literature of computer science. The inconsistent results of earlier attempts to apply Lotka's law to the literature of various scientific disciplines, including computer science, are ascribed to the differences in sampling procedure and treatment of multiple authorship.
-
Abstract
The teaching system of the Open University of Britain is briefly summarized and related to a feedback model of the educational process. It is shown that the main further development needed is to improve the remedial components of the system and several developments towards this goal are described. These include computer-based information retrieval and educational systems; a dedicated teleconferencing network; and use of the standard TV set coupled, for example, to an audio-cassette recorder, a telephone line, and a light pen.
September 1979
-
Abstract
A major motivation is to achieve in man-machine interactions the efficiency of speech communication among humans. Continuous speech is more difficult to understand than are isolated words. Commercially available speech recognition systems of the latter type are highly successful despite their limited capability. To recognize continuous speech, more information is needed than is contained in acoustic waves alone. The linguistic and contextual knowledge that must be supplied or programmed into a computer to accomplish speech interpretation is the subject of several research activities which are described. Speech synthesis systems face similar problems but are further advanced.
-
Abstract
Words such as `input' and `Feedback' can be useful and meaningful when accurately defined in a technical context such as computer science. English professor John McCall at the University of Cincinnati has begun a semi-serious campaign against the metaphoric use of such technical terms in non-technical or unrelated conversation and writing.
-
Abstract
The bibliographic reference and citations which exist among documents in a given document collection can be used to study the history and scope of particular subject areas and to assess the importance of individual authors, documents, and journals. A clustering study of computer science literature is described, using bibliographic citations as a clustering criterion, and conclusions are drawn regarding the scope of computer science and the characteristics of individual documents in the area. In particular, the clustering characteristics lead to a distinction between core and fringe areas in the field and to the identification of particularly influential articles.
December 1978
-
Abstract
The design of man-computer dialog requires the underlying theoretical framework of basic concepts present in other applied arts. Four basic aspects of man-computer interaction are predictability, implication (the extensive use made by humans of context), experimentation (the importance of trial and error procedures), and motivation (the part played by feelings such as trust, hostility, etc.). Nonverbal communication should also be exploited to widen the rather limited `bandwidth' of current computer terminals.
-
Abstract
This article reports an investigation of the technical and economic aspects of the multiple use of bibliographic data and abstracts of articles in machine readable format. Various techniques such as OCR, word processing, and photocomposition are available for data acquisition and transmission. There is also sufficient economic justification to encourage sharing of such data among publishers. But questions of cost sharing, standards copyright, and procedural adjustments in management and accounting are powerful deterrents to the multiple use of publication data both within an organization and among different publishing concerns. The advantages of sharing bibliographic data can be achieved only if publishing concerns in both the public and private sectors are willing to cooperate and adjust.
September 1978
-
Abstract
This paper presents a series of graphic design experiments using an experimental colour graphic display system. Design principles and capabilities of the system are discussed from a graphic designer's point of view. The system allows a designer to choose freely among 128 different colours, various form modes, and collage capabilities, including image mixing. The designer need be neither a programmer nor one who understands the technical aspects of the system to use it creatively. Experimental results are shown visually here, some of which have been used as cover designs for IBM publications.
June 1978
-
Abstract
Retrieval of all the works of a given author in one, or several, computer-searchable bibliographic data bases is made difficult, if not impossible, by inconsistent author name data. Varying forms of an author's name appear as different author names in a computer-generated author index. Authors, publishers, and data base generators should adopt policies that support a consistent name form to alleviate this information problem.
March 1978
-
Abstract
The technique called Information Mapping applies concepts of modularity and system analysis to the preparation of training, product, accounting, procedural, and computer software manuals. The How To volume for Information Mapping is a course in structured writing — a combination textbook, programmed self-instruction manual, guide for teachers, discussion of rationale, presentation of examples, and basis for seminars.
-
Abstract
Soundspel is a phonetic spelling system based on the transliteration of 44,000 most-used English words. It uses letters and letter-pairs consistently to represent the sounds in those words. A computer programmed with both the traditional and the “logical” spellings can provide Soundspel output in numerous stages of conversion from traditional English input.
December 1977
-
Abstract
A personal-use (i.e., small-scale, individual-use), computer-based microfiche filing and retrieval system is described which differs principally from a large-scale literature or library system in that indexing is not based on document content but results from causal relations in the user's work-based activity patterns. Immediate retrieval of the full document gives the user a high level of awareness. Field experience indicated that the number of keywords needed is easily manageable (less than 200) and that ease of document location was enhanced over paper-file searching in two-thirds of the attempts. The application experience is of value in introducing automated retrieval into the general office environment.
-
Abstract
A five-year interdisciplinary effort by speech scientists and computer scientists has demonstrated the feasibility of programming a computer system to `understand' connected speech, i.e., translate it into operational form and respond accordingly. An operational system (HARPY) accepts speech from five speakers, interprets a 1000-word vocabulary, and attains 91 percent sentence accuracy. This Steering Committee summary report describes the project history, problem, goals, and results.
-
Abstract
Citation data from the 1975 and 1976 Journal Citation Reports were used to develop a `computer science impact factor' for ranking `core' journals in computer science. The starting set of source journals included Communications of the ACM, Computer, IEEE Transactions on Computers, and Journal of the Association for Computing Machinery, to which Computer Journal was added in iteration. The problems and limitations of citation analysis are discussed and results are compared with a previous analysis (K. Subramanyam, IEEE Trans. Prof. Commun., vol.PC-19, no.2, p.22-25, Dec. 1976). Librarians' use of the results should be made in the light of their own experience of user requirements.
November 1977
-
Abstract
This paper consists of two parts: (1) a broadstroke survey of the substance, history, and purpose of copyright legislation; and (2) a discussion of the social, economic, and legal challenges to copyright resulting from an almost myopic emphasis on the social value of rapid accumulation and dissemination of information through the devices of computer technology.
-
Abstract
paid circulation and a smaller base over which to distrib ute the expenses of the publications program, driving up unit costs to unacceptable levels.Such appreciable increases in cost would defeat the Society's efforts to maintain the broad and effective distribution of its publications."The Society therefore urges that the systematic library reproduction of copyrighted scientific articles without permission of or payment to the publisher will have a deleterious effect on the dissemination of chemical knowledge and on efforts to promote scientific research in the field of chemistry."If unlicensed and unpaid-for dissemination of such journals or the articles or data contained therein is accomplished through remote-user retrieval from central information sys tems rather than by photoduplication, the result will be the same, namely, the elimination of the nonprofit publisher's ability to recover its expenses through sales of its publications and hence a negation of its ability to continue publishing activities.The only realistic publishing alternative to the elimination of the system of economic incentive and financial return estab lished by copyright is government or government-subsidized dissemination.However, this m T ist be considered unacceptable to both scientists and the general public.It necessarily involves official bureaucratic control over the choice of works to re ceive exposure, whether in conventional printed form or by computer storage and retrieval, as well as the potential for suppression of unorthodox views.
-
Abstract
The National Commission on New Technological Uses of Copyrighted Works (CONTU) is directed to evaluate the effects of the new copyright law with respect to new, developing technologies. Its two basic concerns are with the ends to which copyrighted materials are used in or via computers and the effects of ubiquitous copying machines. Four subcommittees deal with these areas-software, data bases, new works, and photocopying, and CONTU has promulgated guidelines for interlibrary-loan photocopying developed in conjunction with author, publisher, and library groups.
-
Abstract
This is a combined reprint of section 117 of the Copyright Law of 1976 and of the short explanatory statement common to both Senate Report 94–473 and House Report 94–1476. In essence, it simply preserves the status quo of any author's rights or restrictions based in law on December 31, 1977.
September 1977
-
Abstract
The EPC concept is based on sharing the use of highly automated editorial, production, marketing, and business systems by a group of publishers large enough to attain useful economies in operation. The EPC is viewed as a way to provide computer support to the publishers of many small scientific journals that are currently experiencing technical and financial difficulties. Studies directed toward generalized conclusions about the EPC, in all its organizational and economic complexities, have been supported by the National Science Foundation since 1973. Results from these projects and plans for the future are discussed.
-
Abstract
The Institute for Scientific Information® (ISI®) is a multinational corporation that provides a wide variety of information services to scientists and librarians throughout the world. Included are the Science Citation Index®, Current Contents®, and others which depend on sophisticated computer processing for timely production. This paper describes how certain information elements are extracted from each journal article and processed through the ISI system. Examples are given of how recent computer technology has been applied to keep ISI services cost-effective as well as to improve their quality.
-
Abstract
Much of today's CRT photocomposition is being processed through computer systems from typed OCR input or from magnetic tapes. Both approaches have some attraction for publishers who may have the capability of preparing such input within their own editorial offices, or from available computerized master files of data. Advantages of these procedures and some problems which have been encountered are discussed. The importance of personnel selection and training, attention to details of specifications, and equipment maintenance are emphasized. With careful programming, data files can frequently be converted into typesetting language, selecting and rearranging material to meet a publisher's needs.
June 1977
-
Abstract
Programming and English text writing are creative activities that usually require editing, and both kinds of editing can be enhanced via computer. Microprocessors with as little as 4 kbytes of memory can be used. Choice of hardware and its limitations are discussed. Differences between program editing and English text editing can be reflected in software but are note incompatible. Software features such as scrolling, indexing and word wrap are described along with editing functions such as `insert' and `delete'.
-
Abstract
Terminal technology in the next two decades will be based largely on current and near-future research and development (R&D) which, in turn, is based largely on what today's terminal users and R&D managers think will be wanted in the future. In the realm of office work, the R&D priority is predicted to be the use of computers to regain the advantages of 1) human communication, e.g., in teleconferencing; 2) paper as an office medium; and 3) intensive and accessible work environments.
December 1976
-
Abstract
Attempts to identify the `core' primary journals in the newly emerging discipline of computer science. Two measures of significance have been used to draw up a core list of journals: Productivity in terms of the number of source items drawn from each journal for review in a secondary journal (Computer Reviews); and extent of use, as reflected by the frequency with which the journals are cited in a leading primary journal (IEEE Transactions on Computers) over a period of two years. This study has shown that the primary journal is the most favored medium for scholarly communication, and that computer science journal literature has consolidated into a relatively few very productive and highly cited primary journals. These core journals are indispensable in any specialized collection of computer science literature.
March 1976
-
Abstract
The typesetting of mathematics requires consistency and high quality, which has formerly meant high labor costs and long production schedules. OCR devices, computers, and CRT phototypesetters can provide a low-cost, high-speed alternative to labor-intensive metal typesetting when they are coupled to an adequate software system. The software must encompass the many complex, though precisely definable, rules for math setting and an easy-to-learn, labor-efficient input language. This paper describes such a system, currently in use, including hardware, the logic and assumptions of the software, and examples of the input language.
December 1975
-
On-line generation of terminological digests in language translation. An aid in terminology processing ↗
Abstract
This paper describes a technique of generating terminological digests speedily on terminals connected to a computer in order to overcome these impediments and aid the translator in streamlining the translation production process. A terminological digest represents the glossarial framework of a translation, a unique dictionary constructed automatically for the text to be translated. The user can produce a terminological digest by invoking the appropriate program on his terminal, entering on the keyboard the terms he wishes to have looked up. All terms entered are immediately retrieved from an up-to-date scientific-technical dictionary and provided with target language equivalents and other pertinent information. At the user's option, the dictionary entries may be presented singly, as a list in the order of entering the terms (e.g., the order in which they occur in the text to be translated), or as an alphabetically-sorted list. These lists may be displayed, typed out, or printed and saved as `minidictionaries' for a particular field.
-
Abstract
LT^ lA U • id £fSs E~jb ett5 Jl eFs EfT Bis l£r information interaction car.mean many things to different people, from on-line interaction with a computer terminal to direct human-to-human communication.The theme, "information Interaction," ;s people-oriented, it is especially fitting for the 1976 fyjid-year fvleeting, since the tradition of fvlid-Year Meetings is to provide a comfortable informal setting where people-to-people interaction is promoted and ideas are interchanged free!y=Previous mssiings have deail with the lheu : ;~->, IGO'S, and techniques for information handling.We want to concentrate on people and information.What do people really do with information once modern technology has provided it?How are people's decision-making processes affected?How do people cleaJ with nonfidelity or "noise" in information to which they are exposed?What are the educational aspects?To what extent do people deny themselves access to needed information because of misconceptions?Papers are solicited for the iv/'o Mid-Year Meeting on the following general topics.The questions are to stimu-0^-* thsv ar Q not exhaustive.© People-to-People interaction < "»v» do people transfer Information amuny one anuiher?Is the "gatekeeper" concept valid?How can person-to-person communication be enhanced? ® User-information System interactionWhat impact has on-line searching had on users?Will the user pay the costs? © Education as an information interaction Process tAre current educational techniques yeiiiny iiic iob done?How can information science contribute to education?What implications does research in education 1.
September 1975
-
Abstract
As the literature of science has grown and as the editorial and production costs of publications have escalated, it has become mandatory to eliminate the duplication of intellectual effort, and if possible the mechanical effort, which today goes into the preparation and analysis of a scientific document. The primary journal is the true repository of the original scientific and technological data. The secondary service provides access to that originial data; it does not replace it; it is not a surrogate. Experiments to date include the exchange of abstracts, index entries, and uniform bibliographic citations; also, the simultaneous editorial processing of the primary manuscript and the required secondary records. In the latter example, it is possible to produce primary journal indexes automatically from the secondary service database and to plan for one-time keyboarding of specific input data needed by both services. Large computer composition systems developed for the needs of secondary services are now composing primary journals economically.
-
Abstract
Typesetting of journals by computer still is more expensive today than typewriter composition. Economics favors the computer only if multiple use can be made of the material after it has been captured in machine-readable form. Such is the case with the material that is keyboarded into the bibliographic database of the American Institute of Physics (AIP). The records in tins database are produced directly from the manuscripts submitted by authors and are then used over and over again: to photocompose a part of the article in the primary journal itself, to photocompose pages for an “advance abstracts” publication, to photocompose selected abstracts needed by The Energy Research and Development Administration's (ERDA's) Nuclear Science Abstracts and to provide ERDA with a corresponding tape which eventually goes to the International Nuclear Information System in Vienna, to produce a monthly tape for information centers offering selective dissemination of information (SDI) and other services, to produce multiple entries in the new quarterly Current Physics Index which covers all the physics journals published by AIP, to produce multiple entries in the annual subject and author indexes in each journal, and to produce cumulative 5 to 20-year journal indexes. The multiple-use concept cannot be used to justify computer typesetting of the full text of journal articles. We will specify the conditions that must be met before full-text computer composition can become competitive with other methods, and discuss some of the advantages of such a system.
-
Abstract
Composition at the American Society of Civil Engineers (ASCE) has progressed in 20 years from hot type to author prepared camera-ready copy, to IBM Executive typing, to math by hanger keys, to multilevel mathematics on the cathode-ray tube (CRT). ASCE's technical journals have incorporated each of these nonfederally subsidized innovations to deliver better, faster, lower cost final products in 13 journals that include about 13 000 pages per year and indexing of 5 000 articles, papers, and discussions per year from staff-typed optical-character-recognition (OCR) material prepared by an editor, half-time, and an editorial assistants Civil Engineering magazine, the biennial membership directory, and annual committee personnel listings are also computer composed.
-
Abstract
“Fair use” is at best inconclusive. It does not solve the two real problems: 1) the increasing need of education science, government, and business for multiple copies of documents; and 2) the fact that since the copyright owner's compensation is the total return from the use of his work, the loss through “fair use” of his work cannot be measured in terms of any individual use but only in terms of the total use and total copying. Therefore, we feel that the present provision for fair use, while making possible some types of research use of copyrighted material in computer and microfilm storage devices, cannot solve the “computer problem,” let alone the direct copying problem. At best, it will serve as a temporary safety valve for the user and eventually the courts, until some clearinghouse system is established. At that time, the concept of fair use should lose its importance and die off.
-
Abstract
While what constitutes fulfillment varies considerably from one publisher to another, and even from one journal to another, the key word is always service. Subscription fulfillment is defined as the processing and servicing of subscriptions from any source. This can be accomplished in a number of ways. In fact, it is not unusual to convert systems several times to find the best way of fulfilling your subscriptions. The advent of the computer has made it possible to gather statistics and demographics more easily, and an analysis of the statistics is a basic tool in new promotion. As costs increase, greater selectivity is needed in list selection, advertising, and special offers. The basic source of subscribers is renewals. As much, if not more, effort should be spent on obtaining renewals. As much, if not more, effort should be spent on obtaining renewals. As much, if not more, effort should be spent on obtaining renewals as on seeking new subscribers. When do you stop seeking a renewal efforts when the cost reaches the figure for adding a new subscriber. Credit and collection efforts follow the same pattern. Fulfillment is a cycle composed of promotion, servicing, collection, and renewal.
March 1975
-
Abstract
Describes a computer program written for the UNIX time-sharing system which reduces by several orders of magnitude the task of finding words in a document which contain typographical errors. The program is adaptive in the sense that it uses statistics from the document itself for its analysis. In a first pass through the document, a table of diagram and trigram frequencies is prepared. The second pass through the document breaks out individual words and compares the diagrams and trigrams in each word with the frequencies from the table. An index is given to each word which reflects the hypothesis that the trigrams in the given word were produced from the same source that produced the trigram table. The words are sorted in decreasing order of their indices and printed. Printing is suppressed for words appearing in a table of 2726 common technical English words.
March 1974
-
Abstract
Complexity of information storage and retrieval systems may range from simple, personal desk file indexes to interactive computer systems operating on large data bases. All of these systems have certain common characteristics that help to bring the needed information and the investigator together. The commonality of these elements is discussed, so that the user may understand what he can and cannot expect from a retrieval system.
September 1973
-
Abstract
Using a computer text processing system as the entry and change vehicle for a photocomposition system affects the publishing function in many ways. Costs are reduced, quality and readability are enhanced, esthetics are more controllable, and entry personnel require little training. Proofreading is almost entirely replaced by a computer-generated concordance. Mechanicals for reproduction are completed at the editor's site, not at the printer's, completing one more step in the movement to the automated office. The Honeywell Computer Journal is published concurrently on hard copy, microfiche, and magnetic tape. The tape can be used to drive other photocomposition systems that differ from our own, just as a computer can translate COBOL programs to the running instructions of a particular computer. Thus our work has shown the way to a common composition language that can describe all formats and identify uniquely the universe of printed symbols.
-
Abstract
Although computer technology has been introduced into virtually every phase of scientific communication, relatively little use has yet been made of it in primary dissemination, perhaps because of the limited operational scale of the typical scientific publisher. An Editorial Processing Center (EPC) is conceived as a mechanism for combining small publishing operations to achieve a scale great enough for significant computerization while leaving each editor in full command of his own publication. The EPC's computer assists authors, editors, and referees to perform their essential, intellectual functions by relieving them of nonessential, programmable functions. Its final output is a magnetic tape for use in photocomposition. Its potential benefits include immediate operating economies, more effective communication, a base for innovation in the form of publication, and benefits to secondary processors and analysis centers. A number of questions remain, however, chiefly in relation to the exact operating point at which any given configuration would become economically advantageous. Work is in progress to provide the answers.
-
Abstract
The American Institute of Physics (AIP), as an essential aspect of its Current Physics Information (CPI) program, has been implementing a new procedure for the production of both its primary journals and its secondary-information products which relies on a singe processing of the common elements in both. This processing includes copy editing, keyboarding, proofreading, and indexing of such items as article titles, authors, by-lines, abstracts, and references. The single computer tape produced by this processing is used for the photocomposition of the elements involved for the primary journals, as well as for AIP's secondary services including the volume indexes to the primary journals themselves. I will discuss the reasons for this change in procedure, and its technical and economic aspects. I will also sketch out possible future developments in the system, which would rely on much greater use of computer processing, and attempt to assess the economic benefits. Finally, I will discuss the effects of AIP's secondary-information capabilities on the dissemination of primary information in traditional formats, as well as possible alternative formats.
March 1973
-
Abstract
A speaker conveys information not only by pronouncing words; he also uses pitch, inflection, and varying amplitude. The writer conveys information using printed letters concatenated to make words, these, in turn made into sentences. He has available similar mechanisms to supplement his written communication but rarely uses them: he can use different type faces; he can lay out the information upon the page in different fashions; he can vary the size of the printed material. Only the last, the use of different type faces to supplement communication is examined here, with examples from books which teach computer science. The salient feature of multiple type faces is that they increase intelligibility with no diminution in the information transmitted.
-
Abstract
It is the author's thesis that more battles have been lost because of misunderstood orders than because of the failure of strategy, tactics or logistics; that the most important tool of the manager is not the computer, the high speed copier, the automatic voice network and other devices about which there is so much concern, but the language itself which, almost unnoticed, continues to disintegrate at an ever faster rate. The importance of language on communication is discussed.
June 1972
-
Abstract
In spite of instant intercontinental communication by satellite and computer, there are still barriers to efficient information transfer and attainment of comprehensive awareness between scientists/technologists in different countries. The tremendous growth rare of technology has, in fact, brought new problems in its wake. These are grouped under seven headings: language, overpopulation, pollution, jargon, time, economics, and migration. Each of these is briefly examined and its consequence discussed. The language problem is dynamic and keeps changing with time. There is increasing overpopulation of papers; rehash and abstracts all contributing to pollution and diffusion of data. Noise is getting worse due to redundant data, wrong indexing, etc. For years, a jargon explosion has been going on without control. There are too many built-in delays in the conventional publication-translation-processing-search-feedback chain. Information search is hampered by overpopulation and pollution. Most methods of communication have become more expensive with passage of time, while budgets at both company and national level have become tighter. In this age of rapid innovation, scientists and engineers keep changing their field of work and job location, delaying and making less effective attempts at direct communication.
March 1972
-
Abstract
Some typographic conventions relating to line composition and page makeup are presented in the context of the problems they pose to the designer of a computer program for typographic composition.