------------------------------------------------------------------------------- TITLE: INFORMATION AND MEDICINE SOURCE: Dept. of Otolaryngology, UTMB, Grand Rounds DATE: October 1989 FACULTY: Francis B. Quinn, Jr., M.D., FACS DATABASE ADMINISTRATOR: Melinda McCracken, M.S. ------------------------------------------------------------------------------- "This material was prepared for resident physicians in partial fulfillment of educational requirements established for the Postgraduate Training Program of the UTMB Department of Otolaryngology/Head and Neck Surgery and was not intended for clinical use in its present form. It was prepared for the purpose of stimulating group discussion in a conference setting. No warranties, either express or implied, are made with respect to its accuracy, completeness, or timeliness. The material does not necessarily reflect the current or past opinions of members of the UTMB faculty and should not be used for purposes of diagnosis or treatment without consulting appropriate literature sources and informed professional opinion." INFORMATION AND MEDICINE A. INTRODUCTION One hears much about the information explosion in medicine. We must consider the concept of "information" in general, as well as in its particular application to clinical medicine. Practice of clinical medicine is increasingly information- intensive, with respect to the body of knowledge which the practitioner is expected to have at his fingertips. This information goes beyond current concepts of pathophysiology and therapeutics, and increasingly includes demographic and social data relating to the individual patient, as well as administrative and legal requirements imposed upon the activities of the physician himself. Additionally, this information load has a half-life which is sometimes shorter than the time required for its transmission and dissemination using our current system of professional communication (the medical chart, memoranda, flyers, handbooks, journals, CME, and face-to-face conversation.) Finally, the word itself, "information" has several levels of meaning, and we must try to understand these several meanings before we begin to cope with it. B. CONCEPTS OF INFORMATION AND COMMUNICATION 1. OBSOLETE: The obsolete definition of the verb, to inform, is "to give material form to". 2. CURRENT: The word, information, has several current definitions, depending on the domain of use: a. the communication of knowledge or intelligence b. knowledge obtained by investigation, study, or instruction c. the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (such as DNA nucleotides, or binary digits in a computer program) that produce specific effects d. a signal or character (as in a computer system) that represents data e. a message that justifies change in a construct (as a plan or theory) that represents physical or mental experience or another construct f. a numerical quantity representing the content of a message or symbol string, expressed in terms of its distinguishing characteristics or the number of possible alternatives from which it is chosen. For example, the information content of a semantically legal symbol string varies with the reciprocal of the number of semantically legal interpretations available to its receiver. In other words, an "ambiguous" statement has less information, quantitatively speaking, than an "explicit" statement. 3. FUNDAMENTAL THEORIES: The study of information as subject of interest in and for itself can be begun by examining its fundamental theories: a. theories of "speakers" (senders): syntax, grammars, lexicon, codes b. theories of "listeners" (receivers): semantics, ambiguity, "meaning" -- (The characteristics (semantic repetoire) of the receiver influence the information content of the received symbol string.) c. theories of communication channels: limits upon channel capacity, noise, error detection, redundancy d. theories of decisionmaking: pragmatics, utility, metacommunication, meta-information 4. The engineering and mathematical view of information: a. Early in this century the field of radio engineering gave rise to a new interest in the limits of communication. In the USA this was called the theory of information, and in Great Britain, communication theory. b. Early in the days of radiotelegraphy, H. Nyquist and K. Kuepfmueller independently pointed out that in order to transmit signals at some chosen rate, a calculable bandwidth or frequency range was required. c. HARTLEY. This law was reformulated by R. V. L. Hartley in 1927 to state that, in order to transmit a specified message, a certain fixed product (bandwidth x time) is required. If one wants to transmit the message in half the time, a communication channel with twice the bandwidth must be provided. In Hartley's theory of information, a quantity was assigned to the particular message, and that quantity was expressed as information. The quantity was calculated as the probability of the unique symbol string having been chosen from the set of all possible messages concerned H = n log N Hartley's view of information ignores all concepts of "meaning" of a message or symbol string, and while essential to the design and construction of information and communication systems, it causes much difficulty in discussions of information systems with persons and groups not familiar with this sense of the word. d. SHANNON. In 1948, C. E. Shannon of Bell Laboratories extended Hartley's ideas to include the effects of noise and coding procedures upon communication channel capacity. Once again, he cautioned that "In any case, meaning is quite irrelevant to the problem of transmitting the information. ...Thus in information theory, information is thought of as a choice of one message from a set of possible messages." Shannon's work related to communication channels, which, unlike those of Nyquist and Hartley, were assumed to contain noise as well as messages. Shannon went on to express channel capacity limits as reflected in requirements for unambiguous transmission of symbol strings, requirements which can include redundancy, error detection protocols, and error detection and correction algorithms. Here we begin to see an approximation of communication between humans, as for instance, over a noisy telephone link, but without any reference to or concern for the "meaning" of the symbol strings transmitted. 5. COMMUNICATION: Whereas the foregoing discussion of the mathematical view of information may seem tangential to the nature of medical information and its communication, nevertheless, if we are to comprehend the limits and abilities of information processing systems, notably the general purpose electronic digital computer system, we must bear in mind that this is precisely the nature of information as it is acquired, stored, manipulated, communicated, and displayed by these systems. The digital computer has absolutely no way of appreciating what we humans refer to as "meaning", and we must accustom ourselves to that reality if we choose to take advantage of these machines. Warren Weaver (with Shannon - 1949) stated the human communication problem as consisting in three levels: a. How accurately can the symbols of communication be transmitted, stored, and recovered? (the technical problem) b. How precisely do the transmitted symbols convey the desired message? (the problems of syntax and semantics) c. How effectively does the received message affect conduct of the receiver in the desired way? (the pragmatics problem) 6. SYNTAX AND SEMANTICS We must distinguish between syntax and semantics: The sentence, "Colorless green dreams sleep furiously." is syntactically correct according to the rules of modern English, but is semantically null, inasmuch as it is void of meaning. It is an absurd statement, but grammatically correct. Similarly, the phrase, "The policeman's beard is half- constructed" is an example of a syntactically correct statement, produced by a computer program, which statement is devoid of any meaning. It is nonsense, semantically null. The concept of syntax, therefore, includes rules governing the combination of, and arrangement of combinations of, symbols, whether they be alphabetical characters, ideograms, phonemes, or the components of a computer instruction. In fact, a compiler for programming language can be regarded as a body of constraints upon the set of all possible combination of binary digits to which the registers of the computer can be set. In its broadest sense, syntax includes rules governing not only the combination of words into phrases and sentences, but as syntactics, it deals with the formal relations between signs or expressions in abstraction from their signification and their interpreters. It can, therefore, subsume the idea of a lexicon, dictionary, vocabulary allowable under various conventions. In our own context, the vocabulary of medicine, including allowable abbreviations, is governed by conventions of our profession, as influenced by editors of scholarly journals, third-party payers, and standards-setting organizations. The concept of semantics includes the study of the relation between signs, or symbols, or words, and what they refer to, whether it be objects, actions, ideas, or other signs, symbols, or words. * Simply, if a statement makes sense in some way, it is semantically correct. * If nonsense, then it is semantically void. * If it makes sense, but can be interpreted in more than one way, as, "Ship sinks today", or "Farmer's wife is best shot" or, "Fire!", then it is semantically correct, but ambiguous, and, in the Nyquist-Hartley sense, of less informational value than an unambiguous statement. 7. SUMMARY: * Information as a substance or commodity: it has value, and it is obtained at some cost. * Information as a process: * subsumes all four theories listed above. * reduction of uncertainty * Information as a function, or information as a computed result of some function evaluated at x In a more general sense, information can be seen as the property of a message, which property has the effect of reducing uncertainty in the receiver of the message. The information value of a message is expressed in the degree of assistance which the message provides to an individual faced with the task of making a decision concerning the subject of the message. In an even more general sense, information forms the currency -- the stuff of communication -- and especially human communication. This, of course, is a broad and deep area of study in and for itself, and we will not concern ourselves with it here except to point out that every communication is also a metacommunication. In other words, every message contains, in addition to its formal semantic freight, information about the sender, and derivatively, about the receiver. This can be seen clearly in ordinary conversation, where the vocabulary, grammar, accent, world-knowledge, and emotional state of the sender as reflected in the message can give a fair picture of that individual and his place in the social organism. Additionally, the message he chooses to send reveals at least some of his assumptions concerning these same parameters of his intended receiver. While an acquaintance with the practice of human communication is essential to the understanding of information systems, an understanding of the rules, the calculus, of human communication, is likely to recede from us ever how hard we pursue it, if only because of the paradoxical consequences of self-reflexiveness, as shown in Kurt Godel's paper, published in 1931, on formally undecidable propositions. C. CONCEPTS OF MEANING The concept of meaning, as in the meaning of a message, suffers from the limitation of self-reflexive statements, as, for example, we struggle to discover the meaning of "meaning". Immediately we must distinguish between the "meaning" of the message as constructed and uttered by the sender, and the "meaning" of the message as construed by the receiver. In the abstract, one can characterize "meaning" as a mapping between a sign (or message) and the object, process, or concept with which it is by convention associated. Should sender and receiver share this unique mapping function, then the message is unambiguous and clear. Should the case be otherwise, then the message is at best ambiguous, and communication is disturbed. Examples of such faulty communication are phrases such as "Do you think that one will do?", and "If you think the waiters here are unpleasant, you should see the manager." Perhaps the most familiar illustration of disturbed communication is the political debate, where an impartial but informed observer often has great difficulty discerning exactly what is being debated. A special case of communication is represented by the interaction of the human with the general purpose electronic digital computer system. While the human enters and retrieves information from the computer system in relation to the meaning or semantic value of the messages exchanged, the computer stores and manipulates these messages simply as groups of binary digits, as symbols in a two- valued or Boolean algebra, which, by successive levels of encoding and decoding capture, store, transform, and display symbol strings which have meaning for the human, but none at all for the computer itself. It is a particularly enlightening experience for the clinical physician personally to design and operate a clinical database, using any one of the several commercially available database management systems written for desktop single-user computers. He soon learns the anguishing degree of rigor and specificity required of data descriptions and taxonomy if he is to take advantage of the almost overwhelming power of even the small computer to retain analyze and display one's data. While this is not the place for a discussion of data structures, mathematical logic, and relational algebra, even a superfical understanding of these subjects will help immeasurably when the physician undertakes the methodical collection of clinical data. D. CONCEPTS OF DISEASE AND DIAGNOSIS Two quite different views of disease coexist today -- the nominalist and the attribute interpretations. * The "nominalist" ( or "name") view is held by the public health officer and epidemiologist, whose interest is in the disease as a named entity residing in identifiable groups of persons. * The "attribute" view is the frame of reference of the clinician, whose task relates to detecting the features of a particular patient's illness, and matching this collection of features with the attributes of a named disease entity. We refer to this process as diagnosis, a process which has as its formal endpoint the assignment of a nominal, or name, chosen from a bounded set of legal, or consensually validated, disease names. Often, this process is difficult and unsatisfying, as when the patient fails to manifest a sufficient number of the stipulated attributes of a specific disease. An example of diagnosis, or naming, of a disease, by reference to tables of attributes which I would like to mention is the Diagnostic and Statistical Manual (DSM III) published by the American Psychiatric Association. Other clinical specialties have not achieved this degree of rigor in establishing a shared lexicon of diseases, and it is quite possible that a good part of the education of the medical student and resident involves his acquiring the ability to think clearly in the face of shifting, inexact, and often fuzzy diagnostic terminology. Another, less successful example of naming of diseases by reference to lists of attributes was published by the American Medical Association in 1971, as "Current Medical Information and Terminology." This is a list of over thirty-two hundred disease names, with the signs and symptoms listed for each disease. This list has been built into a computer program named "Reconsider", and available here at UTMB. One enters the features of one's patient's illness, and the computer returns an ordered list of disease names, each of which contains at least one attribute corresponding to a feature of the patient's illness. Typically, the novice user will be overwhelmed with a disease list numbering in the hundreds, and concludes that the program is of little help in solving this patient's problem. A second problem with "Reconsider" is that its knowledge base is almost twenty years old, and there have been no attempts to update and reissue the work. Greater and more widely accepted efforts have been expended in developing systems of classification of the names of diseases, principally to serve the needs of those upon whom falls the cost of treatment, typically insurance carriers and government organizations. Systems with which many of us are familiar are the Standard Nomenclature of Diseases and Operations (SNDO), and the several versions of the International Classification of Diseases (ICD-9-CM). The Systematized Nomenclature of Pathology (SNOP), on the other hand, consists of a standardized list of terms arranged in the form of a lexicon, in which each term stands for a disease attribute and is assigned a unique code number. These disease attributes are assigned in SNOP to one of four categories (or "axes"): * topography -- the anatomic site affected; * morphology -- the structural changes (usually microscopic) associated with disease; * etiology -- the cause of disease; and * function -- the physiologic alterations associated with disease. These terms are assigned code numbers, and a particular instance of disease occurring in an individual patient can then be described by a series of SNOP code numbers corresponding to the attributes that are observed. In order to increase the range and types of attributes that can be coded, an extended version of SNOP has been developed, which is known as the Systematized Nomenclature of Medicine (SNOMED). Three more "axes" were added: * "diseases" (which correspond in form to those given in the International Classification of Diseases); * therapeutic "procedures"; and * "patient occupation". By using all seven categories, a large number of the attributes of illnesses can be numerically coded within the SNOP/SNOMED scheme. While all of these systems of classification and encoding allow efficient storage and recall of information concerning disease, whether in relation to the individual patient or large groups of patients, they all * suffer from the loss of potentially significant details, an inevitable consequence of any encoding process, and * suffer from the rapid expansion, and in many cases, obsolescence of the nosology. As a diagnostic classification scheme seeks to serve the current requirement of practitioners and third-party payers, it must grow with the knowledge base, and it must represent the widest possible consensus, else it will join the Current Medical Information and Terminology as a discarded curio. We must accept the migration of nosology as a fact of life, and as we reflect on the Civil War physician's recognition of "nostalgia" as a formal disease entity, and in fact, as a cause of death to be listed on the soldier's death certificate, we must believe that physicians one hundred fifty years hence may look upon our nomenclature of disease as equally quaint. Migration, or evolution, of nosology is inevitable, as -- * we change in our ability to recognize disease resulting from improvements in our techniques of measurement and observation * we change in our efforts to detect disease, as a consequence of new incentives, earlier detection, public education, and case finding * we change our definition of disease, by increasing our proficiency in noting the subtle details of, and cataloging the features of our patients' illnesses. E. CALIRATING OUR TOOLS (OUR MEASUREMENT TECHNIQUES) Edmond A. Murphy, of Johns Hopkins University, in his book, The Logic of Medicine, discusses and distinguishes among resolution, precision, and accuracy, of clinical observations. Resolution: He points out that resolution is the inherent degree of refinement of the measurement technique as applied by the observer. It is an expression of the lower limits of uncertainty regarding whether two data elements represent the same or different values of the observed parameter. An example can be seen in the optical microscope, where the limit of resolution is the wavelength of the light transmitted to the observer, as modified by the quality of the optics. It is an expression of the indeterminacy of the technique of observation, in the hands of the observer. Precision, according to Murphy, refers to the reproducibility of some particular estimate, and is not an attribute of measurement. It is an attribute of an estimate, in that each measurement represents an approximation of the true value, as modified by resolving power of the measurement technique, and random variations in application of the measurement technique. Accuracy, then, indicates just how well sample estimates of the value of interest represent the true value. Formally speaking, an estimate is accurate if, on the average, its deviation from the true value is small. If precision represents the variance of the estimate, then accuracy represents the bias of the estimate. A clear understanding of these terms is essential to the work of scientific investigation, and equally essential, though far less prevalent, in the work of clinical diagnosis and assessment of therapy. The process of clinical observation as carried on by the physician involves highly complex intuitive techniques of measurement by a highly skilled individual, and must be interpreted as integrating the preconceptions and expectations of the physician as well as the clinical phenomena characterizing the condition of the patient. Accepting this premise, we see that the proper study of medical information includes the study of the physician as an element of the measurement process, as a potentially confounding variable in the process of estimating the values of clinical parameters. With this in mind, we have begun a study of inter-rater agreement, or inter-observer variability, with physicians as subjects. Thus far we have completed two studies within which the departments of Pathology and Otolaryngology have cooperated successfully, * one on inter-observer variability in fine-needle aspiration biopsy of masses presenting in the head and neck, * and another on interobserver variability in the interpretation of the technique of brush biopsy and cytopathologic diagnosis, in diseases of the upper aerodigestive passages. * We began, then halted, and are about to resume, a study of inter- rater agreement in clinical staging of malignant tumors of the head and neck, * and we have had discussions with the Department of Radiology on the subject of a cooperative study on the interpretation of radiographic imaging of the head and neck. ALVIN R. FEINSTEIN In Annals of Internal Medicine, 1983:99:843-848, Alvin R. Feinstein introduces the word, "clinimetrics" and says: "The domain of clinimetrics is concerned with quantitative methods in the collection an analysis of comparative clinical data, and particularly with improved 'measurement' of the distinctively clinical and personal phenomena of patient care. The main requirement for scientific quality in data is a consistent, reproducible process of observation and expression. This requirement can be attained with appropriate attention to basic descriptive activities and to the operational criteria that convert raw descriptions into the variables, categories, and composite aggregates of suitably chosen clinimetric scales. For this work, clinicians will be challenged to 'dissect' and stipulate the components of decisions that are now made with unspecified methods or judgements. Clinimetric science provides opportunities for new approaches, new sites, and new personnel in an additional type of clinical investigation that can augment the scientific basis of clinical practice, while rehumanizing the contents of research data and restoring analytic emphasis to the art of patient care." Feinstein recalls that when biometry was founded as a discipline about a century ago, the goal was to join the -metry of measurement with the bio- of human and nonhuman biology, to "fuse mathematical sagacity and biologic wisdom in achieving quantitative analysis for biological data. The requirements for "hardness" in measurement data include reproducibility, objectivity, and dimensionality. The changeable, even transient, nature of many clinical observations make it difficult to reproduce these observations, but objectivity, as reflected by consistency among several competent observers can be calibrated and quantified. If checks of interobserver variability show substantial disagreements, the data are inconsistent and the measurement process unreliable, regardless whether the data consist of verbal statements or scalar. Improvements in the "hardness" of data can be made by paying attention to the observational methods with which phenomena are perceived, the descriptive methods with which the observations are converted to raw expressions, and the taxonomic methods with which the descriptive expressions are assigned to specific categories. While these categories are, in the case of clinical observations, usually nominal, ordinal, or binary, rather than scalar, the assignment can be made consistently, provided explicit criteria are formulated and specified for each category. We already have examples of such processes of assignment of observations to conventionally validated and prognostically useful systems of categories, such as * the Apgar score for expressing the condition of a newborn infant, * the Glasgow coma scale, the Jones criteria for diagnosis of acute rheumatic fever, and * the ASA anesthesia risk categories. No one today denies the utility of these measurement tools. F. CALIBRATING UNCERTAINTY, AND STATISTICS -- THE STUDY OF VARIABILITY) * measurement error * sensitivity, specificity, predictive value * sampling theory * population statistics * inferential statistics * exploratory data analysis (just "messing around with the data") G. CALIBRATING THE DECISION PROCESS 1. Shortcomings of human judgment a. slow computational speed b. severely limited memory span & capacity c. primacy/recency effects upon memory d. framing bias e. preference for loss-avoidance over gain-realization f. severely limited capacity to compute probabilities g. fluctuation in judgment induced by physiologic and emotional perturbations 2. Typical flaws in human decisionmaking a. failure to consider base rate data (prevalence) in assigning probabilities to a set of differential diagnoses b. failure to allow for regression to the mean when evaluating effects of therapy c. overlooking the inherent higher variability in smaller samples d. loss aversion, or the tendency to accept a much higher price to avoid a loss than one would be willing to pay for the same chance to gain an equal amount. (People tend to avoid risks when seeking gains, but choose and accept risks to avoid sure losses.) 3. Sources of human error in clinical reasoning a. availability, in which the probability of an event is overestimated because of the prominent position of the event in the clinician's memory b. representativeness, in which the frequency of an event is estimated incorrectly because the event resembles another event in salient features c. adjustment or anchoring, in which the clinician adjusts his initial probability value to conform to a particular diagnostic solution. d. unwarranted confidence in the accuracy of one's own guess. 4. Formalizing the decision process During the Battle of The Atlantic in World War Two (the struggle between the Uboat and the Allied merchant fleet) a new applied science was born, called Systems Operations Research. It drew upon the mathematical techniques of linear algebra, probability and statistics, and finite mathematics, to solve the problem of frustrating the attempts of the German undersea fleet to choke the supply lifeline to Britain. After the war, it together with Von Neumann and Morganstern's theory of games, gave rise to a new field of study which we call "decision theory." Applications of this new field to medicine have emerged in the form of "decision analysis", a method of using weighted Markov chains to represent information, choice, and outcome. This method, as presently used, gives promise of, if not rationalizing medical decisionmaking, then at least formalizing its expression, and providing an opportunity for inspection and discussion of components of a particular decision. Its usefulness, of course, is dependent upon the resolution, precision, and accuracy of the input information, as well as upon the representativeness of the weights assigned to input and outcome. H. EXPERT SYSTEMS The term "decision support system" encompasses such things as a pocket notebook of normal laboratory values, a hand-held calculator for computing therapeutic drug dosages, an applications software package for descriptive and inferential statistics, linear regression, factor analysis, computer simulations, and finally, expert systems built using artificial intelligence techniques. "Expert systems", sometimes called "consultation systems," use symbolic, non-numeric computation to perform an approximation of human reasoning. They represent a new and stimulating level of man-machine interaction in that they can simulate the performance of the human expert in well-defined and sharply limited domains. Present expert systems have grave shortcomings, and should always be employed subject to human override, for they have nowhere near the grasp of "world knowledge", sometimes called "common sense" possessed by even the nonexpert human, and most are absolutely literal in interpreting data, rules, and instructions. An expert system can be regarded as an abstract machine that uses a collection of facts, a set of rules, and instructions for applying these rules and deriving inferences therefrom. Expert systems have been used to assist in medical diagnosis, mineral resource exploration, oil well log interpretation, nuclear power plant design, chemical synthesis, and planetary fly-bys. They differ from ordinary computer programs in that the tasks do not lend themselves to algorithmic (step-by-step) solutions, and conclusions must often be made based on incomplete or uncertain information. They differ further in that while many computer programs can be very difficult to modify to reflect changes in the task environment, expert systems are specifically designed so that the knowledge base, ie, that body of facts and rules specific to the problem domain, is separate from the "inference engine" or control structure of the program. Thus, the knowledge base is easily accessible for modification -- and can provide an explanation of the system's reasoning to the human user. I. KNOWLEDGE ENGINEERING For a task to be suitable for processing by an expert system, it is first described to a "knowledge engineer" by a human expert. A knowledge engineer is a computer adept who interacts with a human expert's knowledge in the form of facts and rules that then represent the problem domain. There are several requirements: 1. There must be at least one human expert who is acknowledged to perform the task (obtain the solution) well. 2. The primary sources of this human expert's ability must be specialized domain knowledge and experienced judgment. 3. The expert must be able to describe that special knowledge and judgmental ability in unambiguous language and be able to explain the methods by which it can be deconvolved into a particular task or class of tasks. 4. The task must have a carefully limited range of application. It may be necessary to interview several human experts, repeatedly and at considerable length, to construct a satisfactory knowledge base. Competing viewpoints can be captured and compared using consultation systems. At the very least, these systems can be used to collect, preserve, and utilize human judgment and experience laboriously acquired over a lifetime of clinical activity. The clinician who participates in the construction of an expert system can expect some interesting experiences. He will almost certainly stumble on hidden assumptions in his thinking and will be presented with the opportunity to reexamine these assumptions as he explains his reasoning to his knowledge engineer. He will reevaluate observations, measurements, and tests, to derive the best possible estimate of their predictive value. He will come to grips with concepts such as interobserver variability, receiver operating characteristic curves, base rate data, and migration of the nosology itself. While these experiences may not be uniformly pleasant, they can build a better understanding of the limits of accuracy and inference in the diagnostic process. Thus, the physician will gradually gain a new understanding of * what is truly known about the clinical problem, * what is unknown, * and, perhaps most important, what is unknowable in light of current measurement techniques an biologic variability. Perhaps the greatest benefits will flow from the fact that when we set out to construct an expert clinical decision support system, we begin by studying the patient and his problem, and we soon find ourselves studying the physician and his methods. J. WHAT DO WE EXPECT IN THE NEXT FEW YEARS? We have noticed that more incoming residents are familiar with computer systems at least at the level of text processing and file handling, and most of the resident applicants are ten-finger typists. We believe this will diminish resistance. We expect to develop formal residency curricular segments addressing both the UNIX System V environment, and biostatistics and decisionmaking. We foresee at some time in the next decade, a specialty board requirement for some level of computer literacy, and at least a conversational acquaintance with biostatistical concepts and procedures. We expect that as medical school entry slots become less and less competitive, more individuals from the fields of business, engineering, the law, and liberal arts will find their way into M.D. degree programs, and will provide an interesting mix of attitudes and skills, with less reluctance to engage personally the tools of late twentieth century information technology. The currently increasing climate of medical malpractice litigation may bring about the equivalent of the T. J. Hooper dictum (by Justice Learned Hand in 1932). He established that failing to have and use a radio receiver when towing barges offshore was culpable negligence, despite the fact that it was not common practice for ships of that size to be so equipped. We may find some judge imposing a similar requirement upon the medical profession with regard to the use of medical decision support systems (consulting systems, "expert systems") despite the profession's current reluctance to develop and adopt these devices. Sooner or later, the leaders of our medical education establishment will begin to realize that the clinical years of the medical student and the resident physician is spent largely in what we now call knowledge engineering. When the student becomes a resident, the memorized lists of diagnoses, drugs, side effects, symptoms, and complications suddenly must be integrated into a knowledge structure which will provide guidance or justification for some course of action in response to increasing responsibility. The new physician must become his own knowledge engineer -- eagerly, almost desperately, soaking up what passes for knowledge, from whatever source is available (most often, the immediately senior resident). He evaluates and organizes this knowledge as he matures in experience to escape the grip of constant fear which we all recall from our early months as intern and resident. Someday, some innovative educators will decide that this process -- "do-it- yourself knowledge-engineering" -- would benefit from professional assistance, and the topics subsumed by the term medical decisionmaking will become standard academic fare. We believe that the availability of digital computer systems and the emergence of computer adept young physicians-in-training will provide the milieu in which such a paradigm shift can occur and prosper. We have been introducing the general purpose digital computer system into the bedside physician's toolkit, alongside his stethoscope, ophthalmoscope, and sphygmomanometer. We are learning a great deal about the nature and flow of the medical information that comprises the medical record. We're beginning to learn about the language of clinical medicine as a reflection of the values, symbol systems, and reasoning of the clinical physician. We are learning to cope with "cultural impedance" and to identify vectors of "system resonance". K. HUMAN COMMUNICATION --- COMPLEXITIES * semiotic: the general theory of signs and language, consists of syntax, semantics, pragmatics * pragmatics: the study of the sender-receiver relation, as mediated by communication. The meaning of a message has its expression in what it causes the receiver to do. L. GODEL'S THEOREM: the problem of metacommunication In setting out to discuss the intersection of Information and Medicine we are in fact grappling with an understanding of human communication, within an admittedly restricted but nevertheless broad ranging domain. We are attempting to communicate with other humans on the matter of communication among humans, and as such, we risk encountering the bemusing frustrations of self-reflexive statements, and the problem of formal undecidability. As Ludwig Wittgenstein pointed out in his Tractatus Logico- Philosophicus (1951), we can know something about the world in its totality only if we could step outside the world. But if this were possible, the world would no longer be the "whole" world, and its totality would escape us. Similarly with human communication, as we set out to understand the totality of human communication, we may enlarge its boundaries, increase its effectiveness, multiply its speed, but, being human, and communicating, we are powerless to view the system from outside, and ever unable to comprehend it or any part of its domain, in its totality. In 1931, Kurt Godel published an epochal paper on formally undecidable propositions. While his paper dealt with mathematical logic, its consequences were seen to go far beyond mathematics. He established once and for all, that any formal system (mathematical, symbolic, linguistic) is necessarily incomplete, and that, furthermore, the consistency of such a system can only be proved by recourse to methods of proof that are more general than those the system itself can generate. M. THE COMING PARADIGM SHIFT If we consider the changes in the science and practice of medicine brought about by the introduction of the optical microscope, and later, by the roentgen ray, we can believe that the general purpose digital computer system will be another agent of paradigmatic change. Our view is that as physicians come to make use of its power and scope as an information tool, we will begin to think of our science, our patients, our profession, and ourselves in ways that are not only different, but beyond our present ability to predict. --------------------------------END---------------------------------------