Common Threads in Research Across Disciplines: A Reflection


Research lies at the heart of scholarship in all academic disciplines. The author identifies common threads of thought and practice that run through what is called “research.” He identifies six phases that are common to all such scholarly activities (observation, vision, logistics, experiment, assessment, and communication), exploring by example, anecdote and analysis this taxonomy of research as a proposed heuristic for aspiring researchers.

Keynote Address
First Annual Scholars Day Conference
University of North Texas
April 15, 2004

Table of Contents: 


    Common threads of thought and practice regarding scholarship lace their way through many disciplines so that a common taxonomy presents itself when we reflect on the process we call “research.” There are common activities that all perform who are engaged in the research enterprise. While it is true that how one regards research—as a diligent seeking after ultimate “Truth” or as a creative act of construction of a “Narrative” that works—will depend upon one’s epistemology, almost all researchers work as if they were Pragmatic Realists, acting as if truth were unitary, albeit always imperfectly known and laden with the limitations of context. Thus, between the poles of positivism and social constructivism, respectively, practical inquiry is performed. The goal of research, I contend, is reliable knowledge, validated understanding, experience-tested theory. Research is defined in the Oxford English Dictionary as a “search or investigation directed to the discovery of some fact by careful consideration or study of a subject; a course of critical or scientific inquiry” (Oxford University Press, 2004). Research is thus the creative act of insight put to the test. It is the act of intellectual exploration by the human imagination obligated to a unique reality and disciplined by logic to describe the actual state of affairs as precisely, accurately and consistently as is humanly possible. In each discipline, research finds itself manifested in superficially peculiar activities that at a more profound level are nearly identical.

    I offer below a tentative heuristic taxonomy to provoke discussion and to stimulate thought regarding the similarities and differences among our respective disciplines. I have identified six unique components or phases of the research enterprise. I see them operating concurrently, recursively and often in alternative sequence to that in which I have presented them. These are what I have observed in my career as a researcher and as a student of my fellow scholars. While other categories or descriptors may better apply (and will happily be appropriated), I present these as a “straw man” upon which to practice our thrust and parry.


    All inquiry begins with some observation, be it in the physical world, the cultural universe of language or art, or the behavior of human beings. “I have noticed…” are the three most welcome words I can hear from one of my students in an introductory science class, for attention to reality and a sense of wonder is what I seek for them. Curiosity is the cradle of research. As Einstein remarked, “The important thing is not to stop questioning. Curiosity has its own reason for existing” (Harris, 1995). The goal of the Observation Phase of research is the framing of a question. The art of effective research is founded on the ability to ask appropriate and well-articulated questions, “critical questions.” It is at this stage that investigators must learn the context, master the language, and become familiar with prior knowledge, the “received wisdom” regarding the object of their interest, even if they are disposed to disagree with such. While Thomas Kuhn (1970) maintained that science advances by paradigm shifts, by revolutions, one must, nevertheless, articulate the paradigm shift in the language of the discipline if one is to proceed effectively. One of the hallmarks of what is often called “voodoo science” is a propensity for the invention of idiosyncratic language and imprecise vocabulary.

    Therefore, it is imperative that research includes an exhaustive and continuing literature search, not only to learn the language but also to learn of the collective experience and prevailing interpretations of the community. Scholarship is a communal activity. This aspect of the initial phase is often overlooked by novitiates of research. One’s own observations must be compared with the observations of others even as one begins one’s inquiry.


    A second phase of research is intuitive, inspirational, and tentative. An idea, for example, may spring from a reaction against the “received wisdom,” a novel idea from a synthesis of existing theories or from a dream or flight of fancy. In one of the earliest uses of the word “research” in 1799, J. Robertson remarked: “Our most profound researches are frequently nothing better than guessing at the causes of the phenomena” (OED, 2004). Indeed, the description of phenomena requires insight or vision. Perhaps the most creative of all the stages, this activity seeks to produce a hypothesis, a theory, or a concept that can be articulated, tested or manifested. Depending upon the rigor and the accepted modes of expression of the field of research, the inquiry may be qualitative or quantitative in analysis. In the physical sciences it is rarely considered adequate to stop at a vague description. What is required is a fully fleshed-out “model” that makes quantitative predictions regarding the outcome of a proposed experiment. (I am reminded of the quote of Neils Bohr, the great theoretical physicist: “Prediction is very difficult, especially about the future” (Bohr, 2004). Often one is presented in physics with the puzzle of observation, and one must devise a reasonable fundamental model that “predicts” the previously observed behavior. This prediction-testing dialog is absolutely fundamental to the idea of “scientific” research. Indeed, Karl Popper maintained in such works as Logik der Forschung (The Logic of Scientific Discovery) (1959), that unless a premise is testable it cannot be even considered to be “scientific.” He, moreover, argued that premises could not be proven inductively; they could only be disproved by experiment. This principle of testability may be extended or generalized, I argue, to areas of research other than science. A concept, a notion, or a vision must be actualized or manifested before it can be fully realized or evaluated. Only when the vision is thus manifest can its “truthfulness” or intellectual or aesthetic value be assessed.


    Before an idea can be adequately tested or given substance, the investigator must secure the resources and formulate an experimental design. Often overlooked, this phase involves the logistics of resources, personnel, space, time, or required skills. In many instances, it demands the writing of a proposal that is designed to persuade a benefactor, client or patron to fund the project. The proposal is a valuable exercise because it forces one to plan the experiment, investigation or activity. A well-designed experiment can be a “critical experiment” that powerfully discriminates between competing hypotheses with a minimum of resources, but only if it is well thought out. Rarely does a critical experiment happen by chance. One of the most frustrating of experiences is to expend great energy in a research activity and learn at the end little or nothing because of a poorly designed experiment. The Logistics Phase of research is essential to an efficient inquiry, and it is a delight to observe a master experimentalist design a critical experiment that with elegance and efficiency of effort winnows the possibilities. Enrico Fermi is widely accepted as one of the most gifted physicists of the modern era. He wisely jested: “There are two possible outcomes [of a critical experiment]: If the result confirms the hypothesis, then you've made a measurement. If the result is contrary to the hypothesis, then you've made a discovery” (Fermi, 2004). The end of such a Logistics Phase is the preparation for the experiment, the analysis of a piece of literature or the creation of a work of art. The plan precedes the act. Between the vision in the mind’s eye and the image on the canvas, the artist mixes his paint.


    The Experiment component is what most folk think of when “research” is uttered. Indeed, it is often the most prominent. In this phase the test is performed, the measurement taken, the data collected, the musical theme tried out, the draft written, the pen put to paper or the chisel to wood. The task of data collection may be a nearly life-long occupation of observing chimpanzees in the wild, or it can be a few hours when a spacecraft zooms past a planet after years of preparation followed by months of waiting after the rocket’s ascent. The process is often taxing, but as Sophocles mused, whatever the case there are principles that apply to valid experimentation: “Knowledge must come through action; you can have no test which is not fanciful, save by trial.” (Sophocles, 430 B.C.E.)

    Intellectual integrity also is essential to meaningful data acquisition. One must devise strategies that divorce the outcome from personal interest and bias. The Placebo Effect is well known in which the patient’s (and the experimentalist’s) expectations significantly affect the outcome. A similar effect has been reported in educational research in which simply telling a class that they are part of an innovative educational study will improve their performance. In a more sinister form, bias has clouded judgment in numerous scandals such as the “Cold Fusion” fiasco of the 1980s in which the standard protocol of disinterested and critical science was subverted in an infamous career-ending debacle. This and other such “bad” science is recounted by the physicist Bob Park, in his book Voodoo Science: The Road from Foolishness to Fraud (2000).

    Valid inference requires that trustworthy data be available. The product of inference cannot rise much higher than the foundation upon which it is built, especially when that floor is dubious, compromised, cracked and crumbling. Measurements filtered by expectation or polluted by excessive noise can be insidious. I have observed the gradual drifting of the measurement of a universal physical parameter from an initial (grossly erroneous) value to another more accurate value, the change coming not all at once but slowly, as if there were a social pressure not to deviate too radically from the previous results. Over time, however, a new consensus emerged and the measurements of the value stabilized. Thus, I suspect that experimentalists in the “hard sciences” are not immune to the sickness of bias. Moreover, ill-conceived experiments may obscure the reality with noise. I have witnessed such error in the efforts of novice researchers (middle school Science Fair participants) trying to measure the speed of sound using a short string with a stopwatch, only to produce data that measured their reaction time and told absolutely nothing about the phenomenon they sought to investigate.

    How an experiment is done is important. With what insight and skill the activity is performed is crucial. Here lies the art of the experimentalist, the beauty of imagination made material. How one asks a question can make the difference in what one learns. This point is dramatically illustrated by a classic Hungarian joke that recounts how a traveler’s auto broke down near the village of Hatvan, 20 km from Budapest. The stranded tourist stops a farmer driving his horse-drawn cart: “May I have a ride with you?” he asks. “Egan! Climb up here and sit beside me.” “Is it far to Budapest?” the visitor continues. “Nem, it is not far to Budapest.” So the visitor, relieved, climbs aboard. They ride for an hour or more in silence. The rider then remarks, “I thought you said it was not far to Budapest,” to which the farmer replies, “Oh, now it is very far to Budapest.” (P. Revesz, personal communication, 1977). The questions one asks are very important in determining what one learns.

    The wise experimentalist is one who continually performs a meta-experiment refining and re-refining the instrument he uses to make better measurements or more skillful tests of a concept. The goal of this stage is the discrimination of information, the sifting of data to uncover fact. We seek differences and magnitudes of effect with controlled parameters.


    After the long night of data collection, we sit, clear-eyed, looking steadfastly at the fruit of our experimentation and ask, “What does this mean? Did our experiment tell us anything? Did our hypothesis pass or fail the test?” Assessment of the results of the experiment is a unique and significant phase of any research project. Here we draw conclusions, identify errors that mitigate our results, conceive improvements to our design and—through reflection—ponder the implications of our work. It is here that we build knowledge; we have either made a measurement or a discovery, if we have done the other stages well.

    Once I knew a scientist who had many ideas. This is a positive good, as Linus Pauling said, “The best way to have a good idea is to have lots of ideas” (2004). However, my colleague had no internal critic and had difficulty discriminating between his brilliant ideas and those that were…well, “crackpot.” He could not perform a critical experiment—much to the chagrin of his mentors and collaborators. He wasted many resources in fruitless peregrinations. He is doomed, I fear, to a career of disappointment and stymied success. A researcher must have a cool objective detachment from the experiment but a warm empathy for people to function effectively in a group. The researcher is served well when he is most critical of his own data and has examined it thoroughly, honestly, and objectively. Then he can be self-assured when he reveals his findings to his peers (and critics). A reputation for careful scholarship may take a lifetime to build but can be lost in a single event.


    “Publish or perish” the old saw reads. There is truth in such proverbs. I recall a cartoon, however, that showed a tombstone whose epitaph read, “He published, but he perished anyway.” Publication is not a guarantee of acceptance or accuracy. I must often correct the assumption of my graduate students that if something is in print it must be true. They will remark that they are confident of their measurements because their work agrees with the X group or Y, et al. I rebut with the following dictum: “Your results are not valid because they agree with all who went before; rather, you can have confidence in your work because you have done it conscientiously and well. What you have done only has corroborated (or refuted) the received wisdom.”

    It is selfish and foolish to keep such knowledge from the widest audience. Knowledge kept secret has the scent of the occult. Seneca reminds us that “[t]he best ideas are the common property of all.” The enterprise that we are about is that of building the knowledge of the community. We have an obligation to communicate our findings, the fruit of our labor, to our peers. If the flowers of our research can only grow in our own garden then the world soon and rightly suspects that they are made of silk; their scent is indeed faint and “off.” We must be willing to allow our ideas to be tested by others, exposed to the critiques of other minds, to be appropriated, refined or refuted by others. “Truth emerges from the clash of adverse ideas” (Mills, 2004). We must have the moral courage to let our research stand or fall on its own merits.

    Research prospers best when there is an open communication of results. Effective verbal communication is an important activity in the process of inquiry, beginning with trusted colleagues or fellow group members, extending to informal discussions with—often competing—peers at professional meetings, and culminating in formal papers “read” at conferences and symposia. It is an exciting experience to witness or participate in the birth of an idea that changes the landscape of a field of intellectual inquiry. Verbal communication is an important means of rapid communication with immediate feedback. We should assist and encourage our students in such communication. They need to learn how to share and receive information, as well as learning when to keep their mouth shut. Research, in reality, is not always a cooperative exercise; it is sometimes very cutthroat. It also is important that researchers learn the culture and protocol of the community in which they are immersed. Informal verbal exchanges and conversation can reveal not only the merits of an idea, but also the credibility and care of the researcher. I have observed revolutionary ideas initially rejected (that were ultimately accepted) simply because the method of presentation was ill-advised. The sociology and psychology of the community are important and under-appreciated factors in scientific communication.

    It is imperative that research be communicated in written form, as well, to document its existence and to permit it to be subjected to peer review where it can begin to be validated. The significance of the first point is driven home by the case of Denton’s lost Nobel Prize. Dr. Clayton Teague (1978) received his Ph.D. from the Department of Physics of North Texas State University in 1978. He failed to publish the results of his research in the “open literature,” however. In his Nobel Prize lecture of 1986, Gerd Binnig recounted how, in that same year, he and E. Hoenig published their independent observation of the results that Teague had obtained simultaneously in Denton. “Eighteen months later [after our success], we were informed that E.C. Teague, in his Thesis, had already observed similar I(s) curves which at the time were not commonly available in the open-literature” (Binnig & Rohrer, 1986). Binnig and Heinrich Rohrer received the Nobel Prize in Physics for their work. Teague did not. Subsequent to this event, all dissertation research in the Department of Physics at UNT has been required to be published before a degree has been granted.

    The internet has made written communication instantaneously available to millions. We, as academics, must learn to use this new electronic means of publication and encourage our students to do so, as well. However, we must also teach our students how to accurately assess the value of web-based information. Not all sources are equally credible, just as Science and The National Inquirer are not of equal stature even though each print publication avers that it promotes “inquiry.” I propose that all student research be internally peer-reviewed and a written document detailing this work be posted on the honors research web. I encourage all of my colleagues to post “pre-prints” or abstracts of research work on the internet. However, standards should be identified or developed (where there are none) and promulgated for communicating the source, the credentials of the author, the level of peer-review and the permitted use of the information by readers. This is a new opportunity for scholarship in communication that is sorely needed.

    Peer review is essential, but is not infallible. The U.S. Patent Office has come under fire in recent years for granting a patent for a device claimed to produce energy from nothing. The device violates the first law of Thermodynamics, a law that is universally accepted, not because it is the majority opinion, but because in the 400 years of inquiry in Natural Philosophy no credible evidence has ever been forthcoming that one can get more energy out of a system than is there to begin with. This law, also known as the Conservation of Energy, essentially states “You can’t get something for nothing.” But claims for such Perpetual Motion Machines of the First Kind continue, along with claims from their proponents of “suppression of intellectual freedom” by their critics. Peer review is essential if information is to become reliable knowledge.

    I contend that a hypothesis must pass three tests if it is to be considered viable as a “scientific” theory: (1) its assumptions must not contradict well-established evidence; (2) it must be logically consistent; and (3) it must describe the actual state of affairs, that it agree with reality. I once reviewed a paper for Physical Review in which the authors derived a mathematical model intended to represent the motion of atoms of a solid under the impact of a stream of swift atoms. This is a physical process inelegantly but historically called “Sputtering.” Upon review, I found that their model made an assumption that violated the law of Conservation of Energy. Moreover, they made an algebraic sign error in the third line of their mathematical derivation, a mistake we all are familiar with from trying to balance a checkbook. Unfortunately, the new derivation agreed with their previously published equation! I recommended that the paper be published only after major modification. Agreement of a hypothesis with data is not sufficient to “prove” a theory. In my field of ion-solid interactions, I have actually seen a colleague publicly ridiculed when he protested criticism of his model with the statement “But it fits the data.” He was unaware that the same argument had been made for several competing theories two decades earlier, some of which were later repudiated by their creators as “unphysical,” because they ultimately realized the inconsistency of their assumptions.

    The field of research is fraught with dangers for the careless. I have observed the disappointment and embarrassment of researchers who have presented experimental results as novel only to learn from a comment from the audience at a conference that the experiment was performed and published decades before. Goethe (2004) said it well: “Everything has been thought of before, but the problem is to think of it again.” The irony of that remark is that it paraphrases Solomon: “There is no new thing under the sun” (Ecclesiastes 1:9). Not only is it imperative to know what has been published previously, it also is essential to report prior work when it is relevant. Appropriate attribution of sources is one of the most fragile and neglected areas in written scholarly communication. Use of ideas without citation is theft of intellectual property, and the practice prohibits the assessment of the credibility or veracity of the source. Plagiarism is so common these days as to be endemic among students. Often many do not even realize that they are behaving in an unethical manner when they present ideas, words or even whole works as their own creation. A few notorious cases have made front-page news in the last few years. We must instill in our students of research a professional ethic for honest and conscientious attribution that conforms to the accepted style for the discipline. This is not pedantism; this is integrity.


    Therefore, the stages, phases or categories of the act of research can be identified as (1) Observation, (2) Vision, (3) Logistics, (4) Experiment, (5) Assessment, and (6) Communication. I have analyzed the activities that I and other researchers actually do to make it easier to compare and contrast research as it is done in the physical sciences with activities in other disciplines. This paradigm is only a descriptive hypothesis, and is—as yet—untested. I submit it to your scrutiny as my peers to critique its logic, its assumptions and its comportment with the facts. Only by such a procedure can we be consistent or be sure that this narrative actually is what we do.

    Reference List

    • Binnig, G. & Rohrer, H. (1986). Scanning tunneling microscopy from birth to adolescence. Nobel Lecture, December 8, 1986, Retrieved January 7, 2004 from
    • Bohr, N. (2004). Retrieved January 7, 2004 from
    • Ecclesiastes 1:9. (n.d.). King James Authorized Version.
    • Fermi, E. (2004). Retrieved January 7, 2004 from The Quotations Page.
    • Goethe, J.W. (2004). Retrieved January 7, 2004.
    • Harris, K. (1995). Retrieved, 7 January 2004.
    • Kuhn, T. (1970). The Structure of Scientific Revolution. Chicago: University of Chicago Press.
    • Mills, J.S. (2004). Retrieved January 7, 2004.
    • Oxford University Press. (2004). OED On-line Research. Retrieved January 7, 2004 from
    • Park, R. (2000). Voodoo Science: The Road from Foolishness to Fraud. New York: Oxford University Press.
    • Pauling, L. (2004). Oxford University Press. Retrieved January 7 2004 from
    • Popper, K. (1959). Logik der Forschung (The Logic of Scientific Discovery) New York: Basic Books.
    • Senecca, L.A. (2004). Epistles. Retrieved January 7, 2004 from
    • Sophocles. (430 B.C.E.). Trachiniae. Retrieved January 7, 2004 from
    • Teague, E.C. (1978). Room Temperature, Gold Vacuum, Gold Tunneling Experiments. (Doctoral Dissertation, University of North Texas, 1978, Microfilm F1345). Ann Arbor: University of Michigan Microfilms International.
    PDF icon MattesonS.pdf65.64 KB