Can the new category of ‘intermediate science’ resolve the problems of contemporary philosophy of science?

This blog post examines the limitations revealed by Popper’s falsificationism and explores how the concept of ‘intermediate science’—designed to encompass probabilistic theories and incomplete theories—can fill the gaps in modern philosophy of science.

 

In 1897, Thomson discovered the existence of electrons through cathode ray experiments, which became the basis for the so-called ‘plum pudding’ atomic model. His theory was widely accepted as orthodox at the time, but its validity was disproved by Rutherford’s alpha particle scattering experiments. Nevertheless, Thomson’s discovery made a significant contribution to humanity’s understanding of atomic structure, and today, students in South Korea encounter his theory through their science textbooks. Although his model was falsified, there is broad agreement that it represents a scientifically and historically important discovery. However, according to the perspective of the prominent 20th-century philosopher of science, Karl Popper, Thomson’s theory becomes nothing more than a non-scientific theory that should have been discarded the moment it failed to be falsified—that is, a theory that never attained the status of science.
The philosophy of science, tracing its lineage back to the Greek philosopher Thales who claimed science and philosophy emerged simultaneously, aimed to break free from past metaphysical worldviews and construct a worldview grounded in science. Within the philosophy of science, which explores the meaning and legitimacy of the scientific method, the problem of ‘demarcation’—distinguishing science from non-science—becomes the fundamental foundation underpinning the entire discipline. At this juncture, Popper advocated falsificationism, proposing a criterion to distinguish science from pseudoscience. Popper held that for a theory to attain scientific status, it must possess falsifiability. According to his argument, if a theory lacks falsifiability and can always explain all phenomena solely through itself, it is considered pseudoscience and should be discarded. However, the author believes that as times change, the limitations of Popper’s dichotomous demarcation criterion are becoming increasingly apparent. Therefore, the author aims to highlight the problems inherent in Popper’s dichotomous demarcation criterion within the modern scientific environment and the potential negative impact this way of thinking could have on scientific progress.
Entering the 21st century, science began relying on probabilistic interpretations to explain the dramatically increased complexity of natural phenomena, inevitably increasing science’s own uncertainty. Quantum mechanics is a prime example of a scientific theory analyzing phenomena based on probability. In the microscopic world, invisible to the naked eye, particles appear and vanish without continuity, scatter in unpredictable directions, and exhibit other bizarre phenomena. After studying the elusive properties of these ever-changing particles in this unknown realm, scientists concluded that it is impossible to precisely measure a particle’s momentum and position simultaneously (Heisenberg’s Uncertainty Principle). In other words, a particle’s momentum and position can only be described probabilistically. Furthermore, the behavior of electrons is also expressed probabilistically through wave functions. In quantum mechanics, it even becomes possible for a cat to exist in a superposition state, equally likely to be ‘dead’ or ‘alive’ under specific conditions.
Thus, quantum mechanics explains all phenomena through probabilistic descriptions. Although based on the uncertain element of probability, quantum mechanics fits remarkably well with the bizarre phenomena of the microscopic world. In fact, it could be argued that it was precisely the introduction of probability that made it possible to explain the complex characteristics of the microscopic world. Not only quantum mechanics, but also data mining and data science rely on probabilistic analysis to predict and explain various phenomena. Modern humanity increasingly relies on probabilistic data to interpret the diverse phenomena of a complex and pluralistic world. As a result, quantum mechanics has driven revolutionary advances in physics and chemistry, distinguishing conductors from insulators and elucidating the properties of semiconductors, thereby becoming the cornerstone of modern industrial technological development. Indeed, nearly all electronic and information technologies in modern society—including semiconductor chips, quantum computers, quantum communication, quantum sensors, MRI equipment, lasers, GPS synchronization technology, transistor circuits in smartphones, and optical communication networks—are fundamentally based on quantum mechanics. Particularly since the 2020s, quantum-based technologies have expanded further. Global companies like Google, IBM, and Intel continue competing to increase the number of qubits in quantum computers, and several nations, including South Korea, have adopted quantum infrastructure development as a national strategic industry. Furthermore, quantum cryptography has already been demonstrated and commercialized in some financial institutions and national backbone networks, while quantum sensors are demonstrating performance that surpasses the limitations of conventional sensors in the field of ultra-precise measurement.
Meanwhile, data science has also expanded explosively since the 2020s. It has become an essential technology across all industries, including semiconductors, distribution, insurance, security, as well as healthcare, autonomous driving, financial algorithms, climate prediction, aviation, and smart cities. Particularly, the emergence of generative AI and large language models (LLMs) dramatically broadened the scope of data science applications, becoming a core driver of the global ICT market since 2023. Gartner, an international technology analysis firm, has consecutively selected generative AI, data and analytics, and quantum technology as key strategic technologies in its 2024 and 2025 technology outlooks, emphasizing that data-driven decision-making and probabilistic modeling have become central pillars of modern industry.
These facts confirm that scientific theories utilizing probability have been increasingly and firmly validated in the modern technological environment, not only for their content’s legitimacy but also for their practicality. However, according to Popper’s argument, quantum mechanics is not science. This is because when experiments are conducted to falsify probabilistic propositions, the results manifest as a single, definite value that is not probabilistic. Therefore, probabilistic propositions cannot be falsified. If we adhere strictly to Popper’s demarcation criterion, which holds falsifiability as the sole standard for science, quantum mechanics should be classified as pseudoscience. Is it truly right to dismiss a theory that has produced such practical and outstanding results as pseudoscience? I feel strong doubt and aversion towards this. I believe Popper’s demarcation criterion has clear problems, as it fails to encompass theories that rely on probability yet are sufficiently scientific and produce practical results.
Furthermore, Popper’s demarcation criterion reveals serious limitations not only when applied to probabilistic discussions but also when applied to major modern theories like multiverse theory or string theory. String theory explains that the fundamental structure of the universe is formed by extremely small, one-dimensional entities called strings and multi-dimensional entities called membranes existing in a multidimensional space. Quantum multiverse theory asserts that in certain situations, each possibility available for human choice branches off into a different universe. In other words, Schrödinger’s cat could exist in two different universes, one where it is alive and another where it is dead. These two theories share a common characteristic: they cannot be falsified. While string theory can enhance its plausibility by increasing its compatibility with reality through Bayesian methodology, this remains a probabilistic theoretical verification. Experimental falsification is impossible with current technology. Similarly, the multiverse theory cannot be verified with current human technology.
Therefore, applying Popper’s criterion of falsifiability, both theories must be classified as pseudoscience. However, the author has serious concerns about this classification method. Of course, unlike the previously mentioned quantum mechanics, both theories are not yet complete, and there are no cases where they have been technically implemented and applied to real life. Nevertheless, the author focuses on the creativity and potential inherent in these two theories. Although clear experimental evidence has not yet been presented, if these theories prove true, they would constitute monumental discoveries capable of revolutionizing human scientific history, such as enabling dimensional travel. To realize this potential, scientists must persistently continue their research. However, if these theories are classified as pseudoscience, as Popper argues, can scientists truly dedicate themselves actively to such research? Probably not. The term ‘pseudoscience’ inherently carries negative connotations and can instill deep skepticism in researchers, potentially weakening their resolve to pursue such studies. Indeed, many still strongly criticize these theories based on Popper’s falsificationism. I believe we should not dismiss any theory as unscientific merely because definitive proof has not yet been established. Just as the modern atomic model was gradually refined over a long period, successful research inevitably requires patience. However, Popper’s dichotomous demarcation criterion seems insufficiently attuned to the necessity of such patience.
Therefore, I propose solutions to the problems inherent in Popper’s demarcation criterion. First, the demarcation criterion itself requires modification. For general scientific theories, falsifiability should remain the criterion for distinguishing science from non-science. However, for probabilistic discussions, the criterion should shift to ‘data reliability’. Factors such as which theory was utilized or which experimental apparatus was employed must all be included in the calculation of reliability. This criterion is proposed because probabilistic discourse also contains predictions that are valid based on evidence and those that are not; to distinguish between them, an evaluation system based on reliability is absolutely necessary. Furthermore, the author expresses concern that the binary categorization of science into only ‘science’ and ‘pseudoscience’ risks leading to scientism, which could actually hinder scientific progress. Accordingly, the author proposes distinguishing three categories: ‘science’, ‘pseudoscience’, and ‘intermediate science’, which occupies the middle ground. The term ‘intermediate science’ signifies theories in an embryonic stage with potential, existing between science and pseudoscience. Introducing this category would place theories like the multiverse and string theory within ‘intermediate science’. Doing so could partially prevent the indiscriminate criticism of theories that are meaningful but not yet proven.
Although the classification method proposed by the author may not be perfect, Popper’s criteria for classification, as examined earlier, must be revised for the advancement of modern science. Since science is the discipline that deals with an ever-changing world, its criteria must also be updated to suit the times. Therefore, the author emphasizes the importance of redefining the classification criteria in a direction that protects the potential and latent value of new theories and enriches the future of science.

 

About the author

Writer

I'm a "Cat Detective" I help reunite lost cats with their families.
I recharge over a cup of café latte, enjoy walking and traveling, and expand my thoughts through writing. By observing the world closely and following my intellectual curiosity as a blog writer, I hope my words can offer help and comfort to others.