Philosophy, humanities
Arturo Tozzi
Former Center for Nonlinear Science, Department of Physics, University of North Texas, Denton, Texas, USA
Former Computationally Intelligent Systems and Signals, University of Manitoba, Winnipeg, Canada
ASL Napoli 1 Centro, Distretto 27, Naples, Italy
For years, I have published across diverse academic journals and disciplines, including mathematics, physics, biology, neuroscience, medicine, philosophy, literature. Now, having no further need to expand my scientific output or advance my academic standing, I have chosen to shift my approach. Instead of writing full-length articles for peer review, I now focus on recording and sharing original ideas, i.e., conceptual insights and hypotheses that I hope might inspire experimental work by researchers more capable than myself. I refer to these short pieces as nugae, a Latin word meaning “trifles”, “nuts” or “playful thoughts”. I invite you to use these ideas as you wish, in any way you find helpful. I ask only that you kindly cite my writings, which are accompanied by a DOI for proper referencing.
TRUTH DEPENDS ON THE AVAILABLE INGREDIENTS: A THEORY OF TRUTH GROUNDED IN LINEAR LOGIC
Various theories of truth have been proposed, e.g., correspondence, coherence, pragmatic, deflationary accounts. Yet most of these assume idealized, fully rational agents with unlimited access to relevant beliefs and facts. Many treat truth as a static property of propositions rather than a process constructed or enacted through reasoning processes. These theories tend to overlook the practical constraints faced by real-world knowers such as limited information, cognitive resources and temporal access to data. As a result, many of these theories fall short in capturing the real-time nature of reasoning in decision-making under uncertainty, scientific inquiry and artificial intelligence. What’s needed is a model of truth able to capture how knowledge is built, shaped and constrained in practice.
Linear logic and truth: We propose a resource-sensitive theory of truth based on the principles of the linear logic developed by Jean-Yves Girard that, differently from classical logic, treats information as a resource. A proposition is true for an agent if and only if it can be constructed from available information through a valid sequence of steps. A proposition φ is true for an agent or system if and only if it can be derived from a finite set of informational resources using linear inference:
a ⊗ b ⊗ c ⊢ φ
Here, ⊗ (the tensor operator) signifies that all resources must be present, while ⊢ φ indicates that the proposition φ can be constructively derived. Linear logic is also equipped with the linear implication operator (–o) which governs the transformation from resource configuration to truth claim, preserving the constraints of non-replicability and non-disposability unless explicitly overridden by modal annotations. Therefore, we can express a personal theory of truth in linear logic as: “What I think is true becomes true, but only if I have the right resources (a, b and c) available and I cope with them correctly.”
Overall, our approach points towards a conditional and resource-bound truth rooted in the availability and transformation of informational resources. It’s not enough to just believe something: truth must be constructed from ingredients, much like a proof or a recipe. Truth becomes a dynamic, context-dependent process emerging from what the agent has, knows and can do.
Examples: The following examples illustrate how truth may depend on the presence of specific epistemic resources (e.g., ingredients or inputs) without which a conclusion, however valid, cannot be accessed, confirmed or justified.
- A chef prepares a signature dish and confirms that the ingredients are fresh basil (a), ripe tomatoes (b) and high-quality olive oil (c). Using these, she concludes (φ): the dish will have its characteristic flavor. This belief holds true if all three ingredients are available and used in preparation.
- A team of physicists investigates the existence of the Higgs boson (φ) using theoretical predictions (a), particle collision data (b) and high-precision detector technology (c). Each of these elements is essential: without the theory to guide inquiry, the data to examine or the tools to detect relevant signals, the truth about the Higgs boson cannot be accessed. If all the necessary ingredients and proper epistemic resources are not available, truth is not denied by absence, but made unreachable.
Discussion: By rethinking truth as an operational, context-dependent construct grounded in linear transformations, we situate it within the act of reasoning itself. Our approach thereby aligns with constructivist and proof-theoretic traditions, yet introduces an innovation: reasoning processes are constrained by resource flow. Compared to coherence and correspondence theories, our model tracks how a belief becomes true, not just whether it fits. Compared to deflationary theories, it provides a constructive semantics: truth isn’t just a label; rather it’s something that must be earned through effort. Compared to Bayesian epistemology, it does not require probabilistic access to all alternatives, but instead allows for partial, context-specific reasoning with limited inputs. Still, it may explain Gettier-type problems by identifying when a belief is derived without sufficient or properly used resources.
In our framework, a distinction must be drawn between local truth and global derivability. Local truth refers to what an agent can justify or construct at a given moment, based on the specific resources currently available, like data, concepts, tools or evidence. In contrast, global derivability represents what could be validated under ideal conditions, where all relevant resources are accessible and sharable among agents. This distinction may preserve our dynamic, resource-sensitive approach to truth while allowing for objectivity across contexts.
Our theory could be useful for modeling belief revision, limited rationality, context-sensitive reasoning and inferential constraints in real-world agents. In philosophy of science, it may reframe how theories are justified in terms of finite data sets and experimental evidence. In artificial intelligence, it can support the development of explainable systems justifying conclusions based on consumable data sources. In cognitive science, it may provide a model for bounded rationality and decision-making under informational constraints. In legal reasoning, it can formalize how evidence is used and exhausted in constructing a legal claim. In educational technologies, it can help students learn to construct justified answers based on limited premises. Future research might explore connections between linear truth derivation and epistemic virtue, temporal logic or interactive reasoning in dialogue systems.
Still, our theory generates testable predictions. First, agents trained with resource-sensitive reasoning will outperform classical logic-based agents in environments with limited data access. Second, human subjects will prefer explanations that mirror linear inference patterns, using the available evidence. Third, Gettier-style belief derivations will fail under linear truth derivation models, offering better alignment with intuitive notions of knowledge.
Overall, we suggest that truth is a construct shaped by the correct resources at hand to derive a proposition and the care with which we use them. Like a well-prepared dish or a carefully reasoned argument, truth emerges through action, i.e., through the thoughtful consumption of evidence, context and inference.
Philosophical Addendum: A Heideggerian Integration. A fruitful bridge can be drawn between our resource-sensitive account of truth and Heidegger’s account of language, since both emphasize that disclosure depends on the availability of enabling conditions. Heidegger’s reflections in The Essence of Language deepen our framework by showing that naming is not a secondary act of attaching a label to something already present, but the very event through which a being comes into the open. Without the name, the thing remains hidden, undifferentiated and without presence in a meaningful world. In our resource-sensitive account of truth, this resonates with the claim that truth depends on the availability of specific informational ingredients. Just as for Heidegger the name is the resource that discloses being and allows it to stand within human understanding, in our model data, concepts and tools are the resources that disclose truth and allow propositions to be constructed and validated. Naming, then, may be seen as the primordial epistemic resource: it grants existence within the horizon of meaning, paralleling the way linear logic grants truth within the horizon of reasoning.
QUOTE AS: Tozzi A. 2025. Nugae - truth depends on the available ingredients: a theory of truth grounded in linear logic- DOI: 10.13140/RG.2.2.12834.34245
MATHEMATICAL TOPOLOGIES OF THOUGHT: A STRUCTURAL METHOD FOR ANALYZING PHILOSOPHICAL FRAMEWORKS
The intersection of mathematics and philosophy has traditionally focused on logic, set theory, and formal semantics. While these tools have proven effective for clarifying arguments and propositions, they often fail to capture the deeper relational and hierarchical structures underlying philosophical systems. Traditional methods like predicate logic, modal analysis, or set-theoretic classification are limited in assessing global structural coherence and dynamic interdependencies in complex conceptual frameworks. To address these limitations, we propose a novel methodological approach that applies topological, algebraic, and probabilistic tools—specifically homotopy theory, sheaf cohomology, and convergence theorems—to the analysis of philosophical arguments. Rather than merely translating arguments into logical syntax, our method maps epistemic claims into mathematical spaces characterized by topological invariants and algebraic structures. This allows us to model conceptual coherence, continuity, and transformation across different systems of thought.
Our method treats philosophical doctrines as embedded structures in higher-dimensional conceptual spaces. Using tools like the Seifert–van Kampen theorem, Kolmogorov’s zero-one law, and the Nash embedding theorem, we may analyze how philosophical positions can be decomposed into substructures, reassembled through formal operations, and compared based on structural similarity or divergence. Homotopy equivalence provides a way to understand how epistemological models can be transformed while preserving essential properties. Meanwhile, probability theory enables a measure of epistemic stability or variability within those models.
This structural formalization provides several potential advantages. It allows for more rigorous comparison between philosophical frameworks, enables the visualization of conceptual dependencies, and provides metrics for evaluating internal consistency and coherence. The methodology opens the door to applications in artificial intelligence, where philosophical concepts could be operationalized within machine learning systems or automated reasoning engines. It also holds promise for digital humanities and the computational analysis of historical texts. Our approach is distinct from existing techniques in its ability to preserve both local and global properties of philosophical structures, offering a new level of precision and formal clarity. Unlike purely logical or interpretive analyses, it introduces measurable, testable models of philosophical thought, setting the stage for future empirical investigations. Testable hypotheses include whether philosophical systems exhibiting homotopy equivalence correspond to similar cognitive models or whether certain topological features predict conceptual evolution.
Overall, this method represents a conceptual innovation to bridge mathematical formalism and philosophical analysis, providing a framework for future interdisciplinary research in epistemology, metaphysics, AI, and beyond.
QUOTE AS: Tozzi A. 2025. Topological and Algebraic Patterns in Philosophical Analysis: Case Studies from Ockham’s Quodlibetal Quaestiones and Avenarius’ Kritik der Reinen Erfahrung. Preprints. https://doi.org/10.20944/preprints202502.1518.v1.