13/11/2019 – Maíra Bittencourt

Three Bayesian concepts in the context of the breaking of the naval Enigma code during World War II

Maíra Bittencourt

IFCH – Unicamp

Abstract. Jack Good (1916-2009) was a Bayesian statistician who worked with Alan Turing (1912-1954) as his assistant at Bletchley Park during the World War II, from 1941 to 1943. They worked as cryptanalysts at the Government Code and Cypher School (GC&CS) with about eight more people, trying to break the naval enigma code used by the German navy. They were able to break the code by a process Turing called Banburismus. Good (2000, p. 101) refers to this method as a “sequential Bayesian procedure”. The main goal of this seminar is to explain three Bayesian concepts related to this procedure: the Bayes factor, weight of evidence and information update.

COURSERA. Bayesian Statistics: From Concept to Data Analysis. 2019. . 13 sep. 2019.

GOOD, I. J. The Population Frequencies of Species and the Estimation of Population Parameters. Biometrika, Vol. 40, No. 3/4 (Dec., 1953), pp. 237-264.

________. Studies in the History of Probability and Statistics. XXXVII A. M. Turing’s Statistical Work in World War II. Biometrika, Vol. 66, No. 2 (Aug., 1979), pp. 393-396.

________. Good thinking: the foundations of probability and its applications. Minneapolis: University of Minnesota Press, 1983.

________. Enigma and Fish. In: HINSLY, F. H.; STRIPP, A. (Eds). Codebreakers: the inside story of Bletchley Park. Oxford University Press, 1994.

________. Turing’s anticipation of empirical bayes in connection with the cryptanalysis of the naval enigma. Journal of Statistical Computation and Simulation, 2000, Vol.66:2, p. 101-111.

GOOD, I. J.; TOULMIN, G. H. The Number of New Species, and the Increase in Population Coverage, when a Sample is Increased. Biometrika, Vol. 43, No. 1/2 (Jun., 1956), pp. 45-63.

________. Coding Theorems and Weight of Evidence. J. Ins. Maths Applics. 1968, 4, p. 94-105.

HINSLY, F. H.; STRIPP, A. (Eds). Codebreakers: the inside story of Bletchley Park. Oxford University Press, 1994.

KAHN, D. The Codebreakers: The Comprehensive History of Secret Communication from Ancient Times to the Internet. New York: Scribner, 1996.

MCGRAYNE, S. B. The theory that would not die: how Bayes’ rule cracked the enigma code, hunted down Russian submarines, and emerged triumphant from two centuries of controversy. New Haven & London: Yale University Press, 2011.

STERN, J. M. Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast. Information 2011, 2, 635-650.

23/10/2019 – Ekaterina Kubyshkina

A logic for factive ignorance

Ekaterina Kubyshkina

CLE – Unicamp

Abstract. In the current debate there are two epistemological approaches to the definition of ignorance: the Standard View and the New View. The former defines ignorance simply as not knowing, while the latter defines it as the absence of true belief. One of the main differences between these two positions lies in rejecting (Standard View) or in accepting (New View) the factivity of ignorance, i.e., if an agent is ignorant of p, then p is true. In the present talk, I first provide a criticism of the Standard View in favour of the New View. Secondly, I propose a formal setting to represent the notion of factive ignorance.

16/10/2019 – David Fuenmayor

Automated Reasoning with Ethical Theories — A Case Study Towards Responsible AI

David Fenmayor

Freie Universität Berlin

Abstract. In this talk we start by discussing current trends in statistical machine learning and related challenges, mostly concerning data biases, security vulnerabilities and poor interpretability. We then contrast so-called “bottom-up” and “top-down” approaches to machine ethics and argue for the use of expressive logic formalisms (and related theorem proving infrastructure) for the explicit representation of ethical theories, thus mechanizing ethical reasoning in a “top-down” fashion. In particular, we showcase an approach based on the utilization of classical higher-order logic (i.e. Church’s type theory) as a meta-logic to encode some combinations of non-classical logics useful for normative reasoning. The implementation of this approach makes use of automated theorem provers and proof assistants for higher-order logic, whose application will be showcased live (using Isabelle/HOL).

09/10/2019 – Evandro Gomes

Remarks Concerning Semantic and Syntactic Contributions to the History of Logic

Evandro Gomes

Universidade Estadual de Maringá

Abstract. What the historian seeks to identify and analyze in the sources of the history of logic is the notion of logical contribution. This notion was analyzed by Vega Reñon (1997, p. 40–45), whose conceptual framework we summarize here. This author characterizes the definition of logical contribution in an intuitively recursive way. Let 𝑇 be a text. In principle, 𝑇 is logically significant if 𝑇 has to do with the presuppositions, questions, or applications in the field of knowledge covered by logic in a certain historical landmark 𝑀. Consequently,

(i) if 𝑇𝐿 is a logically significant text, then 𝑇𝐿∗ is a logical contribution with respect to the notions, problems, methods, or results that have characterized the cultivation of logic as a discipline at some moment of its historical course;
(ii) 𝑇𝐿∗𝑝 is a potential logical contribution in a determined historical landmark 𝑀, if 𝑇𝐿∗ can be recognized by practitioners of logic in 𝑀 as a logical contribution;
(iii) 𝑇𝐿∗𝑒 is an effective logical contribution in a determined historical mark 𝑀, if 𝑇𝐿∗ can be recognized or assumed by practitioners of logic in 𝑀 as a logical contribution;
(iv) 𝑇𝐿∗𝑙𝑡 is an historical contribution in the broad sense (a memorable contribution), if there is some historical landmark 𝑀 from which 𝑇𝐿∗ comes to be seen as either a potential or effective logical contribution;
(v) 𝑇𝐿∗𝑠𝑡 is an historical contribution in the strict sense, if there is a historical mark 𝑀 in which 𝑇𝐿∗ was an effective logical contribution 𝑇𝐿∗𝑒.

We propose that these notions can be adapted to the historiography of every logic or family of logical systems. For instance, in the case of the paraconsistent logics, we can specify the notion of logical contribution, and introduce by analogy the notion of contribution to the history of paraconsistent logic 𝑇𝐿∗𝑃.

To the historiographical categories presented so far, which give prominence to historical contextual elements, may be added other conceptual categories that are applicable and quite appropriate to the comparative historical study of logical systems. Categories such as syntactic and semantic contribution to the history of logic may be also required by an historiographical point of view as internal criteria. These categories permit the determination of clear chains of theoretical formation and make evident the interdependencies and intercorrelations of authors among themselves and among their logical contributions and their communities. Of course, these categories can also be refined with the objective of better assessing the historical development of a logical theory, a logic, or an entire branch of logical systems.

We consider a contribution to the history of logic syntactic or semantic depending on the degree of conscious motivation demonstrated by an author at the moment of the proposal of his/her contribution to the field of logic. Intentionality is thus a decisive factor in determining if a contribution to the history of logic is purely syntactic or also semantic. These notions can be stated more precisely as follows:

1. A syntactic or accidental contribution to the history of logic (or a purely formal one) occurs when an author proposes a logical innovation (in either the narrow or the strict sense) in accordance with interpretations proper to the historical mark 𝑀 within which it appears, and he or she does not offer an explicit interpretation of it or has little or no consciousness of what he or she has just proposed.
2. A semantic or intentional contribution to the history of logic occurs when a logical innovation (in either the narrow or the strict sense), in accordance with interpretations proper to the historical mark 𝑀 within which it appears, is introduced with an explicit motivation and with full awareness on the part of its contributor (see Gomes (2013), p. 9–12 e Gomes & D’Ottaviano (2017), p. 33–36).

These new categories permit objective analysis, including analysis of possible priority disputes in the history of logic, and also establish clear criteria which historically enumerate different logical systems, whether or not they are mutually dependent in a branch of the development of logic. In addition, such intentional character must be readily recognized in the relevant historical landmark 𝑀 in the community of practitioners of logic at the time.

For example, the history of paraconsistent logic, founded on these categories, is supported by the historiographic premise according to which effective logical contributions 𝑇𝐿∗𝑃 must be conscious or recognized within their historical-theoretical landmark – that is, within the state-of-the-art in the community of practitioners at the time (Gomes & D’Ottaviano, 2017, p. 31-45). For this reason, in order to consider a paraconsistent logician in the strict sense to be a precursor or pioneer as a founder of the theoretical field of paraconsistent logic, it is necessary that his/her contributions be intentional or semantic and that they be chronologically appropriate.

In light of the reasons given above, purely chronological criteria appear to be simplistic and inefficient. If such criteria were enough to determine priority in the discovery of paraconsistency, its inauguration would be placed far back in the history of formal Western logic. And just as it would be improper to attribute to Aristotle the notable role of the founder of paraconsistency, it would likewise be improper to do so with regard to authors such as Peter of Spain, William of Ockham, and others. Although parts of their logical theories may at present be considered paraconsistent in the broad sense, these thinkers did not perceive the unusual and non-classical character that their theories implied. A similar situation is found in the cases of Kolmogorov, Johansson, Nelson, and in the cases of other thinkers of our own era. These authors cannot, under the view we have adopted here, be considered founders of paraconsistency, even though their logical theories can today be interpreted and considered as paraconsistent in the broad or even in the strict sense. In terms of the historiographical premises here assumed, the contributions of Stanisław Jaśkowski (1906–1965) and Newton da Costa (1929) are situated on another level. Motivated by problems arising from the presence of contradictions in specific rational contexts, they proposed and developed logical systems capable of dealing with contradictions or inconsistencies without the trivialization of the theories implied by these systems, completely fulfilling the requirements of historical postulates (1) and (2) above.

BOCHEŃSKI, I. M. [1961]. A History of Formal Logic. Translated from German and edited by I. Thomas. University of Notre Dame Press.
GOMES, E. L. (2013). Sobre a história da paraconsistência e a obra de da Costa: a instauração da Lógica Paraconsistente [On the history of paraconsistency and da Costa’s work: the establishment of Paraconsistent Logic]. (Dec., 2013). 535p + appendixes. Thesis (PhD in Philosophy) – Institute of Philosophy and Human Sciences and Centre for Logic, Epistemology and the History of Sciences, State Univesity of Campinas, Campinas, SP.
GOMES, E. L. & D’OTTAVIANO, I. M. L. (2017). Para além das Colunas de Hércules, uma história da paraconsistência: de Heráclito a Newton da Costa. Campinas: Editora da Unicamp; Centro de Lógica, Epistemologia e História da Ciência. (Unicamp Ano 50, 50; CLE, 80) → http://www.editoraunicamp.com.br/produto_detalhe.asp?id=1151
GUILLAUME, M. (1994) La logique mathématique en sa jeunesse. In Development of Mathematics: 1900-1950. J. P. PIER (ed.). Basel, Boston, Berlin: Birkhäuser Verlag. p. 185–321.
VEGA REÑÓN, L. (1997). Una guía de historia de la lógica. Madrid: Universidade Nacional de Educación a distancia.

02/10/2019 – Francesco Maria Ferrari

Formal and empirical issues against Campbell’s Trope-Theory

Francesco Maria Ferrari

CLE – Unicamp

Abstract. K. Campbell, in the 6th chapter of [1], gives a trope interpretation of Fields. My contribution challenges Campbell’s Trope Theory (TT) inner consistency and its adequacy relative to one of the current unified pictures of fundamental physics, Quantum Field Theory (QFT) [2, 3].

Campbell’s trope-ontology is based on (i) trope-simplicity, i.e., tropes are unstructured and basic, ultimate entities from which, (ii) by the so-called (Humean) supervenience relation, every other further entity – from atoms to those of our manifest world – is to be interpreted as derivative. Supervenience is, thus, here shaped as a reductive relation.

I simply challenge TT by arguing that (i) is a false assumption and that (ii) generates some inconsistencies.

I will proceed as follows. First, I recall the basic features of TT. Second, I present my argument on the generation of inconsistencies in TT as supervenience-based ontology. I argue for the restriction of TT-adequacy to
a non-unified version of QFT – called back-ground dependent – that, contrary to what TT supposes, is not our current best picture of fundamental physics, being coherent just with Quantum Mechanics (statistics) but not with General Relativity. Thus, I conclude, TT is an essentially incomplete theory w.r.t. our physical world, and present some further insights towards a process ontology.


[1] Campbell, K. (1990). Abstract Particulars (Philosophical Theory), Cambridge, USA: Blackwell.
[2] Umezawa, H. (1993). Advanced Field Theory. NY: American Institute of Physics.
[3] Royal Swedish Academy of Sciences (2015). Neutrino Oscillations, Scientic Background on the Nobel Prize in Physics, available online.
[4] Campbell, R. J., & Bickhard, M. H. (2011). Physicalism, Emergence and Downward Causation. Axiomathes 21:33-56.
[5] Del Giuddice, E., Pulselli, R., & Tiezzi, E. (2009). Thermodynamics of irreversible processes and quantum eld theory: an interplay for understanding of ecosystem dynamics, Ecological Modelling 220:1874-1879.
[6] Blasone, M., Jizba, P., & Vitiello, G. (2011). Quantum eld theory and its macroscopic manifestations, Amsterdam: John Benjamins Pub. Co.

11/09/2019 – Aldo Figallo-Orellano

About the notion of consistency for tarskian logics

Aldo Figallo-Orellano

CLE (Unicamp) & Universidad Nacional del Sur (Argentina)

Abstract. The problem of completeness of classical first-order predicate logic was formulated, for the first time in precise mathematical terms, in 1928 by Hilbert and Ackermann and solved positively by Gödel in his Ph.D. thesis one year later. In 1947, Henkin present an alternative and simpler proof that became standard in logic textbooks (Ph.D. thesis, Princeton). The main advantage of Henkin’s is that it shows how to construct term-models to invalidate a derivation in the calculus  by means of a maximal consistent theory.

In this talk we will discuss the notion of consistent theories for certain algebrizable logics and show that Henkin’s maximal theories are linked to Monteiro’s notion of maximal deductive systems of first order Lindembaun-Tarski algebra. First, we are going to present a first-order  trivalent calculus which has a implication but without negation in the language in order to show the mentioned relation. Next,  the completeness theorem will be presented by using results of universal algebras and also, the general presentation of these ideas will be displayed.  Finally, as an application of his,  we will present algebraic-like models for first order da Costa’s logic Cw.

28/08/2019 – Pedro Carrasqueira

Structural equations models for deontic logic

Pedro Carrasqueira
(IFCH – Unicamp)

Abstract. Standard deontic logic (SDL) has been found faulty as a logical theory of normative systems for a number of reasons. Among other criticisms, it has been said to fail to account for: the difference between ought to be and ought to do; conditional norms; and the difference between regulative and constitutive rules.

In my talk I will, first of all, present SDL and the philosophical conception of normativity underlying it; then I will address its problems, and briefly discuss some of the extended and alternative systems of deontic logic proposed as solutions to them. As I will argue, all of them seem to me insufficient as theories of normative systems, as none of them account for all of SDL’s deficiencies in a single, uniform framework.

Thus, following a suggestion by famous philosopher of law Hans Kelsen to the effect that normativity is in some sense analogous to causality, and drawing from recent formal work on causality by Halpern and others, I will present a sketch of my attempt at a novel formal analysis of normativity (making use of so-called structural equations models) that may be able to provide such a framework.


Briggs, R. Interventionist counterfactuals. In: Philosophical studies, v. 160, pp. 139-166. Springer Netherlands, 2012 (online resource).

Gabbay, D. et al. Handbook of deontic logic and normative systems. Milton Keynes: College Publications, 2013.

Halpern, J. Y. Actual causality. Cambridge: MIT, 2016.

Kelsen, H. Teoria pura do direito. Trad. José Baptista Machado. São Paulo: Martins Fontes, 2006.

Navarro, P. E.; Rodríguez, J. L. Deontic logic and legal systems. Cambridge: Cambridge University, 2014.

Searle, J. The construction of social reality. New York: The Free Press, 1995.