Work in process

T.F.S. and R. de Heide:
On the truthconvergence of openminded Bayesianism.
Submitted.Formalizing suggestions going back to Shimony (1970), Wenmackers and Romeijn (2016) work out an extension of Bayesian confirmation theory that can deal with newly proposed hypotheses. We demonstrate that their "openminded Bayesianism" does not preserve the classic guarantee of almostsure merger with the true hypothesis, and advance a forwardlooking openminded Bayesian that does.
Journal publications, to appear

T.F.S. (2019):
The metainductive justification of induction: The pool of strategies. [philsci]
Accepted for publication in Philosophy of Science, Proceedings of the 2018 Biennial Meeting of the PSA.
In this followup paper I pose a challenge to Schurz's proposed metainductive justification of induction. I argue that Schurz's argument requires a dynamic notion of optimality that can deal with an expanding pool of prediction strategies.

T.F.S. (20xx):
The metainductive justification of induction. [doi] [philsci]
Accepted for publication in Episteme.
I investigate Schurz's proposed metainductive justification of induction, a refinement of Reichenbach's pragmatic justification that is grounded in results from machine learning. My conclusion is that the argument, suitably explicated, goes a long way; but there are qualifications. One is that the argument can at most justify sticking with objectinduction for now; another I work out in a followup paper.
Journal publications

T.F.S. (2019):
Putnam's diagonal argument and the impossibility of a universal learning machine. [doi] [philsci]
Erkenntnis 84(3): 633656.
The diagonalization argument of Putnam (1963) denies the possibility of a universal learning machine. Yet the proposal of Solomonoff (1964), made precise by Levin (1970), promises precisely such a thing. In this paper I discuss how this proposed measure function is designed to evade diagonalization, but the corresponding prediction method still falls prey to it.

T.F.S. (2017):
A generalized characterization of algorithmic probability. [doi] [arxiv]
Theory of Computing Systems 61(4): 13371352.
In this technical paper I employ a fixedpoint argument to show that algorithmic probability can equivalently be defined as the universal transformation of any continuous computable measure (rather than just the uniform one). A motivation for establishing this result was to question the view that algorithmic probability incorporates principles of indifference and simplicity.

T.F.S. (2016):
Solomonoff prediction and Occam's razor. [doi] [philsci]
Philosophy of Science 83(4): 459479.Many writings on the subject suggest that algorithmic probability can offer a formal justification of Occam's razor. In this paper I make this argument precise and show why it does not succeed. The broader purpose of the paper is to give an overview for philosophers of Solomonoff's theory of prediction.

G. Barmpalias and T.F.S. (2011):
On the number of infinite sequences with trivial initial segment complexity. [doi] [preprint]
Theoretical Computer Science 412(52): 71337146.In this technical paper, based on results from my MSc thesis [pdf], we answer an open problem [pdf] in the field of algorithmic randomness. This problems concerns infinite sequences of minimal Kolmogorovcomplexity. Specifically, we determined the arithmetical complexity of calculating the number of such sequences for given constant. On the way we prove several results on the complexity of trees.
Conference proceedings

E.R.G. Quaeghebeur, C.C. Wesseling, E.M.A.L. BeauxisAussalet, T. Piovesan, and T.F.S. (2017):
The CWI world cup competition: Eliciting sets of acceptable gambles. [pdf] [poster]
Proceedings of Machine Learning Research 62: Proceedings of the Tenth International Symposium on Imprecise Probability: Theories and Applications, 1014 July 2017, pp. 277288. Poster presented at ISIPTA '15.
Other

T.F.S. (2018):
What's hot in mathematical philosophy. [pdf]
The Reasoner 12(12), pp. 9798.
Installment of a monthly column run by the MCMP; my contribution is on formal epistemology and machine learning.

J.W. Romeijn, T.F.S. and P.D. Grünwald (2012):
Good listeners, wise crowds, and parasitic experts. [doi] [pdf]
Analyse & Kritik 34(2), pp. 399408.
Comment on Metainduction and the wisdom of crowds [pdf] by P. Thorn and G. Schurz.
Their paper investigates the tension between the provable optimality of metainductive methods, that aggregate the judgements of the other available experts, and the Wisdom of Crowds effect, that presupposes diverse and independent judgements by the experts. In our discussion we shift attention from optimality or relative reliability to the absolute reliability of experts.
PhD dissertation (2018, cum laude)

Universal prediction: A philosophical investigation. [cwirepo] [handle] [philsci]
Supervisors: J.W. Romeijn (U Groningen) and P.D. Grünwald (Centrum Wiskunde & Informatica, Amsterdam; Leiden U) .
Assessment committee: H. Leitgeb (LMU Munich), A.J.M. Peijnenburg (U Groningen), and S.L. Zabell (Northwestern U).
Examining committee: the assessment committee, and R. Verbrugge (U Groningen), L. Henderson (U Groningen), and W.M. Koolen (CWI Amsterdam).
In this thesis I investigate the theoretical possibility of a universal method of prediction. A prediction method is universal if it is always able to learn what there is to learn from data: if it is always able to extrapolate given data about past observations to maximally successful predictions about future observations. The context of this investigation is the broader philosophical question into the possibility of a formal specification of inductive or scientific reasoning, a question that also touches on modernday speculation about a fully automatized datadriven science.
I investigate, in particular, a specific mathematical definition of a universal prediction method, that goes back to the early days of artificial intelligence and that has a direct line to modern developments in machine learning. This definition essentially aims to combine all possible prediction algorithms. An alternative interpretation is that this definition formalizes the idea that learning from data is equivalent to compressing data. In this guise, the definition is often presented as an implementation and even as a justification of Occam's razor, the principle that we should look for simple explanations.
The conclusions of my investigation are negative. I show that the proposed definition cannot be interpreted as a universal prediction method, as turns out to be exposed by a mathematical argument that it was actually intended to overcome. Moreover, I show that the suggested justification of Occam's razor does not work, and I argue that the relevant notion of simplicity as compressibility is problematic itself.
 My thesis was one of the three winners of the triennial Wolfgang Stegmüller Award of the Gesellschaft für Analytische Philosophie. Here is a picture from the ceremony, taken from this slideshow.