Articles and blog posts found on 02 September 2022
Published by Reblogs - Credits in Posts,
Barry Maguire: Efficient Markets and Alienation (pdf, 12619 words)
Hasen Khudairi: Epistemicism and Moral Vagueness (pdf, 6750 words)
Janis David Schaab: Moral Obligation: Relational or Second-Personal? (pdf, 15624 words)
Janis David Schaab: Conspiracy Theories and Rational Critique: A Kantian Procedural Approach (pdf, 12522 words)
Robert K. Meyer, Chris Mortensen: Alien Intruders in Relevant Arithmetic (pdf, 10930 words)
Ross T. Brady: The Formalization of Arithmetic in a Logic of Meaning Containment (pdf, 10027 words)
Shay Logan, Graham Leach-Krouse: On Not Saying What We Shouldn’t Have to Say (pdf, 17798 words)
Thomas Macaulay Ferguson, Graham Priest: Meyer’s Relevant Arithmetic: Introduction to the Special Issue (pdf, 4948 words)
Thomas Macaulay Ferguson, Graham Priest: Robert Meyer’s Publications on Relevant Arithmetic (pdf, 1127 words)
Veronica J. Vieland, Sang-Cheol Seok: Absolutely Zero Evidence (pdf, 5089 words)
In his
Notes on James Mill, in 1844, Karl Marx complained that: … the mediating process between men engaged in exchange is not a social or human process, not human relationship; it is the abstract relationship of private property to private property … men engaged in exchange do not relate to each other as men.
Hasen Khudairi: Epistemicism and Moral Vagueness (pdf, 6750 words)
This essay defends an epistemicist response to the phenomenon of vagueness concerning moral terms. I outline a traditional model of – and then two novel approaches to – epistemicism about moral predicates, and I demonstrate how the foregoing are able to provide robust explanations of the source of moral, as epistemic, indeterminacy. The first model of epistemic indeterminacy concerns the extensions of moral predicates, as witnessed by the non-transitivity of a value-theoretic sorites paradox. The second model of moral epistemicism is induced by the status of moral dilemmas in the epistemic interpretation of two-dimensional semantics. The third model is argued to consist in the formal invalidation of modal axiom K – and thus of epistemic closure – in the derivation of Curry’s paradox. I examine the philosophical significance of the foregoing, and compare the proposal to those of ethical expressivism, constructivism, and scalar act-consequentialism. Finally, I examine the status of moral relativism in light of the epistemicist models of moral vagueness developed in the paper, and I argue that the rigidity of ethical value-theoretic concepts adduces in favor of an epistemic interpretation of the indeterminacy thereof.
Janis David Schaab: Moral Obligation: Relational or Second-Personal? (pdf, 15624 words)
The Problem of Obligation is the problem of how to explain the features of moral obligations that distinguish them from other normative phenomena. Two recent accounts, the Second-Personal Account and the Relational Account, propose superficially similar solutions to this problem. Both regard obligations as based on the legitimate claims or demands that persons as such have on one another. However, unlike the Second- Personal Account, the Relational Account does not regard these claims or demands as based on persons' authority to address them. Advocates of the Relational Account accuse the Second-Personal Account of falling prey to the Problem of Antecedence. According to this objection, the Second-Personal Account is committed to the implausible claim that we have an obligation to φ only if, and because, others demand that we φ. Since the Relational Account’s proposed solution to the Problem of Obligation does not face the Problem of Antecedence, its advocates argue that it is dialectically superior to the Second-Personal Account. In this paper, I defend the Second-Personal Account by arguing that, first, the Relational Account does not actually solve the Problem of Obligation and, second, the Second-Personal Account does not fall prey to the Problem of Antecedence.
Janis David Schaab: Conspiracy Theories and Rational Critique: A Kantian Procedural Approach (pdf, 12522 words)
Unlike most philosophical approaches, a procedural approach does not purport to condemn conspiracy theorists directly on the basis of features of their theories. Instead, it focuses on the patterns of thought involved in forming and sustaining belief in such theories. Yet, unlike psychological approaches, a procedural approach provides a rational critique of conspiracist thought patterns. In particular, it criticises these thought patterns for failing to conform to procedures prescribed by reason. The specific procedural approach that I develop takes its cue from the Kantian notion that reason must be used self-critically. I tentatively suggest that conspiracy theorists fail to engage in the relevant sort of self-critique in at least three ways: they do not critically examine their own motivations, they avoid looking at matters from the point of view of others, and they fail to reflect on the limits of human knowledge.
Robert K. Meyer, Chris Mortensen: Alien Intruders in Relevant Arithmetic (pdf, 10930 words)
The system R of first-order relevant arithmetic was introduced in [12], as the result of adding the (first-order version of the) Peano postulates to relevant predicate calculus RQ. The following model was exhibited to show the system non-trivial (thus partially circumventing Gödel’s Second Theorem). We pick as our domain D of objects the integers mod 2, with +, ·, 0 interpreted in the obvious way; on this plan, the successor operation is evidently interpreted so that 0 = 1 and 1 = 0. As our collection V of truth-values we pick the set 3 = {T, N, F}, with sentential connective &, ∨, ∼, → defined on the (classical) subset 2 = {T, F} in the usual classical way. To complete the definition of connectives on 3, we define
Ross T. Brady: The Formalization of Arithmetic in a Logic of Meaning Containment (pdf, 10027 words)
We assess Meyer’s formalization of arithmetic in his [21], based on the strong relevant logic R and compare this with arithmetic based on a suitable logic of meaning containment, which was developed in Brady [7]. We argue in favour of the latter as it better captures the key logical concepts of meaning and truth in arithmetic. We also contrast the two approaches to classical recapture, again favouring our approach in [7]. We then consider our previous development of Peano arithmetic including primitive recursive functions, finally extending this work to that of general recursion.
Shay Logan, Graham Leach-Krouse: On Not Saying What We Shouldn’t Have to Say (pdf, 17798 words)
In this paper we introduce a novel way of building arithmetics whose background logic is R. The purpose of doing this is to point in the direction of a novel family of systems that could be candidates for being the infamous R
#12 that Meyer suggested we look for.
Thomas Macaulay Ferguson, Graham Priest: Meyer’s Relevant Arithmetic: Introduction to the Special Issue (pdf, 4948 words)
Over the decades of Bob Meyer’s prodigious career as philosopher and logician, a topic to which he reliably—if intermittently—returned is relevant arithmetic. Fragmented across a series of abstracts, technical reports, and journal articles Meyer outlined a research program in nonclassical mathematics that rivals that of the intuitionists in its maturity, depth, and perspicacity.
Thomas Macaulay Ferguson, Graham Priest: Robert Meyer’s Publications on Relevant Arithmetic (pdf, 1127 words)
The bibliography appearing below collects the publications in which Meyer’s investigations into relevant arithmetic saw print. Each bibliographic item is accompanied by a short description of the text or other remarks. We include papers on relevant arithmetic coauthored by Meyer, but omit both Meyer’s work on relevant
logic and the work published independently by his collaborators.
Veronica J. Vieland, Sang-Cheol Seok: Absolutely Zero Evidence (pdf, 5089 words)
Statistical analysis is often used to evaluate the strength of evidence for or against scientific hypotheses. Here we consider evidence measurement from the point of view of representational measurement theory, focusing in particular on the 0-points of measurement scales. We argue that a properly calibrated evidence measure will need to count up from
absolute 0, in a sense to be defined, and that this 0-point is likely to be something other than what one might have expected. This suggests the need for a new theory of statistical evidence in the context of which calibrated evidence measurement becomes tractable.