Logical vs. historical order

The logical order of primacy in an argument isn’t necessarily the same as the historical order of primacy. For example, Morris Cohen argued in An Introduction to Logic and Scientific Method (1934) that the axioms of a system are often discovered after the theorems. According to Cohen, many of Euclid’s theorems were already known to the Ancient Greeks for hundreds of years before Euclid did his groundbreaking work. Euclid’s contribution wasn’t as much to discover the theorems as to discover the axioms for theorems already known. His contribution was largely to systematize already-existing knowledge.

In other words: What was in the history of ideas come up with and made known before and after doesn’t necessarily match up with what’s logically precedent and antecedent.

The denial of the possibility of social science

I’ve always been interested in the questions of why people do what they do and why people feel and think as they do. Being of a philosophical and scientific orientation, my interest in those questions has led me to study the most philosophically deep schools of thought in economics, linguistics, and some of the other sciences of human action and the human mind.

To my surprise, though, many or even most people, at least in the modern West, find it uncomfortable to generalize about people. The problem with that is: Science, whether it’s about people or things, is about generalization. Thus, to find it uncomfortable to generalize about people is to find it uncomfortable to do science about people. In other words: To my surprise, the controversies in the sciences of human action and the human mind aren’t only about what the best models are but are also about whether models are even possible.

See below for some of what I’ve written on the psychology and sociology of that debate:

  1. Is the mental subject to scientific law?
  2. The Enlightenment and Romanticism

Natural order

In Christian cosmogony, it was God who (1) made something out of nothing and then (2) gave that something the order that it has. In Dialogues Concerning Natural Religion (1779), David Hume argued for “a new hypothesis of cosmogony,” which was a challenge to the latter doctrine: that without God, the order in the world of matter has no good explanation. In essence, he argued that the order in the world of matter can be accounted for without hypothesizing supernatural intervention, for that order is the natural result of something that everybody knows: that some configurations of matter are more stable than others. If some matter in an unstable form by chance falls into another unstable form, then by definition (i.e., by definition of the term “unstable”) it’s unlikely for the matter to stay in that form for a long time. It’s when matter instead by chance falls into a stable form that it’s likely to stay like that. Chaos falls into chaos until it settles into order.

Interestingly: In The Selfish Gene (1976), Richard Dawkins used that Humean argument in order to contextualize biological evolution. Hume explained how chaos naturally settles into order (which is an explanation of any kind of evolution, whether biological or not), and to that explanation Dawkins added the idea of a replicator (which is how biological evolution works in that context).

Why are there so many rocks? Because rocks are especially stable. If some matter by chance falls into the form of a rock, then it’s likely to stay like that. And why are there so many birds? Not because birds are especially stable on the level of the individual, like rocks, but because birds are especially stable on the level of the group. They’re especially good at replicating themselves, and thus keeping the group in existence, before themselves falling out of existence.

That Humean argument, however, falls to thoroughgoing subjectivism. The difference between chaos and order isn’t inherent to the world of matter. The difference instead comes out of something subjective: categorization.

Rocks are stable because rocks are rocks whether they’re big or small, rough or smooth, etc. But why categorize like that? A big “rock” can fall and break into small “rocks,” and a rough “rock” can be made into a smooth “rock” after enough time in a river. Our categorization scheme is such that through those transformations they’re all still “rocks.” How stable! Theoretically speaking, though, it’s possible to use any categorization scheme that you want. Anything can be thought of as staying the same through any transformation, and anything can be thought of as not staying the same through any transformation. It’s possible to imagine a categorization scheme that puts even rocks into chaotic flux.

Phenomenalist linguistics

In the late 2000s and early 2010s, I put a lot of effort into studying David Hume’s work and building a phenomenalist foundation for linguistics with the help of his work. I shelved the project after a while (not because it wasn’t going well but because I got sidetracked). And then in the early 2020s, I happened to go back to Friedrich Hayek’s book The Sensory Order (1952), which I had read long before that but without getting much out of it. Suddenly, though, it all made sense. And with Hayek’s orienting influence, I got to work putting into writing my late-2000s, early 2010s work on Humean linguistics, now Humean-Hayekian.

There’s something else that I should also mention that influenced me between my original late-2000s, early-2010s work on Humean linguistics and my recent work on Humean-Hayekian linguistics: Around when I went back to The Sensory Order, I spent a lot of time and energy on logic. Besides doing my own thinking, I studied with great interest John Stuart Mill’s 1,000+ page textbook A System of Logic (1843) and Morris Cohen’s more concise and eloquently written textbook An Introduction to Logic and Scientific Method (1934). Thus, between (1) my late-2000s, early-2010s work on adapting Humean phenomenalism to linguistics, (2) my newfound appreciation, as of the early 2020s, for Hayek’s effort to reconcile phenomenalism, which is a traditional doctrine in philosophy, with 20th-century science, and (3) my newfound understanding of logic, I found myself better oriented than ever in all of the ways that mattered for building the phenomenalist foundation for linguistics that I envisioned first in the late 2000s. That project had turned into Humean-Hayekian logico-linguistic system building.

See below for what I’ve written so far on that system:

  1. Sensation as such vs. sensation of
  2. From linguistics to logic
  3. The mental vs. the physical
  4. The subjective vs. the objective
  5. The self vs. the other
  6. Words as sets
  7. Form and substance
  8. Semantics and syntax
  9. Word-thought overwriting
  10. An analogy to word-thought overwriting
  11. The phenomenalism of categorization
  12. The a priori and the a posteriori
  13. The branches of linguistics
  14. The reification of words and money

Deixis

When two people are talking to each other, each utterance is such that there’s a speaker and a listener. Furthermore, there’s everybody who’s neither the speaker nor the listener.

When a first-person pronoun is used (e.g., “I,” “me”), the speaker is referring to themselves. But it’s also possible for the speaker to refer to something close to themselves (e.g., “this,” “these”) or somewhere close to themselves (e.g., “here”).

In deixis, there’s:

  1. The speaker
  2. The listener
  3. Neither the speaker nor the listener
  4. The location in space of the speaker, the listener, or neither the speaker nor the listener
  5. The location in time of the utterance

Thus, it’s possible to refer to:

  1. The speaker of the utterance
  2. The listener
  3. Neither the speaker nor the listener
  4. Something near the speaker of the utterance
  5. Something near the listener
  6. Something near neither the speaker nor the listener
  7. Somewhere near the speaker of the utterance
  8. Somewhere near the listener
  9. Somewhere near neither the speaker nor the listener
  10. The past with respect to the utterance
  11. The present with respect to the utterance
  12. The future with respect to the utterance

There’s also:

  1. Male vs. female
  2. Singular vs. plural

Notation in logic

My interest in using notation in logic (e.g., ~ for “not,” strictly defined) is in part a result of often finding it useful to keep track of how I would notate the logical skeleton of what’s fleshed out as natural prose. For example, in natural prose the word “not” doesn’t always mean ~. In natural prose, there’s no 1-to-1 correspondence between the linguistic form and the logical substance. With the symbol ~ strictly defined, you can, whenever doing so would be useful, ask yourself: “Is the present usage of the word ‘not’ equivalent to ~?”

When I’m reading or writing, my internal experience is often such that I visualize ~ and other symbols as furigana. That is, I often ask myself whether I could justifiably put a certain logical symbol above a given word or phrase.

When writing, that technique helps you get the best of both worlds of the artificial and the natural: the artificiality of scientific writing and the naturalness of artistic writing. It helps you keep track of the logic without there being any need for you to artificially limit or regiment how you use natural language.

Digits etc

  1. There are the symbols 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0. 1 is ⚫︎, 2 is ⚫︎⚫︎, etc., and 0 is nothing. When put together: For the numbers 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0, xy = x * (9 + 1) + y. For example: 23 = 2 * (9 + 1) + 3. 45 = 4 * (9 + 1) + 5. And 67 = 6 * (9 + 1) + 7.
  2. Importantly, the xy in that algebraic equation isn’t another way of writing x * y. In the present context, xy means writing the symbol for the number x in front of the symbol for the number y, with the possible numbers being from ⚫︎ to ⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎, e.g. writing the symbol 2 in front of the symbol 3.
  3. Put differently: For the digits (the term “digit” meaning a certain kind of symbol) 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0, the digit x written in front of the digit y = the number associated with the digit x * ⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎ + the number associated with the digit y.

Positive and negative, plus and minus

  1. In arithmetical notation, there are the seemingly fundamental symbols +, -, *, and /. Interestingly, though: The symbols + and – are ambiguous between (1) +1 vs. -1 read as “positive one” vs. “negative one” and (2) 1 + 1 vs. 1 – 1 read as “one plus one” vs. “one minus one.” That is: + is ambiguous between “positive” and “plus,” and – is ambiguous between “negative” and “minus.”
  2. Semantically speaking, the difference between those two conceptual pairs is that a number being positive or negative is a static state, and a number being added (with a + read “plus”) or subtracted (with a – read “minus”) is a dynamic state. For example, let’s say that you’re looking at your bank account. If you have a positive bank balance (e.g., +50 thousand dollars, with the + usually being left off), then the bank owes you $50,000. And if you have a negative bank balance (e.g., -50 thousand dollars), then you owe the bank that amount of money. That’s about the static state of your bank account. But if you add or subtract money from your bank account—i.e., if you make a deposit or withdrawal—then you change the bank balance in a positive or negative direction (whether in doing so you go far enough as to change whether you’re “in the black” or “in the red”). That’s about the dynamic state of your bank account.
  3. Thus: In the logical language, there will be: (1) a symbol for the static state of being a positive number, (2) a symbol for the static state of being a negative number, (3) a symbol for the dynamic state of a number moving, or being made to move, in a positive direction, and (4) a symbol for the dynamic state of a number moving, or being made to move, in a negative direction.
  4. What about the symbols * and /, though? Interestingly, there’s no analogous ambiguity with those symbols. * is straightforwardly just multiplication, and / is straightforwardly just division.
  5. Addition and subtraction are counterpart operations in arithmetic in that what one does, the other undoes—the term “operation” here implying a dynamic state. Multiplication and division too are counterpart operations in the same sense. For example: Take 3 + 4 = 7. The start is 3, the operation is + 4, and the end is 7. Next, take that end as the start and undo what was done: 7 – 4 = 3. Analogously, consider the “doing” of 3 * 4 = 12 and the counterpart “undoing” of 12 / 4 = 3.
  6. To generalize: (1) For all numbers x, y, and z, x + y = z implies z – y = x. And (2) for all numbers x, y, and z, x * y = z implies z / y = x.
  7. It’s inelegant that the traditional notation is such that ab is the same as a * b but 22 isn’t the same as 2 * 2.

The vocabulary and grammar of arithmetic and algebra

  1. Operands and operators. The “operands” of arithmetic are 1, 0, etc., and the “operators” are +, -, *, /, etc. Algebra, then, which builds on top of arithmetic, introduces the “operands” x, y, etc.
  2. The operands as distinguished into constants and variables. For example: 1 and 0 are “constants,” and x and y are “variables.” In arithmetic and algebra, the constants are numbers (e.g., 1, 0). The variables, then, are like blanks to be filled in with those constants (which, again, are numbers in the present context). x + 1 = 2 is like _ + 1 = 2. What constant/number, if put into that blank, would make a true proposition? The answer is of course 1. Thus, x = 1. That is: _ = 1. But why aren’t blanks actually used? The reason is that x is like a _ that must be filled in with the same constant/number everywhere—well, everywhere in the circumscribed sphere of that x’s usage. For example, consider: 2x + 3x = 10. That’s like 2_ + 3_ = 10 with the constraint that both _ must be filled in with the same constant/number. By contrast: 2x + 3y = 10 is like 2_ + 3_ without that constraint.
  3. Relations. The “relations” of arithmetic and algebra are =, >, <, etc. Without the relations, no proposition can be made (a proposition being anything that’s either true or false). For example: 1 + 1 isn’t a proposition, for it can neither be true nor false. But both 1 + 1 = 2 and 1 + 1 = 3 are propositions (with the former happening to be true and the latter happening to be false).
  4. x^2 + 6 = 5x is such that x = 2 or 3. Consider, though, that x + y = y + x is such that x can be any number and so can y. Thus: For some number(s) x, x^2 + 6 = 5x. And for all numbers x and y, x + y = y + x.
  5. A pair of propositions in English analogous to the foregoing: (1) “For some American people x, x’s parents are from America.” (2) “For all Japanese people x, x’s parents are from Japan.”
  6. Another pair: (1) “For some species of birds x, the prototypical x can fly.” (2) “For all species of fish x, the prototypical x can swim.”
  7. I’ll also need to define the vocabulary 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0, along with the grammatical system inherent in putting, say, 1 (⚫︎) before 2 (⚫︎⚫︎), and getting 12 (⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎⚫︎). That should come before bringing up the operands, the operators, and the relations, along with bringing up quantifiers etc.

Right shifting, continued

In the logical language, the joint-attentional frame variant [establish], which is symbolized both linearly and diagrammatically as a square and is one of the four joint-attentional frame variants, will work as follows:

  1. In “bicycle [establish] electric,” the joint-attentional frame is established on “bicycle” (whether referentially or categorically) and then “electric” is said about that frame. To translate that into English (if interpreted based on the extralinguistic context of the utterance as referentially definite and singular): “The bicycle is electric.”
  2. In “electric [establish] bicycle,” the reverse is true. The joint-attentional frame is established on “electric (thing)” and then “bicycle” is said about that frame. To translate that into English in the foregoing way: “The electric (thing) is (a) bicycle.”
  3. In “bicycle electric [establish]” or “electric bicycle [establish]”—those two phrases being logically identical to each other—the joint-attentional frame is established on what’s both a “bicycle” and “electric.”

To compare that to how English and Japanese work:

  1. X口Y is like “X is Y” and XはY
  2. Y口X is like “Y is X” and YはX
  3. XY口 and YX口 are like “X Y is” and “Y X is,” XYは and YXは

For now, though, let’s analyze only X口Y and YX口. For example, in English and Japanese:

  1. X口Y in English is like “X is Y,” e.g. “(the) cat is black”
  2. X口Y in Japanese is like XはY, e.g. 日本人は時間を守る
  3. YX口 in English is like “Y X is,” e.g. “(the) black cat is”
  4. YX口 in Japanese is like YXは, e.g. 時間を守る日本人は

In both English and Japanese, then, the grammar uses the position of X and Y with respect to “is” or は (the reversal of the order of X and Y in the above examples being incidental for the present purpose) in order to differentiate between—to return to the camera analogy, although that analogy is, strictly or technically speaking, ideal only in the very limited scope of the speaker talking about something referential in the visual modality—”using a label in order to point the camera” and “labeling what the camera is pointing at.”

That is, both English and Japanese use word order as a grammatical tool for the purpose of differentiating between (1) establishing a joint-attentional frame and (2) saying something about that frame.

But how should the logical language handle the non-口 cases, i.e. the other three joint-attentional frame variants? And how do English and Japanese handle those cases?