More questions

16 views
Skip to first unread message

Julius Hamilton

unread,
Sep 27, 2025, 12:29:18 PM (7 days ago) Sep 27
to bfo-d...@googlegroups.com

Hello,

Thanks to everyone who responded to my last post, I am going to take my time going through the responses. 

I am genuinely obsessed with formal ontology, and I'm really pleased that the responses I got are from seasoned minds steeped in these questions for a long time. I feel like I can learn a lot from the people in this group.

To me philosophy is a very mentally dynamic activity. One time I wondered if an unconventional definition of philosophy could be "any intellectual inquiry where the distinction between the methods of inquiry and the content of inquiry collapses". This is why I feel philosophy is so hard. Many other fields have routine methods they apply to specific objects, like doing a placebo-controlled study for a drug, or proving a mathematical theorem. But in philosophy, it's possible but difficult to make progress because you can't separate the foundations from the results. Sometimes one step forward is actually two steps backward. It's like a "dynamical system" in that the rules of the game are changing while you are playing it and it's very easy to get stranded.

The plus side of doing formal ontology is how easy it is to come up with simple, concrete questions that are a launch point for very abstract questions, but ones which eventually are intended to terminate and offer tangible solutions to the original concrete question.

I was just thinking about a formal ontology for characters, sort of like how Unicode is a universal catalogue of symbols. I will show what questions come up as I try to define this ontology in BFO.

In basic formal ontology, my current understanding now is:

1. The top level is entity/thing/object.

I wonder if someone has written up a thorough analysis of what this actually means and why it is justified, though. 

For example: do people feel relatively strongly that this is only because the ontology is capturing/mirroring the structure in the human mind, when we think of things? Would someone claim "entities" map to what are clearly identifiable as "concepts" in the human neural-cognitive-semantic system - we can ask you to think about a certain word and see neural activity in an MRI reflecting how your mind conceived of that concept? My biggest issue with "entity" as a top-level category is that I believe there are phenomena in the mind that are not "thing-like" and which we don't necessarily "regard"/contemplate/direct attention to. I feel like when we force a non-thing-like thing to be modeled as a thing, our mind is capable of generating such an entity, but it changes the original nature of it, and it's a bit of an artificial illusion.

Is it rather an artifact of *language*, that this is a structural necessity - language more or less requires us to form discrete entities in order to say something about them? I have considered using the term "referent" instead. I feel like it shifts the connotations away from implying that "what we are talking about" has something I am calling for now "inherent thing-ness".

For example, in BFO, qualities are "specifically dependent continuants". But, how much is this for its syntactic benefit, as opposed to a metaphysical assertion? A thing can be red. Lately I have been taken with the idea that qualities are modes of things, and modes are a fundamentally different upper-level type than entity. I can think of an entity or object and imagine it in its red or green form. Something about the entity changes in my mind's conception of it. We could use this to claim, "I think there is such a thing as red, but it's not a thing in the same way that inherently thing-ful things are." I mean, maybe this line of thought just serves to help someone understand that the top level in BFO should not be mistaken for thing-like things (which are basically independent continuants). I just worry that someone might make the mistake of seeing "red" as an instance of a subclass of "entity" and think, "Oh, they're talking about redness." And we might need to say, "*No* - redness is an independent continuant. That is an abstract concept of pure color, as a thingful thing. There is a difference between red as an SDC and redness as an IC."

I wonder if the best solution is to assert that an ontology provides a structure for us to think *through*; we train and learn to remember exactly how we have regarded "redness" in *this* ontology; we are not saying that there really is a thing called redness. In a way, if we did not embrace that an ontology is an optional presentation of reality and instead claimed it was an obligatory representation, then we would just be doing physics and empirical cognitive science and so on. The question of "what kind of thing is redness" would be the physical description of photons. The cognitive question of "how does the human mind conceive of redness" would be a study of neural systems in the brain. I'm still very confused about this topic though. Maybe I can summarize the problem as, is it not clear if the kinds of things we want to express have a good absolute representation and we just need to find it, or if all the upper ontology categories we use are themselves original conceptual constructs in our mind and we can sort of use them to partially reflect the other things going on in our minds, but BFO is not "above and outside" our mind and thoughts, it's something we constructed and introduced into the rest of our pre-existing  thought system. 

2. My current working understanding of continuants versus occurrents is that occurrents have "temporal parts". 

Anyway, I'm going to wrap up here, I didn't get very far, just warming up various thoughts. 

Thank you.

Melissa Clarkson

unread,
Sep 28, 2025, 9:48:20 AM (7 days ago) Sep 28
to BFO Discuss
Hello Julian,

I think you will be interested in the book Building Ontologies with Basic Formal Ontology. The introductory chapters address the foundations of realist ontologies. Chapter 1 addresses your question about concepts: realist ontologies strive to represent types of things in reality, based on the best evidence available to us, independently of anyone's mind. Concepts exist in someone's mind as thoughts, mental images, and their understanding of the meaning of terms.

The later chapters walk thorough the structure of BFO. Chapter 5 has a section on specifically dependent continuants, relevant to your thoughts on "redness".

I have found this book quite helpful and returned to it repeatedly in my research and teaching.

–Melissa Clarkson
Reply all
Reply to author
Forward
0 new messages