I think it's extremely hard to say, but I'd say it's hard to argue for sentience when the neural system is so small.
This is a serious problem in philosophy. The concepts of "philosophical zombies" is a hard problem. The fact is that we could have perfectly "sentient behavior" from creatures/beings without any sensible consciousness involved. The sentience of humans is debatable, even though it is readily apparent that we experience qualia.
Unless you're willing to accept that a rock, or a series of silicon chips, or even a massive mechanical computing machine can have consciousness, the blurry edge of where neural systems go from being electric discharges to a vehicle for qualia is an extremely difficult question.
Is there something we know about the relationship between conciousness/sentience and the size of brain?
A computer neural network can be arbitrarily large and there is no evidence it will ever become sentient. While we're related to insects and afaik share some functions. I can definitely picture the idea of their conciousness just being a more basic version of ours. But I'm curious if there is some evidence or reasoning that actually conciousness emerges as some threshold gets crossed
I mean, we don't know. We literally don't know where or how consciousness exists. We just know that we have it, and we try to extrapolate backwards.
Is it a product of the prefrontal cortex? Arguably. Is it the product of the mammalian mind? Seems plausible. The product of a small-but-thresholded nervous system? I think many would agree here. Is it the product of any nervous system? Doubtful, but maybe. Is it the product of any electrical system? Who knows, it's possible. Is it an innate part of matter? I mean...
Are insects conscious? I'm skeptical. Are mussels and/or oysters conscious? I'm very skeptical. Are plants conscious? I'm extremely skeptical.
This always used to be my thought as well. Consciousness and intelligence had a clear hierarchy in my mind. Slowly though I am not so sure anymore. Watching plants grow in their own time frame in the documentary "the private life of plants", reading about the communication networks of trees in "The Hidden Life of Trees" and having observed nature a lot; this hierarchy starts to fade.
There is a big connection between consciousness and morality. Killing a family of bears to safe a human child would be seen as a moral think to do. We explain this to ourselves by using levels of consciousness. It is okay to kill, enslave etc. a species with a lower level of consciousness for the benefit of one with a higher level. And humans have the highest levels of consciousness.
I believe this is an illusion or even a coping mechanisms for admitting that we find our own species more important.
Think of it this way; what if there existed an alien race, superior in intellect and consciousness. Would we be okay with it enslaved us? Would we kill one of our own if it meant saving one of them. I think not. I think the connection between consciousness and morality would disappear.
When connecting consciousness with morality, admitting consciousness to a life form becomes subjective and a costly dilemma.
I think that by just admitting that we prefer the survival of our own kind it would be easier to see rich world of many possible consciousness entities in this world. Be it humans, insects, trees or even interconnected systems of species.
If we take away the whole effect of the trolley problem and only talk about values, I would say that given the choice of 2 human life against the last 2 of a species of some insect, I would choose not to extinct that species.
I agree on the comparison of a higher consciousness than humans visiting earth. I would in most normal cases protect my own humans instead of a bear or alien.
But if a human where about to kill a defenceless alien without cause, I would save that alien.
You're conflating consciousness with intelligence. The two are not the same.
One is the experience of qualia, and the ability to hold preferences.
The other is the behaviors associated with making predictions and intentionally manipulating the world to achieve tasks.
We have a hyper-intelligent chess engines that cannot be argued to be conscious. We also have small mammalian minds, like Koalas, that are dumb as a brick, but are very likely conscious.
Some level of intelligence is a prerequisite for experience of qualia.
A rock for example is below that threshold. The question then becomes what exactly is the minim level of perception we are willing to accept. Individual cells respond to their environment indirectly, that’s enough to qualify IMO but I accept people would disagree.
Which just illiterates people don’t generally use philosophical definitions directly. What they mean by qualia isn’t subjective conscious experience, it’s something else which they can’t actually describe that just so happens to conform exactly to their preconceived notions.
The ideas is if you had some, theoretical, mechanical computer built of stone (something trivially producible), if we believe that any artificial intelligences can be sentient, then it would be trivial to extend this to create a sentient, but essentially just a complex stone object.
A single stone is different than a system consisting of large numbers of stones in some non random configuration.
It’s like the flaw in that “Chinese room” thought experiment. The person following instructions isn’t the room any more than the walls are. All that thought experiment demonstrates is a CPU has a more limited perspective than the computer it’s part of.
A specific rock inherently is whatever that rock is. That same rock can represent all finite messages at the same time in relation to it’s position along arbitrarily defined number lines. Nothing inherent to the rock changed as that meaning is external to it.
Consciousness is an actual thing. It's not an idea. The idea that somehow a collection of stone objects springs forth a consciousness when arranged in one pattern versus another... is certainly an odd concept.
Why would you assume consciousness is a thing rather than a property of a system?
Temperature for example isn’t a substance. I am not saying consciousness couldn’t be a thing, but it just seems to be an assumption without justification.
As to arrangements of stuff being important, arrangement of individual pieces can make the difference between a watch and a pile of pieces or a person and a dead body. Many emergent properties depend on the specific arrangements of components.
>Why would you assume consciousness is a thing rather than a property of a system?
If consciousness were simply the property of a system, then I'd assume it is effectively an illusion. I'm not sure, and I'm not saying this it would be de facto wrong, I'm just saying it's absurd for me to conceive of it this way, because if it is the case that it's a byproduct of a system, then there is effectively no self. It may be the case that that is true, bit it's useless to conceive of it.
Is math an illusion? Math is a property of systems, as is our and our calculators ability to do it. I would argue that makes it more real.
When you see an object in reality, you're really making a series of assumptions. What you see is reflected light, and you construct a highly innacurate but useful model of what that light logically implies. Something that is external can only be perceived epistemically, as a sufficiently good guess. There's no evidence we're not in a simulation, so it's a toss-up as to whether anything physical exists at all.
Contrast the concept of self awareness. 'I think, therefore I am' is not a guess. It is a certainty, regardless of the broader context, that somewhere, something is a host to my experiences and thoughts.
Tangent: You can extend that a little bit by reasoning that you exist, therefore the things you interact with must exist as something. That doesn't get you past the possibility of the matrix, but it does escape total nihilism.
As I said in my first post “What they mean by qualia isn’t subjective conscious experience, it’s something else which they can’t actually describe that just so happens to conform exactly to their preconceived notions.”
I realized recently there’s no evidence that they don’t, but a wealth of evidence that they do.
I watch animals very closely now and I see more all the time that they are just like me in most ways. I suppose I used to watch them as a person who eats them, and my comfort with doing so depended on not seeing them as conscious creatures like me. Now that I don’t eat them, it’s as though there are more similarities than there are differences.
I suspect if animals could speak, we’d realize they’re remarkably similar to us. This probably sounds ridiculous to many people, and would have to me once too.
A key part of this is not that I’m elevating animals’ conscience to that of humans so much as lowering humans’. We tend to think we’re exceptional, but I think our exception is probably just intelligence. The rest, I don’t know, I suspect we’re all very much alike.
I am hypo intelligent, but my coworkers definitely appear unconscious. Incomplete, ill-posed statement fragments over chat or better yet, in needless "sync" meetings.
Insects display unbelievable forms intelligence to anyone that pays attention.
Even if there is a chance it’s all a mirage, wouldn’t it be better to behave as if these and other creatures have an inner life?
Appealing to the philosophical zombie seems like an excuse to discard that our actions might cause real suffering from their perspective, on the unknowable chance they don’t mind.
This is a serious problem in philosophy. The concepts of "philosophical zombies" is a hard problem. The fact is that we could have perfectly "sentient behavior" from creatures/beings without any sensible consciousness involved. The sentience of humans is debatable, even though it is readily apparent that we experience qualia.
Unless you're willing to accept that a rock, or a series of silicon chips, or even a massive mechanical computing machine can have consciousness, the blurry edge of where neural systems go from being electric discharges to a vehicle for qualia is an extremely difficult question.