I don't think it's incoherent to make probabilistic claims like this. It might be incoherent to make deeper claims about what laws given the distribution itself. Either way, what I think is interesting is that, if we also think there is such a thing as an amount of consciousness a thing can have, as in the panpsychic view, these two things create an inverse-square law of moral consideration that matches the shape of most people's intuitions oddly well.
For example: Let's say rock is probably not conscious, P(rock) < 1%. Even if it is, it doesn't seem like it would be very conscious. A low percentage of a low amount multiplies to a very low expected value, and that matches our intuitions about how much value to give rocks.
By incoherent I was referring to the internal inconsistencies of a model, not the probabilistic claims. Ie a model that denies your own consciousness but accepts the consciousness of others is a difficult one to defend. I agree with your statement here.
Thanks for your comment I enjoyed thinking about this. I learned the estimating distributions approach from the rationalist/betting/LessWrong folks and think it works really well, but I've never thought much about how it applies to something unfalsifiable.