We have no idea how to give anything a subjective experience of itself. We know how to make something behave as if it does externally.
One of the worst versions of AGI might be a system that simulates to us that it has an internal life, but in reality has no internal subjective experience of itself.