I'm not saying this will happen, but it seems to me like an incredibly silly move.
We barely understand how consciousness works, we should stop talking about "AGI". It is just empty, ridiculous techno-babble. Sorry for the harsh language, there's no nice way to drive home this point.
Also, I hope my response to tempestn clarifies a bit more.
Edit: I'll be more explicit by what I mean by "nuance" — see Stuart Russell. Check out his book, "Human Compatible". It's written with cutting clarity, restraint, thoughtfulness, simplicity (not to be confused with "easy"!), an absolute delight to read. It's excellent science writing, and a model for anyone thinking of writing a book in this space. (See also Russell's principles for "provably beneficial AI".)