Unsubstantiated? You need to do some reading. Innovators like Elon Musk and Bill Gates haven spoken and written on the AI issue for years. You remind me of a blacksmith when he first saw a car, pure denial. True AI will change every way we live. We're in the embryonic stages of development. Just like we were when the first PC's hit the market, not too long ago. Show me an industry that has not changed due to computers and it will probably require a minimum education and a pool of cheap labor.
"The object of teaching a child is to enable the child to get along without the teacher. We need to educate our children for their future, not our past." Arthur C. Clarke.
I think you may be misinterpreting the hype. As I interpret Musk's statements on AI, true AI is something to be fearful of because it is largely unpredictable. Once it is established there is absolutely no basis to believe that it can be controllable by humans. The premise of Terminator is AI run amuck, which although fictional in the movie is completely plausible. I predict within the next 10 years (well before we can achieve AI), there will be a major cyber attack or disruption based on the internet of things and interconnection that will cause a huge paradigm shift in thinking about the benefit of networking everything together as compared to the possible bad outcomes. Up until now we have seem a predominant benefit, however because of the overall power of such systems, huge catastrophe is possible, it is only a matter of time. Once we see the consequences of this event, and put that together with AI being an even greater power I think we logically determine the AI singularity must be prevented because it is too dangerous. This goes way beyond the threshold into the nuclear era we have been living in for the past 70 years. Think about how Hiroshima and Nagasaki changed the world, we have not had the equivalent event for networks/computers. Black Friday and the flash crash are just minor glitches that hint at what could happen.
I think an entirely plausible scenario is for practical ethical reasons we decide AI cannot be allowed because it is too much of a Pandora's box. So it is hype in the sense that there is a strong possibility that even if it is possible we never use it.
AI is basically the equivalent of a god-like intelligence. Even though it would be based on human technology, at some level following it would be no different than a religion from a philosophical point of view. In this respect it's a weird paradox.