I think Prof. Hawking is off on this one. AI has yet to get anywhere near awareness, a trait vital for them to take over humankind. They are great at computational tasks, e.g. playing chess, crunching huge data sets, etc. To show you the enormity of what computers would need to do to effectively take over, consider this:
A toddler can see a crayon on the other side of the room, previously, it has seen you use the crayon to draw, it walks across the room, picks up the crayon and starts drawing on the wall.
Now, this task requires a number of physical motor skills, which, feasibly, a robot may be able to do. But the initial thought: I'm going to use that crayon to draw on the wall, is not something a computer can come up with on its own. It requires awareness. It would have to be programmed in a top down way to do this. i.e. it would have to be told by a human to do this.
No computer has passed a turing test (despite reports to the contrary, the often cited case is not relevant as the test was framed as being a 13 year old Ukrainian boy), and until one does, I wouldn't worry about being overrun by cyborgs. Even if one did, I still wouldn't worry, as this does not equate to awareness, which means that humans would still come out on top in any dystopian future where we are at war with computers.