Yes, there is some chance of it happening, but I put that chance as infinitesimal.
I agree with Isaac Asimov that people will be scared enough of machines that they demand very strong safeguards. By the time we get to robots building other robots beyond human review, by then I highly doubt they will be organisms that want to commit war.
I want you to think of just how unenlightened war actually is.
These robots wouldn't need resources. They wouldn't have some ideology they need to prove. They may care about us enough to want us to stop hurting each other, and they may defend themselves from attack by us, but both of those do not necessitate mass death or extermination. They wouldn't have aggression that they had to work out or sublimate.
Without all of the reasons humans make war, robots would have no need for that level of organized violence.