Some extra points:
(1) the first electronic computer I know of, a horrendously huge thing using valves, was actually decimal. It was still based on the concept of on/off rather than 10 levels of "on-ness", but it used a ring of 10 bits, only one of which could be on, to represent a decimal digit. It was called Eniac. It's easy to be critical in retrospect, but when you think that those 10 bits could have stored a number from 1 to 1000 instead of a number from 1 to 10.
(2) Analogue computers did work. The Bush differential analyzer is probably the best known, but there have been numerous analogue computing devices over the years. Mostly they amount to a mechanical expression of a mathematical equation, with an ability to use it to calculate or plot values. In a sense, Oreries (models of the solar system, and I don't think I spelt that correctly!) were analogue computers.
(extending that, the gearing between the minute and hour hands of your watch is probably amongst the most common calculating engines - even if it merely divides by 12.)
(3) Maybe the reason why binary is so popular is it works so well, not just in electronics but also in Boolean algebra/logic. Have a think about the philosophy of a trinary computer and its logic! What do you do with three? 0=No, 1=Maybe, 2=Yes? What's the result of an or operation? Yup, you can work this sort of thing out: No or No = No; anything or Yes = Yes; of the remainder, anything or Maybe = Maybe.
But there are other ways you could interpret the bits: 0=No, 1="Yes, I just know it", and "2=Yes, and I can prove it". This can change how the logic ought to work.
Also it gets harder to work out how to use logic gates to make an adder (a very simple thing in binary). For our logical or, 1 or 1 = 1, but now the adder needs to see that 0+2 is the same as 1+1.