Hardly: as long as you have discrete numbers, digital logic applies. The use of binary math in computers is an accident of engineering, not an essential part of computation theory. There are two mathematical formalisms that are used to define computing: the Turing machine and the lambda calculus, and they have been proven to be expressively equivalent.
Quantum computing may result in things like massive parallelism, which will make our engineering of computers different and change the tradeoffs—for instance, cryptography will probably be changed, because our current approach to cryptography is based on the assumption that factoring large numbers takes a lot of processor time, and if quantum computing makes it possible to factor large numbers quickly, then cryptography will have to change—but unless something happens that makes the computer capable of calculating things that cannot be expressed by a Turing machine or the lambda calculus, it’s not going to make a theoretical difference.