Today's conventional digital computers are limited by their inability to process data that isn't broken down into binary language. A circuit will store an information "bit" by either applying a charge to a transistor or not, read by the machine as a one or a zero. Continual advances in photolithography, the technology behind printing circuits on semiconductor wafers, has allowed computing power to double every few years, something known as Moore's Law.