There are a lot of ways to count, but when it comes to computers there is only binary: 0 and 1. Each one is a considered a "bit." That means for 1-bit computing, you get two possible values; 2-bit means four values; then at 3 bits you double that to eight (2 to the third power, aka 2 cubed).
Keep going exponentially and you eventually get 32-bit (2 to the 32nd power) worth 4,294,967,296; 64-bit (or 2 to the 64th power) is worth 18,446,744,073,709,551,616 values.
That's a lot of bits, and the numbers show just how much more powerful a chip that supports higher bit computing can be. It's a lot more than double.
That's because every few years, the chips inside the computers (even smartphones) and the software running on those chips make leaps forward in supporting a new number. For example: