As voyager was hinting - all modern digital logic is designed at the analog level. If you want it to go fast and/or not chew massive amounts of power, and be cost competitive, you have no choice. The simplistic idea of 1s and 0s just isn’t how it happens. That is an abstraction that is given to the higher level logic designers. But the guys at the coal face, those that are dealing with making the things work deal with analog systems, and wrestle with them to make the binary abstraction stable enough to work.
Current modern computer systems are designed out of binary logic, not due to any fundamental limitation, but because engineering them this way turns out to be the best way of coping with a wide range of competing constraints. Early digital computers were not necessarily binary, with ternary logic being tried, amongst others. There remain some interesting possible advantages in ternary even now.
Nobody has mentioned analog computers until now. Before the advent of cheap digital computers every engineering lab would have an analog computer. No problem at all representing values to quite reasonable precision.
Storage of a multi-level value can be trivially managed - and you almost certainly already own a device that stores millions of analog values on a chip, and manages the transfer of those values between storage cells. I’m referring to a CCD sensor in a digital camera. Each pixel hold a voltage proportional to the light that fell on the photosensitive part of the cell. When the exposure is done these voltages are transferred cell by cell across the face of the chip - and fed to an amplifier and digital encoder when they reach the edge. CCDs can hold the image for quite some time before readout. There was a time when CCDs were first invented that people imagined that they could form the basis of all sorts of interesting circuits. The fundamental storage element - basically a capacitor - is trivial. Similar capacitors for the basis of dynamic memory in all modern computers. It isn’t a big deal to imagine taking a variant of modern digital chip technology and making a multi-value logic system from it. The question is not that it is feasible - but will it result in something that is superior to implementing the same functionality in conventional digital logic, where the different values are represented by aggregating bits? If your multivalue system can reliably represent 8 discrete levels, I only need three bits to do the same job. Freed of the constraints to maintain those 8 values, and only needed to maintain 2, my final design might be smaller faster and cheaper, even it need three times as many components. Overall this has been the experience for the last 60 years of computer design. It could change in the future, but there is little concrete on the horizon that portends such a change.
About now conversations tend to need my standard tirade about information. There is always someone who will pipe up with the notion that analog signals contain infinite information compared to digital levels limited number. This is fundamentally not true. Our universe is noisy. There is nowhere you can go, and no technology that can be applied that can remove all the noise. When there is noise there is uncertainty in the signal, and the range of values that can be usefully represented is intrinsically limited. This rather neatly gets us back to Shannon.
People get rather surprised that the information that can be represented in an analog signal is measured in bits, and that it may contain a non-integer number of bits. This measure allows us to directly compare the information content of signals nomatter that they are digital or analog. (And it is worth adding the note that “analog” is a travesty of a word. What we colloquially call analog systems are more properly termed “continuous”. Digital systems are analogues just as much as any other. But the usage has become commonplace and there is no point fighting.)
The point for the OP. There is a well defined equivalence between all the different implementations you can imagine for a thinking brain. Anything from numerical simulation of the electrochemistry of a real brain, simulation of the neuron/synapse abstraction in a range of manners, from conventional computer programs, bespoke computer systems also running code to simulate the abstraction, down to custom digital or analog circuits that directly represent the abstraction. The choice is a matter of engineering tradeoffs, nothing more. Some ways get results quicker, some leverage existing experience and technologies to good effect, others may actually turn out to be superior in the long run but may require serious investment to overcome the technological lead conventional technologies enjoy.
Eventually all of this is arguing about the possibility of creating an existence proof of an artificial thinking brain. We assume that a silicon brain that has been engineered to exactly mimic a human brain, and that then exhibits the same external behaviour and a wet-ware brain is a very good indicator that the silicon brain has captured everything needed to be a self aware thinking being. Since we have not engineered with modern AI “tricks” and only copied the innate structures we found in a real brain, we can avoid claims of “tricks” that are designed to blindly mimic, but don’t actually think. With such an existence proof we can dismiss the ghost in the machine, and look towards more interesting, non-human minds.