Detecting a change in a number

People’s attention is limited and I need to see if there’s some work done is determining how people notice/act on a change in a number.

Take for example a number on a screen. People may easily notice a 0 changing to a 1 or a 5 to an 8 but may not see a 14 change to a 11

Or at least, I find that I may miss that unless I am paying very close attention. I think people notice relative change but I’ve no idea and my searches keep turning up change blindness which I’m not sure is what I’m looking for.

Is there a rule of thumb for this? It feels like it should be tied back to the scale of the change but

Change blindness is essentially proportional to the size of the visual field affected, and how central it is. Change blindness happens way upstream of any processing of the semantic content of the change.

Having said that a number that changes from 1 digit to 12 digits to 5 digits is going to stand out more than one that’s remaining 5 digits while changing values. Simply because the shape of the as-yet meaningless blob of “retina pixels” has changed, which triggers your shape recognizers. Which may in turn trigger furthconvertser processing downstream that eventually plays out as conscious recognition of the changing magnitude of a number.


You might usefully look up Hamming Distance which is a measure of the encoding “distance” between two points in an encoding scheme. In some fonts 1 and 7 or 3 and 8 look very similar, whereas an 8 and a 1 look quite different.

So you could make some arguments that could be tested empirically about which digits shifting between which values (visual shapes actually) would be most or least likely to be spotted all else equal.


“Chunking” also plays a role in the process of recognizing and converting a string of digits into a numeric value. Part of why transposition errors are both common and hard to spot is that “chunking” recognizes groups of digits as groups.

In Hamming terms the glyphs “69” and “96” are very close together despite being very different numerically. The perceptual error is happening at the chunking and chunk recognition / remembering level, far below the perceptual level where any meaning is assigned to any of this. In fact even before the glyphs are recognized as representing numerals.

If you mean the numeral, why wouldn’t they notice both equally? In both cases a single digit is changing.

It’s bad enough when the human visual processing system fails to spot a subtle bug like this. But compound that with photocopiers that introduce subtle mis-digits into the copies they produce! Imagine a page full of digits with a few digits altered. Spot that if you can!

Who here remembers the Xerox Clusterfuck of 2013, in which a computer scientist David Kriesel caught some Xerox copiers doing just that?

Sample fragment of a document, showing original and Xerox result:

What happened? The copier used a compression algorithm to scan and store the document, using a “chunking” algorithm (apparently not unlike the “chunking” discussed by @LSLGuy just above). When it found similar chunks in multiple places in the document, it just stored the compressed version of the chunk once, with each instance in the document stored simply as a link to the stored chunk. Except sometimes it mis-recognized a chunk and printed a different chunk in the copy. Oops.

What could go wrong?

Engineering and construction documents.
Invoices and other financial documents.
Medical prescription documents.

If only we had that useful vBulletin smilie: :eek: (ETA: It occurs to me that codinghorror’s avatar will do quite well.)

Article in The Register:

Kriesel’s full report on the matter:

He displays this error sample, among others: Can you spot the errors (there are two digits changed):

Original:

Copy:

6 became 8 on first line and third line. Kriesel admits that he himself didn’t even see the error on the first line until someone pointed it out to him.

The situation I have is several blocks of digits, where the numbers range from 0 to 999. In the course of normal operations there will be some value in each block.

Note that these are just a component of a larger UI - so normally people position these blocks along the top of the screen

The question I have comes down to “If, given a number, a move of X steps (or maybe ratio of final to initial) is more noticeable than a smaller number of steps. How can this be characterized?”

True. But I think people would notice a 1 changing to a 2 more than an 11 changing to a 12. Same step, same change in a digit but a differing scale change 100% vs 10% (more or less :slight_smile: )

The situation I have is several blocks of digits, where the numbers range from 0 to 999. In the course of normal operations there will be some value in each block.

Note that these are just a component of a larger UI - so normally people position these blocks along the top of the screen

The question I have comes down to “If, given a number, a move of X steps (or maybe ratio of final to initial) is more noticeable than a smaller number of steps. How can this be characterized?”

I think the answer to your question:

is no. The context is much more important than the numbers involved. What other information is the observer observing? How meaningful are the numbers and a change in those number to them? How often will they be checking the numbers, and how intently? Etc. etc.

IMO … you can start by abandoning the misconception that the numeric value has anything to do with the noticeability. It does not.

What matters is (roughly) the amount of pixels changing within the space occupied by the 3 digit number multiplied by some factor for how different the shape of the changing glyphs are. There may well be an effect that the leading or trailing digit changing is slightly more noticeable than the middle digit. But that’s due to geometry, not due to our powers of 10 numerical notation.

In my business we have similar short alphabetic abbreviation fields that are displayed peripheral to the main control info. And that are static for minutes at a time but when they do change, it’s important that the difference be noticed promptly and incorporated into the operators’ awareness of the system state.

Typically when a change occurs the new data is either shown in a different color, is flashed, or is surrounded by a flashing box for a few seconds (3-10). At the end of the timer it returns to the standard color for that field with no flashing or boxing.

The whole and entire point of this maneuver is to create a geometrically larger difference that has some hope of penetrating our evolved filters that are designed to ignore minor changes in the periphery of our zone of attention as minor.

You may well be able to do something similar. Depending on what action the user should take upon a change, you want to make the highlighting apply only to numeric value changes above a certain threshold (absolute or percentage) or have the nature and degree of the highlighting convey a subjective degree of urgency / importance connected to the magnitude of change.

All this is Control Panel Design 101.

Well lack of a rule of thumb means I can’t be wrong. Or too wrong. :slight_smile:

I’m not sure if this has anything to do with the OP or not, but the topic of “detecting a change in a number” made me think of Gray code. It’s “an ordering of the binary numeral system such that two successive values differ in only one bit.” (Wiki.) It is commonly used by digital encoders.

Which is doubly funny given the OP’s similar but slightly different username. A difference worth noticing but easy to overlook.

Right, and I’m saying I think you are wrong about that.

The fact that the numerical change is bigger is irrelevant to whether the visual change is bigger.

If you have a 1 on a screen and it changes to a 2, it shouldn’t matter if there was a 1 just to the left of it or not. As long as the numerals are the same size and they all fit in your normal field of view, then they should be equally easy to see. As you get to more and more digits, it does become somewhat harder, because you have to pay attention to more places where a change might occur, but with just a handful of digits, that’s irrelevant.

Is 11 changing to 21 a "more noticeable change than 11 changing to 12? I would say that it is not, despite one being a bigger numerical change than the other. The amount of visual information changing is exactly the same.

In my defence I’m assuming people are lazy and inattentive :slight_smile:

I think I’ll see if I can rough up a pixel/count type model and play with it and see how it works out.

Yes exactly. They are inattentive. Because it is in the design of human perception systems to ignore anything unchanging and most things undergoing subtle change. And to ignore continuous change, like ever-fluctuating, continuously updating numbers, is quickly filtered out too. Just as are moving leaves on trees.

Lazy doesn’t (necessarily) have anything to do with failures that occur deep inside the unconscious part of human’s brains / minds. Though for damn sure if they’re lazy and have their feet up reading a magazine that obscures their view of your display it doesn’t matter how exciting you make it, they won’t see it. Because they can’t see it through the magazine.