It must indeed be an intoxicating thought to conclude that, of all the great minds who have considered this question, you alone have got it right. While it is, at a certain level of abstraction, possible for you to be uniquely right on this pretty basic point, you don’t seem to have a sense of how radically unlikely that is.
You seem to be approaching the real number system as though it is an object in the real world, like the Himalayas or some such, about which truths can be determined by empirical methods of discovery. The real number system consists purely of intellectual objects manipulated according to intellectual rules, all of which objects and rules only exist as a result of their a priori definitions. You can’t “discover” a number between .999… and 1.0 because a consequence of our definitions of these things is that it does not exist. Your attempt to do so is simply notational abuse. The number you have proposed simply resolves to zero, that is, (1.0-1.0)/2 + 1.0-1.0
This is not an “assumption”, it is a consequence of definitions.
I well recall as a child having division explained to me as the number of times one number will fit into another. This was a rough approximation of the principle for childish ears, but it led me to become very frustrated with teachers who would say that division by zero was undefined. It seemed intuitively obvious that zero “went into” 1 as many times as you wanted it to, in other words infinite. Thus, to me 1/0=infinity. And ordinary algebra naively applied transforms this into 1/infinity=0.
But as an adult, the childish definition of division is inadequate, and zero and infinity are not finite numbers like 6 or the other numbers that can for all purposes be plugged into algebraic equations. Thus, division by zero is not defined because it leads to inconsistency that makes the entire enterprise valueless. 2x0=3x0, but applying ordinary algebraic rules and dividing both sides by zero as if zero worked the same way as, say, 6 results in 2=3. The same thing can happen when you manipulate infinity that way (treating it as though it was a number of the same order as 6). For that reason, it is a mistake to think about 1/infinity as though it is meaningful. Infinity just cannot be plugged into processes like that. Put another way, if 0.999…!= 1.0, then the concept of infinity would collapse to a finite number, making it incoherent, which would manifest itself in the legitimising of “proofs” like those above that 1=2
And that is one among many ways of identifying your error. Buried in all of your thinking is the idea that infinity is a mathematical object in the same class as finite numbers so that ordinary algebraic rules apply. They don’t. You can’t deal algebraically with the concept of infinity in the way you are attempting to do any more than you can divide by “red” or take the square root of “health”. That does not mean you can’t do any maths with infinite, any more than the fact that you can’t divide by zero means you can’t do anything with zero. It just means you can’t treat it as you would wish to.