The issue is we use the same keyboard symbol for unary (single operand) negation of a value, and the negative signifier on a literal number = a string of digits. Those are different operations with legitimately different precedence. And significantly different precedence. Converting a digit string into an arithmetic value is about the highest precedence there is. Conversely, unary negation is at the lowest precedence level along with dyadic (two operand) addition and subtraction.
The string “-9” does not mean “perform unary negation on the positive integer nine”. It does not mean "perform unary negation on the entire expression to the right of the “-”. The string “-9” because it is a hyphen followed by digits means “the negative version of that digit string”.
Other programming languages that use different syntax or glyphs for unary negation and for “this literal is a negative number” don’t have that problem of seemingly wacky precedence.
In e.g. APL " ⁻9*2 " is unambiguously (negative nine) to the power of two which is eighty-one while " -9*2 " is unambiguously unary negation of positive nine to the power of two which is negative eighty-one. Further " ⁻(9*2) " is a syntax error since the negative literal signifier cannot be validly applied to an expression.
Note that APL uses an asterisk-like symbol for exponentiation.