I agree (as a lighting engineer). Efficacy (empirical) will necessarily include secondary losses of the particular technology, be it filament resistance, circuit impedance, etc. Determining a theoretical maximum efficacy gets trickier, however. How efficient is your ballast? What is the ambient environment and how does that effect the component? What is the size and shape of the LED die and lens? What undiscovered “miracle” phosphor could be created? Is it “cheating” to count ballast efficiency in your spec?
Industry LED makers “bias” their efficacy numbers in their spec sheets to compete in the market by idealizing how their particular part was measured. “Doesn’t perform per spec sheet” is a common complaint. E.g. “I used an array of spec 60 lm/W LEDs in my flashlight, but our lab is only measuring 50 lm/W from the flashlight!”
I can’t contribute to what a theoretical maximum LED efficacy would be though or how close to the ideal 683 lm/W an LED could achieve, that’s for solid-state physicists and semi-conductor manufacturers to figure out.