I agree. I was working at Boeing at the time and they made a big deal out of the fact that no full scale mock-ups were made. They had computer models of ‘average’ cabin crew and machanics that they used to digitally test fit and access. The software used was CATIA and IBM made a small fortune selling us DASD to hold it all.
Enigma encrypted traffic was not decrypted using computers, it was decrypted by semi-brute force using machines called bombes which in no way can be considered computers.
FWIW, the first graphical CAD application was called Sketchpad and it was designed at MIT in 1960. I don’t think it was ever actually used, though; it was just a development step.
The word ‘computer’ is mainly defined to make sure your machine was the first.* That, or some joker makes the jape that humans are computers because we compute things, don’t we?** This is complicated by the fact a number of mechanical and semi-mechanical devices were used to do things vaguely related to numbers all the way back to rocks and sticks and charcoal on cave walls. These days, however, ‘computer’ means an all-electronic digital device with real software (no plugboards) and at least one level of digital storage (Williams tubes count, mercury delay lines count, water levels do not***). By this definition the first computer was invented in June 1948 in Manchester and it wasn’t the original ENIAC. It was the Manchester Small-Scale Experimental Machine, aka the Baby. (The original ENIAC didn’t have real software. It used plugboards, meaning it had to be disassembled to be ‘reprogrammed’. This is little different from the Polish Bombes, or an abacus, for that matter.)
*(This works for other things: I’ve heard someone define ‘Personal Computing’ such that IBM invented it in the 1960s when they put CP/CMS on System/360 mainframes.)
**(This happens to be congruent with a very old definition of the term: “One who computes for money.” The last human computers were unskilled women doing trivial arithmetic in WWII.)