This is nonsense with respect to cultural/technological change, because we are not (yet) changing our mental hardware and operating system along with our culture. Humans have been born with essentially the same brains for thousands of years. Each new physically identical human must acquire all cultural knowledge and understanding of technology from scratch.
So even if we grant the dubious assumption that there has been no qualitative change and that we’re talking about continuous exponential growth of technology, the later stages of exponential growth do not look the same, they obviously imply a much larger change with respect to our static innate mental capacity.
I don’t know what you’re talking about here, but “singularity” with respect to AI means something specific, a qualitative change that is certainly not just a continuation of past exponential processes.
If we achieve AGI, it opens up the possibility of positive feedback whereby the AGI can improve its own software and hardware. Even if the first version were only equal to or slightly better than human programmers, it could run a large number of instances of itself, and each iteration could better and faster at improving itself. A runaway positive feedback process might rapidly achieve superintelligence.