All of the thing you ask have complicated answers, and will depend a great deal on the exact model of phone you’re talking about.
Modern phone charge times are not linear. They will be slow at the very low end and the very top end, and may be very fast in the middle of the battery’s charge. Charging speed will also be limited by temperature, with the battery being very cold or very hot also slowing down charging. This is all done to protect the battery. That is assuming a charger that can charge the phone at its highest speed. If the charger is low power, then the phone might be below its “safe” charging rate across the entire charge range.
Many phones will advertise something like “30 minutes to charge from 20-80%,” and it might take another 30 minutes to go from 80% to full. If you have a kill-a-watt or a USB power meter attached to the charger you can see the voltage and amperage move up and down during the charging cycle.
There are also lots of different ways that USB chargers and devices communicate how much power to provide. Default USB maxes out at 5 volts and 0.5 amps (500 mA). To get more power than that, the charger and the device have to discuss it. Sometimes the discussion is as simple as putting a voltage on one of the USB data lines, or a resistance between some of the lines. So that leads to things like Apples 2 amp charging, Qualcom’s Quickcharge system, and USB-PD (power delivery).
I believe modern Apple devices can speak the old 2 amp language, and USB-PD. The fastest charger I have for my ipad is an 18W USB-C PD charger that came with my Google Pixel phone. It is much faster than the 10 watt (or whatever it is) charger that came with my iPad.
Quickcharge and USB-PD adjust both the voltage and amperage to provide more power than default USB.
Wireless charging is similarly complex, with a variety of different standards used to deliver more power. Both the wireless charging pad, the charging adapter it is plugged into, and the device being charged all have to agree on how much power will be sent.
So, there is a real difference between a 5 volt and a 15 volt charging pad. The electronics in the pad need to support the higher voltage, and have the right smarts to negotiate with the device and charger to get the extra power. There may also be licensing fees to use some of the higher power charging methods.
So, it is far more complex than plugging a 9 watt lightbulb or a 1500 watt space heater into the same wall receptacle, and getting whatever amperage is necessary, relying only on Ohm’s law (or whatever electrical thing makes electricity flow at the right amperage).