A little while back, we were helping a woman who could not start her car. The battery was low. We were able to jump start her car, but she was worried about getting stranded elsewhere so she called AAA. When the guy arrived, attempting to be helpful we gave the guy the symptoms, including that the battery was down to 10 volts. He responded saying that “the voltage doesn’t matter, just the amps”. Not wanting to get in to a debate with him, I didn’t challenge it, but he continued to harp on the topic anyhow. Of course the low battery voltage was an indication that it was not charged since it was supposed to be better than 12. He also did not understand that there is no current without voltage.
I have seen other iterations of this idea that current is independent of the voltage. Another common one is that it is current that kills, not voltage. I saw this used as argument that 5 volts could electrocute someone, if there was enough current.
All of these ideas don’t take in to account known physics, specifically ohms law and watts law.
So E/I=R I*R=E and E/R=I
To perform a given amount of work requires watts which is calculated as I*E=P P=Watts
So in the car problem, you need an amount of power to start the engine. If normal is 12 volts at 80 amps 12*80=960 watts. This also shows the starter has a resistance of .15 ohms. So if the voltage drops to 5 during starting, 5/.15=33.3 amps, so this will be 5*33.3=166 watts. This is considerably less power. At the extreme end, if you have zero volts, there will be no current.
In the electrocution myth, they talk about current through the heart, but failed to take in to account body resistance and dispersion of the energy.