So let’s take the PM542's (PDF), they have a Typical Supply Voltage of 36v with a MAX of 50v.

A late forum member told me I should supply them at the max (50v) is this the correct thing to do? Or would it be better to supply the driver with the Typical Rated Voltage?

I want to get the best speed I can from the motors, will supplying the driver at the MAX voltage mean the motor will also receive a higher voltage and give me better speeds?

If we take the Nema23 3Nm motor (PDF), the document said:

Bipolar Parallel 2.73v, 4.2a, 3Nm

So from that, i take it for the Motor to run at 3Nm Holding Torque it will require from the driver 2.73v and 4.2a ?

The other question i have is about the Motor, when a phase is "in use" is the Volts and Amps it is pulling from the driver divided by the amount of wires coming from the motor?