The servo motors im looking at (as im sure all do) have a rated speed and a maximum speed for argument sake 2k rpm rated and 3k rpm max . I dont think the manufacturer (Delta) have produced a torque vs speed graph.

Do i understand correctly that the motor can generate the rated torque at any speed up until the rated rated speed quite happily all the time. And above this speed the motor loses torque generally in a linear fashion down to its maximum speed at which point it no longer generates any torque and presumably starts to shake itself to pieces (pointless)?

My question is do i want to calculate my gearing including pinion (assume 37.5mm pitch diameter) and theoretically only operate the motor in the 0 - 2k rpm window - say 5:1 thus giving me ~50metres per min feed rate at 2k rpm with maximum torque.

Or

is it advisable to theoretically operate the motor up to its maximum speed between 2-3krpm to gain some resolution perhaps? If i modified the gearing to say 7:1 i could still reach 50m/m. The torque would be tailing off but i guess at these higher speeds i wouldnt be cutting anyway at a guess it would just be rapids so would it matter?

I keep using the word theoretically there as im sure there will be many factors during implementation that probably stand in the way of achieving this i just thought it would be better practise to design like this and accept limitations when implementing rather than designing in limitations at this stage?