Depends.
At the full-step point, it will hold at 10 Nm, until the torque is over 10 Nm, and then it will move one step.
Because the holding torque is very much stronger at full-steps only.

So at less than 10 Nm stress, at the full-step position, it will not rotate at all.

At 1/10 microsteps the torque is == 10% iirc.

So, if the stop point is at 9 microsteps off a full step, it will "bend" or comply until it gets to the full step point.
It does not lose steps, but acts as a spring.
When torque is removed, it goes back to the position it was meant to be in.

Servos are different.
A servo "knows" where it is supposed to be, ie what the encoder count must be to stop.
If I rotate the chuck the servo drive led shows how many counts off it is.
It tries to use peak torque, 3x of rated torque, to get back to the position it needs to be in.

So, the 2.5 kW servo, at 10Nm torque and 30 Nm peak torque, tries to use 30 Nm torque x 1:3 belt drive = 90 Nm to get back to the position it "wants" to be in.
The peak torque applies for upto 3 secs, and then goes to sustained rated torque, 10 Nm in my case (2.5 kW).

The effect is very obvious and intuitive.
You can see on the servo drive led, the error by encoder count, in real time.
If you use so much force that you overpower the servo max-error setting, D iirc, it faults.

This takes less than 1 ms, or 0.001 secs.
Typical servo loops are 12 kHz, or 0.12 ms.

Mostly, the spindle can be off by == 0.1 mm at outer edge of 12" chuck before fault.
With very very loose (poor) servo tuning (factory default).

Servos do not have "microsteps" and cannot loose steps.
Servos position perfectly at rest, and then lock (modern ac servos).
When in use, they lag a bit, and this depends on tuning.
The lag is shown on the led at the drive.

The faster the servo runs (or more acceleration), the more lag it has.