The speed is determined automatically. The motor naturally tries to sync with the frequency of the AC sine wave. (Hence "Synchronous Motor")
The speed (in RPM) will be [the frequency (in Hz) x 120] / the number of poles of the motor stator (The way the motor is wound).
You would then multiply the result by some constant <1 to compensate for motor "slip". Slip has to exist for the motor to function.
I say constant, but this number will vary (a little) with motor load, etc.
For Instance @ 50 Hz: (50Hz * 120) / 2 Poles = 3000 RPM
-->Your motor nameplate says 2650 RPM at 50 Hz so you have .883 as your slip factor for real world RPM.
For Instance @ 60 Hz: (60Hz * 120) / 2 Poles = 3600 RPM
-->Your motor nameplate says 3000 RPM at 60 Hz so you have .833 as your slip factor for real world RPM.
edit: By running a synchronous induction motor at reduced voltage and constant frequency (as you would be doing with a variable transformer) you are allowing the motor armature to "slip" more thus reducing the motor speed.
The angular velocity of the AC current around the stator stays the same -you are just letting it outrun the armature by reducing the magnetic coupling between them. (You are reducing the motor torque)
__________________
Jack of all Trades, Master of None.
Last edited by CoolROD; 12-06-2004 at 12:57 PM.
|