>> ... next step be to variac the primary way down and see what the ratio of power increase is?
>> Or is there more still to learn before I plug it in?
I'm still concerned about your very junior level of knowledge about electricity. Transformers do not increase power, in fact they invariably lose some. What's the name (or even just the initials) of your adult mentor who has experience with electrical connections and measurements at voltages over 100 V? Over 300 V?
You could do what I did when I had an unfamiliar x-ray transformer. It's slightly different, in some important ways, from your proposal copied above.
Step 1. Set up instrumentation that can measure AC voltages up to a few thousand. This can be tested, and sensitivity verified, with regular house power. Could be an analog or digital multimeter on an AC volts range, with simple external attenuation as described in FAQs.
Alternative: Set up to measure DC voltages up to a few thousand, and connect it to the _rectifed_ output of your XRT under test. You'll need that anyway for your demo or real fusor.
Step 2. Get your Variac, but don't use it to directly feed any XRT primary, with or without ballast.
Put a step-down transformer in between, that normally reduces house voltage to 12 volts or less. These are common for low-voltage outdoor lighting, and indoor lights that use low-voltage halogen bulbs. Generally have their own fuses on low voltage side. I'd find one rated for at least 50 VA, as opposed to (say) a doorbell transformer, or something out of an inexpensive cordless tool charger. The filament winding of a MOT (2 or 3 volts) might serve, with suitable precautions.
Then you can use most of your variac range, for much better resolution and repeatability, without having to put your XRT properly under oil. You could still die from touching an XRT secondary connection, even with primary voltage much less than 10% of nominal. But it won't jump through (much) air to get you.