I think that shield would help, but the transformer does not appear to have a flux band which would be the first move I would make if I had the trouble.
It is not an end user addition, two reasons, the screws that retain it fit into pre-existing threaded brass inserts in the case (I checked my machine). Plus the plate has been processed in a factory after it was cut. If it is aluminium it has been black anodized and if it is steel it has been black painted to a very uniform finish. Doesn't look like a home job (unless it was mine of course
)
I think , that all/most PETS like this were intended to have this shield plate, but most likely what happened was, along the way, they simply found it was not required, or perhaps redesigned the power transformer for a lower peak primary magetisation flux density and solved the problem that way, and dropped the shield. Just guessing here, but probably the earlier the generation of the PET and transformer, the more likely it might be to have required the shield.
There is another explanation too, though its a little more "Out there", not as bad as Alien abductions though:
It has been observed that many vintage power transformers appear to have high radiated magnetic flux. This has shown up in vintage radios, TV's Audio amplifiers etc. The level of hum injection from magnetic interference, in some cases has been so high that it almost seems impossible to believe that it could have been like this form new.
As noted the problem gets much worse if a transformer designed to run on 60Hz gets run on 50 Hz, even at the exact same rms line voltage and worse in either case with high range line voltages. So the question got raised....If the transformer was not like this from new, what could have changed over time to cause it to radiate higher magnetic fields ?
Keep in mind the magnetization flux is nothing to do with the load current.
One theory, and the only one that seems to wash, is changes in the magnetic properties of the core. If that was true, what would be the mechanism/s ? It could be that over time the insulation layers between the laminations fail and as a result the Eddy current losses increase and so does the primary current. To check this out, on a number of problematic transformers with the problem, I sanded down the edges of the lamination stack to make connections to the laminations, and I could not convince myself, on testing, that this had happened to any significant extent.
The other theory is, that relentless thermal cycling of the core has changed its metallurgy and magnetic properties, causing the the B-H profile of the core to start to saturate at a lower magnetic flux density than it did when it was new. This is the only theory that I think makes any sense (if it were true that the transformer was different from its new state). It is also consistent with the core saturating very abruptly on line voltages that are in the high range, and why the interference effects can drop with relatively small line voltage reductions of 5 to 10v on a Varaic. Though this theory about the core material is very very difficult to prove or disprove without a time machine.
The primary magnetization currents though are enormously different between modern and old transformers. In one case with a line transformer made in the late 1930's, of the same power rating and core size to a modern Hammond transformer with modern core materials, in the Hammond case it was 47mA, in the vintage transformer case over 500mA, at least x 10 higher. Again, the radiated fields from the old transformer were very high, so much so it deflected the beam of the CRT in a scope that it was used in significantly, and it could not have been like this from new. And zero detectable beam deflection with the modern transformer. A case like this makes it fairly convincing the transformer's original core has changed.