Sorry that I'm so overwhelmed at work, I am just checking messages.
If it were me (and I'm not a magnet expert - at all), I would go with the manufacturer's recommendations of a year (if that is who calibrated it). The definition (highly simplified and my version of it) is basically the probability to some defined statistical level (quite often, 95%) is the amount of time an item can be expected to remain within its specified tolerance. In my 35+ years of calibrating a little of everything, most items remain within tolerance longer than their interval (there's the 95%).
All that said, first, I would go with the manufacturer's recommended interval (because in establishing an interval, there should be a legitimate basis for it). BUT... once you have it recertified a few times, if you get your certifications with the measurements and perhaps uncertainty, you can plot over a few years how well it holds its specifications and how far it actually drifts over its interval, and eventually come up with a calculated interval, which may be longer than a year.
I do not recommend arbitrarily establishing a longer interval just to save money, because in doing so, although you may save some vendor calibration costs, until you establish a statistical trail of evidence, you are increasing the risk/probability of making bad measurements based on the calibration of the item.
Just my two cents.