A long time ago, I used to write heavy duty calculation software. We tested it. Fed it known inputs, and verified the outputs. By verified, I mean we used alternate means to calculate the results. And our customers WOULD NEVER blindly accept our updates without retesting. But it was a highly regulated industry.
More recently, for grins, I fed the MSA example Gauge R&R numbers into a piece of software for calculating Gauge R&R that we purchased from a professional company. And the results didn't match.
Having written the code myself (years ago) and witnessed supposedly professional grade (tested?) software fail, I can attest to how easy it is to have a bug in the software.
You would be foolish, foolish, foolish not to test.
But your problem is FEA. And while the method is the same, my gosh, how many different individual results must you have? I don't know what you're using it for, but in a simple model every node would have at least 3 pieces of data attached to it if not 12 or more. And verifying that would be daunting.
I would try this..... These packages are usually three things: a pre-processor (which builds the mesh, displays it graphically, and allows you to assign the initial conditions), a processor (which goes away and does the calculations), and a post processor (which takes the big results file and displays the pretty colored pictures).
Well it's really the output of the processor you're concerned with. And that is some file of results that the post processor reads. I would run some model before updates (a simple one, processing time is expensive, but not TOO simple), generate that output file and store it. Then run it AFTER updates and use a utility to compare the two files bit by bit. Unless the update is doing something to the header of the output file, it really shouldn't be doing anything to the results. The software vendor ought to be able to tell you if this were the case. You may be able to open the file in a text editor and check yourself if the headers are different.
But a bit by bit comparison of the output files before and after wouldn't take any longer than copying them somewhere (which is essentially moving them bit by bit).