Insist? No, probably not. I'm not aware of anything in the regs that REQUIRE code coverage analysis.
I concur that there is no mandate to show code coverage in either unit testing or verification testing.
That FPGA development has converted much hardware into software is a good thing technically and is a development that the regulatory bodies haven't clearly responded to. You'd think the VHDL would be identified as a software component explicitly.
Aside:
There are metrics suggested on code test coverage in the 80% range. That is, 80% of all source lines are traversed in the tests. My opinion is that this in unrealistic and leads to writing software that will give good numbers and not good software in the first place. The primary source of this is in error handling. Some engineers ("crappy") insist that there is no reason to have error codes or test those that are returned as the called software should be perfect. Having error codes on each function and testing them bulks up the code substantially on a SLOC basis and makes it hard to get a high test coverage number as each call needs to "fail" at a specific line. Very hard to do.
On one project we were given that mandate. I responded by having an engineer analyze how the SLOC count (Source Lines of Code) was determined and how best to write code to still allow thorough error reporting and handling without damaging the metrics. He came up with a system inside C++ of using macros and throw/catch that produced a smaller count of error handling lines and, as "catch" can group these for multiple function calls, we could expect to traverse those lines. This drove up our metric without needing to perform more detailed and difficult testing. I dislike throw/catch as it can be used ignorantly but we needed to get good numbers. And so it goes...