This is an interesting one from a privacy POV, as calls to third party fonts (for example) leak the referring domain (but not the specific page, which you'd get leaked for a first-party request).
Nonetheless, the CDN sees the domain accessed, and the user's IP etc. Caching should (in theory) reduce risks from this, but due to various clever attacks, caches in Chrome are basically isolated by domain. I assume this is true for Safari.
Therefore, having "up to date" copies of static assets could be beneficial, as long as you have confidence those assets won't get "sherlocked" and silently updated. With subresource integrity, people shouldn't really be updating their old JS (or whatever) without up-versioning, but who knows - loads of websites will never update their dependencies and won't be actively maintained, so you could see people updating the CDN copy to patch an exploit.
I used to use LocalCDN, but found pages would occasionally break (on other browsers pre-Orion) - I think it's important if you do implement this that you don't risk breaking sites. Ideally you don't break subresource integrity, and you fall back to a CDN pull (maybe without referer? Maybe with a very bland and uninteresting referer like google.com?) rather than serve up an invalid resource that won't be loaded by the site.
Possibly useful article - https://httptoolkit.tech/blog/public-cdn-risks/