We use a lot of AJAX that loads JSON files that return an application/json mime type. In some instances, these JSON files are used to assemble elements on the page, so we don't want to hide them from Google because we want Google to see the page fully rendered.
My question is whether we should be adding an X-Robots-Tag: noindex HTTP header to these JSON files. On one hand, we don't want them returned in the search results because they're JSON files. On the other hand, these files are used as building blocks to build the page we do want indexed, and therefore should be included in Google's cache the same way Google's cache includes our CSS and JS files. We wouldn't noindex our CSS, after all. Right?