Michael Mozina said:
You really need to get into the image at the pixel level to see these details, ...
One more example of the amateur approach to image processing. One must approach the "pixel level" with caution. The pixels on the
AIA detector project onto the sky with angular size 0.6 arcseconds. I can't find the assumed or measured
point spread function (PSF), but if it is
Nyquist sampled (as is usually the case), then the PSF is likely about twice that, or 1.2 arcseconds in diameter. In the absence of resolution enhancement, one should never trust the physical reality of anything in an image that is smaller than the point spread function (which is the smallest physical unit that can be detected by the optics), or more appropriately the point response function (which is the convolution of the point spread function of the optics and the detector pixel).
This means that no feature smaller than 2x2 pixels is likely to be real, and you probably want something rather larger than that if you are serious. It also depends on redundancy, whether or not the image in question is a single image, or a mosaic of many strongly overlapping images. In other words, how many individual images contribute input to any given single pixel in the final image? If the answer is 1 or 2, the real pixel level is not trustworthy. If the answer is more, then it might be, but it depends critically on the image restoration technique.
The lesson here is never blindly trust any image at the level of single pixels.