Elaborate, please. Did you invent this test yourself? Are you aware of standard color controls in the photography industry? Do they use only RGB?
...with the two acting in concert to demonstrate without a doubt the presence of ENCRUSTED material (body image as well as blood!).
Are you looking at the image when applying these algorithms? Or are you using the histogram(s) as a statistical model to guide your application? It's really a very simple question, and your growing reluctance to address it is probably inciting your readers here to infer an answer that is not favorable to your claim.
Answer - I was using only the first 3 of the available settings, having overlooked two that were lurking behind a somewhat inconspicuous "More" tab.
Can you describe the algorithms used by any of these "settings?" Can you describe in exactly what way they will tend to reveal information that would support a finding that the Shroud is "encrusted" and not produce a false positive?
...again, checked against RGB composition to ensure the changes are minimal in colour terms
What other color models did you use in your attempt to validate your findings? Are there color models besides RGB that would be more appropriate to your study?
Thank you for the feedback (well, some of it).
The criticism of your method is valid for the reasons given. You seem reluctant to address the reasons. You just seem to be chafing at the fact that you're being criticized, even going so far as to insinuate that questioning your methods amounts to a personal attack or invasion of privacy. While it is sometimes disappointing to face criticism for work in which you've invested a lot of time, it is necessary to the process. The strength of your findings lies not in how much time you spent arriving at them, but how they weather the worst of valid criticism.
as a scientific (i.e. trial and error) learning curve...
You spend a lot of time trying to tell people what science is. Specifically you seem to spend a lot of time describing your approach and then just slapping the label "scientific" on it. That puts the cart before the horse. If you are going to style your results as scientifically sound, then you bear the burden to prove you have conformed to the appropriate methods and understanding. If you don't know what those are, well then you have more homework to do.
Your ongoing desire to lecture to the readership about how to practice science once again makes it ambiguous whether you're claiming expertise. It's incongruous to approach your topic from the "trial and error" point of view and (as you do below) beg forgiveness for incidental errors or omissions in method, and at the same time rebut criticism by trying to instruct the critics on what is appropriate practice in science and insist that you are following it. While expertise exists along a continuum, it would be wise for you to state unequivocally where along that continuum you want your presentation to fall.
...reported warts an' all online in real time (first time ever?)
Responsible scientists don't drawn conclusions or publish findings until they are confident the results are sound enough to be trusted by a lay public. That's not to say partial results aren't shared among peers for comment and review. However, if that's what you're doing here and if you're thus going to admit your findings have "warts," then you can't have an emotional response every time someone notices a wart. That makes it seem like you're less interested in determining how the image on the Shroud was produce than in being praised as a clever and skilled scientist.
And you don't get to assume all warts are small. You don't get to assume your approach is fundamentally sound and could err only in a detail here or there. You have to consider the possibility that your image analysis techniques have no power to discover what you want to find out.
So if the methodology looks a bit tentative at times, that is indeed the case...
Then why do you seem defensive about questions directed at your methodology? Validation of method is essentially what the process of peer review in science hopes to accomplish, and it's a strong pillar of scientific practice. You don't get to be coy about your methods and simultaneously bristle when your approach is then characterized as amateur.
the essence of science, in my view, is always to view each new promising but non-validated tool with a degree of scepticism initially, indeed to break off from the main project and make the tool itself a project within a project, starting as I said earlier with a blank sheet of paper, and putting the tool through its paces with known reference systems.
Yes, you have the responsibility to validate your methods before you use them and before you draw conclusions. The easiest and best way to do that is to understand the tools that already exist and the sciences that created them. Making up tools and techniques as you go, without due regard to the state of the art, is a hallmark of pseudoscience. If it's important to you to avoid being lumped in with the pseudoscientists, then you need to be more forthright and less defensive about the review you're receiving here.