You had raised the objection against my advocacy for alternate formulations in physics to resolve anomalies (Q's were offered as a possible reformulation) based on the claim that alternate representations would produce "literally the same thing in different notation". Thus, recommending exploration of alternate formulations is unwarranted.
I agreed with your claim. I pointed out that while true, it missed the point I was advocating in that the calculations we would be likely or able to perform greatly depend on the notation methods we choose, as your examples illustrated
You missed one detail: Calculations look different from others
until you distill down to raw group theory, and then you're at the core. Binary arithmetic looks different than decimal arithmetic, but they're both representations of the group of integers under addition. Maybe it's possible that "thinking about rotations and reflections of a cube" inspires different calculations than "thinking about ways to traverse a tetrahedron", but "thinking about the finite group Sym(4)" is, inherently, doing both.
History shows we are simply unlikely to try or look for things that seem implausible under older formulations.
That does not mean that "Hey guys! Try new formulations!" is a productive piece of advice. Especially insofar as today's physicists have been
specifically trained to seek new formulations all the time.
My question: Do we agree that if we build decimal-logic computers, or forced people to do computer-related arithmetic in binary, many results we currently consider trivial would not be around for us put in alternate representations?
No, I don't agree. I mean, if we did arithmetic in binary, then elementary schools would not teach the "all multiples of nine have a digit sum which is also a multiple of nine" fact. (Although this sort of fact is known to mathematicians in---guess what?---general formulations that are true in all bases.) If we did arithmetic in binary, schoolkids would make different mistakes than they make now. If we forced computers to use decimal, no one would have invented
the fast inverse square root, but they probably would have invented equally-useful versions with bitwise arithmetic.
If so, and if history is any guide, it suggests reconfiguring the categories of a science (via alternate representations) is a key characteristic of revolutionary advance which is generally-accepted as needed in physics.
If there exist transformative advances in any STEM discipline which do not feature such recategorization, I would be interested to learn of them.
The problem with your management scheme:
knowing you need a recategorization is easy.
Getting the right one is hard.
Physicists
already know that present-day theories
probably are a limited version---a low-energy limit, or a subgroup, or a 4D projection, or a set of emergent statistical properties---of a theory we haven't seen. They spend all day
attempting to think about current theories in different ways.
on closer review and based on information systems project management criteria, it seems the most promising reformulation in many years: the amplituhedron.
Hey, look! Something physicists discovered, using physicist methods and physicist motivations over the course of a decade, gradually converged on something that was widely recognized, by physicists, using physicists' own version of "project management", to be important/interesting.
And
you have read a popular article about it---in which the journalist presumably interviews a physicist saying "this is a very important reformulation and may point to new truths about spacetime and will be pursued". And you look back and declare
retrospectively that your IT-management-principles have labeled this as something to be pursued?
I've asked this questions repeatedly: what do your management techniques do
differently than what physicists do already? Because this discovery qualifies as "the sort of thing physicists already do". It looks like that IT-management-free method has discovered good things. Do you think your technique would have done
even better? I'm more than a little skeptical of that.
In fact, I might hazard a guess that
your management might have
downgraded the priority of the study of Grassmannians, which (until the 2012-ish excitement) would have appeared, to you, to be a boring pursuit of "routine" science in the boring, inside-the-box, and non-revolutionary business of Yang-Mills theory. If, in 2004, you had been looking for possibilities for a paradigm shift in the study of spacetime, you would have looked
away from "Coplanarity In Twistor Space Of N=4 Next-To-MHV One-Loop Amplitude Coefficients" by Britto, Cachazo, and Feng
http://arxiv.org/abs/hep-th/0411107 . What if the IT-based analysis had diverted funding
away from that and towards (picking from the 400+ hep-th uploads on that day's arxiv) "The Exact Geometry of a Kerr-Taub-NUT Solution of String Theory" or "Dark Entropy: Holographic Cosmic Acceleration" or "Chromogravity - An Effective Diff(4,R) Gauge for the IR region of QCD"?