I was highlighting JayUtah's claim that you must at the first digit state the unit...
Here's the relevant paragraph from the post to which you refer, which clearly states that I
do not claim the canonical base unit must be specified in order to establish what extent is being measured.
Context can never redefine what the base unit is for some particular extent. Otherwise the system collapses. Context properly established, we sometimes omit the base unit if its value is zero and we don't therefore need the abbreviation to further expand the context. We don't need to properly title our musical composition 0h 4′ 33″, because we establish by other means that the context is time duration. What is meant thereafter by the primes is unassailably unambiguous.
And let's revisit the gaffe that has amused so many.
Thank you for confirming that it would not be correct to write 6ft, 2' ⅛ ", as I was pointing out.
But let's restore the full context. You wrote
Nota Bene: when using the notation for feet and inches, the feet immediate become first prime. There is no need to state, say, 6ft, 2' ⅛ ". It is just 6'2 ⅛".
Your objection is not that
6 ft 2′ ⅛″ is nonsensical--which it is--but that it's over-specified (yet apparently otherwise acceptable). And that opinion is driven by the apparent belief that you can willy-nilly restate the canonical base units for some extent and thereby change what the first and subsequent cuts mean.
You emphatically cannot.
Your ongoing duplicity on this point is illustrated by statements such as the following.
Vixen said:
Only if you want to make clear yards are being brought into it. Then that works for me: 3' 2" 11"' as long as we are all clear the first is in yards. Saves an awful lot of writing things out in full.
Establishing that "the first is in yards" is unnecessary. The base unit for length is yards--never anything else. Hence ′ is always feet, the first cut; ″ is always inches, the second cut. Yes, we habitually keep such measurements unnormalized, and therefore habitually omit naming yards as the nominal unit. But the immutability of the yard as the canonical base unit of length is the authority by which ′ and ″ keep their immutable meanings as feet and inches. We seem to properly agree on this point.
Hours, minutes, and seconds work the same way, despite your self-serving equivocation.
It's unclear whether you consider
6 ft 2′ ⅛″ to be improper because it nominates a different base unit or because it employs redundant indicators: both names and primes for feet. Both are mistakes, but unraveling your gyrations proves difficult. If you say
6 ft, you're not using primes notation for length. Why? Because it's using nominal abbreviations for what, in primes notation, would only properly be identified by a prime. Neither
6 ft 2′ nor
6 ft 2″ is meaningful because the named unit is not the right one.
If you're using primes notation and
if you include a nominal unit, then the named unit
must be the canonical base unit. Otherwise the notation is inconsistent and therefore incorrect. That is
not a requirement to use a nominal/canonical base unit in all cases.
As we've belabored, nothing illustrates this better than the preference of feet to yards. The meanings of the primes don't change, even though we omit the nominal units and denormalize the quantity of feet.
But then you say
Vixen said:
Like[]wise if you state one day to be first base then 1' 20" easily translates into one day 20 hours, if you want to write shorthand for the rest of the piece.
You cannot do this. You can no more renominate the canonical base unit for time (hours) any more than you can renominate the base unit for length (yards) or angles (degrees). ′ is an unambiguous measurement of length because it's always the first cut of the canonical base unit, even when the base unit is omitted. ′ is an unambiguous measurement of an angle because the canonical base unit never changes, and therefore the meaning of its first cut never changes. ′ is an unambiguous measurement of time because its canonical base unit (hours) never changes.
Even still you can't get the terminology right. There is a "base unit" and there is a "first cut." There is no such thing as a "first base," except in American baseball and canoodling. The base unit of time is the hour, abbreviated nominally as
h.
This never changes. The first cut of time is the minute, defined as 1/60 of an hour and noted immutably in primes notation by a single prime ′. This, too, does not change. The second cut of a hour is seconds of time. That's literally why it's called "seconds."
By the same logic that lets us omit yards and write feet and inches solely with primes, we can omit hours and write minutes and seconds
unambiguously using primes.
In an attempt to justify your original error, you insinuate that we can simply renominate days as the base unit for time. You suggest that doing this redefines the first cut to be hours and the second cut, notated ″, to be minutes. Therefore your usage
35″ should have been properly understood as "thirty-five minutes."
No. You cannot do this.
Not only did you give no indication whatsoever that you wanted a recontextualization to make days to be the new base unit of time, your own explanation of feet and inches belies that you know you cannot renominate a base unit without introducing the very ambiguity that the primes notation was invented to avoid. You concede that feet and inches are immutably and unambiguously identified using ′ and ″, and you even give the correct reason why. But then you abandon the whole system and claim that none of those rules should be in force while you tacitly and arbitrarily redefine what the symbols mean when measuring time instead of distance or angles.
Why? The reason is obvious. You wrote
35″ when you meant "thirty-five minutes," and refuse to concede that you didn't know that was the wrong notation.
Your first ruse was to insist that this is a perfectly ordinary convention--a bluff. Then when you compounded your error by writing
0.35″, at first you didn't even see the problem. Then when it was spelled out to you in excruciating detail, you deployed the second nonsensical explanation: that the
0. portion was somehow a cobbled-up way to express hours, and therefore to disambiguate the primes. When that fell flat you said
It is something I have always done. I had no idea you and others had never heard of it.
That's a backhanded concession. We went from an alternative convention to simply "something [you] have always done," irrespective of what others might have done. You insinuate that different education produces different convention that explains your usage, but your fellow Britons have contradicted you. You even insinuate that it's our fault we've never heard of this singular, confusing exception to the rules of a system that was in widespread usage for hundreds of years.
Here's how we know this is bollocks.
Do you remember your idiom of the FX prefix? Remember how you claimed it was common notation in screenplays? It isn't. After pages and pages of refutation, you finally fell back to the notion that FX was just something you personally used among your girlfriends. And here we are again. After
first claiming your usage was proper according to standard or convention, you've fallen back to the irrefutable, "Well, it's just what I use." Your
first inclination whenever any error is pointed out to you is to double- or triple-down and insist that you are still right even when the evidence of your error is plain. Your
first inclination is to lie. Only much later, if ever, do you come clean and contradict your first lies by admitting that your usage is just your personal habit. I suppose in your mind that equates to something like, "I can't be wrong if it's something I've always done and was never contradicted."
I assure you a Vixen-only "convention" can most certainly be an error. And it very much is in this case. What you propose to do is contradicted by the standard you say you are conforming to. You are simply wrong, full stop.
Why are we so focused on what are, by any metric, insignificant errors? Precisely
because they are insignificant. You lose little if any face to say, "Oh, I just thought that's what FX meant," or "Oops, I wrote ″ when I meant ′," or, "I misspoke when I said 'perpendicular." People will see that you're amenable to contrary facts, and that you will adjust your beliefs accordingly. Further, even very knowledgeable people make silly, inattentive mistakes. They are corrected, and they accept it graciously, and their credibility hardly suffers. Your insufferable insistence on a mantle of infallibility is pathological.
What we learn about you in this thread is that under no circumstances will you retract a statement that is shown to be wrong. Instead, you will go to extreme lengths and tell all kinds of lies in order to maintain the illusion that you are still somehow right. And yes, we can tell that you do this. If you are legitimately believing your lies, then you have serious issues we can't address in this forum. Whatever that posture, it is clear to all of us that maintaining the illusion that you're infallible is more important to you than actually having the correct facts and arriving at well-reasoned conclusions.
This means you're neither technically nor morally qualified to question other people's expert work. You cannot be trusted to respect facts, or the people who know them better than you do. Hence until you can show some semblance of intellectual honesty, you're more likely to be mocked at this forum than debated.