So we can finally dispense with the copy-paste straw man. You were asked at the very beginning to post the original conversation in context. Thank you for finally doing so.
And I think you mean screen
shot. A screen
print is how you make funny T-shirts.
I'll do so when you address
all the evidence, not just what might be responsible for re-encoding text.
How the typographical quotes got there is less important than
why they remained, and
what they say about the person who wrote the message. If we give you the benefit of the doubt and accept the possibility some unknown program or setting somewhere along the line re-encoded the text to produce the typographical quotes, you still haven't addressed the more pressing parts of the argument.
You purport this to be a conversation with someone whom you desire us to accept as a highly academically qualified mathematician and a physicist, and therefore an expert on mathematical and/or physics notation. The conversation includes a statement from him that you expect us to receive as authoritative and which—although it's a bit ambiguously worded—you insinuate proves your claim that ″ is acceptable as either minutes or seconds of time.
A mathematician would not refer to the primes as "apostrophes," because they aren't. You already told him you were looking for information on "primes," so there's no reason he wouldn't use the correct term. And the statement, "Primes uses apostrophes...," is curiously phrased. A more mathsy way of saying it would be, "The notation uses primes..." Prime is the name of the
symbol, not the name of the notation.
A mathematician would not cobble up a triple-prime out of a ' and a ". We approximate ′ with a ' and ″ with a " frequently as long as those are the only two in use, such as for angle measurements or feet and inches. When we're working in a more general context, where we'll need triple-primes and greater, the convention for decades has to use only single primes as approximations for all of them: ' for ′, '' (two single-quotes) for ″, ''' for ‴, and so forth.
Why? First because mixing symbols almost always looks wrong on the screen or in print. Your "mathematician" seems to acknowledge this by saying to ignore the spacing. But that's the tell. That's not a new problem. We've been dealing with trying to write math notation in ASCII since the early 1980s. Spacing for three or more primes is a solved problem. We just use single-quotes if we need the full gamut of primes.
Second, you can plug that ASCII/IEC-8859 approximation (e.g.,
35''', using only single-quotes) directly into programs like LaTeX and it gives you the proper typesetting. Someone who practiced mathematics and physics, either professionally or academically, starting in the 1980s and extending until the first rudimentary support for equations in word processing, would have written many papers requiring considerable mathematical notation. Until comparatively recently, programs like LaTeX were the only option. This was my bread and butter, both in academia and professional practice.
And during that same period, we all had to deal with communicating mathematics notation amongst ourselves using primitive text-only methods. We still do. There were, and still are, conventions for it, such as writing
usec when we can't use the proper SI
μs. And writing
''' (three single-quotes) when we can't use
‴. Combinations of ' and " in a single symbol were
never used.
So to bring this back to the point, you're telling me that a person we're supposed to respect as a highly qualified mathematician and physicist—and an authority on notation—is going to choose the
one wrong way out of several methods he would have had to employ in his career. And then to expressly acknowledge the reason why it's the wrong way! No, I'm not buying it.
And the "smart" editors only made things worse. Typing ' gets you the typographical single-quote (either open or close), but depending on what the editor thinks you're doing it might give you the wrong one. In some typefaces, the open-single-quote glyph looks enough like a single forward prime. But you only have to get bitten once by the glyph resembling a reverse prime in a different typeface to stop letting it happen. And when you see “” after typing two counts of ", you
know that's wrong. It's not just that it doesn't look good or isn't the "proper" symbol, ⁗. It's that “ is the
wrong symbol—in this case one that has the opposite of the desired meaning. And mixing them is mathematically nonsensical. You only need to deal with difficult (human) editors and typesetters once to eschew the auto-correct altogether,
and to affirmatively correct it when it nevertheless happens.
So again, you're telling me that a purported expert on mathematics and physics notation is going to
let those marks get rewritten incorrectly—in an explanation of how to use the notation properly!? Just no. I expect someone I'm being asked to qualify as an expert in mathematical notation to understand that “ is the opposite of ”
especially as it is used to indicate multiples and subdivisions, and not to allow such confusing mis-notation to go out as an explanation.
"It uses 'apostrophes'..."
Here's the really strange part. You're proffering this guy as a fully-qualified mathematician and physicist, and someone you want us to accept as a recognized or recognizable expert on the proper usage of primes to indicate subdivisions of time, because the proper and standard use of notation would at this point be second nature to him. You didn't predispose him for why you wanted the information you were asking, but his first and only thought is to sayFunny how this highly qualified expert
first goes to the backwoods, vernacular usage you say wasn't okay for homework. How would he even
know about it, if you admit it was just your informal variant? He doesn't default to the usage everyone else accepts as the inviolable standard: ′ for minutes and ″ for seconds. Why not explain the universally accepted standard
first, and then go into whatever variant or vernacular usages you might want to ask about?
Anyone who's read the standard and done his history homework on this notation knows of the controversy over what
º should mean (degrees or hours or both or something else entirely) and whether
h for hours and
d for degrees might have been better in both cases. And nowhere do we read that the primes can just shift left or right as needed, or that
′ can sometimes mean hours, as you both now say it can.
How convenient that your 'expert'
defaults to the one nonstandard, unofficial, vernacular usage you need authority for, without the slightest bit of prompting from you, and for which there is not one shred of documentary evidence?