So 4 pages of bluster and this is what you come up with? So the investigators adjust the parameters of the model so that is accurately models what happened. What is your issue? The more severe case was the one that actually reflects the video and photographic evidence.
Frankly, I don't see your comments as a valid criticism of the NIST report. The severe case was modeled because it was the one that most accurately reflected the observed damage. Why would NIST choose to model less severe cases that patently disagreed with things like the presence of smoke and fire and broken windows?
Your criticism stems from a blatant misrepresentation of NIST's methodology. You, like Griffin, think that NIST changed the model to fit the hypothesis. Rather, the NIST report clearly states that the model was changed to fit the data, giving precedence to observable data, not to the model.
Bluster? LOL. You folks have that game down pat. I wouldn't think of spoiling your post frenzies.
I can't wait for some of your future thread topics;
What Dylan had for breakfast today.
What Dylan had for lunch today.
What Alex Jones had for breakfast today.
What Alex Jones had for lunch today.
Yadadada
Back to NIST;
I have no quibble that the extreme case model was the one that NIST required to generate their ultimate conclusion.
Since the collapse of the twin towers was such an unlikely event under the circumstances that existed, it stands to reason that extreme data would be
required to achieve collapse initiation.
What I take issue with in particular, is that when the extreme data set failed to match the video evidence of a collapse, they further
adjusted the input data until they persuaded the model to respond more closely to the observed data.
They never said how much they adjusted but only qualified this activity as;
"To the extent that the simulations deviated from the
photographic evidence or eyewitness reports [e.g., complete collapse occurred], the investigators adjusted the input, but only within the range of physical reality.
For instance,...the pulling forces on the perimeter columns by the sagging floors were adjusted... (NIST, 2005, p. 142)
Physical reality? Well how definitive is that?
In a paper by fire-engineering experts, Lane and Lamont, in the UK, they stated:
The basis of NIST’s collapse theory is… column behaviour in fire… However, we believe that a considerable difference in downward displace between the [47] core and [240] perimeter columns, much greater than the 300 mm proposed, is required for the collapse theory to hold true… [Our] lower reliance on passive fire protection is in contrast to the NIST work where the amount of fire protection on the truss elements is believed to be a significant factor in defining the time to collapse… The [proposed effect] is swamped by thermal expansion …Thermal expansion and
the response of the whole frame to this effect has NOT been described as yet [by NIST]. (Lane and Lamont, 2005.)
That models of WTC trusses at
Underwriter Laboratories (UL) subjected to fires did NOT fail is also admitted in the final NIST report:
NIST contracted with Underwriters Laboratories, Inc. to conduct tests to obtain information on the fire endurance of trusses like those in the WTC towers.... All four test specimens sustained the maximum design load for approximately 2 hours without collapsing... The Investigation Team was cautious about using these results directly in the formulation of collapse hypotheses. In addition to the scaling issues raised by the test results, the fires in the towers on September 11, and the resulting exposure of
the floor systems, were substantially different from the conditions in the test furnaces.
Nonetheless, the [empirical test] results established that this type of assembly was capable of sustaining a large gravity load, without collapsing, for a substantial period of time relative to the duration of the fires in any given location on September 11. (NIST, 2005, p. 141)
An article in the journal New Civil Engineering (NCE) lends support to concerns about the NIST analysis of the WTC collapses. It states:
World Trade Center disaster investigators [at NIST] are
refusing to show computer visualizations of the collapse of the Twin Towers
despite calls from leading structural and fire engineers, NCE has learned.
Visualisations of collapse mechanisms are routinely used to validate the type of finite element analysis model used by the [NIST] investigators. The collapse mechanism and the role played by the hat truss at the top of the tower has been the focus of debate since the US National Institute of Standards & Technology (NIST) published its findings....
University of Manchester [U.K.] professor of structural engineering Colin Bailey said there was a lot to be gained from visualising the structural response. “
NIST should really show the visualisations; otherwise the opportunity to correlate them back to the video evidence and identify any errors in the modeling will be lost,” he said....
A leading US structural engineer said NIST had obviously devoted enormous resources to the development of the impact and fire models. “By comparison the global structural model is not as sophisticated,” he said. “The software used [by NIST] has been pushed to new limits, and
there have been a lot of simplifications, extrapolations and judgment calls.” (Parker, 2005; emphasis added.)
MM