• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Merged Apollo "hoax" discussion - continuation thread

Kyoon said:
... Talk of holograms, technological feasibility and the "moon landing":

The use of computing technology to get there is frequently alluded to, but exactly kind of hardware and software was required?
A troubling question - just think about the exponential progress in computer software and in particular computer hardware since Neil Armstrong and Buzz Aldrin allegedly landed on the moon, in 1969. Something anyone can understand.
The answer is not easy to find. Actually NASA apparently doesn't care about the key piece of evidence - the software source code.
For anyone that believes the "greatest achievement of Mankind" to be a fake, this is not surprising.
Let's review the answer of NASA. If you have a minimal knowledge of hardware and software and afterwards you still believe in the moon landing hoax, then you are a victim of the BIG LIE technique.

As a Software developer for nearly 20 years and having been programming for nearly 30, having programmed in nearly a dozen languages from Assembly Code to C++, and having a good knowledge of the Apollo Guidance Computer, let's answer this.

The use of computing technology to get there is frequently alluded to, but exactly [what] kind of hardware and software was required?

It's not alluded to at all, it's well defined.

Rather than looking at the equipment itself, it's easier to look at the job required. What did the system have to do? In the end the answer is "Not a lot." When we travel in space, we follow a trajectory that is determined by the gravity of close planetary and stellar objects along with the velocity of our space craft. We generally call these things "Orbits". These have an elliptical shape and can be mathematically derived with quite a bit of accuracy. In fact so much so that people have been doing so for hundreds of years with planets and moons.

Knowing this, then to get from Point A to Point B in space, what we need to do is determine one or more intersecting ellipse shaped orbits that will contain both of these points. If we have multiple ellipses then we need to know how to transition from one to the other. Again this is relatively easy from a mathematical point of view. All we need to do is change our velocity to that required by the new orbit when we are at the point where they intersect. This difference between the velocities is called the Δv.

Now remember that Velocity is a vector, it has both a direction and a magnitude. For our spaceship, to change from one orbit to another, we need to know our current velocity (magnitude and direction) and our required Δv (the change in magnitude direction). Once we know our Δv, we can determine two more things.

The change in magnitude of our velocity will be done with the thrust of our main engines. We know that this is done by applying a force for a period of time, and the time is proportional to our desired change in magnitude and the mass of our ship. We also know how much mass we are using in fuel over that time period, and thus how the mass of our ship changes during the thrust.

The second thing we need to determine is the orientation of the ship during that burn, as this will determine the change in direction.

Now that we know what we need to get from Point A to B, we can determine what it is that our flight computer needs to be able to do.

It needs to be aware of the spacecraft's current velocity (speed and direction). It needs to be aware of the spacecraft's target velocity (speed and direction). It needs to know, or be able to calculate the Δv required to get between these two states. It needs to be aware of the current orientation of the spacecraft and that it will need for the burn. It needs to be able to change or maintain the orientation of the spacecraft as required. It needs to be able to precisely control the burn time of the main engines. Finally it needs to know when to do it.

We can actually simplify this even more. If we give the computer the ability to accept most of this data rather than having to calculate it itself, then we can end up with this list...

1) Current Velocity
2) Current Orientation
3) Required Burn Orientation
4) Time for Burn start
5) Time of Burn in seconds

With just this information, our computer can activate the rocket motors on our spacecraft and put us into a new orbit.

All we need to do to make this work is to have sensors such as radar and gyroscopes that tell our computer the spacecraft's current velocity and orientation, and have back up systems such as ground telemetry and manual orientation checking to make sure that these figures are correct, and then the ability to upload the Start time, Burn Time, and required orientation for the burn into our computer from the ground.

We now can see that the amount of actual computing power required to do the job was really quite small, little more than an over grown calculator with some hard coded controllers and sensor readers. And in fact, this is pretty much what the AGC was.

Now I hear you saying, wait, what about landing on the moon itself? Well, really we just do the same thing, except that we are aiming towards creating an orbit that first intersects with the lunar surface, and then for a final orbit that is at the surface with a zero velocity. The calculation and control is pretty much identical. We can add in the ability to allow the pilots to manually control the system and we're all but done. Taking off is again the same, but this time we are trying to match the orbital velocity of the CM which we are getting via the docking radar. Essentially flying between two points requires the same information regardless of what those points are.

A troubling question - just think about the exponential progress in computer software and in particular computer hardware since Neil Armstrong and Buzz Aldrin allegedly landed on the moon, in 1969. Something anyone can understand.

This simply is irrelevant. All that is relevant is "Was the computer technology of the day able to do the job required?" As shown above the job was remarkably simply, so the answer is a resounding YES!

The answer is not easy to find. Actually NASA apparently doesn't care about the key piece of evidence - the software source code.

Took me 30 seconds to find it.

Here's a scanned version of Apollo 11's CSM's AGC Software and of Apollo 11's LM's AGC.

The rest are there too.

Here is a slightly modernised version for anyone that wants to build their own AGC

For anyone that believes the "greatest achievement of Mankind" to be a fake, this is not surprising.

I agree, it is surprising that the ability to use Google seems remarkably lacking in those that believe such rubbish as that Apollo was hoaxed.

Let's review the answer of NASA. If you have a minimal knowledge of hardware and software and afterwards you still believe in the moon landing hoax, then you are a victim of the BIG LIE technique.

I have in depth knowledge of hardware and software, and I am more that happy that the AGC and its software acted exactly as advertised by NASA, and further more, I have yet to see a CT with any real knowledge of how these systems worked or what they had to do. They do little more that handwave about how the computer power wasn't enough without having any knowledge of what the power was, or what amount of power was realistically required.

The only ones believing a big lie are those that are victims of the Apollo Deniers and their lies that Apollo was hoaxed.
 
Last edited:
I'm assuming most Apollo hoaxers are older than college age, so hopefully my daughter won't have to deal with such nonsense as she continues to pursue an Aerospace Engineering degree in college. She did see the Mythbusters episode about it,

I actually had the opportunity to briefly tell Adam Savage about my background and that I was quite satisfied with the moon hoax episode (given the constraints of a 1-hour episode of a commercial TV show).
 
Last edited:
I have in depth knowledge of hardware and software...

As do a number of people 50 years hence -- something not likely foreseeable by NASA. If one takes the position that what we regard as the Apollo record is merely a cover story, then at some point that story has to break down. It is unlikely that anyone in 1965 foresaw the dominating role computing would play in our lives 50 years later, and that a degree of computer competency achieved by only a few in 1965 would be widespread in 2016. And the "cover story" holds up completely under a level of scrutiny its original designers could hardly have imagined.

And then there is the rather glaring statement by Kyoon, which amounts to an admission of ignorance. "NASA is hiding X!" but a normal person can find X in seconds. That's just trying to parlay one's own colossal ignorance as some kind of evidence that smart, dedicated people can't possibly have done a thing.
 
I actually had the opportunity to briefly tell Adam Savage about my background and that I was quite satisfied with the moon hoax episode (given the constraints of a 1-hour episode of a commercial TV show).

I consulted on that episode. My name is in the credits. Talks on what to do actually began during Mythbusters' first season. Time wasn't the only constraint. Budget was a big one too, as well as access to equipment. It wasn't until the show was well established that a budget and experiment list came into congruence.
 
As do a number of people 50 years hence -- something not likely foreseeable by NASA. If one takes the position that what we regard as the Apollo record is merely a cover story, then at some point that story has to break down. It is unlikely that anyone in 1965 foresaw the dominating role computing would play in our lives 50 years later, and that a degree of computer competency achieved by only a few in 1965 would be widespread in 2016. And the "cover story" holds up completely under a level of scrutiny its original designers could hardly have imagined.

And then there is the rather glaring statement by Kyoon, which amounts to an admission of ignorance. "NASA is hiding X!" but a normal person can find X in seconds. That's just trying to parlay one's own colossal ignorance as some kind of evidence that smart, dedicated people can't possibly have done a thing.

That's the elephant in the room that hoax proponents never seem to consider. Or do they really imagine that they're smarter than half a century's worth of engineers and scientists the world over?
 
Or do they really imagine that they're smarter than half a century's worth of engineers and scientists the world over?

They think that they are either all in on it, or that they are brainwashed into believing it as part of their training to become engineers and scientists. This is why they accept the self-taught engineers over real ones.
 
Or do they really imagine that they're smarter than half a century's worth of engineers and scientists the world over?

It varies. Some believe they are that smart -- essentially that there is no such thing as genuine expertise and that whatever dribblings they glean from the internet are just as valid as years of professional training and experience. Others substitute ideological purity for knowledge. "Engineers" and "scientists" are just specially brainwashed, and it takes the intuitive understanding of the dedicated conspiracy theorist, along with his common sense, to see through the lies of the professionals.
 
I always look at the 60's / 70's spacecraft in awe.
Simple, functional and robust, with a computer that needed to be programmed to use every single bit of memory to the most efficient level.
Take the voyagers, a computer that most pocket calculators nowadays would laugh at, but after decades of being in space with full radiation exposure it still works. And still gives us information.

A truly worthy stepping stone in our exploration of the universe, and I hope to see a time when we find space exploration that important again.
 
I always look at the 60's / 70's spacecraft in awe.
Simple, functional and robust, with a computer that needed to be programmed to use every single bit of memory to the most efficient level.
Take the voyagers, a computer that most pocket calculators nowadays would laugh at, but after decades of being in space with full radiation exposure it still works. And still gives us information.

A truly worthy stepping stone in our exploration of the universe, and I hope to see a time when we find space exploration that important again.

Sadly I think that with technology's growth, we have lost a lot of the art of clean coding in computer software development. When you only had 32K of RAM and 10 registers to store your data, you had to be clever in your code, prioritizing what and where you wanted to keep things stored, and what you could calculate on the spot and then throw away when done. Now days with Gigabytes of RAM, it's not an issue and coders get sloppy, which leads to memory leaks and taking up more resources that you actually need, but it doesn't matter because the computers are performing at such a high level they compensate for the lousy coding.
 
Last edited:
Is it lousy or is it coded to the standard needed to do the job on given hardware?
There are time and cost constraints to consider, making your code more 'clean' and less 'sloppy' is traded off against the hardware capability.
 
Is it lousy or is it coded to the standard needed to do the job on given hardware?
There are time and cost constraints to consider, making your code more 'clean' and less 'sloppy' is traded off against the hardware capability.

Very true. Still, there's a certain elegance and grace that we're missing out on. It really is amazing what could be done with 16K.
 
They think that they are either all in on it, or that they are brainwashed into believing it as part of their training to become engineers and scientists. This is why they accept the self-taught engineers over real ones.

I think they just imagine themselves as Roddy Piper in shades. They're the only ones who can see that half the "people" we interact with are actually aliens.
 
Having once tried to decode my old car's ECU with no documentation or annotation, I'm horribly aware of how much you can cram into 32k of machine code, and it does make me wonder how the simplest thing now seems to require megabytes.

Meanwhile, I'm pretty sure Kyloon has nothing further to add in this thread. It might be fun if he had, but it looks like Apollo is just another item on his checklist of things not to believe in. He may have an opinion, but he doesn't have an argument.

Probably best he stays in his other threads, making loony prophecies in the third person and confusing TV with reality. (I particularly enjoyed his announcement that Hillary Clinton and Kevin Spacey were secretly siblings, like Princess Leia and Luke Skywalker. Spacey is just going to bring the house down with his scene where he discovers Hitler is his real father. :D)
 
Having once tried to decode my old car's ECU with no documentation or annotation, I'm horribly aware of how much you can cram into 32k of machine code, and it does make me wonder how the simplest thing now seems to require megabytes.

To follow this rabbit trail just a little bit, the answer is that modern programming has traded space optimization for programming convenience. Most "things" built these days are built using higher-level programming languages, which utilize standard libraries for common operations. These standard libraries can get kinda big, but they allow the programmer to avoid reinventing the wheel and focus on solving the logic of the application itself. It's a side effect of the rapidly expanding storage capacity of modern computing; programmers have taken advantage of it to increase code reuse, at the cost of making their programs bigger.
 
Sadly I think that with technology's growth, we have lost a lot of the art of clean coding in computer software development.

The biggest factor is feature list versus time. The AGC team could literally spend months bumming a few instructions out of an 18-instruction subroutine. Today a software development team would be expected to produce a major release involving tens of thousands of lines of code and perhaps dozens of features or corrections.

...but it doesn't matter because the computers are performing at such a high level they compensate for the lousy coding.

Again, it's economic. If I assign a senior software developer to a problem for 2 days, that's $1,300 in labor costs. Apply that same $1,300 to more RAM for a rack-mount server, and problem solved. Yes, I can tolerate such things as memory leakage in a hastily coded program, especially if it's something that can be solved by restarting the program.

And that brings us back to Apollo. Read how many times in the transcripts the crew was asked to cycle a breaker for this or that thing. That was a common method of resetting equipment that had become wedged in some unforeseen state.

You're right in that we tolerate much sloppier coding these days than we did back then, and we have the resources to do it. But then as now we also tolerate lapses in correctness and performance. Microsoft very carefully trained its users to believe that PCs were inherently flaky, unstable things that needed to be restarted several times a day and reinstalled several times a year. That's in contrast to other computers in other contexts that could literally run unmodified and unattended for years.
 
To follow this rabbit trail just a little bit, the answer is that modern programming has traded space optimization for programming convenience. Most "things" built these days are built using higher-level programming languages, which utilize standard libraries for common operations. These standard libraries can get kinda big, but they allow the programmer to avoid reinventing the wheel and focus on solving the logic of the application itself. It's a side effect of the rapidly expanding storage capacity of modern computing; programmers have taken advantage of it to increase code reuse, at the cost of making their programs bigger.

Yes, look at the amazing things software can do for us these days. As I'm writing this, Google just popped up a notice on my mobile device telling me my morning commute will be a little bit longer this morning due to traffic conditions. That can only happen by means of these abstraction layers you speak of.

And again, Apollo wins. The AGC used a then-revolutionary concept of time-sharing. The 1201 and 1202 alarms that popped up during the Apollo 11 landing were proximally caused by what we would call today a high load average. The only native data type in the AGC was a 15-bit ones-compliment word, but the programmers very quickly realized they needed a higher-level vector data type and a code library to work on it. And the instruction set was tiny, so the designers borrowed a concept from IBM's channel-command words and implemented an interpreter for higher level "instructions."

And of course none of these were or are secret. You can download the yAGC emulator and run the actual MIT Draper Lab programs on them. I've read about hardware reimplementations too. And if you broaden the scope, engineer Scott Sullivan -- using only published drawings -- has made accurate CAD models of the CSM and LM (Virtual Apollo, and Virtual LM). To claim Apollo details are being kept secret is the epitome of either laziness or ignorance.
 
Looking at the output capabilities and internal computing power the AGC would be considered a moderately high end washing machine by current standards.
 
Looking at the output capabilities and internal computing power the AGC would be considered a moderately high end washing machine by current standards.

Yes, and there are much smarter washers and dryers out there. Yet somehow my dryer with a simple twist knob and no processing at all dries my clothes. (This is an observation Jay has made several times.)
 
Looking at the output capabilities and internal computing power the AGC would be considered a moderately high end washing machine by current standards.

An apt comparison.

Some low-end washing machines still use a cam-based mechanical sequencer. If you have to grind a knob around to the starting point of a cycle and pull out the knob to initiate it, you've got one of those. As the drum behind the knob slowly turns via clockwork, cams on its perimeter operate mechanical switches that implement the sequence of valve openings and closings, motor activations, etc. The Saturn V launch sequencer, believe it or not, was just a more sophisticated version of the cam drum in a washing machine.

Because today it's so very easy to implement a control system using an electronic microcontroller, a lot of people don't know (or have forgotten) how it was done in the past. This leads some conspiracy theorists to claim it cannot have been done, because "we lacked the computer power." Even with the AGC -- whose hardware design went on to have a successful career as a flight control computer in a jet -- many of the critical functions in the Apollo spacecraft (e.g., Earth landing sequences) were still implemented using diode, resistor, and relay based logic. These were considered safer because by 1965 they had a very long design heritage that came with oodles of operational experience and know-how. They were also almost completely impervious to disasters like the Apollo 12 lightning strike.

Safety and reliability always sit in the front seat when it comes to designing computers for these applications. Speed isn't that important, and generally is specified as "just fast enough." Yes, in terms of computing capability, today's high-end smartphones outpace the entire Real Time Computing Center at Mission Control during Apollo. But that's not what we care about. Even a cheap laptop can easily outperform many of the computers we put on spacecraft. The difference is how well your laptop performs in a paint mixer, in an oven, cycling the power every two seconds. In space we don't want a muscle car; we want a dump truck.
 
(This is an observation Jay has made several times.)

Fairly recently, too. :D

The Mercury spacecraft didn't have a computer at all. The Gemini spacecraft had only a rudimentary computer, which went on to become the Saturn V autopilot.

Two arguments typically arise when we talk about computers and Apollo. The first is, "There wasn't enough computing power available." Qualitatively we didn't solve problems in the 1960s with computers the way we do now. Quantitatively, the rebuttal is "Okay then, how much computing power was needed to fly to the Moon?" The conspiracy theorists almost never have an answer.

The second is "With all the advances in computing power, how come we haven't been back to the Moon?" You can't compute your way to the Moon. While we have seen an explosion of electronics technology in the past 50 years, the rest of rocketry hasn't kept pace. You don't need a big, fast computer to get to the Moon, and having one doesn't make the mechanical engineering of the vehicle orders of magnitude cheaper or easier.
 

Back
Top Bottom