Moderated Iron sun with Aether batteries...

Status
Not open for further replies.
I really don't have the slightest idea what you're talking about.

I want quantitative predictions, complete with error bars about the location of the sphere in an RD image at various iron ion wavelengths with respect to the chromosphere. I need numbers. Got numbers?
 
Last edited:
The neon layer and oxygen and other ionized elements inside that neon. If I could figure out what absorbs 94A, I might figure out what "cools off" and then is re excited, and that might allow me go give you a temperature range of something. Without that bit of info however, I'm clueless how to give you surface temperatures at the surface of the photosphere so I'll stick with the standard model for the time being.

Sorry, Michael---as Sol says, we KNOW exactly what an ultra hot and transparent neon plasma looks like. It emits primarily far-UV (because it's so hot) and primarily emission lines (because it's transparent---like you requested---and therefore not a blackbody.) Ditto for an transparent, ultrahot silicon plasma---it emits a UV silicon emission line spectrum. Ditto for any combination of Ne, Si, and trace elements---if it's transparent then it has an emission line spectrum, not a blackbody.

If it's a blackbody it's not transparent.

What emits the 6000K blackbody spectrum?
 
I want quantitative predictions, complete with error bars about the location of the sphere in an RD image at various iron ion wavelengths with respect to the chromosphere. I need numbers. Got numbers?


What do you mean by quantitative predictions? About what? What do you mean by error bars? What sphere are you talking about? In which running difference image(s)? Which various iron wavelengths? With respect to the chromosphere how? Really, Michael, what the hell are you babbling about?

Does anyone else have the slightest idea what this constant incoherent badgering and taunting is all about?
 
I really need a quantitative prediction out of you folks with respect to RD imagery if we are going to compare the predictions of this solar model and the standard solar model. If the standard theory is so 'right' and physics is all about math, and GM is the king of RD imagery, how hard can it be to do a little napkin math and come with some some ballpark figures?
 
What do you mean by quantitative predictions? About what? What do you mean by error bars? What sphere are you talking about? In which running difference image(s)? Which various iron wavelengths? With respect to the chromosphere how? Really, Michael, what the hell are you babbling about?

Does anyone else have the slightest idea what this constant incoherent badgering and taunting is all about?

So I'll just mark you down as "clueless coward"?
 
For five years GM you have dogged me around and complained about how I never quantified anything and I knew nothing about RD images. I have now put numbers on the table related to RD images with respect to the chromosphere. It's your turn. You've probably called me a crackpot on the internet 10,000 times by now. Don't you think a "professional" would put up some numbers if a mere hick from Mt. Shasta can put up some numbers. Can't you compete with a guy that can't balance his checkbook?
 
Running difference images made from the color separated pairs of images above wouldn't show much contrast because the source images are only 10 frames apart. But with a 100 frame offset the running difference images would be what you see in any single frame from the videos below.


This first video was made by removing the green and blue from the source video leaving just the red. Then all the red was converted to grayscale. Then two frames are taken from the video 100 frames apart starting with frames 1 and 100. Then 50% gray is added to each pixel in the first image, and the second image is subtracted pixel by pixel from the first. The result becomes frame 1 in the output. Then move to frame 2 and frame 102 and repeat with all frame X and frame X+100... (These pixel values are numerical values of gray from 0 = black to 255 = white.)

Then of course these videos are sized down to 640x320 and letterboxed to 640x480 to make them fit YouTube and common video viewers. And I trimmed them down to just the first 20 seconds to make more reasonable download sizes.


The second video, above, is the same only I took out the red and the blue, leaving only the green from the original video. And the third, below, is the running difference video made from just the blue. I believe these red, green, and blue colors represent 211Å, 193Å, and 171Å source data respectively.


I use a proprietary script I wrote myself to do this processing, so I won't be more specific. Anyone with a little math background, a modicum of expertise in computer video and graphics, and some reasonable programming skills could certainly do this.




This is a very important point for this discussion. Although the running difference material we find at NASA and LMSAL probably is made pretty much exactly the way I've done it, they can look quite different by adjusting only a couple of things.

First, if you subtract frame x+1 from frame x+100 you get the videos we see above. If you instead subtract frame x+100 from frame x+1 you end up with something that looks like a negative of that video. Either the lighting comes from the other side of the mountain or your mountain turns into a valley. Take your pick.

Second, the contrast between images will obviously be affected by the offset, or how many frames apart you use for the compared images. And you need to remember that is based on the time difference between the original frames, too. A running difference video made with an offset of 10 frames might show so little change that it would look almost like a smooth gray throughout. Compare images 100 frames apart and you can see the changes between source images more clearly. You can shrink and grow your mountains by comparing frames closer or further apart in the sequence.

Many of the running difference images available from NASA and LMSAL have quite different sizes of mountains, some so huge that it's amazing we don't see them with the naked eye when there's a solar eclipse! ;)
Wow, thanks very much GM! :)

Given the time delay between my post and yours, would it be reasonable to say that producing RD videos from calibrated, science images is relatively straight-forward, and not particularly time- or resource-consuming? I imagine the file sizes are a good deal bigger, and that there might be some additional overhead for validation, but otherwise?

One thing that's obvious - to me, at least - is that it's very hard to work out where the limb is in RD images, based on the images alone. In hindsight that's not surprising (pixels on, or near, the limb aren't likely to change much in intensity, except in quite localised regions; contrast at the limb is already pretty low in any case).

Presumably it would be quite easy to colorise an RD image; e.g. make an RGB one by combining separate R, G, and B ones; or convert a grey scale into a colour one (e.g. 0-20 red; 21-40 yellow; 41-60 green; 61-80 blue; 81-100 black). Also, contrast could be enhanced by stretching.

Now I see MM has written this:
I want quantitative predictions, complete with error bars about the location of the sphere in an RD image at various iron ion wavelengths with respect to the chromosphere. I need numbers. Got numbers?
Do you - or any reader for that matter - know why MM would want to determine "the location of the sphere" from an RD image?!?

Also, given that the chromosphere is very cool, compared with the corona and plasmas which radiate strongly "at various iron ion wavelengths" (presumably the ones SDO's AIA takes images in), how does he expect he - or anyone, for that matter - could determine the location of the chromosphere? I mean, it'd be hard enough to spot in the science images, and as there is little or no change to the chromosphere (except locally) from one time to another, it would be invisible in an RD image (no matter what iron ion wavelength it was taken in).

What am I missing?
 
For five years GM you have dogged me around and complained about how I never quantified anything and I knew nothing about RD images. I have now put numbers on the table related to RD images with respect to the chromosphere. It's your turn. You've probably called me a crackpot on the internet 10,000 times by now. Don't you think a "professional" would put up some numbers if a mere hick from Mt. Shasta can put up some numbers. Can't you compete with a guy that can't balance his checkbook?

You're asking people to make a prediction regarding something that nobody can see but you. You say "the green sphere" as if it actually meant something. You are completely incapable of breaking free from your own fantasy to engage with folks in the real world on any level.

Give it a rest and and focus your energies on substantiating your claims instead of posturing on the internet.
 
Yes. According to the standard model, that is where your coronal loops should start to light up. Don't you think we should use that as the starting point?

According to this model, the solid surface is located at about 4800 KM under the base of the chomosphere.

And you really insist that the neon is perfectly transparent? Really 100% transparent? As in, I can look horizontally across a 80,000 km-wide plain (under a 4800km tall atmosphere) and you think I could see any hot iron on the far side? Great, that makes it easy for me to calculate what the 2D projection of this whole business would look like.

Listen to me carefully Michael: I have done a calculation of the 2D projection of the model you just agreed to. It doesn't look like the SDO first-light image. In fact, I have no idea what you were talking about when you said it did. It doesn't look like TRACE VUV limb images either. What it looks like---to nobody's surprise---is that you made some trivial mistake when translating your 3-D mental picture into a 2-D image, and you thought the SDO image agrees with the mistaken projection.

How DID you translate your 3D mental model into a 2D image Michael? What 2D features do you think this 3D model has? Certainly not the features I'm seeing in from SDO (modulo PR caveats) nor TRACE (real data).
 
For the record.

How many times did you claim I could not even balance a checkbook?

If I can come up a ballpark figure that is related to that 171A wavelength which the TRACE RD image on my website is based upon, why can't you?

All that I'm looking for is where you expect to see that "opaque" (your definition) limb appear in an RD image with respect to the chromosphere. How hard can it be?
 
Last edited:
So the 3D model is (from the outside in)

coronal loops
--------------
bright and electrically-active layer
------------
"transparent neon-based substance TBD"
ETA "transparent silicon substance TBD"
------------
ultrahot iron vapor emitting 171A
----
solid iron at 2000K

Do I have that right? This is the first time you've mentioned light from the electrical activity so forgive me if my mind-reading skills have put it in the wrong place. Which layer is responsible for the "green" emissions which end up in the famous 20-pixel-wide stripe on the 2D image provided by the SDO PR department? I've asked eight times, explain already.
OK, so adding "photosphere" - meaning where the white light we see when we look at the Sun comes from - "chromosphere", and "transition layer", the MM model looks like this:

coronal loops
--------------
bright and electrically-active layer } this is the chromosphere
------------
"transparent neon-based substance TBD" } this is
"transparent silicon substance TBD" } the photosphere
------------
ultrahot iron vapor emitting 171A } this is the transition layer
----
solid iron at 2000K


Well, I'm not sure if the transition layer is there, or whether it's a layer between the "transparent neon-based substance TBD" and iron vapour emitting 17.1 nm. However, I'm pretty sure that, in the MM model, the chromosphere is above (= further from the centre of the Sun) the "transparent neon-based substance TBD".
 
All that I'm looking for is where you expect to see that "opaque" (your definition) limb appear in an RD image with respect to the chromosphere. How hard can it be?

"How hard can it be to predict the length of my cat's tail? To predict the St. Louis Cardinal's performance this postseason? To predict the exact balance between emission and absorption in a tangential slice through the solar corona?"

Very hard. It depends on the detailed structure of the corona.
 
It's really hard to tell based on the few RD images, but I think I lucked out. It looks to me that thanks to the cadence of the equipment and the variation in emissions that the 4800KM in question "blinks". I could be related to an inner spinning core. If that timing is related to the sun's 5 minute cycles, that could be a very important clue about what is inside the sun. If that is current flow variation in the 4800km in question, it should be really easy to spot along the limbs of the RD images. WOW am I looking forward to the FITS files. :)
 
"How hard can it be to predict the length of my cat's tail? To predict the St. Louis Cardinal's performance this postseason? To predict the exact balance between emission and absorption in a tangential slice through the solar corona?"

Very hard. It depends on the detailed structure of the corona.

Where would you expect to find the opaque edge of an RD image in say 171A with respect to the chromosphere?
 
Where would you expect to find the opaque edge of an RD image in say 171A with respect to the chromosphere?

IT DEPENDS ON THE DETAILS OF THE STRUCTURE OF THE CORONA. I DO NOT KNOW THE DETAILS OF THE STRUCTURE OF THE CORONA.

You've found a great sentence there, Michael, if you keep repeating it you can probably avoid thinking about blackbodies for another 50 pages.
 
The partition function, you mean? It's a sort of generating function that contains any information you'd like to know about a thermodynamic system. (One in thermodynamic equilibrium anyway.) You start with a Hamiltonian H, which gives the energy of the system for a certain microscopic state. H depends on a bunch of parameters, some of which are microscopic degrees of freedom, and others of which are external, constant, or controllable. To get the partition function you take exponentials of the form exp(-H/kT) and you sum over its evaluations for every possible microscopic state---and this sum is the partition function Z, which is a function of those parameters that aren't microscopic degrees of freedom (such as temperature, pressure, volume, external fields, etc).

A lot of information can be obtained from Z by taking derivatives with respect to the parameters. Some useful thermodynamic identities can be proven from it, too.

I can't say I know anything about solar or plasma physics, but if someone is trying to describe the macroscopic properties of the plasma, solar corona, or what-have-you by starting with microscopic properties, they'd probably be making use of the partition function at some point. Though they'd probably also be using some fairly advanced techniques in addition that I don't yet know. Looking at the Saha equation (which the post you quoted mentioned), it seems that one form of it does make use of Z.

And I seriously doubt someone would be able to overthrow any standard model in physics without knowing something as basic and ubiquitous as the partition function. =p

Thanks!
 
A thought occurs to me (it happens occasionally - usually I try to suppress it before I hurt myself). If Michael's solid surface is at 4800km, and he is claiming to see this surface by counting pixels at the limb, would the necessary oblique line-of-sight angle through the sun to the limb pass below 4800km at some point? In other words, to see 4800km deep at the limb in a 2d image, would he have to see through his iron surface on the way through?

Somebody not named Mozina please set me straight.
No. The key assumption is that the allegedly solid surface is being seen in profile, which is equivalent to assuming that the line of sight is tangent to the allegedly solid surface. So long as the allegedly solid surface is spherical, the tangent line will not intersect the allegedly solid surface.

What's more, you needn't worry about the observer-at-infinity assumption. The 80000+ km of plasma along the line of sight remains the same even when the observer is at some other distance, such as an astronomical unit, but the calculation becomes slightly harder to visualize.

An observer closer than infinity sees less than half the surface of the sun. That changes the projection seen by the observer, which could invalidate any distances obtained by counting pixels in the projected image. As it happens, the camera was far enough away to justify the observer-at-infinity approximation, so counting pixels is not a bad way to estimate the size of features that are known to be orthogonal to the line of sight.

There is no reason to assume the bright green line near the limb represents a feature that's orthogonal to the line of sight. We're accepting Michael Mozina's 4800 km estimate just for the sake of argument. That 4800 km estimate implies the line of sight passes through 80000+ km of plasma. If the physicists tell me that such transparent plasma is highly improbable, then I'm going to conclude that Michael Mozina's interpretation of the image is highly improbable.
:bunnyface
 
Status
Not open for further replies.

Back
Top Bottom