Regarding that... Steep costs for the equipment for the same energy conversion efficacy are the main, but not only reasons why I think that. At an individual consumer level, I learned that the cost of the equipment, area available for the solar array, the weather, and geologic location influence how well the system performs. 6/10 days were cloudy, and pretty much every "house" was in the negative when everything was said and done. We wound a little bit positive only because we minimized power use (the design involved a lot glazing [aka day lighting] which which was itself a large cost component).
At the city level, you need a huge area of land to set up the solar array. That hasn't set particularly well with environmentalists in a number of cases, particularly given that where I live the only area available for an array of the required size happens to be a national park. I'm not sure they're cozy with the idea. Nor do I think the idea of blinding reflected sunlight on flights is either.
Doesn't mean solar's a bad idea, I think we should pursue it, but if it's supposed to help eventually replace fossil fuels, it needs a lot of passive design built into the architecture. I don't see large solar fields doing the trick.
Actually, solar thermal, is mostly off-the-shelf technology and the costs are comparable to gas-turbine electric generation systems. It is only suitable for southwest regional applications, but the fuel is free and when combined with either molten salt or hydrodynamic storage systems, can provide power 24 hours a day even if you have a few days in row of heavy cloud cover. As for PV systems, the only time the home systems get really pricey are when you are talking about off-grid systems that generate 100% of power retrofitted to older homes with low efficiency appliances. While such systems make sense in remote locations, they really won't be the primary use of PV solar for a long time (if ever). The use of PV panels that does make sense in a grid-access setting, is with panels that provide 25-60% of an efficiently designed and applianced home's electrical needs, feeding most of that into the grid, lowering the customer's net draw from the grid. This is best handled through adjustments in new-construction and major-remodel building codes, but this type of arrangement will also provide benefits even to those who just want to add a few panels to their older home, in order to reduce their monthly electric bill.
Again, however, solar in either form, is really not in a position to "replace all fossil fuels," but I really don't no many serious alternatives advocates who promote it as such. Solar is good for some regions. Wind is good for some regions. Tidal and wave power generation is good for some regions, Geothermal is available just about everywhere, but is only practical where the Earth is appropriately hot relatively close to the surface. In many areas, combinations of these systems should work. Add advanced design nuclear and hydroelectric and most importantly a networked intelligent national grid backbone and there is no reason we can't seamlessly and profitably transition away from using fossil fuels to generate electricity over the next 30 years.
Already know that the amounts of carbon in the atmosphere can be measured and a reasonable picture of the distant past versus our current production can be made about the CO2 concentrations themselves.
More than this, the carbon in fossil fuels has been deeply buried for a long time, it is depleted in the more radioactive isotopes present in carbon in the active carbon cycle of our planet. The shifting ratios of isotopic carbon in atmospheric carbon demonstrate that the carbon that is building up in our atmosphere is coming from the fossil fuels we are burning. These changing ratios line up well with the business records over the last two centuries of fossil fuel producers providing a means of cross-checking our data evaluations.
The problem lies in the fact that specifics about the impact from direct atmospheric analysis became available only in the last 40 years.
This is simply inaccurate. This issue has been a topic of scientific investigation since the Fourier discovered and began studying the atmospheric greenhouse effect in 1820. Tyndall isolated quantified the individual ghg impact of the atmospheric components responsible for this effect in 1859.
In 1896 Arrhenius completed a laborious numerical computation which suggested that cutting the amount of CO2 in the atmosphere by half could lower the temperature in Europe some 4-5°C (roughly 7-9°F) — that is, to an ice age level. But this idea could only answer the riddle of the ice ages if such large changes in atmospheric composition really were possible. For that question Arrhenius turned to a colleague, Arvid Högbom. It happened that Högbom had compiled estimates for how carbon dioxide cycles through natural geochemical processes, including emission from volcanoes, uptake by the oceans, and so forth. Along the way he had come up with a strange, almost incredible new idea.
It had occurred to Högbom to calculate the amounts of CO2 emitted by factories and other industrial sources. Surprisingly, he found that human activities were adding CO2 to the atmosphere at a rate roughly comparable to the natural geochemical processes that emitted or absorbed the gas. As another scientist would put it a decade later, we were "evaporating" our coal mines into the air. The added gas was not much compared with the volume of CO2 already in the atmosphere — the CO2 released from the burning of coal in the year 1896 would raise the level by scarcely a thousandth part. But the additions might matter if they continued long enough. (By recent calculations, the total amount of carbon laid up in coal and other fossil deposits that humanity can readily get at and burn is some ten times greater than the total amount in the atmosphere.) So the next CO2 change might not be a cooling decrease, but an increase. Arrhenius made a calculation for doubling the CO2 in the atmosphere, and estimated it would raise the Earth's temperature some 5-6°C (averaged over all zones of latitude).
(excerpted from American Institute of Physics hypertext book "The Discovery of Global Warming" -
http://www.aip.org/history/climate/index.htm#contents
That can't establish much of a pattern when you're dealing with long term climate trends, nor especially when within these last 40 years there have been smaller fluctuations in the climatological patterns. If reducing the CO2 emissions is a priority go for it, my disagreements with AGW are essentially separate from my thoughts about the potential of alternative energy. But I think there are other components of climate change that are more immediate of a concern (and at ground level) than the CO2 levels in the atmosphere, if I'm to take the crisis of sea level rise, warming, etc as already having been "locked in" at face value.
Near term minimums are locked in, from past emissions, each day of emissions adds to the problem exponentially and pushes us toward a point where we overwhelm natural carbon sinks and begin cascading natural release of CO2 reservoirs that will match or exceed the human emissions. We may have already exceeded these thresholds which pushes the need to not only quickly reduce and eliminate our emissions, but to also actually drawing CO2 out of the atmosphere and sequestering it ourselves to avoid that potential.
Well the problem is fossil fuels are the defacto and most established, and it's remained that way despite solar power not exactly being a new concept. There's also ethanol, which has it's own challenges interfering with food demand/prices. Let's not also forget wind power which is restricted by geography and climate. They're receiving adoption sure, but how how long have some of these alternatives already been around, solar's been around for nearly 40 years.n I can't speak for the future improvements to the technology so maybe you'll be proven right but for now those technologies need enough efficiency and cost balance to allow for adoption to ramp up. And manufacturers of these products need to be able to stay afloat.
The reason fossil fuels are "defacto and established" is because they haven't had to cover the externalities associated the impact on society of their use. With the full cost of using these fuels included into their price coal becomes the most expensive way to generate electricity, with oil derived fuels pushing a close second. The total costs of uncovered externalities of using coal for electrical generation in the US costs, by some detailed estimates, more than $500B per year.
http://chge.med.harvard.edu/sites/default/files/epstein_full cost of coal.pdf
These are a hidden tax by the fossil fuels industries upon Americans each and every year as they earn record profits and receive state and federal tax breaks and subsidies.