Land Change in Australia

This is for my Aussie fans. I show how Australia’s landscape changed from 2001 to 2019 using best available satellite data.

Parts of Indonesia and Papua New Guinea that should appear on the map, are changed to water.

Changes:

Type                                |   2001  |   2019  |  Change |  % Chg
------------------------------------+---------+---------+---------+---------
Evergreen Needleleaf Forest         |  0.4615 |  0.4885 | +0.0270 |   +5.85%
Evergreen Broadleaf Forest          |  2.4580 |  2.5659 | +0.1079 |   +4.39%
Deciduous Broadleaf Forest          |  0.0094 |  0.0167 | +0.0073 |  +77.66%
Mixed Forests                       |  0.0570 |  0.0737 | +0.0167 |  +29.30%
Closed Shrubland                    |  3.8315 |  4.0744 | +0.2429 |   +6.34%
Open Shrublands                     | 54.0555 | 54.7268 | +0.6713 |   +1.24%
Woody Savannas                      |  1.6631 |  1.9115 | +0.2484 |  +14.94%
Savannas                            |  6.4679 |  6.6289 | +0.1610 |   +2.49%
Grasslands                          | 24.9655 | 23.3947 | -1.5708 |   -6.29%
Permanent Wetlands                  |  0.2007 |  0.2198 | +0.0191 |   +9.52%
Croplands                           |  3.4671 |  3.5146 | +0.0475 |   +1.37%
Urban and Built-up                  |  0.1321 |  0.1388 | +0.0067 |   +5.07%
Cropland/Natural Vegetation Mosaic  |  0.0065 |  0.0110 | +0.0045 |  +69.23%
Snow and Ice                        |  0.0001 |  0.0005 | +0.0004 | +400.00%
Barren or Sparsely Vegetated        |  2.2241 |  2.2339 | +0.0098 |   +0.44%

Columns are in overall percent, except the last, which shows percent change from 2001 to 2019.

Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/28
# File: australia.sh
# Run: source australia.sh; require; download <user> <pass>; prepare; maps; animate; analyze

require() { sudo apt-get install hdf4-tools imagemagick; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse_mlc() {
    ncdump-hdf -v Majority_Land_Cover_Type_1 $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%02d ", $i}' | fold -w21600 > $1.mlc
}

parse_lct() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
        for (i=1; i<=NF; i++) printf "%03d ", $i}' | fold -w489600 > $1.dat
}

prepare() { parse_mct 2001; parse_mct 2019; parse_lct 2001; parse_lct 2019; }

aus() { ( echo -e 'P3\n830 680\n255'; sed -n 2001,2680p $1.mlc | awk '{
    for (x=5851; x<=6680; x++) {
        if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) printf "000 000 128 "
        else { if ($x==0) printf "000 000 128 "
            if ($x == 01) printf "024 160 064 "
            if ($x == 02) printf "041 216 082 "
            if ($x == 03) printf "156 216 083 "
            if ($x == 04) printf "158 251 183 "
            if ($x == 05) printf "151 202 178 "
            if ($x == 06) printf "193 163 181 "
            if ($x == 07) printf "244 230 206 "
            if ($x == 08) printf "219 240 188 "
            if ($x == 09) printf "249 224 000 "
            if ($x == 10) printf "239 198 160 "
            if ($x == 11) printf "087 153 208 "
            if ($x == 12) printf "246 242 153 "
            if ($x == 13) printf "251 005 000 "
            if ($x == 14) printf "156 168 128 "
            if ($x == 15) printf "250 250 250 "
            if ($x == 16) printf "195 195 195 "
        }
    } print "" }' ) > .pnm 
    convert .pnm -fill white -stroke white -pointsize 30 -gravity NorthEast -annotate 0 "$1" aus$1.png
}

maps() { aus 2001; aus 2019; }

animate() { convert -loop 0 -delay 200 aus*.png animaus.gif; }

count() {
    sed -n 2001,2680p $1.dat | awk 'BEGIN { 
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180 
    } {
        l=(NR+2001)*180/3600-90.025
        A=(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2
        for (t=2; t<=17; t++)
            for (x=5851; x<=6680; x++) {
                if ( (NR<20 && x<6100) || (NR<50 && x>6500) ) continue
                S[t] += $(17*x+t)/100*A
            }
    } END {
        for (t=2; t<=17; t++) SA+=S[t]
        for (t=2; t<=17; t++) {
            printf "%07.4f\n", S[t]/SA*100
        }
    }' 
}

analyze() { 
    echo 'Evergreen Needleleaf Forest
        Evergreen Broadleaf Forest
        Deciduous Needleleaf Forest
        Deciduous Broadleaf Forest
        Mixed Forests
        Closed Shrubland
        Open Shrublands
        Woody Savannas
        Savannas
        Grasslands
        Permanent Wetlands
        Croplands
        Urban and Built-up
        Cropland/Natural Vegetation Mosaic
        Snow and Ice
        Barren or Sparsely Vegetated' | tr -d '\t' > .type

    count 2001 > .2001
    count 2019 > .2019

    echo
    echo 'Type                                |   2001  |   2019  |  Change |  % Chg' 
    echo '------------------------------------+---------+---------+---------+---------'
    paste -d, .type .2001 .2019 | awk -F, '$2!=0 {
        printf "%-35s | %7.4f | %7.4f | %+7.4f | %+7.2f%\n", $1, $2, $3, $3-$2, ($3/$2-1)*100
    }'
    echo
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Surface Change

NASA provides global land cover classification data:

2011, Source

Unfortunately it stops in 2011. I did a little bit more digging and found a great resource here. What I wanted to do was show surface changes over time. Here’s my result:

Each year column shows coverage by percent, and last column shows the percent change from 2001 to 2019.

No analysis in this post. Enjoy 🙂 -Zoe

Code

# Zoe Phin, 2021/02/25
# File: landchg.sh
# Run: source landchg.sh; require; download <user> <pass>; prepare; analyze

require() { sudo apt-get install hdf4-tools; }

download() { base="https://e4ftl01.cr.usgs.gov/MOTA/MCD12C1.006"
    wget -O 2001.hdf --user=$1 --password=$2 $base/2001.01.01/MCD12C1.A2001001.006.2018053185512.hdf
    wget -O 2010.hdf --user=$1 --password=$2 $base/2010.01.01/MCD12C1.A2010001.006.2018053185051.hdf
    wget -O 2019.hdf --user=$1 --password=$2 $base/2019.01.01/MCD12C1.A2019001.006.2020220162300.hdf
}

parse() {
    ncdump-hdf -v Land_Cover_Type_1_Percent $1.hdf | sed 1,702d | tr -d ',;}' | awk '{
    for (i=1; i<=NF; i++) 
        printf "%03d ", $i
    }' | fold -w489600 | awk '{
    for (t=1; t<=17; t++) {
        for (l=0; l<=7199; l++)
            sum += $(17*l+t)
        printf "%.4f ", sum/7200
        sum = 0
    }
    print ""
    }' > $1.lat
}

area() { awk 'BEGIN { a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
    for (l=-89.975; l<=89.975; l+=0.05)
        printf "%.9f\n",(0.05*a*r)^2*(1-e)*cos(r*l)/(1-e*sin(r*l)^2)^2/70842.4493856
    }' > .area
}

whole() { paste -d ' ' .area $1.lat | awk '{ 
        for (i=2; i<=NF; i++) t[i] += $1*$i; 
    } END { 
        for(i in t) printf "%.4f ", t[i]
        print "" 
    }'
}

prepare() { parse 2001; parse 2010; parse 2019; }

analyze() { area; echo 'Water
    Evergreen Needleleaf Forest
    Evergreen Broadleaf Forest
    Deciduous Needleleaf Forest
    Deciduous Broadleaf Forest
    Mixed Forests
    Closed Shrubland
    Open Shrublands
    Woody Savannas
    Savannas
    Grasslands
    Permanent Wetlands
    Croplands
    Urban and Built-up
    Cropland/Natural Vegetation Mosaic
    Snow and Ice
    Barren or Sparsely Vegetated' | tr -d '\t' > .type

    whole 2001 | tr ' ' '\n' > .2001
    whole 2010 | tr ' ' '\n' > .2010
    whole 2019 | tr ' ' '\n' > .2019
    echo 'Type                                |  2001  |  2010  |  2019  |   % Chg' 
    echo '------------------------------------+--------+--------+--------+----------'
    paste -d, .type .2001 .2010 .2019 | sed '$d' | awk -F, '{
        printf "%-35s | %6.3f | %6.3f | %6.3f | %+7.3f%\n", $1, $2, $3, $4, ($4/$2-1)*100
    }'
}

This data requires user registration. Substitute <user> and <pass> with your credentials.

Us and Enceladus

Enceladus is the 6th largest moon of Saturn. It has the distinction of being the most reflective object in the solar system.

Photo by NASA’s Cassini Probe

The bond albedo of Enceladus is 0.81.

Let’s figure out what the average temperature of Enceladus should be using the standard approach. This is determined by 2 things:

  1. Insolation
  2. Longwave Radiation from Saturn.

The combined formula is:

( ( TSI*(1-Ea)/4 + (TSI*(1-Sa)/(Ds/Rs)^2)/4 )/σ )^0.25

TSI = Total Solar Irradiance

Ea = Enceladus Bond Albedo, Sa = Saturn Albedo

Rs = Saturn Radius, Ds = Distance from Saturn to Enceladus

We use data from here, here, Albedo from [Howett 2010] & Emissivity = 1 from [Howett 2003].

Do the math:

14.82*(1-0.81)/4 + (14.82*(1-0.342)/3.9494^2)/4 =

0.704 + 0.156 = 0.86

(0.86 / 5.67e-8)^0.25 = 62.4K

So 62.4 K should be the average temperature of Enceladus. Is it?

No it’s not.

The average is about 13 K higher. This just goes to show that the standard climate science approach of using greybody calculations is wrong. It’s wrong everywhere except where temperatures accidently correspond.

There is also definitely no explanation for Enceladus’ south pole aside from internal heat.

And if tiny planetary bodies have plenty of leaking internal heat, may not the Earth?

Based on data from previous flybys, which did not show the south pole well, team members expected that the south pole would be very cold, as shown in the left panel. Enceladus is one of the coldest places in the Saturn system because its extremely bright surface reflects 80 percent of the sunlight that hits it, so only 20 percent is available to heat the surface. As on Earth, the poles should be even colder than the equator because the sun shines at such an oblique angle there…

Equatorial temperatures are much as expected, topping out at about 80 degrees Kelvin (-315 degrees Fahrenheit), but the south pole is occupied by a well-defined warm region reaching 85 Kelvin (-305 degrees Fahrenheit). That is 15 degrees Kelvin (27 degrees Fahrenheit) warmer than expected. The composite infrared spectrometer data further suggest that small areas of the pole are at even higher temperatures, well over 110 degrees Kelvin (-261 degrees Fahrenheit). Evaporation of this relatively warm ice probably generates the cloud of water vapor detected above Enceladus’ south pole by several other Cassini instruments.

The south polar temperatures are very difficult to explain if sunlight is the only energy source heating the surface, though exotic sunlight-trapping mechanisms have not yet been completely ruled out. It therefore seems likely that portions of the polar region are warmed by heat escaping from the interior of the moon. This would make Enceladus only the third solid body in the solar system, after Earth and Jupiter’s volcanic moon Io, where hot spots powered by internal heat have been detected.

NASA

Don’t expect NASA to tell you how much Earth’s internal hotspots contribute to recent warming.

-Z

Matter only cooling to matter Matters

About a year ago, researcher Willis Eschenbach proposed a simple problem at WUWT that relates conduction and radiation in order to show that, at steady state, Cold Side Radiation (CSR) equals Conductive Heat Flux (CHF), and that therefore my articles (here and here) “must” be in error. Here’s a brief outline:

Block in Space
Absorptivity = Emissivity = 0.95Thermal conductivity: k = 0.8 W/(m*K)
Volume = 1 m³ … A = L = 1No radiation to / from stars
Adiabatic Wall on 4 of 6 sidesInput = 1360 W/m² (~Sun at Earth Distance)
Problem Parameters

Willis’ solution setup was:

We have two conditions that must be met at steady-state. First, the amount of energy entering the block must be equal to the amount of energy leaving the block. The amount entering is equal to 1360 * epsilon, which I’ve said is the emissivity (and thus the absorptivity) at all frequencies. The amount leaving the block is equal to sigma epsilon (T_hot^4 +T_cold^4). So the first equation is:

sigma epsilon (T_hot^4 + T_cold^4) == 1360 epsilon [eqn1]

The second condition at steady-state is that the flow through the block has to be equal to the flow out of the cold side. The flow through the block is k (T_hot – T_cold), and the cold side radiation is sigma epsilon T_cold ^4, so the second equation is:

sigma epsilon T_cold^4 == k (T_hot – T_cold) [eqn2]

Willis’ Comment

The majority of commenters had pretty much the same idea. I will call Willis et al.‘s solution Wal for short, so as not to single out poor Willis.

Let’s clean up the equations:

This is actually a complicated simultaneous equation to solve by hand. I liked Greg’s gnuplot solution the best:

TemperatureRadiation
Sunny (Hot) Side383.338 K1163.2 W/m²
Far (Cold) Side221.410 K129.5 W/m²
Wal’s Steady State “Solution
2D Representation of “Solution”

So is this correct?

I’m afraid not. One of Wal’s two criteria is an insistence that the block must radiate out to empty space. The implicit justification for this is Stefan-Boltzmann’s Law, as Wikipedia explains:

That’s all good and well, but I don’t recall Boltzmann doing or analyzing experiments with radiation into the void of space, but only inside cavities. Are cavities matter or space?

And what is up with j*? What does the star mean? Wikipedia doesn’t go into that very important detail. But it does accidentally hint at it:

The fact that the energy density of the box containing radiation is proportional to T^4 can be derived using thermodynamics.

A box you say? Interesting. The star probably indicates a potential, not guaranteed radiation.

It is common for textbooks to completely leave out the detail and context of the development of Stefan-Boltzmann’s Law (SB Law). For example:

This gives the impression that a body will radiate into the void the space with nothing in sight. And so people who don’t know any better might assume that’s the case!

But how did that chapter begin?

Oh, it begins with a hot object inside a vaccum chamber! Later on …

Kirchoff’s Law is also derived from a “small body contained in a large isothermal enclosure”.

As you’re probably aware, SB Law is intimately connected to Planck’s Law. In fact, it’s just the integral over all wavelengths using Planck’s spectral density formula. So it’s very important to know how Planck derived his formula.

You caught it? “in a cavity”, “no net flow of energy between the body and its environment”.

But, if a body was forced to emit radiation to empty space, (without getting equal radiation in return) that would violate the “no net flow of energy between the body and its environment” rule. This is obviously not a problem “in a cavity”.

Planck’s law describes the unique and characteristic spectral distribution for electromagnetic radiation in thermodynamic equilibrium, when there is no net flow of matter or energy. Its physics is most easily understood by considering the radiation in a cavity with rigid opaque walls. Motion of the walls can affect the radiation. If the walls are not opaque, then the thermodynamic equilibrium is not isolated. It is of interest to explain how the thermodynamic equilibrium is attained. There are two main cases: (a) when the approach to thermodynamic equilibrium is in the presence of matter, when the walls of the cavity are imperfectly reflective for every wavelength or when the walls are perfectly reflective while the cavity contains a small black body (this was the main case considered by Planck); or (b) when the approach to equilibrium is in the absence of matter, when the walls are perfectly reflective for all wavelengths and the cavity contains no matter.

Wikipedia

All I see is cavity, cavity, cavity. Where’s the radiation to nowhere? Nowhere. Let’s see the starting steps of Planck’s Law derivation:

Wikipedia

The derivation of Planck’s Law starts off by considering how much photon energy can fit in a box. As you can see, the dimension of the box matters! The intensity of photons in each wavelength is constrained by length (L)!

Only later is the volume of the box taken out to find the spectral density. The key thing to note is that you can only derive Planck’s Law by using Wave Theory, never Particle Theory. And you need these waves to go from matter to matter, never matter to nothing. If you try to think about waves going from matter to nothing, why would there be a constrained by length (L)? There would be no such constraint in this case. But if there’s no L constraint, then there should be no difference between intensity at different wavelengths. Then Planck’s Law would be wrong, the blackbody curve would be wrong, and SB Law would be wrong. They’re not wrong. There’s simply no emission from matter to NO thing.

For matter not enclosed in such a cavity, thermal radiation can be approximately explained by appropriate use of Planck’s law.

Wikipedia

The appropriate use of Planck’s Law and SB Law would incorporate the View Factor(s) between 2 or more surfaces.

The source gives a lot of examples of different view factors for different problems.

The main point is that actual heat transfer occurs between matter, never matter and nothing.

I like the resister analogy. Notice the space resistance. According to Wal’s theory, empty space has a view factor of 1, thus their space resistance is 1/A, or just 1 by chosen parameters. In actuality, empty space has a view factor of ZERO, and thus the resistance to emit to empty space is infinite. Thus it doesn’t happen.

What Wal would like to do is treat space as a surface with a view factor of 1, and drop the T₂ ( and ε₂ ) terms.

Let’s think about why this is physically inappropriate. If space really had a view factor of 1, every textbook heat transfer problem would have arrows drawn out in every direction from the hotter object, not just toward another surface. All those arrows would indicate a radiative drain on the hotter object. Every textbook would thus be incorrectly draining the hotter object and deceiving its students. Do you think textbooks are deceiving their students by offering a simplified view that neglects the “obvious” radiation to space?

Or maybe the more likely explanation is this radiation to empty space doesn’t happen?

Let’s see what NASA scientists think. Here’s a link to their paper: A METHOD FOR THE THERMAL ANALYSIS OF SPACECRAFT, INCLUDING ALL MULTIPLE REFLECTIONS AND SHADING AMONG DIFFUSE, GRAY SURFACES.

Source, Pages 10-11

Do you see the emission to empty space? Me neither. It doesn’t exist.

Now I know this paper is from 1970, but it’s the best I could find on the topic. Being right doesn’t change with time, so there’s no need to update the fundamentals. If you can find emission to space inside NASA’s formulas elsewhere, please let me know.

If the lack of emission to space in this document is wrong, then NASA’s scientists are underestimating the cooling, and thus risking the lives of astronauts. All of these simultaneous equations would give the wrong results if the emission to empty space actually occurred. Luckily it doesn’t, and astronauts landed on the moon and returned safely because of it.

I hope I have convinced you that there’s no emission (cooling) to empty space. Matter only cools to matter matters!


Let’s move on. The other problem with Wal’s theory is their understanding of what Steady State implies in their problem.

They think that steady state means you set the conductive flux equal to the radiative output of the cold side. Wrong!

You need to set the heat transfer of the input to the conductive flux.

They do not satisfy the third criteria. The “heat in” is not 1360 W/m², that’s only the input in the 1st second to a block at 0 Kelvin.

There is no heat out at all on the space side, as we just discussed.

The proper solution to the problem was the minority opinion:

Both sides will come to the same temperature of 393.534 Kelvin. We satisfy all steady state criteria. Disagree? Look again, the 2nd criteria says “can” not “must”.

Any questions?

Why was this example important? Because mainstream climate scientists also feel that greenhouse gases block radiation to nothing just because they sense a lot on a remote satellite, which is not nothing. 🤣. They too are in error, but this discussion will come later.

Enjoy 🙂 -Zoe

Clarification

I’m not suggesting an object can only cool if another object is present, only radiatively so. If an object’s internal energy is lowered, then it cools. Internal processes are not perpetual, they will degrade over time.

Update

A paper with no mention of heat transfer to space.

Trend in Global Fires

Climate alarmists claim that an increase in man-made greenhouse gas emission will cause more fires. For example …

Human-induced climate change promotes the conditions on which wildfires depend, increasing their likelihood …

ScienceDaily

Funk … says there is very well documented scientific evidence that climate change has been increasing the length of the fire season, the size of the area burned each year and the number of wildfires.

DW

The clearest connection between global warming and worsening wildfires occurs through increasing evapotranspiration and the vapor-pressure deficit.  In simple terms, vegetation and soil dry out, creating more fuel for fires to expand further and faster.

… Global warming will keep worsening wildfires …

SkepticalScience

Sounds serious. Is it true?

We show that fire weather seasons have lengthened across 29.6 million km2 (25.3%) of the Earth’s vegetated surface, resulting in an 18.7% increase in global mean fire weather season length. We also show a doubling (108.1% increase) of global burnable area affected by long fire weather seasons and an increased global frequency of long fire weather seasons across 62.4 million km2 (53.4%) during the second half of the study period.

— Nature: Climate-induced variations in global wildfire danger from 1979 to 2013

This is just about the most scientific paper I could find on the issue. Why are they obsessed with the length of the fire season? Why can’t they just answer the simple question: Is there more or less fire?

NASA has collected daily data on Active Fires since 2000.

Active Fires, March 2000 [Source]

I downloaded and analyzed all of their Active Fires data. Here’s the result:

Now it all makes sense. Climate scammers need to cherrypick locations and seasons in order to distract from the empirical truth that global fires have been decreasing. Disgusting.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/16
# File: fire.sh
# Run: source fire.sh; require; sets; download; index; plot

require() { sudo apt-get install -y gmt gnuplot; }

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD14A1_E_FIRE&year=$y"
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1 }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ S+=$1*$2 
    } END { printf "%0.4f\n", S }'
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    awk '$2>0.02 {"date +%j -d "$1 | getline t; 
        print substr($1,1,4)+t/365" "$2 }' .csv | gmt gmtregress | tee .trend | sed 1d | tr '\t' ' ' | cut -c-25 > plot.csv
    echo "
        set term png size 740,420
        set key outside top center horizontal
        set ytics format '%4.2f'
        set ytics 0.01; set mytics 5
        set xtics 2; set mxtics 2
        set xrange [2000:2021]
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Active Fires Index' w lines lw 2 lc rgb '#DD0000',\
                     '' u 1:3 t 'Linear Trend'  w lines lw 3 lc rgb '#440000'		
    "| gnuplot > fire.png 
}

Fortunate Global Greening

NASA offers a data product called a Vegetation Index. This can be used to track how green the Earth is.

February 2000, [Source]

Although many are familiar with recent global greening, I prefer to always check the source data. And so I downloaded all of their available 16-day-increment data from 2000 to 2021. Here’s my result:

0.0936 --> 0.1029 is +9.94%

10% global greening in 20 years! We are incredibly fortunate!

I just wish everyone felt that way. But you know not everyone does. To the extent that humans enhance global greening is precisely what social parasites want to tax and regulate. No good deed goes unpunished.

Anyway, Enjoy 🙂 -Zoe

P. S. The Earth is ~29% land. A Veg Index of ~0.29 would mean all covered land is heavy vegetation.

# Zoe Phin, 2021/02/16
# File: veg.sh
# Run: source veg.sh; sets; download; index; plot

sets() {
    for y in {2000..2021}; do
        wget -qO- "https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_NDVI_16&year=$y" 
    done | awk -F\' '/"viewDataset/{print $4" "$2}' > sets.csv 
}

download() {
    rm -f wget.log [0-9]*.csv
    awk '{print "wget -a wget.log -O "$1".csv \"https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si="$2"&cs=rgb&format=CSV&width=360&height=180\""}' sets.csv > sets.sh
    bash sets.sh
    rm -f 201[789]-12-31.csv
}

area() {
    seq -89.5 1 89.5 | awk '{
        a=6378.137; e=1-6356.752^2/a^2; r=atan2(0,-1)/180
        printf "%.9f\n",(a*r)^2*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2/1416867.06
    }' > .area
}

avg() {
    awk -F, '{
        for (i=2;i<=NF;i++) { 
            if ($i~/999/) $i=0
            S+=$i; N+=1
        }
        printf "%s %.4f\n", $1, S/N
    }' | awk '{ 
        S+=$1*$2 
    } END { 
        printf "%0.4f\n", S
    }'
}

yoy() {
    cat .csv | cut -c12- | tr '\n' ' ' | awk -vp=$1 '{
        for (i=0;i<p/2;i++) print ""
        for (i=p/2;i<=NF-p/2;i++) { s=0
            for (j=i-p/2; j<=i+p/2; j++)
                s+=$j/(p+1)
            printf "%.4f\n", s
        }
    }' > .yoy
}

index() { area
    for f in $(ls -1 2*.csv); do
        echo -n "${f/.csv/} "
        paste -d, .area $f | avg
    done > .csv
}

plot() { 
    yoy 22; paste -d ' ' .csv .yoy > plot.csv
    sed -n '12p;$p' .yoy | tr '\n' ' ' | awk '{printf "%s --> %s, is %+0.2f%\n", $1, $2, ($2/$1-1)*100 }'
    echo "
        set term png size 740,620
        set key outside top center horizontal
        set timefmt '%Y-%m-%d'
        set xdata time
        set xtics format '%Y'
        set ytics format '%4.2f'
        set ytics 0.01
        set mxtics 2
        set mytics 5
        set xrange ['2000-01-01':'2020-12-31']
        set grid xtics mxtics ytics
        plot 'plot.csv' u 1:2 t 'Vegetation Index ' w lines lw 2 lc rgb '#00CC00',\
                     '' u 1:3 t '1-Year Moving Avg' w lines lw 3 lc rgb '#005500'		
    "| gnuplot > veg.png 
}

Average Moon Day and Night Temperatures

NASA’s Moon Fact Sheet doesn’t give the diurnal temperature range for the entire moon, just the equator:

Diurnal temperature range (equator): 95 K to 390 K

Strange. They have collected the data. Why didn’t they do the calculations? So I could do it?

I went through every 15 degree increment longitude data available here.

Day is the center hot spot +/- 90 degrees. Night is everything outside of that.

Here’s my result:

Lon    Day    Night
000: 303.914 099.629 
015: 304.115 099.809 
030: 304.250 099.569 
045: 304.342 099.402 
060: 303.527 099.818 
075: 303.196 099.688 
090: 302.704 099.543 
105: 302.347 099.650 
120: 301.705 099.676 
135: 301.474 099.267 
150: 301.550 099.314 
165: 300.939 099.281 
180: 300.458 099.378 
195: 301.062 099.347 
210: 301.293 099.516 
225: 302.147 099.307 
240: 303.114 099.249 
255: 302.813 099.433 
270: 302.921 099.221 
285: 303.267 099.054 
300: 303.318 099.161 
315: 303.682 099.245 
330: 303.588 099.397 
345: 304.116 099.122 

Avg: 302.743 099.420

Whole Moon:  201.082

As you can see the time-averaged whole moon goes from a nightly low of 99.42 K to a daily high of 302.743 K, with a 24 moon-hour average of 201.082 K.

I assume that both day and night is a 12 moon-hour period. This may not philosophically be so, but my whole purpose was to figure out the difference between light and dark equal-area hemispheres, not compare unequal light to dark areas.

I’ll contact NASA’s Fact Sheet webadmin to ask him to update.

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/15
# File: moont.sh
# Run: source moont.sh; download; calc

download() {
    for l in {000..345..15}; do                   
        echo http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.xyz        
    done | wget -ci -
}

avg() {
    awk '{ f=1/581.9; e=2*f-f^2; r=atan2(0,-1)/180
        T[$2]+=$3; N[$2]+=1; A[$2]+=r/1438.355*(1-e)*cos(r*$2)/(1-e*sin(r*$2)^2)^2
    } END {for (L in T) printf "%+06.2f %7.3f %.15f\n", L, T[L]/N[L], A[L]}' | awk '{
        T+=$2*$3 } END { printf "%07.3f ", T }'
}

calc() {
    for l in {000..345..15}; do                   
        echo -n "$l: "
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && ($1>-90 && $1<90 ))			{ print }
        (l=="015" && ($1>-75 && $1<105))			{ print }
        (l=="030" && ($1>-60 && $1<120))			{ print }
        (l=="045" && ($1>-45 && $1<135))			{ print }
        (l=="060" && ($1>-30 && $1<150))			{ print }
        (l=="075" && ($1>-15 && $1<165))			{ print }
        (l=="090" && ($1>0   && $1<180))  			{ print }
        (l=="105" && ($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && ($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && ($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && ($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && ($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && ($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && ($1>105 || $1<-75))			{ print }
        (l=="210" && ($1>120 || $1<-60))			{ print }
        (l=="225" && ($1>135 || $1<-45))			{ print }
        (l=="240" && ($1>150 || $1<-30))			{ print }
        (l=="255" && ($1>165 || $1<-15))			{ print }
        (l=="270" && ($1<0 ))						{ print }
        (l=="285" && ($1<15 && $1>-165))			{ print }
        (l=="300" && ($1<30 && $1>-150))			{ print }
        (l=="315" && ($1<45 && $1>-135))			{ print }
        (l=="330" && ($1<60 && $1>-120))			{ print }
        (l=="345" && ($1<75 && $1>-105))			{ print }
        ' | avg
        cat *${l}E.xyz | awk -vl=$l '
        (l=="000" && !($1>-90 && $1<90 ))			{ print }
        (l=="015" && !($1>-75 && $1<105))			{ print }
        (l=="030" && !($1>-60 && $1<120))			{ print }
        (l=="045" && !($1>-45 && $1<135))			{ print }
        (l=="060" && !($1>-30 && $1<150))			{ print }
        (l=="075" && !($1>-15 && $1<165))			{ print }
        (l=="090" && !($1>0   && $1<180))  			{ print }
        (l=="105" && !($1>15 && $1<180 || $1<-165))	{ print }
        (l=="120" && !($1>30 && $1<180 || $1<-150))	{ print }
        (l=="135" && !($1>45 && $1<180 || $1<-135))	{ print }
        (l=="150" && !($1>60 && $1<180 || $1<-120))	{ print }
        (l=="165" && !($1>75 && $1<180 || $1<-105))	{ print }
        (l=="180" && !($1>90 && $1<180 || $1<-90 ))	{ print }
        (l=="195" && !($1>105 || $1<-75))			{ print }
        (l=="210" && !($1>120 || $1<-60))			{ print }
        (l=="225" && !($1>135 || $1<-45))			{ print }
        (l=="240" && !($1>150 || $1<-30))			{ print }
        (l=="255" && !($1>165 || $1<-15))			{ print }
        (l=="270" && !($1<0 ))						{ print }
        (l=="285" && !($1<15 && $1>-165))			{ print }
        (l=="300" && !($1<30 && $1>-150))			{ print }
        (l=="315" && !($1<45 && $1>-135))			{ print }
        (l=="330" && !($1<60 && $1>-120))			{ print }
        (l=="345" && !($1<75 && $1>-105))			{ print }
        ' | avg
        echo
    done | awk '
        BEGIN { print "Lon    Day    Night" }
              { D+=$2; N+=$3; print }
        END   { printf "\nAvg: %07.3f %07.3f\n\nWhole Moon:  %07.3f", D/24, N/24, D/48+N/48}'
}

### Blog Extra ###

require() { sudo apt-get install -y imagemagick; }

dlimgs() {
    for l in {000..345..15}; do                   
        wget -O L$l.png http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.png        
    done
}

scale() { n=0
    for l in {000..345..15}; do                   
        convert -quality 30% -scale 12% L$l.png SL$(printf "%03d" $n).jpg
        let n++
    done
}

animate() { convert -delay 30 -loop 0 SL*.jpg animoon.gif; }

Big Blue Marble

I don’t know about you, but I always thought this was a beautiful image:

Blue Marble ; Source: NASA, 2012, Showing 2004.

I decided to fix it per this article, and make a large (1920×960) animated version. The result is here. It’s 7MB, so please wait for it to load. Right-click and save image in case wordpress is annoying. I made it my wallpaper, and so can you!

Enjoy 🙂 -Zoe

# Zoe Phin, 2021/02/14
# File: terra.sh
# Run: source terra.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=BlueMarbleNG' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O terra$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=1920&height=960"
        let n++
    done
}

fix() { 
    for jpg in `ls -1 terra*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n1920 960\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 1,3d | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w23040 > .tmp

        cut -c1-756 .tmp > .right
        cut -c756-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 50 -loop 0 terra*.jpg animterra.gif; }

Annual Leaf Cycle

Our Beautiful Living and Breathing Planet

The map is fixed per this article.

# Zoe Phin, 2021/02/13
# File: lai.sh
# To Run: source lai.sh; require; download; fix; animate

require() { sudo apt-get install -y netpbm imagemagick; }

download() {
    list=$(wget -qO- 'https://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD15A2_M_LAI&date=2016-01-01' | grep '"viewDataset' | cut -f2 -d "'" | tr '\n' ' ')	
    let n=1
    for si in $list; do 
        N=$(printf "%02d" $n)
        wget -O lai$N.jpg "https://neo.sci.gsfc.nasa.gov/servlet/RenderData?si=$si&cs=rgb&format=JPEG&width=750&height=375"
        sleep 1
        let n++
    done
}

fix() { 
    for jpg in `ls -1 lai*.jpg`; do
        pnm=${jpg/.jpg/.pnm}
        echo -e 'P3\n750 375\n255\n' > $pnm
        jpegtopnm $jpg | pnmtoplainpnm | sed 's/  /\n/g' | awk '{printf "%03d %03d %03d ", $1, $2, $3}' | fold -w9000 > .tmp

        cut -c1-300 .tmp > .right
        cut -c300-  .tmp > .left
        paste -d '' .left .right >> $pnm

        pnmtopng $pnm > $jpg
    done
    rm -f *.pnm
}

animate() { convert -delay 25 -loop 0 lai*.jpg animlai.gif; }

Happy Valentines Day!

❤ -Zoe

Effect of Clouds on Global Upwelling Radiation

I downloaded and analyzed 10 Gigabytes of data fully covering years 2003 to 2019 from “the only project worldwide whose prime objective is to produce global climate data records of ERB [Earth’s Radiation Budget] from instruments designed to observe the ERB” [site] [data] in order to see the effect of clouds at the surface, especially the Upwelling Longwave Radiation (LW_UP).

NASA Reminds us …

High clouds are much colder than low clouds and the surface. They radiate less energy to space than low clouds do. The high clouds in this image are radiating significantly less thermal energy than anything else in the image. Because high clouds absorb energy so efficiently, they have the potential to raise global temperatures. In a world with high clouds, much of the energy that would otherwise escape to space is captured in the atmosphere. High clouds make the world a warmer place. If more high clouds were to form, more heat energy radiating from the surface and lower atmosphere toward space would be trapped in the atmosphere, and Earth’s average surface temperature would climb.

NASA

In contrast to the warming effect of the higher clouds, low stratocumulus clouds act to cool the Earth system. Because lower clouds are much thicker than high cirrus clouds, they are not as transparent: they do not let as much solar energy reach the Earth’s surface. Instead, they reflect much of the solar energy back to space (their cloud albedo forcing is large). Although stratocumulus clouds also emit longwave radiation out to space and toward the Earth’s surface, they are near the surface and at almost the same temperature as the surface. Thus, they radiate at nearly the same intensity as the surface and do not greatly affect the infrared radiation emitted to space (their cloud greenhouse forcing on a planetary scale is small). On the other hand, the longwave radiation emitted downward from the base of a stratocumulus cloud does tend to warm the surface and the thin layer of air in between, but the preponderant cloud albedo forcing shields the surface from enough solar radiation that the net effect of these clouds is to cool the surface.

NASA

Here’s the global percent of clouds by type:

Clouds  Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
Type_1  008.379  007.999  007.839  008.140  008.443  008.367  008.345  008.524  008.550  008.229  008.157  007.984  007.999  008.028  008.256  008.465  009.469  009.641
Type_2  024.677  023.556  023.799  024.149  024.438  024.168  024.382  024.580  024.419  024.181  024.766  024.539  024.534  024.796  025.193  025.493  026.317  026.195
Type_3  036.259  035.815  035.721  035.894  036.028  035.646  036.004  036.248  035.742  035.566  036.363  036.144  036.194  036.563  036.856  036.918  037.531  037.173
Type_4  066.637  067.458  067.597  067.381  066.701  066.395  066.248  066.500  066.579  066.149  066.087  066.093  066.003  066.103  066.577  066.569  067.425  066.972
Type_5  133.275  134.917  135.194  134.763  133.403  132.790  132.496  133.001  133.157  132.298  132.173  132.186  132.007  132.206  133.154  133.139  134.851  133.944

Cloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)

The project keeps track of 4 different types of observed LW_UP: All, Clr, AllNoAero, and Pristine. All is normal observed sky. Clr (clear) is no clouds. AllNoAero is All minus aerosols. Pristine is Clr minus aerosols.

Since clouds play an important role in Earth’s supposed greenhouse effect, and this effect leads to a supposed serious warming at the surface, we should see a very large difference between all these 4 scenarios.

The results (Units = W/m²):

Series               Average     2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_up        397.445  397.191  396.820  397.667  397.222  397.033  396.243  396.924  397.166  396.364  396.883  397.063  397.361  398.266  398.894  398.455  398.166  398.848
all_sfc_lw_up        398.167  397.921  397.559  398.404  397.945  397.750  396.955  397.632  397.876  397.076  397.598  397.795  398.090  398.992  399.625  399.189  398.874  399.551
pristine_sfc_lw_up   397.387  397.135  396.763  397.610  397.165  396.974  396.182  396.866  397.107  396.306  396.825  397.006  397.305  398.207  398.836  398.397  398.106  398.790
allnoaero_sfc_lw_up  398.129  397.885  397.522  398.368  397.907  397.711  396.914  397.594  397.838  397.038  397.560  397.758  398.054  398.953  399.587  399.152  398.834  399.513

But in fact there is very little difference. The difference in surface LW_UP between a Pristine sky (no clouds, no aerosols) and All sky (see above cloud data) is just 0.78 W/m².

I would even argue it might be ZERO. It’s only not zero because a satellite can’t measure the same scenario in the same place at the same time. They can only measure some place nearby or same place at another time. Even if I’m wrong on this, this value is still very unimpressive.

Now let’s look at downwelling longwave radiation (LW_DN) and longwave radiation at the top of the atmosphere (TOA_LW):

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_sfc_lw_dn        317.924  317.702  317.175  318.077  317.760  317.364  316.483  317.572  318.370  316.923  317.328  317.615  318.045  319.242  319.663  318.692  318.146  318.559
all_sfc_lw_dn        347.329  347.436  347.344  348.132  347.250  346.673  345.582  346.526  347.440  346.029  346.573  347.385  347.673  348.678  349.256  348.454  346.994  347.173
pristine_sfc_lw_dn   316.207  316.004  315.473  316.394  316.063  315.611  314.691  315.852  316.654  315.192  315.589  315.934  316.384  317.490  317.954  316.968  316.400  316.867
allnoaero_sfc_lw_dn  346.359  346.490  346.395  347.196  346.297  345.669  344.546  345.549  346.466  345.048  345.590  346.448  346.754  347.694  348.296  347.489  345.987  346.195

Series               Average    2003     2004     2005     2006     2007     2008     2009     2010     2011     2012     2013     2014     2015     2016     2017     2018     2019     
clr_toa_lw_up        262.503  262.373  262.267  262.645  262.446  262.584  262.087  262.268  262.521  262.179  262.185  262.499  262.543  262.668  263.075  262.942  262.535  262.735
all_toa_lw_up        237.889  237.990  237.924  238.257  237.970  238.339  237.685  237.764  238.165  237.975  237.581  237.895  237.973  238.027  237.999  237.848  237.167  237.557
pristine_toa_lw_up   262.979  262.833  262.720  263.102  262.911  263.070  262.598  262.743  262.988  262.665  262.684  262.965  263.009  263.165  263.547  263.419  263.033  263.198
allnoaero_toa_lw_up  238.168  238.260  238.189  238.523  238.242  238.626  237.987  238.042  238.438  238.260  237.874  238.167  238.245  238.320  238.274  238.126  237.456  237.827

Let’s now compare the averages side by side for all 3:

Series               Average

clr_toa_lw_up        262.503
all_toa_lw_up        237.889
pristine_toa_lw_up   262.979
allnoaero_toa_lw_up  238.168

clr_sfc_lw_dn        317.924
all_sfc_lw_dn        347.329
pristine_sfc_lw_dn   316.207
allnoaero_sfc_lw_dn  346.359

clr_sfc_lw_up        397.445
all_sfc_lw_up        398.167
pristine_sfc_lw_up   397.387
allnoaero_sfc_lw_up  398.129

The standard greenhouse effect narrative is that infrared absorbing gases prevent radiation from reaching space and this causes warming at the surface (thus more radiation). Well we clearly see that’s not case. If clouds (water vapor + aerosols) hardly changes outgoing surface radiation, then the whole hypothesis is in error. Less top-of-atmosphere outgoing radiation doesn’t cause surface heating and thus more radiation from the surface, despite the increase in downwelling radiation.

Enjoy 🙂 -Zoe

Update 02/28

Resident Biden’s Senior Climate Advisor reminds us

We quantify the impact of each individual absorber in the total effect by examining the net amount of long‐wave radiation absorbed in the atmosphere (G, global annual mean surface upwelling LW minus the TOA LW upwelling flux) [Raval and Ramanathan, 1989; Stephens and Greenwald, 1991]. This is zero in the absence of any long‐wave absorbers, and around 155 W/m2 in the present‐day atmosphere [Kiehl and Trenberth, 1997]. This reduction in outgoing LW flux drives the 33°C greenhouse effect defined above, and is an easier diagnostic to work with.

Gavin Schmidt et al.

that the greenhouse effect (G) is just SFC_LW_UP minus TOA_LW_UP. So let’s do that for all scenarios:

clr_toa_lw_up        397.445 - 262.503 = 134.942
all_toa_lw_up        398.167 - 237.889 = 160.278
pristine_toa_lw_up   397.387 - 262.979 = 134.408
allnoaero_toa_lw_up  398.129 - 238.168 = 159.961

So there is definitely a mathematical “greenhouse effect” difference between the 4 scenarios, and yet this makes no difference to surface upwelling radiation, and by extension to surface temperature.

Varying the amount of “greenhouse effect” means nothing to surface temperature.

Since the absorption of radiation by IR active gases makes no difference to surface temperature, the greenhouse effect hypothesis is simply incorrect and should be abandoned for the sake of empirical science.

-Zoe

Code clouds.sh:

# Zoe Phin, 2021/02/09

require() { sudo apt install -y hd4-tools; }

download() { 
    mkdir -p ceres; n=4
    for y in {2003..2019}; do 
        for m in {01..12}; do
            [ $y$m -ge 201507 ] && n=5
            [ $y$m -ge 201603 ] && n=6
            [ $y$m -ge 201802 ] && n=7
            wget -O ceres/$y$m.hdf -c "https://opendap.larc.nasa.gov/opendap/hyrax/CERES/SYN1deg-Month/Terra-Aqua-MODIS_Edition4A/$y/$m/CER_SYN1deg-Month_Terra-Aqua-MODIS_Edition4A_40${n}406.$y$m.hdf"
        done
    done
}

cmd() { ncdump-hdf -l999 ceres/$1$2.hdf -v "$3"; }

lwup() { series='init_clr_sfc_lw_up init_all_sfc_lw_up init_pristine_sfc_lw_up init_allnoaero_sfc_lw_up'; lw; }
lwdn() { series='init_clr_sfc_lw_dn init_all_sfc_lw_dn init_pristine_sfc_lw_dn init_allnoaero_sfc_lw_dn'; lw; }
lwta() { series='init_clr_toa_lw_up init_all_toa_lw_up init_pristine_toa_lw_up init_allnoaero_toa_lw_up'; lw; }

lw() {
    printf "\n%-20s %-11s" Series Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    for s in $(echo $series); do
        printf "%-20s = " $s
        for y in {2003..2019}; do
            for m in {01..12}; do
                cmd $y $m ${s}_glob | sed -n 3173,+2p
            done | awk -vv="${s}_glob" '$1==v{s+=$3}END{printf "%07.3f ",s/12}'
        done
        echo
    done | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
    }' | sed -r 's/init_|adj_//' | column -t
}

clouds() {
    rm -f .m* .y* .cld
    printf "\n%-7s %-11s" Clouds Average
    for y in {2003..2019}; do printf "$y     "; done; echo

    printf "Type_%d =\n" $(seq 5) > .cld
    for y in {2003..2019}; do 
        for m in {01..12}; do 
            cmd $y $m obs_cld_amount_glob | sed -n 3173,+2p | grep -o '[0-9].*[0-9]' | tr ',' '\n' > .m$m
        done 
        paste .m* | awk '{ S=0; for(i=1;i<=NF;i++) s+=$i; printf "%07.3f\n", s/12 }' > .y$y
    done
    ( 	paste -d ' ' .cld .y* | awk '{ s=0
        for(i=3;i<=NF;i++) s+=$i; 
        $2 = sprintf("%07.3f", s/17); 
        printf "%s\n", $0
        }' | column -t
        echo -e '\nCloud Types:  1 = High (50-300 mb), 2 = UpperMid (300-500 mb), 3 = LowerMid (500-700 mb), 4 = Low (700 mb-Surface), 5 = Total (50 mb - Surface)'
    )
}

Run:

$ source clouds.sh; require && download
$ clouds; lwup; lwdn; lwta

Greenhouse Gases as Coolants

There, I said it. Don’t believe me? I will show you …

NASA offers an online tool for measuring the effects of clouds, aerosols, and greenhouse gases.

Set Output to OUTPUT_details. Note the CO2 (ppmv) setting in the bottom left. Click Compute button to express form changes. Output comes below the form, so scroll down. Result:

Purple Ellipse = LWUP @ one meter above surface

I wrote a program to see changes to Upwelling Longwave Radiation (LWUP) at 1 meter above surface under different CO2 ppmv settings and zones. Here is the result:

PPM  Trop   MLS    MLW    SAS    SAW
 15 456.36 421.41 309.39 382.31 246.71 
 30 456.35 421.41 309.41 382.31 246.75 
 60 456.34 421.41 309.43 382.31 246.80 
120 456.33 421.40 309.46 382.31 246.87 
180 456.32 421.40 309.47 382.31 246.91 
240 456.32 421.40 309.49 382.31 246.95 
300 456.31 421.40 309.50 382.31 246.97 
360 456.31 421.40 309.50 382.31 246.99 
420 456.30 421.40 309.51 382.31 247.01 
480 456.30 421.39 309.51 382.31 247.02 
540 456.29 421.39 309.52 382.30 247.03 
600 456.29 421.39 309.52 382.30 247.04 

Units are in W/m²

(Trop=Tropics, MLS=Mid-Latitude Summer, MLW=Mid-Latitude Winter, SAS=Subarctic Summer, SAW=Subarctic Winter)

You see it ? ? ?

NASA’s tool also allows you to edit the atmospheric composition of water vapor, by setting Atmosphere EDIT to Detail.

I automated changes to sea level water vapor content while maintaining same CO2 level (410 ppm) and same zone (Mid-Latitude Summer). Result:

0.001 423.39 
0.002 423.39 
0.004 423.39 
0.010 423.39 
0.020 422.07 
0.040 421.78 
0.100 421.31 
0.200 421.13 
0.400 421.24

Anyway, that’s all the time I have for now. -Zoe

Update 02/08

While my analysis for CO2 is correct, it appears my H2O analysis was too simplistic. I have re-written the code. What I do use is use all 5 climate zones and change water vapor content in the whole atmospheric column, not just near the surface. I divide original content by 2, 4, 8, 16 and then multiply by same.

New Result:

  WV-X   Trop   MLS    MLW    SAS    SAW
0.0625X 455.63 420.61 308.94 381.72 246.23 
 0.125X 455.73 420.76 309.09 381.86 246.42 
  0.25X 455.83 420.91 309.24 381.98 246.61 
   0.5X 456.01 421.09 309.37 382.11 246.80 
     1X 456.30 421.40 309.51 382.31 247.01 
     2X 456.40 421.70 309.70 382.59 247.22 
     4X 456.02 421.64 309.95 382.65 247.48 
     8X 455.53 421.31 310.08 382.33 247.82 
    16X 455.41 421.12 309.91 381.96 248.11  

There is now warming in every zone but the tropics. No problem … The extra energy needed to raise water vapor content is exactly what these calculations perform. What you’re seeing is new raised fluxes needed to raise WV content.

Apologies if you feel the title of this article is now misleading. I strive for truth and accuracy.

Another Update 02/08 🙂

I updated the code to check every spectral type, not just the Ocean. The function h2o_diff tracks changes of the effects of multiplying water vapor by 256 times. Here is the result:

   Type  Trop   MLS    MLW    SAS    SAW
    01  -1.31  -0.59  -0.22  -0.93  +0.82
    02  -1.31  -0.59  -0.22  -0.93  +0.82
    03  +0.28  +0.85  +0.80  +0.39  +1.52
    04  +0.28  +0.85  +0.80  +0.39  +1.52
    05  -0.54  +0.10  +0.27  -0.29  +1.15
    06  +2.59  +2.98  +2.45  +2.42  +2.75
    07  +8.74  +8.61  +6.77  +7.76  +5.92
    08  -0.45  +0.17  +0.30  -0.24  +1.15
    09  -0.45  +0.17  +0.30  -0.24  +1.15
    10  -0.45  +0.17  +0.30  -0.24  +1.15
    11  -0.32  +0.36  +0.64  +0.01  +1.52
    12  -0.45  +0.17  +0.30  -0.24  +1.15
    13  -2.47  -1.63  -0.94  -1.90  +0.32
    14  -0.47  +0.16  +0.30  -0.24  +1.17
    15  -2.44  -1.60  -0.91  -1.86  +0.35
    16 +11.84 +11.45  +8.95 +10.45  +7.52
    17  -0.22  +0.51  +0.97  +0.24  +1.88

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

I did the same for CO2 (co2_diff):

   Type  Trop   MLS    MLW    SAS    SAW
    01  -0.09  -0.06  -0.06  -0.09  +0.10
    02  -0.09  -0.06  -0.06  -0.09  +0.10
    03  -0.07  -0.03  +0.00  -0.05  +0.15
    04  -0.07  -0.03  +0.00  -0.05  +0.15
    05  -0.07  -0.04  -0.03  -0.07  +0.13
    06  -0.04  +0.01  +0.11  +0.01  +0.30
    07  +0.02  +0.11  +0.41  +0.17  +0.63
    08  -0.08  -0.04  -0.05  -0.07  +0.11
    09  -0.08  -0.04  -0.05  -0.07  +0.11
    10  -0.08  -0.04  -0.05  -0.07  +0.11
    11  -0.07  -0.03  +0.05  -0.04  +0.23
    12  -0.08  -0.04  -0.05  -0.07  +0.11
    13  -0.10  -0.07  -0.10  -0.12  +0.06
    14  -0.07  -0.04  -0.03  -0.07  +0.12
    15  -0.10  -0.08  -0.09  -0.11  +0.07
    16  +0.06  +0.16  +0.55  +0.25  +0.80
    17  -0.07  -0.02  +0.13  -0.01  +0.33

 1 Evergreen Needle Forest   11 Wetlands
 2 Evergreen Broad Forest    12 Crops
 3 Deciduous Needle Forest   13 Urban
 4 Deciduous Broad Forest    14 Crop/Mosaic
 5 Mixed Forest              15 Permanent Snow
 6 Closed Shrub              16 Barren / Desert
 7 Open Shrub                17 Ocean
 8 Woody Savanna             18 Tundra
 9 Savanna                   19 Fresh Snow
10 Grassland                 20 Sea Ice

Code rtransfer.sh:

# Zoe Phin, v2.2: 2021/02/08

url='https://cloudsgate2.larc.nasa.gov/cgi-bin/fuliou/runfl.cgi?CASE=A
&Compute=Compute&ID=014605%0D%0A&DOUT=F&FOUT=1
&SELOUT=OUTPUT_details
&ATM=mls.atm&EATM=No
&CZA=0.5&VZA=1.0
&STREAM=GWTSA&SFCALB=IGBP
&SFCTYPE=17
&FOAM=OFF&WIND=5.0
&CF3=0.0&CHL=0.1
&CF1=1.0&COD1=1.0&CLDTOP1=250&CLDBOT1=300&PHASE1=ICE&CLDPART1=60&CINH1=100
&CF2=0.0&COD2=10.0&CLDTOP2=850&CLDBOT2=900&PHASE2=WATER&CLDPART2=20&CINH2=100
&AOT1=0.20&AOTTYPE1=continental&AOTSH1=4
&AOT2=0.00&AOTTYPE2=0.5_dust_l2004&AOTSH2=1
&CONT=2.1_ckd&ELEV=0.0
&RES=HI
&CO2=X'

types() { echo '
     1 Evergreen Needle Forest   11 Wetlands
     2 Evergreen Broad Forest    12 Crops
     3 Deciduous Needle Forest   13 Urban
     4 Deciduous Broad Forest    14 Crop/Mosaic
     5 Mixed Forest              15 Permanent Snow
     6 Closed Shrub              16 Barren / Desert
     7 Open Shrub                17 Ocean
     8 Woody Savanna             18 Tundra
     9 Savanna                   19 Fresh Snow
    10 Grassland                 20 Sea Ice
    ' | tr -d '\t'
}

co2() {
    echo "PPM  Trop   MLS    MLW    SAS    SAW"
    for ppm in 15 30 60 120 180 240 300 360 420 480 540 600; do
        printf "%3d " $ppm
        for zone in trop mls mlw sas saw; do
            echo $url | sed "s/ //g; s/CO2=X/CO2=$ppm/; s/ATM=mls/ATM=$zone/" | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done 
}

co2_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        T=$(printf "%02d" $t)
        sed -n "/Type $T/,/^$/p" co2.csv | sed -n '3,14p' | cut -c4- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

h2o() {
	for atm in trop mls mlw sas saw; do
		echo $url | sed "s/ //g; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/" | wget -qO- -i- | sed -n '/<textarea /,/\/textarea>/p;' | sed '1d;$d' > $atm.prof
	done

    echo "  WV-X   Trop   MLS    MLW    SAS    SAW"
    for w in 0.0625 0.125 0.25 0.5 1 2 4 8 16; do
        printf "%6gX " $w
        for zone in trop mls mlw sas saw; do
            atmo=$(awk -vw=$w '{printf "%-7G %8.4f %13G %13G%0D%0A\n", $1, $2, $3*w, $4}' $zone.prof | tr ' ' '+')

            (echo $url | sed "s/ //g; s/CO2=X/CO2=410/; s/EATM=No/EATM=Detail/; s/ATM=mls/ATM=$atm/"; 
            echo "&ATMOSPHERE=$atmo") | tr -d '\n' | wget -qO- -i- | awk '/SLW2 7-20/{printf "%s ", $6}'
        done
        echo
    done | tee h2o.csv
}

h2o_diff() {
    echo "   Type  Trop   MLS    MLW    SAS    SAW"
    for t in {1..17}; do
        sed -n "/Type $t/,/^$/p" h2o.csv | sed -n '3,11p' | cut -c9- | awk -vt=$t '
            NR==1{A=$1;B=$2;C=$3;D=$4;E=$5}END{printf "    %02d %+6.2f %+6.2f %+6.2f %+6.2f %+6.2f\n",t,$1-A,$2-B,$3-C,$4-D,$5-E}'
    done
    types
}

Run it:

$ . rtransfer.sh; co2
$ . rtransfer.sh; h2o


$ . rtransger.sh; co2_diff  # (must be run after co2)
$ . rtransger.sh; h2o_diff  # (must be run after h2o)

Automated Twit

I needed to create a throwaway Twitter account for a research project. I decided to automate its creation and share the code with you 🙂

I used the handy service of 10minutemail for receiving Twitter’s verification code.

I used the wonderful browser automation tool Nightmare.

Automation in action:

Patiently wait 10 seconds in the beginning and to receive Verification code

Code twit.sh:

# Zoe Phin, 2021/01/28

require() { sudo apt-get -y install npm && npm install nightmare; }

newtwit() { echo "nm = require('nightmare')
    main().catch(e=>{console.log('done.')})
    async function main() {
        e = nm({show: false})
        await e.goto('https://10minutemail.com').wait(2000)

        email = await e.evaluate( ()=> {return document.querySelector('input').value} )
        console.log(email)

        n = nm({show: true}).viewport(740,680)
        await n.goto('https://twitter.com/i/flow/signup').wait(6000)

        await n.insert('input[name=name]','Unique Snowflake')

        await n.evaluate( ()=> { 
            document.querySelectorAll('div[role=button]')[1].children[0].click() } )

        await n.insert('input[name=email]', email)

        await n.select('#Month','1').select('#Day','29').select('#Year','1999')

        await n.type('body','\t\t\t\t\t\t ')
        await n.type('body','\t\t\t \t\t\t ')
        await n.type('body','\t\t\t\t\t\t\t\t\t ')
 
        vcode = await e.wait(10000).evaluate( ()=> {
            return document.querySelector('div.small_subject').innerText.substr(0,6) })

        await n.insert('input', vcode).type('body','\t\t\t ')
        console.log(vcode)
        
        await n.wait(2000).type('input[name=password]', 'Un1qu3 Sn0wfl4k3!')
        await n.wait(1000).type('body','\t\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(1000).type('body','\t\t ')
        await n.wait(2000).type('body','\t\t ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body',' ')
        await n.wait(2000).type('body','\t ')
    //	await n.wait(5000).end()
    } 
    " | node 
}

Setup NodeJS and Nightmare:

$ . twit.sh; require

Run:

$ . twit.sh; newtwit

Note: As I’m not an adept browser-bot coder this code may fail once or twice before working. Just run it until it does. Hopefully someone with more time can fix it. It’s good enough for me.

Enjoy 🙂 -Zoe

P.S. Did you notice Twitter is pushing POTUS as the most important person to follow?

White House Youtube Dislike Manipulation

I’ve seen screenshots of YouTube modifying dislikes of White House videos. I decided I would do a thorough analysis myself. I wrote a script to check video stats every 80 seconds for 24 hours – for all videos on White House’s YouTube channel.

The collected data is archived here and here. The format is space-separated “CSV”, as follows:

VideoURL UnixTimestamp Date,Time Views Likes Dislikes

Here is a sample of the most egregious manipulation:

Some videos were delisted in minutes!:

https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771197 01/27/2021,13:13:17      1227       437      2963
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771285 01/27/2021,13:14:45      1463       441      2999
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771372 01/27/2021,13:16:12      1763       449      3030
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771459 01/27/2021,13:17:39      2476       455      3060
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771546 01/27/2021,13:19:06      2640       459      3098
https://www.youtube.com/watch?v=2bpSkdYUtNU 1611771720 01/27/2021,13:22:00      3588       470      3183
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699362 01/26/2021,17:16:02       918       405      4942
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699448 01/26/2021,17:17:28      1202       412      4976
https://www.youtube.com/watch?v=Fxo3OHKjfxs 1611699534 01/26/2021,17:18:54      1375       415      5026
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766646 01/27/2021,11:57:26       255       375      1771
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766733 01/27/2021,11:58:53       455       380      1823
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766819 01/27/2021,12:00:19       455       383      1852
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766906 01/27/2021,12:01:46       819       387      1886
https://www.youtube.com/watch?v=juqHZYKzyx0 1611766992 01/27/2021,12:03:12      1148       393      1932
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767079 01/27/2021,12:04:39      1462       397      1971
https://www.youtube.com/watch?v=juqHZYKzyx0 1611767166 01/27/2021,12:06:06      1830       398      2019
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770591 01/27/2021,13:03:11      1587        83      2040
https://www.youtube.com/watch?v=ucvgAZG_IT4 1611770764 01/27/2021,13:06:04      3014        95      2114

Likes+Dislikes was greater than views in some cases. Although that seems impossible, Youtube updates views slower, so they do not reflect real views at the time. For example:

https://www.youtube.com/watch?v=jw1_00uI02U 1611720090 01/26/2021,23:01:30     44404       924      8099
https://www.youtube.com/watch?v=jw1_00uI02U 1611720176 01/26/2021,23:02:56     44404       924      8118
https://www.youtube.com/watch?v=jw1_00uI02U 1611720260 01/26/2021,23:04:20     44404       925      8132
https://www.youtube.com/watch?v=jw1_00uI02U 1611720345 01/26/2021,23:05:45     44404       925      8151
https://www.youtube.com/watch?v=jw1_00uI02U 1611720429 01/26/2021,23:07:09     44404       925      8168
https://www.youtube.com/watch?v=jw1_00uI02U 1611720514 01/26/2021,23:08:34     44556       925      8184
https://www.youtube.com/watch?v=jw1_00uI02U 1611720599 01/26/2021,23:09:59     44556       925      8199
https://www.youtube.com/watch?v=jw1_00uI02U 1611720683 01/26/2021,23:11:23     44556       928      8219
https://www.youtube.com/watch?v=jw1_00uI02U 1611720768 01/26/2021,23:12:48     44556       928      8237

So it’s possible for likes and dislikes to accumulate while views stays the same. Eventually, views jumps up to better reflect reality.

The record of every time dislikes were removed is archived at https://pastebin.com/raw/F4ELDc4R

Grand Total         -130321

130 Thousand dislikes were removed in a 24hr period!

And this is for the most popular US President of all time!

Enjoy 🙂 -Zoe


Update

The timezone for the charts is UTC (London). Did you notice the huge drop at 06 hour (1 AM US Eastern)? Most working people go to sleep by that time. Coincidence? I think not.

Update

This research was featured in a youtube video.


Code wh.sh:

# Zoe Phin, 2021/01/26

require() { sudo apt-get install curl gnuplot; }

stats() {
    list=$(curl -s 'https://www.youtube.com/c/WhiteHouse/videos' | grep -o 'watch?v=[^"]*')
    for i in $list; do
        link="https://www.youtube.com/$i"
        date=$(date +"%s %x,%R:%S" | tr -d '\n')
        curl -s $link | tr -d ',' | tr '}' '\n' > new 
        grep -m 1 -o '[0-9,]* views' new > .views
        grep -m 1 -o '[0-9,]* likes' new > .likes
        grep -m 1 -o '[0-9,]* dislikes' new  > .dislikes

        paste .views .likes .dislikes | awk -vL=$link -vD="$date" '
            NF==6{printf "%s %s %9s %9s %9s\n", L, D, $1, $3, $5}'
    done
}

collect() {
    while true; do
        stats; sleep 75
    done | tee -a data.csv
}

dislikes() {
    list=$(cut -c1-44 data.csv | sort -u)

    for vid in $list; do	
        echo $vid
        grep ^$vid data.csv | awk '{
            DiffD=$6-D
            if (DiffD < 0) { 
                printf "%s %+7d\n", $3, DiffD 
                DLost+=DiffD
            }
            D=$6
        } END {
            printf "%-19s %7d\n", "Total", DLost
        }' 
        echo
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() {
    list=$(cut -c1-44 data.csv | sort -u)
    let n=0

    for vid in $list; do	
        let n++
        awk -vV=$vid '$1==V {print $2" "$4" "$5" "$6}' data.csv > plot.csv

        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title '$vid'
        set timefmt '%s'
        set xdata time
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Run:

$ source wh.sh; require

Collect data:

$ collect

( press Ctrl-C when you're done )

Record of dislike drops:

$ dislikes

Generate charts:

$ plot

Version 2.0: Cleaner and grabs stats at random 60 to 120 second interval.

# Zoe Phin, v2.0 - 2021/02/20

require() { sudo apt-get install gnuplot; }

collect() {
    url="https://www.youtube.com"
    while true; do
        for vid in $(wget -qO- "$url/c/WhiteHouse/videos" | grep -o 'watch?v=[^"]*'); do
            wget -qO- $url/$vid | egrep -o '[0-9,]* (views|likes|dislikes)' |\
            sed -n 1~2p | tr -d '[:alpha:],\n' |\
            awk -vL=$url/$vid -vD="$(date +"%s %x,%R:%S" | tr -d '\n')" '
                NF==3 { printf "%s %s %9s %9s %9s\n", L, D, $1, $2, $3 }'
        done
        sleep $(seq 60 120 | shuf | head -1)
    done | tee -a data.csv
}

dislikes() {
    for vid in $(cut -c1-44 data.csv | sort -u); do	
        awk -vv=$vid 'BEGIN { print v } $1==v { 
            Diff=$6-Last
            if (Diff < 0) printf "%s %+7d\n", $3, Lost+=Diff 
            Last=$6
        } END {
            printf "%-19s %7d\n\n", "Total", Lost
        }' data.csv
    done | awk '{ print } $1=="Total" { GT+=$2 } 
        END { printf "%-17s %9d\n", "Grand Total", GT 
    }'
}

plot() { n=0
    for vid in $(cut -c1-44 data.csv | sort -u); do	let n++
        awk -vv=$vid '$1==v {print $2" "$4" "$5" "$6}' data.csv > plot.csv
        echo "set term png size 740,740
        set key top left
        set grid xtics ytics
        set title noenhanced '$vid'
        set xdata time
        set timefmt '%s'
        set xtics format '%Hh'
        plot 'plot.csv' u 1:2 t 'Views'    w lines lc rgb 'black' lw 2,\
                     '' u 1:3 t 'Likes'    w lines lc rgb 'green' lw 2,\
                     '' u 1:4 t 'Dislikes' w lines lc rgb 'red'   lw 2
        " | gnuplot > example${n}.png 
    done
}

Something Rotten in Georgia

Updated – 2021/01/06, 06:50 PM EST

The results of the Georgia runoff election results do not make any logical sense to me. In the last 2 months I have probably seen political ads over 1000 times! No exaggeration. All for the 2 senate seats. There was not a single ad for Public Service Commission District 4, and yet the Republican running for this seat got more votes than Republicans running for either senate seat:

You can also grab NY Times’ data using this Linux one-liner (assuming you have curl and jq installed):

$ curl -s https://static01.nyt.com/elections-assets/2020/data/api/2021-01-05/state-page/georgia.json | jq -Mc '.data.races[].candidates[]|[.votes,.last_name,.party_id]' | tr -d '["]'

The result matches official GA site:

2272277,Warnock,democrat
2189111,Loeffler,republican
2253203,Ossoff,democrat
2208129,Perdue,republican
2227870,McDonald,republican
2185670,Blackman,democrat

McDonald got ~15K more votes than Perdue, and ~38K more than Loeffler.

So the question is: How did this happen? How did Republicans manage to vote more for a less important race?

Do you really believe Republicans would vote more for Public Service Commission District 4 than two senate seats ???

No way!

It sure smells like fraud. As if ballots were thrown out … or switched to Democrats.

Also, the Democrat for Commission District 4 got less votes than other Democrats. As if many fake ballots were produced rapidly just for the senate seats, and perpetrators didn’t have time to fill in this seat.

How about November 2020 senate election data? I combine both senate races:

$ curl -s 'https://static01.nyt.com/elections-assets/2020/data/api/2020-11-03/state-page/georgia.json' | jq -Mc '.data.races[1,2].candidates[]|[.party_id,.last_name,.votes]' | tr -d '["]' | tee november.csv

republican,Perdue,2462617
democrat,Ossoff,2374519
libertarian,Hazel,115039
write-ins,Write-ins,265
democrat,Warnock,1617035
republican,Loeffler,1273214
republican,Collins,980454
democrat,Jackson,324118
democrat,Lieberman,136021
democrat,Johnson-Shealey,106767
democrat,James,94406
republican,Grayson,51592
democrat,Slade,44945
republican,Jackson,44335
republican,Taylor,40349
republican,Johnson,36176
libertarian,Slowinski,35431
democrat,Winfield,28687
democrat,Tarver,26333
independent,Buckley,17954
green,Fortuin,15293
independent,Bartell,14640
independent,Stovall,13318
independent,Greene,13293
write-ins,Write-ins,34

I open the results in Excel, and try to combine data into Left and Right candidates. Republican and Libertarian is obviously Right. Democrat and Green are obviously Left. I give the Left a huge boost by including Independents with the Left, and despite this …

The Right Wins. What’s surprising is that the Right lost both Senate Runoff Elections. Why?

Fraud!

Something is very rotten in the state of Georgia!

-Zoe

If you missed it: https://phzoe.com/2021/01/06/georgia-senate-runoff-timestamp-data/

Georgia Senate Runoff Timestamp Data

I was trying to find timestamp data for the Georgia 2021 Senate Runoff election. I couldn’t find it easily via a google search, but I kept digging, and managed to find it and extract it from NY Times’ live prediction data feed. Here it is …

Senate Race 1
Senate Race 2

Code … georgia.sh:

# Zoe Phin, 2021/01/06

require() { sudo apt-get -y install curl jq gnuplot; }

download() { 
    curl -o ga1.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-G-S-2021-01-05.json
    curl -o ga2.json https://static01.nyt.com/elections-assets/2020/data/liveModel/2021-01-05/senate/GA-S-S-2021-01-05.json
}

timeseries() {
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga1.json | tr -d '["]' > ga1.csv
    jq -Mc '.races[0].timeseries[]|[.timestamp,.vote_counted,.republican_voteshare_counted,.democrat_voteshare_counted]' ga2.json | tr -d '["]' > ga2.csv
}

format() {
    for i in 1 2; do
        (echo "Timestamp            Votes  Rep %   Dem %   Rep     Dem"
         awk -F, '{ "TZ=US/Eastern date +%x,%R:%S -d "$1 | getline t; printf "%s %7d %6.4f %6.4f %7d %7d\n", t, $2, $3, $4, $2*$3, $2*$4 }' ga$i.csv
        ) > ga$i.txt
    done
}

plot() {
    awk -F, '{ "TZ=US/Eastern date +%d%H%M%S -d "$1 | getline t; printf "%s %7d %7d\n", t, $2*$3, $2*$4 }' ga$1.csv > ga$1.dat
    (echo 'set term png size 640,480
    set key top left
    set grid xtics ytics
    set ylabel "Million Votes"
    set timefmt "%d%H%M%S"
    set xdata time
    set xtics format "01/%d\n%H:%M"
    set ytics format "%.1f"
    set mytics 5'
    echo "plot 'ga${1}.dat' u 1:(\$2/1e6) t '$2' w lines lc rgb 'red','' u 1:(\$3/1e6) t '$3' w lines lc rgb 'blue'"
    ) | gnuplot > ga$1.png
}

Run it:

$ source georgia.sh; require; download; timeseries; format

format generates timestamp data into two files: ga1.txt and ga2.txt. The results are archived here and here, respectively.

Race 1 is Perdue vs. Ossoff, and Race 2 is Loeffler vs. Warnock

To plot the data:

$ plot 1 Perdue Ossoff
$ plot 2 Loeffler Warnock

This generates ga1.png and ga2.png, which I present above.

I left my opinion out of this post. Curious Windows coders should follow instructions here.

Enjoy the data 🙂 -Zoe

Heat flux in the Sun

The sun is known to emit ~63 Mega Watts per meter squared from its photosphere. But what is the heat flux inside this emissive photosphere?

Source

Heat flux formula: q = k*ΔT/L

q = k * (6600-4400 Kelvin) / (500,000 meters)

What is the thermal conductivity (k) value of hydrogen at these temperatures? [1]

This is actually very difficult to find, but I managed to find something:

Thermal Conductivity of Hydrogen, Source, Figure 5

This y-axis needs to be divided by 10 to get units (W/m*K).

The range of pressure in the photosphere is: 0.0009 atm to 0.123 atm. I think it’s safe to say that thermal conductivity of hydrogen is definitely no more than 2.5 W/m*K in our desired range. That will be our upper limit. Thus,

q = 2.5 * 2200 / 500000 = 0.011 W/m² [2]

As you can you can see, there is no problem with 0.011 W/m² “supporting” a 63 MW/m² output.

My critics will be quick to point out that I can’t use the conduction formula because the sun only has radiative transfers in the photosphere. But that’s just their excuse for NEVER figuring out what the internal heat flow is. Any of their attempts at doing so will be embarrassing for them, and so they will avoid it at all cost. Surely there is a heat flux, and surely it can be figured out.

My critics believe in conservation of heat flow, that is: internal heat flux => emergent radiation. There must be equality of these two things, otherwise there will be rapid cooling. Well, the sun has had 4.5 billion years to reach their “steady-state” equilibrium nonsense and it’s nowhere close. Maybe despite all their chest thumping, they have no idea what they’re talking about?

What goes for the sun here goes for my geothermal theory as well.

Just as <0.011 W/m² internal heat flux can “support” a 63 MW/m² emission from the sun, so too can a ~0.9 W/m² geothermal heat flux “support” a ~334 W/m² emission from Earth’s surface.

And why is that? See here and here.

Think about it!

Enjoy 🙂 -Zoe

Note:

[1] I left out helium. I don’t care to include it, as it makes little difference. I only care about being right within an order of magnitude.

[2] I don’t include surface area of emission, because the difference in solar radius of top and bottom of photosphere is too small.

Geothermal Denial

Climate “scientists” look at Earth’s geothermal heat flux, see that it’s small (~0.1 W/m²), and then conclude geothermal can’t possibly predominantly explain Earth’s surface temperature. This is plain wrong. I came up with an illustration to demonstrate my point a while ago:

The Heat Flux Fallacy

This is a fictional planet. I did this on purpose to accentuate my point. The question here is: Why is the surface 1000°C?

Mainstream climate “scientists” would see the small geothermal heat flux (0.1 W/m²), ignore it, and conclude it must be because Solar + Atmosphere delivers 148,971 W/m² to the surface. It couldn’t have anything to do with the 1010°C a hundred meters below the surface. Oh no, it can’t be that!

But did I say this fantasy planet even has an atmosphere? What if not? The sun only delivers 165 W/m² … Where’s the rest of the energy coming from to make the surface 1000°C?

And if there is an atmosphere … where did the atmosphere get 148,806 W/m² to give to the surface?

Climate cranks come to the rescue and claim that due to infrared absorbing gases in the atmosphere, the Sun’s 165 W/m² gets auto-magically boosted to 148,971 W/m², because that 165 W/m² can’t escape to space. You know what I have to say to that?

The Greenhouse Effect

Some people have serious problems accepting the truth: The surface here is 1000°C because it’s 1010°C a hundred meters below the surface. Simple.

Now why is that so hard to accept? Ideological presupposition. That’s why!

Let’s take a look at a recent discussion here:

Geothermal heat is about 0.1 W/m². Solar absorption is around 161 W/m². All solar is lost on a regular basis and heat loss by the surface is very (!) dynamic. Which means that a little bit more energy from below (for example 0.1 W/m2) is easily lost, together with the dynamic 1610 times higher ‘standard heat loss’.

Wim Röst

I said that her claim, that the heat flux leaving the ground was thousands of times larger than the heat flux passing through the ground, was physically impossible.

Willis Eschenbach

So what they are both saying in our context is that a 0.1 W/m² geothermal heat flux can’t “support” a 148,971 W/m² emission from the surface, so I must be wrong!

In Willis’ view, there must be equality between geothermal heat flux and surface emission.

He believes this is needed to preserve conservation of energy. But what is he really doing?

He’s equating a heat flux between two locations to an absolute energy flux equivalent at one location. Is that conservation of energy? No!

Energy is energy, and heat flux is the energy transfer from hot to cold, i.e. a DIFFERENCE of energies at two locations. How can you compare a differential to an absolute? It makes no sense. But don’t believe me, check your own eyes:

The top of the water represents the planet’s surface.

If what Willis et al were saying was true, we should expect a steep thermal gradient from the bottom to the top of the water column – so that conductive heat flux equals emergent radiation. But what actually happens?

As you can see, the top and bottom of the water column becomes the same temperature. In other words, the conductive (“geothermal”) heat flux becomes 0 W/m².

The top of this water column is capable of emitting εσT⁴, or (5.67e-8)*(273.15+83)^4 =~ 912 W/m²

In this case, 0 W/m² has no problem “supporting” 912 W/m² ! That’s infinite times!

So why can’t a 0.1 W/m² geothermal heat flux “support” a 148,971 W/m² emergent flux?

Of course it can. It’s not a problem at all. These people are simply confused on the physics.

Now imagine they had to explain this infrared electric kettle hot water video without mentioning the heat source below. How would they do that? Well…

The top of the kettle (“sun”) emits 20°C worth of radiation to the top of the water (419 W/m²) . The water vapor and carbon dioxide in the air above the water prevents radiation from leaving to colder space, and so auto-magically the top of the water becomes 912 W/m², or 83°C. Simple!

Now do you see it?

Greenhouse Effect == Geothermal Denial.

Note: Real planetary subsurface has a density gradient, and so you will see a small geothermal heat flux. There is no density gradient in this water example, so the conductive heat flux goes to zero.


Now we take my argument down to Earth, literally. Why is Earth’s surface ~15°C?

Because geothermal “delivers” 0°C to the surface. Add insolation and subtract latent and sensible heat, and you get your 15°C. Simple. No Greenhouse scam necessary.

More details in other articles, such as here, here, and here.

Enjoy 🙂 -Zoe

Happy New Year, Everybody!

Ozone Hole Watch

NASA provides extensive ozone data. I wanted to see what it looks like over time. The following charts describe the following:

The ozone hole area is determined from total ozone satellite measurements. It is defined to be that region of ozone values below 220 Dobson Units (DU) located south of 40°S.

Link
Daily
Annual Mean

Note: 1995 and most of 1996 is missing from their data. I delete 1996 entirely for annual average.

Code: ozone.sh

# Zoe Phin
# 2020/12/22

require() { sudo apt-get install curl gnuplot; }

download() { for y in {1979..2020}; do
    curl -o y$y.csv "https://ozonewatch.gsfc.nasa.gov/meteorology/figures/ozone/to3areas_${y}_toms+omi+omps.txt"
done; }

daily() { awk '
    BEGINFILE { d=0 }
    FNR>6 && $2!~/9999/ {
        y=substr($1,1,4); d+=1/366
        printf "%8.3f %s\n", y+d, $2
    }' y*.csv > daily.csv
}

# Run daily() first
annual() { cut -c 1-4,9- daily.csv | awk '{
        S[$1]+=$2; N[$1]+=1 
    } END {
        for (y in S) print y" "S[y]/N[y]
    }' | sed '/^1996/d' > annual.csv
}

# Arg 1 - 'daily' or 'annual'
plot() { echo "set term png size 740,370
    set nokey 
    set title 'Ozone Hole Area (mil. km²)'
    set xrange [1979:2021]
    set mxtics 5
    set grid ytics
    plot '$1.csv' u 1:2 w l lw 1 lc rgb 'blue'
    " | gnuplot > $1.png
}

Run it:

$ source ozone.sh
$ require && download
$ daily && annual
$ plot daily
$ plot annual

Enjoy 🙂 -Zoe

US Deaths

There’s some articles claiming that US deaths from all causes is lower this year than previous years. Is it true?

No, it is not true. Here’s my result:

Year Pop.      Deaths  %
2011 311556874 2512442 0.806%
2012 313830990 2501531 0.797%
2013 315993715 2608019 0.825%
2014 318301008 2582448 0.811%
2015 320635163 2699826 0.842%
2016 322941311 2703215 0.837%
2017 324985539 2788163 0.858%
2018 326687501 2824382 0.865%
2019 328239523 2835038 0.864%
2020 331937300 3290723 0.991%

Code… uscovid.sh:

# Zoe Phin
# 2020/12/15

download() {
    wget -O covid.csv -c 'https://data.cdc.gov/api/views/xkkf-xrst/rows.csv?accessType=DOWNLOAD&bom=true&format=true%20target='
    wget -O popul.csv -c 'https://www2.census.gov/programs-surveys/popest/datasets/2010-2019/national/totals/nst-est2019-alldata.csv'
}

usdeaths() {
    echo 'Year Pop.      Deaths  %'
    (sed -n 2p popul.csv | tr ',' '\n' | sed -n 9,17p > .pop
    sed -n 2p popul.csv | tr ',' '\n' | sed -n 39,47p > .death
    for y in {2011..2019}; do echo $y; done | paste -d ' ' - .pop .death
    awk -F, 'NR<9999 && $2~/States/ && substr($1,1,4)==2020 { 
        S+=$3; "date +%j -d "$1 | getline D } END { 
        printf "2020 331002651 %d\n",366/D*S }' covid.csv
    ) | awk '{
        printf "%s %5.3f%\n", $0, $3/$2*100
    }'
}

Run it:

$ source uscovid.sh && download && usdeaths

Note 1: As of writing this, the current year has 340 days of data. What I did was multiply the death count by 366/340.

Note 2: US 2020 population estimate number came from here. It will already be obsolete by the time you read this. Expect a little error in %.

Solar Spectrum

I wrote some code to generate a solar spectrum chart from official data. Thought I’d share it with you …

ssi.sh:

# Zoe Phin
# 2020/12/20

require() {
    sudo apt-get install curl nco gnuplot
}

download() {
    curl -o ssi.nc https://www.ncei.noaa.gov/data/solar-spectral-irradiance/access/daily/ssi_v02r01_daily_s20190101_e20191231_c20200226.nc
}

yearavg() {
    ncks --trd -HC -v SSI ssi.nc | awk -F '[= ]' '
        { D[$4]+=$6 } END { for (d in D) printf "%.4f %11.9f\n", d/1000, D[d]/365*1000 }
    ' | sort -n | sed 1d | awk '
        BEGIN { pi=atan2(0,-1); c=299792458; h=6.62607015E-34; k=1.38106485E-23 
            r=695700000; d=149597870700 }
        function P(T,w) { return ((2*pi*h*c^2/(w/1e6)^5)/(exp(h*c/(k*T*(w/1e6)))-1))/1e6 }
        { printf "%s %.9f\n", $0, P(5772,$1)*(r/d)^2 }
    ' > ssi.csv
}

plot() {
    echo 'set term png size 740,370
        set xrange [0:2.5]
        set mxtics 5
        set grid xtics ytics
        set ylabel "Radiance (W/m²/μm)"
        set xlabel "Wavelength (μm)"

        plot "ssi.csv" u 1:2 t "Solar Spectrum" w l lw 1 lc rgb "orange",\
             "" u 1:3 t "5772K Blackbody" w l lw 1 lc rgb "black"
    ' | gnuplot > solar.png
}

Run it:

$ source ssi.sh
$ require
$ download
$ yearavg
$ plot

or

$ . ssi.sh; require && download && yearavg && plot

Data and image contained in ssi.csv and solar.png

Enjoy 🙂 and Happy Holidays! -Zoe

Update

I also made a version for nanometer wavelenths rather than microns …

ssi-nm.sh:

# Zoe Phin
# 2020/12/22

require() {
    sudo apt-get install curl nco gnuplot
}

download() {
    curl -o ssi.nc https://www.ncei.noaa.gov/data/solar-spectral-irradiance/access/daily/ssi_v02r01_daily_s20190101_e20191231_c20200226.nc
}

yearavg() {
    ncks --trd -HC -v SSI ssi.nc | awk -F '[= ]' '
        { D[$4]+=$6 } END { for (d in D) printf "%.4f %11.9f\n", d, D[d]/365 }
    ' | sort -n | sed 1d | awk '
        BEGIN { pi=atan2(0,-1); c=299792458; h=6.62607015E-34; k=1.38106485E-23 
            r=695700000; d=149597870700 }
        function P(T,w) { return ((2*pi*h*c^2/(w/1e9)^5)/(exp(h*c/(k*T*(w/1e9)))-1))/1e9 }
        { printf "%s %.9f\n", $0, P(5772,$1)*(r/d)^2 }
    ' > ssi.csv
}

plot() {
    echo 'set term png size 740,370
        set xrange [0:2500]
        set xtics 200; set mxtics 2
        set ytics 0.2; set mytics 2
        set format y "%.1f"
        set grid xtics ytics
        set ylabel "Radiance (W/m²/nm)"
        set xlabel "Wavelength (nm)"

        plot "ssi.csv" u 1:2 t "Solar Spectrum" w l lw 1 lc rgb "orange",\
             "" u 1:3 t "5772K Blackbody" w l lw 1 lc rgb "black"
    ' | gnuplot > ssi-nm.png
}

Run:

$ . ssi-nm.sh; require && download && yearavg && plot

Climate Scientists vs Air Force

MODTRAN is a tool developed by US Air Force and Spectral Science, Inc to model absorption in the atmosphere. A free version is available from University of Chicago here.


Let’s start playing with this tool. We set all atmospheric gases and other parameters to zero.

CO2 = 0 ppm

The ground temperature doesn’t change – contra the opinion of mainstream climate “scientists”. Now we set carbon dioxide to 99.9999%:

CO2 = 999999 ppm

The ground temperature doesn’t change – contra the opinion of mainstream climate “scientists”.


We look at temperature height profiles for both minimum and maximum carbon dioxide concentration:

CO2 = 0 ppm
CO2 = 999999 ppm

The temperature at various heights doesn’t change – contra the opinion of mainstream climate “scientists”.

Now we look at absorption with various levels of CO2:


CO2 = 0 ppm
CO2 = 999999 ppm
CO2 = 410 ppm
CO2 = 820 ppm

While absorption obviously changes, surface (boundary) temperature doesn’t change – contra the opinion of mainstream climate “scientists”.

Now we add 100 degrees to the surface temperature:

Ground Temperature = 388 K, CO2 = 410 ppm
Ground Temperature = 388 K, CO2 = 410 ppm

The absorption factor and transmittance remains the same (~255.7 CM-1, 0.8836).

Absorption factor doesn’t change based on temperature, and we’ve already seen absorption doesn’t change temperature – contra the opinion of mainstream climate “scientists”.

Play with UChicago MODTRAN yourself, and see that I’m correct.

Enjoy 🙂 -Zoe

Scraping 2020 US Election Data

I thought this code could be useful for researchers. I wrote it a while ago, then never used it. Too busy 😦

elec.sh:

# Zoe Phin
# 2020/11/11

require() {
   sudo apt-get install curl jq gnuplot
}

download() {
   curl -O 'https://static01.nyt.com/elections-assets/2020/data/api/2020-11-03/state-page/{alabama,alaska,arizona,arkansas,california,colorado,connecticut,delaware,florida,georgia,hawaii,idaho,illinois,indiana,iowa,kansas,kentucky,louisiana,maine,maryland,massachusetts,michigan,minnesota,mississippi,missouri,montana,nebraska,nevada,new-hampshire,new-jersey,new-mexico,new-york,north-carolina,north-dakota,ohio,oklahoma,oregon,pennsylvania,rhode-island,south-carolina,south-dakota,tennessee,texas,utah,vermont,virginia,washington,west-virginia,wisconsin,wyoming}.json'
}

timelines() {
   for state in `ls *.json | sed s/.json//`; do
      jq -Mc '.data.races[0].timeseries[]|[.timestamp,.votes,.votes*.vote_shares.trumpd,.votes*.vote_shares.bidenj]' $state.json | tr -d '["]' |\
      awk -F, '{"TZ=US/Eastern date +%d%H%M%S -d "$1 | getline t; printf "%s %8d %8d %8d\n",t,$2-vlast,$3-tlast,$4-blast; vlast=$2; tlast=$3; blast=$4}' > $state.csv
   done
}

merged() {
   sort -n *.csv | awk '{vtotal+=$2; ttotal+=$3; btotal+=$4; printf "%s %9d %9d %9d\n",$1,vtotal,ttotal,btotal}' > elec.dat
}

plot() {
   echo 'set term png size 740,740
   set key top left 
   set grid xtics ytics
   set ylabel "Million Votes"
   set timefmt "%d%H%M%S"
   set xdata time
   set xtics format "11/%d\n%H:%M"
   set ytics format "%.0f"
   set xrange ["03190000":"04040000"]
   plot "elec.dat" u 1:($3/1e6) t "Trump" w lines lc rgb "red" lw 1,\
                "" u 1:($4/1e6) t "Biden" w lines lc rgb "blue" lw 1
   ' | gnuplot > elec.png
}

Run it:

$ source elec.sh
$ require && download
$ timelines && merged && plot

Result: elec.png

Enjoy 🙂 -Zoe

Update 2020/12/06

County level data can be obtained with this code snippet:

counties() {
   echo " County              |    Votes |    Votes    Votes    Votes |      %          %         %   "
   echo " Name                |    Total |    Biden    Trump    Jorg. |    Biden      Trump     Jorg."
   for state in `ls *.json | sed s/.json//`; do
      echo -e "\n--- $state ---\n"
      jq -Mc '.data.races[0].counties[]|[.name,.votes,.results.bidenj,.results.trumpd,.results.jorgensenj]' $state.json | tr -d '["]' |\
      awk -F, -vs=$state '$2!=0{ v+=$2; b+=$3; t+=$4; j+=$5;
         printf "%-20s | %8s | %8s %8s %8s | %9.3f %9.3f %9.3f\n",$1,$2,$3,$4,$5,$3/$2*100,$4/$2*100,$5/$2*100} END {
         printf "\nTotal                | %8s | %8s %8s %8s | %9.3f %9.3f %9.3f\n",v,b,t,j,b/v*100,t/v*100,j/v*100
      }'
   done
}

Run it:

$ . elec.sh; counties > counties.txt

Contents of counties.txt is archived here.

Geothermal to the Moon!

The Earth and the Moon have been neighbors for a very long time. We should expect long term steady state heat transfer from Earth’s internal energy (without Sun) to the Moon to be equal to Earth’s geothermal heat flux. To my knowledge, no one has ever made this claim. Sound outrageous? Let’s see…

According to [Davies 2010], the geothermal flux is 46.7 TW:

We conclude by discussing our preferred estimate of 47 TW, (rounded from 46.7 TW given that our error estimate is ± 2 TW)

— [Davies 2010]

When we divide this figure by the surface area of the Earth, we get:

(46700000000000 ± 2000000000000)/510065728777855 = 0.09155683 ± 0.0039 W/m²

This is not an emergent radiative flux, but a conductive heat flux.

I will now argue that this number is not accidental, but is based on heat transfer from the Earth to the Moon.

In my article Measuring Geothermal …, I extracted 335.643 W/m² as an emergent flux equivalent from geothermal.

In my article Deducing Geothermal, I deduced 332.46 W/m² as an emergent flux equivalent from geothermal.

Let’s average those two values to get: 334.05 W/m²

Now we apply the inverse square law for distant radiation, using data from NASA:

334.05 W/m² × (6371 km/ 378000 km)² = 0.09489495 W/m²

This value falls within the error range of measured geothermal flux. Do you think this is a coincidence? I think not. It makes perfect sense.

Enjoy 🙂 -Zoe

The Steel Greenhouse Ruse

Amateur scientist Willis Eschenbach developed a thought experiment to demonstrate how the greenhouse effect “works”:

It’s been refuted many times before, but I’ll make it even simpler.

The main claim is that the outer shell’s presence will force the inner core to warm up and radiate twice as much compared to no shell at all.

We start with 235 W/m² emerging from core and going to shell. I’ll use an inner and outer surface area of 1, and consider what is going on every second to make things simple.

235 Joules emerges from the core in the 1st second and goes to the shell . Willis reminds us:

In order to maintain its thermal equilibrium, the whole system must still radiate 235 W/m² out to space.

However … the first thing Willis does is break this rule, and sends 235 J back to the core. Nothing to space.

Now a new 235 J emerges in the 2nd second from the core which gets added with the 235 J that’s coming back from the shell from the 1st second.

The core now sends 470 J to shell. This 470 J now gets split into 235 J back to core and 245 J to space.

Second 3 and on just repeats. So you see what he did there? By violating a rule, he gets to cycle in an extra 235 J every second.

There’s a more accurate variation where the rule is violated several times but with less offending Joules each cycle. It goes like this:

Second 1: 235 core->shell, 117.5 shell->core, 117.5 -> space
Second 2: 352.5 core->shell, 176.25 shell->core, 176.25 -> space
Second 3: 411.25 ... 205.625 ... 205.625
Second 4: 440.625 ... 220.3125 ... 220.3125
Second 5: 455.3125 ... 227.65625 ... 227.65625
...
Second X: 470 ... 235 ... 235

I think you get the idea. I wrote a program to do all this, here. The first variation is easier to describe. Here’s some fun satire to illustrate the main point:


Imagine you’re the head manager of a sugar factory.

Every minute, a bag filled with 235 grams of sugar slides down a chute and lands in a basket. You take this bag and walk across the factory to place it inside a truck for later delivery.

You’ve been trying to figure out how to cheaply increase your production for a while now, and one day you finally got a great idea …

You decide to place a table at a halfway point between basket and truck.

In the first minute of implementing your great idea, you move the bag from basket to table. You decide not to then carry the bag to the truck, but back to the basket. You drop the bag in the basket a second before a new bag comes down the chute. When that new bag drops in the basket, and you see two bags, you say to yourself: “I’m a genius! I just doubled production!”.

You now carry two bags to the table. Then one bag to the truck, and one bag back to the basket. You then repeat this over and over.

You convince yourself that seeing two bags in the basket and carrying it to the table means that you’ve doubled production. The proof is self-evident. Congratulations!

Unfortunately not everyone agreed with you. Many thought you are crazy. So you fired them and hired those that agreed with you. You wanted consensus, and you got it!


Now I’m going to illustrate the greenhouse gas fallacy in the most primitive way, using only 2 water molecules:

Core
Shell

We’re at second 0, before any greenhouse magic begins, so the shell is still at 0 J, but the nuclear core is at 235 J. The intensity of motion represents the amount of energy present.

Energy is in fact motion. The universe has only two things: things and motion of them. I’m excluding space.

Willis (and all greenhouse gas junkies in general) believe that energy is just like matter, and you can pass it back to where it came from to have more of it.

What Willis et al end up doing is adding motion to existing motion to intensify motion. They believe this is science, but it’s actually a false philosophy.

Philosophy – core vibrates at twice the intensity

Core
Shell

In actual science, we know that the max energy into a system is the max energy THROUGHOUT the system. But in Willis’ philosophy, you can create a feedback loop that causes more energy (motion) somewhere in the system, but it’s all fine as long as just the final output (to space) obeys conservation of energy in regard to original input. This is completely false. Conservation of energy must be followed at every boundary.

Science – shell achieves vibrational resonance with core

Core
Shell

In reality, the shell will just come to resonate with the core. There will never be a molecule that vibrates more intensely than what the original energy supplied into the system allows.

This is all just 220 year old basic science. Hopefully, climate scientists might learn some basic experimental thermodynamics rather than relying on a falsified thought experiment.

Summary: You can’t make something vibrate more vigorously by confining it with another thing vibrating at an equal or lower rate.

Enjoy 🙂 -Zoe

Fourier’s Accidental Confession

Fourier is considered a direct predecessor to mainstream climatology. Mainstream climatology follows him and purposefully neglects geothermal energy in Earth’s energy budget due to the belief that it is too small. This then allows them to make the outrageous claim that it is IR-absorbing gases in the atmosphere that boosts surface temperatures to what we measure with thermometers.

So is it true that geothermal is negligible?

According to Fourier’s translated 1827 paper:

The effect of the primitive heat which the globe has retained has therefore
become essentially imperceptible at the Earth’s surface …

the effect of the interior heat is no longer perceptible at the surface of the Earth

– Temperatures of the Terrestrial Sphere, Page 15

Well that looks settled. Doesn’t it? Let’s see the whole context:

Temperatures of the Terrestrial Sphere, Page 15

This is a very curious paragraph, for it admits too much.

The only way to melt ice is to provide at least 0°C worth of energy. Right?

0°C is not “negligible”, now is it?

I can already hear my critics saying: “But Zoe, he said over a century!”

Sure. It’s so marginally over 0°C, that it takes a century to melt 3 cubic meters of ice. So what? It’s still at least 0°C. And it’s coming from the Earth.

Fourier contradicts himself when he claims Earth’s internal heat is imperceptible. Is ice melting not perceptible? What if he chose dry ice? More perceptible. What about nitrogen or oxygen “ice”? Even more perceptible!

Is 0°C correct? What do modern geophysicists think?

https://www.routledgehandbooks.com/doi/10.1201/9781315371436-4

Same thing! 0°C is still the convention.

The radiative equivalent of 0°C at emissivity=1 is 315.6 W/m²

Can this really be excluded from the energy budget? No.

What’s the significance of this?

It means the greenhouse effect is junk science. The surface has enough energy from geothermal and solar to explain surface temperatures.

I have two previous articles describing how the geothermal contribution can be computed more accurately using two different methods:

https://phzoe.com/2020/02/13/measuring-geothermal-a-revolutionary-hypothesis/

https://phzoe.com/2020/02/25/deducing-geothermal/

It’s nice to know that the geothermal hypothesis was accidently scientifically supported by the very guy that unfortunately rejected it. A guy who modern academics follow uncritically. The answer was right beneath his feet, but unfortunately his head was in the clouds. Because of him, modern academics truly believe that it is the atmosphere that provides raw energy to the surface, rather than geothermal. What a colossal mistake. They flipped reality completely upside down.

While my critics like to claim that geothermal can only provide ~36 Kelvin because they applied Stefan-Boltzmann formula to the small conductive heat flux of 91.6 mW/m², actual scientists know that geothermal can melt ice. And this knowledge is 200 years old! When are climate scientists going to wake up?

-Zoe

Update 10/02/2020

My critics point out that Fourier meant to add that 318 mW/m² over a course of a century; 3 centuries by today’s known geothermal heat flux: 91 mW/m².

That’s not the point. The point was to expose Fourier’s own confusion over the difference between heat and energy. Fourier’s conduction formula applies to HEAT flow, not energy. 318 mW/m² or 91 mW/m² of total emissive energy will NEVER melt ice. But 318 or 91 mW/m² of HEAT flow might, depending on the temperature the ice is sitting on.

Bottom line: Did Fourier claim geothermal could melt ice? YES. Did he give a good explanation? NO.

Is Fourier a good choice to be a father of climate science? That’s a big NO.

But … since Fourier claimed geothermal could melt ice, I will take his word for it, because in this case he is absolutely right.

Equating Perpendicular Planes is Plain Nonsense

Many people believe you can compare the Geothermal Heat Flux to Insolation, see that it’s pitiful and then exclude Geothermal from the energy budget. I have touched on this subject several times: here, here, and here. Today I will again show that this idea is plain nonsense.

Let’s start with the basics of radiation:

Stefan-Boltzmann’s Law

The radiation emerging out of a plane in the (x,y) dimension is proportional to the fourth power of its temperature. The choice of variable names x,y is arbitrary. Now what about conduction?

Conduction through matter

Geothermal Heat Flux has been globally measured to be ~ 91.6 mW/m²; a very small number. Many people claim that you can convert this figure into a value that tells you what the surface temperature would be in the absence of the sun.

What they do is equate the radiation emerging out of a plane with the internal conductive heat flux. In the language of my previous articles, they equate Cold Side Radiation to Conductive Heat Flux: CSR = CHF. Then they solve for T_cold.

This is kind of funny, because even though we have proof that geothermal will deliver ~273 K, they still think geothermal can only deliver ~36 K.

They believe their argument is reasonable because both CSR and CHF are in units W/m², and therefore they can be equated to one another.

What they don’t understand is that the meters squared (m²) are in completely different dimensions.

In radiative flux, the m² comes from the surface plane. But in conductive flux, the m² comes from dividing the thermal conductivity constant (k) by the depth (L).

The depth is orthogonal (perpendicular) to the surface plane!

How much sense does it make to compare emergent radiation to something based on a 90 degree angle to it? None at all.

I derived the proper relationship between CSR and CHF in my previous articles:

Proper relationship between CSR and CHF

Now I do proper dimensional analysis:

Dimensional Analysis

Yes, their idea doesn’t make any sense at all, but it does make for great rhetorical pseudoscience.

Now for some satire …

Question: How much rain falls on a flat roof top?

Answer: It depends on the building material and height of the building.

Normal Person: Say what?

This is the best analogy I could come up with what their idea represents. Maybe someone else could come up with a better one. Main point: they’re 90 degrees wrong.

I hope to repeat this for the last time: the Geothermal Heat Flux is NOT enough information to say what radiation emerges out of the surface. There are many possibilities with the same heat flux value, as shown here. CHF divided by k (thermal conductivity) yields a temperature gradient. A gradient measure tells you nothing about what’s at the top.

Take care, -Zoe

Zoe’s 35th Birthday

COVID19 in Georgia

Today I analyze COVID19 data for my home state of Georgia. I thought it would be interesting because there is an anomaly. Let’s see the anamoly:

Cases per 100K (Source)
Population Density

You see it? The largest density of cases does not match the largest density of population. We would expect most cases per 100K to be in the 9th largest metropolis in the US (Atlanta), but it’s not!

How could this be? What could cause such an anomaly?

It might have something to do with foreign labor? Georgia is the 2nd largest recipient of temporary agricultural H-2A visas in 2019 (Source). Trend:

There’s no data as to which counties migrant workers go to, but we can take a logical leap: The most agriculturally productive counties probably have the most migrant workers.

We would expect those counties with the largest share of agriculture to be those disproportionately affected by COVID19. Let’s see …

Corn, 2019
Cotton, 2019
Peanuts, 2019
Corn, 2018
Cotton, 2018
Peanuts, 2018

It’s not a perfect match, but I think there’s something to it. Maybe I am wrong, but I haven’t found a better explanation from my local media. In fact, the issue was not even addressed by anyone.

Other states also have low density counties with high COVID19 densities, but they seldom surpass the rates in their major metro areas. Georgia is anomalous in this regard.

Thoughts? Comments?

Peace, -Zoe

CO2 Versus Global COVID19 Response

With the global economic response to the COVID19 epidemic, we would expect global CO2 to be rising much less than other years, if the theory of man-made global warming is indeed true.

I use data from NOAA to see what’s going on.

The estimated daily global seasonal cycle and trend value for CO2 are determined from the daily averaged CO2 data from the four NOAA/ESRL/GMD Baseline observatories. A smoothed seasonal cycle and a smoothed de-seasonalized trend curve are determined for each observatory record at daily intervals. An estimated global seasonal cycle and trend are computed by averaging the four individual observatory seasonal cycle and trend curves at each daily interval.

— ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_trend_gl.txt

I chose the most official processed data there is, so I can’t be accused of cherrypicking. What I do is compare May 1st to Jan 1st of every year from 2010 to 2020. Results:

2010 1.96 0.90
2011 0.95 0.58
2012 1.50 0.68
2013 2.25 0.98
2014 1.54 0.59
2015 1.87 0.72
2016 2.34 1.20
2017 1.64 0.67
2018 1.93 0.78
2019 1.88 0.86
2020 2.07 0.96

Results are in increased ppm (parts per million). 2nd column is smoothed seasonal cycle. 3rd column is smoothed de-seasonalized trend curve.

As you can see, 2020 was the 3rd largest increasing year, after 2016 and 2013.

We would expect it to come in last. Looks like nature doesn’t respond that quickly … or at all.

Peace, -Zoe

Update 06/05/2020

# For Jan 1 to Jun 4

2010 1.55 1.16
2011 0.56 0.72
2012 0.95 0.90
2013 1.99 1.23
2014 1.43 0.76
2015 1.31 0.95
2016 1.89 1.53
2017 1.33 0.84
2018 1.71 1.00
2019 1.58 1.12
2020 1.53 1.18

Code

co2.sh:

wget -qO co2.txt -c ftp://aftp.cmdl.noaa.gov/products/trends/co2/co2_trend_gl.txt      
awk '!/#/ && $2==1 && $3==1 { print $1" "$4" "$5 }' co2.txt > .start       
# Change $2 and $3 to Month and Day: (Ex: $2==5 && $3==1 for May 1st )
awk '!/#/ && $2==5 && $3==1 { print $4" "$5 }' co2.txt > .end 
paste .start .end | awk '{printf "%s %.2f %.2f\n", $1, $4-$2, $5-$3}'
rm -f .start .end

Run it:

$ bash co2.sh

The Irrelevance of Geothermal Heat Flux

You’ve probably heard it before: the geothermal heat flux is so small (91.6 mW/m²) that it can be effectively ignored in Earth’s energy budget. The first part is true, the heat flux is small, but this fact is completely irrelevant. And what is relevant is popularly denied and masked as something else.

I’ve already explained the problem here and here. Unfortunately not everyone understood the point I was trying to make, so I made a visualization:

Various Profiles with the same Geothermal Heat Flux (CF). Emissivity=1

CF (Conductive Flux) is the Geothermal Heat Flux, EF is the Emergent Geothermal Flux, Th and Tc are the temperatures of the hot side and cold side. d is depth. Compatibility with my previous terminology: CF = CHF and EF = CSR.

As you can see all of these profiles have the same geothermal heat flux (CF), and all of them produce a very different emergent flux (EF) out of the surface. The popularly stated geothermal heat flux is NOT a value that you can compare to insolation. The value itself gives you NO clue as to what can emerge at the top. Anyone telling you otherwise is stupid or lying.

The geothermal heat flux and the thermal conductivity factor determines the temperature gradient. A gradient can never tell you either what kinetic energy is at the bottom or the top. Never.

So what really emerges at the top on Earth? In this visualization, the closest answer is ~5°C or ~340 W/m² – what was calculated and observed here and here. ~340 W/m² is what is claimed for the total greenhouse gas backradiation effect, as shown in the “official” energy budget here. That’s not surprising, because the greenhouse gas effect is secretly just geothermal flipped upside down. It’s the biggest scam in climate science, and you heard it here first.

Geothermal provides a tremendous amount of energy, even more than the sun, but climate scientists ignore it because they are looking at a component of a gradient/slope measure, rather than the temperature (kinetic energy) it delivers to the surface.

I invite everyone to give this some serious thought and not just dismiss it using sophistry.

Love, -Zoe

Extra

  1. Geothermal Heat Flux (CF) is a very useful value for commercial geothermal energy prospectors, but not for atmospheric scientists creating an energy budget. EF is what they need to use. They do use it, but they flip it upside down and call it GHE.
  2. The temperature gradient value used is 27.5 °C/km, which I got from here: “it is about 25–30 °C/km”. This makes k = 0.333 W/(m*K).

The Strange Case of Mimas

Mimas is a small moon of Saturn. It is most famous for being the inspiration for the Death Star in the popular movie Star Wars.

Mimas

But from this day it will be famous for refuting mainstream climate science.

How you ask?

Well … let’s examine its external energy sources:

1) Insolation. The insolaton at Mimas should be approximately the same as that for Saturn.

2) Saturn. Radiation received from Saturn should equal the emission from Saturn diluted by the square of the radius of Saturn divided by distance from Saturn to Mimas and divided by 4.

NASA’s Facts Sheets (Saturn, Saturn Satellites) provides us all the numbers we need.

Apply standard formulas:

1) 14.82 * (1 - 0.6) / 4 = 1.482 W/m²

2) (5.67e-8)*(81)^4 * (54364/185539)^2 / 4 = 0.0524 W/m²

The total is 1.5344 W/m²

Let’s convert that back to a temperature (assuming emissivity = 1, by [Howett 2003]):

(1.5344/5.67e-8)^0.25 = 72.1

According to mainstream climate science, only special gases in the atmosphere can boost the surface temperature beyond what external radiation (the sun) alone can do. On Earth, they claim these gases boost the surface temperature by ~33°K.

Mimas has no greenhouse gases or even an atmosphere, so its average temperature should never exceed 72.1 K.

But in reality …

https://www.nasa.gov/mission_pages/cassini/multimedia/pia12867.html

It looks like PacMan is powering the Death Star and the surface temperature is boosted from 2 to 24 K beyond what external radiation alone can do. There is nothing below 74 K?

Isn’t it obvious that Mimas is geothermally boosted?

Neither the greenhouse effect theory of mainstream climate science or the atmospheric pressure theory of Nikolov & Zeller, et al can explain this!

Nothing else can explain PacMan and the thermal boost other than geothermal.

And if a tiny planetoid like Mimas has its own oddly distributed internal energy, maybe the Earth, which is 158,730 times more massive could as well?

Think!

Love, -Zoe

Lunar Warming

In a previous article, I examined the average moon temperature (AMT). You may have noticed that there’s been about ~3 degree K warming in the last decade.

According to [Vasavada 2012], the mean equatorial temperature between 2009 and 2011 was about 213K, whereas the 2017-2018 data from UCLA and WUSTL shows that to be about 216K.

For AMT, the increase has been from ~197K to ~200K.

Perhaps there is some error in the exactness, but that the moon has warmed is not actually controversial; it is accepted by mainstream scientists. I wanted to share with you today their theory as to the cause. Are you ready?

Google “lunar warming”. Here is what you will get:

Mainstream Nonsense

Livescience reports:

According to the new study, the 12 Apollo astronauts who walked on the moon between 1969 and 1972 kicked aside so much dust that they revealed huge regions of darker, more heat-absorbing soil that may not have seen the light of day in billions of years. Over just six years, this newly exposed soil absorbed enough solar radiation to raise the temperature of the entire moon’s surface by up to 3.6 degrees F (2 degrees C), the study found.

Livescience

You got that? They didn’t just raise the temperature where they walked but the ENTIRE moon!

You buy it? I hope not. Great laugh, right?

What is the ratio of surface area walked to the entire moon? I don’t know, but it’s ultra tiny. Seems like heat capacity calculations were ignored. The walked surface area might have to be millions (if not billions) of degrees to raise the entire surface area of the moon by a single degree – ASSUMING there’s horizontal heat transfer via conduction.

Now why would they say something that absurd?

I’ll tell you. Scientists have known that Total Solar Irradiance has been decreasing since the 1950s, and the moon has virtually no atmosphere. Because there is no atmosphere there can’t be any stupid greenhouse effect at work.

That would leave geothermal (lunathermal, I guess) warming as the only culprit!

And if the surface of the moon can warm up due to more internal energy coming up from beneath the surface, perhaps the same thing can be at work on Earth …

Think about it, Occam’s Razor sharp … (answer)

-Zoe

Do blankets warm you?

Believers of the Greenhouse Effect all use the same analogy to get you to believe in their junk science. The site Skeptical Science sets the standard in this article:

So have climate scientists made an elementary mistake? Of course not! The skeptic is ignoring the fact that the Earth is being warmed by the sun, which makes all the difference.

To see why, consider that blanket that keeps you warm. If your skin feels cold, wrapping yourself in a blanket can make you warmer. Why? Because your body is generating heat, and that heat is escaping from your body into the environment. When you wrap yourself in a blanket, the loss of heat is reduced, some is retained at the surface of your body, and you warm up. You get warmer because the heat that your body is generating cannot escape as fast as before.

Link

And more:

To summarise: Heat from the sun warms the Earth, as heat from your body keeps you warm. The Earth loses heat to space, and your body loses heat to the environment. Greenhouse gases slow down the rate of heat-loss from the surface of the Earth, like a blanket that slows down the rate at which your body loses heat. The result is the same in both cases, the surface of the Earth, or of your body, gets warmer.

Link

NASA reminds us that:

The greenhouse effect is the way in which heat is trapped close to the surface of the Earth by “greenhouse gases.” These heat-trapping gases can be thought of as a blanket wrapped around the Earth, which keeps it toastier than it would be without them.

Link

You got that? Blankets warm you! Their logic is so sound that they couldn’t possibly be wrong, could they?

What empirical evidence do they provide for such an assertion? None!

Do they even attempt to predict what temperature a blanket could force? No!

Any such attempt would be very embarrassing for them, so instead they just leave it to the reader’s imagination.

First a note: there is no doubt that a blanket can make you warmer by blocking convection. The issue at hand is whether there is a warming due to radiative heat transfer, as is claimed for the greenhouse effect by analogy.

Let’s consider the case of a typical cotton blanket, whose emissivity ranges from 0.81 to 0.88 [Bellivieu 2019], depending on humidity. I will choose 0.85 for an average humidity condition; The exactness hardly matters. According to the verified program provided in my article The Dumbest Math Theory Ever, a blanket with an emissivity of 0.85 placed on a human being whose normal temperature is at 37°C, should produce a final skin temperature of …

$ ALB=0 TSI=2090.8 bash gheffect 0.85

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 522.700 W |  36.701 C | 444.295 W | 222.148 W | 300.553 W
  2 | 744.848 W |  65.389 C | 410.973 W |  94.413 W | 428.287 W
  3 | 839.260 W |  75.642 C | 396.811 W |  40.125 W | 482.575 W
  4 | 879.386 W |  79.738 C | 390.792 W |  17.053 W | 505.647 W
  5 | 896.439 W |  81.436 C | 388.234 W |   7.248 W | 515.452 W
  6 | 903.687 W |  82.151 C | 387.147 W |   3.080 W | 519.620 W
  7 | 906.767 W |  82.453 C | 386.685 W |   1.309 W | 521.391 W
  8 | 908.076 W |  82.582 C | 386.489 W |   0.556 W | 522.144 W
  9 | 908.632 W |  82.636 C | 386.405 W |   0.236 W | 522.464 W
 10 | 908.869 W |  82.659 C | 386.370 W |   0.100 W | 522.600 W
 11 | 908.969 W |  82.669 C | 386.355 W |   0.043 W | 522.657 W
 12 | 909.012 W |  82.673 C | 386.348 W |   0.018 W | 522.682 W
 13 | 909.030 W |  82.675 C | 386.345 W |   0.008 W | 522.692 W
 14 | 909.038 W |  82.676 C | 386.344 W |   0.003 W | 522.697 W
 15 | 909.041 W |  82.676 C | 386.344 W |   0.001 W | 522.699 W
 16 | 909.042 W |  82.676 C | 386.344 W |   0.001 W | 522.699 W
 17 | 909.043 W |  82.676 C | 386.344 W |   0.000 W | 522.700 W

82.6°C ! Really hot!

Note that I set the albedo to zero. This is because I figure any scattering of photons between human and blanket will find its path back to the human (and thus “should” cause warming), with very little leakage at the edges of the blanket. But let us be as generous as possible to climate alarmists and say the blanket has an albedo of 0.22 (The highest value found for cotton in scientific literature: Source 1, Source 2). What then?

$ ALB=0.22 TSI=2090.8 bash gheffect 0.85

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 407.706 W |  18.040 C | 346.550 W | 173.275 W | 234.431 W
  2 | 580.981 W |  44.999 C | 320.559 W |  73.642 W | 334.064 W
  3 | 654.623 W |  54.635 C | 309.513 W |  31.298 W | 376.408 W
  4 | 685.921 W |  58.484 C | 304.818 W |  13.302 W | 394.404 W
  5 | 699.222 W |  60.081 C | 302.823 W |   5.653 W | 402.053 W
  6 | 704.875 W |  60.752 C | 301.975 W |   2.403 W | 405.303 W
  7 | 707.278 W |  61.036 C | 301.614 W |   1.021 W | 406.685 W
  8 | 708.299 W |  61.157 C | 301.461 W |   0.434 W | 407.272 W
  9 | 708.733 W |  61.208 C | 301.396 W |   0.184 W | 407.522 W
 10 | 708.918 W |  61.230 C | 301.368 W |   0.078 W | 407.628 W
 11 | 708.996 W |  61.239 C | 301.357 W |   0.033 W | 407.673 W
 12 | 709.029 W |  61.243 C | 301.352 W |   0.014 W | 407.692 W
 13 | 709.043 W |  61.245 C | 301.349 W |   0.006 W | 407.700 W
 14 | 709.049 W |  61.245 C | 301.349 W |   0.003 W | 407.703 W
 15 | 709.052 W |  61.246 C | 301.348 W |   0.001 W | 407.705 W
 16 | 709.053 W |  61.246 C | 301.348 W |   0.000 W | 407.706 W
 17 | 709.054 W |  61.246 C | 301.348 W |   0.000 W | 407.706 W

61.2°C ! Still very hot.

OK, I’m now going to be extremely generous, and use an emissivity value of 0.5, which is not even scientifically justifiable, but let’s give the alarmists a huge advantage. What then?

$ ALB=0.22 TSI=2090.8 bash gheffect 0.5

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 407.706 W |  18.040 C | 203.853 W | 101.927 W | 305.780 W
  2 | 509.633 W |  34.746 C | 152.890 W |  25.482 W | 382.224 W
  3 | 535.114 W |  38.525 C | 140.149 W |   6.370 W | 401.336 W
  4 | 541.485 W |  39.448 C | 136.964 W |   1.593 W | 406.113 W
  5 | 543.077 W |  39.678 C | 136.167 W |   0.398 W | 407.308 W
  6 | 543.475 W |  39.735 C | 135.968 W |   0.100 W | 407.606 W
  7 | 543.575 W |  39.750 C | 135.919 W |   0.025 W | 407.681 W
  8 | 543.600 W |  39.753 C | 135.906 W |   0.006 W | 407.700 W
  9 | 543.606 W |  39.754 C | 135.903 W |   0.002 W | 407.704 W
 10 | 543.607 W |  39.754 C | 135.902 W |   0.000 W | 407.706 W
 11 | 543.608 W |  39.754 C | 135.902 W |   0.000 W | 407.706 W

Now we get only 39.8°C, for a total warm up of 2.8°C – by a blanket that can only be heated by the human, and starts off colder (or same) as the human.

So is there any evidence to support the heating of human skin by a passively heated blanket via backradiation ?

However, if a cotton blanket heated to 90°C is in contact with skin the patient does not experience the same tissue injuries, because the blanket has less than one third the specific heat of skin. In addition, the blanket has less than 1/1000 the density of skin (the density of a blanket is about 1 kg/m³ because it is roughly half cotton and half air.) The blanket can give up all of its heat to the skin yet raise the temperature no more than 1/80th of the 70°C temperature difference, or about 1°C.

[ House 2011 ]

This scientist rightfully does not acknowledge warming by radiative effect. The blanket must be theoretically warmed to 90°C to achieve a rise of about 1°C. A table of empirical results is also provided in [House 2011]:

Body PartUnheated BlanketsBlankets Warmed to 43.3°CBlankets Warmed to 65.6°C
Abdomen0.17°C1.11°C2.39°C
Lower Legs0.33°C0.89°C1.11°C
[ House 2011], Table 2, Converted to Celcius

Though there is obviously a tiny amount of warming due to blocking convection, we don’t see any warming as predicted by GH effect radiative heat transfer theory. We should’ve seen a very generous 2.8°C warming as predicted by such a theory in the column Unheated Blankets. We don’t even see such a high number with blankets externally heated to 65.6°C !

Now we move onto [Kabbara 2002]. In this paper we see how expensive equipment can be used to maintain a patient’s temperature. Figure 6 shows how externally heated air prevents a patient’s temperature from falling. But one may ask: What is the purpose of this expensive equipment when climate “scientists” already know that a non-externally heated blanket should raise skin temperature by at least the very generous 2.8°C?

Would you trust these climate “scientists” with your health? Do you think they really believe what they claim?

And now we move onto: US Patent – US6078026A

The blanket A has a maximum power draw of 6.5 amps. With fully charged batteries, the blanket will reach its target temperature (i.e. 100 degrees Fahrenheit or 38 degrees Celsius) approximately 5 minutes and will remain heated for five to eight hours.

Patent US6078026A

An external power source to raise T to 38°C?

Why need external power or even a patent when a simple blanket ought to do the trick?

Please do not object to this article because I based this off a normal temperature of 37°C. Even a hypothermic temperature of 33°C should be raised by 2.72°C, IF the GH effect blanket analogy held any merit.

A search on google scholar for “hospital blankets temperature” should convince anyone with integrity that blankets don’t raise your skin temperature in accordance to radiative transfer theory. For if they did, most of the discussion and science in that search would be moot: human-only heated blankets would solve the problems and special technology would not be necessary.

Skeptical Science finishes off their article:

So global warming does not violate the second law of thermodynamics. And if someone tells you otherwise, just remember that you’re a warm human being, and certainly nobody’s dummy.

Link

I’ll translate that for you: If you believe their sophistry, you are a dummy!

While using poetic license it is alright to say that blankets warm you, but using actual science, it is not correct. The best a blanket can do is keep you warm, but never make you warmer.

Enjoy 🙂 -Zoe

Addendum

Blanket(s) can suppress your perspiration and make you sick from your own urea, thus causing your temperature to go up. However, this could never be a proper analogy for the greenhouse effect.

Geothermal Animated

Geothermal Emission @ the Surface

This was derived from NCEP Reanalysis data, in the tradition of Measuring Geothermal …

Enjoy 🙂 -Zoe

Addendum

geochg.sh:

# source geochg.sh
# Zoe Phin 2020/03/13
    
F=(0 ulwrf dswrf uswrf lhtfl shtfl)                                                  
O=(0 3201.5 3086.5 3131.5 856.5 2176.5)

require() { sudo apt install nco gnuplot imagemagick; } # Linux Only
    
download() {
    b="ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/gaussian_grid"
    for i in ${F[*]}; do wget -O $i.nc -c $b/$i.sfc.mon.mean.nc; done
}

extract() {
    for t in {000..491}; do echo "$t" >&2
        for i in {1..5}; do 
            ncks --trd -HC ${F[$i]}.nc -v ${F[$i]} -d time,$t | sed \$d | awk -F[=\ ] -vO=${O[$i]} '{ 
                printf "%7s %7s %7.3f\n", $4, $6, $8/10+O }' > .f$i
        done
        paste .f1 .f2 .f3 .f4 .f5 | awk '{ 
            printf "%s %s %7.3f\n", $1, $2, $3-($6-$9)+$12+$15 }' > .geo$t
    done
}

annualize() {
    for y in {0..40}; do 
        args=`for m in {0..11}; do printf ".geo%03d " $((12*y+m)); done`
        paste $args | awk '{ a=0; for (i=3;i<=NF;i+=3) a+=$i; print $1" "$2" "a/12 }' > .y$((1979+y))
    done
}

colorize() {
    range=(`sort -nk 3.1 .y* | awk 'NR==1{min=$3} END { print min" "$3 }'`)
    echo ${range[*]}
    for y in {1979..2019}; do awk -vmin=${range[0]} -vmax=${range[1]} 'BEGIN { dlt=max-min }
        {   if ($2 < 191) {$2+=169} else {$2-=191} 
            printf "%s %s %4d\n", $1, $2, 1023*($3-min)/dlt }' .y$y | awk 'BEGIN { n=0
            for (i=255; i>=0; i--) { pal[n] = sprintf("%d 0 255", i); n++ }
            for (i=0; i<=255; i++) { pal[n] = sprintf("0 %d %d", i, 255-i); n++ }
            for (i=0; i<=255; i++) { pal[n] = sprintf("%d 255 0", i); n++ }
            for (i=255; i>=0; i--) { pal[n] = sprintf("255 %d 0", i); n++ }
        } { 
            printf "%s %s %s\n", $2, $1, pal[$3] }
        ' > .c$y
    done
}

scale() {
    rm -f .scale
    range=(`sort -nk 3.1 .y* | awk 'NR==1{min=$3} END { printf "%d %d %d\n", min, $3, $3-min }'`)

    min=${range[0]}; max=${range[1]}; dlt=${range[2]}

    for h in {0..100}; do 
        seq 0 1023 | awk -vh=$h -vmin=$min -vdlt=$dlt 'BEGIN { n=0
                for (i=255; i>=0; i--) { pal[n] = sprintf("%d 0 255", i); n++ }
                for (i=0; i<=255; i++) { pal[n] = sprintf("0 %d %d", i, 255-i); n++ }
                for (i=0; i<=255; i++) { pal[n] = sprintf("%d 255 0", i); n++ }
                for (i=255; i>=0; i--) { pal[n] = sprintf("255 %d 0", i); n++ }
            } { 
            print $1*dlt/1023+min" "h" "pal[$1]
        }' >> .scale 
    done

    echo "set term jpeg size 740,140; set nokey; 
        set title 'Flux (W/m²)
        set xtics 100 out nomirror
        unset ytics; set noborder
        set xrange [$min:$max]; set yrange [0:100]
        rgb(r,g,b) = int(r)*65536 + int(g)*256 + int(b)
        plot '.scale' u 1:2:(rgb(\$3,\$4,\$5)) w dots lc rgb variable lw 1
    " | gnuplot > scale.jpg
}

plot() {
    for y in {1979..2019}; do echo $y >&2; echo "  
        set term jpeg size 740,420; set nokey
        set title '$y'
        set yrange [-180:180]; set xrange [0:720]
        set noborder; unset colorbox 
        unset xtics; unset ytics
        rgb(r,g,b) = int(r)*65536 + int(g)*256 + int(b)
        plot '.c${y}' u (\$1*2):(\$2*2):(rgb(\$3,\$4,\$5)) pt 5 ps 1 lc rgb variable
        " | gnuplot > c$y.jpg
    done
}

animate() {
    convert -loop 0 -delay 50 c*.jpg geoanim.gif
}

clean() { rm -f .geo* .[fyc]* .scale; }

Run it:

$ source geochg.sh
$ require  # Linux Only
$ download
$ extract
$ annualize
$ colorize
$ scale
$ plot
$ animate

Windows users need imagemagick package from Cygwin.

What caused 40 years of global warming?

I’m going to ignore the typical nonsense mainstream narrative, and do this analysis in the tradition of: Measuring Geothermal – A Revolutionary Hypothesis.

I will use 41 years of NCEP Reanalysis Data. Create a new file fluxchange.sh, and paste:

# source fluxchange.sh
# Zoe Phin 2020/03/10
    
F=(0 ulwrf dswrf uswrf lhtfl shtfl)                                                  
O=(0 3201.5 3086.5 3131.5 856.5 2176.5)

require() { sudo apt install nco gnuplot; } # Linux Only
    
download() {
    b="ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/gaussian_grid"
    for i in ${F[*]}; do wget -O $i.nc -c $b/$i.sfc.mon.mean.nc; done
}

extract() {
    rm -f .fx*
    for i in {1..5}; do echo $i of 5 >&2
        for t in {000..491}; do
            ncks --trd -HC ${F[$i]}.nc -v ${F[$i]} -d time,$t | sed \$d | awk -F[=\ ] -vO=${O[$i]} -vt=$t '{ 
                W[$4]+=$8/10+O } END { for (L in W) { T += W[L]/192*cos(L*atan2(0,-1)/180) }
                printf "%04d %02d %7.3f\n", t/12+1979, t%12+1, T/60.1647 }'
        done | tee -a .fx$i
    done
}

annualize() {
    for i in {1..5}; do
        awk '{ T[$1]+=$3 } END { for (y in T) printf "%04d %7.3f\n", y, T[y]/12 }' .fx$i > .af$i
    done
}

change() {
    paste .af1 .af2 .af3 .af4 .af5 | awk '{ 
        printf "%s %s %s %s %s %s | %7.3f %7.3f %7.3f\n", 
            $1, $2, $4, $6, $8, $10, $2+$8+$10, $4-$6, $2-($4-$6)+$8+$10 }' | tee fluxchg.csv | awk '
        NR==1 { Ui=$2; Ni=$5+$6; Si=$9; Gi=$10 } END { 
            dU=$2-Ui; dN=$5+$6-Ni; dS=$9-Si; dG=$10-Gi
            printf "Upwelling Change:\t%7.3f W/m^2\n", dU
            printf "NonRadiative Change:\t%7.3f W/m^2\n\n", dN
            printf "Net Solar Change:\t%7.3f W/m^2\n", dS
            printf "Geothermal Change:\t%7.3f W/m^2\n", dG
            
        }'
}

plot() {
    echo "set term png size 740,550 font 'arial,12'; unset key; set grid
    plot 'fluxchg.csv' u 1:9 t 'Net Solar' w lines lw 3 lc rgb 'orange'" | gnuplot > slrchg.png

    echo "set term png size 740,550 font 'arial,12'; unset key; set grid
    plot 'fluxchg.csv' u 1:10 t 'Geothermal' w lines lw 3 lc rgb 'green'" | gnuplot > geochg.png
}

Run it:

$ source fluxchange.sh
$ require                 # linux only
$ download
$ extract
$ annualize
$ change

Upwelling Change:         3.401 W/m^2
NonRadiative Change:      4.784 W/m^2

Net Solar Change:         1.419 W/m^2
Geothermal Change:        6.766 W/m^2

The results are changes for years 1979 to 2019 (inclusive). The upwelling radiation flux and non-radiative flux equivalent has changed 3.401+4.784 = 8.185 W/m², and the attribution is properly divided among

  1. The change in insolation (primarily due to reduced cloud cover) – 1.419 W/m²
  2. Internal geothermal changes within the Earth – 6.766 W/m²

The crackpot mainstream greenhouse gas theory lacks empirical evidence, and yet its followers have the nerve to claim that humans are mostly responsible for recent warming. Nonsense. The cause was always #1 and #2.

Plot results:

$ plot

Two new files created: slrchg.png and geochg.png

Net Solar @ Surface (W/m²)
Geothermal @ Surface (W/m²)

Enjoy 🙂 – Zoe

Addendum

$ cat fluxchg.csv

1979 395.786 186.921 26.919 87.056 7.335 | 490.177 160.002 330.175
1980 396.248 186.179 27.066 88.049 7.452 | 491.749 159.113 332.636
1981 395.826 186.227 26.999 87.365 6.982 | 490.173 159.228 330.945
1982 395.207 187.333 26.994 87.962 7.934 | 491.103 160.339 330.764
1983 396.148 187.429 27.092 87.688 7.736 | 491.572 160.337 331.235
1984 395.186 187.830 26.942 86.853 7.945 | 489.984 160.888 329.096
1985 394.914 187.021 27.373 86.890 7.407 | 489.211 159.648 329.563
1986 395.503 186.138 26.799 88.883 7.567 | 491.953 159.339 332.614
1987 396.042 186.786 27.074 89.541 7.123 | 492.706 159.712 332.994
1988 396.403 185.917 26.844 88.428 7.598 | 492.429 159.073 333.356
1989 395.692 187.332 26.878 88.224 7.659 | 491.575 160.454 331.121
1990 396.751 185.962 26.491 88.830 7.405 | 492.986 159.471 333.515
1991 396.649 186.832 26.841 88.516 7.939 | 493.104 159.991 333.113
1992 395.438 187.175 27.030 89.625 8.814 | 493.877 160.145 333.732
1993 395.298 187.121 26.921 89.680 8.397 | 493.375 160.200 333.175
1994 395.859 187.384 27.092 90.347 8.459 | 494.665 160.292 334.373
1995 396.609 186.948 26.720 90.761 8.292 | 495.662 160.228 335.434
1996 395.938 186.889 27.293 92.229 8.509 | 496.676 159.596 337.080
1997 396.798 187.287 26.950 92.895 7.849 | 497.542 160.337 337.205
1998 397.931 186.646 26.847 92.945 7.873 | 498.749 159.799 338.950
1999 396.517 187.779 26.880 92.037 8.052 | 496.606 160.899 335.707
2000 396.340 187.967 27.053 94.193 7.936 | 498.469 160.914 337.555
2001 397.321 187.681 26.674 94.724 7.890 | 499.935 161.007 338.928
2002 397.710 188.238 26.708 94.694 7.949 | 500.353 161.530 338.823
2003 397.804 188.196 26.783 95.019 7.977 | 500.800 161.413 339.387
2004 397.397 187.912 26.836 95.386 7.933 | 500.716 161.076 339.640
2005 398.228 187.168 26.577 93.679 7.619 | 499.526 160.591 338.935
2006 397.815 187.226 26.505 93.671 6.732 | 498.218 160.721 337.497
2007 397.605 186.780 26.632 93.904 6.247 | 497.756 160.148 337.608
2008 397.106 188.004 27.021 92.936 6.565 | 496.607 160.983 335.624
2009 397.900 187.515 26.678 93.026 7.348 | 498.274 160.837 337.437
2010 398.153 186.115 26.535 94.463 6.238 | 498.854 159.580 339.274
2011 397.179 186.900 26.747 93.863 6.269 | 497.311 160.153 337.158
2012 397.674 187.437 26.768 93.584 6.606 | 497.864 160.669 337.195
2013 397.805 187.167 26.953 93.387 6.382 | 497.574 160.214 337.360
2014 398.137 187.495 26.908 94.028 6.460 | 498.625 160.587 338.038
2015 398.972 187.381 26.727 94.254 6.797 | 500.023 160.654 339.369
2016 399.599 185.911 25.952 93.387 6.276 | 499.262 159.959 339.303
2017 399.039 186.358 26.212 94.697 6.395 | 500.131 160.146 339.985
2018 398.596 187.198 26.528 94.232 6.356 | 499.184 160.670 338.514
2019 399.187 187.909 26.488 93.321 5.854 | 498.362 161.421 336.941
Column 1Year
Column 2Earth Longwave Upwelling
Column 3Solar Shortwave Downwelling
Column 4Solar Shortwave Upwelling
Column 5Latent Heat
Column 6Sensible Heat
Column 7Total Equivalent Received by Atmosphere
Column 8Net Solar (Shortwave Down minus Up)
Column 9Geothermal (Columns: #2 – (#3 – #4) + #5 + #6 )
Columns

Dumbest Math Theory Ever

Mainstream climate scientists believe in the dumbest math theory ever devised to try and explain physical reality. It is called the Greenhouse Effect. It’s so silly and unbelievable that I don’t even want to give it the honor of calling it a scientific theory, because it is nothing but ideological mathematics that has never been empirically validated. In fact it is nothing but a post hoc fallacy: the surface is hotter than what the sun alone can do, therefore greenhouse gases did it!

Today we will play with this silly math theory called the greenhouse effect. Here are two examples of its typical canonical depiction:


Let’s get started. Please create a new file called gheffect, and paste the following into it:

# bash gheffect
# Zoe Phin, 2020/03/03

[ -z $TSI ] && TSI=1361
[ -z $ALB ] && ALB=0.306

echo $1 | awk -vALB=$ALB -vTSI=$TSI 'BEGIN { 
		SIG = 5.67E-8 ; CURR = LAST = SUN = TSI*(1-ALB)/4
		printf "Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space\n"
	} {
	for (i=1 ;; i++) {
		printf "%3d | %7.3f W | %7.3f C ", i, CURR, (CURR/SIG)^0.25-273.16

		CURR = SUN + $1*LAST/2 ; GHE = SUN - (LAST*(1-$1))

		printf "| %7.3f W | %7.3f W | %07.3f W\n", GHE, CURR-LAST, CURR-GHE

		if ( sprintf("%.3f", CURR) == sprintf("%.3f", LAST) ) break

		#if ( CURR==LAST ) break

		LAST = CURR
	}
}'

Now run it with atmospheric emissivity = 0.792:

$ bash gheffect 0.792

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 236.133 W | -19.125 C | 187.018 W |  93.509 W | 142.625 W
  2 | 329.642 W |   2.971 C | 167.568 W |  37.030 W | 199.104 W
  3 | 366.672 W |  10.419 C | 159.866 W |  14.664 W | 221.470 W
  4 | 381.336 W |  13.212 C | 156.816 W |   5.807 W | 230.327 W
  5 | 387.142 W |  14.296 C | 155.608 W |   2.300 W | 233.834 W
  6 | 389.442 W |  14.722 C | 155.130 W |   0.911 W | 235.223 W
  7 | 390.352 W |  14.890 C | 154.940 W |   0.361 W | 235.773 W
  8 | 390.713 W |  14.957 C | 154.865 W |   0.143 W | 235.991 W
  9 | 390.856 W |  14.983 C | 154.835 W |   0.057 W | 236.077 W
 10 | 390.912 W |  14.994 C | 154.824 W |   0.022 W | 236.111 W
 11 | 390.935 W |  14.998 C | 154.819 W |   0.009 W | 236.125 W
 12 | 390.944 W |  14.999 C | 154.817 W |   0.004 W | 236.130 W
 13 | 390.947 W |  15.000 C | 154.816 W |   0.001 W | 236.132 W
 14 | 390.949 W |  15.000 C | 154.816 W |   0.001 W | 236.133 W

W is shorthand for W/m². Parameters are taken from NASA Earth Fact Sheet.

As you can see, by delaying outgoing radiation for 14 [¹] seconds [²], we have boosted surface up-welling radiation by an additional ~66% (154.8/236.1 W/m²). Amazing, right? That’s what my program shows, and that’s what is claimed:

This is zero in the absence of any long‐wave absorbers, and around 155 W/m² in the present‐day atmosphere [Kiehl and Trenberth, 1997]. This reduction in outgoing LW flux drives the 33°C greenhouse effect …

Attribution of the present‐day total greenhouse effect

The main prediction of the theory is that as the atmosphere absorbs more infrared radiation, the surface will get warmer. Let’s rerun the program with a higher atmospheric emissivity = 0.8

$ bash gheffect 0.8

Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 236.133 W | -19.125 C | 188.907 W |  94.453 W | 141.680 W
  2 | 330.587 W |   3.168 C | 170.016 W |  37.781 W | 198.352 W
  3 | 368.368 W |  10.746 C | 162.460 W |  15.113 W | 221.021 W
  4 | 383.481 W |  13.614 C | 159.437 W |   6.045 W | 230.088 W
  5 | 389.526 W |  14.738 C | 158.228 W |   2.418 W | 233.715 W
  6 | 391.944 W |  15.184 C | 157.745 W |   0.967 W | 235.166 W
  7 | 392.911 W |  15.361 C | 157.551 W |   0.387 W | 235.747 W
  8 | 393.298 W |  15.432 C | 157.474 W |   0.155 W | 235.979 W
  9 | 393.453 W |  15.461 C | 157.443 W |   0.062 W | 236.072 W
 10 | 393.515 W |  15.472 C | 157.431 W |   0.025 W | 236.109 W
 11 | 393.539 W |  15.477 C | 157.426 W |   0.010 W | 236.124 W
 12 | 393.549 W |  15.478 C | 157.424 W |   0.004 W | 236.130 W
 13 | 393.553 W |  15.479 C | 157.423 W |   0.002 W | 236.132 W
 14 | 393.555 W |  15.479 C | 157.423 W |   0.001 W | 236.133 W

A 1% rise in atmospheric emissivity (0.8/0.792) predicts a 0.479 °C rise in surface temperature.

You would think such intelligent and “correct” mathematics would be based on actual experiments, but you would be wrong; it is not based on anything other than its presuppositions, and has been so for more than a century by name, and two centuries by concept.

Let’s outline a very simple experiment to test whether the greenhouse effect is true:

          Solid Surface
               v

1) Person   => |     IR Camera

2) Person   <- | ->  IR Camera

And repeats until "equilibrium"

Radiation leaves the body and strikes a screen. After absorption some radiation will go out to the IR camera, and the rest will go back to the person, thereby warming them up further, according to greenhouse effect theory. Note that we don’t even need absorption, merely reflecting back a person’s radiation should warm them up.

Let’s assume the human body emits 522.7 W/m² (37 °C) (Emissivity: 0.9961, [Sanchez-Marin 2009]). For compatibility with my program, we multiply this figure by 4, and call it TSI. Let’s assume the screen and air in between together has a total emissivity of 0.9. Now run:

$ TSI=2090.8 bash gheffect 0.9
Sec | Upwelling |   Temp    | GH Effect |  Trapped  | To Space
  1 | 362.754 W |   9.658 C | 326.478 W | 163.239 W | 199.515 W
  2 | 525.993 W |  37.188 C | 310.154 W |  73.458 W | 289.296 W
  3 | 599.451 W |  47.498 C | 302.809 W |  33.056 W | 329.698 W
  4 | 632.507 W |  51.830 C | 299.503 W |  14.875 W | 347.879 W
  5 | 647.382 W |  53.725 C | 298.016 W |   6.694 W | 356.060 W
  6 | 654.076 W |  54.566 C | 297.346 W |   3.012 W | 359.742 W
  7 | 657.088 W |  54.943 C | 297.045 W |   1.356 W | 361.398 W
  8 | 658.443 W |  55.112 C | 296.909 W |   0.610 W | 362.144 W
  9 | 659.053 W |  55.188 C | 296.848 W |   0.274 W | 362.479 W
 10 | 659.328 W |  55.222 C | 296.821 W |   0.124 W | 362.630 W
 11 | 659.451 W |  55.238 C | 296.809 W |   0.056 W | 362.698 W
 12 | 659.507 W |  55.244 C | 296.803 W |   0.025 W | 362.729 W
 13 | 659.532 W |  55.248 C | 296.801 W |   0.011 W | 362.743 W
 14 | 659.543 W |  55.249 C | 296.799 W |   0.005 W | 362.749 W
 15 | 659.548 W |  55.250 C | 296.799 W |   0.002 W | 362.752 W
 16 | 659.550 W |  55.250 C | 296.799 W |   0.001 W | 362.753 W
 17 | 659.552 W |  55.250 C | 296.799 W |   0.000 W | 362.753 W

We see that the screen is “trapping” a lot of human radiation from reaching the IR camera, and we expect an extra 296.8 W/m² greenhouse effect, bringing us up to 55°C. Merely placing a screen in front of us should make us feel as if we’re stepping inside a sauna.

https://youtu.be/fpx7hsoYEt4 – Look at all the trapped radiation!
https://youtu.be/Fx49t4sv7f0 – Look at all the trapped radiation!

These people must be really feeling the heat. But they don’t, and for good reason: preventing radiation from reaching a colder place does not cause heating back at the source. Had these people had thermometers strapped to them, they would note the virtually zero temperature rise (due to blocked convection). Look very closely at the videos. Note the seconds the screens are placed in front of their faces and notice the lack of any thermal reading changes. None!

All empirical evidence shows the opposite of the claims of the greenhouse effect.

So the question remains, why is the surface hotter than the sun can make it alone?

Energy Budget

If we look at the energy budget, we can see a dependency loop between surface and atmosphere: Surface -> Atmo = 350 and Atmo -> Surface = 324. So which came first, the chicken or the egg? This is nonsense. You can’t have a dependency loop for heat flow. Let’s try a theory that does not cause mental anguish and lacks empirical evidence. For this, we ignore the climate “scientists”, and go to the geophysicists:

https://www.routledgehandbooks.com/doi/10.1201/9781315371436-4

Here we see that Earth’s geothermal energy is capable of delivering 0 °C to the surface; This is equivalent to 315.7 W/m². We add the sun and subtract latent+sensible heat:

315.7 + 168 – 24 – 78 = 381.7 = Upwelling Radiadtion

Now we get a figure that that’s 390 – 381.7 = 8.3 W/m² off, but that’s OK because latent and sensible heat are not directly measured but estimated with certain physical assumptions, and/or the 0 °C geothermal is an approximation too.

Now we finally realize that the greenhouse effect is a hoax, and nothing but geothermal flipped up-side down. There is no Downwelling Radiation, there is only Upwelling-from-measurement-instrument Radiation (See here). Those who read Why is Venus so hot?, probably already saw where I was going. Now doesn’t it make more sense than backradiation temperature raising? Reality shows abolutely normal geothermal and solar combining to produce what we observe. We see all normal heating, and no ugly backwards zig-zag heating.

Let’s summarize:

     Upwelling
         ^
  |      |       ^        ^
  v      |       |        |
===============================
         |    Latent  Sensible
Solar ---+     Heat      Heat 
         |       ^        ^         
         |       |        |
         +------ Geothermal

Now which explanation does Occam’s Razor favor?

I hope you have enjoyed the return to sanity.

Sincerely, -Zoe

Notes

[¹] We only care about matching 3 decimal places. If we want to extend it to IEEE754 64-bit precision, it takes 40 seconds. Not that this matters much; Most work is accomplished in the first 5 seconds.

[²] I debated with myself whether to use the term seconds or iterations. Real physical calculations would take mass and heat capacity into account, but since greenhouse theorists don’t use these, I won’t either. Their simple model is in seconds.

Instructions for Windows Users

You can run all the code at this blog on Windows, rather than Linux. I will show you how. Preferred Windows is version 10. That’s the only one I’ve set up. Any Windows >7.0 should work in theory.


Run Windows Command Line (<Win>+R; type “cmd” then <enter>)

cd \
mkdir Zoe

Zoe is your variable. Make it whatever you want, remember it, then “exit” <enter>


Download Cygwin Setup Program. [Direct Link to 64-bit Exe]

Run Exe, click <Next>

Choose 1st
Set the root directory
Set to same previous directory
Choose a connection method, I choose 1st
Choose a mirror
Change View to “Full”

You will need 3 packages: wget, lua (5.3), gnuplot.

Type package name in search, then under New, CLICK on the word “Skip”, version number will appear

For lua, keep clicking “Skip/Version #” until version 5.3.x appears. Failure to do so will break gnuplot.

When done click <Next>. Next screen suggests dependencies; Just click <Next>, then wait through download & install process.

First is required, 2nd is your choice

Download Miniconda3. [Direct Link to 64-bit Exe]. Run Exe, click <Next>, click <I Agree>

Your choice. I do recommended.
Place it inside previous Cygwin directory for neatness
I select both. Caution if you are a python developer.

When complete, click <Next>, click <Next>, deselect both “Learn …”, click <Finish>

Double click

A new user home environment will be created.

The Path is “C:\Zoe\home\<Windows Username>”. All code can just go here.

Inside the Terminal:

$ conda config --add channels conda-forge
$ conda install nco numpy

Second command requires your confirmation. Type “y” and <enter>.

When done, exit with “exit” <enter>, and relaunch Cygwin Terminal.


Use any text editor you like (must be able to format Unix-style files), or get Sublime. [Direct Link to 64-bit Exe]

After install, run, and select: File > New File

Make sure to set View > Line Endings > Unix, otherwise script will not run!

Copy and paste the following sample code into the empty new file:

# source map.sh

download() {
	wget -c ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis/surface/land.nc
}

extract() {
	ncks --trd -HC land.nc | awk -F [=\ ] '{
		if ($6 < 191) {$6+=169} else {$6-=191}
		print $6" "$4" "$8
	}' > map.dat
}

plot() {
	echo 'set term png size 362,182; set nokey
		set yrange [-90:90]; set xrange [0:360]
		set noborder; unset colorbox 
		unset xtics; unset ytics
		set palette define (0 "blue", 1 "orange") 
		set margin 0,0,0,0
		plot "map.dat" u 1:2:3 pt 5 ps 1 lc palette' | gnuplot > map.png
}

Save this file to: C:\Zoe\home\<Windows Username>\map.sh

Now execute from Cygwin Terminal:

$ source map.sh
$ download
$ extract
$ plot

* or *

$ . map.sh && download && extract && plot

The result is a file called map.png inside C:\Zoe\home\<Windows Username>. Use Windows Explorer to find it and launch it.

If map.png looks like this … you are done! Most of my blog code will work.


Happy Coding 🙂 -Zoe

Deducing Geothermal

I used to be a fan of Joseph Postma before I realized he’s very stubborn and on the wrong track headed for a dead end. I hope he turns around.

I highly recommend that everyone read his great series … The Fraud of the Greenhouse Effect (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19). You will learn a lot!

Today I will be critiquing Joseph Postma’s paper A Discussion on the Absence of a Measurable Greenhouse Effect in light of what I know about his current view. I take no issue with his central proposition that there is no Greenhouse Effect, but I do take issue with his current personal conclusion that the sun, and the sun alone, is enough to explain the surface temperature. His latent heat doesn’t occur until there’s energy to evaporate water first, and the only energy he has is from the sun. My goal is to show that in fact the sun is not enough, and Postma’s implicit hypothesis is in error, though his paper is still very good and highly recommended reading.

We will use some basic data from NASA:

Solar irradiance (W/m2): 1361

Bond albedo: 0.306

Average temperature: 15°C

Diurnal temperature range: 10°C to 20°C

NASA’s Earth Fact Sheet

See code in the addendum. To run first part of analysis:

> source gmodel.sh && require && download && justsun

Sphere     ... Hard Way: 236.1329 W/m^2 , Easy Way: 236.1335 W/m^2
Hemisphere ... Hard Way: 472.2659 W/m^2 , Easy Way: 472.2659 W/m^2

The hard way performs the actual integration (approximation), while easy way just uses divisor (1/2 for sphere and 1/4 for hemisphere). The hard way is needed to create data of average of insolation across all latitudes as a function of local time. An average location on Earth will see this kind of daily insolation, averaged for the year. Keep in mind that this is an average value for a typical year and therefore my code purposefully disregards the math for Earth’s tilt and resulting seasonal daylight variation. It’s just not necessary. Resulting graph:

Global Average Insolation, using standard Albedo of ~0.3

It should be noted that we can’t use the standard albedo of ~0.3, because this albedo is an average for the entire surface+atmosphere ensemble. We need to know the actual portion of solar radiation that reaches the surface. This is different from the official standard albedo.

> absorbs
Surface Absorption: 0.4874

Flux data from NASA’s ISCCP Project (1983-2004 data here), yields a surface absorption of 0.4874, while NASA’s “official” energy budget [2008] shows: 163.3 / 340.3 = 0.4799. I am just going to choose an “albedo” of 0.52 (1-0.4799). We re-run with an ALB parameter set to 0.52:

> ALB=0.52 justsun
Sphere     ... Hard Way: 163.3196 W/m^2 , Easy Way: 163.3200 W/m^2
Hemisphere ... Hard Way: 326.6392 W/m^2 , Easy Way: 326.6392 W/m^2
Daily Insolation @ Surface for Average Location, Annually Averaged

Now we’re ready to move on to serious analysis. Postma has provided the only formula we will need:

Equation 11 (Page 13)

where C(t) is literally a climate term which could be either positive or negative (adding heat or taking heat away) in total, or composed of several unique contributions depending on if there is an additional heat source such as the “greenhouse effect”, or chemical and geologic sources, etc.

A.d.o.t.A.o.a.M.G.E, Page 13

emis = 1.0; % emissivity, using 1.0 for surface” [ I will use the same ]

— Same, Page 63

The following code snippets (see addendum) summarize this formula (emis = 1):

$1+ADD                          # As in Current Flux += ADD  # ADD = C(t)
T = T + ($1 - SIG*T^4)/TAU      # As in Tnew = Told + (Current Flux - SIG*Told^4)/TAU

We must choose a value for τ (TAU) that will produce a diurnal difference of 10°K (or °C) as shown in NASA’s Earth Fact Sheet above. The needed TAU equals 12940.

> TAU=12940; ADD=0; manydays; lastday

24HR  Sun ... 163.32 W/m2	
Day   Sun ... 326.64 W/m2	
Day   Max ... 236.64 K    -36.52 C	
Night Min ... 226.64 K    -46.52 C	
Max - Min ...  10.00 K     10.00 C	
24HR  Avg ... 231.59 K    -41.57 C	
24HR Flux ... 163.10 W/m2

We start off at T=0, and after 40 days we get to a stable typical day:

Note the result: Max – Min … 10.00 K. We have satisfied one of the criteria, but notice that our 24HR average is : 41.57°C. That’s not the 15°C we need. Obviously the sun is not enough! And the diurnal temperature does not go from a night average of -46.5°C to a day average of -36.5°C. We must satisfy all 4 temperature criteria in NASA’s Fact Sheet shown above, not just 1.

Postma spends a part of the paper analyzing a “C(t)” value of 324 W/m² (what is claimed for GHG backradiation) along with arbitrary (though intelligently guessed) τ values. He then dismisses the results for fairly good reasons. However he missed the crucial point: the sun is not enough. I’m going to show you what values he should have used. I have a parameter in my program (ADD) that is equivalent to C(t). I have found the necessary parameters to be TAU=12895, ADD=227.66. ADD is what I believe to be the radiative component of geothermal.

> TAU=12895; ADD=227.66; manydays; lastday

24HR  Sun ... 390.98 W/m2	
Day   Sun ... 781.96 W/m2	
Day   Max ... 293.21 K     20.05 C	
Night Min ... 283.21 K     10.05 C	
Max - Min ...  10.00 K     10.00 C
24HR  Avg ... 288.11 K     14.95 C	
24HR Flux ... 390.67 W/m2	
Geothermal (Green) + Solar (Yellow)
Geothermal (Green) + Solar (Yellow)

Notice that we satisfied all criteria set forth in NASA’s Earth Fact Sheet (with only 0.05°C error):

Day Max … 20.05 C
Night Min … 10.05 C
Max – Min … 10.00 K
24HR Avg … 14.95 C

But we’re not done. ADD is just the radiative component of geothermal. Let’s add Sensible and Latent Heat from NASA’s “official” energy budget [2008]:

227.66 + 18.4 + 86.4 = 332.46 W/m²

This result is not much different than the 335.64 W/m² result I got here: Measuring Geothermal, using NCEP Reanalysis data.

332.46 vs 335.64 ! What’s the significance of this? I was able to approximately get the same geothermal emergent radiative flux from a very simple model! I don’t know about you, but I’m impressed.

How about Postma? Where’s his mind today?

Yah…that’s definitely my Zoe…one of my most attractive stalkers for sure [Zoe: How sweet, TY] . She thinks that the flat Earth theory model is all totally fine [Zoe: strawman; no one presents an actual flat earth model, including those that print maps on flat paper rather than globes]…but backradiation isn’t from the atmosphere “because that’s impossible, but it is from geothermal.”

So…she wants to keep the flat Earth theory [Zoe: Two hemisphere 24hr heat capacity theory, actually] accounting where the Sun can’t heat the Earth or create and sustain the weather/climate, and where there’s some additional energy source which provides twice more energy than the Sun…but instead of it being “backradiation” she wants it to be geothermal….providing twice the energy than the Sun. She went on about this here for months…finally banned her…because the heat from geothermal is known and measured…and flat Earth theory would be the WRONG way to try to incorporate it anyway!

These people are sick, sick demented freaks, and they seem to really want to keep their flat Earth theory no matter what mechanisms need to change to make it work.

Joseph Postma

Stalker means you are criticizing a public scientist for their ideas. I agree with him that there is no Greenhouse Effect, but he goes too far – way too far. As you can see he still thinks the sun, and the sun alone, is enough to explain surface temperatures and their diurnal variation – and everyone who disagrees is a flat earther!

One thing from his publication I found very interesting:

Solar forcing acts directly only on the top few millimeters of surface soil itself (the penetration depth is larger for ocean water and some heating occurs directly in the atmosphere via extinction), and this is where the incoming short wave radiant energy performs work and raises the temperature. This heat energy will then conduct its way down into the subsurface until it merges with the geothermal temperature at a depth of somewhere around, say, 5 to 10 meters and temperature of approximately 5°C to 10°C [Zoe: I found 0 to 10] … and this much larger thermal-mass system will respond much more slowly, in aggregate, to the solar variation. This low-frequency [Zoe: Goes to nil frequency] aggregate response will provide a baseline upon which the daily variations will oscillate at the top

A.d.o.t.A.o.a.M.G.E, Page 16

A baseline you say? Maybe this baseline would exist without the sun? Maybe this baseline is capable of its own thermal action and emission? Too bad he did not pursuit this line of thought. But thanks for reaffirming my intuition at the time.

I hope you’ve enjoyed this article.

Love, -Zoe

Update

Guys I just had several dozen notifications come through of Zoe trying to link to my blog from what is apparently her new blog.

WHAT A FN STALKER! [Zoe: You made yourself a public scientist so stop playing the victim]

“Zoe’s geothermal insights” or some retardation.

“Flat Earth is OK! [Zoe: usual strawman]. We just need to use geothermal to make up the temperature instead!”

Basic bitch. [Zoe: How lovely]

— Joseph E Postma says:2020/02/25 at 8:24 AM

Nice! Joseph, your solar-only theory can’t explain observations. It would be great if you could rejoin reality and continue to make contributions to science, as you did in the past.

Addendum

Code gmodel.sh:

# source gmodel.sh
# Zoe Phin, 2020/02/20

plothead="set term png size 740,550 font 'arial,12'; unset key
	set xtics 360 out nomirror; set mxtics 6; set grid xtics ytics
	set xtics add ('0h' 0,'6h' 360,'12h' 720,'18h' 1080,'24h' 1440)
	set yrange [0 to 800]; set ytics 100,100,800
"

require() { sudo apt-get install gnuplot; }

download() {
	wget -O sdn.bin -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFDW
	wget -O sup.bin -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFUW
}

absorbs() {
	od='od -An -f -w4 --endian=big'; $od sup.bin > .sup; $od sdn.bin > .sdn
	paste .sdn .sup | awk '{S+=$1-$2} END { 
		printf "Surface Absorption: %.4f\n", (S/NR)/(1361/4) }'
}

justsun() {
	[ -z $ALB ] && ALB=0.306
	seq -89.875 0.25 89.875 | awk -vA=$ALB 'BEGIN {pi = atan2(0,-1); r=pi/180} {
		CONVFMT="%.8f"; if ($1 < 0) $1 = 0 - $1
		a = sin(r*($1+0.125))-sin(r*($1-0.125))
		for (m=0; m<1440; m++) {
			y = cos((m-720)*pi/720)
			print m" "$1" "a/2" "1361*(1-A)*cos($1*pi/180)*((y<0)?0:y)
		}
	}' | awk '{ M[$1]+=$3*$4 } END {
		for (i in M) print M[i]
	}' | tee solar.dat | awk -vA=$ALB '$1>0 { S+=$1 } END { CONVFMT="%.4f"
		HW = S/720; EW = 1361*(1-A)/2
		print "Sphere     ... Hard Way: " HW/2 " W/m^2 , Easy Way: " EW/2 " W/m^2"
		print "Hemisphere ... Hard Way: " HW   " W/m^2 , Easy Way: " HW   " W/m^2" }'

	echo "$plothead set title 'Solar Flux (W/m²)'
	plot 'solar.dat' u (\$1==0?-1:\$1) w filledcu above fc 'yellow',\\
	     513.085*cos((x-720)*pi/720) w lines lc rgb 'orange' lt 8 lw 3
#	     741.835*cos((x-720)*pi/720) w lines lc rgb 'orange' lt 8 lw 3
	" | gnuplot > justsun.png
}

manydays() {
	[ -z $ADD ] && ADD=0; [ -z $DAYS ] && DAYS=15
	for n in `seq $DAYS`; do cat solar.dat | awk -vADD=$ADD '{ 
		printf "%7.3f\n", $1+ADD }'
	done | awk -vTAU=$TAU 'BEGIN { SIG=5.67e-8 } { 
		T=T+($1-SIG*T^4)/TAU
		printf "%10.6f %10.6f %10.6f\n", $0, T, SIG*T^4
	}' > many.dat

	echo "$plothead set key samplen 0; set title '$DAYS Days of Flux (W/m²)'
	set xtics format ''; set xtics 1440 in mirror; unset mxtics
	plot 'many.dat' u (\$1==0?-1:\$1) title '' w filledcurves above y=$ADD fc 'yellow',\\
	$ADD t '' w filledcu above y=0 fc 'dark-green' fs solid 0.8 border lt 2 lw 4,\\
	'' u 3 t 'τ = $TAU' w lines lw 2 lc 8" | gnuplot > many-t${TAU}.png
}

lastday() {
        tail -n 1440 many.dat | nl | tee last.dat | awk 'BEGIN { MIN=999 }
                   $3 > MAX { MAX=$3 }
                   $3 < MIN { MIN=$3 }
                        $2>0 { S+=$2 }
        NR >360 && NR <=1080 { D+=$3 }
        NR<=360 || NR > 1080 { N+=$3 }
        END { SIG=5.67e-8; D24 = (D+N)/1440; C=273.16
                printf "24HR  Sun ... %6.2f W/m2        \n", S/1440
                printf "Day   Sun ... %6.2f W/m2        \n", S/720
                printf "Day   Max ... %6.2f K %9.2f C   \n", MAX, MAX-C
                printf "Night Min ... %6.2f K %9.2f C   \n", MIN, MIN-C
                printf "Max - Min ... %6.2f K %9.2f C   \n", MAX-MIN, MAX-MIN
#               printf "Day   Avg ... %6.2f K %9.2f C   \n", D/720, D/720-C
#               printf "Night Avg ... %6.2f K %9.2f C   \n", N/720, N/720-C
#               printf "D-N Delta ... %6.2f K           \n", (D-N)/720
                printf "24HR  Avg ... %6.2f K %9.2f C   \n", D24, D24-C
                printf "24HR Flux ... %6.2f W/m2        \n", SIG*D24^4
        }'

        echo "$plothead set key samplen 0
        set ylabel 'Flux (W/m²)'; set y2label 'Temperature (K)'
        set ytics out 50 nomirror; set mytics 5; 
        set y2tics 210,10,800; set my2tics 2
        set link y2 via (y/5.67e-8)**0.25 inverse 5.67e-8*(y**4)

        plot $ADD t '' w filledcu above y=0 fc 'dark-green' fs solid 0.8 border lt 2 lw 4,\\
        'last.dat' u 1:2 t '' w filledcu above y=$ADD+1 fc 'yellow' fs solid 1 border lt 5 lw 4,\\
        '' u 1:3 t 'Surface T (τ=$TAU)' w lines lw 3 lc 8 axes x1y2" | gnuplot > last.png
}

Two theories, one ideological, other verified

About a dozen people who have read my article, the case of two different fluxes, have dismissed my central argument by invoking a silly theory. The most famous critic, Willis Eschenbach (of WUWT fame), thus writes:

Zoe, I just took a look at your page. I fear that you’ve made a mathematical mistake. The problem is that you have over-specified the equation. Let me explain by a parallel example:

It is a physical impossibility for there to be more water flowing out of the end of a hose than there is flowing through the hose. Can’t happen. The flow through the hose must be equal to the flow out the end.

In the same way, It is a physical impossibility for there to be more energy flowing out of the end of a block of concrete than there is flowing through the block. It is logically impossible. The flow through the block must be equal to the flow out the end.

— Willis Eschenbach

Willis then went on to resolve my equations using his key “insight” that the radiation emerging out of an object “must” equal its conductive heat flux. In the language of my article, the assertion is: CSR = CHF (Conductive Heat Flux = Cold Side Radiation [radiation at interested end] ).

The emission is at any moment εσT⁴. If the emission is not balanced by absorption or heat flux the temperature and consequently the emission will drop.

— Dirk Visser

This is essentially the same as Willis’ argument.

Other critics write:

If the heat flux is only 92 mW/m², then obviously geothermal can only make the surface about 36 kelvin.

— Unnamed

Again we see the CSR = CHF assumption, then evaluated with Stefan-Boltzmann’s Law.

Geothermal is negligible.

— Joseph Postma

Sun is more than 500 times as powerful as geothermal.

— Unnamed

Both of these comments implicitly assume CSR = CHF.

All other critiques are just variations on the same theme. Only difference is how many implicit logical leaps they are from the core assumption that CSR “must equal” CHF.

In my article I clearly explained that there is a difference between conductive heat flux within a medium and the emergent electromagnetic radiation out of the medium, but it’s been lost on deaf ears for some people. I don’t why (their denial), but I feel the need to shame them a little.

Let’s see what wikipedia says about a black body:

A black body in thermal equilibrium (that is, at a constant temperature) emits electromagnetic radiation called black-body radiation. The radiation is emitted according to Planck’s law, meaning that it has a spectrum that is determined by the temperature alone…

— Wikipedia

What is the conductive heat flux (CHF) of an object at thermal equilibrium (a uniform temperature)?

The conduction formula is:

Conduction Formula

CHF = Q/(A*Δt)

Obviously with a uniform temperature, ΔT equals 0, and thus CHF is also ZERO!

And what did Ludwig Boltzmann and Max Planck discover emitted from their radiation cavities which had a CHF of zero? Was it also zero as my critics assert with their CHF=CSR theory? No, of course not! What comes out of an object with CHS=0 is CSR=εσT⁴ , and not CHF=εσT⁴ [ as my 2nd critic evaluated ]. Nor is this CSR transient and headed for zero, as Willis and Dirk would have you believe.

Just as the wikipedia snippet above implies: ONLY the TEMPERATURE on the edge matters.

a spectrum that is determined by the temperature alone

— Wikipedia

Now wikipedia is not always right about everything, but this is so commonly well known that I don’t need any other source. You can find essentially the same thing in every high school or college textbook. Every experiment since Gustav Kirchoff [1859] has invalidated the CHF=CSR hypothesis, and reaffirmed my hypothesis: CHF and CSR are completely different and their relationship is inverse:

CSR = εσ(T-CHF*L/k)⁴

The greatest external emission is achieved at the lowest internal heat flux, assuming the hot side temperature is the same.

At thermal equilibrium (CHF=0), this formula drops to:

CSR = εσT⁴

Yes, just Stefan-Boltzmann’s Law

If my critics were correct, then all (even one!) experiments since 1859 would show their claim to be true. Yet none of them do, because my critics are … merely engaging in ideological mathematics and not real physics.

Summary:

CHF = CSRCSR = εσ(T-CHF*L/k)⁴
Ideological Mathematics Physics

Geothermal is more than capable of delivering 0°C (CSR=~315 W/m²), despite the fact that its near surface CHF is ~92 mW/m². In fact, assuming same temperature at same depth, a smaller CHF yields a higher CSR. The CHF (~92 mW/m²) alone is not even enough information to determine the final temperature, and hence radiation out of the medium. Quoting CHF and comparing it to insolation is nothing but junk science.

A CHF of 92 mW/m² does not inhibit CSR of 315 W/m²

Sincerely Yours, -Zoe

Update 2020/03/03

This video shows CHF through the water approaching zero. Gets to ~0.01 W/m² at the end.

This video shows CHF through the pan get to zero. See time 01:53.

Measuring Geothermal – A Revolutionary Hypothesis

I’m proposing a brand new hypothesis. Here it is:

  1. The so called greenhouse effect is nothing but an artifact of geothermal, flipped upside down.
  2. We can measure geothermal quite easily:

Geothermal Emission = Upwelling Longwave Radiation – (Downwelling Shortwave Radiation – Upwelling Shortwave) + Latent Heat Flux + Sensible Heat Flux

In the language of NCEP Reanalysis, the formula is:

geo = ulwrf – (dswrf – uswrf) + lhtfl + shtfl

Program: fluxes.sh:

# source flux.sh
# Zoe Phin 2020/02/02
    
F=(0 ulwrf dswrf uswrf lhtfl shtfl)                                                  
O=(0 3201.5 3086.5 3131.5 856.5 2176.5)

require() { sudo apt install nco; }
    
download() {
    b="ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/gaussian_grid"
    for i in ${F[*]}; do wget -O $i.nc -c $b/$i.sfc.mon.mean.nc; done
}
 
extract() {
    for i in {1..5}; do echo $i of 5 >&2
        ncks --trd -HC ${F[$i]}.nc -v ${F[$i]} | awk -F[=\ ] -vO=${O[$i]} '  
        NF != 0 { W[$4]+=$8/10+O } END { for (L in W)                              
            printf "%07.3f %07.3f\n", L, W[L]/94464}' | sort -n > .w$i
    done
}
    
geo() {
    paste .w1 .w2 .w3 .w4 .w5 | tr ' ' '\t' | cut -f 1,2,4,6,8,10 | awk '  
        function S(x) { return cos(x*atan2(0,-1)/180) } {
        A+=$2*S($1); B+=$3*S($1); C+=$4*S($1); D+=$5*S($1); E+=$6*S($1);    
        G=$2+$5+$6-($3-$4); F+=G*S($1); print $0" "G } END { X=60.1647         
        printf "\nAverage:%07.3f %07.3f %07.3f %07.3f %07.3f %07.3f\n",     
            A/X, B/X, C/X, D/X, E/X, F/X }'    
}

Run it:

> source fluxes.sh
> require
> download
> extract
> geo

Result:

-88.542	160.572	130.617	109.567	001.947	-52.117  89.352
-86.653	169.731	129.611	107.707	002.169	-52.952  97.044
-84.753	179.068	129.526	106.648	004.162	-53.255 107.097
-82.851	183.643	130.241	108.115	003.545	-49.320 115.742
-80.947	184.386	132.014	110.074	002.310	-44.688 120.068
-79.043	186.170	133.439	111.061	002.894	-43.264 123.422
-77.139	192.359	134.189	106.598	004.336	-39.082 130.022
-75.235	201.453	135.480	107.015	004.942	-37.415 140.515
-73.331	213.577	137.236	107.364	006.520	-36.813 153.412
-71.426	230.078	138.094	105.162	009.339	-37.852 168.633
-69.522	249.329	137.100	092.216	013.147	-29.933 187.659
-67.617	266.633	135.069	077.029	017.929	-21.059 205.463
-65.713	282.318	134.510	068.697	018.535	-10.248 224.792
-63.808	296.958	129.902	042.475	022.244	-00.262 231.513
-61.903	307.732	128.396	029.934	025.698	-00.212 234.756
-59.999	315.924	128.434	021.671	030.174	-00.973 238.362
-58.094	322.948	128.718	014.388	034.865	-01.690 241.793
-56.189	328.903	129.930	010.542	038.852	-04.608 243.759
-54.285	334.101	132.116	009.320	041.700	-08.192 244.813
-52.380	339.183	134.906	009.415	044.105	-10.608 247.189
-50.475	344.712	138.185	009.791	046.719	-11.703 251.334
-48.571	351.177	142.292	010.109	051.982	-10.207 260.769
-46.666	358.411	147.196	010.329	058.012	-07.006 272.550
-44.761	366.359	152.414	010.643	066.823	-03.668 287.743
-42.856	375.462	158.630	011.047	077.603	002.155 307.637
-40.952	384.799	165.580	011.384	090.318	007.026 327.947
-39.047	392.785	173.228	011.964	099.120	009.369 340.010
-37.142	399.903	181.187	013.021	103.336	011.744 346.817
-35.237	405.908	189.572	013.564	105.950	012.320 348.170
-33.333	411.725	196.549	015.631	104.388	016.865 352.060
-31.428	417.538	202.463	017.816	105.123	018.885 356.899
-29.523	422.830	207.009	019.005	109.663	018.734 363.223
-27.619	427.894	210.176	019.605	113.762	018.905 369.990
-25.714	432.643	212.739	019.920	117.687	019.344 376.855
-23.809	436.710	216.051	021.043	121.615	018.673 381.990
-21.904	440.565	219.806	020.820	127.532	018.441 387.552
-20.000	444.026	222.918	019.940	133.700	018.789 393.537
-18.095	446.273	224.950	019.483	139.782	016.592 397.180
-16.190	448.825	225.921	018.564	144.738	016.498 402.704
-14.286	451.648	226.038	017.898	150.638	014.189 408.335
-12.381	454.276	226.553	017.043	151.484	013.125 409.375
-10.476	456.781	226.559	016.880	148.830	012.093 408.025
-08.571	458.403	225.052	016.935	142.993	012.168 405.447
-06.667	459.008	221.504	017.089	136.852	010.468 401.913
-04.762	459.537	216.769	016.681	127.963	009.479 396.891
-02.857	458.436	210.836	016.090	118.484	006.789 388.963
-00.952	458.345	205.033	015.510	108.482	005.658 382.962
000.952	459.718	198.242	015.232	111.546	006.438 394.692
002.857	462.011	191.416	014.820	117.553	007.996 410.964
004.762	462.380	187.711	014.821	119.988	008.598 418.076
006.667	462.323	190.948	015.854	122.820	009.177 419.226
008.571	462.349	199.546	016.969	128.123	009.886 417.781
010.476	462.901	210.785	017.116	130.262	013.019 412.513
012.381	462.791	220.387	018.292	135.495	011.610 407.801
014.286	462.307	225.155	020.657	132.269	016.258 406.336
016.190	459.679	224.369	024.034	126.352	015.114 400.810
018.095	455.597	221.616	027.286	120.399	012.692 394.358
020.000	451.165	218.953	028.889	114.864	012.807 388.772
021.904	447.301	216.573	029.350	110.086	014.447 384.611
023.809	443.590	215.060	029.035	107.981	016.573 382.119
025.714	438.637	213.123	028.970	101.096	020.691 376.271
027.619	430.920	212.546	029.715	093.543	021.592 363.224
029.523	422.402	214.091	030.752	089.999	025.102 354.164
031.428	413.805	214.585	030.174	091.357	027.343 348.094
033.333	405.764	214.814	028.743	095.340	026.950 341.983
035.237	397.747	212.258	028.421	093.293	025.672 332.875
037.142	392.201	206.275	026.668	091.617	027.492 331.703
039.047	386.381	198.060	028.017	087.172	024.349 327.859
040.952	378.644	191.873	027.470	074.502	022.918 311.661
042.856	369.978	186.137	027.338	063.643	017.247 292.069
044.761	362.398	179.232	028.366	058.155	013.394 283.081
046.666	354.823	172.723	031.027	054.427	009.174 276.728
048.571	347.668	166.185	032.164	054.681	004.311 272.639
050.475	342.281	159.203	030.852	055.521	000.441 269.892
052.380	338.019	152.781	029.620	057.063	-04.077 267.844
054.285	334.264	146.661	028.962	056.201	-04.137 268.629
056.189	330.580	141.411	028.965	055.405	-04.194 269.345
058.094	325.893	137.501	030.894	053.379	-03.215 269.450
059.999	318.476	134.071	034.188	048.998	-04.704 262.887
061.903	309.515	130.940	038.740	045.262	-13.202 249.375
063.808	301.891	127.182	041.305	040.833	-15.629 241.218
065.713	294.212	123.862	044.413	034.349	-18.023 231.089
067.617	287.976	121.268	048.410	028.725	-16.735 227.108
069.522	282.844	120.001	052.192	023.153	-09.531 228.657
071.426	278.003	118.977	060.422	021.281	-10.585 230.144
073.331	271.879	117.964	066.545	015.967	-11.538 224.889
075.235	266.203	115.823	069.800	013.482	-10.869 222.793
077.139	259.555	116.130	078.981	008.960	-16.821 214.545
079.043	254.037	115.012	084.006	006.061	-21.808 207.284
080.947	250.528	113.843	087.865	005.230	-24.534 205.246
082.851	247.531	113.681	093.175	002.866	-27.892 201.999
084.753	247.433	112.121	093.849	002.508	-27.153 204.516
086.653	247.837	110.516	092.525	002.951	-27.169 205.628
088.542	247.890	109.704	092.062	002.538	-27.169 205.617

Average:396.983 187.114 026.813 091.591 007.370 335.643

Columns:

Column 1latLatitude
Column 2ulwrf Upwelling Longwave Radiation
Column 3dswrf Downwelling Shortware Radiation
Column 4uswrf Upwelling Shortwave Radiation
Column 5lhtfl Latent Heat Flux
Column 6shtflSensible Heat Flux
Column 7geoGeothermal

The value all the way on the bottom right: 335.643, is our geothermal emission in W/m² (for whole Earth). This means that geothermal supplies a temperature of:

(335.643/5.67e-8)^0.25-273.16 = 4.22°C

Hmm, If only there was some way to reaffirm this via the scientific literature…

Sure looks like geothermal delivers at least 0 degrees celsius. Our calculation is slightly off, but keep in mind that Sensible and Latent Heat is not directly measured by satellite but approximated via satellite data and some physical assumptions. Also, these geotherm diagram have been around for decades and it’s possible they are just sticking to a convention, while the actual measured surface T has changed.

It appears that everyone in geophysics already knows the truth. It’s only climate “scientists” who think greenhouse gases raise temperature, and without them the surface would be ~-18°C. Nope, without GHGs or even the Sun, it would be at least 0°C.

Geothermal and Solar completely explain the surface temperature and the remaining energy that goes into the atmosphere. No silly greenhouse effect necessary.

This is just a teaser to get people thinking. More to follow. Subscribe and stay tuned.

Love, -Zoe

Average Moon Temperature

Answer = 200.2 K

The Diviner Lunar Radiometer Experiment onboard the Lunar Reconnaissance Orbiter (LRO) has been acquiring moon data since July 2009. A very detailed paper [Williams 2017] was written, yet nowhere does it answer the simple question: what is the average temperature of the moon?

[Nikolov & Zeller 2014] intelligently guessed that the mean moon temperature is 197.3 K, based on techniques and data from [Vasavada 2012].

Today I will be using derived diviner data to calculate the average temperature on the moon. My data has two sources: UCLA and WUSTL. The first has hourly data, and the other has 15 minute data (although many fields are missing). The final result of the two should not differ highly.

I will be using a moon flattening parameter of 1/581.9 from [Araki 2009].

Paste the following into a new file called moontemp.sh:

# source moontemp.sh
# Zoe Phin, 2020/02/12

download1() {
    for l in {000..345..15}; do                   
        echo http://luna1.diviner.ucla.edu/~jpierre/diviner/level4_raster_data/diviner_tbol_snapshot_${l}E.xyz        
    done | wget -ci -
}

download2() {
    base="https://pds-geosciences.wustl.edu/lro/lro-l-dlre-4-rdr-v1/lrodlr_1001/data/gcp"
    for l in {0..8}; do let m=l+1; wget -c $base/global_cumul_avg_cyl_${l}0n${m}0n_002.tab; done    
    for l in {9..1}; do let m=l-1; wget -c $base/global_cumul_avg_cyl_${l}0s${m}0s_002.tab; done 
}

avg() { 
    awk '{ f=1/581.9; e=2*f-f^2; r=atan2(0,-1)/180
        T[$1]+=$3; N[$1]+=1; A[$1]+=r/2876.71*(1-e)*cos(r*$1)/(1-e*sin(r*$1)^2)^2
    } END { for (L in T) print L" "T[L]/N[L]" "A[L]}' | sort -n | awk '{
        printf "%6.2f %7.3f %.9f\n", $1, $2, $3; T+=$2*$3
    } END {printf "\nMoonT: %7.3f\n", T}'
}

calc_ucla() {
    awk '{T[$2" "$1]+=$3;N[$2" "$1]+=1} END {for (L in T) print L" "T[L]/N[L]}' *.xyz | avg 
}

calc_wustl() {
    awk -F, '$11>0{T[$2" "$1]+=$11;N[$2" "$1]+=1} END {for (L in T) print L" "T[L]/N[L]}' *.tab | avg          
}

Source the file and download all:

> source moontemp.sh
> download1
> download2

Run UCLA data:

> calc_ucla

-89.75 110.538 0.000019126
-89.25 111.747 0.000057376
-88.75 100.500 0.000095622
-88.25  86.845 0.000133861
-87.75  95.390 0.000172088
-87.25  93.020 0.000210302
-86.75 102.408 0.000248500
-86.25 111.180 0.000286677
-85.75 115.274 0.000324832
-85.25 116.388 0.000362962
-84.75 108.500 0.000401062
-84.25 114.433 0.000439131
-83.75 121.392 0.000477164
-83.25 127.846 0.000515160
-82.75 131.893 0.000553116
-82.25 128.598 0.000591027
-81.75 129.399 0.000628891
-81.25 128.545 0.000666706
-80.75 133.838 0.000704468
-80.25 136.010 0.000742174
-79.75 138.313 0.000779822
-79.25 137.705 0.000817407
-78.75 138.701 0.000854928
-78.25 139.370 0.000892381
-77.75 144.802 0.000929764
-77.25 146.782 0.000967073
-76.75 144.926 0.001004310
-76.25 148.518 0.001041460
-75.75 149.069 0.001078530
-75.25 152.516 0.001115520
-74.75 152.217 0.001152410
-74.25 152.631 0.001189220
-73.75 152.798 0.001225930
-73.25 155.469 0.001262550
-72.75 156.950 0.001299060
-72.25 159.616 0.001335480
-71.75 160.702 0.001371780
-71.25 161.436 0.001407980
-70.75 161.486 0.001444070
-70.25 161.645 0.001480050
-69.75 163.806 0.001515910
-69.25 164.137 0.001551640
-68.75 165.462 0.001587260
-68.25 167.220 0.001622750
-67.75 168.373 0.001658120
-67.25 168.937 0.001693350
-66.75 170.094 0.001728450
-66.25 169.680 0.001763410
-65.75 170.751 0.001798240
-65.25 173.129 0.001832920
-64.75 173.381 0.001867460
-64.25 173.632 0.001901860
-63.75 173.722 0.001936100
-63.25 174.360 0.001970190
-62.75 175.614 0.002004130
-62.25 177.573 0.002037910
-61.75 179.016 0.002071530
-61.25 178.989 0.002104990
-60.75 179.494 0.002138290
-60.25 179.960 0.002171410
-59.75 181.011 0.002204370
-59.25 182.461 0.002237150
-58.75 182.690 0.002269760
-58.25 183.011 0.002302190
-57.75 183.615 0.002334440
-57.25 184.528 0.002366510
-56.75 185.143 0.002398400
-56.25 186.270 0.002430100
-55.75 186.498 0.002461600
-55.25 187.116 0.002492920
-54.75 187.195 0.002524040
-54.25 188.222 0.002554970
-53.75 188.844 0.002585690
-53.25 189.003 0.002616220
-52.75 189.486 0.002646540
-52.25 190.722 0.002676650
-51.75 191.521 0.002706560
-51.25 192.018 0.002736260
-50.75 192.537 0.002765740
-50.25 192.778 0.002795010
-49.75 193.104 0.002824070
-49.25 193.643 0.002852900
-48.75 193.701 0.002881510
-48.25 194.512 0.002909900
-47.75 194.982 0.002938070
-47.25 195.687 0.002966010
-46.75 196.402 0.002993710
-46.25 196.603 0.003021190
-45.75 197.278 0.003048430
-45.25 197.688 0.003075440
-44.75 197.725 0.003102210
-44.25 198.564 0.003128740
-43.75 199.242 0.003155030
-43.25 199.568 0.003181080
-42.75 199.526 0.003206880
-42.25 199.664 0.003232430
-41.75 200.097 0.003257730
-41.25 200.912 0.003282790
-40.75 201.339 0.003307590
-40.25 202.002 0.003332140
-39.75 202.196 0.003356430
-39.25 202.276 0.003380460
-38.75 202.741 0.003404230
-38.25 203.329 0.003427750
-37.75 203.479 0.003451000
-37.25 203.857 0.003473980
-36.75 204.337 0.003496700
-36.25 204.448 0.003519150
-35.75 204.869 0.003541330
-35.25 205.275 0.003563240
-34.75 205.667 0.003584880
-34.25 205.934 0.003606240
-33.75 206.271 0.003627330
-33.25 206.483 0.003648140
-32.75 206.833 0.003668680
-32.25 207.007 0.003688930
-31.75 207.205 0.003708900
-31.25 207.485 0.003728590
-30.75 207.829 0.003747990
-30.25 208.459 0.003767110
-29.75 208.617 0.003785940
-29.25 208.895 0.003804480
-28.75 209.002 0.003822740
-28.25 208.976 0.003840700
-27.75 209.538 0.003858370
-27.25 209.768 0.003875750
-26.75 210.352 0.003892830
-26.25 210.413 0.003909620
-25.75 210.867 0.003926110
-25.25 211.035 0.003942300
-24.75 211.327 0.003958190
-24.25 211.227 0.003973790
-23.75 211.620 0.003989080
-23.25 212.096 0.004004070
-22.75 212.105 0.004018760
-22.25 212.073 0.004033140
-21.75 212.344 0.004047210
-21.25 212.649 0.004060980
-20.75 212.928 0.004074450
-20.25 213.122 0.004087600
-19.75 213.306 0.004100450
-19.25 213.395 0.004112980
-18.75 213.606 0.004125210
-18.25 213.724 0.004137120
-17.75 213.809 0.004148720
-17.25 214.029 0.004160010
-16.75 214.091 0.004170990
-16.25 214.020 0.004181650
-15.75 214.188 0.004191990
-15.25 214.468 0.004202020
-14.75 214.733 0.004211730
-14.25 214.766 0.004221120
-13.75 214.743 0.004230200
-13.25 215.075 0.004238950
-12.75 215.175 0.004247390
-12.25 215.205 0.004255510
-11.75 215.291 0.004263300
-11.25 215.631 0.004270780
-10.75 215.731 0.004277940
-10.25 215.824 0.004284770
 -9.75 215.847 0.004291280
 -9.25 216.019 0.004297470
 -8.75 216.254 0.004303330
 -8.25 216.301 0.004308870
 -7.75 216.361 0.004314090
 -7.25 216.454 0.004318980
 -6.75 216.384 0.004323550
 -6.25 216.570 0.004327790
 -5.75 216.539 0.004331710
 -5.25 216.662 0.004335300
 -4.75 216.713 0.004338570
 -4.25 216.684 0.004341510
 -3.75 216.822 0.004344120
 -3.25 217.028 0.004346410
 -2.75 217.014 0.004348370
 -2.25 217.030 0.004350000
 -1.75 217.149 0.004351310
 -1.25 217.148 0.004352290
 -0.75 217.118 0.004352940
 -0.25 217.131 0.004353270
  0.25 217.131 0.004353270
  0.75 217.240 0.004352940
  1.25 217.215 0.004352290
  1.75 217.240 0.004351310
  2.25 217.256 0.004350000
  2.75 217.107 0.004348370
  3.25 217.113 0.004346410
  3.75 217.040 0.004344120
  4.25 216.983 0.004341510
  4.75 216.931 0.004338570
  5.25 216.871 0.004335300
  5.75 216.831 0.004331710
  6.25 216.701 0.004327790
  6.75 216.656 0.004323550
  7.25 216.717 0.004318980
  7.75 216.789 0.004314090
  8.25 216.628 0.004308870
  8.75 216.441 0.004303330
  9.25 216.380 0.004297470
  9.75 216.352 0.004291280
 10.25 216.287 0.004284770
 10.75 216.194 0.004277940
 11.25 216.113 0.004270780
 11.75 215.904 0.004263300
 12.25 215.766 0.004255510
 12.75 215.754 0.004247390
 13.25 215.668 0.004238950
 13.75 215.504 0.004230200
 14.25 215.449 0.004221120
 14.75 215.361 0.004211730
 15.25 215.048 0.004202020
 15.75 214.742 0.004191990
 16.25 214.674 0.004181650
 16.75 214.590 0.004170990
 17.25 214.502 0.004160010
 17.75 214.531 0.004148720
 18.25 214.349 0.004137120
 18.75 214.273 0.004125210
 19.25 214.167 0.004112980
 19.75 213.801 0.004100450
 20.25 213.556 0.004087600
 20.75 213.123 0.004074450
 21.25 213.120 0.004060980
 21.75 212.942 0.004047210
 22.25 212.636 0.004033140
 22.75 212.513 0.004018760
 23.25 212.112 0.004004070
 23.75 211.977 0.003989080
 24.25 211.909 0.003973790
 24.75 211.629 0.003958190
 25.25 211.476 0.003942300
 25.75 211.027 0.003926110
 26.25 210.862 0.003909620
 26.75 210.510 0.003892830
 27.25 210.405 0.003875750
 27.75 210.150 0.003858370
 28.25 209.810 0.003840700
 28.75 209.589 0.003822740
 29.25 209.128 0.003804480
 29.75 209.121 0.003785940
 30.25 208.887 0.003767110
 30.75 208.520 0.003747990
 31.25 208.118 0.003728590
 31.75 207.895 0.003708900
 32.25 207.487 0.003688930
 32.75 207.224 0.003668680
 33.25 206.852 0.003648140
 33.75 206.578 0.003627330
 34.25 206.238 0.003606240
 34.75 205.897 0.003584880
 35.25 205.553 0.003563240
 35.75 205.236 0.003541330
 36.25 204.862 0.003519150
 36.75 204.382 0.003496700
 37.25 204.062 0.003473980
 37.75 203.723 0.003451000
 38.25 203.273 0.003427750
 38.75 202.940 0.003404230
 39.25 202.578 0.003380460
 39.75 201.890 0.003356430
 40.25 201.613 0.003332140
 40.75 201.259 0.003307590
 41.25 200.688 0.003282790
 41.75 200.437 0.003257730
 42.25 200.291 0.003232430
 42.75 199.786 0.003206880
 43.25 199.203 0.003181080
 43.75 198.752 0.003155030
 44.25 198.529 0.003128740
 44.75 198.064 0.003102210
 45.25 197.678 0.003075440
 45.75 196.640 0.003048430
 46.25 196.054 0.003021190
 46.75 195.885 0.002993710
 47.25 195.866 0.002966010
 47.75 195.388 0.002938070
 48.25 194.087 0.002909900
 48.75 193.621 0.002881510
 49.25 193.414 0.002852900
 49.75 193.144 0.002824070
 50.25 192.485 0.002795010
 50.75 191.851 0.002765740
 51.25 191.477 0.002736260
 51.75 190.991 0.002706560
 52.25 190.311 0.002676650
 52.75 189.843 0.002646540
 53.25 189.517 0.002616220
 53.75 188.474 0.002585690
 54.25 187.944 0.002554970
 54.75 187.390 0.002524040
 55.25 187.308 0.002492920
 55.75 186.351 0.002461600
 56.25 185.324 0.002430100
 56.75 185.153 0.002398400
 57.25 184.717 0.002366510
 57.75 184.195 0.002334440
 58.25 183.457 0.002302190
 58.75 182.550 0.002269760
 59.25 181.897 0.002237150
 59.75 180.882 0.002204370
 60.25 180.425 0.002171410
 60.75 179.733 0.002138290
 61.25 178.569 0.002104990
 61.75 177.431 0.002071530
 62.25 177.406 0.002037910
 62.75 176.133 0.002004130
 63.25 174.992 0.001970190
 63.75 174.263 0.001936100
 64.25 174.045 0.001901860
 64.75 173.420 0.001867460
 65.25 172.958 0.001832920
 65.75 171.647 0.001798240
 66.25 170.659 0.001763410
 66.75 168.890 0.001728450
 67.25 168.258 0.001693350
 67.75 167.299 0.001658120
 68.25 166.353 0.001622750
 68.75 165.771 0.001587260
 69.25 164.839 0.001551640
 69.75 164.604 0.001515910
 70.25 163.729 0.001480050
 70.75 162.392 0.001444070
 71.25 160.312 0.001407980
 71.75 159.079 0.001371780
 72.25 158.767 0.001335480
 72.75 158.000 0.001299060
 73.25 155.641 0.001262550
 73.75 154.728 0.001225930
 74.25 152.357 0.001189220
 74.75 152.286 0.001152410
 75.25 152.446 0.001115520
 75.75 151.423 0.001078530
 76.25 149.418 0.001041460
 76.75 147.758 0.001004310
 77.25 145.345 0.000967073
 77.75 145.334 0.000929764
 78.25 143.694 0.000892381
 78.75 143.395 0.000854928
 79.25 141.426 0.000817407
 79.75 139.245 0.000779822
 80.25 134.396 0.000742174
 80.75 134.638 0.000704468
 81.25 132.923 0.000666706
 81.75 131.281 0.000628891
 82.25 128.608 0.000591027
 82.75 125.593 0.000553116
 83.25 124.031 0.000515160
 83.75 122.858 0.000477164
 84.25 117.375 0.000439131
 84.75 110.464 0.000401062
 85.25 111.891 0.000362962
 85.75 113.233 0.000324832
 86.25 109.398 0.000286677
 86.75 113.166 0.000248500
 87.25 109.039 0.000210302
 87.75  98.730 0.000172088
 88.25  95.876 0.000133861
 88.75  86.159 0.000095622
 89.25  89.165 0.000057376
 89.75 109.777 0.000019126

MoonT: 201.082

First column is latitude, 2nd is average temperature for that latitude +/- 0.5 degrees, and 3rd is the surface area proportion of that latitude +/- 0.5 degrees. 3rd column adds up to 1. MoonT is the area-weighted mean of all latitudes.

Now we do the same for WUSTL data:

> calc_wustl

-89.75 111.192 0.000019126
-89.25 111.660 0.000057376
-88.75 100.290 0.000095622
-88.25  86.871 0.000133861
-87.75  95.084 0.000172088
-87.25  92.831 0.000210302
-86.75 102.601 0.000248500
-86.25 111.243 0.000286677
-85.75 115.981 0.000324832
-85.25 116.198 0.000362962
-84.75 108.497 0.000401062
-84.25 115.692 0.000439131
-83.75 123.413 0.000477164
-83.25 130.003 0.000515160
-82.75 133.344 0.000553116
-82.25 129.484 0.000591027
-81.75 129.702 0.000628891
-81.25 128.091 0.000666706
-80.75 133.846 0.000704468
-80.25 136.168 0.000742174
-79.75 138.256 0.000779822
-79.25 137.502 0.000817407
-78.75 138.429 0.000854928
-78.25 138.730 0.000892381
-77.75 143.939 0.000929764
-77.25 145.831 0.000967073
-76.75 144.331 0.001004310
-76.25 147.966 0.001041460
-75.75 148.465 0.001078530
-75.25 152.140 0.001115520
-74.75 151.776 0.001152410
-74.25 152.297 0.001189220
-73.75 152.585 0.001225930
-73.25 155.127 0.001262550
-72.75 156.952 0.001299060
-72.25 159.367 0.001335480
-71.75 160.486 0.001371780
-71.25 161.450 0.001407980
-70.75 161.246 0.001444070
-70.25 161.774 0.001480050
-69.75 163.612 0.001515910
-69.25 163.996 0.001551640
-68.75 165.422 0.001587260
-68.25 167.346 0.001622750
-67.75 168.544 0.001658120
-67.25 169.355 0.001693350
-66.75 170.667 0.001728450
-66.25 170.258 0.001763410
-65.75 171.491 0.001798240
-65.25 174.150 0.001832920
-64.75 174.220 0.001867460
-64.25 174.762 0.001901860
-63.75 174.905 0.001936100
-63.25 175.696 0.001970190
-62.75 177.189 0.002004130
-62.25 179.499 0.002037910
-61.75 180.929 0.002071530
-61.25 181.092 0.002104990
-60.75 181.633 0.002138290
-60.25 182.008 0.002171410
-59.75 183.258 0.002204370
-59.25 184.545 0.002237150
-58.75 184.681 0.002269760
-58.25 184.960 0.002302190
-57.75 185.818 0.002334440
-57.25 187.257 0.002366510
-56.75 187.572 0.002398400
-56.25 188.767 0.002430100
-55.75 188.732 0.002461600
-55.25 189.269 0.002492920
-54.75 189.528 0.002524040
-54.25 190.825 0.002554970
-53.75 191.269 0.002585690
-53.25 191.363 0.002616220
-52.75 191.864 0.002646540
-52.25 193.107 0.002676650
-51.75 194.078 0.002706560
-51.25 194.501 0.002736260
-50.75 194.990 0.002765740
-50.25 195.240 0.002795010
-49.75 195.366 0.002824070
-49.25 196.196 0.002852900
-48.75 196.037 0.002881510
-48.25 196.876 0.002909900
-47.75 197.586 0.002938070
-47.25 198.416 0.002966010
-46.75 199.182 0.002993710
-46.25 199.469 0.003021190
-45.75 199.996 0.003048430
-45.25 200.541 0.003075440
-44.75 200.536 0.003102210
-44.25 201.071 0.003128740
-43.75 201.588 0.003155030
-43.25 201.680 0.003181080
-42.75 201.665 0.003206880
-42.25 201.937 0.003232430
-41.75 202.546 0.003257730
-41.25 203.719 0.003282790
-40.75 204.205 0.003307590
-40.25 204.736 0.003332140
-39.75 204.970 0.003356430
-39.25 204.972 0.003380460
-38.75 205.850 0.003404230
-38.25 206.663 0.003427750
-37.75 206.584 0.003451000
-37.25 207.040 0.003473980
-36.75 207.439 0.003496700
-36.25 207.656 0.003519150
-35.75 208.246 0.003541330
-35.25 208.680 0.003563240
-34.75 209.273 0.003584880
-34.25 209.473 0.003606240
-33.75 209.612 0.003627330
-33.25 209.509 0.003648140
-32.75 209.973 0.003668680
-32.25 209.766 0.003688930
-31.75 210.415 0.003708900
-31.25 211.005 0.003728590
-30.75 211.611 0.003747990
-30.25 212.332 0.003767110
-29.75 212.587 0.003785940
-29.25 212.645 0.003804480
-28.75 212.918 0.003822740
-28.25 213.020 0.003840700
-27.75 213.443 0.003858370
-27.25 213.599 0.003875750
-26.75 214.268 0.003892830
-26.25 214.016 0.003909620
-25.75 214.252 0.003926110
-25.25 214.595 0.003942300
-24.75 215.184 0.003958190
-24.25 214.881 0.003973790
-23.75 215.254 0.003989080
-23.25 215.482 0.004004070
-22.75 214.994 0.004018760
-22.25 215.303 0.004033140
-21.75 215.699 0.004047210
-21.25 215.885 0.004060980
-20.75 216.148 0.004074450
-20.25 216.335 0.004087600
-19.75 215.992 0.004100450
-19.25 216.290 0.004112980
-18.75 216.212 0.004125210
-18.25 216.517 0.004137120
-17.75 216.406 0.004148720
-17.25 216.614 0.004160010
-16.75 216.567 0.004170990
-16.25 216.449 0.004181650
-15.75 216.859 0.004191990
-15.25 217.096 0.004202020
-14.75 217.008 0.004211730
-14.25 216.947 0.004221120
-13.75 216.897 0.004230200
-13.25 216.589 0.004238950
-12.75 216.806 0.004247390
-12.25 216.571 0.004255510
-11.75 216.447 0.004263300
-11.25 216.478 0.004270780
-10.75 216.249 0.004277940
-10.25 216.115 0.004284770
 -9.75 216.419 0.004291280
 -9.25 216.589 0.004297470
 -8.75 216.901 0.004303330
 -8.25 216.549 0.004308870
 -7.75 216.389 0.004314090
 -7.25 216.261 0.004318980
 -6.75 216.297 0.004323550
 -6.25 216.072 0.004327790
 -5.75 216.128 0.004331710
 -5.25 215.923 0.004335300
 -4.75 215.805 0.004338570
 -4.25 215.933 0.004341510
 -3.75 216.061 0.004344120
 -3.25 216.258 0.004346410
 -2.75 216.291 0.004348370
 -2.25 215.959 0.004350000
 -1.75 215.765 0.004351310
 -1.25 215.792 0.004352290
 -0.75 215.759 0.004352940
 -0.25 215.570 0.004353270
  0.25 215.419 0.004353270
  0.75 215.150 0.004352940
  1.25 215.066 0.004352290
  1.75 214.871 0.004351310
  2.25 214.831 0.004350000
  2.75 214.809 0.004348370
  3.25 214.578 0.004346410
  3.75 214.599 0.004344120
  4.25 214.262 0.004341510
  4.75 213.976 0.004338570
  5.25 213.812 0.004335300
  5.75 213.690 0.004331710
  6.25 213.126 0.004327790
  6.75 212.800 0.004323550
  7.25 212.827 0.004318980
  7.75 212.963 0.004314090
  8.25 212.537 0.004308870
  8.75 212.270 0.004303330
  9.25 211.869 0.004297470
  9.75 211.598 0.004291280
 10.25 211.233 0.004284770
 10.75 211.009 0.004277940
 11.25 210.931 0.004270780
 11.75 210.785 0.004263300
 12.25 210.905 0.004255510
 12.75 210.935 0.004247390
 13.25 210.643 0.004238950
 13.75 210.358 0.004230200
 14.25 210.152 0.004221120
 14.75 209.363 0.004211730
 15.25 208.773 0.004202020
 15.75 208.371 0.004191990
 16.25 208.195 0.004181650
 16.75 208.633 0.004170990
 17.25 208.545 0.004160010
 17.75 208.499 0.004148720
 18.25 208.173 0.004137120
 18.75 208.080 0.004125210
 19.25 207.456 0.004112980
 19.75 207.185 0.004100450
 20.25 206.943 0.004087600
 20.75 206.466 0.004074450
 21.25 206.488 0.004060980
 21.75 206.295 0.004047210
 22.25 205.953 0.004033140
 22.75 205.895 0.004018760
 23.25 205.286 0.004004070
 23.75 205.249 0.003989080
 24.25 204.794 0.003973790
 24.75 204.654 0.003958190
 25.25 204.500 0.003942300
 25.75 203.869 0.003926110
 26.25 203.385 0.003909620
 26.75 203.094 0.003892830
 27.25 203.295 0.003875750
 27.75 203.300 0.003858370
 28.25 203.156 0.003840700
 28.75 202.549 0.003822740
 29.25 202.184 0.003804480
 29.75 201.900 0.003785940
 30.25 201.414 0.003767110
 30.75 201.026 0.003747990
 31.25 200.339 0.003728590
 31.75 200.075 0.003708900
 32.25 199.678 0.003688930
 32.75 199.134 0.003668680
 33.25 199.242 0.003648140
 33.75 198.870 0.003627330
 34.25 198.542 0.003606240
 34.75 197.986 0.003584880
 35.25 197.710 0.003563240
 35.75 197.292 0.003541330
 36.25 196.695 0.003519150
 36.75 196.182 0.003496700
 37.25 195.985 0.003473980
 37.75 195.376 0.003451000
 38.25 195.215 0.003427750
 38.75 194.945 0.003404230
 39.25 194.702 0.003380460
 39.75 194.408 0.003356430
 40.25 194.084 0.003332140
 40.75 193.454 0.003307590
 41.25 192.829 0.003282790
 41.75 192.537 0.003257730
 42.25 192.213 0.003232430
 42.75 191.841 0.003206880
 43.25 191.122 0.003181080
 43.75 190.537 0.003155030
 44.25 190.296 0.003128740
 44.75 190.286 0.003102210
 45.25 189.988 0.003075440
 45.75 189.450 0.003048430
 46.25 188.734 0.003021190
 46.75 188.502 0.002993710
 47.25 188.434 0.002966010
 47.75 187.969 0.002938070
 48.25 186.444 0.002909900
 48.75 186.200 0.002881510
 49.25 186.058 0.002852900
 49.75 185.592 0.002824070
 50.25 185.071 0.002795010
 50.75 184.626 0.002765740
 51.25 184.254 0.002736260
 51.75 184.047 0.002706560
 52.25 183.452 0.002676650
 52.75 182.817 0.002646540
 53.25 182.411 0.002616220
 53.75 181.747 0.002585690
 54.25 181.146 0.002554970
 54.75 180.724 0.002524040
 55.25 180.913 0.002492920
 55.75 180.215 0.002461600
 56.25 179.626 0.002430100
 56.75 179.553 0.002398400
 57.25 178.901 0.002366510
 57.75 178.362 0.002334440
 58.25 177.475 0.002302190
 58.75 176.981 0.002269760
 59.25 176.461 0.002237150
 59.75 176.088 0.002204370
 60.25 175.993 0.002171410
 60.75 175.367 0.002138290
 61.25 174.353 0.002104990
 61.75 173.586 0.002071530
 62.25 173.660 0.002037910
 62.75 172.696 0.002004130
 63.25 171.946 0.001970190
 63.75 171.428 0.001936100
 64.25 171.687 0.001901860
 64.75 171.416 0.001867460
 65.25 171.483 0.001832920
 65.75 170.581 0.001798240
 66.25 169.589 0.001763410
 66.75 168.143 0.001728450
 67.25 167.852 0.001693350
 67.75 167.419 0.001658120
 68.25 166.735 0.001622750
 68.75 166.388 0.001587260
 69.25 165.532 0.001551640
 69.75 165.550 0.001515910
 70.25 164.822 0.001480050
 70.75 163.915 0.001444070
 71.25 162.093 0.001407980
 71.75 161.380 0.001371780
 72.25 161.240 0.001335480
 72.75 160.716 0.001299060
 73.25 158.450 0.001262550
 73.75 157.668 0.001225930
 74.25 155.624 0.001189220
 74.75 155.580 0.001152410
 75.25 156.122 0.001115520
 75.75 155.565 0.001078530
 76.25 154.319 0.001041460
 76.75 153.155 0.001004310
 77.25 150.936 0.000967073
 77.75 151.168 0.000929764
 78.25 149.753 0.000892381
 78.75 148.941 0.000854928
 79.25 146.770 0.000817407
 79.75 143.766 0.000779822
 80.25 138.198 0.000742174
 80.75 137.521 0.000704468
 81.25 134.825 0.000666706
 81.75 132.612 0.000628891
 82.25 129.110 0.000591027
 82.75 124.871 0.000553116
 83.25 122.797 0.000515160
 83.75 121.661 0.000477164
 84.25 116.752 0.000439131
 84.75 110.306 0.000401062
 85.25 111.987 0.000362962
 85.75 113.071 0.000324832
 86.25 109.478 0.000286677
 86.75 113.206 0.000248500
 87.25 108.943 0.000210302
 87.75  98.809 0.000172088
 88.25  96.164 0.000133861
 88.75  86.252 0.000095622
 89.25  89.380 0.000057376
 89.75 110.643 0.000019126

MoonT: 199.386

It’s hard to choose which data is better. The first is complete hourly data, while the latter is 15 minute data but with many missing pieces.

The results are 201.086 K and 199.386 K.

The average of the two results is 200.2 K, and I will leave it at that.

Answer = 200.2 K

Enjoy 🙂 -Zoe

Update

I was curious about Equations 12 and 13 of [Nikolov & Zeller 2014], so I wrote a little bit more code into moontemp.sh above:

eq12_13() {
    seq 0.05 0.1 89.95 | awk '{ R=atan2(0,-1)/180; L=R*$1
        T=216.313+9.919*L-119.814*L^2+307.116*L^3-466.244*L^4+321.317*L^5-84.973*L^6
        AVG+=T*cos(L)*0.1*R
    } END { printf "MoonT: %7.3f\n", AVG }'
}

I used 216.313 K as my average equatorial temperature, which I got by averaging UCLA & WUSTL data between -0.25 and 0.25 latitude.

> . moontemp.sh; eq12_13

MoonT: 200.777

The result is only off by half a degree from our answer. This formula is pretty good.

Pressure Change and Real Standard Pressure

The standard mean sea level pressure is defined as 101.325 kPa. This is the standard used in US and International Standard Atmosphere. This value is all over the place. In reality this value was agreed upon by committee and at no time represented the true mean sea level pressure. Most certainly it does not represent the true value today. Today I will try to calculate what the real value should be. I will be using data from NOAA’s ESR Lab.

You will need gnuplot:

sudo apt install gnuplot

pres.sh:

# bash pres.sh
# Zoe Phin, 2020/02/05

wget -qO- 'https://www.esrl.noaa.gov/psd/cgi-bin/data/timeseries/timeseries.pl?ntype=1&var=Pressure&level=2000&lat1=-90&lat2=90&lon1=0&lon2=360&iseas=1&mon1=0&mon2=11&iarea=1&typeout=1&Submit=Create+Timeseries' | awk '$1>1947&&$1<2020{print $1" "$2/10}' | sed 1d > surpres.txt

wget -qO- 'https://www.esrl.noaa.gov/psd/cgi-bin/data/timeseries/timeseries.pl?ntype=1&var=Sea+Level+Pressure&level=2000&lat1=-90&lat2=90&lon1=0&lon2=360&iseas=1&mon1=0&mon2=11&iarea=1&typeout=1&Submit=Create+Timeseries' |  awk '$1>1947&&$1<2020{print $1" "$2/10}' | sed 1d > seapres.txt

echo 'set term png size 740,740; set key below
set title "Pressure (kPa)"; set xrange [1947 to 2020]
set yrange [98.58 to 98.46]; set format y "%5.2f"
set y2range [101.24 to 101.12]; set format y2 "%6.2f"
set grid xtics mxtics ytics y2tics mytics my2tics
set xtics 10; set mxtics 2; set ytics 0.02; set y2tics 0.02; set mytics 2; set my2tics 2
plot "surpres.txt" u 1:2 axes x1y1 t "Surface (Left)" w lines lt 1 lw 2 lc rgb "red",\
     "seapres.txt" u 1:2 axes x1y2 t "Sealevel (Right)" w lines lt 1 lw 2 lc rgb "blue"' | gnuplot > pres.png

Run it:

> bash pres.sh

Result is three files: surpres.txt, seapres.txt, and pres.png

Mean Surface & Sealevel Pressure (kPa)

I have trouble believing global data before the 1979 full global satellite era. In any case we see that both surface and sea-level pressure have been decreasing. This is odd considering that temperatures have been going up, but I will not go into that today.

To figure out the real mean sea-level pressure I will simply average the data between 1979 and 2019 (inclusive):

>  awk '$1>=1979 && $1<=2019 { SUM+=$2; NUM+=1 } END {print SUM/NUM}' seapres.txt

101.159

The mean sea-level pressure in the post-satellite era is: 101.159 kPa and NOT 101.325 kPa.

All those using the committee-established value of 101.325 kPa will not be reflecting reality.

The average surface pressure between 1979 & 2019 is 98.4976 kPA.

Enjoy 🙂 -Zoe

North and South Hemisphere

Today I will analyze some differences between the north and south hemisphere. I’ll be using NCEP‘s Long Term Mean Air Surface Temperature for 1979-2017, and NASA’s ISCCP Project Insolation data from 1983-2009. Sure the years don’t overlap, but we are using long term averages anyway and don’t care about the time trend. First we need one tool:

$ apt get install nco

Create a new file called hemi.sh, with the following:

# source hemi.sh
# Zoe Phin, 2020/01/24

download() {             
    wget -O air.nc -c ftp://ftp.cdc.noaa.gov/Datasets/ncep/air.sfc.day.ltm.nc
    wget -O wtr.nc -c http://research.jisao.washington.edu/data_sets/elevation/fractional_land.1-deg.nc
    wget -O ele.nc -c http://research.jisao.washington.edu/data_sets/elevation/elev.1-deg.nc
    wget -O sup.fl -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFDW
    wget -O sdn.fl -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFUW
}

water() { # Arg: 1 - Min Latitude, 2 - Max Latitude
    ncks --trd -HC wtr.nc -v data | awk -F[=\ ] -vm=$1 -vM=$2 '   
        $4!=NIL && $4>m && $4 <M {
            a = 6378138; b = 6356753; e = 1-(b/a)^2; r = atan2(0,-1)/180
            A = (a*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2/510072e9
            printf "%6.2f %5.1f %.9f %6.2f\n", $4, $6, A, $8/10000
        }' | awk '
            { A+=$3; L+=$3*$4 } END { printf "Water Fraction: %7.4f\n", 1-L/A }'
}

elev() { # Arg: 1 - Min Latitude, 2 - Max Latitude
    ncks --trd -HC ele.nc -v data | awk -F[=\ ] -vm=$1 -vM=$2 '
        $4!=NIL && $4>m && $4 <M {
            a = 6378138; b = 6356753; e = 1-(b/a)^2; r = atan2(0,-1)/180
            A = (a*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2/510072e9
            if ($8 < 0) $8 = 0
            printf "%6.2f %5.1f %.9f %6.2f\n", $4, $6, A, $8
        }' | awk '
            { A+=$3; E+=$3*$4 } END { printf "Avg Elevation: %7.4f\n", E/A }'
}

solar() { # Arg: 1 - N or S, Empty Arg = All
    od -An -w4 -f --endian=big sup.fl > .sup
    od -An -w4 -f --endian=big sdn.fl > .sdn

    H='1,6596p'; D=3298; 
    [[ -z $1 ]] && D=6596
    [[ $1 = "S" ]] && H='1,3298p'
    [[ $1 = "N" ]] && H='3299,$p'

    paste .sup .sdn | sed -n $H | awk '{
        printf "%7.3f %7.3f %7.3f %7.3f\n", $1, $2, $1-$2, 1-$2/$1
    }' | awk -vD=$D '{UP+=$1;DN+=$2;NT+=$3;AB+=$4} END {
        print "Averages:" 
        printf "%7.3f %7.3f %7.3f %7.3f\n", UP/D, DN/D, NT/D, AB/D
    }'
}

temp() {    # Arg: 1 - Min Latitude, 2 - Max Latitude
    for d in `seq 0 364`; do
        ncks --trd -HC air.nc -v air -d time,$d |\
        awk -F[=\ ] -vm=$1 -vM=$2 '$4!=NIL && $4>=m && $4<=M {
            if ($4 < 0) { $4 += 1.25 } else { $4 -= 1.25 }
            a = 6378138; b = 6356753; e = 1-(b/a)^2; r = atan2(0,-1)/180
            A = (a*2.5*r)^2*(1-e)*cos(r*$4)/(1-e*sin(r*$4)^2)^2/510072e9

            printf "%6.2f %5.1f %.9f %6.2f\n", $4, $6, A, $8/100+477.65-273.16
        }' | awk -vd=$d '
            { A+=$3; T+=$3*$4 } END { printf "%03d %7.4f\n", d+1, T/A }'
    done
}       

tempsavg() {    # Not generic
    for f in ans nor sou; do
        cat $f.csv | awk '{S+=$2}END{print S/NR}'
    done
}

tempsplot() {   # Not generic
    echo "set term png size 740,370;set grid;set key below;set xrange [0:365]
    set title 'Long Term Day of the Year Mean (°C)'; set xtics 30
    plot 'ans.csv' u 1:2 t 'Whole' w lines lw 2 lc rgb 'black',\
         'nor.csv' u 1:2 t 'North' w lines lw 2 lc rgb 'orange',\
         'sou.csv' u 1:2 t 'South' w lines lw 2 lc rgb 'blue'" | gnuplot > allhemi.png
}

We will source the code to have its functions run as separate command-line commands:

$ source hemi.sh

First we download the necessary data:

$ download

We extract the necessary data:

$ temp -90 90 > ans.csv  # Whole Earth
$ temp 0 90 > nor.csv    # North hemisphere
$ temp -90 0 > sou.csv   # South hemisphere

We plot the data:

$ tempsplot

A file called allhemi.png will be generated:

Day of the Year Mean (1979-2017), 0 = Jan 1st

We can see that there’s a lot more fluctuation in the north than south hemisphere. This is most likely due to more ocean in the south having a moderating influence. Let’s see what the actual averages are:

$ tempsavg

# Result:
# 14.9809 - Whole
# 15.6322 - North
# 14.3295 - South

The north is actually over a degree warmer than the south. I did not expect that. I would’ve thought that more ocean would have made it warmer. Let’s move on to elevation analysis.

$ elev -90 90    # Whole Earth

# Avg Elevation: 232.8598 (meters)

$ elev 0 90      # North Hemisphere

# Avg Elevation: 273.7237 (meters)

$ elev -90 0     # South hemisphere

# Avg Elevation: 191.9959 (meters)

The nothern hemisphere is on average 81.7 meters higher. From my previous article Air Temperatures and Average Lapse Rate, we learned that the average lapse rate is ~0.0056 °C/m. 81.7 * 0.0056 = 0.458 °C advantage for the south.

Now let’s take a look at the water fraction:

$ water -90 90   # Whole Earth

Water Fraction:  0.7110

$ water 0 90     # North Hemisphere

Water Fraction:  0.6092

$ water -90 0    # South Hemisphere

Water Fraction:  0.8127

The southern hemisphere has 0.8127/0.6092 = 33 % MORE water than the northern hemisphere.

Now we do insolation analysis. I expect that the south will receive more insolation given that perihelion occurs while the sun is in the south, and aphelion occurs while the sun is in the north.

$ solar       # Whole Earth
Averages:
189.141  23.309 165.832   0.854

$ solar N     # North hemisphere
Averages:
187.909  24.750 163.159   0.847

$ solar S     # South hemisphere
Averages:
190.373  21.868 168.504   0.861

The results are best explained in a table:

Downwelling Surface AlbedoNet SolarAbsorption
Whole189.1 W/m²23.3 W/m²165.8 W/m²85.4 %
North187.9 W/m²24.8 W/m²163.2 W/m²84.7 %
South190.4 W/m²21.9 W/m²168.5 W/m²86.1 %
Insolation

The southern hemisphere has a higher absorption fraction, higher net insolation, more water and lower elevation. It has every advantage to be hotter than the northern hemisphere and yet it is not, it is 15.6322 °C (North) – 14.3295 °C (South) = 1.3 °C cooler. How come?

Who can solve this mystery?

Enjoy 🙂 -Zoe

Update 2020/01/31

A 110 views and 2 dozen comments later nobody has solved the mystery. The best answer was the heat capacity difference of land and water, but as pointed out, heat capacity controls both the heating and cooling rate.

Astute readers of this blog may have guessed where I was going: geothermal. Indeed, I do think it is geothermal, and I have thought this for about half a year now.

I have a simple formula for guessing the radiative component of geothermal. It is:

RadGeo = Longwave Upwelling IR - (Shortwave Downwelling - Shortwave Upwelling)

RadGeo = Longwave Upwelling IR - Net Solar

msolver.sh:

# source msolver.sh
# Zoe Phin, 2020/01/31

download() {
    wget -O ldn.fl -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__LWFLSRFUW
    wget -O sup.fl -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFDW
    wget -O sdn.fl -c https://isccp.giss.nasa.gov/pub/data/FC/FDAVGANN__SWFLSRFUW
}

msolver() { # Arg: 1 - N or S, Empty Arg = All
    od -An -w4 -f --endian=big ldn.fl > .ldn
    od -An -w4 -f --endian=big sup.fl > .sup
    od -An -w4 -f --endian=big sdn.fl > .sdn

    H='1,6596p'; D=3298;
    [[ -z $1 ]] && D=6596
    [[ $1 = "S" ]] && H='1,3298p'
    [[ $1 = "N" ]] && H='3299,$p'

    paste .ldn .sup .sdn | sed -n $H | awk '{
        printf "%7.3f %7.3f %7.3f %7.3f %7.3f\n", $1, $2, $3, $2-$3, $1-$2+$3
    }' | awk -vD=$D '{UP+=$1;NT+=$4;GE+=$5} END { 
        printf "%7.3f %7.3f %7.3f\n", GE/D, NT/D, UP/D            
    }'
}

Run it:

$ source msolver.sh
$ download
$ msolver             # Whole Earth - 227.632 165.832 393.463
$ msolver N           # North Hemi. - 236.605 163.159 399.764
$ msolver S           # South Hemi. - 218.659 168.504 387.163

Summary Table

Rad. GeothermalNet SolarUpwelling Result
Whole227.6 W/m²165.8 W/m²393.5 W/m²
North236.6 W/m²163.2 W/m²399.8 W/m²
South218.7 W/m²168.5 W/m²387.2 W/m²

Summary: The North has a geothermal advantage that outweighs all of the South’s non-geothermal advantages.

For those that still believe geothermal is tiny and negligible, please read my other articles to get informed:

https://phzoe.wordpress.com/2019/12/04/the-case-of-two-different-fluxes/

https://phzoe.wordpress.com/2019/12/06/measuring-geothermal-1/

Enjoy 🙂 -Zoe

Precipitable Water as Temperature Proxy

Precipitable water is a measure of how high water would stack up if all the water vapor in the atmosphere would rain down, right now! It typically ranges between 22.4 and 24.2 millimeters. All the water vapor raining down would add up to about 0.9 inches.

Now a little bit of logic: the amount of water vapor in the atmosphere depends on how hot the oceans/lakes/rivers and whatever water is on/in the ground is. The hotter, the more evaporation. Simple. Therefore precipitable water should be a good proxy for surface water temperature. Let’s see what the history of precipitable water looks like. For that we go to NOAA’s ESRL.

We fill out the form, like this:

And this is what we get:

Precipitable Water Column Height, in millimeters

One would think that with constant warming, we should see the precipitable water always going up. But we don’t see that. We clearly see a CYCLE here, an invisible letter U or V. In fact, it reminds me of something we discovered here:

Global Averge Temperature Anomaly after Latitude Drift Adjustment

Let’s combine the two, while shifting temperatures forward 7 years:

Latitude Drift Adjusted Berkeley Temperature vs. Precipitable Water

Now that makes sense. You know what doesn’t make sense? The “consensus” temperature data. Here it is:

Berkeley Global Summary Temperature vs. Precipitable Water

It is clear that Berkeley (and other similar outfits) do not perform proper latitude drift adjustment and so their result does not match what we should expect to happen to precipitable water level.

What we have here is a great confirmation that mainstream climate science has gone off the rails.

Enjoy 🙂 -Zoe

Air Temperatures and Average Lapse Rate

Today we will examine 40 years (1979-2018) of air temperature data and derive the average lapse rate. First we need some tools:

$ sudo apt install nco gmt

We will be using NCEP Reanalysis 2 data. Grab it:

$ wget -c -O air.nc ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.derived/pressure/air.mon.mean.nc

Create a new file air.sh with:

#!/usr/bin/bash
# Zoe Phin, 2019/12/17

P=(1000 925 850 700 600 500 400 300 250 200 150 100 70 50 30 20 10) # Size: 17 

aircsv() {
    for t in `seq 0 479`; do
        for p in `seq 0 16`; do
            for l in `seq 0 72 | sed s/36//`; do
                ncks -v air -d time,$t,$t -d lat,$l,$l -d level,$p,$p air.nc |\
                sed -n "/air =/,/^$/p" | egrep -o '[-0-9].*[0-9]' | tr -s ', ' '\n' | awk -v l=$l '
                    function rad(x) { return x*atan2(0,-1)/180 } {
                    lat = l * 2.5 - 90; lon = n * 2.5; n += 1
                    if (lat < 0) { lat += 1.25 } else { lat -= 1.25 }
                    a = 6378137.678; b = 6356752.964; E = 1-b^2/a^2; r = rad(lat)
                    A = (a*rad(2.5))^2*(1-E)*cos(r)/(1-E*sin(r)^2)^2/510065728777854
                    printf "%5.2f %5.1f %6.2f %12.10f\n", lat, lon, $1/100+465.15, A}'
            done | awk -v T=$t -v P=${P[p]} '
                {S+=$3*$4} END {printf "%3d %4d %6.2f\n", T, P, S}'
        done
    done
}

avgair() {
    for p in ${P[*]}; do
        awk -v P=$p '$2==P{S+=$3;N+=1} END {printf "%8.1f %6.2f\n",P*100,S/N}' air.old
    done
}

plotavgair() {
    avgair > avgair.csv
    echo 'set term png size 740,740; unset key; 
    set mxtics 2; set mytics 2; set ytics 10000
    set grid mytics ytics xtics
    set title "Temperature (K) vs. Pressure (Pa)"
    set yrange [101325 to 0]
    plot "avgair.csv" u 2:1 w linespoints ps 1 pt 3 lw 3 lc rgb "red"' | gnuplot
}

slopes() {
    for p in ${P[*]}; do 
        printf "%4d: " $p; sed 's/\... / /' .p$p.dat |\
            awk '{T[$1]+=$2}END{for (i in T) printf "%s %6.2f\n",i,T[i]/12}' |\
            gmt gmtregress -Fp -o5 | awk '{printf "%9.6f\n", $1}'
    done
}

plotair() {
    for p in ${P[*]}; do 
        awk -v P=$p '$2==P{printf "%7.2f %6.2f\n", $1/12+1979+1/24, $3}' air.old > .p$p.dat
    done
    (echo 'set term png size 740,1000; set key below; 
    set mxtics 5; set mytics 5; set grid mytics ytics xtics
    set title "Temperature (K) at Pressure (hPa)"
    set xrange [1978 to 2020]; set yrange [300 to 200]; plot \';
    for p in ${P[*]}; do 
        echo "'.p$p.dat' u 1:2 t '$p' w points,\\"
    done ) | sed '$s/..$//' | gnuplot
}

lapse(){
    avgair > avgair.csv
    I=(`head -1 avgair.csv`)

    awk -v Ps=${I[0]} -v Ts=${I[1]} 'BEGIN{
    #   g  = 9.7977074331       # Based on EGM2008
        g  = 9.7977115          # Account for using Atmosphereless GM constant
        g  = g - 0.16           # Adjust g to middle of the troposphere
        R  = 8.31446261815324   # Gas Constant
        M  = 0.0289644          # from US Standard Atmosphere
    } 
    $1 >= 10000 && $1 < 100000 {
        P = $1; T = $2
        L = (g*M/R)*log(T/Ts)/log(P/Ps)
        printf "%s %10.6f\n", $1, L
    }' avgair.csv |\
        awk '{S+=$2; print} END {printf "\nAVG = %12.10f\n", S/NR}'
}

We source the code, to allow its functions to act like command-line commands we can run in parts.

$ source air.sh

The first thing we must do is generate the data for all the other commands:

$ aircsv > air.csv

This will take several hours to run on an average laptop. Afterwards we are free to run various things quickly. When complete, let’s plot the air data:

$ plotair > p.png

What’s the trend for each pressure?

$ slopes

1000:  0.019277
 925:  0.017171
 850:  0.017911
 700:  0.021696
 600:  0.015600
 500:  0.018723
 400:  0.017133
 300:  0.013247
 250:  0.006388
 200: -0.005289
 150: -0.022488
 100: -0.054646
  70: -0.051159
  50: -0.043313
  30: -0.047701
  20: -0.055906
  10: -0.110508

The above is the trend in °K/year. You’ll notice that the lower atmosphere is getting hotter while the upper atmosphere is getting colder.

This is a signature of reduced insolation with a greater reduction of cloud cover. In simple terms: there is less sun power but even less clouds, allowing MORE of that sun to get through.

This is NOT a signature of greenhouse gas forcing, as the literature has the magical pivot point at either the beginning of the tropopause (P~100 hPa) or in the middle of the troposphere (P~500 hPa) or where T~242°K (P~410 hPa), and certainly not at P~225 hPa as we see here.

What’s the average for the entire 40 years for each pressure? (Now in Pascals)

$ avgair

100000.0 288.28
 92500.0 284.13
 85000.0 280.74
 70000.0 272.96
 60000.0 266.39
 50000.0 257.76
 40000.0 247.02
 30000.0 233.47
 25000.0 226.07
 20000.0 219.35
 15000.0 212.70
 10000.0 206.58
  7000.0 207.84
  5000.0 211.68
  3000.0 217.00
  2000.0 221.34
  1000.0 229.13

What does that look like?

$ plotavgair > a.png

The troposphere is wherever pressure exceeds 10000 pascals. This may not be the exact figure, but our resolution doesn’t allow any better.

Now I want to know the tropospheric lapse rate.

$ lapse

92500.0   0.006245
85000.0   0.005475
70000.0   0.005140
60000.0   0.005190
50000.0   0.005420
40000.0   0.005660
30000.0   0.005881
25000.0   0.005887
20000.0   0.005700
15000.0   0.005381
10000.0   0.004859

AVG = 0.0055307273

Don’t let the precision of the average fool you, but the 99% confident answer is between 0.0055 and 0.0057 °C/meter. This is an average for a mixed wet/dry troposphere, i.e the actual troposphere. It is what is actually observed with the average water vapor level being what it actually is. All those using 0.0065 °C/m as used in the US/International Standard Atmosphere will not be reflecting reality (only an ideal dry atmosphere model), and should abandon doing so.

Enjoy 🙂 -Zoe

Earth Average Surface Gravity

Happy new year, everybody 🙂

It would be interesting to know the average surface gravity of Earth. I looked over at Wikipedia: Standard Gravity and found:

The standard acceleration due to gravity (or standard acceleration of free fall), sometimes abbreviated as standard gravity, usually denoted by ɡ0 or ɡn, is the nominal gravitational acceleration of an object in a vacuum near the surface of the Earth. It is defined by standard as 9.80665 m/s2

If we look at US Standard Atmosphere, we find the same thing. So what’s the problem?

Both sources tell us:

The value of ɡ0 defined above is a nominal midrange value on Earth, originally based on the acceleration of a body in free fall at sea level at a geodetic latitude of 45°.

Nominal midrange? What is that? That’s not an average for the whole Earth. We will have to figure it out by other means.

A National Geospatial Intelligence Agency document tells us how to do it. Appendix B gives us all the math we need. We will however use the latest values for a and b derived from here. Create a new file called efacts.sh with:

#!/usr/bin/bash
# Zoe Phin 2019/12/15

echo '  function __(s, p, n) { printf "%-3s\t%25.*f\n", s, p, n } 
BEGIN { π = atan2(0,-1)

        # Given

    GM = 3.9860046055e14                    # Earth Mass * G
    a  = 6378137.678                        # Semi-major axis
    b  = 6356752.964                        # Semi-minor axis
    ω  = 7.292115e-5                        # Angular Velocity
    G  = 6.67428e-11                        # Gravity "Constant"
    Mₐ = 5.148e18                           # Mass of the Atmosphere    

        # Derived

    Mₑ  = GM / G                            # Mass of the Earth (with Atmosphere)
    Mₒ  = Mₑ − Mₐ                           # Mass of the Earth (w/o Atmosphere)
    GMₒ = G × Mₒ                            # Atmosphere-less Constant

    F  = (a − b) / b                        # Flattening Factor
    f  = 1 / F                              # Inverse Flattening
    e  = √(1 − b²/a²)                       # 1st Eccentricity
    E  = √(a²/b² − 1)                       # 2nd Eccentricity

    Rₚ = a²/b                               # Polar Radius of Curvature
    R₁ = a × (1 − F/3)                      # Mean Radius of the Three Semi-Axes
    R₂ = Rₚ × (1 − (⅔)E² +                 \# Radius of a Sphere of Equal Area 
        (26/45)E⁴ −                         \
        (100/189)E⁶ +                        \
        (7034/14175)E⁸ −                      \
        (220652/467775)E¹⁰ )

    R₃ = ³√(a² × b)                         # Radius of a Sphere of Equal Volume

    t  = tan⁻¹(E)
    q  = ½ × ( (t + 3×t/E²) − 3/E )
    Q  = 3 × ( (1 + 1/E²) × (1 − t/E) )-1

    __( "GM" , 00 , GM )
    GM  = GMₒ               # Warning ! Switching to Atmosphere-less Constant

    m  = (ω² × a ² × b) / GM                # Normal Gravity Formula Constant
    J₂ = (e²/3) × (1−(2×m×E)/(15×q))        # Dynamical  Form  Factor
    U₀ = t*GM/(a×e) + ⅓×ω²×a²               # Normal Gravity Potential
    Gₑ = GM/(a×b) × (1 − m − (m×E×Q)/(6×q)) # Normal Gravity at the Equator
    Gₚ = GM/a² × (1 + m × (E×Q)/(3×q))      # Normal Gravity at the Poles
    k  = (b × Gₚ)/(a × Gₑ) − 1

    Gₘ = Gₑ × (1 + (⅙)e² + (⅓)k +          \# Mean Value of Normal Gravity
        (59/360)e⁴ + (5/18)ke² +            \
        (2371/15120)e⁶ + (259/1080)ke⁴ +     \
        (270229/1814400)e⁸ + (9623/45360)ke⁶) \

    __( "a"  , 03 , a  )
    __( "b"  , 03 , b  )
    __( "ω"  , 11 , ω  )
    __( "G"  , 16 , G  )
    __( "Mₑ" , 00 , Mₑ )
    __( "Mₐ" , 00 , Mₐ )
    __( "Mₒ" , 00 , Mₒ )
    __( "GMₒ", 03 , GMₒ)
    __( "F"  , 16 , F  )
    __( "f"  , 09 , f  )
    __( "e"  , 15 , e  )
    __( "E"  , 15 , E  )
    __( "Rₚ" , 04 , Rₚ )
    __( "R₁" , 04 , R₁ )
    __( "R₂" , 04 , R₂ )
    __( "R₃" , 04 , R₃ )
    __( "q"  , 12 , q  )
    __( "Q"  , 12 , Q  )
    __( "m"  , 14 , m  )
    __( "J₂" , 15 , J₂ )
    __( "U₀" , 04 , U₀ )
    __( "Gₑ" , 10 , Gₑ )
    __( "Gₚ" , 10 , Gₚ )
    __( "k"  , 15 , k  )
    __( "Gₘ" , 10 , Gₘ )
}'| sed 's/#.*//' | sed '/__/s/,/,\\\n/' | sed -r '
        /__/! { y,πωₚₑₐₒₘ₀₁₂₃₄×−,pWpeaom01234*-,;
        s,³√(.*),(\1)^(1/3),g; s,tan⁻¹\(E\),atan2(E\,1),g;
        s,½,1/2,g; s,⅓,1/3,g; s,⅙,1/6,g; s,⅔,2/3,g; s,²,^2,g; 
        s,⁴,^4,g;  s,⁶,^6,g;  s,⁸,^8,g; s,¹⁰,^10,g; s,√,sqrt,g;
        s,\)ke,\)*k*e,g; s,\)([Eke]),\)*\1,g }' | awk -f -

The program is written in unicode to make reading it and following Appendix B much easier. Sorry if your browser doesn’t render the code properly. I just noticed it doesn’t on my smartphone.

Make this file executable and run it:

> chmod +x efacts.sh
> ./efacts.sh

GM 	          398600460550000
a  	              6378137.678
b  	              6356752.964
ω 	            0.00007292115
G  	       0.0000000000667428
Mₑ	5972186671071637111046144
Mₐ	      5148000000000000000
Mₒ	5972181523071637240414208
GMₒ	      398600116958065.625
F  	       0.0033640939204509
f  	            297.256861326
e  	        0.081819240444396
E  	        0.082094488053751
Rₚ	             6399594.3322
R₁	             6370985.4599
R₂	             6371007.8495
R₃	             6371001.4586
q  	           0.000073346392
Q  	           0.002688044572
m  	         0.00344979040432
J₂	        0.001082631230267
U₀	            62636794.1768
Gₑ	             9.7803152684
Gₚ	             9.8321748731
k  	        0.001931854280022
Gₘ	             9.7976331557

Gₘ (mean normal gravity) is the value we’re looking for, and it’s 9.7976331557.

Now if only the actual earth was a model, then we’d already be done. We want the most accurate answer though, and so we need measured data. We will use the EGM2008 standard, as it’s the most popular. We’ll be using this great resource to seek our answer. But first we’ll need to generate a table of latitudes and longitudes:

> for x in `seq -89.5 89.5`; do for y in `seq -179.5 179.5`; do echo $x $y; done; done > latlon.csv

We will upload this file to get the results. Now go there and fill out the form exactly as shown here (note Gm is our atmosphere-less value GMₒ, not GM)

This will take a while …

But Zoe, Why are we interested in Normal Gravity and not Gravity? Because we’re interested in the acceleration towards the surface and not the center of the earth.

To find the average for the whole Earth we will have to go through each latitude,longitude pair and find out their surface area using the formula from here:

dA = a2 cos[ø (1-e2)] dø dl / (1-e2 sin2ø)2

This will generate an error of ~0.00001281 when we add up all the pairs, but that’s alright because we’ll just divide by a surface area 1.00001281 times as large.

Click on “Download results” when the site is finished, and save to EGM2008.csv in the same folder as the following code.

egm.sh:

#!/usr/bin/bash
# Zoe Phin 2019/12/17

sed 1,32d EGM2008.csv | awk '
    function r(x) { return x*atan2(0,-1)/180 } {
    e = 0.081819240444396; printf "%.12f %.12f\n", $5/1e5, \
        (6378137.678*r(1))^2*(1-e^2)*cos(r($3))/(1-e^2*sin(r($3))^2)^2
}' | awk '
    { S += $1*$2; } END { printf "%.10f\n", S/510072261022076.375 }'

Run it

> bash egm.sh

9.7977074331

The best estimate for Earth’s average surface gravity using the most popular reference model is: 9.7977074331

Note how little difference there is between this number and the previous mathematically derived value; It’s only 0.0000742774.

No one should use the “Standard Gravity” value of 9.80655 if they are interested in the whole earth. Such use will lead to a geometric error of ~0.0009025 and an arithmetic error of ~0.0088426.

Now these errors might be small, but why not make them smaller? Use my number. Thank you.

Enjoy 🙂 -Zoe

What Global Warming?

Berkeley Earth is a popular resource among climate alarmists. Today I will examine their most popular data, available here. This data is the basis of a very popular meme, the global warming color stripes. But how valid is it? Could it be a misinterpretation?

First, we will need three tools:

> sudo apt install gnuplot ncview nco

Then download & plot:

> (echo 'set term png size 740,370;set grid;unset key;plot "-" u 1:2 w lines lw 2 lc rgb "red"'; wget -qO- http://berkeleyearth.lbl.gov/auto/Global/Land_and_Ocean_summary.txt | awk 'length($1)==4{print $1" "$2}') | gnuplot > t.png
http://berkeleyearth.lbl.gov/auto/Global/Land_and_Ocean_summary.txt (1st and 2nd Column)

Looks like legit global warming! OK, we’re done!

We must demand extremely costly legislation!

No wait …

Let’s download their whole dataset from here. (Warning: File is 400MB+, Don’t Click!)

> wget -c -O temps.nc http://berkeleyearth.lbl.gov/auto/Global/Gridded/Land_and_Ocean_LatLong1.nc

Open up the file with ncview:

> ncview temps.nc

Here is coverage for January of years: 1850, 1875, 1900, 1925, 1950, and 2019

White = Missing Data

As you can see, we obviously did not have global coverage for most of the interested period. Could it possibly be that the “global” warming reported by Berkeley Earth and similar outfits is actually a statistical artifact due to increased coverage and incompatible comparison? In other words, is it the whole globe that’s warming or is it just the shifting and increasing subset of the globe that’s warming?

Good question! Let’s investigate …

We need elevation data so that we can see if that’s a factor. We get that from here. Download using:

> wget -c http://research.jisao.washington.edu/data_sets/elevation/elev.1-deg.nc

The Berkeley Earth data file has information about the baseline temperatures for Jan 1950 through Dec 1980. We examine this first. Create a new file called base.sh, and paste in the following text:

#!/usr/bin/bash
# Zoe Phin, 2019/12/29

ncks --trd -HC elev.1-deg.nc -v data | sed \$d | awk -F[=\ ] '$8>=0{print $8} $8<0{print 0}' > .elev

onemonth() {
	for l in `seq 0 179`; do
		ncks -v climatology -d month_number,$1,$1 -d latitude,$l,$l temps.nc |\
		sed -n '/climatology =/,/;/p' | sed 1d | tr -d ' ;' | tr ',' '\n' |\
		awk -v l=$l 'function rad(x) { return x*atan2(0,-1)/180 } {
			lat = l - 89.5; lon = n - 179.5; n += 1

			a = 6378137.678; b = 6356752.964; E = 1-b^2/a^2;
			x = rad(lat)
			A = (a*rad(1))^2*(1-E)*cos(x)/(1-E*sin(x)^2)^2

			printf "%5.1f %6.1f %10.6f %20.18f\n", lat, lon, $1, A/510072261022077
		}'    
	done > .t$1
	paste .t$1 .elev
}

allmonths() {
	for t in `seq 0 11`; do
		printf "%2d " $(($t+1)); onemonth $t |\
			awk '$3!="nan"{T+=$3*$4;L+=$1*$4;A+=$4;E+=$5*$4}END{
				printf "%10.6f %5.3f %6.3f %7.2f\n",T,A,L/A,E/A}'
	done | awk '{print;S+=$2}END{print "AVG: "S/12}'
}

allmonths

Make this file executable and run it:

> chmod +x temps.sh
> ./base.sh

 1  12.243634 1.000  0.000  232.86
 2  12.429942 1.000  0.000  232.86
 3  13.030878 1.000  0.000  232.86
 4  13.999172 1.000  0.000  232.86
 5  14.955463 1.000  0.000  232.86
 6  15.675272 1.000  0.000  232.86
 7  15.915606 1.000  0.000  232.86
 8  15.757542 1.000  0.000  232.86
 9  15.143898 1.000  0.000  232.86
10  14.242063 1.000  0.000  232.86
11  13.198164 1.000  0.000  232.86
12  12.521963 1.000  0.000  232.86
AVG: 14.0928

Here we see the month number, the average area-weighted temperature, coverage (100%), average latitude (equator), and average elevation. Average elevation will come in handy later. At the end is an annual average temperature. So far so good. Now we do latitude analysis. Create a new file latitudes.sh, and paste in the following:

#!/usr/bin/bash
# Zoe Phin, 2019/12/29

paste .t0 .t1 .t2 .t3 .t4 .t5 .t6 .t7 .t8 .t9 .t10 .t11 | awk '{
	$1 += 0.5; if ($1 < 0) $1 = 0 - $1
	printf "%5.1f %6.1f %10.6f\n", $1, $2, 
		($3+$7+$11+$15+$19+$23+$27+$31+$35+$39+$43+$47+$51+$55+$59+$63)/12
}' | awk '{
	L[$1]+=$3 } END {
	for (l in L) {
		L[l]/=720
		if (l == "0.0") { L[l]*=2 }
		printf "%5.1f %10.6f\n", l, L[l]
	}
}'  | sort -n | awk '{print $1" "$2-L;L=$2}'\
	| sed 1d | sed -n 1,24p | awk '{S+=$2}END{print S/NR}'

Now run it:

> bash latitudes.sh

-0.133022

This tells us that in the tropics, for every degree latitude away from the equator, the temperature drops 0.133 °C, according to 1950-1980 baseline data. This fact will come in handy later.

Now we are ready to analyze the temporal data. Create a new file called temps.sh and paste the following text into it:

#!/usr/bin/bash
# Zoe Phin, 2019/12/29

ncks -v data -lat,0,0 elev.1-deg.nc |\                                                                              
    sed -n '/data =/,/;/p' | sed 1d |\                                            
    tr -d ' ;' | tr ',' '\n' | sed '/^$/d' |\
    awk '$1>=0{print $1}$1<0{print 0}' > .elev

onetime() {
    for l in `seq 0 179`; do
        ncks -v temperature -d time,$1,$1 -d latitude,$l,$l temps.nc |\
        sed -n '/temperature =/,/;/p' | sed 1d | tr -d ' ;' | tr ',' '\n' |\
        awk -v l=$l 'function rad(x) { return x*atan2(0,-1)/180 } {  
            lat = l - 89.5; lon = n - 179.5; n += 1
            a = 6378137.678; b = 6356752.964; E = 1-b^2/a^2;
            x = rad(lat)
            A = (a*rad(1))^2*(1-E)*cos(x)/(1-E*sin(x)^2)^2
            printf "%5.1f %6.1f %10.6f %20.18f\n", lat, lon, $1, A/510072261022077
        }'    
    done > .tmp                                                                                                       
    paste .tmp .elev 
}   

alltimes() {
    for t in `seq 0 2027`; do
        printf "%4d " $t 
        onetime $t | awk '
            $3!="nan"{T+=$3*$4;L+=$1*$4;A+=$4;E+=$5*$4}END{printf "%10.6f %5.3f %6.3f %7.2f\n",T,A,L/A,E/A}'
    done
}   

alltimes

Make new file executable, and run it. Then wait hours. lol

> chmod +x temps.sh
> ./temps.sh | tee temps.csv

While you’re waiting for temps.csv to be completely filled, create a new program that will analyze and adjust the results. Call it analyze.sh

#!/usr/bin/bash
# Zoe Phin, 2019/12/29

monthly() {
    awk '{printf "%4d %02d %10.6f %5.3f %6.3f %6.2f\n",$1/12+1850,$1%12+1,$2,$3,$4,$5}'
}

yearly() {
    awk '{T[$1]+=$3; C[$1]+=$4; L[$1]+=$5; E[$1]+=$6} END {
        for (y in T) { 
            printf "%4d %10.6f %5.3f %6.3f %6.2f\n",y,T[y]/12,C[y]/12,L[y]/12,E[y]/12 
        }
    }' | sort -n
}

adjust() {
    awk '{
        if ($4 < 0) $4 = 0 - $4
        printf "%s  %8.6f  %7.4f\n", $0, $4*0.133022, -0.0057*(232.86-$5)
    }' |\
    awk '{
        printf "%s  = %10.6f\n", $0, $2 + $6 + $7
    }'
}

monthly | yearly | adjust

Notice that I’m using two parameters we discovered earlier: 0.133022 and 232.86 (mean elevation). A third parameter is 0.0057, which is just the average lapse rate in °C/meter.

Make the file executable and run it after temps.csv is completely generated.

> ./chmod +x analyze.sh
> cat temps.csv | ./analyze.sh | tee final.csv

1850  -0.357290 0.770  9.075 230.12  1.207175  -0.0156 =   0.834285
1851  -0.283336 0.782  7.687 226.25  1.022540  -0.0377 =   0.701504
1852  -0.258400 0.780  7.954 217.99  1.058057  -0.0848 =   0.714857
1853  -0.306750 0.832  5.019 224.64  0.667637  -0.0469 =   0.313987
1854  -0.274175 0.842  3.801 219.65  0.505617  -0.0753 =   0.156142
1855  -0.241830 0.833  3.595 224.71  0.478214  -0.0465 =   0.189884
1856  -0.355853 0.843  3.705 225.26  0.492847  -0.0433 =   0.093694
1857  -0.461720 0.849  3.100 219.24  0.412368  -0.0776 =  -0.126952
1858  -0.354328 0.834  3.874 222.37  0.515327  -0.0598 =   0.101199
1859  -0.313976 0.829  2.207 230.06  0.293580  -0.0160 =  -0.036396
1860  -0.337673 0.747  3.282 224.38  0.436578  -0.0483 =   0.050605
1861  -0.372894 0.664  5.272 187.82  0.701292  -0.2567 =   0.071698
1862  -0.424297 0.689  4.241 216.75  0.564146  -0.0918 =   0.048049
1863  -0.245435 0.604  6.774 167.69  0.901091  -0.3715 =   0.284156
1864  -0.295420 0.690  5.608 193.15  0.745987  -0.2263 =   0.224267
1865  -0.209959 0.606  5.634 190.14  0.749446  -0.2435 =   0.295987
1866  -0.170314 0.587  7.426 210.01  0.987821  -0.1302 =   0.687307
1867  -0.167626 0.605  4.531 208.39  0.602723  -0.1395 =   0.295597
1868  -0.164820 0.693  1.542 221.86  0.205120  -0.0627 =  -0.022400
1869  -0.212144 0.811  3.939 239.15  0.523974   0.0359 =   0.347730
1870  -0.273406 0.841  3.148 237.14  0.418753   0.0244 =   0.169747
1871  -0.281568 0.826  2.503 239.02  0.332954   0.0351 =   0.086486
1872  -0.260434 0.851  2.382 229.30  0.316858  -0.0203 =   0.036124
1873  -0.245535 0.870  3.264 236.25  0.434184   0.0193 =   0.207949
1874  -0.316410 0.858  2.290 236.99  0.304620   0.0235 =   0.011710
1875  -0.343630 0.902  3.548 227.12  0.471962  -0.0327 =   0.095632
1876  -0.351453 0.880  3.113 229.60  0.414097  -0.0186 =   0.044044
1877  -0.010802 0.891  3.061 227.45  0.407180  -0.0308 =   0.365578
1878   0.070294 0.920  3.761 220.60  0.500296  -0.0699 =   0.500690
1879  -0.219576 0.905  3.533 222.53  0.469967  -0.0589 =   0.191491
1880  -0.283391 0.903  2.997 219.83  0.398667  -0.0743 =   0.040976
1881  -0.199763 0.920  3.777 219.21  0.502424  -0.0778 =   0.224861
1882  -0.246604 0.933  3.889 227.58  0.517323  -0.0301 =   0.240619
1883  -0.295346 0.935  3.978 228.95  0.529162  -0.0223 =   0.211516
1884  -0.445828 0.939  3.970 229.57  0.528097  -0.0188 =   0.063469
1885  -0.421912 0.934  4.251 232.04  0.565477  -0.0047 =   0.138865
1886  -0.458854 0.939  4.209 229.30  0.559890  -0.0203 =   0.080736
1887  -0.481525 0.941  4.258 231.56  0.566408  -0.0074 =   0.077483
1888  -0.299911 0.937  4.022 232.11  0.535014  -0.0043 =   0.230803
1889  -0.173886 0.928  3.699 234.59  0.492048   0.0099 =   0.328062
1890  -0.434682 0.914  3.849 232.06  0.512002  -0.0046 =   0.072720
1891  -0.335364 0.905  3.290 240.85  0.437642   0.0455 =   0.147778
1892  -0.382409 0.926  3.818 234.61  0.507878   0.0100 =   0.135469
1893  -0.373769 0.930  3.979 233.86  0.529295   0.0057 =   0.161226
1894  -0.362064 0.913  4.107 233.84  0.546321   0.0056 =   0.189857
1895  -0.301138 0.905  3.449 237.65  0.458793   0.0273 =   0.184955
1896  -0.209349 0.914  3.530 239.58  0.469568   0.0383 =   0.298519
1897  -0.200136 0.919  3.538 238.05  0.470632   0.0296 =   0.300096
1898  -0.395082 0.942  4.022 232.84  0.535014  -0.0001 =   0.139832
1899  -0.239184 0.945  4.072 231.99  0.541666  -0.0050 =   0.297482
1900  -0.134154 0.943  4.175 231.34  0.555367  -0.0087 =   0.412513
1901  -0.205338 0.944  4.127 231.32  0.548982  -0.0088 =   0.334844
1902  -0.334302 0.945  4.046 231.59  0.538207  -0.0072 =   0.196705
1903  -0.435556 0.952  3.585 232.03  0.476884  -0.0047 =   0.036628
1904  -0.472943 0.953  3.521 231.87  0.468370  -0.0056 =  -0.010173
1905  -0.323819 0.952  3.588 232.04  0.477283  -0.0047 =   0.148764
1906  -0.256762 0.952  3.605 232.17  0.479544  -0.0039 =   0.218882
1907  -0.413088 0.952  3.583 232.06  0.476618  -0.0046 =   0.058930
1908  -0.439984 0.953  3.575 231.93  0.475554  -0.0053 =   0.030270
1909  -0.500328 0.953  3.578 231.99  0.475953  -0.0050 =  -0.029375
1910  -0.474652 0.951  3.589 231.81  0.477416  -0.0060 =  -0.003236
1911  -0.487573 0.951  3.567 231.32  0.474489  -0.0088 =  -0.021884
1912  -0.414555 0.954  3.496 231.90  0.465045  -0.0055 =   0.044990
1913  -0.382771 0.953  3.555 231.92  0.472893  -0.0054 =   0.084722
1914  -0.223899 0.951  3.667 232.11  0.487792  -0.0043 =   0.259593
1915  -0.157645 0.949  3.819 232.08  0.508011  -0.0044 =   0.345966
1916  -0.380000 0.941  4.299 232.10  0.571862  -0.0043 =   0.187562
1917  -0.480852 0.931  4.939 229.73  0.656996  -0.0178 =   0.158344
1918  -0.323954 0.908  6.281 225.14  0.835511  -0.0440 =   0.467557
1919  -0.258707 0.918  5.622 226.10  0.747850  -0.0385 =   0.450643
1920  -0.241853 0.943  4.200 232.26  0.558692  -0.0034 =   0.313439
1921  -0.182841 0.948  3.915 232.11  0.520781  -0.0043 =   0.333640
1922  -0.272134 0.953  3.580 232.01  0.476219  -0.0048 =   0.199285
1923  -0.251429 0.950  3.737 232.01  0.497103  -0.0048 =   0.240874
1924  -0.247327 0.951  3.701 231.85  0.492314  -0.0058 =   0.239187
1925  -0.205461 0.950  3.764 231.94  0.500695  -0.0052 =   0.290034
1926  -0.069547 0.952  3.631 232.06  0.483003  -0.0046 =   0.408856
1927  -0.176755 0.952  3.640 232.03  0.484200  -0.0047 =   0.302745
1928  -0.155696 0.950  3.732 232.09  0.496438  -0.0044 =   0.336342
1929  -0.333335 0.951  3.666 231.75  0.487659  -0.0063 =   0.148024
1930  -0.120497 0.950  3.776 232.05  0.502291  -0.0046 =   0.377194
1931  -0.070735 0.952  3.594 231.95  0.478081  -0.0052 =   0.402146
1932  -0.103892 0.953  3.550 231.85  0.472228  -0.0058 =   0.362536
1933  -0.278755 0.954  3.498 232.04  0.465311  -0.0047 =   0.181856
1934  -0.149201 0.953  3.517 232.06  0.467838  -0.0046 =   0.314037
1935  -0.186067 0.953  3.549 232.07  0.472095  -0.0045 =   0.281528
1936  -0.138241 0.955  3.426 231.90  0.455733  -0.0055 =   0.311992
1937   0.023886 0.954  3.484 231.98  0.463449  -0.0050 =   0.482335
1938   0.028060 0.955  3.431 231.90  0.456398  -0.0055 =   0.478958
1939   0.003556 0.950  3.683 230.77  0.489920  -0.0119 =   0.481576
1940   0.099252 0.936  4.594 228.66  0.611103  -0.0239 =   0.686455
1941   0.074650 0.925  5.307 228.02  0.705948  -0.0276 =   0.752998
1942   0.035199 0.925  5.355 227.44  0.712333  -0.0309 =   0.716632
1943   0.070681 0.916  5.850 227.37  0.778179  -0.0313 =   0.817560
1944   0.156531 0.890  7.065 200.57  0.939800  -0.1841 =   0.912231
1945   0.032824 0.877  7.844 205.34  1.043425  -0.1569 =   0.919349
1946  -0.042840 0.950  3.540 229.52  0.470898  -0.0190 =   0.409058
1947   0.050743 0.960  3.004 231.05  0.399598  -0.0103 =   0.440041
1948  -0.064158 0.963  2.719 230.03  0.361687  -0.0161 =   0.281429
1949  -0.084292 0.964  2.721 231.53  0.361953  -0.0076 =   0.270061
1950  -0.161060 0.977  1.753 236.13  0.233188   0.0186 =   0.090728
1951   0.001301 0.985  1.177 235.44  0.156567   0.0147 =   0.172568
1952   0.076559 0.963  2.836 232.62  0.377250  -0.0014 =   0.452409
1953   0.131789 0.960  2.998 232.61  0.398800  -0.0014 =   0.529189
1954  -0.055526 0.975  1.926 231.30  0.256200  -0.0089 =   0.191774
1955  -0.111239 0.981  1.329 229.13  0.176786  -0.0213 =   0.044247
1956  -0.180457 0.997  0.183 232.94  0.024343   0.0005 =  -0.155614
1957   0.051807 0.999  0.078 232.86  0.010376  -0.0000 =   0.062183
1958   0.061004 0.999  0.082 232.72  0.010908  -0.0008 =   0.071112
1959   0.035838 0.997  0.196 232.82  0.026072  -0.0002 =   0.061710
1960  -0.012004 0.998  0.104 232.77  0.013834  -0.0005 =   0.001330
1961   0.073172 0.999  0.078 232.80  0.010376  -0.0003 =   0.083248
1962   0.025351 0.998  0.128 232.91  0.017027   0.0003 =   0.042678
1963   0.065958 0.997  0.165 232.75  0.021949  -0.0006 =   0.087307
1964  -0.204562 0.999  0.051 232.78  0.006784  -0.0005 =  -0.198278
1965  -0.103658 0.999  0.027 232.74  0.003592  -0.0007 =  -0.100766
1966  -0.035893 0.999  0.030 232.84  0.003991  -0.0001 =  -0.032002
1967   0.007587 1.000  0.014 232.89  0.001862   0.0002 =   0.009649
1968  -0.056209 1.000  0.018 232.88  0.002394   0.0001 =  -0.053715
1969   0.086212 1.000  0.009 232.85  0.001197  -0.0001 =   0.087309
1970   0.022673 0.999  0.044 232.89  0.005853   0.0002 =   0.028726
1971  -0.098918 0.999  0.060 232.96  0.007981   0.0006 =  -0.090337
1972  -0.009380 0.999  0.076 233.00  0.010110   0.0008 =   0.001530
1973   0.107283 0.999  0.050 232.97  0.006651   0.0006 =   0.114534
1974  -0.109770 0.999  0.082 232.98  0.010908   0.0007 =  -0.098162
1975  -0.053170 0.999  0.065 232.98  0.008646   0.0007 =  -0.043824
1976  -0.161628 0.998  0.155 233.01  0.020618   0.0009 =  -0.140110
1977   0.136867 0.999  0.041 232.94  0.005454   0.0005 =   0.142821
1978   0.009959 1.000  0.017 232.90  0.002261   0.0002 =   0.012420
1979   0.096247 1.000  0.005 232.87  0.000665   0.0001 =   0.097012
1980   0.215677 1.000  0.000 232.86  0.000000  -0.0000 =   0.215677
1981   0.264261 1.000  0.000 232.86  0.000000  -0.0000 =   0.264261
1982   0.044925 1.000  0.000 232.86  0.000000  -0.0000 =   0.044925
1983   0.237702 1.000  0.000 232.86  0.000000  -0.0000 =   0.237702
1984   0.072032 1.000  0.000 232.86  0.000000  -0.0000 =   0.072032
1985   0.059067 1.000  0.000 232.86  0.000000  -0.0000 =   0.059067
1986   0.113913 1.000  0.000 232.86  0.000000  -0.0000 =   0.113913
1987   0.246261 1.000  0.000 232.86  0.000000  -0.0000 =   0.246261
1988   0.292035 1.000  0.000 232.86  0.000000  -0.0000 =   0.292035
1989   0.175688 1.000  0.000 232.86  0.000000  -0.0000 =   0.175688
1990   0.367576 1.000  0.000 232.86  0.000000  -0.0000 =   0.367576
1991   0.350261 1.000  0.000 232.86  0.000000  -0.0000 =   0.350261
1992   0.163420 1.000  0.000 232.86  0.000000  -0.0000 =   0.163420
1993   0.195115 1.000  0.000 232.86  0.000000  -0.0000 =   0.195115
1994   0.246145 1.000  0.000 232.86  0.000000  -0.0000 =   0.246145
1995   0.390852 1.000  0.000 232.86  0.000000  -0.0000 =   0.390852
1996   0.293093 1.000  0.000 232.86  0.000000  -0.0000 =   0.293093
1997   0.436165 1.000  0.000 232.86  0.000000  -0.0000 =   0.436165
1998   0.587775 1.000  0.000 232.86  0.000000  -0.0000 =   0.587775
1999   0.353628 1.000  0.000 232.86  0.000000  -0.0000 =   0.353628
2000   0.366550 1.000  0.000 232.86  0.000000  -0.0000 =   0.366550
2001   0.508265 1.000  0.000 232.86  0.000000  -0.0000 =   0.508265
2002   0.586163 1.000  0.000 232.86  0.000000  -0.0000 =   0.586163
2003   0.570285 1.000  0.000 232.86  0.000000  -0.0000 =   0.570285
2004   0.473007 1.000  0.000 232.86  0.000000  -0.0000 =   0.473007
2005   0.655408 1.000  0.000 232.86  0.000000  -0.0000 =   0.655408
2006   0.607471 1.000  0.000 232.86  0.000000  -0.0000 =   0.607471
2007   0.616768 1.000  0.000 232.86  0.000000  -0.0000 =   0.616768
2008   0.482518 1.000  0.000 232.86  0.000000  -0.0000 =   0.482518
2009   0.615649 1.000  0.000 232.86  0.000000  -0.0000 =   0.615649
2010   0.688333 1.000  0.000 232.86  0.000000  -0.0000 =   0.688333
2011   0.573430 1.000  0.000 232.86  0.000000  -0.0000 =   0.573430
2012   0.585135 1.000  0.000 232.86  0.000000  -0.0000 =   0.585135
2013   0.611063 1.000  0.000 232.86  0.000000  -0.0000 =   0.611063
2014   0.675435 1.000  0.000 232.86  0.000000  -0.0000 =   0.675435
2015   0.813036 1.000  0.000 232.86  0.000000  -0.0000 =   0.813036
2016   0.951589 1.000  0.000 232.86  0.000000  -0.0000 =   0.951589
2017   0.836485 1.000  0.000 232.86  0.000000  -0.0000 =   0.836485
2018   0.770258 1.000  0.000 232.86  0.000000  -0.0000 =   0.770258

Columns:

1Year
2Original Temperature Anomaly
3Global Coverage (1 = 100%)
4Average Latitude
5Average Elevation
6Latitude Adjustment Needed
7Elevation Adjustment Needed
8=
9Final Temperature Anomaly = Column 2 + Column 6 + Column 7

Now we plot final.csv

> echo 'set term png size 740,370;set grid;unset key;plot "final.csv" u 1:9 w lines lw 2 lc rgb "red"' | gnuplot > final.png
Latitude and Elevation Adjusted Temperature Anomaly

Reality sure looks a lot different once we properly adjust for shifting average latitude and elevation.

What are the 10 hottest years?

> cat final.csv | awk '{print $9" "$1}' | sort -rn | head

0.951589 2016
0.919349 1945
0.912231 1944
0.836485 2017
0.834285 1850
0.817560 1943
0.813036 2015
0.770258 2018
0.752998 1941
0.716632 1942

What’s the trend?

> sudo apt install gmt
> cat final.csv | awk '{print $1" "$9}' | gmt gmtregress -Fp -o5

0.0013332235319

From 1850 to 2019, we’ve been warming up by 0.0013 °C/year.

We’ve warmed up by 0.22 °C since 1850. I wouldn’t worry about it! A linear regression is inappropriate anyway. The data is obviously cyclical.

Summary: After adjusting Berkeley Earth’s data for latitude and elevation, we found no serious global warming.

Latitude and Elevation Adjusted Temperature Anomaly. Trend Slope = 0.00133°C/year.

Enjoy 🙂 -Zoe


Addendum:

Average Latitude Drift; Slope = -0.0327°/year

In case it wasn’t obvious: The so-called “global warming” is primarily due to incompleteness of data, and the average latitude drifting south towards equator in the data we do have.

Data Coverage, 0 to 1 = 0 to 100%
Mean Elevation, meters

We should rename Climate Change to Historic Mean Latitude Change.

Who’s with me?


Update 2020/01/03

A youtube channel operator claims that Berkeley already performed a latitude and elevation adjustment. This is absolutely bogus. He has since censored our discussion thread from the public. Let’s address this anyway…

In the Berkeley grid data , there are 16635 cells (out of 64800) that have complete data from 1850 to 2018. I compare these 16635 cells to Berkeley’s global summary (first chart on this post). Result:

There ought to have been a huge difference between 1850 and ~1980, if they really accounted for latitude. There is hardly a difference. It’s also painfully obvious that Berkeley stuffs the missing data with neighboring data and model interpolations – and the result is hardly different than just taking a plain area-weighted average of time-persistent locations, which have a VERY Northern bias.

Update 2020/01/04

Below is a very long term weather station in the Netherlands; one of the oldest continuous stations that exist.

What do you see? Uhuh

Before you accuse me of cherrypicking, consider that this cherry has been getting wrapped in more CO2 and it didn’t make a lick of difference. Why is that? Uhuh

Update 2020/01/17

More confirmation that what I’ve done here is correct, available here.

Ocean Cover and Hypsometry

What percent of the Earth is ocean? We can figure that out by continuing from here. Run:

> awk '$3>0{print 0" "$4}$3<=0{print 1" "$4}' i.csv > x.csv
> awk '{T[$1]+=$2;S+=$2}END{print T[0]/S" "T[1]/S}' x.csv

0.290505 0.709495

The answer is 70.95%.

We can also generate a hypsometric curve:

> cat i.csv | awk '{print $3" "$4}' | sort -rn > hypso.raw
> awk '{T[$1]+=$2;S+=$2}END{for (i in T){print i" "T[i]}}' hypso.raw | sort -rn -k 1.1 > hypso.dat
> awk '{N+=$2;printf "%d %30.28f\n",$1,N/510065728777854.5264}' hypso.dat > hypso.csv

To plot this curve, create an empty file called hypso.plot and paste text below:

set term png size 740,370
unset key
set xtics 0.1
set mxtics 2
set grid mxtics xtics ytics
set xrange [-0.01 to 1.01]
plot "hypso.csv" u 2:1 w filledcurves above y=0 fc "orange",\
     "hypso.csv" u 2:1 w filledcurves below y=1 fc "blue"

Make sure you have gnuplot.

> sudo apt install gnuplot

Generate an image with:

> gnuplot hypso.plot > hypso.png

You should get:

Earth Hypsometry (meters)

Enjoy 🙂 -Zoe

Earth Average Elevations and Depths

What is the average depth of the ocean? What is the average elevation on land? And what is the average elevation of the Earth? I will answer all of these questions.

I get my data from NOAA:

ETOPO1 is a 1 arc-minute global relief model of Earth’s surface that integrates land topography and ocean bathymetry. Built from global and regional data sets, it is available in “Ice Surface” (top of Antarctic and Greenland ice sheets) and “Bedrock” (base of the ice sheets).

I need a tool to read NOAA’s files.

> sudo apt install nco

Download the relevant files:

> wget -c https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/data/ice_surface/cell_registered/netcdf/ETOPO1_Ice_c_gmt4.grd.gz
> wget -c https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/data/bedrock/cell_registered/netcdf/ETOPO1_Bed_c_gmt4.grd.gz

Unzip and rename the files:

> unzip ETOPO1_Ice_c_gmt4.grd.gz; mv ETOPO1_Ice_c_gmt4.grd etop01i.csv
> unzip ETOPO1_Bed_c_gmt4.grd.gz; mv ETOPO1_Bed_c_gmt4.grd etop01b.csv

Now here is my main program. Save the following to a new file etop.sh.

#!/usr/bin/bash
# Zoe Phin, 2019/12/17

for l in `seq 0 10799`; do
    ncks -v z -d y,$l,$l etop01$1.nc | sed -n '/z =/,/^$/p' | egrep -o '[-0-9].*[0-9]' | tr -s ', ' '\n' | awk -v l=$l '  
    function rad(x) { return x*atan2(0,-1)/180 }  
    {
        a = 6378137.678; b = 6356752.964; E = 1-b^2/a^2;                         

        lon = n - 180 + 1/120; n+=1/60     
        lat = l/60 - 90 + 1/120

        x = rad(lat)    
        A = rad(1/60)^2*a^2*(1-E)*cos(x)/(1-E*sin(x)^2)^2      

        printf "%7.3f %8.3f %5.0f %6.3f\n", lat, lon, $1, A 
    }'   
done 

This program takes 1 argument: b for bedrock, and i for ice. Make the program executable and run it: (This will take quite a bit of time, and generates two 8.5GB files)

> chmod +x etop.sh
> ./etop.sh b > b.csv
> ./etop.sh i > i.csv

I will use the Surface Area from this article. Now I write an additional program: etop2.sh

echo -n 'Over land, Bedrock, Water surface = 0 ... '
awk '$3>0 {S+=$3*$4}       END {printf "%8.3f\n", S/510065728777854.5264}' b.csv 
echo -n 'Over land, Bedrock, No water included ...'
awk '$3>0 {S+=$3*$4;N+=$4} END {printf "%8.3f\n", S/N}' b.csv 
echo -n 'Under water, Bedrock, No land Included ... '
awk '$3<=0{S+=$3*$4;N+=$4} END {printf "%8.3f\n", S/N}' b.csv
echo -n 'Bedrock, Overall Average ... '
awk '     {S+=$3*$4}       END {printf "%8.3f\n", S/510065728777854.5264}' b.csv 

echo -n 'Over land, Ice Top, Water surface = 0 ... '
awk '$3>0 {S+=$3*$4}       END {printf "%8.3f\n", S/510065728777854.5264}' i.csv 
echo -n 'Over land, Ice Top, No water included ...'
awk '$3>0 {S+=$3*$4;N+=$4} END {printf "%8.3f\n", S/N}' i.csv 
echo -n 'Under water, Ice Top, No land Included ... '
awk '$3<=0{S+=$3*$4;N+=$4} END {printf "%8.3f\n", S/N}' i.csv 
echo -n 'Ice Top, Overall Average ... '
awk '     {S+=$3*$4}       END {printf "%8.3f\n", S/510065728777854.5264}' i.csv 

Make it executable and run it: (This will take >30 minutes on an average laptop)

> chmod +x etop2.sh
> ./etop2.sh

Over land, Bedrock, Water surface = 0 ...  180.573
Over land, Bedrock, No water included ... 651.100
Under water, Bedrock, No land Included ... -3624.931
Bedrock, Overall Average ... -2439.039
Over land, Ice Top, Water surface = 0 ...  231.404
Over land, Ice Top, No water included ... 796.560
Under water, Ice Top, No land Included ... -3683.895
Ice Top, Overall Average ... -2382.302

It’s interesting to note that total ice stacks up to an average of ~51 meters globe wide.

Using Ice Top data:

The average depth of the ocean is: 3683.9 meters

The average elevation on land is 796.6 meters

Average height above sea level is 231.4 meters

Treating the ocean bottom and land top as a surface, the average height is 2382.3 meters BELOW sea level.

Enjoy 🙂 -Zoe

Why is Venus so hot?

Continuing in the correct tradition, Venus is hot MOSTLY due to internal (“geo”thermal) reasons – and NASA knows it.

https://www.nas.nasa.gov/SC13/assets/images/content/33_Smrekar_S_Figure7_SC13_big.jpg

Duh!

No stupid runaway greenfraud effect necessary.

-Zoe

This article is dedicated to Hans Schreuder, R.I.P.

Update – 2020/01/14

Somebody complained that NASA’s diagram shows geothermal as being a little over 500K, which is not enough for Venus.

This has no credibility. Here is a zoom-in of the diagram:

Treating the 500 marking as pixel 0, the 2500 marking appears at pixel 546. This means that each pixel represents 2000 / 546 = 3.663 Kelvin

The center of the dark black geothermal line falls at pixel 65.

500 + 65 * 3.663 = 738.095

According to NASA, Venus’ surface is: 737K